-
Notifications
You must be signed in to change notification settings - Fork 13
/
Copy pathindex.html
148 lines (121 loc) · 7.73 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<!-- Bootstrap CSS -->
<link href="https://cdn.jsdelivr.net/npm/[email protected]/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-EVSTQN3/azprG1Anm3QDgpJLIm9Nao0Yz1ztcQTwFspd3yD65VohhpuuCOmLASjC" crossorigin="anonymous">
<!-- Copy and modify below line to add Signposting -->
<link href="" rel="self" />
<title>Example dataset: Fleiss kappa for doc-2-doc relevance assessment</title>
</head>
<body class="container">
<h1>Fleiss kappa for doc-2-doc relevance assessment</h1>
<p class="lead">Example dataset for Signposting tutorial</p>
<!--
The HTML used in this example is based on Bootstrap v5 starter template -- the use of class="..." are layout features, and are not part of Signposting.
-->
<section class="card">
<h2 class="card-header">About this tutorial</h2>
<div class="card-body container-md">
<p>
One of the results of the web-app <a href="https://github.com/zbmed-semtec/TREC-doc-2-doc-relevance">TREC-doc-2-doc-relevance</a> is the measure of the inter-annotator agreement on the task of document-to-document relevance assessment using Fleiss' kappa.
</p>
<p>
This page has been created to emulate the Zenodo Landing page for <a href="https://doi.org/10.5281/zenodo.7338056">Fleiss kappa for doc-2-doc relevance assessment</a> dataset so that we can add Signposting as part of the <a href="https://github.com/stain/signposting-tutorial">signposting-tutorial</a>. For the purpose of the tutorial, imagine that the DOI below is registered to redirect to this page, rather than the real Zenodo entry.
</p>
</div>
</section>
<div id="abstract" class="card">
<h2 class="card-header">Abstract</h2>
<p class="card-body">
Here we present a table summarizing the Fleiss’ kappa results. The Fleiss’ Kappa was calculated as a way to measure the degree of agreement between four annotators who evaluated the relevance of a set of documents (15 evaluation articles) regarding its corresponding “reference article”. The table contains 7 columns, the first one presents the topics, 8 in total. The second column shows the “reference articles”, represented by their PubMed-ID and organized by topic. The third column shows the Fleiss’ Kappa results. The fourth column shows the interpretation of the Fleiss' Kappa results being: i) “Poor” results <0.20, ii) “Fair” results within 0.21 - 0.40, and iii) “Moderate” results within 0.41 - 0.60. The fifth column shows the PubMed-IDs of evaluation articles rated by the four annotators as “Relevant” regarding its corresponding “reference article”. The sixth column shows the PubMed-IDs of evaluation articles rated by the four annotators as “Partially relevant” regarding its corresponding “reference article”. The seventh column shows the PubMed-IDs of evaluation articles rated by the four annotators as “Non-relevant” regarding its corresponding “reference article”.
</p>
</div>
<div class="card" id="metadata">
<h2 class="card-header">Metadata</h2>
<dl class="card-body">
<dt>Title</dt>
<dd>Fleiss kappa for doc-2-doc relevance assessment</dd>
<dt>Type</dt>
<dd>Data set</dd>
<dt>DOI</dt>
<dd><a href="https://doi.org/10.5281/zenodo.7338056">10.5281/zenodo.7338056</a></dd>
<dt>Keywords</dt>
<dd>Fleiss kappa</dd>
<dd>Inter-annotator agreement</dd>
<dt>Authors</dt>
<dd>
<a href="https://orcid.org/0000-0003-2978-8922">Olga Giraldo</a>
</dd>
<dd>
<a href="https://orcid.org/0009-0004-1529-0095">Dhwani Solanki</a>
</dd>
<dd>
<a href="https://orcid.org/0000-0002-1018-0370">Dietrich Rebholz-Schuhmann</a>
</dd>
<dd>
<a href="https://orcid.org/0000-0003-3986-0510">Leyla Jael Castro</a>
</dd>
<dt>License</dt>
<dd>
<a href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International</a>
</dd>
<dt>Published</dt>
<dd>November 19, 2022</dd>
</dl>
</div>
<div id="download" class="card">
<h2 class="card-header">Download</h2>
<ul class="card-body">
<code>Fleiss Kappa for document-to-document relevant assessment.csv</code> (3.3 kB)
<a href="https://zenodo.org/records/7338056/files/Fleiss%20Kappa%20for%20document-to-document%20relevant%20assessment.csv?download=1" class="btn btn-primary">Download</a>
</ul>
</div>
<div class="card" id="acknowledgement">
<h2 class="card-header">Acknowledgements</h2>
<p class="card-body">
This work is part of the STELLA project funded by DFG (project no. 407518790). This work was supported by the BMBF-funded de.NBI Cloud within the German Network for Bioinformatics Infrastructure (de.NBI) (031A532B, 031A533A, 031A533B, 031A534A, 031A535A, 031A537A, 031A537B, 031A537C, 031A537D, 031A538A)
</p>
</div>
<div class="card" id="metadata">
<h2 class="card-header">Export</h2>
<p class="card-body">
Metadata can be downloaded as:
<a href="bioschemas.jsonld">Bioschemas JSON-LD</a>,
<a href="https://zenodo.org/records/7338056/export/json-ld">schema.org JSON-LD</a>,
<a href="https://zenodo.org/records/7338056/export/bibtex">BibTeX</a>,
<a href="https://zenodo.org/records/7338056/export/datacite-xml">Datacite XML</a>,
<a href="https://zenodo.org/records/7338056/export/dublincore">Dublin Core XML</a>
</ul>
</div>
<script type="application/ld+json" id="bioschemas">
{
"@context": "https://schema.org",
"@type": "Dataset",
"@id": "https://doi.org/10.5281/zenodo.7338056",
"conformsTo": "https://bioschemas.org/profiles/Dataset/1.1-DRAFT",
"identifier": "DOI:10.5281/zenodo.7338056",
"citation": "Giraldo O, Solanki D, Rebholz-Schuhmann D, Castro LJ. Fleiss kappa for doc-2-doc relevance assessment. Zenodo; 2022. doi:10.5281/zenodo.7338056",
"name": "Fleiss kappa for doc-2-doc relevance assessment",
"description": "Fleiss' kappa measuring inter-annotator agreement on a document-to-document relevance assessment task. The table contains 7 columns, the first one presents the topics, 8 in total. The second column shows the \"reference articles\", represented by their PubMed-ID and organized by topic. The third column shows the Fleiss’ Kappa results. The fourth column shows the interpretation of the Fleiss' Kappa results being: i) \"Poor\" results <0.20, ii) \"Fair\" results within 0.21 - 0.40, and iii) \"Moderate\" results within 0.41 - 0.60. The fifth column shows the PubMed-IDs of evaluation articles rated by the four annotators as \"Relevant\" regarding its corresponding \"reference article\". The sixth column shows the PubMed-IDs of evaluation articles rated by the four annotators as \"Partially relevant\" regarding its corresponding \"reference article\". The seventh column shows the PubMed-IDs of evaluation articles rated by the four annotators as \"Non-relevant\" regarding its corresponding \"reference article\".",
"keywords": "Fleiss' Kappa, Inter-annoator agreement, TREC Genomics Track 2005, relevance assessment",
"license": {
"@type": "CreativeWork",
"@id": "https://spdx.org/licenses/CC-BY-4.0.html",
"name": "Creative Commons Attribution 4.0 International",
"alternateName": "CC BY 4.0",
"url": "https://creativecommons.org/licenses/by/4.0/"
},
"url": "https://zenodo.org/record/7338056",
"datePublished": "2022-11-19",
"author": [
{"@id": "https://orcid.org/0000-0003-2978-8922"},
{"@id": "https://orcid.org/0009-0004-1529-0095"},
{"@id": "https://orcid.org/0000-0002-1018-0370"},
{"@id": "https://orcid.org/0000-0003-3986-0510"}
]
}
</script>
</body>
</html>