izpis_h1_title_alt

Quantifying the impact of context on the quality of manual hate speech annotation
ID Ljubešić, Nikola (Author), ID Mozetič, Igor (Author), ID Kralj Novak, Petra (Author)

.pdfPDF - Presentation file, Download (389,29 KB)
MD5: 7019AF32FD10551FF7B3C1301583B20C
URLURL - Source URL, Visit https://www.cambridge.org/core/journals/natural-language-engineering/article/quantifying-the-impact-of-context-on-the-quality-of-manual-hate-speech-annotation/B6E813E528CE094DBE489ABD3A047D8A This link opens in a new window

Abstract
The quality of annotations in manually annotated hate speech datasets is crucial for automatic hate speech detection. This contribution focuses on the positive effects of manually annotating online comments for hate speech within the context in which the comments occur. We quantify the impact of context availability by meticulously designing an experiment: Two annotation rounds are performed, one in-context and one out-of-context, on the same English YouTube data (more than 10,000 comments), by using the same annotation schema and platform, the same highly trained annotators, and quantifying annotation quality through inter-annotator agreement. Our results show that the presence of context has a significant positive impact on the quality of the manual annotations. This positive impact is more noticeable among replies than among comments, although the former is harder to consistently annotate overall. Previous research reporting that out-of-context annotations favour assigning non-hate-speech labels is also corroborated, showing further that this tendency is especially present among comments inciting violence, a highly relevant category for hate speech research and society overall. We believe that this work will improve future annotation campaigns even beyond hate speech and motivate further research on the highly relevant questions of data annotation methodology in natural language processing, especially in the light of the current expansion of its scope of application.

Language:English
Keywords:hate speech, manual annotation, inter-annotator agreement, impact of context
Work type:Article
Typology:1.01 - Original Scientific Article
Organization:FRI - Faculty of Computer and Information Science
Publication status:Published
Publication version:Version of Record
Year:2023
Number of pages:Str. 1481-1494
Numbering:Vol. 29, iss. 6
PID:20.500.12556/RUL-155112 This link opens in a new window
UDC:004.8
ISSN on article:1351-3249
DOI:10.1017/S1351324922000353 This link opens in a new window
COBISS.SI-ID:118777859 This link opens in a new window
Publication date in RUL:20.03.2024
Views:86
Downloads:7
Metadata:XML RDF-CHPDL DC-XML DC-RDF
:
Copy citation
Share:Bookmark and Share

Record is a part of a journal

Title:Natural language engineering
Shortened title:Nat. lang. eng.
Publisher:Cambridge University Press
ISSN:1351-3249
COBISS.SI-ID:514557465 This link opens in a new window

Licences

License:CC BY 4.0, Creative Commons Attribution 4.0 International
Link:http://creativecommons.org/licenses/by/4.0/
Description:This is the standard Creative Commons license that gives others maximum freedom to do what they want with the work as long as they credit the author.

Projects

Funder:ARRS - Slovenian Research Agency
Project number:P2-0103
Name:Tehnologije znanja

Funder:ARRS - Slovenian Research Agency
Project number:P6-0411
Name:Jezikovni viri in tehnologije za slovenski jezik

Funder:ARRS - Slovenian Research Agency
Project number:N6-0099
Name:Jezikovna krajina sovražnega govora na družbenih omrežjih

Funder:Other - Other funder or multiple funders
Funding programme:Flemish Research Foundation
Project number:FWO-G070619N
Acronym:LiLaH

Funder:EC - European Commission
Funding programme:Rights, Equality and Citizenship Programme
Project number:875263
Acronym:IMSyPP

Similar documents

Similar works from RUL:
Similar works from other Slovenian collections:

Back