Contents

Intersectional Bias in Hate Speech and Abusive Language Datasets

Kim, Jae Yeon / Ortiz, Carlos / Nam, Sarah / Santiago, Sarah / Datta, Vivek

DC Field Value Language
dc.contributor.authorKim, Jae Yeon-
dc.contributor.authorOrtiz, Carlos-
dc.contributor.authorNam, Sarah-
dc.contributor.authorSantiago, Sarah-
dc.contributor.authorDatta, Vivek-
dc.date.available2021-12-23T07:46:20Z-
dc.date.created2021-12-23-
dc.date.issued2020-06-08-
dc.identifier.urihttps://archives.kdischool.ac.kr/handle/11125/42835-
dc.description.abstractAlgorithms are widely applied to detect hate speech and abusive language in social media. We investigated whether the human-annotated data used to train these algorithms are biased. We utilized a publicly available annotated Twitter dataset (Founta et al. 2018) and classified the racial, gender, and party identification dimensions of 99,996 tweets. The results showed that African American tweets were up to 3.7 times more likely to be labeled as abusive, and African American male tweets were up to 77% more likely to be labeled as hateful compared to the others. These patterns were statistically significant and robust even when party identification was added as a control variable. This study provides the first systematic evidence on intersectional bias in datasets of hate speech and abusive language.-
dc.languageEnglish-
dc.publisherAAAI Organization-
dc.titleIntersectional Bias in Hate Speech and Abusive Language Datasets-
dc.typeConference-
dc.identifier.bibliographicCitationProceedings of the Fourteenth International Conference on Web and Social Media (ICWSM), Data Challenge Workshop-
dc.description.journalClass1-
dc.citation.conferenceDate2020-06-08-
dc.citation.conferencePlaceUS-
dc.citation.conferencePlaceAtlanta, Georgia, USA-
dc.citation.titleProceedings of the Fourteenth International Conference on Web and Social Media (ICWSM), Data Challenge Workshop-
dc.contributor.affiliatedAuthorKim, Jae Yeon-
dc.identifier.urlhttps://sites.google.com/view/icwsm2020datachallenge-
Files in This Item:
    There are no files associated with this item.

Click the button and follow the links to connect to the full text. (KDI CL members only)

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

상단으로 이동