Gendered Ambiguous Pronoun (GAP) Shared Task at the Gender Bias in NLP Workshop 2019

Kellie Webster, Marta R. Costa-jussà, Christian Hardmeier, Will Radford

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

The 1st ACL workshop on Gender Bias in Natural Language Processing included a shared task on gendered ambiguous pronoun (GAP) resolution. This task was based on the coreference challenge defined in Webster et al. (2018), designed to benchmark the ability of systems to resolve pronouns in real-world contexts in a gender-fair way. 263 teams competed via a Kaggle competition, with the winning system achieving logloss of 0.13667 and near gender parity. We review the approaches of eleven systems with accepted description papers, noting their effective use of BERT (Devlin et al., 2018), both via fine-tuning and for feature extraction, as well as ensembling.
Original languageEnglish
Title of host publicationProceedings of the First Workshop on Gender Bias in Natural Language Processing
Place of PublicationFlorence, Italy
PublisherAssociation for Computational Linguistics
Pages1-7
Number of pages7
ISBN (Electronic)978-1-950737-40-6
DOIs
Publication statusPublished - 2 Aug 2019
Event57th Annual Meeting of the Association for Computational Linguistics - Fortezza da Basso, Florence, Italy
Duration: 28 Jul 20192 Aug 2019
Conference number: 57
http://www.acl2019.org/EN/index.xhtml

Conference

Conference57th Annual Meeting of the Association for Computational Linguistics
Abbreviated titleACL 2019
Country/TerritoryItaly
CityFlorence
Period28/07/192/08/19
Internet address

Fingerprint

Dive into the research topics of 'Gendered Ambiguous Pronoun (GAP) Shared Task at the Gender Bias in NLP Workshop 2019'. Together they form a unique fingerprint.

Cite this