Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

Quantified reproducibility assessment of NLP results

Belz, Anya orcid logoORCID: 0000-0002-0552-8096, Popović, Maja orcid logoORCID: 0000-0001-8234-8745 and Mille, Simon orcid logoORCID: 0000-0002-8852-2764 (2022) Quantified reproducibility assessment of NLP results. In: 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 22-27 May 2022, Dublin, Ireland.

Abstract
This paper describes and tests a method for carrying out quantified reproducibility assessment (QRA) that is based on concepts and definitions from metrology. QRA produces a single score estimating the degree of reproducibility of a given system and evaluation measure, on the basis of the scores from, and differences between, different reproductions. We test QRA on 18 different system and evaluation measure combinations (involving diverse NLP tasks and types of evaluation), for each of which we have the original results and one to seven reproduction results. The proposed QRA method produces degree-of-reproducibility scores that are comparable across multiple reproductions not only of the same, but also of different, original studies. We find that the proposed method facilitates insights into causes of variation between reproductions, and as a result, allows conclusions to be drawn about what aspects of system and/or evaluation design need to be changed in order to improve reproducibility.
Metadata
Item Type:Conference or Workshop Item (Paper)
Event Type:Conference
Refereed:Yes
Subjects:Computer Science > Computational linguistics
Computer Science > Machine learning
DCU Faculties and Centres:DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Research Initiatives and Centres > ADAPT
Published in: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 1. Association for Computational Linguistics (ACL).
Publisher:Association for Computational Linguistics (ACL)
Official URL:https://doi.org/10.18653/v1/2022.acl-long.2
Copyright Information:© 2022 Association for Computational Linguistics
Funders:ADAPT SFI Centre for Digital Media Technology which is funded by Science Foundation Ireland through the SFI Research Centres Programme Grant 13/RC/2106., European Regional Development Fund (ERDF), European Commission under the H2020 program contract numbers 786731, 825079, 870930 and 952133.
ID Code:28371
Deposited On:25 May 2023 15:33 by Maja Popovic . Last Modified 04 Jul 2023 10:39
Documents

Full text available as:

[thumbnail of 2022.acl-long.2.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Creative Commons: Attribution 4.0
363kB
Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record