Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

A reproduction study of an annotation-based human evaluation of MT outputs

Popović, Maja orcid logoORCID: 0000-0001-8234-8745 and Belz, Anya orcid logoORCID: 0000-0002-0552-8096 (2021) A reproduction study of an annotation-based human evaluation of MT outputs. In: 14th International Conference on Natural Language Generation, 20-24 Sept 2021, Aberdeen, Scotland.

Abstract
In this paper we report our reproduction study of the Croatian part of an annotation-based human evaluation of machine-translated user reviews (Popovic, 2020). The work was carried out as part of the ReproGen Shared Task on Reproducibility of Human Evaluation in NLG. Our aim was to repeat the original study exactly, except for using a different set of evaluators. We describe the experimental design, characterise differences between original and reproduction study, and present the results from each study, along with analysis of the similarity between them. For the six main evaluation results of Major/Minor/All Comprehension error rates and Major/Minor/All Adequacy error rates, we find that (i) 4/6 system rankings are the same in both studies, (ii) the relative differences between systems are replicated well for Major Comprehension and Adequacy (Pearson's \textgreater 0.9), but not for the corresponding Minor error rates (Pearson's 0.36 for Adequacy, 0.67 for Comprehension), and (iii) the individual system scores for both types of Minor error rates had a higher degree of reproducibility than the corresponding Major error rates. We also examine inter-annotator agreement and compare the annotations obtained in the original and reproduction studies.
Metadata
Item Type:Conference or Workshop Item (Paper)
Event Type:Conference
Refereed:Yes
Subjects:Computer Science > Machine learning
Computer Science > Machine translating
DCU Faculties and Centres:DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Research Initiatives and Centres > ADAPT
Published in: Proceedings of the 14th International Conference on Natural Language Generation (INLG). . Association for Computational Linguistics (ACL).
Publisher:Association for Computational Linguistics (ACL)
Official URL:https://aclanthology.org/2021.inlg-1.31
Copyright Information:© 2021 Association for Computational Linguistics
Funders:Science Foundation Ireland SFI Research Centres Programme, Grant 13/RC/2106., European Regional Development Fund (ERDF), European Association for Machine Translation (EAMT)., ADAPT NLG research group
ID Code:28358
Deposited On:24 May 2023 08:58 by Maja Popovic . Last Modified 24 May 2023 09:00
Documents

Full text available as:

[thumbnail of 2021.inlg-1.31.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Creative Commons: Attribution 4.0
199kB
Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record