Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

Enhancing neural machine translation of low-resource languages: corpus development, human evaluation and explainable AI architectures

Lankford, Séamus orcid logoORCID: 0000-0003-1693-9533 (2024) Enhancing neural machine translation of low-resource languages: corpus development, human evaluation and explainable AI architectures. PhD thesis, Dublin City University.

Abstract
In the current machine translation (MT) landscape, the Transformer architecture stands as the gold standard, especially for high-resource language pairs. This research delves into its efficacy for low-resource language pairs including both the English↔Irish and English↔Marathi language pairs. Notably, the study identifies the optimal hyperparameters and subword model type to significantly improve the translation quality of Transformer models for low-resource language pairs. The scarcity of parallel datasets for low-resource languages can hinder MT development. To address this, we developed gaHealth, the first bilingual corpus of health data for the Irish language. Focusing on the health domain, models developed using this in-domain dataset exhibited very significant improvements in BLEU score when compared with models from the LoResMT2021 Shared Task. A subsequent human evaluation using the multidimensional quality metrics error taxonomy showcased the superior performance of the Transformer system in reducing both accuracy and fluency errors compared to an RNN-based counterpart. Furthermore, this thesis introduces adaptNMT and adaptMLLM, two open-source applications streamlined for the development, fine-tuning, and deployment of neural machine translation models. These tools considerably simplify the setup and evaluation process, making MT more accessible to both developers and translators. Notably, adaptNMT, grounded in the OpenNMT ecosystem, promotes eco-friendly natural language processing research by highlighting the environmental footprint of model development. Fine-tuning of MLLMs by adaptMLLM demonstrated advancements in translation performance for two low-resource language pairs: English-Irish and English-Marathi, compared to baselines from the LoResMT2021 Shared Task.
Metadata
Item Type:Thesis (PhD)
Date of Award:March 2024
Refereed:No
Supervisor(s):Way, Andy and Afli, Haithem
Subjects:Computer Science > Artificial intelligence
Computer Science > Computational linguistics
Computer Science > Computer engineering
Computer Science > Computer software
Computer Science > Information technology
Computer Science > Machine learning
Computer Science > Machine translating
Humanities > Irish language
Humanities > Linguistics
Humanities > Translating and interpreting
DCU Faculties and Centres:DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Research Initiatives and Centres > ADAPT
Use License:This item is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 License. View License
Funders:Munster Technological University
ID Code:29403
Deposited On:22 Mar 2024 13:38 by Andrew Way . Last Modified 22 Mar 2024 13:38
Documents

Full text available as:

[thumbnail of ID-20216607-Date-Jan2nd-2024-Enhancing Neural Machine Translation.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Creative Commons: Attribution-Noncommercial-No Derivative Works 4.0
4MB
Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record