Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

Adaptation of machine translation for multilingual information retrieval in the medical domain

Pecina, Pavel, Dušek, Ondřej, Goeuriot, Lorraine orcid logoORCID: 0000-0001-7491-1980, Hajič, Jan, Hlaváčová, Jaroslava, Jones, Gareth J.F. orcid logoORCID: 0000-0002-4033-9135, Kelly, Liadh orcid logoORCID: 0000-0003-1131-5238, Leveling, Johannes orcid logoORCID: 0000-0003-0603-4191, Mareček, David, Novák, Michal, Popel, Martin, Rosa, Rudolf, Tamchyna, Aleš and Urešová, Zdeňka (2014) Adaptation of machine translation for multilingual information retrieval in the medical domain. Artificial Intelligence in Medicine, 61 (3). pp. 165-185. ISSN 1873-2860

Abstract
Objective. We investigate machine translation (MT) of user search queries in the context of cross-lingual information retrieval (IR) in the medical domain. The main focus is on techniques to adapt MT to increase translation quality; however, we also explore MT adaptation to improve eectiveness of cross-lingual IR. Methods and Data. Our MT system is Moses, a state-of-the-art phrase-based statistical machine translation system. The IR system is based on the BM25 retrieval model implemented in the Lucene search engine. The MT techniques employed in this work include in-domain training and tuning, intelligent training data selection, optimization of phrase table configuration, compound splitting, and exploiting synonyms as translation variants. The IR methods include morphological normalization and using multiple translation variants for query expansion. The experiments are performed and thoroughly evaluated on three language pairs: Czech–English, German–English, and French–English. MT quality is evaluated on data sets created within the Khresmoi project and IR eectiveness is tested on the CLEF eHealth 2013 data sets. Results. The search query translation results achieved in our experiments are outstanding – our systems outperform not only our strong baselines, but also Google Translate and Microsoft Bing Translator in direct comparison carried out on all the language pairs. The baseline BLEU scores increased from 26.59 to 41.45 for Czech–English, from 23.03 to 40.82 for German–English, and from 32.67 to 40.82 for French–English. This is a 55% improvement on average. In terms of the IR performance on this particular test collection, a significant improvement over the baseline is achieved only for French–English. For Czech–English and German–English, the increased MT quality does not lead to better IR results. Conclusions. Most of the MT techniques employed in our experiments improve MT of medical search queries. Especially the intelligent training data selection proves to be very successful for domain adaptation of MT. Certain improvements are also obtained from German compound splitting on the source language side. Translation quality, however, does not appear to correlate with the IR performance – better translation does not necessarily yield better retrieval. We discuss in detail the contribution of the individual techniques and state-of-the-art features and provide future research directions.
Metadata
Item Type:Article (Published)
Refereed:Yes
Uncontrolled Keywords:Medical information retrieval
Subjects:Computer Science > Machine translating
Computer Science > Information retrieval
DCU Faculties and Centres:Research Initiatives and Centres > Centre for Next Generation Localisation (CNGL)
DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Publisher:Elsevier
Official URL:http://dx.doi.org/10.1016/j.artmed.2014.01.004
Copyright Information:© 2014 Elsevier
Use License:This item is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 3.0 License. View License
ID Code:20117
Deposited On:17 Sep 2014 10:22 by Liadh Kelly . Last Modified 25 Oct 2018 09:31
Documents

Full text available as:

[thumbnail of aiim2013_(2).pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
236kB
Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record