Kelly, Liadh ORCID: 0000-0003-1131-5238, Goeuriot, Lorraine ORCID: 0000-0001-7491-1980, Suominen, Hanna, Schreck, Tobias, Leroy, Gondy, Mowery, Danielle L., Velupillai, Sumithra, Chapman, Wendy, Martinez, David, Zuccon, Guido and Palotti, Joao (2014) Overview of the ShARe/CLEF eHealth evaluation lab 2014. Lecture Notes in Computer Science, 8685 . pp. 172-191. ISSN 0302-9743
Abstract
This paper reports on the 2nd ShARe/CLEFeHealth evaluation lab which continues our evaluation resource building activities for the medical domain. In this lab we focus on patients' information needs as opposed to the more common campaign focus of the specialised information needs of physicians and other healthcare workers. The usage scenario of the lab is to ease patients and next-of-kins' ease in understanding eHealth information, in particular clinical reports. The 1st ShARe/CLEFeHealth evaluation lab was held in 2013. This lab consisted
of three tasks. Task 1 focused on named entity recognition and normalization of disorders; Task 2 on normalization of acronyms/abbreviations; and Task 3 on information retrieval to address questions patients may
have when reading clinical reports. This year's lab introduces a new challenge in Task 1 on visual-interactive search and exploration of eHealth
data. Its aim is to help patients (or their next-of-kin) in readability issues related to their hospital discharge documents and related information search on the Internet. Task 2 then continues the information extraction
work of the 2013 lab, specifically focusing on disorder attribute identification and normalization from clinical text. Finally, this year's Task 3 further extends the 2013 information retrieval task, by cleaning the 2013
document collection and introducing a new query generation method and multilingual queries. De-identied clinical reports used by the three tasks were from US intensive care and originated from the MIMIC II database.
Other text documents for Tasks 1 and 3 were from the Internet and originated from the Khresmoi project. Task 2 annotations originated from the ShARe annotations. For Tasks 1 and 3, new annotations, queries, and relevance assessments were created. 50, 79, and 91 people registered their interest in Tasks 1, 2, and 3, respectively. 24 unique teams participated with 1, 10, and 14 teams in Tasks 1, 2 and 3, respectively. The teams
were from Africa, Asia, Canada, Europe, and North America. The Task 1 submission, reviewed by 5 expert peers, related to the task evaluation category of Eective use of interaction and targeted the needs of both expert and novice users. The best system had an Accuracy of 0.868 in Task 2a, an F1-score of 0.576 in Task 2b, and Precision at 10 (P@10) of 0.756 in Task 3. The results demonstrate the substantial community interest and capabilities of these systems in making clinical reports easier to understand for patients. The organisers have made data and tools available for future research and development.
Metadata
Item Type: | Article (Published) |
---|---|
Refereed: | Yes |
Uncontrolled Keywords: | Information Extraction; Evaluation; Medical Informatics |
Subjects: | Computer Science > Interactive computer systems Computer Science > Visualization Computer Science > Information retrieval |
DCU Faculties and Centres: | Research Initiatives and Centres > Centre for Next Generation Localisation (CNGL) DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing |
Publisher: | Springer |
Official URL: | http://dx.doi.org/10.1007/978-3-319-11382-1_17 |
Copyright Information: | © 2014 Springer The original publication is available at www.springerlink.com |
Use License: | This item is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 3.0 License. View License |
ID Code: | 20109 |
Deposited On: | 12 Nov 2014 11:51 by Liadh Kelly . Last Modified 25 Oct 2018 14:33 |
Documents
Full text available as:
Preview |
PDF
- Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
268kB |
Downloads
Downloads
Downloads per month over past year
Archive Staff Only: edit this record