Costello, Eamon ORCID: 0000-0002-2775-6006, Holland, Jane and Kirwan, Colette (2018) Evaluation of MCQs from MOOCs for common item writing faws. BMC Research Notes, 11 (849). pp. 1-3. ISSN 1756-0500
Abstract
Objective: There is a dearth of research into the quality of assessments based on Multiple Choice Question (MCQ)
items in Massive Open Online Courses (MOOCs). This dataset was generated to determine whether MCQ item writing
faws existed in a selection of MOOC assessments, and to evaluate their prevalence if so. Hence, researchers reviewed
MCQs from a sample of MOOCs, using an evaluation protocol derived from the medical health education literature,
which has an extensive evidence-base with regard to writing quality MCQ items.
Data description: This dataset was collated from MCQ items in 18 MOOCs in the areas of medical health education,
life sciences and computer science. Two researchers critically reviewed 204 questions using an evidence-based evaluation
protocol. In the data presented, 50% of the MCQs (112) have one or more item writing faw, while 28% of MCQs
(57) contain two or more faws. Thus, a majority of the MCQs in the dataset violate item-writing guidelines, which mirrors
fndings of previous research that examined rates of faws in MCQs in traditional formal educational contexts.
Metadata
Item Type: | Article (Published) |
---|---|
Refereed: | Yes |
Uncontrolled Keywords: | MOOCs; MCQs; Quality; Item writing; Tests; Pedagogy |
Subjects: | Social Sciences > Distance education Social Sciences > Education Social Sciences > Educational technology |
DCU Faculties and Centres: | DCU Faculties and Schools > NIDL (National Institute for Digital Learning) |
Publisher: | Springer Nature |
Official URL: | https://doi.org/10.1186/s13104-018-3959-4 |
Copyright Information: | © 2018 The Authors |
Use License: | This item is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 3.0 License. View License |
ID Code: | 22834 |
Deposited On: | 07 Dec 2018 16:15 by Thomas Murtagh . Last Modified 28 Jan 2020 13:59 |
Documents
Full text available as:
Preview |
PDF
- Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
750kB |
Downloads
Downloads
Downloads per month over past year
Archive Staff Only: edit this record