Sudan Journal of Medical Sciences

ISSN: 1858-5051

High-impact research on the latest developments in medicine and healthcare across MENA and Africa

Item Analysis of Multiple-choice Questions (MCQs): Assessment Tool For Quality Assurance Measures

Published date: Sep 30 2021

Journal Title: Sudan Journal of Medical Sciences

Issue title: Sudan JMS: Volume 16 (2021), Issue No. 3

Pages: 334–346

DOI: 10.18502/sjms.v16i3.9695

Authors:

Amani H. Elgadal - amanielgaddal@karary.edu.sd

Abdalbasit A. Mariod

Abstract:

Background: Integration of assessment with education is vital and ought to be performed regularly to enhance learning. There are many assessment methods like Multiple-choice Questions, Objective Structured Clinical Examination, Objective Structured Practical Examination, etc. The selection of the appropriate method is based on the curricula blueprint and the target competencies. Although MCQs has the capacity to test students’ higher cognition, critical appraising, problem-solving, data interpretation, and testing curricular contents in a short time, there are constraints in its analysis. The authors aim to accentuate some consequential points about psychometric analysis displaying its roles, assessing its validity and reliability in discriminating the examinee’s performance, and impart some guide to the faculty members when constructing their exam questions bank.

Methods: Databases such as Google Scholar and PubMed were searched for freely accessible English articles published since 2010. Synonyms and keywords were used in the search. First, the abstracts of the articles were viewed and read to select suitable match, then full articles were perused and summarized. Finally, recapitulation of the relevant data was done to the best of the authors’ knowledge.

Results: The searched articles showed the capacity of MCQs item analysis in assessing questions’ validity, reliability, its capacity in discriminating against the examinee’s performance and correct technical flaws for question bank construction.

Conclusion: Item analysis is a statistical tool used to assess students’ performance on a test, identify underperformed items, and determine the root causes of this underperformance for improvement to ensure effective and accurate students’ competency judgment.

Keywords: assessment, difficulty index, discrimination index, distractors, MCQ item analysis

References:

[1] Hingorjo, M. R. and Jaleel, F. (2012). Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency. Journal of Pakistan Medical Association, vol. 62, no. 2, pp. 142–147.

[2] Vanderbilt, A. A., Feldman, M., and Wood, I. K. (2013). Assessment in undergraduate medical education: a review of course exams. Medical Education Online, vol. 18, no. 1, pp. 1–5.

[3] Gajjar, S., Sharma, R., Kumar, P., et al. (2014). Item and test analysis to identify quality multiple choice questions (MCQs) from an assessment of medical students of Ahmedabad, Gujarat. Indian Journal of Community Medicine, vol. 39, no. 1, pp. 17–20.

[4] Abdulghani, H. M., Irshad, M., Haque, S., et al. (2017). Effectiveness of longitudinal faculty development programs on MCQs items writing skills: a follow-up study. PLoS One, vol. 12, no. 10, e0185895.

[5] McKinley, D. W. and Norcini, J. J. (2014). How to set standards on performance-based examinations: AMEE Guide No. 85. Medical Teacher, vol. 36, no. 2, pp. 97–110.

[6] Ben-David, M. F. (2000). AMEE Guide No. 18: standard setting in student assessment. Medical Teacher, vol. 22, no. 2, pp. 120–130.

[7] Testa, S., Toscano, A., and Rosato, R. (2018). Distractor efficiency in an item pool for a statistics classroom exam: assessing its relationship with item cognitive level classified according to Bloom’s taxonomy. Frontiers in Psychology, vol. 9, p. 1585.

[8] Kumar, D., Jaipurkar, R., Shekhar, A., et al. (2021). Item analysis of multiple choice questions: a quality assurance test for an assessment tool. Medical Journal Armed Forces India, vol. 77, no. 1, pp. S85–S89.

[9] Mubuuke, A. G., Mwesigwa, C., and Kiguli, S. (2017). Implementing the Angoff method of standard setting using postgraduate students: Practical and affordable in resourcelimited settings. African Journal of Health Professions Education, vol. 9, no. 4, p. 171.

[10] Yim, M. (2018). Comparison of results between modified-Angoff and bookmark methods for estimating cut score of the Korean medical licensing examination. Korean Journal of Medical Education, vol. 30, no. 4, pp. 347–357.

[11] Seçil, Ö. M. Ü. R. and Selvi, H. (2010). Angoff, Ebel ve Nedelsky yöntemleriyle belirlenen kesme puanlarının sınıflama tutarlılıklarının karşılaştırılması Eğitimde ve Psikolojide Ölçme ve Değerlendirme Dergisi, vol. 1, no. 2, pp. 109–113.

[12] Park, J., Ahn, D.-S., Yim, M. K., et al. (2018). Comparison of standard-setting methods for the Korea Radiological technologist Licensing Examination: Angoff, Ebel, Bookmark, and Hofstee. Journal of Educational Evaluation for Health Professions, vol. 15, p. 32. [13] Petrillo, J., Cano, S. J., McLeod, L. D., et al. (2015). Using classical test theory, item response theory, and Rasch measurement theory to evaluate patient-reported outcome measures: a comparison of worked examples. Value Health, vol. 18, no. 1, pp. 25–34. [14] Ali, S. H., Carr, P. A., and Ruit, K. G. (2016). Validity and reliability of scores obtained on multiple-choice questions: why functioning distractors matter. Journal of the Scholarship of Teaching and Learning, vol. 16, no. 1, pp. 1–14. [15] Abdalla, M. E. (2011). What does item analysis tell us? Factors affecting the reliability of multiple-choice questions (MCQs). Gezira Journal of Health Sciences, vol. 7, no. 2, pp. 17–25. [16] Vegada, B. N., Karelia, B. N., Pillai, A., et al. (2014). Reliability of four-response type multiple choice questions of pharmacology summative tests of II. International Journal of Mathematics and Statistics Invention. Retrieved from: https://www. semanticscholar.org/paper/{%}22Reliability-of-four-response-type-multiple-choice- Vegada-Karelia/43a896bff1c7b16cee1a5c89643b443f0cd0bf9d#citing-papers [17] Glen, S. (n.d.). Kuder-Richardson 20 (KR-20) & 21 (KR-21). Retrieved from: https://www. statisticshowto.com/kuder-richardson/ [18] Velou, M. S. and Ahila, E. (2020). Refine the multiple-choice questions tool with item analysis. IAIM, vol. 7, no. 8, pp. 80–85.

[19] Coughlin, P. A. and Featherstone, C. R. (2017). How to write a high quality multiple choice question (MCQ): a guide for clinicians. European Journal of Vascular and Endovascular Surgery, vol. 54, no. 5, pp. 654–658.

[20] Salih, K. E. M. A., Jibo, A., Ishaq, M., et al. (2020). Psychometric analysis of multiple-choice questions in an innovative curriculum in Kingdom of Saudi Arabia. International Journal of Family Medicine and Primary Care, vol. 9, no. 7, pp. 3663– 3668.

[21] Harti, S., Mahapatra, A. K., Gupta, S. K., et al. (2021). All India AYUSH post graduate entrance exam (AIAPGET) 2019–AYURVEDA MCQ item analysis. Journal of Ayurveda and Integrative Medicine, vol. 12, no. 2, pp. 356–358.

[22] Namdeo, S. K. and Sahoo, S. (2016). Item analysis of multiple-choice questions from an assessment of medical students in Bhubaneswar, India. International Journal of Research in Medical Sciences, vol. 4, no. 5, pp. 1716–1719.

[23] Garg, R., Kumar, V., and Maria, J. (2019). Analysis of multiple-choice questions from a formative assessment of medical students of a medical college in Delhi, India. International Journal of Research in Medical Sciences, vol. 7, pp. 174–177.

[24] Tarrant, M. and Ware, J. (2008). Impact of item-writing flaws in multiple-choice questions on student achievement in high-stakes nursing assessments: item-writing flaws and student achievement. Medical Education, vol. 42, no. 2, pp. 198–206.

[25] Rao, C., Kishan Prasad, H. L., Sajitha, K., et al. (2016). Item analysis of multiple-choice questions: Assessing an assessment tool in medical students. International Journal of Educational and Psychological Researches, vol. 2, no. 4, pp. 201–204.

[26] Kheyami, D., Jaradat, A., Al-Shibani, T., et al. (2018). Item analysis of multiple choice questions at the Department of Paediatrics, Arabian Gulf University, Manama, Bahrain. Sultan Qaboos University Medical Journal, vol. 18, no. 1, p. 68.

[27] Vegada, B., Shukla, A., Khilnani, A., et al. (2016). Comparison between three-option, four option and five option multiple choice question tests for quality parameters: a randomized study. Indian Journal of Pharmacology, vol. 48, no. 5, pp. 571–575.

[28] Nwadinigwe, P. I. and Naibi, L. (2013). The number of options in a multiple-choice test item and the psychometric characteristics. Journal of Education and Practice, vol. 4, pp. 189–196.

[29] Tweed, M. (2019). Adding to the debate on the numbers of options for MCQs: the case for not being limited to MCQs with three, four or five options. BMC Medical Education, vol. 19, no. 1, p. 354.

[30] Pawluk, S. A., Shah, K., Minhas, R., et al. (2018). A psychometric analysis of a newly developed summative, multiple choice question assessment adapted from Canada to a Middle Eastern context. Currents in Pharmacy Teaching and Learning, vol. 10, no. 8, pp. 1026–1032.

[31] Gupta, P., Meena, P., Khan, A. M., et al. (2020). Effect of faculty training on quality of multiple-choice questions. International Journal of Applied and Basic Medical Research, vol. 10, pp. 210–214.

[32] Ali, R., Sultan, A. S., and Zahid, N. (2021). Evaluating the effectiveness of’MCQ development workshop using cognitive model framework: a pre-post study. Journal of the Pakistan Medical Association, vol. 71 1 (A), pp. 119–121.

[33] Alamoudi, A. A., El-Deek, B. S., Park, Y. S., et al. (2017). Evaluating the long-term impact of faculty development programs on MCQ item analysis. Medical Teacher, vol. 39, no. 1, pp. S45–S49.

[34] Steinert, Y., Mann, K., Anderson, B., et al. (2016). A systematic review of faculty development initiatives designed to enhance teaching effectiveness: a 10-year update: BEME Guide No. 40. Medical Teacher, vol. 38, no. 8, pp. 769–786.

[35] Smeby, S. S., Lillebo, B., Gynnild, V., et al. (2019). Improving assessment quality in professional higher education: could external peer review of items be the answer? Cogent Medicine, vol. 6, no. 1, 1659746.

[36] AlKhatib, H. S., Brazeau, G., Akour, A., et al. (2020). Evaluation of the effect of items’ format and type on psychometric properties of sixth year pharmacy students’ clinical clerkship assessment items. BMC Medical Education, vol. 20, no. 1, p. 190.

Download
HTML
Cite
Share
statistics

10000 Abstract Views

794 PDF Downloads