The relationship of correct option location, distractor efficiency, difficulty and discrimination indices in analysis of high-stakes multiple-choice questions exam of medical students

  • Madjid Shafiayan Tehran University of Medical Sciences
  • Balal Izanloo Kharazmi University
Palabras clave: , Difficulty index, Discrimination index, Distractor efficiency, Correct option position, Multiple – Choice Questions, Medical Education

Resumen

Background: Analysis of Multiple-Choice Questions (MCQs) is the psychometric method pertains to validity and reliability of the exam. Objective: This study was conducted to identify psychometric properties of high-stakes MCQ exam of under-graduate Medical Students' assessment. With this in mind we tried to investigate the effect of correct option location on difficulty index (DIF I) and discrimination index (DI) regarding distractor efficiency (DE) in the context of Medical Education. Materials and Methods: National high –stake MCQ exam was conducted to senior medical students belonging to universities of Medical Sciences to assess knowledge of Basic and Clinical sciences. Data were analyzed using Classical Test Theory to investigate effect of correct – option position on DIF I and DI and DE. Microsoft Excel spread sheet; SPSS version 23; R Psych Package softwars were used. Descriptive statistics; Point biserial correlation; Fisher's Exact Test; ANOVA test and Pearson correlation were performed. Results: The mean score was 107.30±19.10 ranging from 40 – 174.Mean DIF I and DI were 0.54 ± 0.20 and 0.20 ± 0.10, respectively. Fourthly three and half percent MCQs were of average DIF I (0.30˂ P˂ 0.70) and DI ˃0.2. Overall 127/600 (21.16%) were null distractors (˂ 5%) and DE was 78.84%. Mean DIF I and SD key option 1; 2; 3; 4 were 0.50±0.20; 0.59 ± 0.18; 0.54 ± 0.23; 0.50 ± 0.17, respectively. Conclusion: Our data suggest that correct option location remarkably affect DIF I of item. We believe our study provides considerable insight into validating MCQs of Medical students' assessment to optimizing question bank                                                    

Descargas

La descarga de datos todavía no está disponible.

Biografía del autor/a

Madjid Shafiayan, Tehran University of Medical Sciences

Ph.D Candidate Of Medical Education, Department of Medical Education,School of Medicine,Tehran University of Medical Sciences, Tehran, Iran.

Balal Izanloo, Kharazmi University
Assistant Professor of Curriculum Planning, Department of Curriculum Planning, Faculty of Psychology and Education, Kharazmi University, Tehran, Iran.

Citas

Swanwick TJUMEE, Theory, Practice. Understanding medical education2013. 1-6 p.

Miller GEJAm. The assessment of clinical skills/competence/performance. 1990;65(9):S63-7.

Tavakol M, Dennick RJAM. Postexamination analysis: a means of improving the exam cycle. 2016;91(9):1324.

Downing SM. Validity: on the meaningful interpretation of assessment data. Medical education. 2003;37(9):830-7.

De Champlain AF. A primer on classical test theory and item response theory for assessments in medical education. Medical education. 2010;44(1):109-17.

Soler HH, ARIAS RMJEIdlUC. A new insight into examinee behaviour in a multiple-choice test: a quantitative approach. 2002;10(2002):113-37.

Rodriguez MCJEMI, Practice. Three options are optimal for multiple‐choice items: A meta‐analysis of 80 years of research. 2005;24(2):3-13.

Tarrant M, Ware J, Mohammed AMJBME. An assessment of functioning and non-functioning distractors in multiple-choice questions: a descriptive analysis. 2009;9(1):40.

Ware J, Vik T. Quality assurance of item writing: during the introduction of multiple choice questions in medicine for high stakes examinations. Medical teacher. 2009;31(3):238-43.

Barman A, Ja’afar R, Rahim F, Noor AJTomej. Psychometric Characteristics of MCQs used in Assessing Phase-II Undergraduate Medical Students of Universiti Sains Malaysia. 2010;3:1-4.

Rogausch A, Hofer R, Krebs R. Rarely selected distractors in high stakes medical multiple-choice examinations and their recognition by item authors: a simulation and survey. BMC Med Educ. 2010;10(1):85.

Hingorjo MR, Jaleel F. Analysis of one-best MCQs: the difficulty index, D I and DE. JPMA The Journal of the Pakistan Medical Association. 2012;62(2):142-7.

Gajjar S, Sharma R, Kumar P, Rana M. Item and test analysis to identify quality multiple choice questions (MCQs) from an assessment of medical students of Ahmedabad, Gujarat. Indian journal of community medicine: official publication of Indian Association of Preventive & Social Medicine. 2014;39(1):17.

Ali SH, Ruit KG. The Impact of item flaws, testing at low cognitive level, and low distractor functioning on multiple-choice question quality. Perspectives on medical education. 2015;4(5):244-51.

Madhav V. Item Analysis of Multiple-Choice Questions in Teaching Prosthodontics. Journal of dental education. 2015;79(11):1314-9.

Patil PS, Dhobale MR, Mudiraj NR. ITEM ANALYSIS OF MCQS'-MYTHS AND REALITIES WHEN APPLYING THEM AS AN ASSESSMENT TOOL FOR MEDICAL STUDENTS. International Journal of Current Research and Review. 2016;8(13):12.

Rao C, Kishan Prasad H, Sajitha K, Permi H, Shetty J. Item analysis of multiple choice questions: Assessing an assessment tool in medical students. 2016;2(4):201-4.

Ferdousi S, Rahman MM, Talukder HK, Habib MA. Post-application Quality Analysis of MCQs of Preclinical Examination Using Item Analysis. Bangladesh Journal of Medical Education. 2017;7(1):2-7.

Gierl MJ, Bulut O, Guo Q, Zhang XJRoER. Developing, analyzing, and using distractors for multiple-choice tests in education: a comprehensive review. 2017;87(6):1082-116.

Garg R, Kumar V, Maria J. Analysis of multiple choice questions from a formative assessment of medical students of a medical college in Delhi, India. International Journal of Research in Medical Sciences. 2018;7(1):4.

Kheyami D, Jaradat A, Al-Shibani T, Ali FA. Item Analysis of Multiple Choice Questions at the Department of Paediatrics, Arabian Gulf University, Manama, Bahrain. Sultan Qaboos University Medical Journal. 2018;18(1):e68.

Pawluk SA, Shah K, Minhas R, Rainkie D, Wilby KJ. A psychometric analysis of a newly developed summative, multiple choice question assessment adapted from Canada to a Middle Eastern context. Currents in Pharmacy Teaching and Learning. 2018.

Veney JE. Statistics for health policy and administration using Microsoft Excel: Jossey-Bass San Francisco; 2003.

Green SB, Salkind NJ. Using SPSS for Windows and Macintosh, Books a la Carte: Pearson; 2016.

Team RC. R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. 2013. 2014.

DeVellis RFJMc. Classical test theory. 2006:S50-S9.

Raymond M, Rousset FJE. An exact test for population differentiation. 1995;49(6):1280-3.

González-Rodríguez G, Colubi A, Gil MÁJCS, Analysis D. Fuzzy data treated as functional data: A one-way ANOVA test approach. 2012;56(4):943-55.

Feldt LSJP. The approximate sampling distribution of Kuder-Richardson reliability coefficient twenty. 1965;30(3):357-70.

Caballero J, Wolowich WR, Benavides S, Marino J. Difficulty and discrimination indices of multiple-choice examination items in a college of pharmacy therapeutics and pathophysiology course sequence. Int J Pharm Pract. 2014;22(1):76-83.

Publicado
2019-12-11
Cómo citar
Shafiayan, M., & Izanloo, B. (2019). The relationship of correct option location, distractor efficiency, difficulty and discrimination indices in analysis of high-stakes multiple-choice questions exam of medical students. Revista De La Universidad Del Zulia, 10(27), 132-151. Recuperado a partir de https://produccioncientificaluz.org./index.php/rluz/article/view/30011