Volume : 2, Issue : 2, OCT 2018


A.M. Mamadu, Professor S.O. Emaikwu, O. Adikwu


This study aims to determine the optimal number of distractors for multiple-choice test items. A Common Person Equating design was employed to compare distractor performance indices across three Mathematics test formats i, ii and iii, which were different only in the number of distractors per item. Format ii and iii were constructed by deleting one distractor from each item in format i and format ii respectively. Format i, ii and iii, each containing 20 multiple-choice Mathematics test items with four, three and two distractors per item, were administered to 169 Senior Secondary II (SSII) students―120 students from Golden College Yagba, and 49 students from Government Secondary School Makurdi, both in Makurdi, Benue State, Nigeria. The three test formats were administered by means of Common Person Equating with Counterbalancing. The Nedelsky Model was applied to compare distractor plausibility across multiple-choice Mathematics test items with four, three and two distractors per item. Distractor plausibility was found to be significantly increased with increased number of distractors, with four-distractor-items yielding more plausible distractors. It is hence recommended that item writers should make it an obligation to include four distractors or more on multiple-choice test items.


Article : Download PDF

Cite This Article

Article No : 1

Number of Downloads : 4


  1. Agi, C.I., Aduloju, M.O., & Iornienge, T.M. (2015). Computer-aided testing: a panacea for examination malpractice. African Journal of Theory and Practice of Educational Research, 1, 76-90.
  2. Alonge, M.F. (2004). Measurement and evaluation in education and psychology. Ado-Ekiti: Adedayo.
  3. Emaikwu, S.O. (2011). Fundamentals of test, measurement and evaluation with psychometric theories. Makurdi: Selfers.
  4. Haladyna, T.M., & Downing, S.M. (1993) How many options is enough for a multiple-choice test item? Educational and Psychological Measurement, 53, 999-1010.
  5. Haladyna, T.M., Downing, S.M., & Rodriguez, M.C. (2002). A review of multiple-choice item writing guidelines for classroom assessment. Journal of Applied Measurement in Education, 15(3), 309-334.
  6. Nwadinigwe, P.I., & Naibi, L. (2013). The number of options in a multiple-choice test item and the psychometrics characteristics. Journal of Education and Practice, 4, 28.
  7. Nwokora, B.I. (2010). Assessment of the implementation of government measures for controlling examination malpractices in Ebonyi State secondary schools (PhD thesis).
  8. (2013, February 28). Educational measurement and evaluation [SlideShare]. Retrieved from https://www.slideshare.net/mobile/edtechred/educational-measurement-and-evaluation
  9. Pushpangathan, K. [Kavukavya]. (2016, June 14). Types of test items [SlideSh are]. Retrieved from https://www.slideshare.net/mobile/kavukavya/types-of-test-items