Abstract

Difficult multiple-choice (MC) questions can be made easy by providing a set of answer options of which most are obviously wrong. In the education literature, a plethora of instructional guides exist for crafting a suitable set of wrong choices (distractors) that enable the assessment of the students’ understanding. The art of MC question design thus hinges on the question-maker’s experience and knowledge of the potential misconceptions. In contrast, we advocate a data-driven approach, where correct and incorrect options are assembled directly from the students’ own past submissions. Large-scale online classroom settings, such as massively open online courses (MOOCs), provide an opportunity to design optimal and adaptive multiple-choice questions that are maximally informative about the students’ level of understanding of the material. In this work, we (i) develop a multinomial-logit discrete choice model for the setting of MC testing, (ii) derive an optimization objective for selecting optimally discriminative option sets, (iii) propose an algorithm for finding a globally-optimal solution, and (iv) demonstrate the effective-ness of our approach via synthetic experiments and a user study. We finally showcase an application of our approach to crowd-sourcing tests from technical online forums.


Filed under: Optimization Techniques


Comments