Modeling Examinee Heterogeneity in Discrete Option Multiple Choice Items
A new format for computer-based administration of multiple- choice items, the discrete option multiple choice (DOMC) item, is receiving growing attention due to potential advantages related both to item security and control of testwiseness. A unique feature of the DOMC format is the potential for an examinee to respond incorrectly to an item for different reasons—either failure to select a correct response, or selection of a distractor response. This feature motivates consideration of a new item response model that introduces an individual differences trait related to general proclivity to select response options. Using empirical data from an actual DOMC test, we validate the model by demonstrating the statistical presence of such a trait and discuss its implications for test equity in DOMC tests and the potential value for added item administration constraints.
KeywordsItem response model Discrete option multiple-choice items Computer-based testing
- Foster, D., & Miller H. L. (2009). A new format for multiple-choice testing: Discrete-Option Multiple-Choice. Results from early studies A new format for multiple-choice testing: Discrete-option multiple-choice. results from early studies. Psychology Science Quarterly, 51(4), 355–369.Google Scholar
- Lunn, D. J., Thomas, A., Best, N., & Spiegelhalter, D. J. (2000). WinBUGS - A Bayesian modeling framework: Concepts, structure and extensibility Winbugs - a bayesian modeling framework: Concepts, structure and extensibility. Statistics and Computing, 10(4), 325–337. https://doi.org/10.1023/A:1008929526011.CrossRefGoogle Scholar
- Spiegelhalter, D. J., Best, N. G., Carlin, B. P., & van der Linde, A. (2002). Bayesian measures of model complexity and fitBayesian measures of model complexity and fit. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 64(4), 583–639. https://doi.org/10.1111/1467-9868.00353.MathSciNetCrossRefzbMATHGoogle Scholar