Adopting the Multi-process Approach to Detect Differential Item Functioning in Likert Scales

  • Kuan-Yu JinEmail author
  • Yi-Jhen Wu
  • Hui-Fang Chen
Conference paper
Part of the Springer Proceedings in Mathematics & Statistics book series (PROMS, volume 265)


The current study compared the performance of the logistic regression (LR) and the odds ratio (OR) approaches in differential item functioning (DIF) detection in which the three processes of an IRTree model were considered in a five-point response scale. Three sets of binary pseudo items (BPI) were generated to indicate an intention of endorsing the midpoint response, a positive/negative attitude toward an item, and a tendency of using extreme category, respectively. Missing values inevitably appeared in the last two sets of BPI. We manipulated the DIF patterns, the percentages of DIF items, and the purification procedure (with/without). The results suggested that (1) both the LR and OR performed well in detecting DIF when BPI did not include missing values; (2) the OR method generally outperformed the LR method when BPI included missing values; (3) the OR method performed fairly well without a purification procedure, but the purification procedure improved the performance of the LR approach, especially when the number of DIF was large.


IRTree Differential item functioning Logistic regression Odds ratio Purification Missing data 


  1. Böckenholt, U. (2012). Modeling multiple response processes in judgment and choice. Psychological Methods, 17, 665–678. Scholar
  2. Emenogu, B. C., Falenchuck, O., & Childs, R. A. (2010). The effect of missing data treatment on Mantel-Haenszel DIF detection. The Alberta Journal of Educational Research, 56, 459–469.Google Scholar
  3. Jin, K.-Y., Chen, H.-F., & Wang, W.-C. (2018). Using odds ratios to detect differential item functioning. Applied Psychological Measurement. Scholar
  4. Lei, P., Chen, S., & Yu, L. (2006). Comparing methods of assessing differential item functioning in a computerized adaptive testing environment. Journal of Educational Measurement, 43, 245–264. Scholar
  5. Magis, D., Béland, S., Tuerlinckx, F., & De Boeck, P. (2010). A general framework and an R package for the detection of dichotomous differential item functioning. Behavior Research Methods, 42, 847–862. Scholar
  6. Plieninger, H., & Heck, D. W. (2018). A new model for acquiescence at the interface of psychometrics and cognitive psychology. Multivariate behavioral research, 53, 633–654. Scholar
  7. Team, R. C. (2016). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing.Google Scholar
  8. Rogers, H. J., & Swaminathan, H. (1993). A comparison of logistic regression and Mantel-Haenszel procedures for detecting differential item functioning. Applied Psychological Measurement, 17, 105–116. Scholar
  9. Walker, C. M., & Sahin, S. G. (2017). Using a multidimensional IRT framework to better understand differential item functioning (DIF): A tale of three DIF detection procedures. Educational and Psychological Measurement, 77, 945–970. Scholar
  10. Wang, W.-C., & Su, Y.-H. (2004). Effects of average signed area between two item characteristic curves and test purification procedures on the DIF detection via the Mantel-Haenszel method. Applied Measurement in Education, 17, 113–144. Scholar
  11. Zettler, I., Lang, J. W., Hülsheger, U. R., & Hilbig, B. E. (2016). Dissociating indifferent, directional, and extreme responding in personality data: Applying the three-process model to self-and observer reports. Journal of Personality, 84, 461–472. Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Faculty of EducationThe University of Hong KongPokfulam, Hong KongChina
  2. 2.Bamberg Graduate School of Social Sciences (BAGSS)Otto-Friedrich-Universität BambergBambergGermany
  3. 3.Department of Social and Behavioural SciencesCity University of Hong KongKowloon, Hong KongChina

Personalised recommendations