Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bentler, P. M., & Tanaka, J. S. (1983). Problems with EM algorithms for ML factor analysis. Psychometrika, 48(2), 247–251.
Berndt, E., Hall, B., Hall, R., & Hausman, J. (1974). Estimation and inference in nonlinear structural models. Annals of Economic and Social Measurement, 3, 653–666.
Bock, R. D., & Aitkin, M. (1981). Marginal maximum likelihood estimation of item parameters: Application of an EM algorithm. Psychometrika, 46(4), 443–459.
Bock, R. D., & Lieberman, M. (1970). Fitting a response model for n dichotomously scored items. Psychometrika, 35, 179–197.
Lesaffre, E., & Spiessens, B. (2001). On the effect of the number of quadrature points in a logistic random effects model: an example. Journal of the Royal Statistical Society, Series C, 50, 325–335.
Liu, Q., & Pierce, D. (1994). A note on Gauss–Hermite quadrature. Biometrika, 81(3), 624.
McLachlan, G., & Krishan, T. (2007). The EM algorithm and extensions. Hoboken, NJ: Wiley.
Meng, X. L., & Rubin, D. B. (1991). Using EM to obtain asymptotic variance-covariance matrices: the SEM algorithm. Journal of the American Statistical Association, 86(416), 899–909.
Rabe-Hesketh, S., Skrondal, A., & Pickles, A. (2002). Reliable estimation of generalized linear mixed models using adaptive quadrature. The Stata Journal, 2(1), 1–21.
Schilling, S., & Bock, R. D. (2005). High-dimensional maximum marginal likelihood item factor analysis by adaptive quadrature. Psychometrika, 70(3), 533–555.
Skrondal, A., & Rabe-Hesketh, S. (2004). Generalized latent variable modeling: Multilevel, longitudinal, and structural equation models. Boca Raton, FL: Chapman & Hall/CRC.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendices
Appendix 1: Technical Details for EM
The conditional distribution f(η | u i , θ(t)) is
Then these conditional expectations involved in the Q function can be expressed as follows:
and
Then we have
where \(r_{j}{(\theta }^{(t)}) =\sum _{ i=1}^{N}u_{ij}\frac{f(u_{i}\vert \eta {,\theta }^{(t)})} {f(u_{i})}\), and \(n{(\theta }^{(t)}) =\sum _{ i=1}^{N}\frac{f(u_{i}\vert \eta {,\theta }^{(t)})} {f(u_{i})}\).
Integrations in the equations above can be approximated as follows using G–H quadrature. Note that these quadrature points, x g , and weights, w g , correspond to ϕ(η | θ(t)) which is the density function of N(0, Φ(t)).
We take the derivatives of Q1j with respect to model parameters
and
In the above equations, we have
and
Appendix 2: Technical Details for the Quasi-Newton Algorithm
For our objective function, \(Log\tilde{L}(\theta )\), the first derivative with respect to θ j , the latent trait for the jth item, is
where
For the probit link
Rights and permissions
Copyright information
© 2013 Springer Science+Business Media New York
About this paper
Cite this paper
An, X., Yung, YF. (2013). Notes on the Estimation of Item Response Theory Models. In: Millsap, R.E., van der Ark, L.A., Bolt, D.M., Woods, C.M. (eds) New Developments in Quantitative Psychology. Springer Proceedings in Mathematics & Statistics, vol 66. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-9348-8_19
Download citation
DOI: https://doi.org/10.1007/978-1-4614-9348-8_19
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-9347-1
Online ISBN: 978-1-4614-9348-8
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)