Psychometric Properties of Three New National Survey of Student Engagement Based Engagement Scales: An Item Response Theory Analysis
- 573 Downloads
We sought to develop and psychometrically describe three new student engagement scales to measure college students’ engagement with their faculty (student-faculty engagement: SFE), community-based activities (CBA), and transformational learning opportunities (TLO) using items selected from the National Survey of Student Engagement (NSSE), a widely used, standardized student engagement survey. We used confirmatory factor analysis for ordered-categorical measures, item response theory (IRT), and data from 941 US college students’ NSSE responses. Our findings indicated acceptable construct validity. The scales measured related but separable areas of engagement. IRT demonstrated that scores on the student-faculty engagement scale offered the most precise measurement in the middle range of student-faculty engagement. The CBA scale most reliably measured above average engagement, while TLO scores provided relatively precise descriptions of engagement across this spectrum. Findings support these scales’ utility in institutional efforts to describe “local” student engagement, as well as efforts to use these scales in cross-institutional comparisons.
KeywordsStudent engagement Item response theory Test reliability Measurement Transformational learning Community learning Student faculty interaction
We would like to thank the students who participated in our University's National Survey of Student Engagement. We would also like to thank the reviewers whose comments improved our original manuscript. Finally, Adam would also like to thank Tara J. Carle and Margaret Carle whose unending support and thoughtful comments make his work possible.
- Astin, A. W. (1993). What matters in college: Four critical years revisited. San Francisco: Jossey-Bass.Google Scholar
- Astin, A. W., & Sax, L. J. (1998). How undergraduates are affected by service participation. The Journal of College Student Development, 39(3), 251–263.Google Scholar
- Astin, A. W., Vogelgesang, L. J., Ikeda, E. K., & Yee, J. A. (2000). How service learning affects students. Los Angeles: Higher Education Research Institute University of California.Google Scholar
- Benjamini, Y., & Hochberg, Y. (1995). Controlling the false discovery rate: A practical and powerful approach to multiple testing. Journal of the Royal Statistical Society, Series B, 57, 289–300.Google Scholar
- Bjorner, J. B., Smith, K. J., Stone, C., & Sun, X. (2007). IRTFIT: A macro for item fit and local dependence tests under IRT models. USA: QualityMetric Incorporated School of Education, University of Pittsburgh.Google Scholar
- Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39, 3–7.Google Scholar
- Embretson, S., & Reise, S. P. (2000). Item response theory for psychologists. NJ: Lawrence.Google Scholar
- Gonyea, R. M., Kinzie, J., Kuh, G. D., & Nelson Laird, T. F. (2008). High impact activities: What they are, why they work, and who benefits. Program presented at the American Association for Colleges and Universities annual meeting. Washington, DC.Google Scholar
- Hambleton, R. K., & Swaminathan, H. (1985). Item response theory. Boston, MA: Kluwer.Google Scholar
- Hopkins, K. D. (1997). Educational and psychological measurement and evaluation. New York: Allyn & Bacon.Google Scholar
- Kinzie, J., Evenbeck, S. (2008). Assessing Student Engagement in High-Impact Practices. Program presented at the NASPA IARC Conference Scottsdale, AZ.Google Scholar
- Kuh, G. D. (2001). The national survey of student engagement: Conceptual framework and overview of psychometric properties. Bloomington, IN: Indiana University, Center for Postsecondary Research.Google Scholar
- Kuh, G. D. (2003). What we’re learning about student engagement from NSSE. Change, 35, 24–32.Google Scholar
- Kuh, G. D. (2005). 7 steps for taking student learning seriously. Trusteeship, 13, 20–24.Google Scholar
- Kuh, G. D., Hayek, J. C., Carini, R. M., Ouimet, J. A., Gonyea, R. M., & Kennedy, J. (2001). NSSE technical and norms report. Bloomington, IN: Indiana University Center for Postsecondary Research and Planning.Google Scholar
- Kuh, G. D., Kinzie, J., Schuh, J. H., Whitt, E. J., & Associates. (2005). Student success in college: Creating conditions that matter. San Francisco: Jossey-Bass.Google Scholar
- McInnis, E. D. (2006). Nonresponse bias in student assessment surveys: A comparison of respondents and non-respondents of the national survey of student engagement at an independent comprehensive Catholic University. Unpublished doctoral dissertation, Marywood University, USA.Google Scholar
- Muthén, L. K., & Muthén, B. O. (1998–2007). Mplus user’s guide. (4ed.) Los Angeles, CA: Muthén & Muthén.Google Scholar
- National Survey of Student Engagement (NSSE). (2006). NSSE 2006 codebook. Retrieved November 22, 2009, from http://www.nsse.iub.edu/pdf/2006_Institutional_Report/nsse_codebooks/NSSE%202006%20Codebook.pdf.
- National Survey of Student Engagement (NSSE). (2008). Origins. Retrieved November 22, 2009, from http://nsse.iub.edu/html/origins.cfm.
- Pace, C. R. (1984). Measuring the quality of college student experiences. Los Angeles: Los Angeles Center for the Study of Evaluation, University of California Los Angeles.Google Scholar
- Pascarella, E. T., & Terenzini, P. T. (1991). How college affects students. San Francisco: Jossey-Bass.Google Scholar
- Pedhazur, E. J., & Schmelkin, L. (1991). Measurement, design, and analysis: An integrated approach. Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.Google Scholar
- Samejima, F. (1969). Estimation of latent ability using a response pattern of graded scores. Psychometrika Monograph Supplement, 34(4, pt. 2), 100.Google Scholar
- Thissen, D., Chen, W., & Bock, D. (2002). MULTILOG 7. Chicago: Scientific software international.Google Scholar
- Thissen, D., Steinberg, L., & Wainer, H. (1993). Differential item functioning. In P. W. Holland (Ed.), Detection of differential item functioning using the parameters of item response models (pp. 67–113). Hillsdale, NJ: Lawrence Erlbaum Associates.Google Scholar
- Thorndike, R. M. (2004). Measurement and evaluation in psychology and education (7th ed.). Columbus, OH: Merrill Publishing Co/Prentice-Hall.Google Scholar