Improving SET Response Rates: Synchronous Online Administration as a Tool to Improve Evaluation Quality

  • Trey Standish
  • Jeff A. Joines
  • Karen R. Young
  • Victoria J. Gallagher
Article

Abstract

Institutions of higher education continue to migrate student evaluations of teaching (SET) from traditional, in-class paper forms to online SETs. Online SETs would favorably compare to paper-and-pencil evaluations were it not for widely reported response rate decreases that cause SET validity concerns stemming from possible nonresponse bias. To combat low response rates, one institution introduced a SET application for mobile devices and piloted formal synchronous classroom time for SET completion. This paper uses the Leverage Salience Theory to estimate the impact of these SET process changes on overall response rates, open-ended question response rates, and open end response word counts. Synchronous class time best improves SET responses when faculty encourage completion on keyboarded devices and provide students SET completion time in the first 15 min of a class meeting. Full support from administrators requires sufficient wireless signal strength, IT infrastructure, and assuring student access to devices for responses clustering around meeting times.

Keywords

Student evaluation of teaching SET Leverage salience theory Survey Response rate 

References

  1. Adams, M. J. D., & Umbach, P. D. (2012). Nonresponse and online student evaluations of teaching: Understanding the influence of salience, fatigue, and academic environments. Research in Higher Education, 53, 576–591.CrossRefGoogle Scholar
  2. Aleamoni, L. M. (1999). Student rating myths versus research facts from 1924 to 1998. Journal of Personnel Evaluation in Education, 49(1), 26–31.Google Scholar
  3. Anderson, H. M., Cain, J., & Bird, E. (2005). Online student course evaluations: Review of literatures and a pilot study. American Journal of Pharmaceutical Education, 69, 34–43.CrossRefGoogle Scholar
  4. Avery, R. J., Bryan, W. K., Mathios, A., Kang, H., & Bell, D. (2006). Does an online delivery system influence student evaluations? The Journal of Economic Education, 37(1), 21–37.CrossRefGoogle Scholar
  5. Ballantyne, C. (2003). Online evaluations of teaching: An examination of current practice and considerations for the future. New Directions for Teaching and Learning, 96, 103–112.CrossRefGoogle Scholar
  6. Bothell, T. W., & Henderson, T. (2003). Do online ratings of instruction make sense? In D. L. Sorenson & T. D. Johnson (Eds.), Online Student Ratings of Instruction. New Directions for Teaching and Learning. New York: Jossey-Bass.Google Scholar
  7. Centra, J. (2003). Will teachers receive higher student evaluations by giving higher grades and less course work? Research in Higher Education, 44(5), 495–518.CrossRefGoogle Scholar
  8. Crews, T. B., & Curtis, D. F. (2011). Online course evaluations: Faculty perspective and strategies for improved response rates. Assessment & Evaluation in Higher Education, 36(7), 865–878.CrossRefGoogle Scholar
  9. Dommeyer, C. J., Baum, P., Hanna, R. W., & Chapman, K. S. (2004). Gathering faculty teaching evaluations by in-class and online surveys: Their effects on response rates and evaluations. Assessment & Evaluation in Higher Education, 29(5), 611–623.CrossRefGoogle Scholar
  10. Goodman, J., Anson, R., & Belcheir, M. (2015). The effect of incentives and other instructor-driven strategies to increase online student evaluation response rates. Assessment & Evaluation in Higher Education, 40(7), 958–970.CrossRefGoogle Scholar
  11. Goyder, J. (1985). Face-to-face interviews and mail questionnaires: The net difference in response rate. Public Opinion Quarterly, 49(3), 234–252.CrossRefGoogle Scholar
  12. Groves, R. M., Floyd, F. J., Jr., Couper, M. P., Lepkowski, J. M., Singer, E., & Tourangeau, R. (2009). Survey methodology (2nd ed.). Hoboken, NJ: Wiley.Google Scholar
  13. Groves, R. M., Singer, E., & Corning, A. (2000). Leverage-saliency theory of survey participation: Description and illustration. Public Opinion Quarterly, 64, 299–308.CrossRefGoogle Scholar
  14. Hobson, S. M., & Talbot, D. M. (2001). Understanding student evaluations: What all faculty should know. College Teaching, 49(1), 26–31.CrossRefGoogle Scholar
  15. Kuh, G. D. & Whitt, E. J. (1998). The Invisible Tapestry: Culture in American Colleges and Universities. ASHE-ERIC Higher Education Report No. 1. Washington DC: Association for the Study of Higher Education.Google Scholar
  16. Laubsch, P. (2006). Online and in-person evaluations: A literature review and exploratory comparison. Journal of Online Learning and Teaching, 2(2), 62–73.Google Scholar
  17. Lindahl, M. W., & Unger, M. L. (2010). Cruelty in student teaching evaluations. College Teaching, 58(3), 71–76.CrossRefGoogle Scholar
  18. Misra, S., Stokols, D., & Marino, H. A. (2011). Using norm-based appeal to increase response rates in evaluation research: A field experiment. American Journal of Evaluation, 33(1), 88–98.CrossRefGoogle Scholar
  19. Nulty, D. D. (2008). The adequacy of response rates to online paper surveys: What can be done? Assessment & Evaluation In Higher Education, 33(3), 301–314.CrossRefGoogle Scholar
  20. Sax, L., Gilmartin, S., & Bryant, A. (2003). Assessing response rates and nonresponse bias in web and paper surveys. Research in Higher Education, 44(4), 409–432.CrossRefGoogle Scholar
  21. Spencer, K. J., & Schmelkin, L. P. (2002). Student perspectives on teaching and its evaluation. Assessment & Evaluation in Higher Education, 27(5), 397–409.CrossRefGoogle Scholar
  22. Standish, T. (2017). A validation study of self-reported behavior: Can college student self-reports of behavior be accepted as being self-evident? (Doctoral dissertation). Retrieved from NC State University Electronic Theses and Dissertations. (Accession no. 33607).Google Scholar
  23. Young, K.R., Joines, J.A., Standish, T., Gallagher, V.J. (2017). Student evaluations of teaching: The impact of faculty procedures on response rates.Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.North Carolina State UniversityRaleighUSA

Personalised recommendations