Comparing Likert Scale Functionality Across Culturally and Linguistically Diverse Groups in Science Education Research: an Illustration Using Qatari Students’ Responses to an Attitude Toward Science Survey

  • Ryan Summers
  • Shuai Wang
  • Fouad Abd-El-Khalick
  • Ziad Said
Article
  • 20 Downloads

Abstract

Surveying is a common methodology in science education research, including cross-national and cross-cultural comparisons. The literature surrounding students’ attitudes toward science, in particular, illustrates the prevalence of efforts to translate instruments with the eventual goal of comparing groups. This paper utilizes survey data from a nationally representative cross-sectional study of Qatari students in grades 3 through 12 to frame a discussion around the adequacy and extent to which common adaptations allow comparisons to be made among linguistically or culturally different respondents. The analytic sample contained 2615 students who responded to a previously validated 32-item instrument, 1704 of whom completed the survey in Modern Standard Arabic and 911 in English. The purpose of using these data is to scrutinize variation in the performance of the instrument between groups of respondents as determined by language of survey completion and cultural heritage. Multi-group confirmatory factor analysis was employed to investigate issues of validity associated with the performance of the survey with each group and to evaluate the appropriateness of using this instrument to make simultaneous comparisons across the distinct groups. Findings underscore the limitations of group comparability that may persist even when issues of translation and adaptation were heavily attended to during instrument development.

Keywords

Attitudes toward science Cross-sectional Multi-group CFA Translation Validity 

Supplementary material

10763_2018_9889_MOESM1_ESM.docx (20 kb)
ESM 1 (DOCX 19 kb)

References

  1. Abd-El-Khalick, F., Summers, R., Said, Z., Wang, S., & Culbertson, M. (2015). Development and large-scale validation of an instrument to assess Arabic-speaking students’ attitudes toward science. International Journal of Science Education, 37(16), 2637–2663.CrossRefGoogle Scholar
  2. Adolphe, F. (2002). A cross-national study of classroom environment and attitudes among junior secondary science students in Australia and in Indonesia (Doctoral dissertation). Curtin University, Australia.Google Scholar
  3. Ali, M. S., Mohsin, M. N., & Iqbal, M. Z. (2013). The discriminant validity and reliability for Urdu version of Test of Science-Related Attitudes (TOSRA). International Journal of Humanities and Social Science, 3(2), 29–39.Google Scholar
  4. Amer, S. R., Ingels, S. J., & Mohammed, A. (2009). Validity of borrowed questionnaire items: A cross-cultural perspective. International Journal of Public Opinion Research, 21(3), 368–375.CrossRefGoogle Scholar
  5. Beaton, D. E., Bombardier, C., Guillemin, F., & Ferraz, M. B. (2000). Guidelines for the process of cross-cultural adaptation of self-report measures. Spine, 25(24), 3186–3191.CrossRefGoogle Scholar
  6. Bentler, P. M. (2005). EQS 6.1: Structural equations program manual. Encino, CA: Multivariate Software, Inc..Google Scholar
  7. Bentler, P. M., & Bonett, D. G. (1980). Significance tests and goodness of fit in the analysis of covariance structures. Psychological Bulletin, 88(3), 588–606.CrossRefGoogle Scholar
  8. Blalock, C. L., Lichtenstein, M. J., Owen, S., Pruski, L., Marshall, C., & Topperwein, M. (2008). In pursuit of validity: A comprehensive review of science attitude instruments. International Journal of Science Education, 30, 961–977.CrossRefGoogle Scholar
  9. Bollen, K. A. (1989). Structural equations with latent variables. New York, NY: Wiley & Sons, Inc..CrossRefGoogle Scholar
  10. Brewer, D. J., Augustine, C. H., Zellman, G. L., Ryan, G. W., Goldman, C. A., Stasz, C., & Constant, L. (2007). Education for a new era: Design and implementation of K–12 education reform in Qatar. Retrieved from http://www.rand.org/pubs/monographs/2007/RAND MG548.pdf.
  11. Boone, W. J., Townsend, J. S., & Staver, J. (2011). Using Rasch theory to guide the practice of survey development and survey data analysis in science education and to inform science reform efforts: An exemplar utilizing STEBI self-efficacy data. Science Education, 95, 258–280.  https://doi.org/10.1002/sce.20413.
  12. Borgers, N., De Leeuw, E., & Hox, J. (2000). Children as respondents in survey research: Cognitive development and response quality 1. Bulletin of Sociological Methodology, 66(1), 60–75.CrossRefGoogle Scholar
  13. Borgers, N., Sikkel, D., & Hox, J. (2004). Response effects in surveys on children and adolescents: The effect of number of response options, negative wording, and neutral mid-point. Quality and Quantity, 38(1), 17–33.CrossRefGoogle Scholar
  14. Brickhouse, N. W., & Potter, J. T. (2001). Young women’s scientific identity formation in an urban context. Journal of Research in Science Teaching, 38, 965–980.CrossRefGoogle Scholar
  15. Brislin, R. W., Lonner, W. J., & Thorndike, R. M. (1973). Cross-cultural research methods. New York, NY: Wiley.Google Scholar
  16. Brotman, J., & Moore, F. (2008). Girls and science: A review of four themes in the science education literature. Journal of Research in Science Teaching, 45(9), 971–1002.  https://doi.org/10.1002/tea.20241.CrossRefGoogle Scholar
  17. Byrne, B. M. (2006). Structural equation modeling with EQS: Basic concepts, applications, and programming (2nd ed.). Mahwah, NJ: Erlbaum.Google Scholar
  18. Byrne, B. M. (2008). Testing for multigroup equivalence of a measuring instrument: A walk through the process. Psicothema, 20(4), 872–882.Google Scholar
  19. Campbell, A. A., & Katona, G. (1953). The sample survey: A technique for social science research. In L. Festinger & D. Katz (Eds.), Research methods in the behavioral sciences (pp. 14–55). New York, NY: Dryden.Google Scholar
  20. Central Intelligence Agency (2013). Qatar. In The world factbook. Retrieved from https://www.cia.gov/library/publications/the-world-factbook/geos/qa.html.
  21. Chen, C., Lee, S. Y., & Stevenson, H. W. (1995). Response style and cross-cultural comparisons of rating scales among East Asian and North American students. Psychological Science, 6(3), 170–175.CrossRefGoogle Scholar
  22. Cortina, J. (1993). What is coefficient alpha: An examination of theory and applications. Journal of Applied Psychology, 78, 98–104.CrossRefGoogle Scholar
  23. Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychomerika, 16, 297–334.CrossRefGoogle Scholar
  24. Curebal F (2004). Gifted students’ attitudes towards science and classroom environment based on gender and grade level (Unpublished Master's Thesis). Graduate School of Natural and Applied Sciences at Middle East Technical University, Ankara, Turkey.Google Scholar
  25. Fraser, B. (1981). Test of science related attitudes. Melbourne, Australia: Australian Council for Educational Research.Google Scholar
  26. Fraser, B., Aldridge, J. M., & Adolphe, F. S. G. (2010). A cross-national study of secondary science classroom environments in Australia and Indonesia. Research in Science Education, 40, 551–571.CrossRefGoogle Scholar
  27. Gall, M. D., Borg, W. R., & Gall, J. P. (1996). Educational research: An introduction. White Plains, NY: Longman.Google Scholar
  28. Gencer, A. S., & Cakiroglu, J. (2007). Turkish preservice science teachers’ efficacy beliefs regarding science teaching and their beliefs about classroom management. Teaching and Teacher Education, 23(5), 664–675.CrossRefGoogle Scholar
  29. General Secretariat for Development Planning (2010). Qatar national vision 2030. Doha, Qatar: Authors.Google Scholar
  30. Gokhale, A., Rabe-Hemp, C., Woeste, L., & Machina, K. (2015). Gender differences in attitudes toward science and technology among majors. Journal of Science Education and Technology, 24(4), 509–516.  https://doi.org/10.1007/s10956-014-9541-5.CrossRefGoogle Scholar
  31. Guillemin, F., Bombardier, C., & Beaton, D. (1993). Cross-cultural adaptation of health related quality of life measures: Literature review and proposed guidelines. Journal of Clinical Epidemiology, 46, 1417–1432.CrossRefGoogle Scholar
  32. Harkness, J. A., & Schoua-Glusberg, A. (1998). Questionnaires in translation. ZUMA-Nachrichten Spezial, 3(1), 87–127.Google Scholar
  33. Harkness, J. A., Van de Vijver, F. J., & Mohler, P. P. (2003). Cross-cultural survey methods. Hoboken, NJ: Wiley-Interscience.Google Scholar
  34. Hayduk, L. A. (1987). Structural equation modeling with LISREL: Essentials and advances. Baltimore, MD: The Johns Hopkins University Press.Google Scholar
  35. Horn, J. L., & McArdle, J. J. (1992). A practical and theoretical guide to measurement invariance in aging research. Experimental Aging Research, 18, 117–144.CrossRefGoogle Scholar
  36. Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1–55.CrossRefGoogle Scholar
  37. Jenkins, E. W., & Pell, G. (2006). The Relevance of Science Education Project (ROSE) in England: A summary of findings. Leeds, England: University of Leeds Centre for Studies in Science and Mathematics Education.Google Scholar
  38. Jowell, R., Roberts, C., Fitzgerald, R., & Eva, G. (Eds.). (2007). Measuring attitudes cross-nationally: Lessons from the European Social Survey. London, England: Sage.Google Scholar
  39. Kellett, M., & Ding, S. (2004). Middle childhood. In S. Fraser, V. Lewis, S. Ding, M. Kellett, & C. Robinson (Eds.), Doing research with children and young people (pp. 161–174). London, England: Sage.Google Scholar
  40. Khalili, K. Y. (1987). A crosscultural validation of a test of science related attitudes. Journal of Research in Science Teaching, 24(2), 127–136.CrossRefGoogle Scholar
  41. King, G., Murray, C. J., Salomon, J. A., & Tandon, A. (2003). Enhancing the validity and cross-cultural comparability of measurement in survey research. American Political Science Review, 97(04), 567–583.CrossRefGoogle Scholar
  42. Kline, P. (1979). Psychometrics and psychology. London, England: Academic Press.Google Scholar
  43. Liaghatdar, M. J., Soltani, A., & Abedi, A. (2011). A validity study of attitudes toward science scale among Iranian secondary school students. International Education Studies, 4(4), 36–46.CrossRefGoogle Scholar
  44. Lovelace, M., & Brickman, P. (2013). Best practices for measuring students’ attitudes toward learning science. CBE-Life Sciences Education, 12(4), 606–617.CrossRefGoogle Scholar
  45. Lowe, J. P. (2004). The effect of a cooperative group work and assessment on the attitudes of students towards science in New Zealand (Unpublished doctoral dissertation). Curtin University of Technology, Curtin, Australia.Google Scholar
  46. Lyons, T. (2006). Different countries, same science classes: Students’ experiences of school science in their own words. International Journal of Science Education, 28, 591–613.CrossRefGoogle Scholar
  47. McKay, R. B., Breslow, M. J., Sangster, R. L., Gabbard, S. M., Reynolds, R. W., Nakamoto, J. M., & Tarnai, J. (1996). Translating survey questionnaires: Lessons learned. New Directions for Evaluation, 70, 93–104.CrossRefGoogle Scholar
  48. Meredith, W. (1993). Measurement invariance, factor analysis, and factorial invariance. Psychometrika, 58, 525–542.CrossRefGoogle Scholar
  49. Mourtaga, K. (2004). Investigating writing problems among Palestinian students: Studying English as a foreign language. Bloomington, Indiana: Author House.Google Scholar
  50. Navarro, M., Förster, C., González, C., & González-Pose, P. (2016). Attitudes toward science: Measurement and psychometric properties of the Test of Science-Related Attitudes for its use in Spanish-speaking classrooms. International Journal of Science Education, 38(9), 1459–1482.CrossRefGoogle Scholar
  51. Nunnally, J., & Bernstein, L. (1994). Psychometric theory. New York, NY: McGraw-Hill Higher, INC..Google Scholar
  52. Osborne, J., Simon, S., & Collins, S. (2003). Attitude towards science: A review of the literature and its implications. International Journal of Science Education, 25(9), 1049–1079.CrossRefGoogle Scholar
  53. Pell, T., & Jarvis, T. (2001). Developing attitude to science scales for use with children of ages from five to eleven years. International Journal in Science Education, 23, 847–862.CrossRefGoogle Scholar
  54. Poortinga, Y. H. (1989). Equivalence of cross-cultural data: An overview of basic issues. International Journal of Psychology, 24(6), 737–756.CrossRefGoogle Scholar
  55. Potvin, P., & Hasni, A. (2014). Interest, motivation and attitude towards science and technology at K-12 levels: A systematic review of 12 years of educational research. Studies in Science Education, 50(1), 85–129.CrossRefGoogle Scholar
  56. Presser, S., Couper, M. P., Lessler, J. T., Martin, E., Martin, J., Rothgeb, J. M., & Singer, E. (2004). Methods for testing and evaluating survey questions. Public Opinion Quarterly, 68(1), 109–130.CrossRefGoogle Scholar
  57. Qatar Foundation (2009). Science and research. Retrieved December 6, 2009 from http://www.qf.org.qa/output/Page18.asp.
  58. Rashed, R. (2003). Report on ROSE project in Egypt. Retrieved from http://roseproject.no/network/countries/egypt/report-egy.pdf.
  59. Said, Z., Summers, R., Abd-El-Khalick, F., & Wang, S. (2016). Attitudes toward science among grades 3 through 12 Arab students in Qatar: Findings from a cross-sectional national study. International Journal of Science Education, 38(4), 621–643.CrossRefGoogle Scholar
  60. Santiboon, T. (2013). School environments inventory in primary education in Thailand. Merit Research Journal of Education and Review, 1(10), 250–258.Google Scholar
  61. Schmitt, N. (1996). Uses and abuses of coefficient alpha. Psychological Assessment, 8, 350–353.CrossRefGoogle Scholar
  62. Schreiber, J. B., Nora, A., Stage, F. K., Barlow, E. A., & King, J. (2006). Reporting structural equation modeling and confirmatory factor analysis results: A review. The Journal of Educational Research, 99(6), 323–338.CrossRefGoogle Scholar
  63. Schreiner, C., & Sjøberg, S. (2004). Sowing the seeds of ROSE: Background, rationale, questionnaire development and data collection for ROSE (the Relevance of Science Education)—A comparative study of students’ views of science and science education. Oslo, Norway: University of Oslo, Department of Teacher Education and School Development.Google Scholar
  64. Sechrest, L., Fay, T. L., & Zaidi, S. H. (1972). Problems of translation in cross-cultural research. Journal of Cosscultural Psychology, 3(1), 41–56.Google Scholar
  65. Shrigley, R. L. (1990). Attitude and behavior correlates. Journal of Research in Science Teaching, 27, 97–113.CrossRefGoogle Scholar
  66. Sorbom, D. (1974). A general method for studying differences in factor means and factor structure between groups. British Journal of Mathematical and Statistical Psychology, 27, 229–239.CrossRefGoogle Scholar
  67. Sperber, A. D., Devellis, R. F., & Boehlecke, B. (1994). Cross-cultural translation. Journal of Cross-Cultural Psychology, 25(4), 501–524.CrossRefGoogle Scholar
  68. Squires, A., Aiken, L. H., van den Heede, K., Sermeus, W., Bruyneel, L., Lindqvist, R., . . . Matthews, A. (2013). A systematic survey instrument translation process for multi-country, comparative health workforce studies. International Journal of Nursing Studies, 50(2), 264–273.Google Scholar
  69. Streiner D.L., & Norman G.R. (1989). Health measurement scales: A practical guide to their development and use. New York, NY: Oxford University.Google Scholar
  70. Tavakol, M., & Dennick, R. (2011). Making sense of Cronbach's alpha. International Journal of Medical Education, 2, 53–55.Google Scholar
  71. Telli, S. (2006). Students’ perceptions of their science teachers’ interpersonal behaviour in two countries: Turkey and the Netherlands (Unpublished doctoral dissertation). The Graduate School of Natural and Applied Sciences, Middle East Technical University, Ankara, Turkey.Google Scholar
  72. The World Bank. (2008). The road not traveled: Education reform in the Middle East and North Africa. Washington, DC: Author.Google Scholar
  73. Tomas, L., & Ritchie, S. (2015). The challenge of evaluating students’ scientific literacy in a writing-to-learn context. Research in Science Education, 45(1), 41–58.  https://doi.org/10.1007/s11165-014-9412-3.CrossRefGoogle Scholar
  74. Turkmen, L., & Bonnstetter, R. (1999). A study of Turkish preservice science teachers’ attitudes toward science and science teaching. Retrieved from ERIC database. (ED444828).Google Scholar
  75. Tytler, R., & Osborne, J. (2012). Student attitudes and aspirations towards science. In B. Fraser, K. Tobin, & C. J. McRobbie (Eds.), Second international handbook of science education (pp. 597–625). Dordrecht, Netherlands: Springer.CrossRefGoogle Scholar
  76. United Nations Development Programme. (2003). The Arab human development report: Building a knowledge society. New York, NY: UNDP regional Program and Arab Fund for Economic and social Development.Google Scholar
  77. Van de Vijver, F., & Leung, K. (1997). Methods and data analysis of comparative research. In J. W. Berry, Y. H. Poortinga, & J. Pandey (Eds.), Handbook of cross-cultural psychology: Theory and method (Vol. 1, 2nd ed., pp. 257–300). Needham Heights, MA: Allyn & Bacon.Google Scholar
  78. Vaske, J. J. (2008). Survey research and analysis: Applications in parks, recreation and human dimensions. State College, PA: Venture Publishing.Google Scholar
  79. Wang, J., & Wang, X. (2012). Structural equation modeling: Applications using Mplus. Hoboken, NJ: John Wiley & Sons, Inc..CrossRefGoogle Scholar
  80. Watkins, D., & Cheung, S. (1995). Culture, gender, and response bias: An analysis of responses to the self-description questionnaire. Journal of Cross-Cultural Psychology, 26(5), 490–504.CrossRefGoogle Scholar
  81. Webb, A. (2014). A cross-cultural analysis of the Test of Science Related Attitudes (Master’s thesis). The Pennsylvania State University, Pennsylvania, EE.UU. Retrieved from https://etda.libraries.psu.edu/paper/22723/24112.
  82. Widaman, K. F., & Reise, S. P. (1997). Exploring the measurement invariance of psychological instruments: Applications in the substance abuse domain. In K. J. Bryant & M. Windle (Eds.), The science of prevention: Methodological advance from alcohol and substance abuse research (pp. 281–324). Washington, DC: American Psychological Association.CrossRefGoogle Scholar
  83. Zellman, G. L., Ryan, G. W., Karam, R., Constant, L., Salem, H., Gonzalez, G., . . . Al-Obaidli, K. (2007). Implementation of the K-12 education reform in Qatar’s schools. Santa Monica, CA: RAND Corporation.Google Scholar
  84. Zellman, G. L., Ryan, G. W., Karam, R., Constant, L., Salem, H., Gonzalez, G., . . . Al-Obaidli, K. (2009). Implementation of the K-12 education reform in Qatar’s schools. Santa Monica, CA: RAND Corporation. Retrieved from http://www.rand.org/pubs/monographs/2009/RAND MG880.pdf.

Copyright information

© Ministry of Science and Technology, Taiwan 2018

Authors and Affiliations

  1. 1.Department of Teaching and LearningUniversity of North DakotaGrand ForksUSA
  2. 2.SRI InternationalWashington, DCUSA
  3. 3.School of EducationUniversity of North Carolina at Chapel HillChapel HillUSA
  4. 4.School of Engineering TechnologyCollege of the North AtlanticDohaQatar

Personalised recommendations