Advertisement

Exploring the Relationship Between Self-Assessments and OPIc Ratings of Oral Proficiency in French

  • Magda TigchelaarEmail author
Chapter
Part of the Educational Linguistics book series (EDUL, volume 37)

Abstract

The present study analyzed the self-assessed spoken French language abilities that students said they ‘can do’ in relation to the ACTFL proficiency scores they received on an oral proficiency interview by computer (OPIc). A secondary aim was to assess different scales that have been used to convert OPIc ratings to numeric scores.

French university students (N = 216) of varying proficiency levels rated a series of can-do statements related to speaking skills. They then completed the ACTFL OPIc test, which was rated by certified ACTFL raters. A series of regression analyses showed that the strength of the relationship between self-assessment and OPIc ratings was strongly influenced by the type of numeric scale used: When data were ranked ordinally and analyzed using an ordinal regression, a majority (65%) of variance in OPIc scores was explained by self-assessment scores. Analyzed using linear regression, when scores were converted to equal-interval scales, self-assessment scores explained approximately 30% of variance. On a graduated scale that reflected the increasing distances between ACTFL (2012) proficiency levels, only 20% of variance was accounted for.

Keywords

Self-assessment Oral proficiency Can-do statements Concurrent validity Correlation Regression 

References

  1. ACTFL. (2012). ACTFL proficiency guidelines – speaking. Retrieved from http://www.actfl.org
  2. ACTFL. (2015). NCSSFL-ACTFL can-do statements. Retrieved from http://www.actfl.org/global_statements
  3. Bachman, L. F. (1988). Problems in examining the validity of the ACTFL oral proficiency interview. Studies in Second Language Acquisition, 10, 149–164.  https://doi.org/10.1017/S0272263100007282 CrossRefGoogle Scholar
  4. Bachman, L. F. (2004). Statistical analyses for language assessment. Cambridge, UK: Cambridge University Press.CrossRefGoogle Scholar
  5. Bachman, L. F., & Savignon, S. (1986). The evaluation of communicative language proficiency: A critique of the ACTFL oral interview. The Modern Language Journal, 70, 380–390.  https://doi.org/10.1111/j.1540-4781.1986.tb05294.x CrossRefGoogle Scholar
  6. Brantmeier, C. (2006). Advanced L2 learners and reading placement: Self-assessment, CBT, and subsequent performance. System, 34(1), 15–35.  https://doi.org/10.1016/j.system.2005.08.004 CrossRefGoogle Scholar
  7. Brecht, D., Davidson, D., & Ginsberg, B. (1995). Predictors of foreign language gain during study abroad. In B. Freed (Ed.), Second language acquisition in a study abroad context (pp. 37–66). Philadelphia, PA: John Benjamins.CrossRefGoogle Scholar
  8. Brown, N. A., Dewey, D. P., & Cox, T. L. (2014). Assessing the validity of can-do statements in retrospective (then-now) self-assessment. Foreign Language Annals, 47(2), 261–285.  https://doi.org/10.1111/flan.12082 CrossRefGoogle Scholar
  9. Butler, Y. G. (2016). Self-assessment of and for young learners’ foreign language learning. In M. Nikolov (Ed.), Assessing young learners of English: Global and local perspectives (pp. 291–315). New York, NY: Springer International Publishing.CrossRefGoogle Scholar
  10. Byrnes, H., & Ortega, L. (2008). The longitudinal study of advanced L2 capacities. New York, NY: Routledge.Google Scholar
  11. Chalhoub–Deville, M., & Deville, C. (1999). Computer adaptive testing in second language contexts. Annual Review of Applied Linguistics, 19, 273–299.  https://doi.org/10.1017/S0267190599190147 CrossRefGoogle Scholar
  12. Chen, Y. M. (2008). Learning to self-assess oral performance in English: A longitudinal case study. Language Teaching Research, 12(2), 235–262.  https://doi.org/10.1177/1362168807086293 CrossRefGoogle Scholar
  13. Dandonoli, P., & Henning, G. (1990). An investigation of the construct validity of the ACTFL proficiency guidelines and oral interview procedure. Foreign Language Annals, 23(1), 11–21.  https://doi.org/10.1111/j.1944-9720.1990.tb00330.x CrossRefGoogle Scholar
  14. Davidson, D. (2010). Study abroad: When, how long, and with what results? New data from the Russian front. Foreign Language Annals, 43(1), 6–26.  https://doi.org/10.1111/j.1944-9720.2010.01057.x CrossRefGoogle Scholar
  15. Field, A. (2009). Discovering statistics using SPSS (3rd ed.). Thousand Oaks, CA: Sage Publications.Google Scholar
  16. Glover, P. (2011). Using CEFR level descriptors to raise university students’ awareness of their speaking skills. Language Awareness, 20(2), 121–133.  https://doi.org/10.1080/09658416.2011.555556 CrossRefGoogle Scholar
  17. Green, A. (2014). Exploring language assessment and testing. New York, NY: Routledge.Google Scholar
  18. Kenyon, D. M., & Malabonga, V. M. (2001). Comparing examinees’ attitudes toward a computerized oral proficiency assessment. Language Learning & Technology, 5, 60–83. Available at http://llt.msu.edu/vol5num2/pdf/kenyon.pdf
  19. Kenyon, D. M., & Tschirner, E. (2000). The rating of direct and semi-direct oral proficiency interviews: Comparing performance at lower proficiency levels. The Modern Language Journal, 84(1), 85–101.  https://doi.org/10.1111/0026-7902.00054 CrossRefGoogle Scholar
  20. Laerd Statistics. (2013). Ordinal regression using SPSS Statistics. Available from https://statistics.laerd.com/spss-tutorials/ordinal-regression-using-spss-statistics.php
  21. Lange, D. L., & Lowe, P. (1987). Grading reading passages according to the ACTFL/ETS/ILR reading proficiency standard: Can it be learned? Selected papers from the 1986 Language Testing Research Colloquium (pp. 111–127). Monterey, CA: Defense Language Institute. Available at https://archive.org/details/ERIC_ED287291
  22. Lee, I. (2016). Putting students at the centre of classroom L2 writing assessment. Canadian Modern Language Review, 72(2), 258–280.  https://doi.org/10.3138/cmlr.2802 CrossRefGoogle Scholar
  23. Lowe, P. (1985). The ILR proficiency scale as a synthesizing research principle: The view from the mountain. In J. J. Charles (Ed.), Foreign language proficiency in the classroom and beyond (pp. 9–54). Lincolnwood, IL: National Textbook Company. Available at https://eric.ed.gov/?id=ED253104
  24. Malabonga, V. M., Kenyon, D. M., & Carpenter, H. (2005). Self-assessment, preparation and response time on a computerized oral proficiency test. Language Testing, 22(1), 59–92.  https://doi.org/10.1191/0265532205lt297oa CrossRefGoogle Scholar
  25. Malone, M., & Montee, M. (2010). Oral proficiency assessment: Current approaches and applications for post-secondary foreign language programs. Language and Linguistics Compass, 4(10), 972–986.  https://doi.org/10.1111/j.1749-818X.2010.00246.x CrossRefGoogle Scholar
  26. Mason, L., Powers, C., & Donnelly, S. (2015). The Boren awards: A report of oral language proficiency gains during academic study abroad. New York: Institute of International Education. Available at https://www.iie.org/Research-and-Insights/Publications/The-Boren-Awards-A-Report-Of-Oral-Language-Proficiency-Gains
  27. Meredith, R. A. (1990). The oral proficiency interview in real life: Sharpening the scale. The Modern Language Journal, 74(3), 288–296.  https://doi.org/10.1111/j.1540-4781.1990.tb01065.x CrossRefGoogle Scholar
  28. National Security Education Program. (2016). The language flagship. Retrieved from http://www.nsep.gov/content/language-flagship
  29. Nikolov, M. (2016). A framework for young EFL learners’ diagnostic assessment: ‘Can do statements’ and task types. In M. Nikolov (Ed.), Assessing young learners of English: Global and local perspectives (pp. 65–92). New York, NY: Springer International Publishing.CrossRefGoogle Scholar
  30. Plonsky, L., & Oswald, F. L. (2014). How big is “big”? Interpreting effect sizes in L2 research. Language Learning, 64(4), 878–912.  https://doi.org/10.1111/lang.12079 CrossRefGoogle Scholar
  31. Purpura, J. E., & Turner, C. E. (2014) A learning-oriented assessment approach to understanding the complexities of classroom-based language assessment. Teachers College, Columbia University Roundtable in Second Language Studies: Roundtable on Learning-Oriented Assessment in Language Classrooms and Large Scale Assessment Contexts. Teachers College, Columbia University, New York, NY. Retrieved from http://www.tc.columbia.edu/tccrisls/
  32. Purpura, J. E., & Turner, C. E. (2015). Learning-oriented assessment in second and foreign language classrooms. In D. Tsagari & J. Banerjee (Eds.), Handbook of second language assessment (pp. 255–272). Boston, MA: De Gruyter Mouton.Google Scholar
  33. Ross, S. (1998). Self-assessment in second language testing: A meta-analysis and analysis of experiential factors. Language Testing, 15(1), 1–20.  https://doi.org/10.1177/026553229801500101 CrossRefGoogle Scholar
  34. Surface, E., Poncheri, R., & Bhavsar, K. (2008). Two studies investigating the reliability and validity of the English ACTFL OPIc with Korean test takers: The ACTFL OPIc validation project technical report. Retrieved from http://www.languagetesting.com/wp-content/uploads/2013/08/ACTFL-OPIc-English-Validation-2008.pdf
  35. Tigchelaar, M., Bowles, R., Winke, P., & Gass, S. (2017). Assessing the validity of ACTFL can-do statements for spoken proficiency: A Rasch analysis. Foreign Language Annals, 50(3), 379–403.CrossRefGoogle Scholar
  36. Thompson, G. L., Cox, T. L., & Knapp, N. (2016). Comparing the OPI and the OPIc: The effect of test method on oral proficiency scores and student preference. Foreign Language Annals, 49(1), 75–92.  https://doi.org/10.1111/flan.12178 CrossRefGoogle Scholar
  37. Trofimovich, P., Isaacs, T., Kennedy, S., Saito, K., & Crowther, D. (2014). Flawed self-assessment: Investigating self-and other-perception of second language speech. Bilingualism: Language and Cognition, 19(1), 1–19.  https://doi.org/10.1017/S1366728914000832 CrossRefGoogle Scholar
  38. Tschirner, E. (2016). Listening and reading proficiency levels of college students. Foreign Language Annals, 49, 201–223.  https://doi.org/10.1111/flan.12198 CrossRefGoogle Scholar
  39. Tschirner, E., Bärenfänger, O., & Wanner, I. (2012). Assessing evidence of validity of assigning CEFR rating to the ACTFL oral proficiency interview (OPI) and oral proficiency interview by computer (OPIc). (Technical Report 2012-US-PUB-1). Retrieved from Language Testing International: http://www.languagetesting.com/wp-content/uploads/2014/02/OPIc-CEFR-Study-Final-Report.pdf
  40. Vande Berg, M., Connor-Linton, J., & Paige, J. M. (2009). The Georgetown Consortium Project: Interventions for student learning abroad. Frontiers: The Interdisciplinary Journal of Study Abroad, 18, 1–75. Available at http://files.eric.ed.gov/fulltext/EJ883690.pdf
  41. VanPatten, B., Trego, D., & Hopkins, W. (2015). In-class vs. online testing in university-level language courses: A research report. Foreign Language Annals, 48(1), 659–668.  https://doi.org/10.1111/flan.12160 CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Western Michigan UniversityKalamazooUSA

Personalised recommendations