Abstract
The recent introduction of interact marks a significant shift in the way in which New Zealand high school students’ FL spoken communicative proficiency is to be assessed, and stands in stark contrast to earlier procedures. In particular, interact signals a shift from an assessment of learning model (converse) towards more open-ended assessment for learning opportunities (interact). This shift has implications for the perceived comparative usefulness of the two different assessments. This chapter outlines a theoretical framework to support an evaluation of the relative usefulness or fitness for purpose of different assessment types – Bachman and Palmer’s (1996) six qualities of test usefulness. The chapter goes on to articulate the fundamental principles informing interact in practice in terms of the information teachers have received, and evaluates these principles against the test usefulness framework. Finally, the chapter presents the methodology for the 2-year study that sought stakeholder views (both teachers and students) during the initial phases of the implementation of interact.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Open-endedness of language is obscured by one clarification document which suggests that the former language-specific curriculum documents and the former vocabulary and structures lists (which are supposed to have been withdrawn, see Chap. 3) may be used for guidance when determining whether the appropriate level of language has been achieved (NZQA, 2014d, moderator’s newsletter, December 2012). However, the overall tenor of the guidelines signals openness of language and structures.
- 2.
The scale as presented in the surveys (Fig. 4.2) suggests a measure from 1 to 10. This was done to indicate that strongly disagree was considered a viable response, with the mid-point (neutral) set at 5. In terms of measuring the response with a ruler, however, measurement began at 0mm and the extreme left of the scale was regarded as 0.
- 3.
See East (2012) for a brief discussion of the different kinds of assessment that schools in the New Zealand context can opt into. Alternatives include Cambridge International Examinations and the International Baccalaureate.
- 4.
A subsequent opportunity for informant feedback was possible when data were re-presented in a one-hour forum in 2014 which attracted approximately 180 attendees (East, 2014).
References
Bachman, L. F. (1990). Fundamental considerations in language testing. Oxford, England: Oxford University Press.
Bachman, L. F. (2002). Some reflections on task-based language performance assessment. Language Testing, 19(4), 453–476. http://dx.doi.org/10.1191/0265532202lt240oa
Bachman, L. F., & Palmer, A. (1996). Language testing in practice: Designing and developing useful language tests. Oxford, England: Oxford University Press.
Bachman, L. F., & Palmer, A. (2010). Language assessment in practice: Developing language assessments and justifying their use in the real world. Oxford, England: Oxford University Press.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. http://dx.doi.org/10.1191/1478088706qp063oa
Brown, H. D., & Abeywickrama, P. (2010). Language assessment: Principles and classroom practices (2nd ed.). New York, NY: Pearson.
Bryman, A. (2004a). Member validation and check. In M. Lewis-Beck, A. Bryman, & T. Liao (Eds.), Encyclopedia of social science research methods (p. 634). Thousand Oaks, CA: Sage. http://dx.doi.org/10.4135/9781412950589.n548
Bryman, A. (2004b). Triangulation. In M. B. Lewis-Beck, A. Bryman, & T. Liao (Eds.), Encyclopedia of social science research methods (pp. 1143–1144). Thousand Oaks, CA: Sage. http://dx.doi.org/10.4135/9781412950589.n1031
Canale, M. (1983). On some dimensions of language proficiency. In J. W. J. Oller (Ed.), Issues in language testing research (pp. 333–342). Rowley, MA: Newbury House.
Canale, M., & Swain, M. (1980). Theoretical bases of communicative approaches to second language teaching and testing. Applied Linguistics, 1(1), 1–47. http://dx.doi.org/10.1093/applin/i.1.1
Council of Europe, (2001). Common European Framework of Reference for languages. Cambridge, England: Cambridge University Press.
Denzin, N. K. (1970). The research act in sociology. Chicago, IL: Aldine.
East, M. (2008). Dictionary use in foreign language writing exams: Impact and implications. Amsterdam, Netherlands/Philadelphia, PA: John Benjamins. http://dx.doi.org/10.1075/lllt.22
East, M. (2012). Task-based language teaching from the teachers’ perspective: Insights from New Zealand. Amsterdam, Netherlands / Philadelphia, PA: John Benjamins. http://dx.doi.org/10.1075/tblt.3
East, M. (2013, August 24). The new NCEA ‘interact’ standard: Teachers’ thinking about assessment reform. Paper presented at the New Zealand Association of Language Teachers (NZALT) Auckland/Northland Region language seminar, Auckland.
East, M. (2014, July, 6–9). To interact or not to interact? That is the question. Keynote address at the New Zealand Association of Language Teachers (NZALT) Biennial National Conference, Languages Give You Wings, Palmerston North, NZ.
East, M., & Scott, A. (2011a). Assessing the foreign language proficiency of high school students in New Zealand: From the traditional to the innovative. Language Assessment Quarterly, 8(2), 179–189. http://dx.doi.org/10.1080/15434303.2010.538779
East, M., & Scott, A. (2011b). Working for positive washback: The standards-curriculum alignment project for Learning Languages. Assessment Matters, 3, 93–115.
Hinkel, E. (2010). Integrating the four skills: Current and historical perspectives. In R. Kaplan (Ed.), The Oxford handbook of applied linguistics (2nd ed., pp. 110–123). Oxford, England: Oxford University Press. http://dx.doi.org/10.1093/oxfordhb/9780195384253.013.0008
Hu, G. (2013). Assessing English as an international language. In L. Alsagoff, S. L. McKay, G. Hu, & W. A. Renandya (Eds.), Principles and practices for teaching English as an international language (pp. 123–143). New York, NY: Routledge.
Koefoed, G. (2012). Policy perspectives from New Zealand. In M. Byram & L. Parmenter (Eds.), The Common European Framework of Reference: The Globalisation of Language Education Policy (pp. 233–247). Clevedon, England: Multilingual Matters.
Kramsch, C. (1986). From language proficiency to interactional competence. The Modern Language Journal, 70(4), 366–372. http://dx.doi.org/10.1111/j.1540-4781.1986.tb05291.x
Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159–174. http://dx.doi.org/10.2307/2529310
Lazaraton, A. (1995). Qualitative research in applied linguistics: A progress report. TESOL Quarterly, 29(3), 455–472. http://dx.doi.org/10.2307/3588071
Lazaraton, A. (2002). A qualitative approach to the validation of oral language tests. Cambridge, England: Cambridge University Press.
Leaper, D. A., & Riazi, M. (2014). The influence of prompt on group oral tests. Language Testing, 31(2), 177–204. http://dx.doi.org/10.1177/0265532213498237
Lewkowicz, J. (2000). Authenticity in language testing: Some outstanding questions. Language Testing, 17(1), 43–64. http://dx.doi.org/10.1177/026553220001700102
Luoma, S. (2004). Assessing speaking. Cambridge, England: Cambridge University Press. http://dx.doi.org/10.1017/cbo9780511733017
Mangubhai, F., Marland, P., Dashwood, A., & Son, J. B. (2004). Teaching a foreign language: One teacher’s practical theory. Teaching and Teacher Education, 20, 291–311. http://dx.doi.org/10.1016/j.tate.2004.02.001
McNamara, T. (1997). ‘Interaction’ in second language performance assessment: Whose performance? Applied Linguistics, 18(4), 446–466. http://dx.doi.org/10.1093/applin/18.4.446
Merriam, S. B. (2009). Qualitative research: A guide to design and implementation. San Fransisco, CA: Jossey-Bass.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook (2nd ed.). Thousand Oaks, CA.: Sage.
Ministry of Education. (2014b). Resources for internally assessed achievement standards. Retrieved from http://ncea.tki.org.nz/Resources-for-Internally-Assessed-Achievement-Standards
NZQA. (2014a). External moderation. Retrieved from http://www.nzqa.govt.nz/providers-partners/assessment-and-moderation/managing-national-assessment-in-schools/secondary-moderation/external-moderation/
NZQA. (2014b). Internal moderation. Retrieved from http://www.nzqa.govt.nz/providers-partners/assessment-and-moderation/managing-national-assessment-in-schools/secondary-moderation/external-moderation/internal-moderation/
NZQA. (2014c). Languages – Clarifications. Retrieved from http://www.nzqa.govt.nz/qualifications-standards/qualifications/ncea/subjects/languages/clarifications/
NZQA. (2014d). Languages – Moderator’s newsletter. Retrieved from http://www.nzqa.govt.nz/qualifications-standards/qualifications/ncea/subjects/languages/moderator-newsletters/October-2014/
NZQA. (2014e). NCEA subject resources. Retrieved from http://www.nzqa.govt.nz/qualifications-standards/qualifications/ncea/subjects/
Pardo-Ballester, C. (2010). The validity argument of a web-based Spanish listening exam: Test usefulness evaluation. Language Assessment Quarterly, 7(2), 137–159. http://dx.doi.org/10.1080/15434301003664188
Poehner, M. (2008). Dynamic assessment: A Vygotskian approach to understanding and promoting L2 development. New York, NY: Springer.
Scott, A., & East, M. (2009). The standards review for learning languages: How come and where to? The New Zealand Language Teacher, 39, 28–33.
Scott, A., & East, M. (2012). Academic perspectives from New Zealand. In M. Byram & L. Parmenter (Eds.), The Common European framework of reference: The globalisation of language education policy (pp. 248–257). Clevedon, England: Multilingual Matters.
Shohamy, E. (2001). The social responsibility of the language testers. In R. L. Cooper (Ed.), New perspectives and issues in educational language policy (pp. 113–130). Amsterdam, Netherlands/Philadelphia, PA: John Benjamins Publishing Company. http://dx.doi.org/10.1075/z.104.09sho
Shohamy, E. (2007). Tests as power tools: Looking back, looking forward. In J. Fox, M. Wesche, D. Bayliss, L. Cheng, C. E. Turner, & C. Doe (Eds.), Language testing reconsidered (pp. 141–152). Ottawa, Canada: University of Ottawa Press.
Spolsky, B. (1985). The limits of authenticity in language testing. Language Testing, 2(1), 31–40. http://dx.doi.org/10.1177/026553228500200104
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer Science+Business Media Singapore
About this chapter
Cite this chapter
East, M. (2016). Investigating Stakeholder Perspectives on Interact . In: Assessing Foreign Language Students’ Spoken Proficiency. Educational Linguistics, vol 26. Springer, Singapore. https://doi.org/10.1007/978-981-10-0303-5_4
Download citation
DOI: https://doi.org/10.1007/978-981-10-0303-5_4
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-0301-1
Online ISBN: 978-981-10-0303-5
eBook Packages: EducationEducation (R0)