Student-Related Challenges of Performing Alternative Assessments from the Perspective of Kurdish Tertiary TESOL Teachers

  • Dler Abdullah IsmaelEmail author
Part of the Second Language Learning and Teaching book series (SLLT)


Most of the student-related challenges of alternative assessments are perhaps associated with those in which students are the major assessors. Brown and Hudson (1998) state that portfolios, self-assessment, and peer-assessment can be relatively difficult to produce and organise. Pedagogically, diagnosing and tackling the challenges of alternative assessments are potentially important for developing the teaching, learning, and assessment of English language. This study investigated the student-related challenges of conducting alternative assessments from the perspective of Kurdish tertiary TESOL teachers. Teachers’ perspectives are immensely important because their beliefs influence and shape their classroom practices (Wang, 2011). This is part of the continuing research “to understand how cognitive and affective factors interact in shaping what teachers do” (Borg, 2006). To this end, this study used face-to-face and focus group interviews as methods of data collection, and embraced an interpretivist and phenomenological approach, which requires researchers to situate meaning units in relationship to context and structure (Anderson, 2007). The participants were 12 interviewees from two English departments at a public university in the Kurdistan Region. The findings revealed that, regarding the performance of alternative assessments, teachers contended that most students had limited knowledge and low skills, and were unwilling to participate, unresponsive, and uncooperative. They were primarily interested in being spoon-fed information and using memorisation techniques, and were preoccupied with passing tests and acquiring certificates rather than actual learning. They were also often shy and ashamed of their mistakes, and thus lacked confidence, were unmotivated, not eager to learn, careless, inattentive to the rules, and subjective, all of which prevented the successful implementation of alternative assessment practices. The implications of the findings of this study could be influential in tackling the aforementioned student-related challenges of alternative assessments.


Alternative assessment Teacher perspective Student-related challenges Student affective aspects Self-assessment Peer-assessment 


  1. Abbas, Z. (2012). Difficulties in using methods of alternative assessment in teaching from Iraqi instructors’ points of view. AL-Fatih Journal, 48, 23–45.Google Scholar
  2. Anderson, R. (2007). Thematic content analysis (TCA) descriptive presentation of qualitative data (p. 4). Palo Alto, CA: Institute of Transpersonal Psychology.Google Scholar
  3. Anusienė, L., Kaminskienė, L., & Kavaliauskienė, G. (2007). The challenges for ESP learners: Alternative assessment of performance and usefulness of class activities. Kalbų Studijos, 10, 75–81.Google Scholar
  4. Ataç, B. A. (2012). Foreign language teachers’ attitude toward authentic assessment in language teaching. Journal of Language & Linguistics Studies, 8(2), 7–19.Google Scholar
  5. Azarnoosh, M. (2013). Peer assessment in an EFL context: Attitudes and friendship bias. Language Testing in Asia, 3(1), 1–10.CrossRefGoogle Scholar
  6. Bahous, R. (2008). The self-assessed portfolio: A case study. Assessment & Evaluation in Higher Education, 33(4), 381–393.CrossRefGoogle Scholar
  7. Baker, E. L. (2010). What probably works in alternative assessment. Report 772 of National Center for Research on Evaluation, Standards, and Student Testing (CRESST), Graduate School of Education & Information Studies, UCLA, University of California, Los Angeles. Retrieved from
  8. Borg, S. (2006). Teacher cognition and language education, research and practice. London, New York: Continuum.Google Scholar
  9. Briggs, A., & Coleman, M. (2007). Research methods in educational leadership and management. London: Sage Publications.Google Scholar
  10. Brown, J. D., & Hudson, T. (1998). The alternatives in language assessment. TESOL Quarterly, 32(4), 653–675.CrossRefGoogle Scholar
  11. Chen, Y. M. (2008). Learning to self-assess oral performance in English: A longitudinal case study. Language Teaching Research, 12(2), 235–262. Scholar
  12. Cheng, W., & Warren, M. (2005). Peer assessment of language proficiency. Language Testing, 22(1), 93–121.CrossRefGoogle Scholar
  13. Cheng, L., Rogers, T., & Hu, H. (2004). ESL/EFL instructors’ classroom assessment practices: Purposes, methods, and procedures. Language Testing, 21(3), 360–389. Scholar
  14. Chirimbu, S. (2013). Using alternative assessment methods in foreign language teaching. Case study: Alternative assessment of business English for university students. Scientific Bulletin of the Politehnica University of Timisoara, Transactions on Modern Languages, (12), 91–99.Google Scholar
  15. Cohen, L., Manion, L., & Morrison, K. (2007). Research methods in education. USA and Canada: Routledge.Google Scholar
  16. Crotty, M. (1998). The foundations of social research, meaning and perspective in the research process. London: Sage Publications.Google Scholar
  17. Derakhshan, A., Rezaei, S., & Alemi, M. (2011). Alternatives in assessment or alternatives to assessment: A solution or a quandary. International Journal of English Linguistics, 1(1), 173–178.CrossRefGoogle Scholar
  18. Dragemark Oscarson, A. (2009). Self-assessment of writing in learning English as a foreign language. A study at the upper secondary school level. Göteborg, Sweden: ACTA Universitatis Gothoburgensis, Geson Hylte Tryck.Google Scholar
  19. Earl, L., & Katz, S. (2006). Rethinking classroom assessment with purpose in mind: Assessment for learning, assessment as learning and assessment of learning. Winnipeg, MB: Minister of Education, Citizenship and Youth.Google Scholar
  20. Finch, A. E. (2002). Authentic assessment: Implications for EFL performance testing in Korea. Secondary Education Research, 49, 89–122.Google Scholar
  21. Grabin, L. A. (2009). Alternative assessment in the teaching of English as a foreign language in Israel. Unpublished doctoral dissertation, University of South Africa, Pretoria, Gauteng, South Africa.
  22. Grami, G. M. A. (2010). The effects of integrating peer feedback into university-level ESL writing curriculum: A comparative study in a Saudi context. Unpublished doctoral dissertation, Newcastle University, Newcastle upon Tyne, UK.Google Scholar
  23. Hamayan, E. V. (1995). Approaches to alternative assessment. Annual Review of Applied Linguistics, 15, 212–226.CrossRefGoogle Scholar
  24. Hancock, C. (1994). Alternative assessment and second language study: What and why? East Lansing, MI: National Center for Research on Teacher Learning.Google Scholar
  25. Hassaskhah, J., & Sharifi, A. (2011). The role of portfolio assessment and reflection on process writing. Asian EFL Journal Quarterly, 13(1), 193–231.Google Scholar
  26. Hidri, S. (2014). Developing and evaluating a dynamic assessment of listening comprehension in an EFL context. Language Testing in Asia, 4(1), 4. Scholar
  27. Hidri, S. (2017). Specs validation of a dynamic reading comprehension test for EAP learners in an EFL context. In S. Hidri & C. Coombe (Eds.), Evaluation in foreign language education in the Middle East and North Africa (pp. 315–337). Cham: Springer International Publishing.CrossRefGoogle Scholar
  28. Ismael, D. A. (2016). The assessment practices of in-service Kurdish Tertiary TESOL teachers and their cognitions of alternative assessment (Unpublished doctoral thesis). University of Exeter, Exeter, UK.Google Scholar
  29. Ishihara, N. (2009). Teacher-based assessment for foreign language pragmatics. TESOL Quarterly, 43(3), 445–470.CrossRefGoogle Scholar
  30. İşlek, H., & Ortaokulu, M. A. G. (2012). Students’ attitudes and performance on portfolios. Journal of Educational & Instructional Studies in the World, 2(4), 42–47.Google Scholar
  31. Jia, Y. (2009). Ethical standards for language testing professionals: An introduction to five major codes. Shiken: JALT Testing & Evaluation SIG Newsletter, 13(2), 2–8.Google Scholar
  32. Khonbi, Z. A., & Sadeghi, K. (2013). The effect of assessment type (self vs. peer) on Iranian university EFL students’ course achievement. Procedia-Social and Behavioral Sciences, 70, 1552–1564.CrossRefGoogle Scholar
  33. Knox, S., & Burkard, A. W. (2009). Qualitative research interviews. Psychotherapy Research, 19(4–5), 566–575.CrossRefGoogle Scholar
  34. Kvale, S. (2007). Doing interviews. London: Sage.CrossRefGoogle Scholar
  35. Law, B., & Eckes, M. (2007). Assessment and ESL: An alternative approach. Canada: Portage & Main Press.Google Scholar
  36. Lim, H. (2007). A study of self- and peer-assessment on learners’ oral proficiency. In CamLing proceedings of the Fifth University of Cambridge postgraduate conference in language research (pp. 169–176). Cambridge: Cambridge Institute of Language Research.Google Scholar
  37. Lin, Y. (2009). Enhancing EFL learners’ English reading proficiency through collocation instruction. English Teaching & Learning, 33(1), 37–71.Google Scholar
  38. Lucas, R. (2007). A study on portfolio assessment as an effective student self-evaluation scheme. The Asia-Pacific Education Researcher, 16(1), 23–32.Google Scholar
  39. Mertens, M. (2010). Research and evaluation in education and psychology, integrating diversity with quantitative, qualitative, and mixed methods. California: Sage Publications.Google Scholar
  40. Norris, J. M., Brown, J. D., Hudson, T. D., & Bonk, W. (2002). Examinee abilities and task difficulty in task-based second language performance assessment. Language Testing, 19(4), 395–418.CrossRefGoogle Scholar
  41. Peng, J. C. (2008). Peer assessment in an EFL context: Attitudes and correlations. In Proceedings of the 2008 second language research forum (pp. 89–107). Somerville, MA: Cascadilla Proceedings Project.Google Scholar
  42. Pine, J. (2009). Teacher action research, building knowledge democracies. USA: Sage Publications.Google Scholar
  43. Radnor, H. (2002). Researching your professional practice, doing interpretive research. Buckingham and Philadelphia: Open University Press.Google Scholar
  44. Roskams, T. (1999). Chinese EFL students’ attitudes to peer feedback and peer assessment in an extended pair work setting. RELC Journal, 30(1), 79–123.CrossRefGoogle Scholar
  45. Saldaña, J. (2012). The coding manual of qualitative researchers. Los Angeles, London, New Delhi: Sage Publications.Google Scholar
  46. Savenye, W., & Robinson, R. (1996). Qualitative research issues and methods: An introduction for educational technologists. In D. Jonassen (Ed.), Handbook of research for educational communications and technology (pp. 1171–1195). New York: Macmillan.Google Scholar
  47. Scarino, A. (2013). Language assessment literacy as self-awareness: Understanding the role of interpretation in assessment and in teacher learning. Language Testing, 30(3), 309–327. Scholar
  48. Shrestha, P., & Coffin, C. (2012). Dynamic assessment, tutor mediation and academic writing development. Assessing Writing, 17(1), 55–70. Scholar
  49. Suzuki, M. (2009). The compatibility of L2 learners’ assessment of self- and peer revisions of writing with teachers’ assessment. TESOL Quarterly, 43(1), 137–148. Scholar
  50. Tannenbaum, J. (1996). Practical ideas on alternative assessment for ESL students. ERIC Digest Journal, No. ED, 395–500.Google Scholar
  51. Troudi, S. (2006). Empowering ourselves through action research. In P. Davidson et al. (Eds.), Teaching, learning, leading: Proceedings of the 11th TESOL Arabia conference (pp. 277–290). Dubai: TESOL Arabia Publications.Google Scholar
  52. Tsagari, D. (2004). Is there life beyond language testing? An introduction to alternative language assessment. Center for Research in Language Education, CRILE Working Papers, 58, 1–23.Google Scholar
  53. Turner, D. W. (2010). Qualitative interview design: A practical guide for novice investigators. The Qualitative Report, 15(3), 754–760.Google Scholar
  54. Turuk, M. (2008). The relevance and implications of Vygotsky’s sociocultural theory in the second language classroom. ARECLS, 5, 244–262.Google Scholar
  55. Vygotsky, L. (1978). Interaction between learning and development. Readings on the development of children, 23(3), 34–41.Google Scholar
  56. Wach, A. (2012). Classroom-based language efficiency assessment: A challenge for EFL teachers. Wydawnictwo Naukowe UAM, Glottodidactica, 39(1), 81–92.Google Scholar
  57. Wallace, M. (1998). Action research for language teachers. Cambridge: Cambridge University Press.Google Scholar
  58. Wang, Z. (2011). A case study of one EFL writing teacher’s feedback on discourse for advanced learners in China. University of Sydney Papers in TESOL, 6.Google Scholar
  59. Wilson, E. (2009). School-based research, a guide for education students. London: Sage Publications.Google Scholar
  60. Winke, P. (2011). Evaluating the validity of a high-stakes ESL test: Why teachers’ perceptions matter. TESOL Quarterly, 45(4), 628–660. Scholar
  61. Wolf, M. K., Herman, J. L., Bachman, L. F., Bailey, A. L., & Griffin, N. (2008). Recommendations for assessing English language learners: English language proficiency measures and accommodation uses (Part 3 of 3). CRESST Report 732, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).Google Scholar
  62. Yu, S. (2013). EFL teachers’ beliefs and practices regarding peer feedback in L2 writing classrooms. Polyglossia, 24, 74–79.Google Scholar
  63. Zafar Khan, S. (2011). Factors affecting the motivation of expatriate English as a foreign language (EFL) teachers in the Sultanate of Oman. Unpublished doctoral dissertation, University of Exeter, Exeter, UK.Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  1. 1.LondonUK
  2. 2.Exeter UniversityExeterUK
  3. 3.Sulaimani UniversitySulaymaniyahIraq

Personalised recommendations