Skip to main content

State-of-the-Art of Assessment in Tunisia: The Case of Testing Listening Comprehension

  • Chapter
  • First Online:

Abstract

As part of their prerequisite for any hallmark form of accountability, language tests are designed and administered to spur learning outcomes that would serve the educational needs of any nation. This article investigated language assessment in Tunisia by focusing on the testing of listening comprehension. SPSS and FACETS quantitative analyses of test scores of an achievement examination among 646 test-takers suggested that students had a very low language ability and that the nine raters who graded this exam were harshly and subjectively biased toward the nature of the exam tasks, thus showing a fuzzy idea of the construct of listening comprehension. This was conducive to irrelevant assessment results which, in fact, put into questions the testability of the exam items. Implications of the results were considered and recommendations to further investigate assessment in this context were also discussed.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  • Anckar, J. (2011). Assessing foreign language listening comprehension by means of the multiple-choice format: Processes and products. Jyväskylä: University of Jyväskylä.

    Google Scholar 

  • Association of Language Testers in Europe (ALTE). (2001). Glossary. Retrieved from http://www.alte.org/glossary/index.cfm (April 2018).

  • Bachman, L. F., & Palmer, A. S. (1996). Language testing in practice: Designing and developing useful language tests. Oxford: Oxford University Press.

    Google Scholar 

  • Bond, T. G., & Fox, C. M. (2015). Applying the Rasch model: Fundamental measurement in the human sciences (3rd ed.). Mahwah, NJ: L. Erlbaum.

    Book  Google Scholar 

  • Bostrom, R. N. (2011). Rethinking conceptual approaches to the study of “listening”. The International Journal of Listening, 25(1–2), 10–26.

    Article  Google Scholar 

  • Brindley, G. (1998). Assessing listening abilities. Annual Review of Applied Linguistics, 18, 171–191.

    Article  Google Scholar 

  • Brindley, G., & Slayter, H. (2002). Exploring task difficulty in ESL listening assessment. Language Testing, 19(4), 369–394.

    Article  Google Scholar 

  • Brown, H. D., & Abeywickrama, P. (2004). Language assessment: Principles and classroom practices. White Plains, NY: Pearson Education.

    Google Scholar 

  • Buck, G. (2001). Assessing listening. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Chniti, Y. (2018). Testing grammar in an EFL context. In S. Hidri (Ed.), Revisiting the assessment of second language abilities: From theory to practice (pp. 310–327). Basel: Springer.

    Google Scholar 

  • Common European Framework of Reference. (2016). Retrieved from https://www.coe.int/en/web/common-european-framework-reference-languages/listening-comprehension/-/asset_publisher/FuTcPrtvIAwm/content/bifie-salzbourg-english-listening-2?inheritRedirect=false.

  • Davidson, F. (1991). Statistical support for training ESL composition rating. In L. Hamp-Lyons (Ed.), Assessing second language writing in academic context (pp. 155–164). Norwood, NJ: Ablex.

    Google Scholar 

  • Davison, C., & Cummins, J. (2007). Introduction: Assessment and evaluation in ELT: Shifting paradigms and practices. In J. Cummins & C. Davison (Eds.), International handbook of English language teaching (pp. 415–420). Boston, MA: Springer.

    Chapter  Google Scholar 

  • Flowerdew, J. (1994). Research of relevance to second language lecture comprehension—An overview. In J. Flowerdew (Ed.), Academic listening: Research perspectives (pp. 7–29). Cambridge: Cambridge University Press.

    Google Scholar 

  • Flowerdew, J., & Miller, L. (2005). Second language listening: Theory and practice. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Freedle, R. & Kostin, I. (1996). The prediction of TOEFL listening comprehension item difficulty for mini-talk passages: Implications for construct validity (TOEFL Research Report 56). Princeton, NJ: Educational Testing Service. http://dx.doi.org/10.1002/j.2333-8504.1996.tb01707.x.

  • Ginther, A. (2002). Context and context visuals and performance on listening comprehension stimuli. Language Testing, 19(2), 133–167.

    Article  Google Scholar 

  • Hidri, S. (2013). The effectiveness of assessment of learning and assessment for learning in eliciting valid inferences on the test-takers’ listening comprehension ability. In Article published in the Proceedings of the 2013 Nile TESOL Conference Revolutionizing TESOL: Techniques and Strategies (pp. 1–25). Egypt: The American University of Cairo. https://docs.google.com/file/d/0B6bmHwcjFuVYX2FJZTFndGF0QzA/edit.

  • Hidri, S. (2014). Developing and evaluating a dynamic assessment of listening comprehension in an EFL context. Language Testing in Asia, 4(4), 1–19. https://doi.org/10.1186/2229-0443-4-4.

    Article  Google Scholar 

  • Hidri, S. (2015). Conceptions of assessment: Investigating what assessment means to secondary and university teachers. Arab Journal of Applied Linguistics, 1(1), 19–43.

    Google Scholar 

  • Hidri, S. (2017). Specs validation of a dynamic reading comprehension test for EAP learners in an EFL context. In S. Hidri & C. Coombe (Eds.), Evaluation in foreign language education in the Middle East and North Africa (pp. 315–337). Basel: Springer.

    Chapter  Google Scholar 

  • Hidri, S. (2018a). Revisiting the assessment of second language abilities: From theory to practice. Basel: Springer.

    Google Scholar 

  • Hidri, S. (2018b). Introduction: State of the art of assessing second language abilities. In S. Hidri (Ed.), Revisiting the assessment of second language abilities: From theory to practice (pp. 1–19). Basel: Springer.

    Google Scholar 

  • Hidri, S. (2018c). Assessing spoken language ability: A many-Facet Rasch analysis. In S. Hidri (Ed.), Revisiting the assessment of second language abilities: From theory to practice (pp. 23–48). Basel: Springer.

    Google Scholar 

  • Inan-Karagul, B., & Yuksel, D. (2018). Self-assessment in listening. In J. I. Liontas (Ed.), The TESOL Encyclopedia of English Language Teaching (pp. 1–7). Wiley. https://doi.org/10.1002/9781118784235.eelt0256.

  • Jenkins, J., & Leung, C. (2014). English as a Lingua Franca. In A. J. Kunnan (Ed.), The companion to language assessment (pp. 1–10). Wiley. https://doi.org/10.1002/9781118411360.wbcla047.

  • Kondo-Brown, K. (2002). A FACETS analysis of rater bias in measuring Japanese second language writing performance. Language Testing, 19(1), 3–31.

    Article  Google Scholar 

  • Lumley, T., & McNamara, T. F. (1995). Rater characteristics and rater bias: Implications for training. Language Testing, 12, 54–71.

    Article  Google Scholar 

  • Major, R. C., Fitzmaurice, S. F., Bunta, F., & Balasubramanian, C. (2002). The effects of nonnative accents on listening comprehension: Implications for ESL assessment. TESOL Quarterly, 36(2), 173–190.

    Article  Google Scholar 

  • Mattoussi, Y. (2018). Testing usefulness of reading comprehension exams among first year students of English at the tertiary level in Tunisia. In S. Hidri (Ed.), Revisiting the assessment of second language abilities: From theory to practice (pp. 265–288). Springer International Publishing AG.

    Chapter  Google Scholar 

  • McNamara, T. (1996). Measuring second language performance. London: Longman.

    Google Scholar 

  • Messick, S. (1996). Validity and washback in language test. Language Testing, 13(4), 241–257.

    Article  Google Scholar 

  • Naimi, Y. (2018). Teachers’ conceptions of assessment in an ESP context. In S. Hidri (Ed.), Revisiting the assessment of second language abilities: From theory to practice (pp. 174–194). Springer International Publishing AG.

    Google Scholar 

  • Patri, M. (2002). The influence of peer feedback on self-and peer-assessment of oral skills. Language Testing, 19(2), 109–131.

    Article  Google Scholar 

  • Renandya, W. A., & Farrell, T. S. (2010). ‘Teacher, the tape is too fast!’ Extensive listening in ELT. ELT Journal, 65(1), 52–59.

    Article  Google Scholar 

  • Riahi, I. (2018). Techniques in teaching and testing vocabulary for learners of English in an EFL context. In S. Hidri (Ed.), Revisiting the assessment of second language abilities: From theory to practice (pp. 289–309). Basel: Springer.

    Chapter  Google Scholar 

  • Ross, J. A. (2006). The reliability, validity and utility of self-assessment. Practical Assessment Research & Evaluation, 11(10), 1–13.

    Google Scholar 

  • Shohamy, E. (1991). The power of tests: A critical perspective on the uses of language tests. London: Longman.

    Google Scholar 

  • Song, X. (2005). Language learner strategy use and English proficiency on the Michigan English Language Assessment Battery. Spaan Fellow Working Papers in Second or Foreign Language Assessment, 3, 1–23.

    Google Scholar 

  • Vandergrift, L. (2007). Recent developments in second and foreign language listening comprehension research. Language Teaching, 40(3), 191–210.

    Article  Google Scholar 

  • Vandergrift, L., & Goh, C. C. (2012). Teaching and learning second language listening: Metacognition in action. New York: Routledge.

    Book  Google Scholar 

  • Vandergrift, L., Goh, C., Mareschal, C. J., & Tafaghodtari, M. H. (2006). The metacognitive awareness listening questionnaire: development and validation. Language Learning, 56(3), 431–462.

    Article  Google Scholar 

  • Wall, D. (1996). Introducing new tests into traditional systems: Insights from general education and from innovation theory. Language Testing, 13, 334–354.

    Article  Google Scholar 

  • Wall, D., & Alderson, J. C. (1993). Examining washback: The Sri Lankan impact study. Language Testing, 10(1), 41–69.

    Article  Google Scholar 

  • Weigle, S. (1998). Using FACETS to model rater training effects. Language Testing, 15, 263–287.

    Article  Google Scholar 

Download references

Acknowledgements

I would like to thank the following people for their unconditional help and cooperation at the time of data collection: Aicha Graja, Faten Belhaj, Faten Houioui, Hajer Mami, Hedia Oueslati, Rim Drira, Rim Zaoui, Safia Sahli, and Selma Ben Mrad. I, however, remain responsible for the contents of this work.

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 The Author(s)

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Hidri, S. (2019). State-of-the-Art of Assessment in Tunisia: The Case of Testing Listening Comprehension. In: Hidri, S. (eds) English Language Teaching Research in the Middle East and North Africa. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-319-98533-6_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-98533-6_2

  • Published:

  • Publisher Name: Palgrave Macmillan, Cham

  • Print ISBN: 978-3-319-98532-9

  • Online ISBN: 978-3-319-98533-6

  • eBook Packages: Social SciencesSocial Sciences (R0)

Publish with us

Policies and ethics