Abstract
This study investigated testing usefulness (Bachman & Palmer, 1996) of reading comprehension among first year students of English at the tertiary level in an EFL context. T Data were gathered by means of two questionnaires and a corpus which included 173 first year students’ graded reading exams. A questionnaire was administered to 64 students and 21 teachers which was collected at different institutes and universities. Scores of the reading exams were meant to check the appropriateness of the construct and rater consistency. Questionnaire results indicated that most of the reading comprehension examinations had an acceptable level of reliability and a moderate level of authenticity and interactiveness. It was also revealed that these exams had a high level of construct validity and practicality. In addition, the findings indicated that the achievement reading tests had a harmful impact on first year students, whereas they had a beneficial washback effect on the teachers. On the other hand, the results of the reading scores’ analysis using Cronbach alpha to estimate the reading tests’ reliability showed that there were only three tests among six which had an acceptable reliability of (∝=.70). Concerning the findings of the construct validity assessment procedure of the reading exams, they showed that only the reading part in comprehension, composition, grammar test 1 and the reading section in comprehension, composition, grammar test 2 were proven to be construct valid. These results contradicted those of the questionnaires. Actually, the participants did not provide trustworthy answers in the reliability and construct validity parts. Therefore, most of the reading midterm and final exams designed by the teachers at the tertiary level in Tunisia had a low reliability and construct validity, a moderate authenticity and interactiveness. Nevertheless, the reading exams had a high practicality and a beneficial impact on the teachers, but had a harmful impact on first year learners. The reading exams had a low, since reliability and construct validity which are two important criteria were proven to be threatened. The study had implications for test design.
References
Alderson, J. C., Clapham, C., & Wall, D. (1995). Language test construction and evaluation. Cambridge: Cambridge University Press.
Bachman, L. F. (1990). Fundamental considerations of language testing. Oxford: Oxford University Press.
Bachman, L. F., & Palmer, A. S. (1996). Language testing in practice. Oxford: Oxford University Press.
Brown, J. D. (1996). Testing in language programs. Upper Saddle River, NJ: Prentice Hall.
Brown, J. D. (2002). The Cronbach alpha reliability estimate. Shiken: JALT Testing & Evaluation SIG Newsletter, 6(1), 17–18.
Brown, J. D. (2003). Language assessment-principles and classroom practices. White Plains: Longman.
Cronbach, L. J. (1970). Essentials of psychological testing (3rd ed.). New York: Harper & Row.
Davies, A. (1990). Principles of language testing. Oxford: Basil Blackwell, Ltd.
Henning, G. (1987). A guide to language testing: Development, evaluation, research. Cambridge: Newbury House Publishers.
Hidri, S. (2015). Conceptions of assessment: Investigating what assessment means to secondary and university teachers. Arab Journal of Applied Linguistic, 1(1), 19–43.
Hughes, A. (1989). Testing for language teachers. Cambridge: Cambridge University Press.
Hughes, A. (2003). Testing for language teachers (2nd ed.). Cambridge: Cambridge University Press.
Pallant, J. (2005). SPSS survival manual: A step by step to data analysis using SPSS for Windows (Version 12). Australia: Allen & Unwin.
Read, J., & Chapelle, C. A. (2001). A framework for second language vocabulary assessment. Language Testing, 18(1), 1–32.
Taylor, L., & Weir, J. C. (2012). IELTS collected papers 2 research in reading and listening assessment. Cambridge: Cambridge University Press.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendices
Appendix 1
Students’ Questionnaire
This questionnaire investigates the problems in reading comprehension tests among first year English students at the tertiary level in Tunisia and tries to find different solutions to them. This questionnaire may take you about 15 min to fill out. The information you provide will be strictly confidential and used for academic purposes only. Your cooperation is very important for this research project. Please, answer truthfully.
Section A—Background Information
-
1.
Age: \( {\displaystyle \begin{array}{l}18\hbox{-} 22\\ {}\mathrm{More}\ \mathrm{than}\ 22\end{array}} \)
-
2.
Rank the skills below from the most important (1) to the least important (4)
-
3.
What is your level in the reading skill?
Section B—Test usefulness
-
Part 1: Reliability
-
1.
Did it happen to you that you got an undesirable mark in the reading test/exam during this year?
Yes | No | |
If Yes, this occurred because of: | ||
Non-understanding of the text | Bad emotional state | |
No world knowledge for the text | Temporary illness | |
Length of the text | Others (please specify) | |
The topic of the text was boring |
-
2.
Which testing techniques did your reading tests/exams include this year?
Multiple-choice questions | True/False questions | Gap-filling tasks |
Information transfer | Paraphrasing | Summarising |
Guessing the meaning of unfamiliar words from context |
-
Identifying referents [e.g., What does the underlined word ‘it’ (line 25) refer to?]
-
Short-answer questions (The answers to these questions may vary from a word or phrase to one or two sentences)
-
Others (please specify)………………………………
-
3.
How often did your teacher give you a detailed scoring key in the reading test during this year?
-
4.
State the extent to which you agree or disagree with the two following statements [Strongly Disagree (SD); Disagree (D); Undecided (U); Agree (A); Strongly Agree (SA)]
N° | Statements | SD | D | U | A | SA |
1 | At least, two different teachers should correct the reading test and exam paper. | |||||
2 | In the reading tests, you should become identified by number not name, like in the final exam. |
-
5.
How often did you sit this year for a reading test/ exam in distracting conditions (e.g., unquiet class, bad conditions of desks and chairs,)?
-
6.
How do you think of the reading tests/exams that you took this year? They are…
-
7.
Did your teacher permit you to choose between tasks in the reading test/exam?
during this year?
-
8.
What do you think of the reading comprehension test/exam tasks that you took this year? They are…
-
9.
What do you think of the reading test/exam questions of this year? They are…
-
10.
What do you think of the reading tests/exams that you took this year? They are…
-
11.
How many times did your teacher inform you this year about the aspects of the reading test/exam [format and testing techniques (some of them are mentioned in page 2)] before you took it?
-
Part 2: Construct validity
-
1.
Do the reading tests/exams that you took this year test only your ability to understand a text?
-
If No, explain……………………………………………………………...
-
2.
What do you think of most of the reading test/exam questions? They are…
-
Difficult to understand
-
Neither difficult nor easy to understand
-
Easy to understand
-
3.
Did all your class often take the same reading test/exam on the day of the examination during this year?
-
Part 3: Authenticity
-
1.
Please, tick (√) one box in each row
N° | Questions | N | R | S | O | A |
1 | How often did the reading test/exam tasks that you took this year represent situations, which are similar to what you will encounter in your life? | |||||
2 | How often did the text in the reading test/exam have a topic, which is real and not imaginary? |
-
2.
How did most of the vocabulary task(s) in the reading tests/exams that you took this year represent?
-
1.
They represented question(s) about the meanings of words or expressions
-
2.
They represented paragraph(s) to fill them in with given words
-
3.
They represented question(s) about the meanings of words or expression and paragraph(s) to fill them in with given words
-
1.
3. How were the topics in the reading test/exam tasks? They were …
-
Part 4: Interactiveness
-
1.
How often did responding to the reading test/exam tasks require your topical knowledge?
-
2.
What did the understanding often required in the reading test/exam tasks involve? It involves...
-
1.
A large knowledge of the English language
-
2.
Neither large nor small knowledge of the English language
-
3.
A small knowledge of the English language
-
1.
3. Please, tick (√) one box in each row
N° | Questions | N | R | S | O | A |
1 | How often did it happen that the reading test/exam tasks depend on each other? | |||||
2 | How often did the reading test/exam tasks cause you any emotional threat? |
-
Part 5: Impact
-
1.
Have you often prepared for the reading comprehension test/sub-test and exam?
-
If No, why?...................................................................................................
-
............................................................................................................................
-
2.
Did your teacher always inform you this year about the aspects of the reading test/exam [format and testing techniques (some of them are mentioned in page 2)] before you took it?
-
3.
Did your teacher often ask you to suggest appropriate tasks for use in a reading test/exam?
-
4.
Did your teacher often ask you to suggest appropriate marking criteria for use in a reading test/exam?
-
5.
How many times did you receive remarks about your strengths and weaknesses in the two reading tests that you took this year?
-
Part 6: Practicality
-
1.
Please, tick (√) one box in each row
Questions | N | R | S | O | A | |
1 | How often did you complete the reading test/exam within the fixed time? | |||||
2 | How often have the reading tests/exams been managed without problems? |
Thank you for your cooperation
Appendix 2
Teachers’ Questionnaire
This questionnaire investigates problems in reading comprehension tests among first year English students at the tertiary level in Tunisia and tries to find different solutions to them. This questionnaire may take you about 15 min to fill in. The information you provide will be strictly confidential and used for academic purposes only. Your cooperation is very important for this research project. Please, answer truthfully.
Section A—Background Information
-
1.
Name (optional): ...................................................................................................
-
2.
Academic status: Please, tick (√) the appropriate one.
-
3.
Teaching first year reading course:
-
4.
Training:
-
Have you received any type of training in testing reading?
-
Please explain whether you answer yes or no .................................................. ................................................................................................................................
Section B—Test usefulness
-
Part 1: Reliability
-
1.
Which reading testing technique(s) do you often use in your reading tests/exams?
-
2.
Do you often provide a detailed scoring key for the first year reading test/exam?
-
3.
Have you received any type of training in scoring the first year reading test/exam?
-
4.
Do you often precise the acceptable responses at the outset of scoring?
-
5.
State the extent to which you agree or disagree with the two following statements [Strongly Disagree (SD); Disagree (D); Undecided (U); Agree (A); Strongly Agree (SA)]
Statements | SD | D | U | A | SA | |
1 | Each copy of a reading test and final exam should be scored by at least two independent scorers | |||||
2 | In the reading tests, first year students should become identified by number not name, like in the final exam |
-
6.
How are your first year reading comprehension tests/exams? They are…
-
7.
Do you often allow your first year students to choose between tasks in a reading examination?
-
8.
How often do you inform your first-year students about the reading test/exam format before finalizing it?
-
Part 2: Construct validity
-
1.
Do your first year reading test tasks only measure your students’ ability to understand a text?
-
If No, explain..........................................................................................................
-
2.
Do you often take into consideration a given level of a students’ category in designing the reading exam questions?
-
If Yes, what level of students do you often take in mind?
-
The highest proficient students
-
The medium proficient students
-
The least proficient students
-
3.
Do you often give different varieties of examinations to the first-year students on the day of the reading test?
-
Part 3: Authenticity
-
1.
Please, tick (√) one box in each row
Questions | N | R | S | O | A | |
1 | How often do you opt for real-world first year reading exam tasks? | |||||
2 | How often do you give your first-year students a text which has a real-world topic in the reading exam? |
-
2.
What do the vocabulary task(s) that you often design in the first year reading tests represent?
-
1.
They represent question(s) about the meanings of words or expressions
-
2.
They represent paragraph(s) to be filled in with given words
-
3.
They represented questions about the meanings of words or expression and paragraphs to be filled in with given words
-
1.
-
Part 4: Interactiveness
-
1.
Do your reading exams always contain some tasks which involve your first-year students’ topical knowledge?
-
2.
Do you often take into consideration first year students’ characteristics (e.gs., age, nationality, educational background) in designing the reading exam tasks?
-
If No, why?
-
...............................................................................................................................
What does the understanding often required in your reading test tasks involve? It involves…
-
1.
A wide range of first year students’ areas of language knowledge
-
2.
Neither wide nor narrow range of first year students’ areas of language knowledge
-
3.
A narrow range of first year students’ areas of language knowledge
-
3.
How often do you design dependent first year reading comprehension test tasks?
-
Part 5: Impact
-
1.
What do you often assess in your first year reading comprehension test?
Students’ ability to understand a text Students’ other abilities (please specify)...................................... Students’ ability to write
-
2.
Which type of testing do you often use in the first year reading test?
-
1.
Direct testing: A testing method that closely matches the reading comprehension ability being measured
-
2.
Indirect testing: A testing method that measures abilities related to the reading comprehension ability being tested, rather than this ability itself
-
3.
Direct and indirect testing
-
1.
-
3.
What is your reading test/exam based on?
-
4.
Do you always familiarise your first-year students with the reading test/exam
format before they take it?
-
Part 6: Practicality
-
1.
Please, tick (√) one box in each row
N° | Questions | N | R | S | O | A |
1 | How often are administrative details established clearly before the reading test/exam? | |||||
2 | How often do your first-year students complete the reading test/exam within the set time frame? | |||||
3 | How often is your scoring system for the reading test/ exam feasible in your time frame? | |||||
4 | How often have the material resources (space, equipment, and materials) been provided to the reading test/exam’s design? | |||||
5 | How often have the material resources been provided to the reading test/exam’s administration? | |||||
6 | How often have the reading tests/exams been administered smoothly (without problems)? |
Thank you for your cooperation
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this chapter
Cite this chapter
Mattoussi, Y. (2018). Testing Usefulness of Reading Comprehension Exams Among First Year Students of English at the Tertiary Level in Tunisia. In: Hidri, S. (eds) Revisiting the Assessment of Second Language Abilities: From Theory to Practice. Second Language Learning and Teaching. Springer, Cham. https://doi.org/10.1007/978-3-319-62884-4_13
Download citation
DOI: https://doi.org/10.1007/978-3-319-62884-4_13
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-62883-7
Online ISBN: 978-3-319-62884-4
eBook Packages: EducationEducation (R0)