Skip to main content

Evaluating the Sentence Form Test as a Test of English Writing for Students in Three Japanese Universities

  • Conference paper
  • First Online:
Pacific Rim Objective Measurement Symposium (PROMS) 2016 Conference Proceedings
  • 206 Accesses

Abstract

The current project was a follow-up to PROMS presentations in 2014 and 2015, both of which pointed to the potential value of “indirect,” objective writing tests over and above “direct” ratings of writing, for English entrance examinations, as well as lower stakes placement tests for writing classes in Japanese universities. The 2014 results showed that the entrance examination essay rating process of one EAP department in a Japanese university, despite adherence to standard rating procedures, was fraught with difficulties, including unpredictable as well as too limited use of the rubric by raters, and rubric flaws, including unclear definition of score categories. The 2015 results focused on an “indirect” writing test, the Sentence Form Test (SFT), a multiple-choice test of writing knowledge, in particular, correct sentence form. Results showed that the fit, difficulty, and reliability of the SFT were acceptable, and SFT scores also demonstrated criterion-related validity, as shown by their correlations with both analytic and holistic essay ratings. For the current follow-up study, a larger number of students from three Japanese universities were asked to complete the SFT, in order to examine whether it would work equally well as a test for female and male students from different universities, in a variety of programs of study, and of a wider range of English proficiency. Results showed that the SFT performed well overall in assessing students, though more test items were needed for students at the lower proficiency levels. In order to examine the character of the test in more detail, a sample of students with higher levels of English proficiency was drawn from the larger sample, in order to enhance test targeting. Results showed that the SFT demonstrated both unidimensionality and person-invariant item calibration.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences. Mahwah, N.J.: Lawrence Erlbaum.

    Google Scholar 

  • Brown, J. D. (1996). Testing in language programs. Upper Saddle River: Prentice Hall Regents.

    Google Scholar 

  • Daigaku Hensachi Jouhou. (2017). Retrieved December 15, 2016, from http://daigakujyuken.boy.jp/.

  • Ellis, R., & Barkhuizen, G. (2005). Analysing learner language. Oxford: Oxford University Press.

    Google Scholar 

  • Engelhard, G., Jr. (2013). Invariant measurement: Using Rasch models in the social, behavioral, and health sciences. New York: Routledge.

    Google Scholar 

  • Hamp-Lyons, L. (1991). Issues and directions in assessing second language writing in academic contexts. In L. Hamp-Lyons (Ed.), Assessing second language writing in academic contexts (pp. 323–329). Norwood: Ablex Publishing.

    Google Scholar 

  • Japan Ministry of Education. (2011, April 11). Section 9, Foreign Languages. Retrieved November 30, 2016, from http://www.mext.go.jp/component/a_menu/education/micro_detail/__icsFiles/afieldfile/2011/04/11/1298356_10.pdf.

  • Kroll, B. (1998). Assessing writing abilities. Annual Review of Applied Linguistics, 18, 219–239.

    Article  Google Scholar 

  • Leki, I., Cumming, A., & Silva, T. (2008). A synthesis of research on second language writing in English. New York: Routledge.

    Google Scholar 

  • Linacre, J. M. (n.d.). Reliability and separation of measures. Retrieved August 1, 2016, from Winsteps.com website: http://www.winsteps.com/winman/reliability.htm.

  • Linacre, J. M. (2008). Old Rasch forum—Rasch on the run 2008. Retrieved December 30, 2016, from http://www.rasch.org/forum2008.htm.

  • Linacre, J. M. (2012, June). Winsteps Rasch tutorial 3 (section 113). Retrieved June 15, 2016, from http://www.winsteps.com/a/winsteps-tutorial-3.pdf.

  • McVeigh, B. (2002). Japanese higher education as myth. Armonk, NY: M. E. Sharpe.

    Google Scholar 

  • Newfields, T. (2006). Suggested answers for assessment literacy self-study quiz #1. Shiken: JALT Testing & Evaluation SIG Newsletter, 10(2), 25–32.

    Google Scholar 

  • Stansfield, C. W., & Ross, J. (1988). A long-term research agenda for the test of written English. Language Testing, 5, 160–186.

    Article  Google Scholar 

  • Takagi, K. K. (2016). Writing assessment in university entrance examinations: The case for “indirect” assessment. In Q. Zhang (Ed.), Pacific Rim Objective Measurement Symposium (PROMS) 2015 Conference Proceedings. Singapore: Springer.

    Google Scholar 

  • Weigle, S. C. (2002). Assessing writing. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Willis, R. (n.d.). Interview with Tim Murphey—Parts 3–5. ELT NEWS.com: The Website for English Language Teachers. Retrieved February 28, 2017, from http://www.eltnews.com/features/interviews/2011/05/interview_with_tim_murphey_3-5.html.

  • Yan, Z., Lum, C. W., Lui, R. T. L., Chu, S. S. W., & Lui, M. (2015). Measuring teaching assistants’ efficacy using the Rasch model. Journal of Applied Measurement, 16(1), 60–75.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kristy King Takagi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Takagi, K.K., Pan, Y. (2018). Evaluating the Sentence Form Test as a Test of English Writing for Students in Three Japanese Universities. In: Zhang, Q. (eds) Pacific Rim Objective Measurement Symposium (PROMS) 2016 Conference Proceedings. Springer, Singapore. https://doi.org/10.1007/978-981-10-8138-5_13

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-8138-5_13

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-8137-8

  • Online ISBN: 978-981-10-8138-5

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics