Skip to main content

Dealing with Open-Answer Questions in a Peer-Assessment Environment

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 7558))

Abstract

Correction of open answers questions is an heavy task as, in principle, all the students answers have to be graded. In this paper we give evidence of the possibility to reduce the teacher’s workload on open questions questionnaires, by a module managing a rough constraint-based model of the students’ decisions, involved in a peer-assessment task. By modeling students decisions we relate their competences on the topic (K) to their ability to judge (J) others’ work and to the correctness (C) of their own (open) answer. The network of constraints and relations established among the above variables through the students’ choices, allows us to constraint the set of possible values of the answers’ correctness (C). Our system suggests what subset of the answers the teacher should correct, in order to narrow the set of hypotheses and produce a complete set of grades. The model is quite simple, yet sufficient to show that the number of required corrections is as small as half of the initial answers. In order to show this result, we report on an extensive set of simulated experiments which answer to three research questions: 1) is the method described able to deduce the whole set of grades with few corrections? 2) what set of parameters is best to run actual experiments? 3) is the model “robust” respect to simulations with high probability of random data?

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Cheng, Y., Ku, H.: An investigation of the effects of reciprocal peer tutoring. Computers in Human Behavior 25 (2009)

    Google Scholar 

  2. Kreijns, K., Kirschner, P.A., Jochems, W.: Identifying the pitfalls for social interaction in computer supported collaborative learning environments: a review of the research. Computers in Human Behavior 19, 335–353 (2003)

    Article  Google Scholar 

  3. Li, H., Yamanishi, K.: Mining from open answers in questionnaire data. In: Proc. KDD 2001, pp. 443–449. ACM, USA (2001)

    Chapter  Google Scholar 

  4. Rosell, M., Velupillai, S.: Revealing Relations between Open and Closed Answers in Questionnaires through Text Clustering Evaluation. In: Proc. LREC 2008, Eur. Language Resources Ass. (2008)

    Google Scholar 

  5. Yamanishi, K., Li, H.: Mining Open Answers in Questionnaire data. IEEE Intelligent Systems, 58–63 (September/October 2002)

    Google Scholar 

  6. Sterbini, A., Temperini, M.: Good students help each other: improving knowledge sharing through reputation systems. In: Proc. 8th International Conference on Information Technology Based Higher Education and Training (ITHET), Kumamoto, Japan (2007)

    Google Scholar 

  7. Falchikov, N., Goldfinch, J.: Student Peer Assessment in Higher Education: A Meta-Analysis Comparing Peer and Teacher Marks. Review of Educational Research 70(3), 287–322 (2000)

    Google Scholar 

  8. iPeer, http://ipeer.ctlt.ubc.ca

  9. PEAR, http://www.uoguelph.ca/peartool/

  10. Bloom, B.S. (ed.): Taxonomy of Educational Objectives. David McKay Company Inc., New York (1964)

    Google Scholar 

  11. Sterbini, A., Temperini, M.: Supporting Assessment of Open Answers in a Didactic Setting. In: Social and Personal Computing for Web-Supported Learning Communities (SPEL 2012), ICALT 2012, Rome, Italy (2012)

    Google Scholar 

  12. Sterbini, A., Temperini, M.: Correcting open-answer questionnaires through a Bayesian-network model of peer-based assessment. In: International Conference on Information Technology Based Higher Education and Training (ITHET 2012), Istanbul, Turkey, June 21-23 (2012)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Sterbini, A., Temperini, M. (2012). Dealing with Open-Answer Questions in a Peer-Assessment Environment. In: Popescu, E., Li, Q., Klamma, R., Leung, H., Specht, M. (eds) Advances in Web-Based Learning - ICWL 2012. ICWL 2012. Lecture Notes in Computer Science, vol 7558. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33642-3_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33642-3_26

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33641-6

  • Online ISBN: 978-3-642-33642-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics