Skip to main content

Teacher Supported Peer Evaluation Through OpenAnswer: A Study of Some Factors

  • Conference paper
  • First Online:
Computers Supported Education (CSEDU 2016)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 739))

Included in the following conference series:

  • 785 Accesses

Abstract

In the OpenAnswer system it is possible to compute grades for/to the answers to open-ended questions given to a class of students, based on the students’ peer-evaluation and on the teacher’s grading work, performed on a subset of the answers. Here we analyze the systems’ performances, expressed as the capability to infer correct grades based on a limited amount of grading work by the teacher. In particular, considering that the performance may well depend on alternative definitions (valorization) of several aspects of the system, we show an analysis of such alternative choices, with the intention of seeing what choices might result in better system’s behavior. The factors we investigate are related to the Bayesian framework underpinning OpenAnswer. In particular we tackle the different possibilities to define probability distribution of key variables, conditional probabilities tables, and methods to map our statistical variables onto usable grades. Moreover we analyze the relationship between two main variables that express knowledge possessed by the student and her/his peer-assessing skill. By exploring alternative configurations of the system’s parameters we can conclude that Knowledge is in general more difficult than Assessment. The way to reach such a (not astonishing) conclusion provides also a quantitative evidence of Bloom’s ranking.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Anderson, L.W., Krathwohl, D.R. (eds.): A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. Allyn and Bacon, Boston (2000)

    Google Scholar 

  2. Anderson, J.R., Corbett, A.T., Koedinger, K.R., Pelletier, R.: Cognitive tutors: lessons learned. J. Learn. Sci. 4(2), 167–207 (1995)

    Article  Google Scholar 

  3. Bloom, B.S., Engelhart, M.D., Furst, E.J., Hill, W.H., Krathwohl, D.R.: Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook I: Cognitive Domain. David McKay, New York (1956)

    Google Scholar 

  4. Castellanos-Nieves, D., Fernández-Breis, J., Valencia-García, R., Martínez-Béjar, R., Iniesta-Moreno, M.: Semantic Web Technologies for supporting learning assessment. Inf. Sciences 181, 9 (2011)

    Article  Google Scholar 

  5. Cho, K., MacArthur, C.: Student revision with peer and expert reviewing. Learn. Instr. 20(4), 328–338 (2010)

    Article  Google Scholar 

  6. Chung, H., Graf, S., Robert Lai, K.: Kinshuk: enrichment of peer assessment with agent negotiation. IEEE Trans. Learn. Technol. 4(1), 35–46 (2011)

    Article  Google Scholar 

  7. Conati, C., Gartner, A., Vanlehn, K.: Using Bayesian networks to manage uncertainty in student modeling. User Model. User-Adap. Inter. 12, 371–417 (2002)

    Article  MATH  Google Scholar 

  8. De Marsico, M., Sterbini, A., Temperini, M.: The definition of a tunneling strategy between adaptive learning and reputation-based group activities. In: Proceedings of 11th IEEE International Conference on Advanced Learning Technologies, pp. 498–500 (2011)

    Google Scholar 

  9. De Marsico, M., Sterbini, A., Temperini, M.: A framework to support social-collaborative personalized e-Learning. In: Kurosu, M. (ed.) HCI 2013. LNCS, vol. 8005, pp. 351–360. Springer, Heidelberg (2013). doi:10.1007/978-3-642-39262-7_40

    Chapter  Google Scholar 

  10. De Marsico, M., Sterbini, A., Temperini, M.: A strategy to join adaptive and reputation-based social-collaborative e-learning, through the Zone of Proximal Development. Int. J. Distance Educ. Tech. (IJDET) 11(3), 12–31 (2013)

    Article  Google Scholar 

  11. El-Kechaï, N., Delozanne, É., Prévit, D., Grugeon, B., Chenevotot, F.: Evaluating the performance of a diagnosis system in school algebra. In: Leung, H., Popescu, E., Cao, Y., Lau, R.W.H., Nejdl, W. (eds.) ICWL 2011. LNCS, vol. 7048, pp. 263–272. Springer, Heidelberg (2011). doi:10.1007/978-3-642-25813-8_28

    Chapter  Google Scholar 

  12. Formisano, A., Omodeo, E.G., Temperini, M.: Layered map reasoning: an experimental approach put to trial on sets. Electron. Notes Theor. Comput. Sci. 48, 1–28 (2001). Elsevier Science B. V., Amsterdam

    Article  MATH  Google Scholar 

  13. Formisano, A., Omodeo, E.G., Temperini, M.: Goals and benchmarks for automated map reasoning. J. Symb. Comput. 29(2), 259–297 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  14. Gasparetti, F., Limongelli, C., Sciarrone, F.: Wiki course builder: a system for retrieving and sequencing didactic materials from wikipedia. In: Proceedings of ITHET 2015 (2015)

    Google Scholar 

  15. Jackson, K., Trochim, W.: Concept mapping as an alternative approach for the analysis of open-ended survey responses. Organ. Res. Methods 5, 307–336 (2002). Sage

    Article  Google Scholar 

  16. Li, L.X., Liu, X., Steckelberg, A.L.: Assessor or assessee: how student learning improves by giving and receiving peer feedback. Br. J. Ed. Tech. 41(3), 525–536 (2010)

    Article  Google Scholar 

  17. Limongelli, C., Lombardi, M., Marani, A., Sciarrone, F., Temperini, M.: A recommendation module to help teachers build courses through the Moodle Learning Management System. New Rev. Hypermedia Multimed. 22, 1–2 (2016)

    Article  Google Scholar 

  18. Limongelli, C., Lombardi, M., Marani, A., Sciarrone, F.: A teacher model to speed up the process of building courses. In: Kurosu, M. (ed.) HCI 2013. LNCS, vol. 8005, pp. 434–443. Springer, Heidelberg (2013). doi:10.1007/978-3-642-39262-7_50

    Chapter  Google Scholar 

  19. Limongelli, C., Lombardi, M., Marani, A., Sciarrone, F.: A teaching-style based social network for didactic building and sharing. In: Lane, H.C., Yacef, K., Mostow, J., Pavlik, P. (eds.) AIED 2013. LNCS, vol. 7926, pp. 774–777. Springer, Heidelberg (2013). doi:10.1007/978-3-642-39112-5_110

    Chapter  Google Scholar 

  20. Limongelli, C., Sciarrone, F., Temperini, M.: A social network-based teacher model to support course construction. Comput. Hum. Behav. 51, 1077–1085 (2015)

    Article  Google Scholar 

  21. Metcalfe, J., Shimamura, A.P.: Metacognition: Knowing About Knowing. MIT Press, Cambridge (1994)

    Google Scholar 

  22. Miller, P.: The effect of scoring criteria specificity on peer and self-assessment. Assess. Eval. High. Educ. 28(4), 383–394 (2003)

    Article  Google Scholar 

  23. Palmer, K., Richardson, P.: On-line assessment and free-response input-a pedagogic and technical model for squaring the circle. In: Proceedings of 7th Computer Assisted Assessment Conference, pp. 289–300 (2003)

    Google Scholar 

  24. Romero, C., Ventura, S.: Educational data mining: a review of the state of the art. IEEE Trans. SMC Part C 40(6), 601–618 (2010)

    Google Scholar 

  25. Sadler, P.M., Good, E.: The impact of self- and peer-grading on student learning. Educ. Assess. 11(1), 1–31 (2006)

    Article  Google Scholar 

  26. Sterbini, A., Temperini, M.: Collaborative projects and self evaluation within a social reputation-based exercise-sharing system. In: Proceedings of WI-IAT 2009, vol. 3, pp. 243–246 (2009)

    Google Scholar 

  27. Sterbini, A., Temperini, M.: Dealing with open-answer questions in a peer-assessment environment. In: Popescu, E., Li, Q., Klamma, R., Leung, H., Specht, M. (eds.) ICWL 2012. LNCS, vol. 7558, pp. 240–248. Springer, Heidelberg (2012). doi:10.1007/978-3-642-33642-3_26

    Chapter  Google Scholar 

  28. Sterbini, A., Temperini, M.: Analysis of OpenAnswers via mediated peer-assessment. In: Proceedings of the International Conference on System Theory, Control and Computing, Workshop SPEL (2013)

    Google Scholar 

  29. Sterbini, A., Temperini, M.: OpenAnswer, a framework to support teacher’s management of open answers through peer assessment. In: Proceedings of Frontiers in Education, pp. 164–170 (2013)

    Google Scholar 

  30. Topping, K.: Peer assessment between students in colleges and universities. Rev. Educ. Res. 68, 249–276 (1998)

    Article  Google Scholar 

  31. Yamanishi, K., Li, H.: Mining open answers in questionnaire data. IEEE Intell. Syst. 17, 58–63 (2002)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marco Temperini .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

De Marsico, M., Sterbini, A., Temperini, M. (2017). Teacher Supported Peer Evaluation Through OpenAnswer: A Study of Some Factors. In: Costagliola, G., Uhomoibhi, J., Zvacek, S., McLaren, B. (eds) Computers Supported Education. CSEDU 2016. Communications in Computer and Information Science, vol 739. Springer, Cham. https://doi.org/10.1007/978-3-319-63184-4_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-63184-4_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-63183-7

  • Online ISBN: 978-3-319-63184-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics