Skip to main content

Using the Students’ Levels of Preparation and Confidence as Feedback Information in Quiz-Based Learning Activities

  • Conference paper
  • First Online:
Computers Supported Education (CSEDU 2017)

Abstract

This paper examines ways to enrich the feedback information students receive in closed-type quiz activities that include a revision phase (i.e., students are allowed to change their initial answers after they receive information from their peers, teacher, or system). Typically, in such activities, the information students receive is based on the percentage of students under each possible question choice. The conducted study analyzes the potential of two additional variables, namely the students’ level of preparation and confidence. Both variables are self-reported and, therefore, subjective. During the Fall semester 2016, 91 sophomore students enrolled in an Information Systems course participated in the study for five weeks. The activity was taking place during the first 20 min of each class. Students had to go through three phases and (a) answer a multiple-choice quiz with 8 questions and 4 options for each question, (b) receive feedback based on the whole classroom population, and (c) see the correct answer and discuss them with the teacher in the lecture that follows. The students were randomly grouped into four conditions, based on the feedback they received. The control group only received information on the percentage of students that selected each choice, the Confidence group received feedback on the percentage and the average level of confidence of students that selected each choice, the Preparation group received feedback on the percentage and the average level of preparation of students that selected each choice, and finally the Both group received feedback on the percentage and both the average level of confidence and preparation of students that selected each choice. Result analysis showed that in the most challenging questions (i.e., the ones where students’ answers were diverging) the students in the Confidence, Preparation, and Both groups significantly outperformed the students in the Control group. In addition, both confidence and preparation variables were significantly correlated to students’ performance during the initial phase, suggesting that students were accurate and sincere in describing their preparation and confidence levels. This paper is an extended version of [1], presented at the 9th International Conference on Computer Supported Education.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://www.socrative.com/.

  2. 2.

    https://peerwise.cs.auckland.ac.nz/.

  3. 3.

    http://getkahoot.com.

References

  1. Papadopoulos, P.M., Natsis, A., Obwegeser, A.: Improving the quiz: student preparation and confidence as feedback metrics. In: The Proceedings of the 9th International Conference on Computer Supported Education – CSEDU 2017, Porto, Portugal (2017, in press)

    Google Scholar 

  2. Buil, I., Catalán, S., Martínez, E.: Do clickers enhance learning? A control-value theory approach. Comput. Educ. 103, 170–182 (2016)

    Article  Google Scholar 

  3. Sosa, G.W., Berger, D.E., Saw, A.T., Mary, J.C.: Effectiveness of computer-assisted instruction in statistics: a meta-analysis. Rev. Educ. Res. 81(1), 97–128 (2011)

    Article  Google Scholar 

  4. Bransford, J.D., Brown, A., Cocking, R.: How People Learn: Mind, Brain, Experience and School. National Academy Press, Washington (2000)

    Google Scholar 

  5. Kleitman, S., Costa, D.S.J.: The role of a novel formative assessment tool (Stats-mIQ) and individual differences in real-life academic performance. Learn. Individ. Diff. 29, 150–161 (2014)

    Article  Google Scholar 

  6. Wang, T.-H.: Web-based quiz-game-like formative assessment: development and evaluation. Comput. Educ. 51(3), 1247–1263 (2008)

    Article  Google Scholar 

  7. Denny, P.: The effect of virtual achievements on student engagement. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2013), pp. 763–772. ACM, New York (2013)

    Google Scholar 

  8. Bodemer, D.: Tacit guidance for collaborative multimedia learning. Comput. Hum. Behav. 27(3), 1079–1086 (2011)

    Article  Google Scholar 

  9. Janssen, J., Bodemer, D.: Coordinated computer-supported collaborative learning: awareness and awareness tools. Educ. Psychol. 48(1), 40–55 (2013)

    Article  Google Scholar 

  10. Lin, J.-W., Mai, L.-J., Lai, Y.-C.: Peer interaction and social network analysis of online communities with the support of awareness of different contexts. Int. J. Comput. Supp. Collab. Learn. 10(2), 139–159 (2015)

    Google Scholar 

  11. Buder, J.: Group awareness tools for learning: current and future directions. Comput. Hum. Behav. 27(3), 1114–1117 (2011)

    Article  Google Scholar 

  12. Erkens, M., Schlottbom, P., Bodemer, D.: Qualitative and quantitative information in cognitive group awareness tools: impact on collaborative learning. In: Looi, C.-K., Polman, J., Cress, U., Reimann, P. (eds.) Transforming Learning, Empowering Learners: 12th International Conference of the Learning Sciences, pp. 458–465. International Society of the Learning Sciences, Singapore (2016)

    Google Scholar 

  13. Schnaubert, L., Bodemer, D.: Subjective validity ratings to support shared knowledge construction in CSCL. In: Lindwall, O., Häkkinen, P., Koschmann, T., Tchounikine, P., Ludvigsen, S. (eds.) Exploring the Material Conditions of Learning: The Computer Supported Collaborative Learning (CSCL) Conference 2015, vol. 2, pp. 933–934. International Society of the Learning Sciences, Gothenburg (2015)

    Google Scholar 

  14. Méndez-Coca, D., Slisko, J.: Software Socrative and smartphones as tools for implementation of basic processes of active physics learning in classroom: an initial feasibility study with prospective teachers. Eur. J. Phys. Educ. 4(2), 17–24 (2013)

    Google Scholar 

  15. Papadopoulos, P.M., Demetriadis, S.N., Weinberger, A.: “Make it explicit!”: improving collaboration through increase of script coercion. J. Comput. Assist. Learn. 29(4), 383–398 (2013)

    Article  Google Scholar 

  16. DiBattista, D., Mitterer, J.O., Gosse, L.: Acceptance by undergraduates of the immediate feedback assessment technique for multiple-choice testing. Teach. High. Educ. 9(1), 17–28 (2004)

    Article  Google Scholar 

  17. Deterding, S., Dixon, D., Khaled, R., Nacke, L.: From game design elements to gamefulness: defining “gamification”. In: Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments, pp. 9–15. ACM, New York (2011)

    Google Scholar 

  18. Wang, A.I.: The wear out effect of a game-based student response system. Comput. Educ. 82, 217–227 (2015)

    Article  Google Scholar 

  19. Baker, R., Walonoski, J., Heffernan, N., Roll, I., Corbett, A., Koedinger, K.: Why students engage in “gaming the system” behavior in interactive learning environments. J. Interact. Learn. Res. 19(2), 185–224 (2008)

    Google Scholar 

  20. Papadopoulos, P.M., Lagkas, T., Demetriadis, S.N.: How revealing rankings affects student attitude and performance in a peer review learning environment. In: Zvacek, S., Restivo, M.T., Uhomoibhi, J., Helfert, M. (eds.) CSEDU 2015. CCIS, vol. 583, pp. 225–240. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-29585-5_13

    Chapter  Google Scholar 

  21. Gibbons, F.X., Buunk, B.P.: Individual differences in social comparison: the development of a scale of social comparison orientation. J. Pers. Soc. Psychol. 76(1), 129–142 (1999)

    Article  Google Scholar 

  22. Buunk, A.P., Gibbons, F.X.: Social comparison orientation: a new perspective on those who do and those who don’t compare with others. In: Guimond, S. (ed.) Social Comparison and Social Psychology: Understanding Cognition, Intergroup Relations and Culture, pp. 15–33. Cambridge University Press, Cambridge (2006)

    Google Scholar 

Download references

Acknowledgements

This work has been partially funded by a Starting Grant from AUFF (Aarhus Universitets Forskningsfond), titled “Innovative and Emerging Technologies in Education”.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Antonis Natsis .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Papadopoulos, P.M., Natsis, A., Obwegeser, N. (2018). Using the Students’ Levels of Preparation and Confidence as Feedback Information in Quiz-Based Learning Activities. In: Escudeiro, P., Costagliola, G., Zvacek, S., Uhomoibhi, J., McLaren, B. (eds) Computers Supported Education. CSEDU 2017. Communications in Computer and Information Science, vol 865. Springer, Cham. https://doi.org/10.1007/978-3-319-94640-5_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-94640-5_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-94639-9

  • Online ISBN: 978-3-319-94640-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics