Abstract
This paper examines ways to enrich the feedback information students receive in closed-type quiz activities that include a revision phase (i.e., students are allowed to change their initial answers after they receive information from their peers, teacher, or system). Typically, in such activities, the information students receive is based on the percentage of students under each possible question choice. The conducted study analyzes the potential of two additional variables, namely the students’ level of preparation and confidence. Both variables are self-reported and, therefore, subjective. During the Fall semester 2016, 91 sophomore students enrolled in an Information Systems course participated in the study for five weeks. The activity was taking place during the first 20 min of each class. Students had to go through three phases and (a) answer a multiple-choice quiz with 8 questions and 4 options for each question, (b) receive feedback based on the whole classroom population, and (c) see the correct answer and discuss them with the teacher in the lecture that follows. The students were randomly grouped into four conditions, based on the feedback they received. The control group only received information on the percentage of students that selected each choice, the Confidence group received feedback on the percentage and the average level of confidence of students that selected each choice, the Preparation group received feedback on the percentage and the average level of preparation of students that selected each choice, and finally the Both group received feedback on the percentage and both the average level of confidence and preparation of students that selected each choice. Result analysis showed that in the most challenging questions (i.e., the ones where students’ answers were diverging) the students in the Confidence, Preparation, and Both groups significantly outperformed the students in the Control group. In addition, both confidence and preparation variables were significantly correlated to students’ performance during the initial phase, suggesting that students were accurate and sincere in describing their preparation and confidence levels. This paper is an extended version of [1], presented at the 9th International Conference on Computer Supported Education.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Papadopoulos, P.M., Natsis, A., Obwegeser, A.: Improving the quiz: student preparation and confidence as feedback metrics. In: The Proceedings of the 9th International Conference on Computer Supported Education – CSEDU 2017, Porto, Portugal (2017, in press)
Buil, I., Catalán, S., Martínez, E.: Do clickers enhance learning? A control-value theory approach. Comput. Educ. 103, 170–182 (2016)
Sosa, G.W., Berger, D.E., Saw, A.T., Mary, J.C.: Effectiveness of computer-assisted instruction in statistics: a meta-analysis. Rev. Educ. Res. 81(1), 97–128 (2011)
Bransford, J.D., Brown, A., Cocking, R.: How People Learn: Mind, Brain, Experience and School. National Academy Press, Washington (2000)
Kleitman, S., Costa, D.S.J.: The role of a novel formative assessment tool (Stats-mIQ) and individual differences in real-life academic performance. Learn. Individ. Diff. 29, 150–161 (2014)
Wang, T.-H.: Web-based quiz-game-like formative assessment: development and evaluation. Comput. Educ. 51(3), 1247–1263 (2008)
Denny, P.: The effect of virtual achievements on student engagement. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2013), pp. 763–772. ACM, New York (2013)
Bodemer, D.: Tacit guidance for collaborative multimedia learning. Comput. Hum. Behav. 27(3), 1079–1086 (2011)
Janssen, J., Bodemer, D.: Coordinated computer-supported collaborative learning: awareness and awareness tools. Educ. Psychol. 48(1), 40–55 (2013)
Lin, J.-W., Mai, L.-J., Lai, Y.-C.: Peer interaction and social network analysis of online communities with the support of awareness of different contexts. Int. J. Comput. Supp. Collab. Learn. 10(2), 139–159 (2015)
Buder, J.: Group awareness tools for learning: current and future directions. Comput. Hum. Behav. 27(3), 1114–1117 (2011)
Erkens, M., Schlottbom, P., Bodemer, D.: Qualitative and quantitative information in cognitive group awareness tools: impact on collaborative learning. In: Looi, C.-K., Polman, J., Cress, U., Reimann, P. (eds.) Transforming Learning, Empowering Learners: 12th International Conference of the Learning Sciences, pp. 458–465. International Society of the Learning Sciences, Singapore (2016)
Schnaubert, L., Bodemer, D.: Subjective validity ratings to support shared knowledge construction in CSCL. In: Lindwall, O., Häkkinen, P., Koschmann, T., Tchounikine, P., Ludvigsen, S. (eds.) Exploring the Material Conditions of Learning: The Computer Supported Collaborative Learning (CSCL) Conference 2015, vol. 2, pp. 933–934. International Society of the Learning Sciences, Gothenburg (2015)
Méndez-Coca, D., Slisko, J.: Software Socrative and smartphones as tools for implementation of basic processes of active physics learning in classroom: an initial feasibility study with prospective teachers. Eur. J. Phys. Educ. 4(2), 17–24 (2013)
Papadopoulos, P.M., Demetriadis, S.N., Weinberger, A.: “Make it explicit!”: improving collaboration through increase of script coercion. J. Comput. Assist. Learn. 29(4), 383–398 (2013)
DiBattista, D., Mitterer, J.O., Gosse, L.: Acceptance by undergraduates of the immediate feedback assessment technique for multiple-choice testing. Teach. High. Educ. 9(1), 17–28 (2004)
Deterding, S., Dixon, D., Khaled, R., Nacke, L.: From game design elements to gamefulness: defining “gamification”. In: Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments, pp. 9–15. ACM, New York (2011)
Wang, A.I.: The wear out effect of a game-based student response system. Comput. Educ. 82, 217–227 (2015)
Baker, R., Walonoski, J., Heffernan, N., Roll, I., Corbett, A., Koedinger, K.: Why students engage in “gaming the system” behavior in interactive learning environments. J. Interact. Learn. Res. 19(2), 185–224 (2008)
Papadopoulos, P.M., Lagkas, T., Demetriadis, S.N.: How revealing rankings affects student attitude and performance in a peer review learning environment. In: Zvacek, S., Restivo, M.T., Uhomoibhi, J., Helfert, M. (eds.) CSEDU 2015. CCIS, vol. 583, pp. 225–240. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-29585-5_13
Gibbons, F.X., Buunk, B.P.: Individual differences in social comparison: the development of a scale of social comparison orientation. J. Pers. Soc. Psychol. 76(1), 129–142 (1999)
Buunk, A.P., Gibbons, F.X.: Social comparison orientation: a new perspective on those who do and those who don’t compare with others. In: Guimond, S. (ed.) Social Comparison and Social Psychology: Understanding Cognition, Intergroup Relations and Culture, pp. 15–33. Cambridge University Press, Cambridge (2006)
Acknowledgements
This work has been partially funded by a Starting Grant from AUFF (Aarhus Universitets Forskningsfond), titled “Innovative and Emerging Technologies in Education”.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Papadopoulos, P.M., Natsis, A., Obwegeser, N. (2018). Using the Students’ Levels of Preparation and Confidence as Feedback Information in Quiz-Based Learning Activities. In: Escudeiro, P., Costagliola, G., Zvacek, S., Uhomoibhi, J., McLaren, B. (eds) Computers Supported Education. CSEDU 2017. Communications in Computer and Information Science, vol 865. Springer, Cham. https://doi.org/10.1007/978-3-319-94640-5_5
Download citation
DOI: https://doi.org/10.1007/978-3-319-94640-5_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-94639-9
Online ISBN: 978-3-319-94640-5
eBook Packages: Computer ScienceComputer Science (R0)