Advertisement

Logic Evaluation Through Game-Based Assessment

  • Carlos Arce-LoperaEmail author
  • Alan Perea
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 973)

Abstract

Digital game–based evaluations may be useful to solve several problems of traditional paper-based assessments, such as students’ test anxiety and evaluation rigidity. On the other hand, video games allow to record interaction data of the thinking process that can be used later in formative assessments. A game application was developed as a tool for evaluating the logic abilities of first year university students. The game was designed as a puzzle with different difficulty levels. Experimental results showed that the game scores were not significantly different from the grades obtained with traditional paper-based evaluations. However, for most students, the game-based interaction was significantly different by lowering perceived frustration and increasing user engagement. The use of gamification on student assessment can lower test anxiety and reveal useful insights on student thinking processes. Moreover, automatic and real time feedback could drastically improve learning and guide students to understand complex scenarios.

Keywords

Human factors Gamification Assessment Logic 

References

  1. 1.
    Connolly, T.M., Boyle, E.A., MacArthur, E., Hainey, T., Boyle, J.M.: A systematic literature review of empirical evidence on computer games and serious games. Comput. Educ. 59, 661–686 (2012)CrossRefGoogle Scholar
  2. 2.
    Mavridis, A., Tsiatsos, T.: Game-based assessment: investigating the impact on test anxiety and exam performance. J. Comput. Assist. Learn. 33, 137–150 (2017)CrossRefGoogle Scholar
  3. 3.
    Kiili, K., Ketamo, H.: Evaluating cognitive and affective outcomes of a digital game-based math test. IEEE Trans. Learn. Technol. 11, 255–263 (2018)CrossRefGoogle Scholar
  4. 4.
    Hancock, D.R.: Effects of test anxiety and evaluative threat on students’ achievement and motivation. J. Educ. Res. 94, 284–290 (2001)CrossRefGoogle Scholar
  5. 5.
    Kolagari, S., Modanloo, M., Rahmati, R., Sabzi, Z., Ataee, A.J.: The effect of computer-based tests on nursing students’ test anxiety: a quasi-experimental study. Acta Inform. Medica. 26, 115–118 (2018)CrossRefGoogle Scholar
  6. 6.
    Eseryel, D., Law, V., Ifenthaler, D., Ge1, X., Miller, R.: An investigation of the interrelationships between motivation, engagement, and complex problem solving in game-based learning. J. Educ. Technol. Soc. 17, 42–53 (2014)Google Scholar
  7. 7.
    Nicol, D.J., Macfarlane-Dick, D.: Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. Stud. High. Educ. 31, 199–218 (2006)CrossRefGoogle Scholar
  8. 8.
    Dor, D., Zwick, U.: SOKOBAN and other motion planning problems. Comput. Geom. 13, 215–228 (1999)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Hart, S.G.: Nasa-Task load index (NASA-TLX); 20 years later. In: Proceedings of the human factors and Ergonomics Society Annual Meeting, vol. 50, pp. 904–908 (2006)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Engineering FacultyUniversidad IcesiCaliColombia

Personalised recommendations