Advertisement

Assessment of Collaboration and Feedback on Gesture Performance

  • Dimitra AnastasiouEmail author
  • Eric Ras
  • Mehmetcan Fal
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 1014)

Abstract

This paper proposes gesture performance as one main channel for assessing collaboration skills, while multiple users solve a problem collaboratively on a tangible user interface. Collaborative problem solving incorporates two dimensions, complex problem solving and collaboration. Thus, the technology-based assessment of collaborative problem solving includes assessing both problem solving and collaboration skills. Particularly, for assessing collaboration skills, we consider gesture performance as an important indicator. We differentiate between physical 3D mid-air gestures and manipulative gestures; for the latter, we developed a gesture recognition application using Kinect. The method we follow for object and gesture recognition is to merge the logging files from our tangible interface software framework (object recognition) with the Kinect log files (gesture recognition) in one file. The application can analyze the number of object manipulations with respect to timing axis, subject/participant, and handedness.

Keywords

Assessment framework Collaboration Feedback Feed-forward Gestures Performance Tangible user interfaces 

Notes

Acknowledgments

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No. 654477.

References

  1. 1.
    OECD: PISA 2015 Collaborative Problem Solving Framework (2017). http://www.oecd.org/pisa/pisaproducts/Draft%20PISA%202015%20Collaborative%20Problem%20Solving%20Framework%20.pdf. Accessed 18 Nov
  2. 2.
    Anastasiou, D., Ras, E.: Case study analysis on collaborative problem solving using a tangible interface. In: Joosten-ten Brinke, D., Laanpere, M. (eds.) TEA 2016. CCIS, vol. 653, pp. 11–22. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-57744-9_2CrossRefGoogle Scholar
  3. 3.
    Hornecker, E., Buur, J.: Getting a grip on tangible interaction: a framework on physical space and social interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 437–446 (2006)Google Scholar
  4. 4.
    Gutwin, C., Greenberg, S.: The mechanics of collaboration: developing low cost usability evaluation methods for shared workspaces. In: Proceedings of the IEEE 9th International Workshop on Enabling Technologies: Infrastructure for Collaborative Enterprises, pp. 98–103 (2000)Google Scholar
  5. 5.
    Schmidt, K., Bannon, L.: Taking CSCW seriously. Comput. Support. Coop. Work (CSCW) 1(1–2), 7–40 (1992)CrossRefGoogle Scholar
  6. 6.
    Scott, S.D., Grant, K.D., Mandryk, R.L.: System guidelines for co-located, collaborative work on a tabletop display. In: Kuutti, K., Karsten, E.H., Fitzpatrick, G., Dourish, P., Schmidt, K. (eds.) ECSCW 2003, pp. 159–178. Springer, Dordrecht (2003).  https://doi.org/10.1007/978-94-010-0068-0_9CrossRefGoogle Scholar
  7. 7.
    O’Neil, H.F., Chuang, S.H., Chung, G.K.W.K.: Issues in the computer-based assessment of collaborative problem solving. Assess. Educ. 10, 361–373 (2003)CrossRefGoogle Scholar
  8. 8.
    OECD: PISA 2012 assessment and analytical framework mathematics, reading, science, problem solving and financial literacy. OECD Publishing (2013)Google Scholar
  9. 9.
    Evans, A., Wobbrock, J.: Filling in the gaps: capturing social regulation in an interactive tabletop learning environment. In: Proceedings of the 11th International Conference of the Learning Sciences (2014)Google Scholar
  10. 10.
    Dillenbourg, P., Evans, M.: Interactive tabletops in education. Int. J. Comput.-Support. Collab. Learn. 6(4), 491–514 (2011)CrossRefGoogle Scholar
  11. 11.
    McNeill, D.: Hand and Mind: What Gestures Reveal About Thought. University of Chicago Press, Chicago (1992)Google Scholar
  12. 12.
    Kendon, A.: The study of gesture: some observations on its history. Rech. Semiot. Semiot. Inq. 2(1), 25–62 (1982)Google Scholar
  13. 13.
    Karam, M., Schraefel, M.C.: A taxonomy of gestures in human computer interactions. Technical report (2005)Google Scholar
  14. 14.
    Lao, S., et al.: A gestural interaction design model for multi-touch displays. In: Proceedings of the British HCI-Group, pp. 440–446 (2009)Google Scholar
  15. 15.
    Murphy, K.M.: Building meaning in interaction: rethinking gesture classifications. Crossroads Lang. Interact. Cult. 5, 29–47 (2003)Google Scholar
  16. 16.
    Quek, F.: Toward a vision-based hand gesture interface. In: Proceedings of the Virtual Reality, Software and Technology Conference, pp. 17–31 (1994)Google Scholar
  17. 17.
    Hinrichs, U., Carpendale, S.: Gestures in the wild: studying multi-touch gesture sequences on interactive tabletop exhibits. In: Proceedings of CHI 2011, pp. 3023–3032 (2011)Google Scholar
  18. 18.
    Higgins, S.E., et al.: Multi-touch tables and the relationship with collaborative classroom pedagogies: a synthetic review. Int. J. Comput.-Support. Collab. Learn. 6, 515–538 (2011)CrossRefGoogle Scholar
  19. 19.
    Djajadiningrat, T., Buur, J.: Look mama, with hands! On tangible interaction, gestures and learning. In: Proceedings of DIS 2002, Designing Interactive Systems, p. 417. ACM, London (2002)Google Scholar
  20. 20.
    Marquardt, N., Jota, R., Greenberg, S., Jorge, J.A.: The continuous interaction space: interaction techniques unifying touch and gesture on and above a digital surface. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011. LNCS, vol. 6948, pp. 461–476. Springer, Heidelberg (2011).  https://doi.org/10.1007/978-3-642-23765-2_32CrossRefGoogle Scholar
  21. 21.
    Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1083–1092 (2009)Google Scholar
  22. 22.
    Rautaray, S., Agrawal, A.: Vision based hand gesture recognition for human computer interaction: a survey. Artif. Intell. Rev. 43(1), 1–54 (2015)CrossRefGoogle Scholar
  23. 23.
    Martinez-Maldonado, R., Yasef, K., Kay, J.: TSCL: a conceptual model to inform understanding of collaborative learning processes at interactive tabletops. Int. J. Hum.-Comput. Stud. 83, 62–82 (2015)CrossRefGoogle Scholar
  24. 24.
    Wu, M., Balakrishnan, R.: Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. In: Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology, pp. 193–202 (2003)Google Scholar
  25. 25.
    Julià, C.F., Earnshaw, N., Jordà, S.: GestureAgents: an agent-based framework for concurrent multi-task multi-user interaction. In: Proceedings of TEI, pp. 207–214 (2013)Google Scholar
  26. 26.
    Schneider, B., Blikstein, P.: Unraveling students’ interaction around a tangible interface using gesture recognition. In: Educational Data Mining, pp. 320–323 (2014)Google Scholar
  27. 27.
    Ahmaniemi, T., Lantz, V., Marila, J.: Perception of dynamic audiotactile feedback to gesture input. In: Proceedings of the 10th International Conference on Multimodal Interfaces, pp. 85–92 (2008)Google Scholar
  28. 28.
    Lucchese, G., et al.: GestureCommander: continuous touch-based gesture prediction. In: Proceedings of CHI Extended Abstracts on Human Factors in Computing Systems (2012)Google Scholar
  29. 29.
    Oh, U., Kane, S.K., Findlater, L.: Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures. In: Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2013) (2013)Google Scholar
  30. 30.
    Grosse-Puppendahl, T., Beck, S., Wilbers, D.: Rainbowfish: visual feedback on gesture-recognizing surfaces. In: Proceedings of CHI Extended Abstracts on Human Factors in Computing Systems, pp. 427–430 (2014)Google Scholar
  31. 31.
    Evans, C.: Making sense of assessment feedback in higher education. Rev. Educ. Res. 83(1), 70–120 (2013)CrossRefGoogle Scholar
  32. 32.
    Sadler, D.R.: Formative assessment: revisiting the territory. Assess. Educ. 5(1), 77–84 (1998)CrossRefGoogle Scholar
  33. 33.
    Higgins, R., Hartley, P., Skelton, A.: Getting the message across: the problem of communicating assessment feedback. Teach. High. Educ. 6(2), 269–274 (2001)CrossRefGoogle Scholar
  34. 34.
    Djajadiningrat, T., Οverbeeke, K., Wensveen, S.: But how, Donald, tell us how?: on the creation of meaning in interaction design through feedforward and inherent feedback. In: Proceedings of the 4th Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, pp. 285–291 (2002)Google Scholar
  35. 35.
    Greiff, S., et al.: Complex problem solving in educational settings – something beyond g: concept, assessment, measurement invariance, and construct validity. J. Educ. Psychol. 105, 364–379 (2013)CrossRefGoogle Scholar
  36. 36.
    Schwartz, L., Ras, E., Anastasiou, D., Latour, T., Maquil, V.: Designing a collaborative problem solving task in the context of urban planning. In: Ras, E., Guerrero Roldán, A.E. (eds.) TEA 2017. CCIS, vol. 829, pp. 223–234. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-97807-9_17CrossRefGoogle Scholar
  37. 37.
    Narciss, S.: Informatives tutorielles feedback. Entwicklungs-und Evaluationsprinzipien auf der Basis instruktionspsychologischer Erkenntnisse. Waxmann, Münster (2006)Google Scholar
  38. 38.
    Bavelas, J., et al.: Gestures specialized for dialogue. Pers. Soc. Psychol. Bull. 21(4), 394–405 (1995)CrossRefGoogle Scholar
  39. 39.
    Kipp, M., et al.: An annotation scheme for conversational gestures: how to economically capture timing and form. Lang. Resour. Eval. 41(3–4), 325–339 (2007)CrossRefGoogle Scholar
  40. 40.
    Lausberg, H., Sloetjes, H.: The revised NEUROGES-ELAN system: an objective and reliable interdisciplinary analysis tool for nonverbal behavior and gesture. Behav. Res. Methods 48, 973–993 (2015)CrossRefGoogle Scholar
  41. 41.
    Morris, M.R., et al.: Cooperative gestures: multi-user gestural interactions for co-located groupware. In: Proceedings of CHI 2006, pp. 1201–1210 (2006)Google Scholar
  42. 42.
    Wittenburg, P., et al.: ELAN: a professional framework for multimodality research. In: Proceedings of the 5th Conference on Language Resources and Evaluation, pp. 1556–1559 (2006)Google Scholar
  43. 43.
    Maquil, V., Tobias, E., Anastasiou, D., Mayer, H., Latour, T.: COPSE: rapidly instantiating problem solving activities based on tangible tabletop interfaces. Proc. ACM Hum.-Comput. Interact. 1(1), 6 (2017)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Luxembourg Institute of Science and TechnologyEsch-sur-AlzetteLuxembourg
  2. 2.Turkish AerospaceAnkaraTurkey

Personalised recommendations