Assessment of Collaborative Problem Solving Using Linear Equations on a Tangible Tabletop

  • Valérie Maquil
  • Eric Tobias
  • Samuel Greiff
  • Eric Ras
Part of the Communications in Computer and Information Science book series (CCIS, volume 439)


Using Tangible User Interfaces (TUI) for assessing collaborative problems has only been marginally investigated in technology-based assessment. Our first empirical studies focused on light-weight performance measurements, usability, user experience, and gesture analysis to increase our understanding of how people interact with TUI in an assessment context. In this paper we propose a new approach for assessing individual skills for collaborative problem solving using the MicroDYN methodology with TUIs. These so-called MicroDYN items are high quality and designed to assess individual problem solving skills. The items are based on linear structural equations. We describe how this approach was applied to create an assessment item for a collaborative setting with children that implements a simplified model of climate change using the knowledge of the previous studies. Finally, we propose a series of research questions as well as a future empirical study.


Tangible User Interfaces linear structural equation MicroDYN collaborative problem solving technology-based assessment 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    OECD: Better skills, better jobs, better lives. A strategic approach to skills policies. OECD Publishing, Paris (2012)Google Scholar
  2. 2.
    Ferrari, A.: Digital Competence in Practice. JRC Technical Reports. European Commission, Joint Research Centre, Institute for Prospective Technological Studies Seville (2012)Google Scholar
  3. 3.
    Binkley, M., Erstad, O., Herman, J., Raizen, S., Ripley, M., Miller-Ricci, M., Rumble, M.: Defining Twenty-First Century Skills. In: Griffin, P., McGaw, B., Care, E. (eds.) Assessment and Teaching of 21st Century Skills, pp. 17–66. Springer, Dordrecht (2012)CrossRefGoogle Scholar
  4. 4.
    Bennett, R.E., Gitomer, D.H.: Transforming K-12 assessment. In: Wyatt-Smith, C., Cumming, J. (eds.) Assessment Issues of the 21st Century. Springer Publishing Company, New York (2009)Google Scholar
  5. 5.
    OECD: PISA 2012 assessment and analytical framework mathematics, reading, science, problem solving and financial literacy. OECD Publishing, Paris (2013)Google Scholar
  6. 6.
    Marshall, P.: Do tangible interfaces enhance learning? In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction, pp. 163–170. ACM, Baton Rouge (2007)CrossRefGoogle Scholar
  7. 7.
    Klemmer, S.R., Hartmann, B., Takayama, L.: How bodies matter: five themes for interaction design. In: 6th Conference on Designing Interactive Systems (DIS 2006), pp. 140–149. ACM, University Park (2006)Google Scholar
  8. 8.
    Esteves, A., Van den Hoven, E., Oakley, I.: Physical games or digital games?: Comparing support for mental projection in tangible and virtual representations of a problem-solving task. In: 7th International Conference on Tangible, Embedded and Embodied Interaction, Barcelona, Spain (2013)Google Scholar
  9. 9.
    Ras, E., Maquil, V., Foulonneau, M., Latour, T.: Empirical Studies on a Tangible User Interface for Technology-based Assessment - Insights and Emerging Challenges. International Journal of e-Assessment (IJEA), CAA 2012 Issue: Pedagogy and Technology: Harmony and Tensions 3 (2013)Google Scholar
  10. 10.
    Mayer, R.E., Wittrock, M.C.: Problem Solving. In: Alexander, P.A., Winne, P.H. (eds.) Handbook of Educational Psychology, pp. 287–303. Lawrence Erlbaum, New York (2006)Google Scholar
  11. 11.
    Novick, L.R., Bassok, M.: Problem solving. In: Holyoak, K.J., Morrison, R.G. (eds.) The Cambridge Handbook of Thinking and Reasoning, pp. 321–349. University Press, Cambridge (2005)Google Scholar
  12. 12.
    Greiff, S., Wüstenberg, S., Molnar, G., Fischer, A., Funke, J., Csapo, B.: Complex Problem Solving in educational settings – something beyond g: Concept, assessment, measurement invariance, and construct validity. Journal of Educational Psychology 105, 364–379 (2013)CrossRefGoogle Scholar
  13. 13.
    March, S.T., Smith, G.F.: Design and natural science research on information technology. Decision Support Systems 15, 251–266 (1995)CrossRefGoogle Scholar
  14. 14.
    Hevner, A.R., March, S.T., Park, J., Ram, S.: Design science in information systems research. MIS Quarterly 28, 75–105 (2004)Google Scholar
  15. 15.
    IPCC Working Group I: Climate Change 2013: The Physical Science Basis (2013)Google Scholar
  16. 16.
    Fleck, R., Rogers, Y., Yuill, N., Marshall, P., Carr, A., Rick, J., Bonnett, V.: Actions speak loudly with words: unpacking collaboration around the table. In: ACM International Conference on Interactive Tabletops and Surfaces, pp. 189–196. ACM, Banff (2009)CrossRefGoogle Scholar
  17. 17.
    Van Joolingen, W.R., De Jong, T., Dimitrakopoulou, A.: Issues in computer supported inquiry learning in science. Journal of Computer Assisted Learning 23 (2007)Google Scholar
  18. 18.
    Kapitanoff, S.H.: Collaborative testing: Cognitive and interpersonal processes related to the enhanced test performance. Active Learning in Higher Education 10 (2009)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Valérie Maquil
    • 1
  • Eric Tobias
    • 1
  • Samuel Greiff
    • 2
  • Eric Ras
    • 1
  1. 1.Public Research Centre Henri TudorLuxembourgLuxembourg
  2. 2.University of LuxembourgLuxembourg-KirchbergLuxembourg

Personalised recommendations