How to Leverage Reflection in Case of Inquiry Learning? The Study of Awareness Tools in the Context of Virtual and Remote Laboratory

  • Rémi Venant
  • Philippe Vidal
  • Julien BroisinEmail author
Conference paper
Part of the Lecture Notes in Networks and Systems book series (LNNS, volume 22)


In this paper we design a set of awareness and reflection tools aiming at engaging learners in the deep learning process during a practical activity carried out through a virtual and remote laboratory. These tools include: (i) a social awareness tool revealing to learners their current and general levels of performance, but also enabling the comparison between their own and their peers’ performance; (ii) a reflection-on-action tool, implemented as timelines, allowing learners to deeply analyze both their own completed work and the tasks achieved by peers; (iii) a reflection-in-action tool acting as a live video player to let users easily see what others are doing. An experimentation involving 80 students was conducted in an authentic learning setting about operating system administration; the participants evaluated the system only slightly higher than traditional computational environments when it comes to leverage reflection and critical thinking, even if they evaluated the system as good in terms of usability.


Virtual and remote laboratory Computer science Awareness tool Reflection 


  1. 1.
    Arnold, K.E., Pistilli, M.D.: Course signals at Purdue: using learning analytics to increase student success. In: Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, pp. 267–270. ACM (2012)Google Scholar
  2. 2.
    Bangor, A., Kortum, P., Miller, J.: Determining what individual SUS scores mean: adding an adjective rating scale. J. Usability Stud. 4(3), 114–123 (2009)Google Scholar
  3. 3.
    Boud, D.: Situating academic development in professional work: using peer learning. Int. J. Acad. Dev. 4(1), 3–10 (1999)CrossRefGoogle Scholar
  4. 4.
    Boud, D., Keogh, R., Walker, D.: Reflection: Turning Experience into Learning. Routledge, New York (2013)Google Scholar
  5. 5.
    Broisin, J., Venant, R., Vidal, P.: Lab4CE: a remote laboratory for computer education. Int. J. Artif. Intell. Educ. 25(4), 1–27 (2015)Google Scholar
  6. 6.
    Brooke, J.: SUS: a retrospective. J. Usability Stud. 8(2), 29–40 (2013)MathSciNetGoogle Scholar
  7. 7.
    Brooke, J., et al.: SUS-a quick and dirty usability scale. Usability Eval. Ind. 189(194), 4–7 (1996)Google Scholar
  8. 8.
    Advanced Distributed Learning (ADL) Co-Laboratories: Experience API. Accessed 21 Nov 2016
  9. 9.
    Collins, A., Brown, J.S.: The computer as a tool for learning through reflection. In: Learning Issues for Intelligent Tutoring Systems, pp. 1–18. Springer, New York (1988)Google Scholar
  10. 10.
    Davis, D., Trevisan, M., Leiffer, P., McCormack, J., Beyerlein, S., Khan, M.J., Brackin, P.: Reflection and metacognition in engineering practice. In: Using Reflection and Metacognition to Improve Student Learning, pp. 78–103 (2013)Google Scholar
  11. 11.
    De Jong, T., Linn, M.C., Zacharia, Z.C.: Physical and virtual laboratories in science and engineering education. Science 340(6130), 305–308 (2013)CrossRefGoogle Scholar
  12. 12.
    Durall, E., Leinonen, T.: Feeler: supporting awareness and reflection about learning through EEG data. In: The 5th Workshop on Awareness and Reflection in Technology Enhanced Learning, pp. 67–73 (2015)Google Scholar
  13. 13.
    Edwards, S.H.: Using software testing to move students from trial-and-error to reflection-in-action. ACM SIGCSE Bull. 36(1), 26–30 (2004)CrossRefGoogle Scholar
  14. 14.
    Govaerts, S., Verbert, K., Klerkx, J., Duval, E.: Visualizing activities for self-reflection and awareness. In: International Conference on Web-Based Learning, pp. 91–100. Springer, Heidelberg (2010)Google Scholar
  15. 15.
    Howlin, C., Lynch, D.: Learning and academic analytics in the realizeit system. In: E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education, pp. 862–872 (2014)Google Scholar
  16. 16.
    Jonassen, D.H.: Instructional design theories and models: a new paradigm of instructional theory. Des. Constr. Learn. Environ. 2, 215–239 (1999)Google Scholar
  17. 17.
    Kist, A.A., Maxwell, A., Gibbings, P., Fogarty, R., Midgley, W., Noble, K.: Engineering for primary school children: learning with robots in a remote access laboratory. In: The 39th SEFI Annual Conference: Global Engineering Recognition, Sustainability and Mobility (2011)Google Scholar
  18. 18.
    Kollöffel, B., de Jong, T.: Can performance feedback during instruction boost knowledge acquisition? Contrasting criterion-based and social comparison feedback. Interact. Learn. Environ. 24(7), 1–11 (2015)Google Scholar
  19. 19.
    Lowe, D., Murray, S., Lindsay, E., Liu, D.: Evolving remote laboratory architectures to leverage emerging internet technologies. IEEE Trans. Learn. Technol. 2(4), 289–294 (2009)CrossRefGoogle Scholar
  20. 20.
    Matthew, C.T., Sternberg, R.J.: Developing experience-based (tacit) knowledge through reflection. Learn. Individ. Differ. 19(4), 530–540 (2009)CrossRefGoogle Scholar
  21. 21.
    Maxwell, A., Fogarty, R., Gibbings, P., Noble, K., Kist, A.A., Midgley, W.: Robot RAL-ly international-promoting stem in elementary school across international boundaries using remote access technology. In: The 10th International Conference on Remote Engineering and Virtual Instrumentation, pp. 1–5. IEEE (2013)Google Scholar
  22. 22.
    Michinov, N., Primois, C.: Improving productivity and creativity in online groups through social comparison process: new evidence for asynchronous electronic brainstorming. Comput. Hum. Behav. 21(1), 11–28 (2005)CrossRefGoogle Scholar
  23. 23.
    Miller, T.: Formative computer-based assessment in higher education: the effectiveness of feedback in supporting student learning. Assess. Eval. High. Educ. 34(2), 181–192 (2009)CrossRefGoogle Scholar
  24. 24.
    Prensky, M.: Khan academy. Educ. Technol. 51(5), 64 (2011)Google Scholar
  25. 25.
    Schön, D.A.: The Reflective Practitioner: How Professionals Think in Action. Basic Books, New York (1983)Google Scholar
  26. 26.
    Seibert, K.W.: Reflection-in-action: tools for cultivating on-the-job learning conditions. Org. Dyn. 27(3), 54–65 (2000)MathSciNetCrossRefGoogle Scholar
  27. 27.
    Sweller, J.: Cognitive load theory, learning difficulty, and instructional design. Learn. Instr. 4(4), 295–312 (1994)CrossRefGoogle Scholar
  28. 28.
    Taylor, P., Maor, D.: Assessing the efficacy of online teaching with the constructivist online learning environment survey. In: The 9th Annual Teaching Learning Forum, p. 7 (2000)Google Scholar
  29. 29.
    Venant, R., Vidal, P., Broisin, J.: Evaluation of learner performance during practical activities: an experimentation in computer education. In: The 14th International Conference on Advanced Learning Technologies, ICALT, pp. 237–241. IEEE (2016)Google Scholar
  30. 30.
    Wilson, J., Jan, L.W.: Smart Thinking: Developing Reflection and Metacognition. Curriculum Press, Carlton (2008)Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  1. 1.Institut de Recherche en Informatique de ToulouseUniversité Toulouse III Paul SabatierToulouse Cedex 04France

Personalised recommendations