Advertisement

Experience Capturing with Wearable Technology in the WEKIT Project

  • Puneet SharmaEmail author
  • Roland Klemke
  • Fridolin Wild
Chapter

Abstract

In this article, we focus on capturing an expert’s experiences using wearable sensors. For this, first, we outline a set of high-level tasks that facilitate the transfer of experience from an expert to a trainee. Next, we define a mapping strategy to associate each task with one or more low-level functions such as gaze, voice, video, body posture, hand/arm gestures, biosignals, fatigue levels, haptic feedback, and location of the user in the environment. These low-level functions are then decomposed to their associated state-of-the-art sensors. Based on the requirements and constraints associated with the use cases from three different industrial partners, a set of sensors are proposed for the experience-capturing prototype. Finally, we discuss the attributes and features of the proposed prototype, along with its key challenges, constraints, and possible future directions.

References

  1. https://www.raspberrypi.org (2017). Accessed December 21, 2017.
  2. https://www.arduino.cc (2017). Accessed December 21, 2017.
  3. https://www.raspberrypi.org/products/raspberry-pi-zero/ (2017). Accessed December 21, 2017.
  4. https://esp32.com (2017). Accessed December 21, 2017.
  5. http://www.ieee802.org/11/ (2017). Accessed December 22, 2017.
  6. https://www.bluetooth.com (2017). Accessed December 22, 2017.
  7. http://www.zigbee.org (2017). Accessed December 22, 2017.
  8. http://www.usb.org (2017). Accessed December 22, 2017.
  9. https://moverio.epson.com (2017). Accessed December 27, 2017.
  10. https://www.microsoft.com/en-us/hololens (2017). Accessed December 27, 2017.
  11. https://www.x.company/glass/ (2017). Accessed December 27, 2017.
  12. http://www.metavision.com (2017). Accessed December 27, 2017.
  13. https://www.vuzix.com/Products/m100-smart-glasses (2017). Accessed December 27, 2017.
  14. http://www.optinvent.com (2017). Accessed December 27, 2017.
  15. https://www.osterhoutgroup.com/r-7-glasses-system.html (2017). Accessed December 27, 2017.
  16. http://wiki.seeed.cc/Grove-Ear-clip_Heart_Rate_Sensor/ (2017). Accessed December 27, 2017
  17. http://www.tellme-ip.eu/#home (2017). Accessed December 28, 2017.
  18. https://pupil-labs.com/store/ (2017). Accessed December 28, 2017.
  19. https://www.smivision.com (2017). Accessed December 28, 2017.
  20. http://wekit.eu (2017). Accessed December 29, 2017.
  21. https://www.myo.com (2017). Accessed December 29, 2017.
  22. https://www.empatica.com/en-int/research/e4/ (2017). Accessed December 29, 2017.
  23. Bianchi, S., Sesana, M., Megliola, M., Helin, K., Karjalainen, J., & Wild, F. (2016). A service-oriented distributed learning environment for manufacturing workplaces. In G. Vincenti, A. Bucciero, & C. Vaz de Carvalho (Eds.), E-learning, E-education, and online training (pp. 19–26). Cham: Springer International Publishing.CrossRefGoogle Scholar
  24. Critchley, H. D. (2002). Review: Electrodermal responses: What happens in the brain. The Neuroscientist, 8(2), 132–142. https://doi.org/10.1177/107385840200800209. PMID: 11954558.
  25. Gilliland, S., Komor, N., Starner, T., & Zeagler, C. (2010). The textile interface swatchbook: Creating graphical user interface-like widgets with conductive embroidery. In International Symposium on Wearable Computers (ISWC 2010) (pp. 1–8).  https://doi.org/10.1109/ISWC.2010.5665876.
  26. Heinz, E. A., Kunze, K. S., Gruber, M., Bannach, D., & Lukowicz, P. (2006). Using wearable sensors for real-time recognition tasks in games of martial arts – An initial experiment. In: 2006 IEEE Symposium on Computational Intelligence and Games (pp. 98–102).  https://doi.org/10.1109/CIG.2006.311687.
  27. Hou, B., Ogata, H., Li, M., & Uosaki, N. (2012). Pacall: Supporting language learning using sensecam. In: 2012 IEEE Seventh International Conference on Wireless, Mobile and Ubiquitous Technology in Education (pp. 127–131).  https://doi.org/10.1109/WMUTE.2012.30.
  28. Karanikas, N., Melis, D. J., & Kourousis, K. I. (2017). The balance between safety and productivity and its relationship with human factors and safety awareness and communication in aircraft manufacturing. Safety and Health at Work. https://doi.org/10.1016/j.shaw.2017.09.001. http://www.sciencedirect.com/science/article/pii/S2093791117303074
  29. Koyama, N., Tajima, R., Hirose, N., & Sukigara, K. (2016). Ir tag detection and tracking with omnidirectional camera using track-before-detect particle filter. Advanced Robotics, 30(13), 877–888. https://doi.org/10.1080/01691864.2016.1159144.CrossRefGoogle Scholar
  30. Limbu, B., Fominykh, M., Klemke, R., Specht, M., & Wild, F. (2018). Supporting training of expertise with wearable technologies: The WEKIT reference framework (pp. 157–175). Singapore: Springer. https://doi.org/10.1007/978-981-10-6144-8_10.Google Scholar
  31. Mcdaniel, M., Schmidt, F., & Hunter, J. E. (1988). Job experience correlates of job performance. Journal of Applied Psychology, 73, 327–330.CrossRefGoogle Scholar
  32. Myers, M. B., Griffith, D. A., Daugherty, P. J., & Lusch, R. F. (2004). Maximizing the human capital equation in logistics: Education, experience, and skills. Journal of Business Logistics, 25(1), 211–232. http://doi.org/10.1002/j.2158-1592.2004.tb00175.x CrossRefGoogle Scholar
  33. Nakamura, H., Hanamitsu, N., Kanebako, J., & Minamizawa, K. (2018). A finger sensor for sharing visual and tactile experience (pp. 305–307). Singapore: Springer. http://doi.org/10.1007/978-981-10-4157-0_52.Google Scholar
  34. Nergard, V., Brox, C., Olsen, A., & Fuglstad, P. O. (2016). Wekit d6.1 training scenario and evaluation plan for aeronautics. http://wekit.eu/d6-1-training-scenario-and-evaluation-plan-for-aeronautics/
  35. Ravi, N., Dandekar, N., Mysore, P., & Littman, M. L. (2005). Activity recognition from accelerometer data. In Proceedings of the 17th Conference on Innovative Applications of Artificial Intelligence – Volume 3, IAAI’05 (pp. 1541–1546). AAAI Press. http://dl.acm.org/citation.cfm?id=1620092.1620107
  36. Roesner, F., Kohno, T., & Molnar, D. (2014). Security and privacy for augmented reality systems. Communications of the ACM, 57(4), 88–96. http://doi.org/10.1145/2580723.2580730.CrossRefGoogle Scholar
  37. Ros, M., Trives, J.V., & Lonjon, N. (2017). From stereoscopic recording to virtual reality headsets: Designing a new way to learn surgery. Neurochirurgie, 63(1), 1–5. http://doi.org/10.1016/j.neuchi.2016.08.004. http://www.sciencedirect.com/science/article/pii/S0028377016301230
  38. Stoppa, M., & Chiolerio, A. (2014). Wearable electronics and smart textiles: A critical review. Sensors, 14(7), 11957–11992. http://doi.org/10.3390/s140711957. http://www.mdpi.com/1424-8220/14/7/11957
  39. Tsuchikawa, M., Iwasawa, S., Ito, S., Kogure, K., Hagita, J. H., Mase, K., & Sumi, Y. (2005). Low-stress wearable computer system for capturing human experience. In Ninth IEEE International Symposium on Wearable Computers (ISWC’05) (pp. 27–33). http://doi.org/10.1109/ISWC.2005.31 Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.University of TromsøTromsøNorway
  2. 2.Open University of the NetherlandsHeerlenThe Netherlands
  3. 3.Performance Augmentation LabOxford Brookes UniversityOxfordUK

Personalised recommendations