Advertisement

Experimental Observation of Nodding Motion in Remote Communication Using ARM-COMS

  • Teruaki ItoEmail author
  • Hiroki Kimachi
  • Tomio Watanabe
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10904)

Abstract

Considering the critical issues of remote communication, this study proposes an idea of remote individuals’ virtual connection through augmented tele-presence systems called ARM-COMS (ARm-supported eMbodied COmmunication Monitor System). Several ideas of robot-based remote communication systems have been proposed to challenge the telepresence issue of remote participants. However, it does not cover the issue of relationship. An idea of robotic arm-typed system and/or an idea of anthropomorphization draw researchers’ attentions to challenge the lack of relationship with remote participants. However, usage of the human body movement of a remote person as a non-verbal message, or cyber-physical media in remote communication is still an open issue. Under these circumstances, this paper describes the system configuration of ARM-COMS based on the proposed idea and discusses the feasibility of the idea using the experimental observations.

Keywords

Cyber-physical communication media Embodied communication Augmented tele-presence robotic arm manipulation Face detection 

Notes

Acknowledgement

This work was supported by JSPS KAKENHI Grant Numbers JP16K00274. The author would like to acknowledge all members of Collaborative Engineering Labs at Tokushima University, and Center for Technical Support of Tokushima University, for their cooperation to conduct the experiments.

References

  1. 1.
    Abowdm, D.G., Mynatt, D.E.: Charting past, present, and future research in ubiquitous computing. ACM Trans. Comput.-Hum. Interact. (TOCHI) 7(1), 29–58 (2000)CrossRefGoogle Scholar
  2. 2.
    Bertrand, C., Bourdeau, L.: Research interviews by Skype: a new data collection method. In: Esteves, J. (ed.) Proceedings from the 9th European Conference on Research Methods, pp. 70–79. IE Business School, Spain (2010)Google Scholar
  3. 3.
    Dlib C++ libraty. http://dlib.net/
  4. 4.
  5. 5.
    Greenberg, S.: Peepholes: low cost awareness of one’s community. In: 1996 Conference Companion on Human Factors in Computing Systems: Common Ground, Vancouver, British Columbia, Canada, pp. 206–207 (1996)Google Scholar
  6. 6.
    Ito, T., Watanabe, T.: Three key challenges in ARM-COMS for entrainment effect acceleration in remote communication. In: Yamamoto, S. (ed.) HCI 2014. LNCS, vol. 8521, pp. 177–186. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-07731-4_18CrossRefGoogle Scholar
  7. 7.
    Ito, T., Watanabe, T.: ARM-COMS for entrainment effect enhancement in remote communication. In: Proceedings of the ASME 2015 International Design Engineering Technical Conferences & Computers and Information Engineering Conference (IDETC/CIE2015), August, Boston, USA, no. DETC2015-47960 (2015)Google Scholar
  8. 8.
    Ito, T., Watanabe, T.: Motion control algorithm of ARM-COMS for entrainment enhancement. In: Yamamoto, S. (ed.) HIMI 2016. LNCS, vol. 9734, pp. 339–346. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-40349-6_32CrossRefGoogle Scholar
  9. 9.
    Kashiwabara, T., Osawa, H., Shinozawa, K., Imai, M.: TEROOS: a wearable avatar to enhance joint activities. In: Annual Conference on Human Factors in Computing Systems, pp. 2001–2004 (2012)Google Scholar
  10. 10.
    Kim, K., Bolton, J., Girouard, A., Cooperstock, J., Vertegaal, R.: TeleHuman: effects of 3D perspective on gaze and pose estimation with a life-size cylindrical telepresence pod. In: Proceedings of CHI2012, pp. 2531–2540 (2012)Google Scholar
  11. 11.
  12. 12.
    Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., Torralba, A.: Eye tracking for everyone. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2016)Google Scholar
  13. 13.
  14. 14.
    Osawa, T., Matsuda, Y., Ohmura, R., Imai, M.: Embodiment of an agent by anthropomorphization of a common object. Web Intell. Agent Syst.: Int. J. 10, 345–358 (2012)Google Scholar
  15. 15.
    Otsuka, T., Araki, S., Ishizuka, K., Fujimoto, M., Heinrich, M., Yamato, J.: A realtime multimodal system for analyzing group meetings by combining face pose tracking and speaker diarization. In: Proceedings of the 10th International Conference on Multimodal Interfaces (ICMI 2008), Chania, Crete, Greece, pp. 257–264 (2008)Google Scholar
  16. 16.
    Ohtsuka, S., Oka, S., Kihara, K., Tsuruda, T., Seki, M.: Human-body swing affects visibility of scrolled characters with direction dependency. In: Society for Information Display (SID) 2011 Symposium Digest of Technical Papers, pp. 309–312 (2011)Google Scholar
  17. 17.
  18. 18.
    OpenFace API documentation. http://cmusatyalab.github.io/openface/
  19. 19.
    Padmavathi, G., Shanmugapriya, D., Kalaivan, M.: A study on vehicle detection and tracking using. Wirel. Sens. Netw. 2, 173–185 (2010)CrossRefGoogle Scholar
  20. 20.
    Schoff, F., Kalenichenko, D., Philbin, J.: FaceNet: a unified embedding for face recognition and clustering. In: IEEE Conference on CVPR 2015, pp. 815–823 (2015)Google Scholar
  21. 21.
    Sirkin, D., Ju, W.: Consistency in physical and on-screen action improves perceptions of telepresence robots. In: HRI 2012 Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction, pp. 57–64 (2012)Google Scholar
  22. 22.
    Tariq, A.M., Ito, T.: Master-slave robotic arm manipulation for communication robot. In: Japan Society of Mechanical Engineer, Proceedings of 2011 Annual Meeting, vol. 11, no. 1, p. S12013, September 2011Google Scholar
  23. 23.
  24. 24.
    Watanabe, T.: Human-entrained embodied interaction and communication technology. In: Fukuda, S. (ed.) Emotional Engineering, pp. 161–177. Springer, London (2011).  https://doi.org/10.1007/978-1-84996-423-4_9CrossRefGoogle Scholar
  25. 25.
    Wongphati, M., Matsuda, Y., Osawa, H., Imai, M.: Where do you want to use a robotic arm ? And what do you want from the robot ? In: International Symposium on Robot and Human Interactive Communication, pp. 322–327 (2012)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Graduate School of Technology, Industrial and Social SciencesTokushima UniversityTokushimaJapan
  2. 2.Graduate School of Advanced Technology and ScienceTokushima UniversityTokushimaJapan
  3. 3.Faculty of Computer Science and System EngineeringOkayama Prefectural UniversitySouja, OkayamaJapan

Personalised recommendations