Advertisement

2.5DHANDS: a gesture-based MR remote collaborative platform

  • Peng Wang
  • Shusheng ZhangEmail author
  • Xiaoliang BaiEmail author
  • Mark Billinghurst
  • Weiping He
  • Mengmeng Sun
  • Yongxing Chen
  • Hao Lv
  • Hongyu Ji
ORIGINAL ARTICLE
  • 14 Downloads

Abstract

Current remote collaborative systems in manufacturing are mainly based on video-conferencing technology. Their primary aim is to transmit manufacturing process knowledge between remote experts and local workers. However, it does not provide the experts with the same hands-on experience as when synergistically working on site in person. The mixed reality (MR) and increasing networking performances have the capacity to enhance the experience and communication between collaborators in geographically distributed locations. In this paper, therefore, we propose a new gesture-based remote collaborative platform using MR technology that enables a remote expert to collaborate with local workers on physical tasks. Besides, we concentrate on collaborative remote assembly as an illustrative use case. The key advantage compared to other remote collaborative MR interfaces is that it projects the remote expert’s gestures into the real worksite to improve the performance, co-presence awareness, and user collaboration experience. We aim to study the effects of sharing the remote expert’s gestures in remote collaboration using a projector-based MR system in manufacturing. Furthermore, we show the capabilities of our framework on a prototype consisting of a VR HMD, Leap Motion, and a projector. The prototype system was evaluated with a pilot study comparing with the POINTER (adding AR annotations on the task space view through the mouse), which is the most popular method used to augment remote collaboration at present. The assessment adopts the following aspects: the performance, user’s satisfaction, and the user-perceived collaboration quality in terms of the interaction and cooperation. Our results demonstrate a clear difference between the POINTER and 2.5DHANDS interface in the performance time. Additionally, the 2.5DHANDS interface was statistically significantly higher than the POINTER interface in terms of the awareness of user’s attention, manipulation, self-confidence, and co-presence.

Keywords

Mixed reality Augmented reality Remote collaboration Sharing gestures 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

Acknowledgments

We would like to thank Yuming Zheng for donating his personal water pump for our research, and Yue Wang for checking the English of an early version, and his constructive comments are gratefully acknowledged which have helped the author to improve the paper. Besides, we would like to appreciate the anonymous reviewers for their constructive suggestions for enhancing this paper. Specifically, we thank our schoolfellow for their contribution to this research: Li Zhang for the science lead and Dechuan Han for technical realization, Jiaxiang Du for the experimental data collection. Moreover, we want to thank for Shuxiang Wang’s constructive suggestions for improving the experiment. We would also like to thank members of the Northwestern Polytechnical University for their participation in the experiment.

Funding information

This research was sponsored by the civil aircraft special project (MJZ-2017-G73) and the seed foundation of innovation and creation for graduate students in the Northwestern Polytechnical University (ZZ2018084).

References

  1. 1.
    Oyekan J, Prabhu V, Tiwari A, Baskaran V, Burgess M, Mcnally R (2017) Remote real-time collaboration through synchronous exchange of digitised human-workpiece interactions. Futur Gener Comput Syst 67:83–93Google Scholar
  2. 2.
    Wang X, Ong SK, Nee AYC (2016) A comprehensive survey of augmented reality assembly research. Adv Manuf 4:1–22Google Scholar
  3. 3.
    Ong SK, Yuan ML, Nee AYC (2008) Augmented reality applications in manufacturing: a survey. Int J Prod Res 46(10):2707–2742zbMATHGoogle Scholar
  4. 4.
    Nee AYC, Ong SK, Chryssolouris G, Mourtzis D (2012) Augmented reality applications in design and manufacturing. CIRP Ann Manuf Technol 61(2):657–679Google Scholar
  5. 5.
    Wang Y, Zhang S, Yang S, He W, Bai X, Zeng Y (2017) A LINE-MOD-based markerless tracking approach for AR applications. Int J Adv Manuf Technol 89(5–8):1699–1707Google Scholar
  6. 6.
    Doshi A, Smith RT, Thomas BH, Bouras C (2017) Use of projector based augmented reality to improve manual spot-welding precision and accuracy for automotive manufacturing. Int J Adv Manuf Technol 89:1279–1293Google Scholar
  7. 7.
    Uva AE, Gattullo M, Manghisi VM, Spagnulo D, Cascella GL, Fiorentino M (2018) Evaluating the effectiveness of spatial augmented reality in smart manufacturing: a solution for manual working stations. Int J Adv Manuf Technol 94:509–521Google Scholar
  8. 8.
    Wang Y, Zhang S, Wan B, He W, Bai X (2018) Point cloud and visual feature-based tracking method for an augmented reality-aided mechanical assembly system. Int J Adv Manuf TechnolGoogle Scholar
  9. 9.
    Matsas E, Vosniakos G, Batras D (2017) Effectiveness and acceptability of a virtual environment for assessing human–robot collaboration in manufacturing. Int J Adv Manuf Technol 92:3903–3917Google Scholar
  10. 10.
    Wang S, Parsons M, Stonemclean J, Rogers P, Boyd S, Hoover K, Meruvia-Pastor O, Gong M, Smith A (2017) Augmented reality as a telemedicine platform for remote procedural training. Sensors 17(10):2294Google Scholar
  11. 11.
    Anton D, Kurillo G, Bajcsy R (2017) User experience and interaction performance in 2D/3D telecollaboration. Futur Gener Comput Syst 82:77–88Google Scholar
  12. 12.
    Gurevich P, Lanir J, Cohen B (2015) Design and implementation of TeleAdvisor: a projection-based augmented reality system for remote collaboration. CSCW 24(6):527–562Google Scholar
  13. 13.
    Anton D, Kurillo G, Yang AY, Bajcsy R (2017) Augmented telemedicine platform for real-time remote medical consultation. In: MultiMedia modeling. Springer International Publishing, pp 77–89Google Scholar
  14. 14.
    Piumsomboon T, Lee GA, Hart JD, Ens B, Lindeman RW, Thomas BH, Billinghurst M (2018) Mini-Me: An adaptive avatar for mixed reality remote collaboration. In: Proceedings of the 2018 CHI conference on human factors in computing systems, vol 46. ACM, pp1–13Google Scholar
  15. 15.
    Gurevich P, Lanir J, Cohen B, Cohen B, Stone R (2012) TeleAdvisor:a versatile augmented reality tool for remote assistance. In: Sigchi conference on human factors in computing systems. ACM, pp 619–622Google Scholar
  16. 16.
    Gergle D, Kraut RE, Fussell SR (2013) Using visual information for grounding and awareness in collaborative tasks. Hum-Comput Interact 28(1):1–39Google Scholar
  17. 17.
    Ranjan A, Birnholtz JP, Balakrishnan R (2007) Dynamic shared visual spaces: experimenting with automatic camera control in a remote repair task. In: Sigchi conference on human factors in computing systems. ACM pp 1177–1186Google Scholar
  18. 18.
    Fussell SR, Setlock LD, Kraut RE (2003) Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks. In: Conference on human factors in computing systems. ACM, pp 513–520Google Scholar
  19. 19.
    Gao L, Bai H, Lee G, Billinghurst M (2016) An oriented point-cloud view for MR remote collaboration. In: SIGGRAPH ASIA 2016 mobile graphics and interactive applications. ACM, p 8Google Scholar
  20. 20.
    Huang W, Alem L, Tecchia F, Duh HB (2017) Augmented 3D hands: a gesture-based mixed reality system for distributed collaboration. J Multimodal User In 2:1–13Google Scholar
  21. 21.
    Tecchia F, Alem L, Huang W (2012) 3D helping hands:a gesture based MR system for remote collaboration. In: ACM SIGGRAPH International Conference on Virtual-Reality Continuum and ITS Applications in Industry. ACM (VRCAI), pp 323–328Google Scholar
  22. 22.
    Huang W, Alem L (2013) HandsinAir: a wearable system for remote collaboration on physical tasks. In: Proceedings of the 2013 conference on computer supported cooperative work companion. ACM (CSCW), pp 153–156Google Scholar
  23. 23.
    O'Neill J, Castellani S, Roulland F, Juliano C, Dai L, Roulland F, Hairon N (2011) From ethnographic study to mixed reality: a remote collaborative troubleshooting system. In: ACM 2011 conference on computer supported cooperative work. ACM (CSCW), pp 225–234Google Scholar
  24. 24.
    Kirk D, Rodden T (2007) Turn it this way:grounding collaborative action with remote gestures. In: conference on human factors in computing systems, CHI 2007, San Jose, California, USA, April 28 - May. DBLP, pp 1039–1048Google Scholar
  25. 25.
    Fussell SR, Kraut RE, Siegel J (2000) Coordination of communication:effects of shared visual context on collaborative work. In: Acm conference on Computer Supported Cooperative Work (CSCW). pp 21–30Google Scholar
  26. 26.
    Kraut RE, Fussell SR, Siegel J (2003) Visual information as a conversational resource in collaborative physical tasks. Hum Comput Interact 18(1–2):13–49Google Scholar
  27. 27.
    Tait M, Billinghurst M (2015) The effect of view Independence in a collaborative AR system. Comput Supported Coop Work J 24(6):563–589Google Scholar
  28. 28.
    Tait M, Tsai T, Sakata N, Billinghurst M, Vartiainen E (2013) A projected augmented reality system for remote collaboration. IEEE international symposium on mixed and augmented reality (ISMAR). IEEE, pp 1–6Google Scholar
  29. 29.
    Fussell SR, Setlock LD, Parker EM, Yang J (2003) Assessing the value of a cursor pointing device for remote collaboration on physical tasks. In: Extended Abstracts of the 2003 Conference on Human Factors in Computing Systems, CHI 2003, Ft. Lauderdale, Florida, USA, April. DBLP, pp 788–789Google Scholar
  30. 30.
    Kim S, Lee G, Sakata N, Billinghurst M (2014). Improving co-presence with augmented visual communication cues for sharing experience through video conference. In: IEEE international symposium on mixed and augmented reality (ISMAR) IEEE, pp 83–92Google Scholar
  31. 31.
    Andrist S, Gleicher M, Mutlu B (2017) Looking Coordinated: Bidirectional Gaze Mechanisms for Collaborative Interaction with Virtual Characters. In: CHI conference on human factors in computing systems. ACM, pp 2571–2582Google Scholar
  32. 32.
    Gupta K, Lee GA, Billinghurst M (2016) Do you see what I see? The effect of gaze tracking on task space remote collaboration. IEEE Trans Vis Comput Graph 22(11):2413–2422Google Scholar
  33. 33.
    Wang P, Zhang S, Bai X, Billinghurst M, He W, Zhang L, Du J, Wang S (2018) [POSTER]Do You Know What I Mean? An MR-based Collaborative Platform. In: IEEE International Symposium on Mixed and Augmented Reality (ISMAR). https://www.ismar2018.org/papers/ismar2018_poster_1044.html
  34. 34.
    Kuzuoka H, Oyama S, Yamazaki K, Mitsuishi M (2000) GestureMan:a mobile robot that embodies a remote instructor's actions. In: CSCW'00: Proc. 2000 ACM conference on computer supported cooperative work (CSCW). ACM Press, New York, pp 155–162Google Scholar
  35. 35.
    Ou J, Chen X, Fussell SR, Yang J (2003) DOVE: drawing over video environment. In: Eleventh ACM International Conference on Multimedia. ACM, pp100–101Google Scholar
  36. 36.
    Fussell SR, Setlock LD, Yang J, Ou J, Mauer E, Kramer ADI (2004) Gestures over video streams to support remote collaboration on physical tasks. Hum Comput Interact 19(3):273–309Google Scholar
  37. 37.
    Kirk D, Crabtree A, Rodden T (2005) Ways of the hands. In: ECSCW 2005. Springer Netherlands, pp 1–21Google Scholar
  38. 38.
    Kirk D, Fraser D S (2006) Comparing remote gesture technologies for supporting collaborative physical tasks. In: Conference on human factors in computing systems, CHI 2006, Montréal, Québec, Canada, April DBLP, pp 1191–1200Google Scholar
  39. 39.
    Li J, Wessels A, Alem L, Stitzlein C (2007) Exploring interface with representation of gesture for remote collaboration. Ozchi 179–182Google Scholar
  40. 40.
    Alem L, Tecchia F, Huang W (2011) HandsOnVideo: towards a gesture based Mobile AR system for remote collaboration. In: Recent trends of mobile collaborative augmented reality systems. Springer, New York, pp 135–148Google Scholar
  41. 41.
    Alem L, Li J (2011) A study of gestures in a video-mediated collaborative assembly task. Int J Hum Comput Interact 2011(3):1Google Scholar
  42. 42.
    Isaacs EA, Tang JC (1993) What video can and can’t do for collaboration: a case study. In: ACM International conference on multimedia. ACM, pp 199–206Google Scholar
  43. 43.
    Lee G, Kim S, Lee Y, Dey A (2017) [POSTER] mutually shared gaze in augmented video conference. In: IEEE international symposium on mixed and augmented reality (ISMAR). IEEE, pp 79–80Google Scholar
  44. 44.
    Brooke J (1996) SUeS—a quick and dirty usability scale. In: Usability evaluation in industryGoogle Scholar
  45. 45.
    Harms C, Biocca F (2004) Internal consistency and reliability of the networked minds measure of social presence. In: Alcanizm, Rey, Seventh international workshop: presenceGoogle Scholar

Copyright information

© Springer-Verlag London Ltd., part of Springer Nature 2019

Authors and Affiliations

  • Peng Wang
    • 1
  • Shusheng Zhang
    • 1
    Email author
  • Xiaoliang Bai
    • 1
    Email author
  • Mark Billinghurst
    • 1
    • 2
  • Weiping He
    • 1
  • Mengmeng Sun
    • 1
  • Yongxing Chen
    • 1
  • Hao Lv
    • 1
  • Hongyu Ji
    • 1
  1. 1.Key Laboratory of Contemporary Designing and Integrated Manufacturing Technology, Ministry of Education, Cyber-Physical Interaction LabNorthwestern Polytechnical UniversityXi’anChina
  2. 2.Empathic Computing LabUniversity of South AustraliaMawson LakesAustralia

Personalised recommendations