Virtual Reality Based Immersive Telepresence System for Remote Conversation and Collaboration
We developed a Virtual Reality (VR) based telepresence system providing novel immersive experience for remote conversation and collaboration. By wearing VR headsets, all the participants can be gathered into a same virtual space, with 3D cartoon Avatars representing them. The 3D VR Avatars can realistically emulate the head postures, facial expressions and hand motions of the participants, enabling them to conduct enjoyable group-to-group conversations with people spatially isolated from them. Moreover, our VR telepresence system offers conspicuously new manners for remote collaboration. For example, users can play PPT slides or watch videos together, or they can cooperate on solving a math problem by calculating on a virtual blackboard, all of which can be hardly achieved using conventional video-based telepresence system. Experiments show that our system can provide unprecedented immersive experience for tele-conversation and new possibilities for remote collaboration.
KeywordsVirtual reality Telepresence system VR avatar Remote collaboration Teleconferencing
This work was supported by Research Grant of Beijing Higher Institution Engineering Research Center and the People Programme (Marie Curie Actions) of the European Union’s Seventh Framework Programme (MC-IRSES, grant No. 612627).
- 1.Otsuka, K.: MMSpace: kinetically-augmented telepresence for small group-to-group conversations. In: Virtual Reality (VR) 2016 IEEE, pp. 19–28. IEEE (2016)Google Scholar
- 2.Maimone, A., Fuchs, H.: Encumbrance-free telepresence system with real-time 3D capture and display using commodity depth cameras. In: International symposium on mixed and augmented reality (2011)Google Scholar
- 5.Fairchild, A.J., Campion, S.P., García, A.S., Wolff, R., Fernando, T., Roberts, D.J.: A mixed reality telepresence system for collaborative space operation. IEEE Trans. Circuits Syst. Video Technol. 27(4), 814–827 (2017)Google Scholar
- 6.Vasudevan, R., Zhou, Z., Kurillo, G., Lobaton, E., Bajcsy, R., Nahrstedt, K.: Real-time stereo-vision system for 3D teleimmersive collaboration. In: International Conference on Multimedia and Expo (2010)Google Scholar
- 7.Higuchi, K., Chen, Y., Chou, P. A., Zhang, Z., Liu, Z.: Immerseboard: immersive telepresence experience using a digital whiteboard. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 2383–2392. ACM (2015)Google Scholar
- 8.Ichim, A., Bouaziz, S., Pauly, M.: Dynamic 3D avatar creation from hand-held video input. Int. Conf. Comput. Graph. Interact. Tech. 34(4), 45:1–45:14 (2015)Google Scholar
- 13.Bradski, G.: Opencv Libr. Doct. Dobbs J. 25(11), 120–126 (2000)Google Scholar
- 14.Xiong, X., De la Torre, F.: Supervised descent method and its applications to face alignment. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 532–539 (2013)Google Scholar
- 15.Xie, N., Yuan, T., Chen, N., Zhou, X., Wang, Y., Zhang, X.: Rapid DCT-based LipSync generation algorithm for game making. In: SIGGRAPH ASIA 2016 Posters, p. 2. ACM (2016)Google Scholar
- 16.Hoon, L., Chai, W., Rahman, K.: Development of real-time lip sync animation framework based on viseme human speech. Arch. Des. Res. 27(4), 19–29 (2014)Google Scholar