Advertisement

Natural interaction in virtual TV sets through the synergistic operation of low-cost sensors

  • Roi MéndezEmail author
  • Julián Flores
  • Enrique Castelló
  • Jose R. R. Viqueira
Long Paper
  • 91 Downloads

Abstract

A virtual TV set combines images from the real world with a virtual environment in order to obtain images that give the impression of the real elements, such as actors or physical objects, being present in a computer-generated scene. Thus, the audience has the feeling of the talents being present in a place where they are not. One of the most relevant aspects to obtain a good sense of presence on stage is the capability of the actors to interact, in real time, with the virtual world. To make this possible, it is necessary to track the body of the presenters and detect their gestures so that they can modify the synthetic environment in a natural way. In this paper, a study that analyzes the feasibility of the synergistic use of several sensors to improve the interaction of the actors with the scene, mainly focusing on natural gesture detection, is presented. A new workflow, based on the learning of natural gestures by the system through artificial intelligence techniques in order to use them during live broadcasts, is proposed. Using this approach in the pre-production process, the actors are able to create their own custom paradigm of interaction with the virtual environment, thus increasing the naturalism of their behavior during live broadcasts and reducing the training time needed for new productions.

Keywords

Natural user interaction Virtual TV set Microsoft Kinect V2 

Notes

Acknowledgements

The authors will like to thank all the volunteers who participated in the study and thank also to Manolo Fidalgo for his continuous support and advice. This work has received financial support from the Consellería de Cultura, Educación e Ordenación Universitaria (accreditation 2016–2019, ED431G/08) and the European Regional Development Fund (ERDF).

References

  1. 1.
    Alabbasi, H., Gradinaru, A., Moldoveanu, F., Moldoveanu, A.: Human motion tracking evaluation using kinect v2 sensor. In: E-Health and Bioengineering Conference (EHB) 2015, pp. 1–4 (2015)Google Scholar
  2. 2.
    Argelaguet, F., Andujar, C.: A survey of 3d object selection techniques for virtual environments. Comput. Gr. 37, 121–136 (2013)CrossRefGoogle Scholar
  3. 3.
    Bowman, D.A., Kruijff, E., LaViola, J.J., Poupyrev, I.: 3D User Interfaces: Theory and Practice. Addison Wesley Longman Publishing Co.,Inc, Redwood City (2004)Google Scholar
  4. 4.
    Daemen, J., Haufs-Brusberg, P., Herder, J.: Markerless actor tracking for virtual (tv) studio applications. In: 2013 International Joint Conference on Awareness Science and Technology and Ubi-Media Computing (iCAST-UMEDIA), pp. 790–796 (2013)Google Scholar
  5. 5.
    Gibbs, S., Arapis, C., Breiteneder, C., Lalioti, V., Mostafawy, S., Speier, J.: Virtual studios: an overview. IEEE MultiMed. 5(1), 18–35 (1998)CrossRefGoogle Scholar
  6. 6.
    Gonzalez-Jorge, H., Riveiro, B., Vazquez-Fernandez, E., Martnez-Snchez, J., Arias, P.: Metrological evaluation of microsoft kinect and asus xtion sensors. Measurement 46(6), 1800–1806 (2013)CrossRefGoogle Scholar
  7. 7.
    Guna, J., Jakus, G., Pogacnik, M., Tomazic, S., Sodnik, J.: An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 14(2), 3702 (2014)CrossRefGoogle Scholar
  8. 8.
    Hilliges, O., Kim, D., Izadi, S., Weiss, M., Wilson, A.: Holodesk: Direct 3d interactions with a situated see-through display. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’12, pp. 2421–2430. ACM, New York (2012)Google Scholar
  9. 9.
    Kennedy, R.C., Gaskins, F.J.: Electronic composites in modern television. Proc. IRE 46(11), 1798–1807 (1958)CrossRefGoogle Scholar
  10. 10.
    Klapdohr, M., Wöldecke, B., Marinos, D., Herder, J., Geiger, C., Vonolfen, W.: Vibrotactile pitfalls: arm guidance for moderators in virtual tv studios. In: Proceedings of the 13th International Conference on Humans and Computers, HC ’10, pp. 72–80. University of Aizu Press, Fukushima-ken (2010)Google Scholar
  11. 11.
    Lange, R., Seitz, P.: Solid-state time-of-flight range camera. IEEE J. Quantum Electron. 37(3), 390–397 (2001). doi: 10.1109/3.910448 CrossRefGoogle Scholar
  12. 12.
    Ltd., M.S.E.: mo-sys camera motion systems. http://www.mo-sys.com/ (2015). [Online; last visited 20-april-2016]
  13. 13.
    Ludwig, P., Buchel, J., Herder, J., Vonolfen, W.: Inearguide - a navigation and interaction feedback system using in ear headphones for virtual tv studio productions. In: 9th Workshop Virtuelle und Erweiterte Realitat der GI-Fachgruppe VR/AR, Dusseldorf (2012)Google Scholar
  14. 14.
    Mendez, R.: Advanced visualization and interaction applied to virtual scenarios. Ph.D. thesis, University of Santiago de Compostela (2017)Google Scholar
  15. 15.
    Mitsumine, H., Yamanouchi, Y., Fukaya, T., Okubo, H., Inoue, S.: Camera parameter estimation method using infrared markers for live tv production. In: 2008 IEEE Virtual Reality Conference, pp. 275–276 (2008)Google Scholar
  16. 16.
    Multimedia, B.: Infinity set, the best virtual studio solution. http://www.brainstorm3d.com/products/infinity-set/ (2016). [Online; last visited 30-July-2016]
  17. 17.
    Multimedia, B.: Render engine for 3d real-time graphics, estudio. http://www.brainstorm.es/products/estudio/ (2016). [Online; last visited 30-July-2016]
  18. 18.
    Multimedia, B.: Virtual studios solution provider, brainstorm multimedia. http://www.brainstorm3d.com/ (2016). [Online; last visited 30-July-2016]
  19. 19.
    Petrovic, M., Jaksic, B., Spalevic, P., Petrovic, I., Dakovic, V.: The analysis background on the effect of chroma-key in virtual tv studio. INFOTECH 12, 937–941 (2012)Google Scholar
  20. 20.
    Point, N.: Motive:tracker - motion capture and 6dof object tracking - optitrack. https://www.optitrack.com/products/motive/tracker/ (2016). [Online; last visited 23-may-2016]
  21. 21.
    Rahbar, K., Pourreza, H.R.: Inside looking out camera pose estimation for virtual studio. Gr. Models 70(4), 57–75 (2008)CrossRefGoogle Scholar
  22. 22.
    Schapire, R.E.: A brief introduction to boosting. In: Proceedings of the 16th International Joint Conference on Artificial Intelligence—volume 2, IJCAI’99, pp. 1401–1406. Morgan Kaufmann Publishers Inc., San Francisco (1999)Google Scholar
  23. 23.
    Shim, W., Kim, G.J.: Designing for presence and performance: the case of the virtual fish tank. Presence 12(4), 374–386 (2003)CrossRefGoogle Scholar
  24. 24.
    Technologies, N.: ncam–ncam homepage. http://www.ncam-tech.com/ (2016). [Online; last visited 10-June-2016]
  25. 25.
    Thomas, G.: Mixed reality techniques for tv and their application for on-set and pre-visualization in film production. In: International Workshop on Mixed Reality Technology for Filmmaking, pp. 31–36 (2006)Google Scholar
  26. 26.
    Tykkala, A.T., Comport, A.I., Kamarainen, J.K., Hartikainen, H.: Live rgb-d camera tracking for television production studios. Visual understanding and applications with RGB-D cameras. J. Vis. Commun. Image Represent. 25(1), 207–217 (2014)CrossRefGoogle Scholar
  27. 27.
    Ullmer, B., Ishii, H.: Emerging Frameworks for Tangible User Interfaces, pp. 915–931. IBM Corp, Riverton (2000)Google Scholar
  28. 28.
    Vidal, B.: Chroma key visual feedback based on non-retroreflective polarized reflection in retroreflective screens. IEEE Trans. Broadcast. 58(1), 144–150 (2012)CrossRefGoogle Scholar
  29. 29.
    Weichert, F., Bachmann, D., Rudak, B., Fisseler, D.: Analysis of the accuracy and robustness of the leap motion controller. Sensors 13(5), 6380 (2013)CrossRefGoogle Scholar
  30. 30.
    Wojdala, A.: Challenges of virtual set technology. IEEE MultiMed. 5(1), 50–57 (1998)CrossRefGoogle Scholar
  31. 31.
    Xiao, Q., Zhao, G., Li, R.: Three dimensional low cost virtual studio system. In: International Symposium on Knowledge Acquisition and Modeling, 2008. KAM ’08. pp. 741–744 (2008)Google Scholar
  32. 32.
    Zhang, Z.: Microsoft kinect sensor and its effect. IEEE MultiMed. 19(2), 4–10 (2012). doi: 10.1109/MMUL.2012.24 CrossRefGoogle Scholar
  33. 33.
    Zhao, G., Ming, T., Di, C., Quan, L., Ming, G.: Research on low-cost virtual studio system for education resource authoring. In: IEEE International Symposium on IT in Medicine and Education, 2008. ITME 2008. pp. 198–201 (2008)Google Scholar

Copyright information

© Springer-Verlag GmbH Germany 2017

Authors and Affiliations

  1. 1.CITIUSUniversity of Santiago de CompostelaSantiago de CompostelaSpain
  2. 2.Faculty of Communication ScienceUniversity of Santiago de CompostelaSantiago de CompostelaSpain

Personalised recommendations