Abstract
We examine the problem of creating immersive virtual reality (VR) scenes using a single moving RGB-D camera. Our approach takes as input a RGB-D video containing one or more actors and constructs a complete 3D background within which human actors are properly embedded. A user can then view the captured video from any viewpoint and interact with the scene. We also provide a manually labeled database of RGB-D video sequences and evaluation metrics.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
LaValle, S.M., Yershova, A., Katsev, M., Antonov, M.: Head tracking for the oculus rift. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 187–194. IEEE (2014)
Yagi, Y., Yachida, M.: Real-time omnidirectional image sensors. Int. J. Comput. Vis. 58(3), 173–207 (2004)
Tzavidas, S., Katsaggelos, A.K.: A multicamera setup for generating stereo panoramic video. IEEE Trans. Multimed. 7(5), 880–890 (2005)
Zhang, Z.: Microsoft kinect sensor and its effect. IEEE Multimed. 19(2), 4–10 (2012)
Tippetts, B., Lee, D.J., Lillywhite, K., Archibald, J.: Review of stereo vision algorithms and their suitability for resource-limited systems. J. Real-Time Image Process. 11(1), 5–25 (2016)
Yang, Q., Tan, K.-H., Culbertson, B., Apostolopoulos, J.: Fusion of active and passive sensors for fast 3D capture. In: 2010 IEEE International Workshop on Multimedia Signal Processing (MMSP), pp. 69–74. IEEE (2010)
Newcombe, R.A., Izadi, S., Hilliges, O., Molyneaux, D., Kim, D., Davison, A.J., Kohi, P., Shotton, J., Hodges, S., Fitzgibbon, A.: Kinectfusion: real-time dense surface mapping and tracking. In: 2011 10th IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 127–136. IEEE (2011)
Newcombe, R.A., Fox, D., Seitz, S.M.: Dynamicfusion: reconstruction and tracking of non-rigid scenes in real-time. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 343–352 (2015)
Dou, M., Fuchs, H.: Temporally enhanced 3D capture of room-sized dynamic scenes with commodity depth cameras. In: 2014 IEEE Virtual Reality (VR), pp. 39–44. IEEE (2014)
Dou, M., Guan, L., Frahm, J.-M., Fuchs, H.: Exploring high-level plane primitives for indoor 3D reconstruction with a hand-held RGB-D camera. In: Park, J.-I., Kim, J. (eds.) ACCV 2012. LNCS, vol. 7729, pp. 94–108. Springer, Heidelberg (2013). doi:10.1007/978-3-642-37484-5_9
Dou, M., Khamis, S., Degtyarev, Y., Davidson, P., Fanello, S.R., Kowdle, A., Escolano, S.O., Rhemann, C., Kim, D., Taylor, J., Kohli, P., Tankovich, V., Izadi, S.: Fusion4D: real-time performance capture of challenging scenes. ACM Trans. Graph. 35(4), 114:1–114:13 (2016)
Shotton, J., Sharp, T., Kipman, A., Fitzgibbon, A., Finocchio, M., Blake, A., Cook, M., Moore, R.: Real-time human pose recognition in parts from single depth images. Commun. ACM 56(1), 116–124 (2013)
Montero, A.S., Lang, J., Laganiere, R.: Scalable kernel correlation filter with sparse feature integration. In: Proceedings of the IEEE International Conference on Computer Vision Workshops, pp. 24–31 (2015)
Heckbert, P.S.: A seed fill algorithm. In: Graphics Gems, pp. 275–277. Academic Press Professional Inc (1990)
Labbé, M., Michaud, F.: Online global loop closure detection for large-scale multi-session graph-based slam. In: 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2661–2666. IEEE (2014)
Labbé, M.,Michaud, F.: Memory management for real-time appearance-based loop closure detection. In: 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1271–1276. IEEE (2011)
Labbe, M., Michaud, F.: Appearance-based loop closure detection for online large-scale and long-term operation. IEEE Trans. Robot. 29(3), 734–745 (2013)
Lourakis, M.I.A., Argyros, A.A.: SBA: a software package for generic sparse bundle adjustment. ACM Trans. Math. Softw. (TOMS) 36(1), 2 (2009)
Kazhdan, M., Bolitho, M., Hoppe, H.: Poisson surface reconstruction. In: Proceedings of the Fourth Eurographics Symposium on Geometry Processing vol. 7 (2006)
Endres, F., Hess, J., Engelhard, N., Sturm, J., Cremers, D., Burgard, W.: An evaluation of the RGB-D slam system. In: 2012 IEEE International Conference on Robotics and Automation (ICRA), pp. 1691–1696. IEEE (2012)
Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of RGB-D slam systems. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 573–580. IEEE (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Lai, P.K., Laganière, R. (2017). Creating Immersive Virtual Reality Scenes Using a Single RGB-D Camera. In: Karray, F., Campilho, A., Cheriet, F. (eds) Image Analysis and Recognition. ICIAR 2017. Lecture Notes in Computer Science(), vol 10317. Springer, Cham. https://doi.org/10.1007/978-3-319-59876-5_25
Download citation
DOI: https://doi.org/10.1007/978-3-319-59876-5_25
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-59875-8
Online ISBN: 978-3-319-59876-5
eBook Packages: Computer ScienceComputer Science (R0)