Abstract
To accelerate digital 3D environments creation, we propose a workflow utilizing neural network systems to create 3D indoor room layouts in the Unity game engine from 2D equirectangular RGB 360 panorama images. Our approach is inspired by HorizonNet, which generates textured room layouts in point clouds using Recurrent Neural Network (RNN). However, it is not desirable in VR since data points can be visible at close ranges, and thus, break user immersion. Alternatively, we used 3D meshes that are connected with small triangular faces, which stitch together with no gaps in between, simulating realistic solid surfaces. We succeeded in converting room layout representations from point cloud to 3D mesh, by extracting rooms’ metadata predicted by HorizonNet, and dynamically generating textured custom mesh in Unity. Mesh layouts can be directly applied into Unity VR applications. Users can take 360 images on their mobile phones and visualize room layouts in VR through our system. As our evaluations suggest, mesh layout representation improves frame rates and memory usage and does not affect the layout accuracy of the original approach, providing satisfactory room layout for VR development.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Unity. https://unity.com
Sun, C., Hsiao, C.W., Sun, M., Chen, H.T.: HorizonNet: learning room layout with 1D representation and pano stretch data augmentation. In: Proceedings of the IEEE CVPR, pp. 1047–1056. IEEE Press, New York (2019). https://doi.org/10.1109/CVPR.2019.00114
Izadinia, H., Shan, Q., Seitz, S.M.: IM2CAD. In: Proceedings of the IEEE CVPR, pp. 5134–5143. IEEE Press, New York (2017). https://doi.org/10.1109/CVPR.2017.260
Zhang, Y., Song, S., Tan, P., Xiao, J.: PanoContext: a whole-room 3D context model for panoramic scene understanding. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV. LNCS, vol. 8694, pp. 668–686. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-10599-4_43
Luo, C., Zou, B., Lyu, X., Xie, H.: Indoor scene reconstruction: from panorama images to CAD models. In: IEEE ISMAR-Adjunct, pp. 317–320. IEEE Press, New York (2019). https://doi.org/10.1109/ISMAR-Adjunct.2019.00-21
Schubert, T., Friedmann, F., Regenbrecht, H.: Embodied presence in virtual environments. In: Paton, R., Neilson, I. (eds.) Visual Representations and Interpretations, pp. 269–278. Springer, London (1999). https://doi.org/10.1007/978-1-4471-0563-3_30
Bohil, C., Owen, C., Jeong, E., Alicea, B., Biocca, F.: Virtual reality and presence. In: Eadie, W.F. (ed.) 21st Century Communication: A Reference Handbook. SAGE Publications, Thousand Oaks (2009). https://doi.org/10.4135/9781412964005.
Lee, C.Y., Badrinarayanan, V., Malisiewicz, T., Rabinovich, A.: RoomNet: end-to-end room layout estimation. In: IEEE ICCV, pp. 4865–4874. IEEE Press, New York (2017). https://doi.org/10.1109/ICCV.2017.521
Fernandez-Labrador, C., Facil, J.M., Perez-Yus, A., Demonceaux, C., Civera, J., Guerrero, J.J.: Corners for layout: end-to-end layout recovery from 360 images. IEEE Robot. Autom. 5(2), 1255–1262 (2020). https://doi.org/10.1109/LRA.2020.2967274
Guo, R., Zou, C., Hoiem, D.: Predicting complete 3D models of indoor scenes. arXiv preprint (2015). https://arxiv.org/abs/1504.02437v3
Yang, S.T., Wang, F.E., Peng, C.H., Wonka, P., Sun, M., Chu, H.K.: DuLa-Net: a dual-projection network for estimating room layouts from a single RGB panorama. In: Proceedings of the IEEE CVPR, pp. 3358–3367. IEEE Press, New York (2019). https://doi.org/10.1109/CVPR.2019.00348.
Liu, C., Wu, J., Furukawa, Y.: FloorNet: a unified framework for floorplan reconstruction from 3D scans. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV. LNCS, vol. 11210, pp. 203–219. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01231-1_13
Jones, M.B., Kennedy, R.S., Stanney, K.M.: Toward systematic control of cybersickness. Presence. 13(5), 589–600 (2004). https://doi.org/10.1162/1054746042545247
Zou, C., Colburn, A., Shan, Q., Hoiem, D.: LayoutNet: reconstructing the 3D room layout from a single RGB image. In: Proceedings of the IEEE CVPR, pp. 2051–2059. IEEE Press, New York (2018). https://doi.org/10.1109/CVPR.2018.00219
VotanicXR. https://www.votanic.com/votanicxr
Virtanen, J.P., et al.: Interactive dense point clouds in a game engine. ISPRS J. Photogramm. 163, 375–389 (2020). https://doi.org/10.1016/j.isprsjprs.2020.03.007
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Chan, J.C.P., Ng, A.K.T., Lau, H.Y.K. (2021). Constructing 3D Mesh Indoor Room Layouts from 2D Equirectangular RGB 360 Panorama Images for the Unity Game Engine. In: Stephanidis, C., Antona, M., Ntoa, S. (eds) HCI International 2021 - Posters. HCII 2021. Communications in Computer and Information Science, vol 1421. Springer, Cham. https://doi.org/10.1007/978-3-030-78645-8_19
Download citation
DOI: https://doi.org/10.1007/978-3-030-78645-8_19
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-78644-1
Online ISBN: 978-3-030-78645-8
eBook Packages: Computer ScienceComputer Science (R0)