Advertisement

Markerless Indoor Augmented Reality Navigation Device Based on Optical-Flow-Scene Indoor Positioning and Wall-Floor-Boundary Image Registration

  • Wen-Shan Lin
  • Chian C. HoEmail author
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 1013)

Abstract

For markerless indoor Augmented Reality Navigation (ARN) technology, camera pose is inevitably the fundamental argument of positioning estimation and pose estimation, and floor plane is indispensably the fiducial target of image registration. This paper proposes optical-flow-scene indoor positioning and wall-floor-boundary image registration to make ARN more precise, reliable, and instantaneous. Experimental results show both optical-flow-scene indoor positioning and wall-floor-boundary image registration have higher accuracy and less latency than conventional well-known ARN methods. On the other hand, these proposed two methods are seamlessly implemented on the handheld Android embedded platform and are smoothly verified to work well on the handheld indoor augmented reality navigation device.

Keywords

Augmented reality Indoor positioning Image registration Navigation 

Notes

Acknowledgments

This work was supported in part by Ministry of Science and Technology, Taiwan, under Grant MOST 106-2221-E-224-053.

References

  1. 1.
    Kim, J., Jun, H.: Vision-based location positioning using augmented reality for indoor navigation. IEEE Trans. Consum. Electron. 54(3), 954–962 (2008)CrossRefGoogle Scholar
  2. 2.
    Mohareri, O., Rad, A.B.: Autonomous humanoid robot navigation using augmented reality technique. In: Proceedings of 2011 IEEE International Conference on Mechatronics (ICM), pp. 463–468, April 2011Google Scholar
  3. 3.
    DiVerdi, S., Hollerer, T.: Heads up and camera down: a vision-based tracking modality for mobile mixed reality. IEEE Trans. Visual. Comput. Graphics 14(3), 500–512 (2008)CrossRefGoogle Scholar
  4. 4.
    Hile, H., Borriello, G.: Positioning and orientation in indoor environments using camera phones. IEEE Comput. Graph. Appl. 28(4), 32–39 (2008)CrossRefGoogle Scholar
  5. 5.
    Oskiper, T., Sizintsev, M., Branzoi, V., Samarasekera, S., Kumar, R.: Augmented reality binoculars. IEEE Trans. Visual. Comput. Graphics 21(5), 611–623 (2015)CrossRefGoogle Scholar
  6. 6.
    Cheok, A.D., Yue, L.: A novel light-sensor-based information transmission system for indoor positioning and navigation. IEEE Trans. Instrum. Meas. 60(1), 290–299 (2011)CrossRefGoogle Scholar
  7. 7.
    Hervas, R., Bravo, J., Fontecha, J.: An assistive navigation system based on augmented reality and context awareness for people with mild cognitive impairments. IEEE J. Biomed. Health Inform. 18(1), 368–374 (2014)CrossRefGoogle Scholar
  8. 8.
    Thomas, B.H., Sandor, C.: What wearable augmented reality can do for you. IEEE Pervasive Comput. 8(2), 8–11 (2009)CrossRefGoogle Scholar
  9. 9.
    Comport, A.I., Marchand, E., Pressigout, M., Chaumette, F.: Real-time markerless tracking for augmented reality: the virtual visual servoing framework. IEEE Trans. Visual. Comput. Graphics 12(4), 615–628 (2006)CrossRefGoogle Scholar
  10. 10.
    Lee, T., Hollerer, T.: Multithreaded hybrid feature tracking for markerless augmented reality. IEEE Trans. Visual. Comput. Graphics 15(3), 355–368 (2009)CrossRefGoogle Scholar
  11. 11.
    Kim, Y.-G., Kim, W.-J.: Implementation of augmented reality system for smartphone advertisements. Int. J. Multimed. Ubiquitous Eng. 9(2), 385–392 (2014)CrossRefGoogle Scholar
  12. 12.
    Simon, G., Berger, M.-O.: Pose estimation for planar structures. IEEE Comput. Graph. Appl. 22(6), 46–53 (2002)CrossRefGoogle Scholar
  13. 13.
    Prince, S.J.D., Xu, K., Cheok, A.D.: Augmented reality camera tracking with homographies. IEEE Comput. Graph. Appl. 22(6), 39–45 (2002)CrossRefGoogle Scholar
  14. 14.
    Maidi, M., Preda, M., Le, V.H.: Markerless tracking for mobile augmented reality. In: Proceedings of IEEE International Conference on Signal and Image Processing Applications, pp. 301–306, November 2011Google Scholar
  15. 15.
    Barcelo, G.C., Panahandeh, G., Jansson, M.: Image-based floor segmentation in visual inertial navigation. In: Proceedings of IEEE International Instrumentation and Measurement Technology Conference (I2MTC), pp. 1402–1407, May 2013Google Scholar
  16. 16.
    Ling, M., Jianming, W., Bo, Z., Shengbei, W.: Automatic floor segmentation for indoor robot navigation. In: Proceedings of International Conference on Signal Processing Systems (ICSPS), vol. 1, pp. V1-684–V1-689, July 2010Google Scholar
  17. 17.
    Posada, L.F., Narayanan, K.K., Hoffmann, F., Bertram, T.: Floor segmentation of omnidirectional images for mobile robot visual navigation. In: Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 804–809, October 2010Google Scholar
  18. 18.
    Rodriguez-Telles, F.G., Abril Torres-Mendez, L., Martinez-Garcia, E.A.: A fast floor segmentation algorithm for visual-based robot navigation. In: Proceedings of International Conference on Computer and Robot Vision (CRV), pp. 167–173, May 2013Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  1. 1.Graduate School of Vocation and Technological EducationNational Yunlin University of Science and TechnologyDouliouTaiwan
  2. 2.Department of Electrical EngineeringNational Yunlin University of Science and TechnologyDouliouTaiwan

Personalised recommendations