Advertisement

Developmental Analysis of a Markerless Hybrid Tracking Technique for Mobile Augmented Reality Systems

  • Waqas Khalid ObeidyEmail author
  • Haslina ArshadEmail author
  • Siok Yee Tan
  • Hameedur Rahman
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9429)

Abstract

Continuous tracking in Augmented Reality (AR) applications is essential for registering and augmenting the digital content on top of the real world. However, tracking on handheld devices such as PDAs or mobile phones enforces many restrictions and challenges in the form of efficiency and robustness which are the standard performance measures of tracking. This work focuses on the pre-analysis required for the development of an Accelero-Visual Markerless Hybrid Tracking Technique. The technique combines visual feature based tracking with the accelerometer sensor of the smartphones to make the process of tracking more efficient and robust. Pre-Analysis is performed for the visual and sensor based tracking approaches required to design the hybrid tracking technique. For visual tracking, the best keypoint detector and descriptors are analyzed. Careful selection of these visual tracking elements during the analysis stage helps in achieving much efficient and robust markerless augmented reality tracking results on a modern day smartphone.

Keywords

Markerless tracking Mobile augmented reality Keypoint detection Computer vision 

Notes

Acknowledgment

The authors would like to thank all those participated in this work as part of the project sponsored by research university grant Universiti Kebangsaan Malaysia (FRGS/1/2013/ICT01/UKM/02/9).

References

  1. 1.
    Yee, T.S., Arshad, H., Obeidy, W.K.: Car advertisement for android application in augmented reality. Int J. Inf. Syst. Eng. (online) 2(1) April 2014Google Scholar
  2. 2.
    Kurz, D., et al.: Absolute spatial context-aware visual feature descriptors for outdoor handheld camera localization. In: International Conference on Computer Vision Theory and Applications (2014)Google Scholar
  3. 3.
    Henrysson, A.: Bringing Augmented Reality to Mobile Phones. Ph.D thesis, Department of Science and Technology, The Institute of Technology, Linköping University (2007) Google Scholar
  4. 4.
    Wagner, D., Schmalstieg, D.: Making augmented reality practical on mobile phones, part 1. Comput. Graph. Appl. IEEE 29(3), 12–15 (2009)CrossRefGoogle Scholar
  5. 5.
    Rosenblum, L.J., et al.: The development of mobile augmented reality. In: Dill, J., Earnshaw, R., Kasik, D., Vince, J., Wong, P.C. (eds.) Expanding the Frontiers of Visual Analytics and Visualization, pp. 431–448. Springer, London (2012)CrossRefGoogle Scholar
  6. 6.
    Azuma, R.T., et al.: Making augmented reality work outdoors requires hybrid tracking. In: Proceedings of the First International Workshop on Augmented Reality. Citeseer (1998)Google Scholar
  7. 7.
    Naimark, L., Foxlin, E.: Circular data matrix fiducial system and robust image processing for a wearable vision-inertial self-tracker. In: Proceedings of the 1st International Symposium on Mixed and Augmented Reality. IEEE Computer Society (2002)Google Scholar
  8. 8.
    Jiang, B., Neumann, U., You, S.: A robust hybrid tracking system for outdoor augmented reality. In: Proceedings of Virtual Reality. IEEE (2004)Google Scholar
  9. 9.
    Seo, B.-K., et al.: A tracking framework for augmented reality tours on cultural heritage sites. In: Proceedings of the 9th ACM SIGGRAPH Conference on Virtual-Reality Continuum and its Applications in Industry. ACM (2010)Google Scholar
  10. 10.
    Wagner, D., et al.: Real-time detection and tracking for augmented reality on mobile phones. IEEE Trans. Vis. Comput. Graph. 16(3), 355–368 (2010)CrossRefGoogle Scholar
  11. 11.
    Klein, G., Murray, D.: Parallel tracking and mapping on a camera phone. In: 8th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2009. IEEE (2009)Google Scholar
  12. 12.
    Van Wyk, M.M.: The effects of Teams-Games-Tournaments on achievement, retention, and attitudes of economics education students. J. Soc. Sci. 26(3), 183–193 (2011)Google Scholar
  13. 13.
    Siltanen, S.: Theory and applications of marker-based augmented reality, VTT Science 3 (2012). http://www.vtt.fi/inf/pdf/science/2012/S3.pdf
  14. 14.
    Arth, C., Mulloni, A., Schmalstieg, D.: Exploiting sensors on mobile phones to improve wide-area localization. In: 2012 21st International Conference on Pattern Recognition (ICPR). IEEE (2012)Google Scholar
  15. 15.
    Khan, U.U.H., et al.: Objects recognition and pose calculation system for mobile. Indian J. Sci. Res. Technol. 3(1), 40–50 (2015)Google Scholar
  16. 16.
    Mulloni, A., Seichter, H., Schmalstieg, D.: Handheld augmented reality indoor navigation with activity-based instructions. In: Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM (2011)Google Scholar
  17. 17.
    Schall, C.J., et al.: Surgical instrument articulation joint cover. Google Patents (2010)Google Scholar
  18. 18.
    Heeger, D.J.: Model for the extraction of image flow. JOSA A 4(8), 1455–1471 (1987)CrossRefGoogle Scholar
  19. 19.
    Baker, S., et al.: A database and evaluation methodology for optical flow. Int. J. Comput. Vis. 92(1), 1–31 (2011)CrossRefGoogle Scholar
  20. 20.
    Lieberknecht, S., et al.: A dataset and evaluation methodology for template-based tracking algorithms. In: 8th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2009. IEEE (2009)Google Scholar
  21. 21.
    Mikolajczyk, K., Schmid, C.: A performance evaluation of local descriptors. IEEE Trans. Pattern Anal. Mach. Intell. 27(10), 1615–1630 (2005)CrossRefGoogle Scholar
  22. 22.
    Lowe, D.G.: Object recognition from local scale-invariant features. In: The Proceedings of the Seventh IEEE International Conference on Computer Vision. IEEE (1999)Google Scholar
  23. 23.
    Bay, H., Tuytelaars, T., Van Gool, L.: SURF: speeded up robust features. In: Leonardis, A., Bischof, H., Pinz, A. (eds.) ECCV 2006, Part I. LNCS, vol. 3951, pp. 404–417. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  24. 24.
    Wagner, D., Langlotz, T., Schmalstieg, D.: Robust and unobtrusive marker tracking on mobile phones. In: 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, ISMAR 2008. IEEE (2008)Google Scholar
  25. 25.
    Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)CrossRefGoogle Scholar
  26. 26.
    Calonder, M., Lepetit, V., Strecha, C., Fua, P.: BRIEF: binary robust independent elementary features. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010, Part IV. LNCS, vol. 6314, pp. 778–792. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  27. 27.
    Morel, J.-M., Yu, G.: ASIFT: a new framework for fully affine invariant image comparison. SIAM J. Imaging Sci. 2(2), 438–469 (2009)zbMATHMathSciNetCrossRefGoogle Scholar
  28. 28.
    Miksik, O., Mikolajczyk, K.: Evaluation of local detectors and descriptors for fast feature matching. In: 2012 21st International Conference on Pattern Recognition (ICPR). IEEE (2012)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Center for Artificial Intelligence Technology, Faculty of Information Science and TechnologyUniversiti Kebangsaan MalaysiaBangiMalaysia

Personalised recommendations