Skip to main content

Multiple Object Tracking in Unprepared Environments Using Combined Feature for Augmented Reality Applications

  • Conference paper
Communication and Networking (FGCN 2010)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 119))

Abstract

Existing augmented reality (AR) applications performs well in tracking objects in prepared environments. However, tracking multiple objects in unprepared environments involves considerable difficulty to achieve best results since users are not allowed to modify the real environment such as in outdoor applications. This research focuses on a multiple object tracking based on combined feature tracking and color tracking in an arbitrary scene to demonstrate superior performance. This approach is beneficial to tracking technologies since additional visual cues provide seamless real-synthetic world integration. The system requires accurate measurements of the six-degree of freedom (6DOF) camera pose, three-degree of freedom (3DOF) for position and three for orientation relative to the world coordinate system. Our framework aims to lead AR applications in unprepared environments with multiple tracking objects regardless of the movement of real objects, lights and cameras.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Neumann, U., Cho, Y.: A self-tracking augmented reality system. In: Proceedings of VRST, pp. 109–115 (1996)

    Google Scholar 

  2. Kato, H., Billinghurst, M.: Marker tracking and HMD calibration for a video-based augmented reality conferencing system. In: Proceedings of IWAR, pp. 85–94 (1999)

    Google Scholar 

  3. State, A., Hirota, G., Chen, D., Garrett, B., Livingston, M.: Superior augmented reality registration by integrating landmark tracking and magnetic tracking. In: Proceedings of SIGGRAPH 1996, pp. 429–438 (1996)

    Google Scholar 

  4. Cucchiara, R., Grana, C., Neri, G., Piccardi, M., Prati, A.: The Sakbot system for moving object detection and tracking. In: Video-based Surveillance Systems-Computer vision and Distributed Processing, pp. 145–157 (2001)

    Google Scholar 

  5. Azuma, R.: A Survey of Augmented Reality. Presence: Teleoperators and Virtual Environments 6(4), 355–385 (1997)

    Article  Google Scholar 

  6. Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., MacIntyre, B.: Recent Advances in Augmented Reality. IEEE Computer Graphics and Application 21(6), 34–47 (2001)

    Article  Google Scholar 

  7. Glassner, A.: Everyday Computer Graphics. IEEE Computer Graphics and Applications 23(6), 76–82 (2003)

    Article  Google Scholar 

  8. Navab, N.: Developing Killer Apps for Industrial Augmented Reality. IEEE Computer Graphics and Applications 24(3), 16–20 (2004)

    Article  Google Scholar 

  9. McKenna, S.J., Jabri, S., Duric, Z., Rosenfeld, A., Wechsler, A.: Tracking group of people. Computer Vision Image Understanding 80(1), 42–56 (2000)

    Article  MATH  Google Scholar 

  10. Boult, T.E., Micheals, R.J., Gao, X., Eckmann, M.: Into the woods: Visual surveillance of noncooperative and camouflaged targets in complex outdoor settings. Proceedings of IEEE 89(10), 1382–1402 (2001)

    Article  Google Scholar 

  11. Lipton, A., Fujiyoshi, H., Patil, R.: Moving target classification and tracking from real-time video. In: DARPA Image Understanding Workshop, pp. 129–136 (1998)

    Google Scholar 

  12. Kato, H., Billinghurst, M., Poupyrev, I., Imamoto, K., Tachibana, K.: Virtual Object Manipulation on a Table-Top AR Environment. In: Proceedings of International Symposium Augmented Reality (2000)

    Google Scholar 

  13. Marchand, E., Chaumette, F.: Virtual Visual Servoing: A Framework for Real-Time Augmented Reality. In: Eurographics Conference Proceedings, pp. 289–298 (2002)

    Google Scholar 

  14. Zhang, X., Fronz, S., Navab, N.: Visual Marker Detection and Decoding in AR Systems: A Comparative Study. In: Proceedings IEEE International Symposium Mixed and Augmented Reality, pp. 79–106 (2002)

    Google Scholar 

  15. Comaniciu, D., Ramesh, V., Meer, P.: Real-time tracking of non-rigid objects using mean shift. In: Computer Vision and Pattern Recognition (2000)

    Google Scholar 

  16. Tomasi, C., Kanade, T.: Detection and Tracking of Point Features. Technical Report CMUCS-91-132, Carnegie Mellon University (1991)

    Google Scholar 

  17. Birchfield, S.: Klt an implementation of the kanade-lucas-tomasi feature tracker, www.ces.clemson.edu/stb/klt/

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Cagalaban, G., Kim, S. (2010). Multiple Object Tracking in Unprepared Environments Using Combined Feature for Augmented Reality Applications. In: Kim, Th., Chang, A.CC., Li, M., Rong, C., Patrikakis, C.Z., Ślęzak, D. (eds) Communication and Networking. FGCN 2010. Communications in Computer and Information Science, vol 119. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-17587-9_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-17587-9_1

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-17586-2

  • Online ISBN: 978-3-642-17587-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics