Automated camera calibration and 3D egomotion estimation for augmented reality applications

  • Dieter Koller
  • Gudrun Klinker
  • Eric Rose
  • David Breen
  • Ross Whitaker
  • Mihran Tuceryan
Motion and Calibration
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1296)


This paper addresses the problem of accurately tracking the 3D motion of a monocular camera in a known 3D environment and dynamically estimating the 3D camera location. For that purpose we propose a fully automated landmark-based camera calibration method and initialize a motion estimator, which employes extended Kalman filter techniques to track landmarks and to estimate the camera location at any given time. The implementation of our approach has been proven to be efficient and robust and our system successfully tracks in real-time at approximately 10 Hz. We show tracking results of various augmented reality scenarios.


Kalman Filter Augmented Reality Extend Kalman Filter Virtual Object Camera Calibration 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    M. Bajura, H. Fuchs, and R. Ohbuchi, “Merging virtual objects with the real world: Seeing ultrasound imagery within the patient,” in Computer Graphics (SIGGRAPH '92 Proceedings) (E. E. Catmull, ed.), vol. 26(2), pp. 203–210, July 1992.Google Scholar
  2. 2.
    W. Lorensen, H. Cline, C. Nafis, R. Kikinis, D. Altobelli, and L. Gleason, “Enhancing reality in the operating room,” in Visualization '93 Conference Proceedings, (Los Alamitos, CA), pp. 410–415, IEEE Computer Society Press, October 1993.Google Scholar
  3. 3.
    A. State, M. Livingston, W. Garrett, G. Hirota, M. Whitton, E. Pisano, and H. Fuchs, “Technologies for augmented reality systems: Realizing ultrasound-guided needle biopsies,” in Computer Graphics Proceedings, Annual Conference Series: SIGGRAPH '96 (New Orleans, LA), pp. 439–446, ACM SIGGRAPH, New York, August 1996.Google Scholar
  4. 4.
    W. Grimson, T. Lozano-Perez, W. Wells, G. Ettinger, and S. White, “An automatic registration method for frameless stereotaxy, image, guided surgery and enhanced reality visualization,” in IEEE Conf Computer Vision and Pattern Recognition, (Seattle, WA, June 19–23), pp. 430–436, 1994.Google Scholar
  5. 5.
    T. Caudell and D. Mizell, “Augmented reality: An application of heads-up display technology to manual manufacturing processes,” in Proceedings of Hawaii International Conference on System Sciences, pp. 659–669, January 1992.Google Scholar
  6. 6.
    P Milgram, S. Zhai, D. Drascic, and J. Grodski, “Applications of augmented reality for human-robot communication,” in Proceedings of IROS '93: International Conference on Intelligent Robots and Systems, (Yokohama, Japan), pp. 1467–1472, July 1993.Google Scholar
  7. 7.
    C. Chevrier, S. Belblidia, and J. Paul, “Composing computer-generated images and video films: An application for visual assessment in urban environments,” in Computer Graphics: Developments in Virtual Environments (Proceedings of CG International '95 Conference), (Leeds, UK), pp. 115–125, June 1995.Google Scholar
  8. 8.
    E. Rose, D. Breen, K. Ahlers, C. Crampton, M. Tuceryan, R. Whitaker, and D. Greer, “Annotating real-world objects using augmented reality,” in Computer Graphics: Developments in Virtual Environments (Proceedings of CG International '95 Conference), (Leeds, UK), pp. 357–370, June 1995.Google Scholar
  9. 9.
    K. Ahlers, A. Kramer, D. Breen, P. Chevalier, C. Crampton, E. Rose, M. Tuceryan, R. Whitaker, and D. Greer, “Distributed augmented reality for collaborative design applications,” in Eurographics '95 Proceedings, (Maastricht, NL, pp. 3–14, Blackwell Publishers, August 1995.Google Scholar
  10. 10.
    G. Klinker, K. Ahlers, D. Breen, P-Y Chevalier, C. Crampton, D. Greer, D. Koller, A. Kraemer, E. Rose, M. Tuceryan, and R. Whitaker, “onfluence of computer vision and interactive graphics for augmented reality,” Presence: Teleoperators and Virtual Environments (Special issue on Augmented Reality), January 1997.Google Scholar
  11. 11.
    A. State, M. Livingston, W. Garrett, G. Hirota, M. Whitton, E. Pisano, and H. Fuchs, “Superior augmented reality registration by integrating landmark tracking and magnetic tracking,” in Computer Graphics Proceedings, Annual Conference Series: SIGGRAPH '96 (New Orleans, LA), pp. 429–438, ACM SIGGRAPH, New York, August 1996.Google Scholar
  12. 12.
    M. Uenohara and T. Kanade, “Vision-based object registration for real-time image overlay,” Computers in Biology and Medicine, vol. 25, pp. 249–260, March 1995.CrossRefPubMedGoogle Scholar
  13. 13.
    K. Kutulakos and J. Vallino, “Affine object representations for calibration-free augmented reality,” in Virtual Reality Ann. Int'l Symposium (VRAIS '96), pp. 25–36,1996.Google Scholar
  14. 14.
    D. Breen, R. Whitaker, E. Rose, and M. Tuceryan, “Interactive occlusion and automatic object placement for augmented reality,” in Eurographics '96 Proceedings, (Poitiers, France), pp. 1122, Elsevier Science Publishers B.V, August 1996.Google Scholar
  15. 15.
    J. Mellor, “Realtime camera calibration for enhanced reality visualization,” in First Int'l Conf on Computer Vision, Virtual Reality and Robotics in Medicine (CVRMed), (Nice, France, April 3–6, 1995, N. Ayache (ed.), Lecture Notes in Computer Science 905, Springer-Verlag, Berlin, Heidelberg, New York), 1995.Google Scholar
  16. 16.
    M. Bajura and U. Neumann, “Dynamic registration correction in video-Based augmented reality systems,” IEEE Computer Graphics and Applications, vol. 15, pp. 52–61, September 1995.CrossRefGoogle Scholar
  17. 17.
    D. Gennery, “Tracking known three-dimensional objects,” in Proc. Conf. American Association of Artificial Intelligence, (Pittsburgh, PA, Aug. 18–20), pp. 13–17,1982.Google Scholar
  18. 18.
    D. Lowe, “Robust model-based motion tracking through the integration of search and estimation,” International Journal of Computer Vision, vol. 8, no. 2, pp. 113–122, 1992.CrossRefGoogle Scholar
  19. 19.
    D. Gennery, “Visual tracking of known three-dimensional objects,” International Journal of Computer Vision, vol. 7, pp. 243–270,1992.CrossRefGoogle Scholar
  20. 20.
    Z. Zhang and O. Faugeras, 3D Dynamic Scene Analysis. No. 27 in Springer Series in Information Science, Springer-Verlag, Berlin, Heidelberg, New York, London, Paris, Tokyo, 1992.Google Scholar
  21. 21.
    G. Adiv, “Determining 3-d motion and structure from optical flow generated by several moving objects,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. PAMI-7, pp. 384–401,1985.Google Scholar
  22. 22.
    T Huang, “Determining three-dimensional motion and structure from perspective views,” in Handbook of Pattern Recognition and Image Processing, pp. 333–354,1986.Google Scholar
  23. 23.
    T. Broida, S. Chandrashekhar, and R. Chellappa, “Recursive 3-d motion estimation from a monocular image sequence,” IEEE Trans. Aerospace and Electronic Systems, vol. 26, pp. 639–656,1990.CrossRefGoogle Scholar
  24. 24.
    Z. Zhang, “Estimating motion and structure from correspondences of line segments between two perspective images,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 17, pp. 1129–1139, December 1995.CrossRefGoogle Scholar
  25. 25.
    H. Shariat and K. Price, “Motion estimation with more then two frames,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. PAMI-12, pp. 417–434, 1990.CrossRefGoogle Scholar
  26. 26.
    J. Weng, P Cohen, and M. Hemiou, “Calibration of stereo cameras using a non-linear distortion model,” in Proc. Int. Conf. on Pattern Recognition, (Atlantic City, NJ, June 17–21), pp. 246–253, 1990.Google Scholar
  27. 27.
    M. Tuceryan, D. Greer, R. Whitaker, D. Breen, C. Crampton, E. Rose, and K. Ahlers, “Calibration requirements and procedures for a monitor-based augmented reality system,” IEEE Transactions on Visualization and Computer Graphics, vol. 1, no. 3, pp. 255–273, 1995.CrossRefGoogle Scholar
  28. 28.
    J. Barrera, J. Barron, and R. Lotufo, “Mathematical morphology toolbox for the khoros system,” in Conf. on Image Algebra and Morphological Image Processing V, International Symposium on Optics, Imaging and Instrumentation, SPIE's Annual Meeting, (24–29 July, 1994, San Diego, USA), 1994.Google Scholar
  29. 29.
    H. Goldstein, Classical Mechanics. Reading, MA: Addison-Wesley Press, 1980.Google Scholar
  30. 30.
    D. Koller, “A robust vision-based tracking technique for augmented reality applications.” in preparation, 1997.Google Scholar
  31. 31.
    A. Gelb, ed., Applied Optimal Estimation. Cambridge, MA: MIT Press, 1974.Google Scholar
  32. 32.
    Y. Bar-Shalom and X.-R. Li, Estimation and Tracking: Principles, Technuques, and Software. Boston, London: Artech House, 1993.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • Dieter Koller
    • 1
    • 2
    • 3
  • Gudrun Klinker
    • 1
  • Eric Rose
    • 1
  • David Breen
    • 4
  • Ross Whitaker
    • 5
  • Mihran Tuceryan
    • 6
  1. 1.Fraunhofer Project Group for AR at ZGDVMunichGermany
  2. 2.EE Dept.California Inst. of TechnologyPasadena
  3. 3.Autodesk, Inc.Mountain View
  4. 4.Computer Graphics Lab.California Inst. of TechnologyPasadena
  5. 5.EE Dept.Knoxville
  6. 6.Dept of Comp & Info Science, IUPUIIndianapolis

Personalised recommendations