Advertisement

A Unifying Theory for Central Panoramic Systems and Practical Implications

  • Christopher Geyer
  • Kostas Daniilidis
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1843)

Abstract

Omnidirectional vision systems can provide panoramic alertness in surveillance, improve navigational capabilities, and produce panoramic images for multimedia. Catadioptric realizations of omnidirectional vision combine reflective surfaces and lenses. A particular class of them, the central panoramic systems, preserve the uniqueness of the projection viewpoint. In fact, every central projection system including the well known perspective projection on a plane falls into this category.

In this paper, we provide a unifying theory for all central catadioptric systems. We show that all of them are isomorphic to projective mappings from the sphere to a plane with a projection center on the perpendicular to the plane. Subcases are the stereographic projection equivalent to parabolic projection and the central planar projection equivalent to every conventional camera. We define a duality among projections of points and lines as well as among different mappings.

This unification is novel and has a a significant impact on the 3D interpretation of images. We present new invariances inherent in parabolic projections and a unifying calibration scheme from one view. We describe the implied advantages of catadioptric systems and explain why images arising in central catadioptric systems contain more information than images from conventional cameras. One example is that intrinsic calibration from a single view is possible for parabolic catadioptric systems given only three lines. Another example is metric rectification using only affine information about the scene.

Keywords

Great Circle Image Center Stereographic Projection Perspective Projection Parabolic Mirror 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

erences

  1. 1.
    T.E. Boult. Remote reality demonstration. In IEEE Conf. Computer Vision and Pattern Recognition, pages 966–967, Santa Barbara, CA, June 23–25, 1998.Google Scholar
  2. 2.
    J.S. Chahl and M.V. Srinivasan. Range estimation with a panoramic sensor. Journal Opt. Soc. Am. A, 14:2144–2152, 1997.CrossRefGoogle Scholar
  3. 3.
    A. Fitzgibbon, M. Pilu, and R. Fisher. Direct least-square fitting of ellipses. In Proc. Int. Conf. on Pattern Recognition, pages 253–257, Vienna, Austria, Aug. 25–30, 1996.Google Scholar
  4. 4.
    J. Gluckman and S.K. Nayar. Egomotion and omnidirectional cameras. In Proc. Int. Conf. on Computer Vision, pages 999–1005, Bombay, India, Jan. 3–5, 1998.Google Scholar
  5. 5.
    A. Hicks and R. Bajcsy. Reflective surfaces as computational sensors. In CVPR-Workshop on Perception for Mobile Egents, Fort Collins, CO, June 26, 1999.Google Scholar
  6. 6.
    D. Liebowitz and A. Zisserman. Metric rectification for perspective images of planes. In IEEE Conf. Computer Vision and Pattern Recognition, pages 582–488. Santa Barbara, CA, June 23–25, 1998.Google Scholar
  7. 7.
    V. Nalwa. Bell labs 360-degree panoramic webcam. News Release, http://www.lucent.com/press/0998/980901.bla.html, 1998.
  8. 8.
    S. Nayar. Catadioptric omnidirectional camera. In IEEE Conf. Computer Vision and Pattern Recognition, pages 482–488, Puerto Rico, June 17–19, 1997.Google Scholar
  9. 9.
    T. Needham. Visual Complex Analysis. Clarendon Press, Oxford, 1997.zbMATHGoogle Scholar
  10. 10.
    S.A. Nene and S.K. Nayar. Stereo with mirrors. In Proc. Int. Conf. on Computer Vision, pages 1087–1094, Bombay, India, Jan. 3–5, 1998.Google Scholar
  11. 11.
    Y. Onoe, K. Yamazawa, H. Takemura, and N. Yokoya. Telepresence by real-time view-dependent image generation from omnidirectional video streams. Computer Vision and Image Understanding, 71:588–592, 1998.CrossRefGoogle Scholar
  12. 12.
    D. Pedoe. Geometry: A comprehensive course. Dover Publications, New York, NY, 1970.Google Scholar
  13. 13.
    D. W. Rees. Panoramic television viewing system. United States Patent No. 3, 505, 465, Apr. 1970.Google Scholar
  14. 14.
    D. Southwell, A. Basu, and B. Vandergriend. A conical mirror pipeline inspection system. In Proc. IEEE Int. Conf. on Robotics and Automation, pages 3253–3258, 1996.Google Scholar
  15. 15.
    T. Svoboda, T. Pajdla, and V. Hlavac. Epipolar geometry for panoramic cameras. In Proc. 6th European Conference on Computer Vision, pages 218–231, 1998.Google Scholar
  16. 16.
    Y. Yagi, S. Kawato, and S. Tsuji. Real-time omnidirectional image sensor (copis) for vision-guided navigation. Trans. on Robotics and Automation, 10:11–22, 1994.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2000

Authors and Affiliations

  • Christopher Geyer
    • 1
  • Kostas Daniilidis
    • 1
  1. 1.GRASP LaboratoryUniversity of PennsylvaniaPennsylvania

Personalised recommendations