Fast and Accurate Micro Lenses Depth Maps for Multi-focus Light Field Cameras

  • Rodrigo Ferreira
  • Nuno GoncalvesEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9796)


Light field cameras capture a scene’s multi-directional light field with one image, allowing the estimation of depth. In this paper, we introduce a fully automatic fast method for depth estimation from a single plenoptic image running a RANSAC-like algorithm for feature matching. The novelty about our approach is the global method to back project correspondences found using photometric similarity to obtain a 3D virtual point cloud. We then use lenses with different focal-lengths in a multiple depth map refining phase and their reprojection to the image plane, generating an accurate depth map per micro lens. Tests with simulations and real images are presented and show a good trade-off between computation time and accuracy of the method presented. Our method achieves an accuracy similar to the state-of-the-art in considerable less time (speedups of around 3 times).


Salient Point Depth Estimation Epipolar Line Depth Ground Truth Epipolar Plane Image 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Bishop, T.E., Zanetti, S., Favaro, P.: Light field superresolution. In: Proceedings of IEEE International Conference on Computational Photography, ICCP, pp. 1–9 (2009)Google Scholar
  2. 2.
    Dansearau, D., Bruton, L.: Gradient-based depth estimation from 4D light field. In: Proceedings of International Symposium on Circuits and Systems, vol. 3, pp. III-549–552 (2004)Google Scholar
  3. 3.
    Fleischmann, O., Koch, R.: Lens-based depth estimation for multi-focus plenoptic cameras. In: Jiang, X., Hornegger, J., Koch, R. (eds.) GCPR 2014. LNCS, vol. 8753, pp. 410–420. Springer, Heidelberg (2014)Google Scholar
  4. 4.
    Ives, H.E.: Optical properties of Lippman lenticulated sheet. J. Opt. Soc. Am. 21, 171 (1930)CrossRefGoogle Scholar
  5. 5.
    Liang, C.K., Ramamoorthi, R.: A light transport framework for lenslet light field cameras. ACM Trans. Graph. (TOG) 34(2), 16 (2015)CrossRefGoogle Scholar
  6. 6.
    Lin, H., Chen, C., Bing Kang, S., Yu, J.: Depth recovery from light field using focal stack symmetry. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 3451–3459 (2015)Google Scholar
  7. 7.
    Lippmann, G.: Épreuves réversibles. Photographies intégrals. C. R. Acad. Sci. 146, 446–451 (1908)Google Scholar
  8. 8.
    Lumsdaine, A., Georgiev, T.: Full resolution lightfield rendering. Indiana University and Adobe Systems, Technical report (2008)Google Scholar
  9. 9.
    Ng, R.: Digital light field photography. Ph.D. thesis, Stanford University (2006)Google Scholar
  10. 10.
    Ng, R., Levoy, M., Bredif, M., Duval, G., Horowitz, M., Hanrahan, P.: Light field photography with a hand-held plenoptic camera. Comput. Sci. Tech. Rep. CSTR 2(11), 1 (2005)Google Scholar
  11. 11.
    Perwass, C., Wietzke, L.: Single lens 3D-camera with extended depht-of-field. In: Proceedings of SPIE Human Vision and Electronic Imaging (2012)Google Scholar
  12. 12.
    Tao, M., Hadap, S., Malik, J., Ramamoorthi, R.: Depth from combining defocus and correspondence using light-field cameras. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 673–680 (2013)Google Scholar
  13. 13.
    Wang, T.C., Efros, A.A., Ramamoorthi, R.: Occlusion-aware depth estimation using light-field cameras. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 3487–3495 (2015)Google Scholar
  14. 14.
    Wanner, S., Goldlueck, B.: Globally consistent depth labeling of 4D light fields. In: Proceedings of 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 41–48 (2012)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Institute of Systems and Robotics - University of CoimbraCoimbraPortugal

Personalised recommendations