An Effective Dual-Fisheye Lens Stitching Method Based on Feature Points

  • Li YaoEmail author
  • Ya Lin
  • Chunbo Zhu
  • Zuolong Wang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11295)


Fisheye lens is a super-wide-angle lens which is very light. Usually two cameras can shoot 360-degree panoramic images. However, the limited overlapping field of views make it hard to stitch in the boundaries. This paper introduces a novel method for dual-fisheye camera stitching based on feature points. And we also put forward the idea of expanding to video. Results show that this method can be used to produce high-quality panoramic images by stitching the original images of the dual-fisheye camera Samsung Gear 360.


Dual-fisheye Stitching Panorama-video Virtual reality 



This work is supported by natural science foundation of Jiangsu Province under Grant No. BK20181267.


  1. 1.
    GoPro Odyssey. Accessed 27 April 2018
  2. 2.
    Facebook Surround360. Accessed 27 April 2018
  3. 3.
    Brown, M., Lowe, D.G.: Automatic panoramic image stitching using invariant features. Int. J. Comput. Vis. 74(1), 59–73 (2007)CrossRefGoogle Scholar
  4. 4.
    Gao, J., Kim, S.J., Brown, M.S.: Constructing image panoramas using dual-homography warping. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 49–56. IEEE Computer Society (2011)Google Scholar
  5. 5.
    Matsushita, Y.: Smoothly varying affine stitching. In: Computer Vision and Pattern Recognition, pp. 345–352. IEEE (2011)Google Scholar
  6. 6.
    He, B., Yu, S.: Parallax-robust surveillance video stitching. Sensors 16(1), 7 (2015)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Lin, K., Liu, S., Cheong, L.F.: Seamless video stitching from hand-held camera inputs. In: Computer Graphics Forum, pp. 479–487 (2016)CrossRefGoogle Scholar
  8. 8.
    Ho, T., et al.: 360-degree video stitching for dual-fisheye lens cameras based on rigid moving least squares. In: IEEE International Conference on Image Processing, pp. 51–55. IEEE (2017)Google Scholar
  9. 9.
    Bay, H., Tuytelaars, T., Van Gool, L.: SURF: speeded up robust features. In: Leonardis, A., Bischof, H., Pinz, A. (eds.) ECCV 2006. LNCS, vol. 3951, pp. 404–417. Springer, Heidelberg (2006). Scholar
  10. 10.
    Burt, P.J.: A multiresolution spline with applications to image mosaics. ACM Trans. Comput. Graph. 2(4), 217–236 (1983)CrossRefGoogle Scholar
  11. 11.
    Ho, T., Budagavi, M.: Dual-fisheye lens stitching for 360-degree imaging. In: IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 2172–2176. IEEE (2017)Google Scholar
  12. 12.
    Stricker, A.M.A., Orengo, M.: Similarity of color images. In: Proceedings of SPIE Storage & Retrieval for Image & Video Databases, vol. 2420, pp. 381–392 (1995)Google Scholar
  13. 13.
    Ngo, H.T., Asari, V.K.: A pipelined architecture for real-time correction of barrel distortion in wide-angle camera images. IEEE Trans. Circuits Syst. Video Technol. 15(3), 436–444 (2005)CrossRefGoogle Scholar
  14. 14.
    Ying, X.H.: Fisheye lense distortion correction using spherical perspective projection constraint. Chin. J. Comput. (2003)Google Scholar
  15. 15.
    Sharghi, S.D., Kamangar, F.A.: Geometric feature-based matching in stereo images. In: 1999 Proceedings of IEEE Information, Decision and Control, IDC 1999, pp. 65–70 (1999)Google Scholar
  16. 16.
    Fischler, M.A., Bolles, R.C.: Random sample consensus. Commun. ACM 24(6), 381–395 (1981)CrossRefGoogle Scholar
  17. 17. Frequently Asked Questions. Accessed 27 April 2018
  18. 18.
    Gao, J., Li, Y., Chin, T.J., Brown, M.S.: Seam-driven image stitching. In: Eurographics (2013)Google Scholar
  19. 19.
    Xiao, J.S., Rao, T.Y.: An image fusion algorithm of Laplacian pyramid based on graph cuting. J. Optoelectron. Laser 25(7), 1416–1424 (2014)Google Scholar
  20. 20.
    Rublee, E., et al.: ORB: an efficient alternative to SIFT or SURF. In: International Conference on Computer Vision, Barcelona, pp. 2564–2571 (2011)Google Scholar
  21. 21.
    Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)MathSciNetCrossRefGoogle Scholar
  22. 22.
    Li, Y., et al.: A fast rotated template matching based on point feature. In: MIPPR 2005: SAR and Multispectral Image Processing, 60431P–60431P-7 (2005)Google Scholar
  23. 23.
    Beyond Compare. Accessed 6 May 2018

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.School of Computer Science and EngineeringSoutheast UniversityNanjingPeople’s Republic of China
  2. 2.Key Laboratory of Computer Network and Information IntegrationSoutheast University, Ministry of EducationNanjingPeople’s Republic of China
  3. 3.Samsung ElectronicsSuwonSouth Korea

Personalised recommendations