Advertisement

Multi-focus image fusion with random walks and guided filters

  • Zhaobin WangEmail author
  • Lina Chen
  • Jian Li
  • Ying ZhuEmail author
Regular Paper
  • 27 Downloads

Abstract

Multi-focus image fusion technique is able to help obtaining an all-focused image, which is advantage to human vision and image processing. In this paper, a novel multi-focus image fusion method is proposed based on random walk and guided filter. In the proposed method, the decomposition function and the optimizing function of random walk are used in multi-focus image fusion. And the random walk is also utilized for weight maps directly. The advantages of random walk and guided filter in image fusion are fully utilized by regulating proportional coefficients artificially. The proposed method concludes six steps: first, decomposing source images into detail layers and base layers with random walk; second, the random walk is used for weight maps directly and the guided filter is used as smoothing filters to get the streamlined weight maps of the detail layers and the base layers, respectively; third, the weight maps of the detail layers and the base layers are acquired by summing the initializing weight maps in different proportions; and then, the final weight maps of the detail layers are acquired using random walk for optimizing. After that, the fused detail layer and base layer are obtained by weighted average of detail layers and base layers, singly. Finally, the fused image is gained by summing up the fused base layer and the fused detail layer. Experiments demonstrate that the proposed method outperforms many other approaches in both subjective and objective assessments.

Keywords

Image fusion Multi-focus image Random walk Guided filter Weighted averaging 

Notes

Acknowledgements

We would like to thank the associate editors and the reviewers for their valuable comments and suggestions. The authors also thank Shuai Wang for his generous help.

Funding

This study was funded by National Natural Science Foundation of China (Grant no. 61201421) and the Fundamental Research Funds for the Central Universities of Lanzhou University (lzujbky-2017-187).

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

References

  1. 1.
    Zhang, Q., et al.: Sparse representation based multi-sensor image fusion for multi-focus and multi-modality images: a review. Inf. Fusion 40, 57–75 (2018)CrossRefGoogle Scholar
  2. 2.
    Yan, C., et al.: A fast Uyghur text detector for complex background images. IEEE Trans. Multimedia 20(12), 3389–3398 (2018)CrossRefGoogle Scholar
  3. 3.
    Yan, C., et al.: Effective Uyghur language text detection in complex background images for traffic prompt identification. IEEE Trans. Intell. Transp. Syst. PP(99), 1–10 (2017)Google Scholar
  4. 4.
    Liu, Z., et al.: A novel multi-focus image fusion approach based on image decomposition. Inf. Fusion 35, 102–116 (2017)CrossRefGoogle Scholar
  5. 5.
    Wang, Z., et al.: Review of random walk in image processing. Arch. Comput. Methods Eng. 26(1), 17–34 (2017)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Yan, C., et al.: Supervised hash coding with deep neural network for environment perception of intelligent vehicles. IEEE Trans. Intell. Transp. Syst. 19(1), 284–295 (2018)CrossRefGoogle Scholar
  7. 7.
    Yan, C., et al.: A highly parallel framework for HEVC coding unit partitioning tree decision on many-core processors. IEEE Signal Process. Lett. 21(5), 573–576 (2014)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Yan, C., et al.: Efficient parallel framework for HEVC motion estimation on many-core processors. IEEE Trans. Circuits Syst. Video Technol. 24(12), 2077–2089 (2014)CrossRefGoogle Scholar
  9. 9.
    Wang, Z., Ma, Y., Gu, J.: Multi-focus image fusion using PCNN. J. Univ. Electron. Sci. Technol. China 43(6), 2003–2016 (2009)zbMATHGoogle Scholar
  10. 10.
    Shen, R., et al.: Generalized random walks for fusion of multi-exposure images. IEEE Trans. Image Process. A Publ. IEEE Signal Process. Soc. 20(12), 3634–3646 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Li, S., Kang, X., Hu, J.: Image fusion with guided filtering. IEEE Trans. Image Process. A Publ. IEEE Signal Process. Soc. 22(7), 2864 (2013)Google Scholar
  12. 12.
    Hua, K.L., et al.: A novel multi-focus image fusion algorithm based on random walks. J. Vis. Commun. Image Represent. 25(5), 951–962 (2014)CrossRefGoogle Scholar
  13. 13.
    Liu, Y., et al.: Region level based multi-focus image fusion using quaternion wavelet and normalized cut. Signal Process. 97(7), 9–30 (2014)CrossRefGoogle Scholar
  14. 14.
    Nejati, M., Samavi, S., Shirani, S.: Multi-focus image fusion using dictionary-based sparse representation. Inf. Fusion 25, 72–84 (2015)CrossRefGoogle Scholar
  15. 15.
    Wang, Z., Wang, S., Guo, L.: Novel multi-focus image fusion based on PCNN and random walks. Neural Comput. Appl. 5, 1–14 (2016)Google Scholar
  16. 16.
    Wang, Z., Wang, S., Zhu, Y.: Multi-focus image fusion based on the improved PCNN and guided filter. Neural Process. Lett. 45(1), 75–94 (2017)CrossRefGoogle Scholar
  17. 17.
    Nejati, M., et al.: Surface area-based focus criterion for multi-focus image fusion. Inf. Fusion 36, 284–295 (2017)CrossRefGoogle Scholar
  18. 18.
    Tian, J., Chen, L.: Multi-focus image fusion using wavelet-domain statistics. In: IEEE International Conference on Image Processing (2010)Google Scholar
  19. 19.
    Yang W, Gong Y.: Multi-spectral and panchromatic images fusion based on PCA and fractional spline wavelet. Int. J. Remote Sens. 33(22), 7060–7074 (2012)CrossRefGoogle Scholar
  20. 20.
    Li, H., Chai, Y., Li, Z.: Multi-focus image fusion based on nonsubsampled contourlet transform and focused regions detection. Optik Int. J. Light Electron Opt. 124(1), 40–51 (2013)CrossRefGoogle Scholar
  21. 21.
    Liu, Y., Liu, S., Wang, Z.: Multi-focus image fusion with dense SIFT. Inf. Fusion 23(C), 139–155 (2015)CrossRefGoogle Scholar
  22. 22.
    Yang, Y., et al.: Multifocus image fusion based on NSCT and focused area detection. IEEE Sens. J. 15(5), 2824–2838 (2015)Google Scholar
  23. 23.
    Aslantas, V., Toprak, A.N.: Multi-focus image fusion based on optimal defocus estimation. Comput. Electr. Eng. 62, 302–318 (2017)CrossRefGoogle Scholar
  24. 24.
    Qin, X., et al.: Multi-focus image fusion based on window empirical mode decomposition. Infrared Phys. Technol. 85, 251–260 (2017)CrossRefGoogle Scholar
  25. 25.
    Zhang, Y., Bai, X., Wang, T.: Boundary finding based multi-focus image fusion through multi-scale morphological focus-measure. Inf. Fusion 35, 81–101 (2017)CrossRefGoogle Scholar
  26. 26.
    Wang, Z., et al.: Review of pulse-coupled neural networks. Image Vis. Comput. 28(1), 5–13 (2010)CrossRefGoogle Scholar
  27. 27.
    Wang, Z., Ma, Y.: Medical image fusion using m-PCNN. Inf. Fusion 9(2), 176–185 (2008)CrossRefGoogle Scholar
  28. 28.
    Hou, X., et al.: Guided filter-based fusion method for multiexposure images. Opt. Eng. 55(11), 1–12 (2016)CrossRefGoogle Scholar
  29. 29.
    Qin, H., et al.: Multi-focus image fusion using a guided-filter-based difference image. Appl. Opt. 55(9), 2230–2239 (2016)CrossRefGoogle Scholar
  30. 30.
    Zribi, M.: Non-parametric and region-based image fusion with bootstrap sampling. Inf. Fusion 11(2), 85–94 (2010)CrossRefGoogle Scholar
  31. 31.
    Chai, Y., Li, H., Li, Z.: Multifocus image fusion scheme using focused region detection and multiresolution. Opt. Commun. 284(19), 4376–4389 (2011)CrossRefGoogle Scholar
  32. 32.
    Gonzalez-Audicana, M., et al.: Fusion of multispectral and panchromatic images using improved IHS and PCA mergers based on wavelet decomposition. IEEE Trans. Geosci. Remote Sens. 42(6), 1291–1299 (2004)CrossRefGoogle Scholar
  33. 33.
    Burt, P.J.: A gradient pyramid basis for pattern-selective image fusion. In: Proceedings of the Society for Information Display Conference (1992)Google Scholar
  34. 34.
    Anderson, C.H.: Filter-subtract-decimate hierarchical pyramid signal analyzing and synthesizing technique (1988)Google Scholar
  35. 35.
    Pearson, K.: The problem of the random walk. Nature 72(1865), 294 (1905)CrossRefzbMATHGoogle Scholar
  36. 36.
    Wang, Z., et al.: Review of random walk in image processing. Arch. Comput. Methods Eng. 1866, 1–18 (2017)Google Scholar
  37. 37.
    Smolka, B., Wojciechowski, K.W., Szczepanski, M.: Random Walk Approach to Image Enhancement. In: Proceedings of International Conference on Image Analysis and Processing, 2001 (1999)Google Scholar
  38. 38.
    Ram, S., Rodríguez, J.J.: Random walker watersheds: a new image segmentation approach. In: IEEE International Conference on Acoustics, Speech and Signal Processing, (2013)Google Scholar
  39. 39.
    Sun, X., et al.: Random walks for feature-preserving mesh denoising. Comput. Aided Geom. Des. 25(7), 437–456 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  40. 40.
    Grady, L., Funkalea, G.: Multi-label image segmentation for medical applications based on graph-theoretic electrical potentials. In: Computer Vision and Mathematical Methods in Medical and Biomedical Image Analysis, ECCV 2004 Workshops CVAMIA and MMBIA, Prague, Czech Republic, May 15, 2004, Revised Selected Papers (2004)Google Scholar
  41. 41.
    Pham, C.C., Jeon, J.W.: Efficient image sharpening and denoising using adaptive guided image filtering. Image Process. IET 9(1), 71–79 (2015)CrossRefGoogle Scholar
  42. 42.
    Kang, X., Li, S., Benediktsson, J.A.: Spectral–spatial hyperspectral image classification with edge-preserving filtering. IEEE Trans. Geosci. Remote Sens. 52(5), 2666–2677 (2014)CrossRefGoogle Scholar
  43. 43.
    He, K., Sun, J., Tang, X.: Guided Image Filtering, pp. 1397–1409. Springer, Berlin (2010)Google Scholar
  44. 44.
    Draper, N.R., Smith, H.: Applied Regression Analysis, 2nd ed. Wiley, New York (1981)zbMATHGoogle Scholar
  45. 45.
    Grady, L.: Random walks for image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 28(11), 1768–1783 (2006)CrossRefGoogle Scholar
  46. 46.
    Wang, Z., et al.: An Image enhancement method based on edge preserving random walk filter. In: International Conference on Intelligent Computing (2015)Google Scholar
  47. 47.
    Wang, Z., Wang, H.: Image Smoothing with Generalized Random Walks, pp. 792–804. Elsevier Science Publishers B. V., Amsterdam (2016)Google Scholar
  48. 48.
    Liu, Z., et al.: Objective assessment of multiresolution image fusion algorithms for context enhancement in night vision: a comparative study. IEEE Trans. Pattern Anal. Mach. Intell. 34(1), 94 (2012)CrossRefGoogle Scholar
  49. 49.
    Hossny, M., Nahavandi, S., Creighton, D.: Comments on ‘Information measure for performance of image fusion’. Electron. Lett. 44(18), 1066–1067 (2008)CrossRefGoogle Scholar
  50. 50.
    Qiang, W., Yi, S., Jing, J.: 19—Performance evaluation of image fusion techniques. In: Image fusion: algorithms and applications, pp. 469–492 (2008)Google Scholar
  51. 51.
    Xydeas, C.S., Petrovic, V.: Objective image fusion performance measure. Mil. Tech. Cour. 56(2), 181–193 (2000)Google Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.School of Information Science and EngineeringLanzhou UniversityLanzhouChina
  2. 2.Key Laboratory of Microbial Resources Exploitation and Application of Gansu Province, Institute of BiologyGansu Academy of SciencesLanzhouChina

Personalised recommendations