Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

TextureToMTF: predicting spatial frequency response in the wild


In this work, we propose an no-reference image quality assessment (NR-IQA) approach at a confluence of signal processing and deep learning. We use MTF50 (spatial frequency where modulation transfer function is 50% of its peak value) on slanted edged as a measure for image quality. We propose a comprehensive IQA dataset of images captured through hand-held phone camera in variety of situations with slanted edges around it. The MTF50 values at the slanted edges are then used to garner ground truth values for each patch in the captured images. A convolution neural network is then trained to predict MTF50 values from arbitrary image patches. We present results on the proposed dataset and synthetically generated TID2013 dataset and show state-of-the-art performance for IQA in the wild.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5


  1. 1.


  1. 1.

    Liu, X., van de Weijer, J., Bagdanov, A.D.: Rankiqa: learning from rankings for no-reference image quality assessment. In: ICCV (2017)

  2. 2.

    Ponomarenko, N., Jin, L., Ieremeiev, O., Lukin, V., Egiazarian, K., Astola, J., Vozel, B., Chehdi, K., Carli, M., Battisti, F., et al.: Image database tid2013: peculiarities, results and perspectives. SPIC 30, 57–77 (2015)

  3. 3.

    Sheikh, H.R., Wang, Z., Cormack, L., Bovik, A.C.: Live image quality assessment database release 2. in 2007 07-17. (2005)

  4. 4.

    Rai, P.K., Maheshwari, S., Mehta, I., Sakurikar, P., Gandhi, V.: Beyond ocrs for document blur estimation. In: ICDAR (2017)

  5. 5.

    Burns, P.D.: Slanted-edge MTF for digital camera and scanner analysis. In: Proceedings of PICS Conference IS&T (2000)

  6. 6.

    Wueller, D.: Low light performance of digital still cameras. In: MCMD, vol. 8667, p. 86671H (2013)

  7. 7.

    Facciolo, G., Pacianotto, G., Renaudin, M., Viard, C., Guichard, F.: Quantitative measurement of contrast, texture, color, and noise for digital photography of high dynamic range scenes. Electron. Imaging 2018, 170–1 (2018)

  8. 8.

    Cao, F., Guichard, F., Hornung, H.: Measuring texture sharpness of a digital camera. In: Digital Photography V, vol. 7250, pp. 72500H. International Society for Optics and Photonics (2009)

  9. 9.

    Rai, P.K., Maheshwari, S., Gandhi, V.: Document quality estimation using spatial frequency response. In: ICASSP (2018)

  10. 10.

    Kumar, J., Chen, F., Doermann, D.: Sharpness estimation for document and scene images. In: ICPR2012 (2012)

  11. 11.

    Ferzli, R., Karam, L.J.: A no-reference objective image sharpness metric based on the notion of just noticeable blur. TIP 18, 717–728 (2009)

  12. 12.

    Mittal, A., Moorthy, A.K., Bovik, A.C.: No-reference image quality assessment in the spatial domain. In: TIP, vol. 21 (2012)

  13. 13.

    Batten, C.F.: Autofocusing and astigmatism correction in the scanning electron microscope. Mphill thesis, University of Cambridge (2000)

  14. 14.

    Hassen, R., Wang, Z., Salama, M.M.: Image sharpness assessment based on local phase coherence. TIP 22, 2798–2810 (2013)

  15. 15.

    Zhang, N.F., Vladar, A., Postek, M.T., Larrabee, R.D.: A kurtosis-based statistical measure for two-dimensional processes and its applications to image sharpness. Tech. rep., NIST (2003)

  16. 16.

    Xiao, B., Wang, K., Bi, X., Li, W., Han, J.: 2d-lbp: an enhanced local binary feature for texture image classification. In: IEEE TOCSVT (2018)

  17. 17.

    Verma, M., Raman, B.: Local neighborhood difference pattern: a new feature descriptor for natural and texture image retrieval. In: MTA, p. 11843 (2018)

  18. 18.

    Armi, L., Fekri-Ershad, S.: Texture image analysis and texture classification methods-a review. arXiv:1904.06554 (2019)

  19. 19.

    Fekriershad, S., Tajeripour, F.: Color texture classification based on proposed impulse-noise resistant color local binary patterns and significant points selection algorithm. pp. 33–42 (2017)

  20. 20.

    Zhang, M., Muramatsu, C., Zhou, X., Hara, T., Fujita, H.: Blind image quality assessment using the joint statistics of generalized local binary pattern. IEEE SPL 22, 207–210 (2014)

  21. 21.

    Zhang, M., Xie, J., Zhou, X., Fujita, H.: No reference image quality assessment based on local binary pattern statistics. In: 2013 VCIP, pp. 1–6 (2013)

  22. 22.

    Shi, J., Xu, L., Jia, J.: Discriminative blur detection features. In: CVPR (2014)

  23. 23.

    Kang, L., Ye, P., Li, Y., Doermann, D.: Convolutional neural networks for no-reference image quality assessment. In: CVPR (2014)

  24. 24.

    Bianco, S., Celona, L., Napoletano, P., Schettini, R.: On the use of deep learning for blind image quality assessment. SIVP 12, 355–362 (2018)

  25. 25.

    Lin, K.Y., Wang, G.: Hallucinated-IQA: no-reference image quality assessment via adversarial learning. In: CVPR (2018)

  26. 26.

    Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, A.: Generative adversarial nets. In: NIPS (2014)

  27. 27.

    Rusiñol, M., Chazalon, J., Ogier, J.M.: Combining focus measure operators to predict OCR accuracy in mobile-captured document images. In: DAS (2014)

  28. 28.

    Ye, P., Kumar, J., Kang, L., Doermann, D.: Unsupervised feature learning framework for no-reference image quality assessment. In: CVPR (2012)

  29. 29.

    Maheshwari, S., Rai, P.K., Sharma, G., Gandhi, V.: Document blur detection using edge profile mining. In: ICVGIP (2016)

  30. 30.

    Kang, L., Ye, P., Li, Y., Doermann, D.: A deep learning approach to document image quality assessment. In: ICIP (2014)

  31. 31.

    Saad, M.A., Bovik, A.C., Charrier, C.: Blind image quality assessment: a natural scene statistics approach in the DCT domain. In: TIP, vol. 21 (2012)

Download references

Author information

Correspondence to Murtuza Bohra.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Bohra, M., Maheshwari, S. & Gandhi, V. TextureToMTF: predicting spatial frequency response in the wild. SIViP (2020).

Download citation


  • Image quality prediction
  • Blur prediction
  • Image sharpness
  • Spatial frequency