Advertisement

PMIQD 2019: A Pathological Microscopic Image Quality Database with Nonexpert and Expert Scores

  • Shuning Xu
  • Menghan HuEmail author
  • Wangyang Yu
  • Jianlin Feng
  • Qingli Li
Conference paper
  • 42 Downloads
Part of the Communications in Computer and Information Science book series (CCIS, volume 1181)

Abstract

In medical diagnostic analysis, pathological microscopic image is often regarded as a gold standard, and hence the study of pathological microscopic image is of great necessity. High quality microscopic pathological images enable doctors to arrive at correct diagnosis. The pathological microscopic image is an important cornerstone for modernization and computerization of medical procedures. The quality of pathological microscopic images may be degraded due to a variety of reasons. It is difficult to acquire key information, so research for quality assessment of pathological microscopic image is quite necessary. In this paper, we perform a study on subjective quality assessment of pathological microscopic images and investigate whether the existing objective quality measures can be applied to the pathological microscopic images. Concretely, we establish a new pathological microscopic image quality database (PMIQD) which includes 425 pathological microscopic images with different quality degrees. The mean opinion scores rated by nonexperts and experts are calculated afterwards. Besides, we investigate the prediction performance of the existing popular image quality assessment (IQA) algorithms on PMIQD, including 8 no-reference (NR) methods. Experimental results demonstrate that the present objective models do not work well. IQA for pathological microscopic image needs to be developed for predicting the quality rated by nonexperts and experts.

Keywords

Pathological microscopic image Subjective image quality assessment No-reference model observer Database 

Notes

Acknowledgment

This work is sponsored by the Shanghai Sailing Program (No. 19YF1414100), the National Natural Science Foundation of China (No. 61831015, No. 61901172), the STCSM (No. 18DZ2270700), and the China Postdoctoral Science Foundation funded project (No. 2016M600315).

References

  1. 1.
    Snead, D.R.J., et al.: Validation of digital pathology imaging for primary histopathological diagnosis. Histopathology 68(7), 1063–1072 (2016)CrossRefGoogle Scholar
  2. 2.
    Sheikh, H.R., Bovik, A.C., De Veciana, G.: An information fidelity criterion for image quality assessment using natural scene statistics. IEEE Trans. Image Process. 14(12), 2117–2128 (2005)CrossRefGoogle Scholar
  3. 3.
    Wang, Z., Bovik, A.C., Sheikh, H.R., Simoncelli, E.P., et al.: Image quality assessment: from error visibility to structural similarity. IEEE Trans. Image Process. 13(4), 600–612 (2004)CrossRefGoogle Scholar
  4. 4.
    Zhang, L., Shen, Y., Li, H.: VSI: a visual saliency-induced index for perceptual image quality assessment. IEEE Trans. Image Process. 23(10), 4270–4281 (2014)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Zhang, L., Zhang, L., Mou, X., Zhang, D.: FSIM: a feature similarity index for image quality assessment. IEEE Trans. Image Process. 20(8), 2378–2386 (2011)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Sheikh, H.R.: LIVE image quality assessment database release 2 (2005). http://live.ece.utexas.edu/research/quality
  7. 7.
    Larson, E.C., Chandler, D.M.: Categorical image quality (CSIQ) database (2010)Google Scholar
  8. 8.
    Ponomarenko, N., et al.: Image database TID2013: peculiarities, results and perspectives. Sig. Process. Image Commun. 30, 57–77 (2015)CrossRefGoogle Scholar
  9. 9.
    Recommendation ITU-R BT.500-13: Methodology for the subjective assessment of the quality of television pictures. Technical report, International Telecommunication Union (2012)Google Scholar
  10. 10.
    Zhu, W., Zhai, G., Menghan, H., Liu, J., Yang, X.: Arrow’s impossibility theorem inspired subjective image quality assessment approach. Sig. Process. 145, 193–201 (2018)CrossRefGoogle Scholar
  11. 11.
    Karbowski, M., Youle, R.J.: Dynamics of mitochondrial morphology in healthy cells and during apoptosis. Cell Death Differ. 10(8), 870 (2003)CrossRefGoogle Scholar
  12. 12.
    Shrestha, P., Kneepkens, R., Vrijnsen, J., Vossen, D., Abels, E., Hulsken, B.: A quantitative approach to evaluate image quality of whole slide imaging scanners. J. Pathol. Inform. 7, 56 (2016)CrossRefGoogle Scholar
  13. 13.
    Li, L., Lin, W., Wang, X., Yang, G., Bahrami, K., Kot, A.C.: No-reference image blur assessment based on discrete orthogonal moments. IEEE Trans. Cybern. 46(1), 39–50 (2015)CrossRefGoogle Scholar
  14. 14.
    Narvekar, N.D., Karam, L.J.: A no-reference perceptual image sharpness metric based on a cumulative probability of blur detection. In: 2009 International Workshop on Quality of Multimedia Experience, pp. 87–91. IEEE (2009)Google Scholar
  15. 15.
    Vu, P.V., Chandler, D.M.: A fast wavelet-based algorithm for global and local image sharpness estimation. IEEE Sig. Process. Lett. 19(7), 423–426 (2012)CrossRefGoogle Scholar
  16. 16.
    Vu, C.T., Phan, T.D., Chandler, D.M.: A spectral and spatial measure of local perceived sharpness in natural images. IEEE Trans. Image Process. 21(3), 934–945 (2011)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Winkelman, K.-H.: Method and apparatus for the automatic analysis of density range, color cast, and gradation of image originals on the basis of image values transformed from a first color space into a second color space, 16 September 1997. US Patent 5,668,890 (1997)Google Scholar
  18. 18.
    P ITU-T Recommendation: Subjective video quality assessment methods for multimedia applications. International Telecommunication Union (1999)Google Scholar
  19. 19.
    Sheikh, H.R., Sabir, M.F., Bovik, A.C.: A statistical evaluation of recent full reference image quality assessment algorithms. IEEE Trans. Image Process. 15(11), 3440–3451 (2006)CrossRefGoogle Scholar
  20. 20.
    Mantiuk, R.K., Tomaszewska, A., Mantiuk, R.: Comparison of four subjective methods for image quality assessment. In: Computer Graphics Forum, vol. 31, pp. 2478–2491. Wiley Online Library (2012)Google Scholar
  21. 21.
    Kumar, B., Singh, S.P., Mohan, A., Anand, A.: Performance of quality metrics for compressed medical images through mean opinion score prediction. J. Med. Imaging Health Inform. 2(2), 188–194 (2012)CrossRefGoogle Scholar
  22. 22.
    Streijl, R.C., Winkler, S., Hands, D.S.: Mean opinion score (MOS) revisited: methods and applications, limitations and alternatives. Multimed. Syst. 22(2), 213–227 (2016)CrossRefGoogle Scholar
  23. 23.
    Pelgrom, M.J.M., Duinmaijer, A.C.J., Welbers, A.P.G.: Matching properties of MOS transistors. IEEE J. Solid-State Circ. 24(5), 1433–1439 (1989)CrossRefGoogle Scholar
  24. 24.
    Sheikh, H.R., Bovik, A.C.: Image information and visual quality. In: 2004 IEEE International Conference on Acoustics, Speech, and Signal Processing, vol. 3, pp. iii–709. IEEE (2004)Google Scholar
  25. 25.
    Saad, M.A., Bovik, A.C., Charrier, C.: Blind image quality assessment: a natural scene statistics approach in the DCT domain. IEEE Trans. Image Process. 21(8), 3339–3352 (2012)MathSciNetCrossRefGoogle Scholar
  26. 26.
    Mittal, A., Moorthy, A.K., Bovik, A.C.: No-reference image quality assessment in the spatial domain. IEEE Trans. Image Process. 21(12), 4695–4708 (2012)MathSciNetCrossRefGoogle Scholar
  27. 27.
    Gu, K., et al.: FISBLIM: a five-step blind metric for quality assessment of multiply distorted images. In: SiPS 2013 Proceedings, pp. 241–246. IEEE (2013)Google Scholar
  28. 28.
    Mittal, A., Soundararajan, R., Bovik, A.C.: Making a “completely blind” image quality analyzer. IEEE Sig. Process. Lett. 20(3), 209–212 (2012)CrossRefGoogle Scholar
  29. 29.
    Xue, W., Zhang, L., Mou, X.: Learning without human scores for blind image quality assessment. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 995–1002 (2013)Google Scholar
  30. 30.
    Ke, G., Zhai, G., Yang, X., Zhang, W.: Hybrid no-reference quality metric for singly and multiply distorted images. IEEE Trans. Broadcast. 60(3), 555–567 (2014)CrossRefGoogle Scholar
  31. 31.
    Moorthy, A.K., Bovik, A.C.: Visual importance pooling for image quality assessment. IEEE J. Sel. Top. Sig. Process. 3(2), 193–201 (2009)CrossRefGoogle Scholar
  32. 32.
    Chen, M.-J., Su, C.-C., Kwon, D.-K., Cormack, L.K., Bovik, A.C.: Full-reference quality assessment of stereopairs accounting for rivalry. Sig. Process. Image Commun. 28(9), 1143–1155 (2013)CrossRefGoogle Scholar
  33. 33.
    Gu, K., Zhai, G., Yang, X., Zhang, W., Liu, M.: Subjective and objective quality assessment for images with contrast change. In: 2013 IEEE International Conference on Image Processing, pp. 383–387. IEEE (2013)Google Scholar
  34. 34.
    Dhanachandra, N., Manglem, K., Chanu, Y.J.: Image segmentation using K-means clustering algorithm and subtractive clustering algorithm. Procedia Comput. Sci. 54, 764–771 (2015)CrossRefGoogle Scholar
  35. 35.
    Ameisen, D., et al.: Automatic image quality assessment in digital pathology: from idea to implementation. In: IWBBIO, pp. 148–157 (2014)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  • Shuning Xu
    • 1
  • Menghan Hu
    • 1
    Email author
  • Wangyang Yu
    • 1
  • Jianlin Feng
    • 1
  • Qingli Li
    • 1
  1. 1.Shanghai Key Laboratory of Multidimensional Information ProcessingEast China Normal UniversityShanghaiChina

Personalised recommendations