Advertisement

A Novel No-reference Subjective Quality Metric for Free Viewpoint Video Using Human Eye Movement

  • Pallab Kanti Podder
  • Manoranjan Paul
  • Manzur Murshed
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10749)

Abstract

The free viewpoint video (FVV) allows users to interactively control the viewpoint and generate new views of a dynamic scene from any 3D position for better 3D visual experience with depth perception. Multiview video coding exploits both texture and depth video information from various angles to encode a number of views to facilitate FVV. The usual practice for the single view or multiview quality assessment is characterized by evolving the objective quality assessment metrics due to their simplicity and real time applications such as the peak signal-to-noise ratio (PSNR) or the structural similarity index (SSIM). However, the PSNR or SSIM requires reference image for quality evaluation and could not be successfully employed in FVV as the new view in FVV does not have any reference view to compare with. Conversely, the widely used subjective estimator- mean opinion score (MOS) is often biased by the testing environment, viewers mode, domain knowledge, and many other factors that may actively influence on actual assessment. To address this limitation, in this work, we devise a no-reference subjective quality assessment metric by simply exploiting the pattern of human eye browsing on FVV. Over different quality contents of FVV, the participants eye-tracker recorded spatio-temporal gaze-data indicate more concentrated eye-traversing approach for relatively better quality. Thus, we calculate the Length, Angle, Pupil-size, and Gaze-duration features from the recorded gaze trajectory. The content and resolution invariant operation is carried out prior to synthesizing them using an adaptive weighted function to develop a new quality metric using eye traversal (QMET). Tested results reveal that the proposed QMET performs better than the SSIM and MOS in terms of assessing different aspects of coded video quality for a wide range of FVV contents.

Keywords

Eye-traversal Eye-tracking Free viewpoint video Gaze-trajectory HEVC QMET Quality assessment 

Notes

Acknowledgement

This work was supported in part by the Australian Research Council under Discovery Projects Grant DP130103670.

References

  1. 1.
    Gu, K., Zhai, G., Lin, W., Liu, M.: The analysis of image contrast: from quality assessment to automatic enhancement. IEEE Trans. Cybern. 46(1), 284–297 (2016)CrossRefGoogle Scholar
  2. 2.
    Rahaman, D.M., Paul, M.: Adaptive weighting between warped and learned foregrounds for view synthesize. In: 2017 IEEE International Conference on Multimedia and Expo Workshops (ICMEW), pp. 49–54. IEEE (2017)Google Scholar
  3. 3.
    Zhu, C., Li, S.: Depth image based view synthesis: new insights and perspectives on hole generation and filling. IEEE Trans. Broadcast. 62(1), 82–93 (2016)CrossRefGoogle Scholar
  4. 4.
    Battisti, F., Bosc, E., Carli, M., Le Callet, P., Perugia, S.: Objective image quality assessment of 3D synthesized views. Sig. Process.: Image Commun. 30, 78–88 (2015)Google Scholar
  5. 5.
    Xu, M., Zhang, J., Ma, Y., Wang, Z.: A novel objective quality assessment method for perceptual video coding in conversational scenarios. In: 2014 IEEE Visual Communications and Image Processing Conference, pp. 29–32. IEEE (2014)Google Scholar
  6. 6.
    Gu, K., Liu, M., Zhai, G., Yang, X., Zhang, W.: Quality assessment considering viewing distance and image resolution. IEEE Trans. Broadcast. 61(3), 520–531 (2015)CrossRefGoogle Scholar
  7. 7.
    Liu, H., Klomp, N., Heynderickx, I.: A no-reference metric for perceived ringing artifacts in images. IEEE Trans. Circuits Syst. Video Technol. 20(4), 529–539 (2010)CrossRefGoogle Scholar
  8. 8.
    Fang, Y., Ma, K., Wang, Z., Lin, W., Fang, Z., Zhai, G.: No-reference quality assessment of contrast-distorted images based on natural scene statistics. IEEE Signal Process. Lett. 22(7), 838–842 (2015)Google Scholar
  9. 9.
    Zhu, K., Li, C., Asari, V., Saupe, D.: No-reference video quality assessment based on artifact measurement and statistical analysis. IEEE Trans. Circuits Syst. Video Technol. 25(4), 533–546 (2015)CrossRefGoogle Scholar
  10. 10.
    Gu, K., Lin, W., Zhai, G., Yang, X., Zhang, W., Chen, C.W.: No-reference quality metric of contrast-distorted images based on information maximization. IEEE Trans. Cybern. 47, 4559–4565 (2016)CrossRefGoogle Scholar
  11. 11.
    Tourancheau, S., Autrusseau, F., Sazzad, Z.P., Horita, Y.: Impact of subjective dataset on the performance of image quality metrics. In: 2008 15th IEEE International Conference on Image Processing, ICIP 2008, pp. 365–368. IEEE (2008)Google Scholar
  12. 12.
    Liu, H., Heynderickx, I.: Visual attention in objective image quality assessment: based on eye-tracking data. IEEE Trans. Circuits Syst. Video Technol. 21(7), 971–982 (2011)CrossRefGoogle Scholar
  13. 13.
    Böhme, M., Dorr, M., Graw, M., Martinetz, T., Barth, E.: A software framework for simulating eye trackers. In: Proceedings of the 2008 Symposium on Eye Tracking Research and Applications, pp. 251–258. ACM (2008)Google Scholar
  14. 14.
    Seshadrinathan, K., Soundararajan, R., Bovik, A.C., Cormack, L.K.: Study of subjective and objective quality assessment of video. IEEE Trans. Image Process. 19(6), 1427–1441 (2010)MathSciNetCrossRefMATHGoogle Scholar
  15. 15.
    Jia, L., Zhong, X., Tu, Y.: No-reference video quality assessment model based on eye tracking data. In: International Conference on Information, Electronics and Computer, pp. 97–100 (2014)Google Scholar
  16. 16.
    Arndt, S., Radun, J., Antons, J.N., Möller, S.: Using eye-tracking and correlates of brain activity to predict quality scores. In: 2014 Sixth International Workshop on Quality of Multimedia Experience (QoMEX), pp. 281–285. IEEE (2014)Google Scholar
  17. 17.
    Albanesi, M.G., Amadeo, R.: A new algorithm for objective video quality assessment on eye tracking data. In: 2014 International Conference on Computer Vision Theory and Applications (VISAPP), vol. 1, pp. 462–469. IEEE (2014)Google Scholar
  18. 18.
    Tsai, C.-M., Guan, S.-S., Tsai, W.-C.: Eye movements on assessing perceptual image quality. In: Zhou, J., Salvendy, G. (eds.) ITAP 2016. LNCS, vol. 9754, pp. 378–388. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-39943-0_37 Google Scholar
  19. 19.
    Ribeiro, F., Florencio, D., Nascimento, V.: Crowdsourcing subjective image quality evaluation. In: 2011 18th IEEE International Conference on Image Processing (ICIP), pp. 3097–3100. IEEE (2011)Google Scholar
  20. 20.
    Streijl, R.C., Winkler, S., Hands, D.S.: Mean opinion score (MOS) revisited: methods and applications, limitations and alternatives. Multimed. Syst. 22(2), 213–227 (2016)CrossRefGoogle Scholar
  21. 21.
    Podder, P.K., Paul, M., Murshed, M.: QMET: a new quality assessment metric for no-reference video coding by using human eye traversal. In: 2016 International Conference on Image and Vision Computing New Zealand (IVCNZ), pp. 1–6. IEEE (2016)Google Scholar
  22. 22.
    Bross, B., Han, W.J., Ohm, J.R., Sullivan, G.J., Wiegand T.: High efficiency video coding text specification draft 8. JTCVC- J1003, Sweden (2012)Google Scholar
  23. 23.
    Joint collaborative team on video coding (JCT-VC), HM software manual, CVS server. (http://hevc.kw.bbc.co.uk/svn/jctvc-hm/). Accessed Dec 2016
  24. 24.
    Podder, P.K., Paul, M., Rahaman, D.M., Murshed, M.: Improved depth coding for HEVC focusing on depth edge approximation. Sig. Process. Image Commun. 55, 80–92 (2017)CrossRefGoogle Scholar
  25. 25.
    Mulvey, F., Villanueva, A., Sliney, D., Lange, R., Cotmore, S., Donegan, M.: Exploration of safety issues in eyetracking (2008)Google Scholar
  26. 26.
    The basics of power law. https://en.wikipedia.org/wiki/power_law. Accessed Dec 2016
  27. 27.
    Salehin, M.M., Paul, M.: Human visual field based saliency prediction method using eye tracker data for video summarization. In: 2016 IEEE International Conference on Multimedia and Expo Workshops (ICMEW), pp. 1–6. IEEE (2016)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Pallab Kanti Podder
    • 1
  • Manoranjan Paul
    • 1
  • Manzur Murshed
    • 2
  1. 1.School of Computing and MathematicsCharles Sturt UniversityBathurstAustralia
  2. 2.School of Information TechnologyFederation University ChurchillChurchillAustralia

Personalised recommendations