Advertisement

Evaluating Quality of Super-Resolved Face Images

  • Bir Bhanu
  • Ju Han
Part of the Advances in Pattern Recognition book series (ACVPR)

Abstract

The widespread use of super-resolution methods in a variety of applications such as surveillance has led to an increasing need for quality assessment measures. The current quality measures aim to compare different fusion methods by assessing the quality of the fused images. They consider the information transferred between the super-resolved image and input images only. In this chapter, we propose an integrated objective quality evaluation measure for super-resolved images, which focuses on evaluating the quality of super-resolved images that are constructed from different conditions of input images. The proposed quality evaluation measure combines both the relationship between the super-resolved image and the input images, and the relationship between the input images. Using the proposed measure, the quality of the super-resolved face images constructed from videos are evaluated under different input conditions, including the variation of pose, lighting, facial expressions and the number of input images.

Keywords

Facial Expression Face Recognition Video Sequence Input Image Face Image 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 33.
    Cornet, Y., Binard, M.: Which metrices to assess and compare the quality of image fusion products? In: Remote Sensing of Land Use and Land Cover, Proceedings of the EARSeL Workshop, Dubrovnik, Croatia, 2004 Google Scholar
  2. 35.
    Cvejic, N., Loza, A., Bull, D., Canagarajah, N.: A novel metric for performance evaluation of image fusion algorithms. Trans. Eng. Comput. Technol. 7, 80–85 (2005) Google Scholar
  3. 36.
    Cvejic, N., Canagarajah, C., Bull, D.: Image fusion metric based on mutual information and Tsallis entropy. Electron. Lett. 42, 626–627 (2006) CrossRefGoogle Scholar
  4. 50.
    Gharavi, H., Tabatabai, A.: Application of quadrature mirror filtering to the coding of monochrome and color images. In: Proceedings of IEEE International Conference Acoustic, Speech, and Signal Processing, pp. 2384–2487 (1987) Google Scholar
  5. 75.
    Irani, M., Peleg, S.: Motion analysis for image enhancement: resolution, occlusion, and transparency. J. Vis. Commun. Image Represent 4, 324–335 (1993) CrossRefGoogle Scholar
  6. 134.
    Periaswamy, S., Farid, H.: Elastic registration in the presence of intensity variations. IEEE Trans. Med. Imaging 22(7), 865–874 (2003) CrossRefGoogle Scholar
  7. 135.
    Petrovic, V., Xydeas, C.: Sensor noise effects on signal-level image fusion performance. Inf. Fusion 4(3), 167–183 (2003) CrossRefGoogle Scholar
  8. 138.
    Piella, G., Heijmans, H.: A new quality metric for image fusion. In: Proceedings of International Conference Image Processing, pp. 173–176 (2003) Google Scholar
  9. 140.
    Qu, G.H., Zhang, D.L., Yan, P.F.: Information measure for performance of image fusion. Electron. Lett. 38(7), 313–315 (2002) CrossRefGoogle Scholar
  10. 167.
    Tsagaris, V., Anastassopoulos, V.: Information measure for assessing pixel-level fusion methods. In: Image and Signal Processing for Remote Sensing X. Proceedings of the SPIE, vol. 5573, pp. 64–71. SPIE, Bellingham (2004) CrossRefGoogle Scholar
  11. 175.
    Wang, Z., Bovik, A.C.: A universal image quality index. IEEE Signal Process. Lett. 9(3), 81–84 (2002) CrossRefGoogle Scholar
  12. 176.
    Wang, Z., Bovik, A.C., Sheikh, H.R., Simoncelli, E.P.: Image quality assessment: from error visibility to structural similarity. IEEE Trans. Image Process. 13(4), 600–612 (2004) CrossRefGoogle Scholar
  13. 187.
    Xydeas, C., Petrovic, V.: Objective image fusion performance measure. Electron. Lett. 36, 308–309 (2000) CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2010

Authors and Affiliations

  1. 1.Bourns College of EngineeringUniversity of CaliforniaRiversideUSA
  2. 2.Lawrence Berkeley National LaboratoryUniversity of CaliforniaBerkeleyUSA

Personalised recommendations