Skip to main content

Comparison of Techniques for Mitigating the Effects of Illumination Variations on the Appearance of Human Targets

  • Conference paper
Book cover Advances in Visual Computing (ISVC 2007)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4842))

Included in the following conference series:

Abstract

Several techniques have been proposed to date to build colour invariants between camera views with varying illumination conditions. In this paper, we propose to improve colour invariance by using data-dependent techniques. To this aim, we compare the effectiveness of histogram stretching, illumination filtration, full histogram equalisation and controlled histogram equalisation in a video surveillance domain. All such techniques have limited computational requirements and are therefore suitable for real time implementation. Controlled histogram equalisation is a modified histogram equalisation operating under the influence of a control parameter [1]. Our empirical comparison looks at the ability of these techniques to make the global colour appearance of single human targets more matchable under illumination changes, whilst still discriminating between different people. Tests are conducted on the appearance of individuals from two camera views with greatly differing illumination conditions and invariance is evaluated through a similarity measure based upon colour histograms. In general, our results indicate that these techniques improve colour invariance; amongst them, full and controlled equalisation consistently showed the best performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Madden, C., Cheng, E.D., Piccardi, M.: Tracking people across disjoint camera views by an illumination-tolerant appearance representation. Machine Vision and Applications 18, 233–247 (2007)

    Article  MATH  Google Scholar 

  2. Abdel-Hakim, A.E., Farag, A.A.: Csift: A sift descriptor with color invariant characteristics. International Conference on Computer Vision and Pattern Recognition 2, 1978–1983 (2006)

    Google Scholar 

  3. Finlayson, G., Hordley, S., Schaefer, G., Tian, G.Y.: Illuminant and device invariant colour using histogram equalisation. Pattern Recognition 38, 179–190 (2005)

    Article  Google Scholar 

  4. Barnard, K., Funt, B.: Camera characterization for color research. Color Research and Application 27, 153–164 (2002)

    Google Scholar 

  5. Bala, R.: Device characterization. In: Sharma, G. (ed.) Digital Color Imaging Handbook, CRC Press, Boca Raton, USA (2003)

    Google Scholar 

  6. Javed, O., Rasheed, Z., Shafique, K., Shah, M.: Tracking across multiple cameras with disjoint views. IEEE Conference on Computer Vision and Pattern Recognition 2, 26–33 (2005)

    Google Scholar 

  7. Javed, O., Rasheed, Z., Shafique, K., Shah, M.: Tracking across multiple cameras with disjoint views. International Conference on Computer Vision 2, 952–957 (2003)

    Article  Google Scholar 

  8. Weiss, Y.: Deriving intrinsic images from image sequences. International Conference on Computer Vision 2, 68–75 (2001)

    Google Scholar 

  9. Barnard, K., Funt, B., Cardei, V.: A comparison of computational colour constancy algorithms; part one: Methodology and experiments with synthesized data. IEEE Transactions in Image Processing 11, 972–984 (2002)

    Article  Google Scholar 

  10. Toth, D., Aach, T., Metzler, V.: Bayesian spatiotemporal motion detection under varying illumination. In: European Signal Processing Conference pp. 2081–2084 (2000)

    Google Scholar 

  11. Zhou, S.K., Chellapa, R.: From sample similarity to ensemble similarity: probabilistic distance measures in reproducing kernel hilbert space. IEEE Transactions on Pattern Analysis And Machine Intelligence 28, 917–929 (2006)

    Article  Google Scholar 

  12. Wren, C., Azarbayejani, A., Darrell, T., Pentland, A.P.: Pfinder: real-time tracking of the human body. IEEE Transactions on Pattern Analysis And Machine Intelligence 19(7), 780–785 (1997)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

George Bebis Richard Boyle Bahram Parvin Darko Koracin Nikos Paragios Syeda-Mahmood Tanveer Tao Ju Zicheng Liu Sabine Coquillart Carolina Cruz-Neira Torsten Müller Tom Malzbender

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Madden, C., Piccardi, M., Zuffi, S. (2007). Comparison of Techniques for Mitigating the Effects of Illumination Variations on the Appearance of Human Targets. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2007. Lecture Notes in Computer Science, vol 4842. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-76856-2_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-76856-2_12

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-76855-5

  • Online ISBN: 978-3-540-76856-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics