Refining Face Tracking with Integral Projections

  • Ginés García Mateos
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2688)


Integral projections can be used, by themselves, to accurately track human faces in video sequences. Using projections, the tracking problem is effectively separated into the vertical, horizontal and rotational dimensions. Each of these parts is solved, basically, through the alignment of a projection signal -a one-dimensional pattern- with a projection model. The effect of this separation is an important improvement in feature location accuracy and computational efficiency. A comparison has been done with respect to the CamShift algorithm. Our experiments have also shown a high robustness of the method to 3D pose, facial expression, lighting conditions, partial occlusion, and facial features.


Facial Feature Location Accuracy Integral Projection Face Model Vertical Projection 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    Bradski, G.R.: Computer Vision Face Tracking For Use in a Perceptual User Interface. Intel Technology Journal Q2’98 (1998)Google Scholar
  2. [2]
    Spors, S., Rabenstein, R.: A Real-Time Face Tracker for Color Video. IEEE Intl. Conference on Acoustics, Speech, and Signal Processing, Utah, USA (2001)Google Scholar
  3. [3]
    Kaucic, R., Blake, A.: Accurate, Real-Time, Unadorned Lip Tracking. Proc. of 6th Intl. Conference on Computer Vision (1998) 370–375Google Scholar
  4. [4]
    Sobottka, K., Pitas, I.: Segmentation and Tracking of Faces in Color Images. Proc. of 2nd Intl. Conf. on Aut. Face and Gesture Recognition (1996) 236–241Google Scholar
  5. [5]
    Stiefelhagen, R., Yang, J., Waibel, A.: A Model-Based Gaze Tracking System. Proc. of IEEE Intl. Symposia on Intelligence and Systems (1996) 304–310Google Scholar
  6. [6]
    Pahor, V., Carrato, S.: A Fuzzy Approach to Mouth Corner Detection. Proc. of ICIP-99, Kobe, Japan (1999) I–667–I–671Google Scholar
  7. [7]
    Schwerdt, K., Crowley, J. L.: Robust Face Tracking Using Color. Proc. of 4th Intl. Conf. on Aut. Face and Gesture Recognition, Grenoble, France (2000) 90–95Google Scholar
  8. [8]
    García-Mateos, G., Ruiz, A., López-de-Teruel, P.E.: Face Detection Using Integral Projection Models. Proc. of IAPR Intl. Workshops S+SSPR’2002, Windsor, Canada (2002) 644–653Google Scholar
  9. [9]
    Isard, M., Blake, A.: Contour Tracking by Stochastic Propagation of Conditional Density. Proc. 4th Eur. Conf. on Computer Vision, Cambridge, UK (1996) 343–356Google Scholar
  10. [10]
    Vieren, C., Cabestaing, F., Postaire, J.: Catching Moving Objects with Snakes for Motion Tracking. Pattern Recognition Letters, 16 (1995) 679–685CrossRefGoogle Scholar
  11. [11]
    Pentland, A., Moghaddam, B., Starner, T.: View-Based and Modular Eigenspaces for Face Recognition. Proc. CVPR’94, Seattle, Washington, USA (1994) 84–91Google Scholar
  12. [12]
    La Cascia, M., Sclaro., S., Athitsos, V.: Fast, Reliable Head Tracking Under Varying Illumination: An Approach Based on Registration of Texture-mapped 3D Models. IEEE PAMI, 22(4), (2000) 322–336Google Scholar
  13. [13]
    Intel Corporation. IPL and OpenCV: Intel Open Source Computer Vision Library.

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Ginés García Mateos
    • 1
  1. 1.Dept. de Informática y SistemasUniversidad de MurciaEspinardo, MurciaSpain

Personalised recommendations