On Gaze Estimation Using Integral Projection of Eye Images

  • Lan-Rong DungEmail author
  • Yu-Cheng Lee
  • Yin-Yi Wu
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 850)


This paper presents a gaze estimation algorithm using integral projection of eye images with advantage of low additional hardware requirement and low computational power. The algorithm needs only a webcam under nature light source and captured eye images in a non-intrusive way. Before integral projection, we used binarization process to eliminate the non-related image information to gaze position. Projected on binary eye images with projection adjustment method to avoid eye tilt makes projection error and defined the accurate integral range of eye ROI images to achieve robust gaze estimation. We analyzed the projection diagram with skewness to describe the variation of different gaze position. In skewness calculation, the pixel coordinate of eye ROI images has been normalized to avoid head moved back and forth makes the size of ROI changed. In horizontal direction, the error angle of our algorithm is 2.29°, maximum error angle is 4.8° and the resolution we defined is 7.5. Because our algorithm is inaccurate in vertical, we could only estimate gaze direction, but to estimate precise angle. The computational power of our algorithm is low, the average execution time of each frame is only 0.01652 s, only 24% of opponent.


Gaze estimation Integral projection Skewness 


  1. 1.
    Yang, X., Sun, J., Liu, J., Chu, J., Liu, W., Gao, Y.: Agaze tracking scheme for eye-based intelligent control. In: 8th WCICA, pp. 50–55 (2010)Google Scholar
  2. 2.
    Raudonis, V., Simutis, R., Narvydas, G.: Discrete eye tracking for medical applications. In: 2nd ISABEL, pp. 1–6 (2009)Google Scholar
  3. 3.
    Joanes, D.N., Gill, C.A.: Comparing measures of sample skewness and kurtosis. J. R. Stat. Soc. 47, 183–189 (1998)CrossRefGoogle Scholar
  4. 4.
    Zhou, Z.H., Geng, X.: Projection functions for eye detection. Pattern Recogn. 37, 1049–1056 (2004)CrossRefGoogle Scholar
  5. 5.
    Otsu, N.: A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 9(1), 62–66 (1979)CrossRefGoogle Scholar
  6. 6.
    Rosten, E., Drummond, T.: Fusing points and lines for high performance tracking. IEEE Int. Conf. Comput. Vis. 2, 1508–1515 (2005)Google Scholar
  7. 7.
    Ferhat, O., Vilariño, F.: Low cost eye tracking: the current panorama. Comput. Intell. Neurosci. (2016)Google Scholar
  8. 8.
    Chennamma, O.H.R., Yuan, X.H.: A survey on eye-gaze tracking techniques. Indian J. Comput. Sci. Eng. 4(5), 388–393 (2013)Google Scholar
  9. 9.
    Fitzgibbon, A.W., Fisher‚ R.B.: A buyer’s guide to conic fitting. In: Proceedings of the 6th British Conference on Machine Vision, vol. 2, pp. 513–522 (1995)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.National Chiao Tung UniversityHsinchuTaiwan
  2. 2.National Chung-Shan Institute of Science and TechnologyTaoyuanTaiwan

Personalised recommendations