Advertisement

Improvement of Gaze Estimation Robustness Using Pupil Knowledge

  • Kohei Arai
  • Ronny Mardiyanto
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6017)

Abstract

This paper presents an eye gaze estimation system which robust against various users. Our method utilizes an IR camera mounted on glass to allow user’s movement. Pupil knowledge such as shape, size, location, and motion are used. This knowledge works based on the knowledge priority. Pupil appearance such as size, color, and shape are used as the first priority. When this step fails, then pupil is estimated based on its location as second priority. When all steps fail, then we estimate pupil based on its motion as the last priority. The aim of this proposed method is to make the system compatible for various user as well as to overcome problem associated with illumination changes and user movement. The proposed system is tested using several users with various race as well as nationality and the experiment result are compared to the well-known adaptive threshold method and template matching method. The proposed method shows good performance, robustness, accuracy and stability against illumination changes without any prior calibration.

Keywords

Gaze eye detection pupil pupil knowledge 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Park, K.S., Lee, K.T.: Eye controlled human/computer interface using the line of sight and intentional blink. Computer Ind. Eng. 30(3), 463–473 (1996)CrossRefGoogle Scholar
  2. 2.
    Ito, Sudo, Ifuku: The look input type communication equipment for serious physically handicapped persons. The Institute of Electronics, Information and Communication Engineers paper magazine J83D1C5, 495–503 (2000)Google Scholar
  3. 3.
    Yamada, F.: The text creation and the peripheral equipment control device by eye movement. The Institute of Electronics, Information and Communication Engineers paper magazine CJ69D(7), 1103–1107 (1986)Google Scholar
  4. 4.
    Yamada, M.: The research trend of the latest eye movement, An Electric Information-and-Telecommunications Academic Journal, MBE95-132, NC95-90,145-152 (1995)Google Scholar
  5. 5.
    Abe, K., Ohiamd, S., Ohyama, M.: An Eye-gaze Input System based on the Limbus Tracking Method by Image Analysis for Seriously Physically Handicapped People. In: 7th ERCIM Workshop User Interface for All Adjunct Proc., pp. 185–186 (2002)Google Scholar
  6. 6.
    Creative Advanced Technologies, http://www.creact.co.jp/jpn/por.pdf
  7. 7.
    Kuno, Y., Fujii, K., Uchikawa: Development of the look input interface using EOG. The Information Processing Society of Japan Paper Magazine C39C5, 1455–1462 (1998)Google Scholar
  8. 8.
    Robinson, D.A.: A method of measuring eye movement using a sclera search coil in a magnetic field. IEEE Trans. on Biomedical Electronics 10, 137–145 (1963)Google Scholar
  9. 9.
    The organization of the retina and visual system, http://webvision.med.utah.edu/
  10. 10.
    Ito, N.: Eye movement measurement by picture taking in and processing via a video capture card. An Institute of Electronics, Information and Communication Engineers Technical Report C102 C128 C31-36 (2002)Google Scholar
  11. 11.
    Kishimoto, Y., Hirose, C.: Development of the look input system by a cursor move system. The Institute of Image Information and Television Engineers C55C6C917-919 (2001)Google Scholar
  12. 12.
    Corno, L., Farinetti, I.,, S.: A Cost-Effective Solution for Eye-Gaze Assistive Technology. In: Proc. IEEE International Conf. on Multimedia and Expo., vol. 2, pp. 433–436 (2002)Google Scholar
  13. 13.
    Abe, O., Oi, D.: The look input system using the sclera reflection method by image analysis. The Institute of Image Information and Television Engineers 57(10), 1354–1360 (2003)Google Scholar
  14. 14.
    Abe, Daisen, Oi.: The multi-index look input system which used the image analysis under available light. The Institute of Image Information and Television Engineers, 58C11C 1656-1664 (2004)Google Scholar
  15. 15.
    Abe, Daisen, Oi.: The look input platform for serious physically handicapped persons, Human Interface Society Human interface symposium 2004 collected papers C1145-1148 (2004)Google Scholar
  16. 16.
    Bradski, G., Kaebler, A.: Learning Computer Vision with the OpenCV Library, pp. 214–219. O’Reilly, Sebastopol (2008)Google Scholar
  17. 17.
    Haro, A., Flickner, M., Essa, I.: Detecting and Tracking Eyes By Using Their Physiological Properties, Dynamics, and Appearance. In: Proceeding of CVPR 2000, pp. 163–168 (2000)Google Scholar
  18. 18.
    Zhu, Z., Ji, Q., Fujimura, K., Lee, K.: Combining Kalman filtering and mean shift for real time eye tracking under active IR illumination. In: Proceeding of 16th Pattern Recognition International Conference, vol. 4, pp. 318–321 (2002)Google Scholar
  19. 19.
    Takegami, T., Gotoh, T., Kagei, S., Minamikawa-Tachino, R.: A Hough Based Eye Direction Detection Algorithm without On-site Calibration. In: Proceeding of 7th Digital Image Computing: Techniques and Applications, pp. 459–468 (2003)Google Scholar
  20. 20.
    Lam, K.M., Yan, H.: Locating and extracting eye in human face images. Pattern Recognition 29(5), 771–779 (1996)CrossRefMathSciNetGoogle Scholar
  21. 21.
    Chow, G., Li, X.: Towards a system of automatic facial feature detection. Pattern Recognition (26), 1739–1755 (1993)Google Scholar
  22. 22.
    Rajpathaka, T., Kumarb, R., Schwartzb, E.: Eye Detection Using Morphological and Color Image Processing. In: Proceeding of Florida Conference on Recent Advances in Robotics (2009)Google Scholar
  23. 23.
    Brunelli, R., Poggio, T.: Face Recognition: Features versus templates. IEEE Trans. Patt. Anal. Mach. Intell. 15(10), 1042–1052 (1993)CrossRefGoogle Scholar
  24. 24.
    Beymer, D.J.: Face Recognition under varying pose. In: Beymer, D.J. (ed.) IEEE Proceedings of Int. Conference on Computer Vision and Pattern Recognition (CVPR 1994), Seattle, Washington, pp. 756–761 (1994)Google Scholar
  25. 25.
    Shafi, M., Chung, P.W.H.: A Hybrid Method for Eyes Detection in Facial Images. International Journal of Electrical, Computer, and Systems Engineering, 231–236 (2009)Google Scholar
  26. 26.
    Morimoto, C., Koons, D., Amir, A., Flickner, M.: Pupil detection and tracking using multiple light sources. Image and Vision Computing 18(4), 331–335 (2000)CrossRefGoogle Scholar
  27. 27.
    Yuille, A., Haallinan, P., Cohen, D.S.: Feature extraction from faces using deformable templates. In: Proceeding of IEEE Computer Vision and Pattern Recognition, pp. 104–109 (1989)Google Scholar
  28. 28.
  29. 29.
    Kim, K.-N., Ramakrishna, R.S.: Vision-based eye gaze tracking for human computer interface. In: Proceedings of IEEE International Conference on Systems, Man, and Cybernetics, vol. 2, pp. 324–329 (1999)Google Scholar
  30. 30.
    Newman, R., Matsumoto, Y., Rougeaux, S., Zelinsky, A.: Real-time stereo tracking for head pose and gaze estimation. In: Proceedings of Fourth International Conference on Automatic Face and Gesture Recognition, pp. 122–128 (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Kohei Arai
    • 1
  • Ronny Mardiyanto
    • 1
    • 2
  1. 1.Depatment of Information ScienceSaga UniversityJapan
  2. 2.Institut Teknologi Sepuluh NopemberSurabayaIndonesia

Personalised recommendations