Block-Wise Gaze Estimation Based on Binocular Images

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10749)

Abstract

Appearance-based gaze estimation methods have been proved to be highly effective. Different from the previous methods that estimate gaze direction based on left or right eye image separately, we propose a binocular-image based gaze estimation method. Considering the challenges in estimating the precise gaze points via regression models, we estimate the block-wise gaze position by classifying the binocular images via convolutional neural network (CNN) in the proposed method. We divide the screen of the desktop computer into 2 × 3 and 6 × 9 blocks respectively, label the binocular images with their corresponding gazed block positions, train a convolutional neural network model to classify the eye images according to their labels, and estimate the gazed block through the CNN-based classification. The experimental results demonstrate that the proposed gaze estimation method based on binocular images can reach higher accuracy than those based on monocular images. And the proposed method shows its great potential in practical touch screen-based applications.

Keywords

Gaze estimation Gaze block Appearance-based Eye image Convolutional neural network (CNN) 

Notes

Acknowledgements

This work is supported by Key Research and Development Foundation of Shandong Province (2016GGX101009), Natural Science Foundation of Shandong Province (ZR2014FM012), and Scientific Research and Development Foundation of Shandong Provincial Education Department (J15LN60). We acknowledge the support of NVIDIA Corporation with the donation of the TITAN X GPU used for this research.

References

  1. 1.
    Morimoto, C.H., Mimica, M.R.M.: Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98(1), 4–24 (2005)CrossRefGoogle Scholar
  2. 2.
    Yang, C., et al.: A gray difference-based pre-processing for gaze tracking. In: 10th International Conference Proceedings on Signal Processing Proceedings, Beijing, China, pp. 1293–1296. IEEE Computer Society (2010)Google Scholar
  3. 3.
    Shih, S.-W., Liu, J.: A novel approach to 3-D gaze tracking using stereo cameras. Trans. Syst. Man Cybern. 34(1), 234–245 (2004)CrossRefGoogle Scholar
  4. 4.
    Cheung, Y., Peng, Q.: Eye gaze tracking with a web camera in a desktop environment. IEEE Trans. Hum.-Mach. Syst. 45(4), 419–430 (2015)CrossRefGoogle Scholar
  5. 5.
    Morimoto, C.H., Amir, A., Flickner, M.: Detecting eye position and gaze from a single camera and 2 light sources. In: 16th International Conference Proceedings on Pattern Recognition, Washington, DC, USA, p. 40314. IEEE Computer Society (2002)Google Scholar
  6. 6.
    Coutinho, F.L., Morimoto, C.H.: Free head motion eye gaze tracking using a single camera and multiple light sources. In: 19th Brazilian Symposium Proceedings on Computer Graphics and Image Processing, Manaus, Brazil, pp. 171–178. IEEE Computer Society (2006)Google Scholar
  7. 7.
    Guestrin, E.D., Eizenman, M.: General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Bio-Med. Eng. 53(6), 1124–1133 (2006)CrossRefGoogle Scholar
  8. 8.
    Niu, C., Sun, J., Li, J., Yan, H.: A calibration simplified method for gaze interaction based on using experience. In: 2015 International Workshop Proceedings on Multimedia Signal Processing, Xiamen, China, pp. 1–5. IEEE Computer Society (2015)Google Scholar
  9. 9.
    Lu, F., Sugano, Y., Okabe, T., Sato, Y.: Adaptive linear regression for appearance-based gaze estimation. IEEE Trans. Pattern Anal. Mach. Intell. 36(10), 2033–2046 (2014)CrossRefGoogle Scholar
  10. 10.
    Lu, F., Sugano, Y., Okabe, T., Sato, T.: Inferring human gaze from appearance via adaptive linear regression. In: 2011 International Conference Proceedings on Computer Vision Processing, Washington, DC, USA, pp. 153–160. IEEE Computer Society (2011)Google Scholar
  11. 11.
    Sugano, Y., Matsushita, Y., Sato, Y.: Appearance-based gaze estimation using visual saliency. IEEE Trans. Pattern Anal. Mach. Intell. 35(2), 329–341 (2013)CrossRefGoogle Scholar
  12. 12.
    Ye, N., Tao, X., Dong, L., Ge, N.: Mouse calibration aided real-time gaze estimation based on boost Gaussian Bayesian learning. In: 2016 International Conference Proceedings on Image Processing, Phoenix, AZ, USA, pp. 2797–2801. IEEE Computer Society (2016)Google Scholar
  13. 13.
    Sugano, Y., Matsushita, Y., Sato, Y., Koike, H.: Appearance-based gaze estimation with online calibration from mouse operations. IEEE Trans. Hum.-Mach. Syst. 45(6), 750–760 (2015)CrossRefGoogle Scholar
  14. 14.
    Wang, Y., Shen, T., Yuan, G., Bian, J., Xianping, F.: Appearance based gaze estimation using deep features and random forest regression. Knowl.-Based Syst. 1(10), 293–301 (2016)CrossRefGoogle Scholar
  15. 15.
    Sugano, Y., Matsushita, Y., Sato, Y.: Learning-by-synthesis for appearance-based 3D gaze estimation. In: 2014 IEEE Conference Proceedings on Computer Vision and Pattern Recognition, Washington, DC, USA, pp. 1821–1828. IEEE Computer Society (2014)Google Scholar
  16. 16.
    Krafka, K., et al.: Eye tracking for everyone. In: 2016 IEEE Conference Proceedings on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, pp. 2176–2184. IEEE Computer Society (2016)Google Scholar
  17. 17.
    Baluja, S., Pomerleau, D.: Non-intrusive gaze tracking using artificial neural networks. Technical report. Carnegie Mellon University, Pittsburgh, PA, USA (1994)Google Scholar
  18. 18.
    Zhang, X., Sugano, Y., Fritz, M.: Andreas bulling: appearance-based gaze estimation in the wild. In: 2015 IEEE Conference Proceedings on Computer Vision and Pattern Recognition, Boston, MA, USA, pp. 4511–4520 (2015)Google Scholar
  19. 19.
    Baggio, D.L., et al.: Mastering OpenCV with Practical Computer Vision Projects, 1st edn. Packt, Birmingham (2012)Google Scholar
  20. 20.
    Lienhart, R., Maydt, J.: An extended set of Haar-like features for rapid object detection. In: 2002 International Conference Proceedings on Image Processing, Rochester, New York, USA, pp. I-900–I-903 (2002)Google Scholar
  21. 21.
    Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Pereira, F., Burges, C.J.C., Bottou, L., Weinberger, K.Q. (eds.) 25th International Conference Proceedings on Neural Information Processing Systems 2012, pp. 1097–1105. Curran Associates Inc., USA (2012)Google Scholar
  22. 22.
    George, A., Routray, A.: Real-time eye gaze direction classification using convolutional neural network. In: 2016 International Conference Proceedings on Signal Processing and Communications Processing, Bangalore, India, pp. 1–5 (2016)Google Scholar
  23. 23.
    Florea, L., Florea, C., Vrânceanu, R., Vertan, C.: Can your eyes tell me how you think? A gaze directed estimation of the mental activity. In: Proceedings of the 2013 British Machine Vision Conference, Bristol, UK, pp. 60–61. BMVA Press (2013)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Xuemei Wu
    • 1
  • Jing Li
    • 2
  • Qiang Wu
    • 1
  • Jiande Sun
    • 3
  • Hua Yan
    • 4
  1. 1.School of Information Science and EngineeringShandong UniversityJinanChina
  2. 2.School of Mechanical and Electrical EngineeringShandong Management UniversityJinanChina
  3. 3.School of Information Science and EngineeringShandong Normal UniversityJinanChina
  4. 4.School of Computer Science and TechnologyShandong University of Finance and EconomicsJinanChina

Personalised recommendations