Advertisement

Multimedia Tools and Applications

, Volume 78, Issue 6, pp 7155–7179 | Cite as

Driver’s eye-based gaze tracking system by one-point calibration

  • Hyo Sik Yoon
  • Hyung Gil Hong
  • Dong Eun Lee
  • Kang Ryoung ParkEmail author
Article
  • 137 Downloads

Abstract

The accuracies of driver’s gaze detection by previous researches are affected by the various sitting positions and heights of drivers in case that initial calibration of driver is not performed. By using dual cameras, the driver’s calibration can be omitted, but processing time with complexity is increased. In addition, the problem of disappearing corneal specular reflection (SR) in the eye image as the driver severely turns his/her head has not been dealt in previous researches. To consider these issues, we propose a gaze tracking method based on driver’s one-point calibration using both corneal SR and medial canthus (MC) based on maximum entropy criterion. An experiment with collected data from 26 subjects (wearing nothing, glasses, sunglasses, hat, or taking various hand pose) in a vehicle, showed that the accuracy of the proposed method is higher than that of other gaze tracking methods. In addition, we showed the effectiveness of our method in the real driving environment.

Keywords

Driver’s gaze detection NIR camera and NIR illuminator Pupil, corneal specular reflection and medial canthus Initial calibration of driver Maximum entropy criterion 

Notes

Acknowledgments

This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2018R1D1A1B07041921), by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2017R1D1A1B03028417), and by the National Research Foundation of Korea (NRF) grant funded by the Korea government (Ministry of Science and ICT) (NRF-2017R1C1B5074062).

References

  1. 1.
    850nm CWL, 12.5mm Dia. Hard Coated OD 4 50nm Bandpass Filter. Available online: https://www.edmundoptics.co.kr/optics/optical-filters/bandpass-filters/hard-coated-od4-50nm-bandpass-filters/84778/ (accessed on 7 March 2017)
  2. 2.
    Abtahi S, Hariri B, Shirmohammadi S (2011) Driver drowsiness monitoring based on yawning detection. In Proceedings of IEEE International Instrumentation and Measurement Technology Conference, Binjiang, pp. 1–4Google Scholar
  3. 3.
    Ahlstrom C, Kircher K, Kircher A (2013) A gaze-based driver distraction warning system and its effect on visual behavior. IEEE Trans Intell Transp Syst 14:965–973CrossRefGoogle Scholar
  4. 4.
    Batista JP (2005) A real-time driver visual attention monitoring system. In Proceedings of the 2nd Iberian Conference on Pattern Recognition and Image Analysis, Estoril, pp. 200–208CrossRefGoogle Scholar
  5. 5.
    Bergen JR, Anandan P, Hanna KJ, Hingorani R (1992) Hierarchical model-based motion estimation. In Proceedings of European Conference on Computer Vision, Santa Margherita Ligure, pp. 237–252CrossRefGoogle Scholar
  6. 6.
    Centre (geometry). Available online: https://en.wikipedia.org/wiki/Centre_(geometry) (accessed on 7 March 2017)
  7. 7.
    Cheng HD, Chen JR, Li J (1998) Threshold selection based on fuzzy c-partition entropy approach. Pattern Recogn 31:857–870CrossRefGoogle Scholar
  8. 8.
    Cho D-C, Kim W-Y (2013) Long-range gaze tracking system for large movements. IEEE Trans Biomed Eng 60:3432–3440CrossRefGoogle Scholar
  9. 9.
    Cho CW, Lee HC, Gwon SY, Lee JM, Jung D, Park KR, Kim H-C, Cha J (2014) Binocular gaze detection method using a fuzzy algorithm based on quality measurements. Opt Eng 53:053111-1–053111-22Google Scholar
  10. 10.
    Choi I-H, Hong SK, Kim Y-G (2016) Real-time categorization of driver’s gaze zone using the deep learning techniques. In Proceedings of IEEE International Conference on Big Data and Smart Computing, Hong Kong, pp. 143–148Google Scholar
  11. 11.
    Cui J, Liu Y, Xu Y, Zhao H, Zha H (2013) Tracking generic human motion via fusion of low- and high-dimensional approaches. IEEE Trans Syst Man Cybern Part A-Syst Hum 43:996–1002CrossRefGoogle Scholar
  12. 12.
    Daewoo Lacetti Premiere. Available online: https://en.wikipedia.org/wiki/Chevrolet_Cruze (accessed on 2 April 2018)
  13. 13.
    Dlib C++ Library (Real-time face pose estimation). Available online: http://blog.dlib.net/2014/08/real-time-face-pose-estimation.html (accessed on 7 March 2017)
  14. 14.
    Dong Y, Hu Z, Uchimura K, Murayama N (2011) Driver inattention monitoring system for intelligent vehicles: a review. IEEE Trans Intell Transp Syst 12:596–614CrossRefGoogle Scholar
  15. 15.
    Dongguk Single Camera-based Driver Database (DSCD-DB1). Available online: http://dm.dgu.edu/link.html (accessed on 27 July 2017)
  16. 16.
    Durna Y, Ari F (2017) Design of a binocular pupil and gaze point detection system utilizing high definition images. Appl Sci-Basel 7:1–16Google Scholar
  17. 17.
    ELP-USB500W02M-L36. Available online: http://www.elpcctv.com/usb20-5mp-usb-camera-module-ov5640-color-cmos-sensor-36mm-lens-p-216.html (accessed on 7 March 2017)
  18. 18.
    Eye tracking. Available online: https://en.wikipedia.org/wiki/Eye_tracking (accessed on 7 March 2017)
  19. 19.
    Franchak JM, Kretch KS, Soska KC, Adolph KE (2011) Head-mounted eye tracking: a new method to describe infant looking. Child Dev 82:1738–1750CrossRefGoogle Scholar
  20. 20.
    Fridman L, Lee J, Reimer B, Victor T (2016) "Owl" and "Lizard": Patterns of head pose and eye pose in driver gaze classification. IET Comput Vis 10:308–313CrossRefGoogle Scholar
  21. 21.
    Fridman L, Lee J, Reimer B, Victor T (2016) Owl and lizard: Patterns of head pose and eye pose in driver gaze classification. IET Comput Vis 10:308–314CrossRefGoogle Scholar
  22. 22.
    Fu X, Guan X, Peli E, Liu H, Luo G (2013) Automatic calibration method for driver’s head orientation in natural driving environment. IEEE Trans Intell Transp Syst 14:303–312CrossRefGoogle Scholar
  23. 23.
    Funke G, Greenlee E, Carter M, Dukes A, Brown R, Menke L (2016) Which eye tracker is right for your research? Performance evaluation of several cost variant eye trackers. In Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting, Washington, DC, pp. 1240–1244Google Scholar
  24. 24.
    García I, Bronte S, Bergasa LM, Almazán J, Yebes J (2012) Vision-based drowsiness detector for real driving conditions. In Proceedings of IEEE Intelligent Vehicles Symposium, Alcala de Henares, 618–623Google Scholar
  25. 25.
    Ghosh S, Nandy T, Manna N (2015) Real time eye detection and tracking method for driver assistance system. Advancements of Medical Electronics. Springer, New Delhi, pp. 13–25Google Scholar
  26. 26.
    Gonzalez RC, Woods RE (2010) Digital Image Processing, 3rd edn. Prentice Hall, New JerseyGoogle Scholar
  27. 27.
    Hansen DW, Ji Q (2010) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32:478–500CrossRefGoogle Scholar
  28. 28.
    Itkonen T, Pekkanen J, Lappi O (2015) Driver gaze behavior is different in normal curve driving and when looking at the tangent point. PLoS One 10:1–19CrossRefGoogle Scholar
  29. 29.
    Jang JW, Heo H, Bang JW, Hong HG, Naqvi RA, Nguyen PH, Nguyen DT, Lee MB, Park KR (2018) Fuzzy-based estimation of continuous Z-distances and discrete directions of home appliances for NIR camera-based gaze tracking system. Multimed Tools Appl 77:11925–11955CrossRefGoogle Scholar
  30. 30.
    Jung D, Lee JM, Gwon SY, Pan W, Lee HC, Park KR, Kim H-C (2016) Compensation method of natural head movement for gaze tracking system using an ultrasonic sensor for distance measurement. Sensors 16:1–20CrossRefGoogle Scholar
  31. 31.
    Kar A, Corcoran P (2017) A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access 5:16495–16519CrossRefGoogle Scholar
  32. 32.
    Kazemi V, Sullivan J (2014) One millisecond face alignment with an ensemble of regression trees. In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Columbus, 1867–1874Google Scholar
  33. 33.
    Khushaba RN, Kodagoda S, Lal S, Dissanayake G (2011) Driver drowsiness classification using fuzzy wavelet-packet-based feature-extraction algorithm. IEEE Trans Biomed Eng 58:121–131CrossRefGoogle Scholar
  34. 34.
    Koblova EV, Bashkatov AN, Genina EA, Tuchin VV, Bakutkin VV (2005) Estimation of melanin content in iris of human eye. Proc SPIE 5688:302–311CrossRefGoogle Scholar
  35. 35.
    Lee JW, Heo H, Park KR (2013) A novel gaze tracking method based on the generation of virtual calibration points. Sensors 13:10802–10822CrossRefGoogle Scholar
  36. 36.
    Lee SJ, Jo J, Jung HG, Park KR, Kim J (2011) Real-time gaze estimator based on driver’s head orientation for forward collision warning system. IEEE Trans Intell Transp Syst 12:254–267CrossRefGoogle Scholar
  37. 37.
    Lee B-G, Lee B-L, Chung W-Y (2014) Mobile healthcare for automatic driving sleep-onset detection using wavelet-based EEG and respiration signals. Sensors 14:17915–17936CrossRefGoogle Scholar
  38. 38.
    Li G, Chung W-Y (2015) A context-aware EEG headset system for early detection of driver drowsiness. Sensors 15:20873–20893CrossRefGoogle Scholar
  39. 39.
    Li Z, Li SE, Li R, Cheng B, Shi J (2017) Online detection of driver fatigue using steering wheel angles for real driving conditions. Sensors 17:1–12CrossRefGoogle Scholar
  40. 40.
    Li Y, Xue F, Fan X, Qu Z, Zhou G (2018) Pedestrian walking safety system based on smartphone built-in sensors. IET Commun 12:751–758CrossRefGoogle Scholar
  41. 41.
    Li Y, Xue F, Feng L, Qu Z (2017) A driving behavior detection system based on a smartphone's built-in sensor. Int J Commun Syst 30:1–13Google Scholar
  42. 42.
    Li Y, Zhou G, Li Y, Shen D (2016) Determining driver phone use leveraging smartphone sensors. Multimed Tools Appl 75:16959–16981CrossRefGoogle Scholar
  43. 43.
    Liang Y, Reyes ML, Lee JD (2007) Real-time detection of driver cognitive distraction using support vector machines. IEEE Trans Intell Transp Syst 8:340–350CrossRefGoogle Scholar
  44. 44.
    Liu L, Cheng L, Liu Y, Jia Y, Rosenblum DS (2016) Recognizing complex activities by a probabilistic interval-based model, In Proceedings of the 13th AAAI Conference on Artificial Intelligence, Phoenix, pp. 1266–1272Google Scholar
  45. 45.
    Liu Y, Cui J, Zhao H, Zha H (2012) Fusion of low-and high-dimensional approaches by trackers sampling for generic human motion tracking. In Proceedings of the 21st International Conference on Pattern Recognition, Tsukuba, pp. 898–901Google Scholar
  46. 46.
    Liu Y, Nie L, Han L, Zhang L, Rosenblum DS (2015) Action2Activity: Recognizing complex activities from sensor data. In Proceedings of the 24th International Joint Conference on Artificial Intelligence, Buenos Aires, pp. 1617–1623Google Scholar
  47. 47.
    Liu Y, Nie L, Liu L, Rosenblum DS (2016) From action to activity: sensor-based activity recognition. Neurocomputing 181:108–115CrossRefGoogle Scholar
  48. 48.
    Noris B, Keller J-B, Billard A (2011) A wearable gaze tracking system for children in unconstrained environments. Comput Vis Image Underst 115:476–486CrossRefGoogle Scholar
  49. 49.
    OpenCV. Available online: http://opencv.org/ (accessed on 7 March 2017)
  50. 50.
    Purkinje Images. Available online: https://en.wikipedia.org/wiki/Purkinje_images (accessed on 7 March 2017)
  51. 51.
    Rantanen V, Vanhala T, Tuisku O, Niemenlehto P-H, Verho J, Surakka V, Juhola M, Lekkala J (2011) A wearable, wireless gaze tracker with integrated selection command source for human-computer interaction. IEEE Trans Inf Technol Biomed 15:795–801CrossRefGoogle Scholar
  52. 52.
    Ren Y-Y, Li X-S, Zheng X-L, Li Z, Zhao Q-C (2015) Analysis of drivers’ eye-movement characteristics when driving around curves. Discret Dyn Nat Soc 2015:1–10Google Scholar
  53. 53.
    Ren Y-Y, Li X-S, Zheng X-L, Li Z, Zhao Q-C, Chen X-X (2014) Analysis and modeling of driver’s gaze trajectories in curves driving. Adv Mech Eng 2014:1–12Google Scholar
  54. 54.
    Renault Samsung SM5. Available online: https://en.wikipedia.org/wiki/Renault_Samsung_SM5 (accessed on 7 March 2017)
  55. 55.
    Sahayadhas A, Sundaraj K, Murugappan M (2012) Detecting driver drowsiness based on sensors: a review. Sensors 12:16937–16953CrossRefGoogle Scholar
  56. 56.
    Shih S-W, Liu J (2004) A novel approach to 3-D gaze tracking using stereo cameras. IEEE Trans. Syst. Man Cybern. Part B-Cybern 34:234–245CrossRefGoogle Scholar
  57. 57.
    Smith P, Shah M, da Vitoria Lobo N (2000) Monitoring head/eye motion for driver alertness with one camera. In Proceedings of International Conference on Pattern Recognition, Barcelona, pp. 636–642Google Scholar
  58. 58.
    Smith P, Shah M, da Vitoria Lobo N (2003) Determining driver visual attention with one camera. IEEE Trans Intell Transp Syst 4:205–218CrossRefGoogle Scholar
  59. 59.
    Sturm RA, Frudakis TN (2004) Eye colour: portals into pigmentation genes and ancestry. Trends Genet 20:327–332CrossRefGoogle Scholar
  60. 60.
    Tawari A, Chen KH, Trivedi MM (2014) Where is the driver looking: Analysis of head, eye and iris for robust gaze zone estimation. In Proceedings of IEEE International Conference on Intelligent Transportation Systems, Qingdao, pp. 988–994Google Scholar
  61. 61.
    Tawari A, Trivedi MM (2014) Robust and continuous estimation of driver gaze zone by dynamic analysis of multiple face videos. In Proceedings of IEEE Intelligent Vehicles Symposium, Dearborn, pp. 344–349Google Scholar
  62. 62.
    Tobii. Available online: http://www.tobii.com (accessed on 7 March 2017)
  63. 63.
    Tsukada A, Shino M, Devyver M, Kanade T (2011) Illumination-free gaze estimation method for first-person vision wearable device. In Proceedings of IEEE International Conference on Computer Vision Workshops, Barcelona, pp. 2084–2091Google Scholar
  64. 64.
    van Leeuwen PM, Happee R, de Winter JCF (2015) Changes of driving performance and gaze behavior of novice drivers during a 30-min simulator-based training. Procedia Manufacturing 3:3325–3332CrossRefGoogle Scholar
  65. 65.
    Vicente F, Huang Z, Xiong X, De la Torre F, Zhang W, Levi D (2015) Driver gaze tracking and eyes off the road detection system. IEEE Trans Intell Transp Syst 16:2014–2027CrossRefGoogle Scholar
  66. 66.
    Vora S, Rangesh A, Trivedi MM (2017) On generalizing driver gaze zone estimation using convolutional neural networks. In Proceedings of IEEE Intelligent Vehicles Symposium, Redondo Beach, pp. 849–854Google Scholar
  67. 67.
    Wang J, Zhang G, Shi J (2016) 2D gaze estimation based on pupil-glint vector using an artificial neural network. Appl Sci-Basel 6:1–17Google Scholar
  68. 68.
    Yoo DH, Chung MJ (2005) A novel non-intrusive eye gaze estimation using cross-ratio under large head motion. Comput Vis Image Underst 98:25–51CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  • Hyo Sik Yoon
    • 1
  • Hyung Gil Hong
    • 1
  • Dong Eun Lee
    • 1
  • Kang Ryoung Park
    • 1
    Email author
  1. 1.Division of Electronics and Electrical EngineeringDongguk UniversitySeoulSouth Korea

Personalised recommendations