Abstract
This paper aims to develop and evaluate two different webcam-based gaze-controlled interfaces for users with severe speech and motor impairment (SSMI). We configured two webcam-based gaze trackers using open-source software (Python and JavaScript) and developed cursor control algorithm using the gaze tracker. We designed a quiz application to evaluate the webcam-based gaze trackers for both users with SSMI and their able-bodied counterparts. We also collected data using a commercial infrared-based eye gaze tracker. We noted that users with SSMI and able- bodied users could use the webcam-based gaze-controlled interface. It was found that for users with SSMI, speed of interaction was significantly faster for a low-cost infrared-based commercial gaze. Results from this study can be used to develop as well as select low-cost eye gaze trackers for users with SSMI. This might be the first study to evaluate webcam-based gaze trackers in a gaze-controlled interface for users with different range of abilities
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Duchowski, A.T.: Eye Tracking Methodology. Springer, Berlin (2007)
Biswas, P., Langdon, P.: Multimodal intelligent eye-gaze tracking system. Int. J. Hum-Comput. Interact. 31(4), 277–294 (2015)
Communicate with the world using the power of your eyes, http://www.eyegaze.com/eye-tracking-assistive-technology-device/ (visited on 8/11/2017.)
Eye-gaze control technology, https://www.cerebralpalsy.org.au/about-cerebral-palsy/interventions-and-therapies/eye-gaze-technology-for-children-and-adults-with-cerebral-palsy/#1473737787180-c2ceacf9-ff87 (visited on 8/11/2017)
Borgestig, M., Sandqvist, J., Parsons, R., Falkmer, T., Hemmingsson, H.: Eye gaze performance for children with severe physical impairments using gaze-based assistive technology—A longitudinal study. Assistive Technol. 28(2), 93–102 (2016)
Majaranta, P., Majaranta, N., Daunys, G., Špakov, O.: Text editing by gaze: Static versus dynamic menus. In: Proceeding 5th Conference on Communication by Gaze Interaction (COGAIN 2009), pp. 19–24 (2009)
Urbina, M.H., Huckauf, A.: Selecting with gaze controlled pie menus. In: Proceedings of the 5th International Conference on Communication by Gaze Interaction (COGAIN 2009), pp. 25–29 (2009)
Razzak, F., Castellina, E., Corno, F.: Environmental control application compliant with COGAIN guidelines. In: COGAIN 2009-Gaze Interaction for Those Who Want It Most, pp. 31–34 (2009)
Bates, R.: Multimodal eye-based interaction for zoomed target selection on a standard graphical user interface. In: Proceedings of Interact 99, vol. 2, pp. 7–8 (1999)
Zhai, S., Morimoto, C., Ihde, S.: Manual and gaze input cascaded (MAGIC) pointing. In: Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, pp. 246–253 (1999)
Dostal, J., Kristensson, P.O., Quigley, A.: Subtle gaze-dependent techniques for visualising display changes in multi-display environments. In Proceedings of the 2013 international conference on Intelligent user interfaces. ACM, pp. 137–148 (2013, March)
Khonglah, J.R., Khosla, A.: A low cost webcam based eye tracker for communicating through the eyes of young children with ASD. In: Next Generation Computing Technologies (NGCT), 2015 1st IEEE International Conference on, pp. 925–928, (2015)
Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: Computer Vision and Pattern Recognition, 2001. CVPR 2001. Proceedings of the 2001 IEEE Computer Society Conference on. IEEE, vol. 1, pp. I–I (2001)
Cuong, N.H., Hoang, H.T.: Eye-gaze detection with a single webcam based on geometry features extraction. In: Control Automation Robotics & Vision (ICARCV), 2010 11th International Conference on. IEEE, pp. 2507–2512 (2010)
Sewell, W., Komogortsev, O.: Real-time eye gaze tracking with an unmodified commodity webcam employing a neural network. In CHI’10 Extended Abstracts on Human Factors in Computing Systems. ACM, pp. 3739–3744 (2010)
San Agustin, J., Skovsgaard, H., Mollenbach, E., Barret, M., Tall, M., Hansen, D.W., Hansen, J.P.: Evaluation of a low-cost open-source gaze tracker. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, pp. 77–80 (2010)
Biswas, P., Shree, J.D.: Analysing ocular parameters of users with cerebral palsy for developing gaze controlled interface, EC Opthalmology 8(5) (2017)
Vasanth, J., Jacob, N., Viswanathan, S.: Visual function status in children with cerebral palsy. Optom. Vis. Perform. 2(3), 251–253 (2014)
Sahay, A., Biswas, P.: Webcam Based Eye Gaze Tracking Using a Landmark Detector. In: Proceedings of the 10th Annual ACM India Compute Conference on ZZZ. ACM, pp. 31–37 (2017)
Tobii Eye Gaze Tracker: https://www.tobii.com/ (visited on 12/12/2017)
Biswas, P., Langdon, P.: A new input system for disabled users involving eye gaze tracker and scanning interface. J. Assist. Technol. 5(2), 58–66 (2011)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Agarwal, A. et al. (2019). Comparing Two Webcam-Based Eye Gaze Trackers for Users with Severe Speech and Motor Impairment. In: Chakrabarti, A. (eds) Research into Design for a Connected World. Smart Innovation, Systems and Technologies, vol 135. Springer, Singapore. https://doi.org/10.1007/978-981-13-5977-4_54
Download citation
DOI: https://doi.org/10.1007/978-981-13-5977-4_54
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-13-5976-7
Online ISBN: 978-981-13-5977-4
eBook Packages: EngineeringEngineering (R0)