Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

An Enhanced Eye-Tracking Approach Using Pipeline Computation

  • 3 Accesses


Tracking and authentication at a low cost and in a convenient way is an important issue nowadays as opposed to using high cost for specified hardware and complex calibration process. In the paper, the prime concentration is divided into two: (i) eye tracking and (ii) parallelism in execution. First part has been subdivided as: Firstly, human face is taken from video sequence, and this is divided into frames. Eye regions are recognized from image frames. Secondly, iris region and pupils (central point) are being identified. In identifying pupil, the energy intensity and edge strength are taken into account together. Iris and eye corners are considered as tracking points. The famous sinusoidal head model is used for 3-D head shape, and adaptive probability density function (APDF) is proposed for estimating pose in facial features extraction. Thirdly, iris (pupil) tracking is completed by integrating eye vector and head movement information gained from APDF. The second part focuses on reducing execution time (ET) by applying pipeline architecture (PA). So the entire process has been subdivided into three phases. Phase-I is tracking eyes and pupils and preparing an eye vector. Phase-II is completing the calibration process. Phase-III is for matching and feature extraction. PA is integrated to improve performance in terms of ET because three phases are running in parallel. The experiment is done based on CK + image sets. Indeed, by using PA, the ET is reduced by (66.68%) two-thirds. The average accuracy in tracking (91% in left eye pupil (LEP)) and 93.6% in right eye pupil (REP)) is a robust one.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8


  1. 1.

    Hossain, M.A.; Sanyal, G.: Tracking humans based on interest point over span-space in multifarious situations. IJSEIA 10(9), 175–192 (2016)

  2. 2.

    Ravi, K.K., et al.: Constraint saliency based intelligent camera for enhancing viewers attention towards intended face. Pattern Recognit. Lett. 1(2), 1–10 (2018)

  3. 3.

    Samadi, F.; Akbarizadeh, G.; Kaabi, H.: Change detection in SAR images using deep belief network: a new training approach based on morphological images. IET Image Process. 13(12), 2255–2264 (2019)

  4. 4.

    Wyder, S.; et al.: With gaze tracking toward noninvasive eye cancer treatment. IEEE Trans. Biomed. Eng. 63(9), 1914–1924 (2016)

  5. 5.

    Kurzhals, K.; Hlawatsch, M.; Heimerl, F.; Burch, M.; Ertl, T.; Weiskopf, D.: Gaze stripes: image-based visualization of eye tracking data. IEEE Trans. Vis. Comput. Graph. 22(1), 1005–1014 (2016)

  6. 6.

    Moghaddam, A.E.; Akbarizadeh, G.; Kaabi, H.: Automatic detection and segmentation of blood vessels and pulmonary nodules based on a line tracking method and generalized linear regression model. Signal Image Video Process. 13(3), 457–464 (2019)

  7. 7.

    Tirandaz, Z.; Akbarizadeh, G.: Unsupervised texture-based SAR image segmentation using spectral regression and Gabor filter bank. J. Indian Soc. Remote Sens. 44(2), 177–186 (2016)

  8. 8.

    Norouzi, M.; Akbarizadeh, G.; Eftekhar, F.: A hybrid feature extraction method for SAR image registration. Signal Image Video Process. 12(8), 1559–1566 (2018)

  9. 9.

    Sharifzadeh, F.; Akbarizadeh, G.; Kavian, Y.S.: Ship classification in SAR images using a new hybrid CNN-MLP classifier. J. Indian Soc. Remote Sens. 47(4), 551–562 (2019)

  10. 10.

    Andekah, Z.A.; Naderan, M.; Akbarizadeh, G.: Semi-supervised hyperspectral image classification using spatial-spectral features and superpixel-based sparse codes. In: 2017 ICEE, @ IEEE, May 2017, pp. 2229–2234

  11. 11.

    Akbarizadeh, G.: A new statistical-based kurtosis wavelet energy feature for texture recognition of SAR images. IEEE Trans. Geosci. Remote Sens. 50(11), 4358–4368 (2012)

  12. 12.

    Raeisi, A.; Akbarizadeh, G.; Mahmoudi, A.: Combined method of an efficient cuckoo search algorithm and nonnegative matrix factorization of different zernike moment features for discrimination between oil spills and lookalikes in SAR images. IEEE JSTAEORS 11(11), 4193–4205 (2018)

  13. 13.

    Modava, M.; Akbarizadeh, G.; Soroosh, M.: Integration of spectral histogram and level set for coastline detection in SAR images. IEEE Trans. Aerosp. Electron. Syst. 55(2), 810–819 (2019)

  14. 14.

    Akbarizadeh, G.; Rahmani, M.: Efficient combination of texture and color features in a new spectral clustering method for PolSAR image segmentation. Natl. Acad. Sci. Lett. 40(2), 117–120 (2017)

  15. 15.

    Akbarizadeh, G.; Tirandaz, Z.; Kooshesh, M.: A new curvelet-based texture classification approach for land cover recognition of SAR satellite images. Malays. J. Comput. Sci. 27(3), 218–239 (2014)

  16. 16.

    Hossain, M.A.; et al.: Eye diseases detection based on covariance. IJCSITS 2(2), 376–379 (2012)

  17. 17.

    Hossain, M.A.; et al.: Object tracking podium on region covariance for recognition and classification. IJETCAS 2(1), 68–73 (2012)

  18. 18.

    Ahmadi, N., Akbarizadeh, G.: Iris tissue recognition based on GLDM feature extraction and hybrid MLPNN-ICA classifier. Neural Comput. Appl. 10(4), 1–15 (2018)

  19. 19.

    Singh, D.; Kumar, V.: Dehazing of outdoor images using notch based integral guided filter. Multimed. Tools Appl. 77, 27363–27386 (2018)

  20. 20.

    Singh, D.; Kumar, V.: Single image defogging by gain gradient image filter. Sci. China Inf. Sci. 62(7), 2019 (2019)

  21. 21.

    Singh, D.; Kaurb, M.; Singh, H.: Remote sensing image fusion using fuzzy logic and gyrator transform. Remote Sens. Lett. 9(10), 942–951 (2018)

  22. 22.

    Singh, D.; Kumar, V.: A novel dehazing model for remote sensing images. Comput. Electr. Eng. 69, 14–27 (2018)

  23. 23.

    Singh, D.; Garg, D.; Singh Pannu, H.: Efficient landsat image fusion using fuzzy and stationary discrete wavelet transform. Imaging Sci. J. 65(2), 108–114 (2017)

  24. 24.

    Singh, D.; Kumar, V.: Image dehazing using Moore neighborhood-based gradient profile prior. Signal Process.-Image Commun. 70, 131–144 (2019)

  25. 25.

    Singh, D.; Kumar, V.: Dehazing of remote sensing images using improved restoration model based dark channel prior. Imaging Sci. J. 65(5), 282–292 (2017)

  26. 26.

    Singh, D.; Kumar, V.; Kaur, M.: Single image dehazing using gradient channel prior. Appl. Intell. 49(12), 4276–4293 (2019)

  27. 27.

    Taibi, F.; Akbarizadeh, G.; Farshidi, E.: Robust reservoir rock fracture recognition based on a new sparse feature learning and data training method. Multidimens. Syst. Signal Process. 1–34 (2019)

  28. 28.

    Kang, Z.; Landry, S.J.: An eye movement analysis algorithm for a multielement target tracking task: maximum transition-based agglomerative hierarchical clustering. IEEE THMS 45(1), 13–24 (2015)

  29. 29.

    Hossain, M.A.; et al.: A new tactic to maintain privacy and safety of imagery information. IJCA 110(5), 6–12 (2015)

  30. 30.

    Zhang, X.; et al.: An eye tracking analysis for video advertising: relationship between advertisement elements and effectiveness. IEEE Access 6, 10699–10707 (2018)

  31. 31.

    Moacdieh, N.M.; Sarter, N.: The effects of data density, display organization, and stress on search prformance: an eye tracking study of clutter. IEEE Trans. HMS 47(6), 886–895 (2017)

  32. 32.

    Kurzhals, K.; et al.: Visual analytics for mobile eye tracking. IEEE Trans. Vis. Comput. Graph. 23(1), 301–310 (2017)

  33. 33.

    Kim, M.; Kim, B.H.; Jo, S.: Quantitative evaluation of a low-cost noninvasive hybrid interface based on eeg and eye movement. IEEE Trans. NSRE 23(2), 159–168 (2015)

  34. 34.

    Zhao, Z.N.; et al.: Liquid metal enabled flexible electronic system for eye movement tracking. IEEE Sens. J. 18(6), 2592–2598 (2018)

  35. 35.

    Jansson, D.; et al.: Parametric and nonparametric analysis of eye-tracking data by anomaly detection. IEEE Trans. CST 23(4), 1578–1586 (2015)

  36. 36.

    Kumar, D.; et al.: SmartEye: developing a novel eye tracking system for quantitative assessment of oculomotor abnormalities. IEEE Trans. NSRE 24(10), 1051–1059 (2016)

  37. 37.

    Wu, T.; et al.: A robust noninvasive eye control approach for disabled people based on Kinect 2.0 sensor. IEEE Sens. Counc. 1(4), 1–4 (2017)

  38. 38.

    Zhang, W.; Liu, H.: Toward a reliable collection of eye-tracking data for image quality research: challenges, solutions, and applications. IEEE Trans. Image Proc. 26(5), 2424–2437 (2017)

  39. 39.

    Alam, S.S.; Jianu, R.: Analyzing eye-tracking information in visualization and data space: from where on the screen to what on the screen. IEEE Trans. Vis. Comput. Graph. 23(5), 1492–1505 (2017)

  40. 40.

    Hossain, M.A.; Samanta, D.; Sanyal, G.: Extraction of panic expression from human face based on histogram approach. In: 6th International Conference on Information Processing-ICIP 2012, CCIS 292, pp. 411–418. Springer, Bangalore, 10–12 Aug (2012)

  41. 41.

    Orlosky, J.; et al.: ModulAR: eye-controlled vision augmentations for head mounted displays. IEEE Trans. Vis. Comput. Graph. 21(11), 1259–1268 (2015)

  42. 42.

    Eid, M.A.; et al.: A novel eye-gaze-controlled wheelchair system for navigating unknown environments: case study with a person with ALS. IEEE-Access 4, 558–573 (2016)

  43. 43.

    Cordella, L.P.; et al.: An analysis of computational cost in image processing: a case study. IEEE Trans. Comput. 10, 904–910 (1978)

  44. 44.

    Hossain, M.A.; Sanyal, G.: A stochastic statistical approach for tracking human activity. IJITMC 1(3), 33–42 (2013)

  45. 45.

    Hossain, M.A.; et al.: A novel stochastic tracking approach on human movement analysis. IJCA 86(18), 36–40 (2014)

  46. 46.

    Su, J.-H.; et al.: Efficient big image data retrieval using clustering index and parallel computation. In: IEEE 8th ICAST, pp. 182–187 (2017)

  47. 47.

    Chen, Y.; et al.: Multi resolution parallel magnetic resonance image reconstruction in mobile computing-based IoT. IEEE Access 7, 15623–15633 (2019)

  48. 48.

    Hughes, J.; Rhodes, S.; Dunne, B.E.: Eye gaze detection system for impaired user GUI control. In: IEEE, 2017, pp. 1348–1351

  49. 49.

    Wu, J.-H.; Ou, W.-L.; Fan, C.-P.: NIR-based gaze tracking with fast pupil ellipse fitting for real-time wearable eye trackers. In: IEEE, 2017, pp. 93–97

  50. 50.

    Jin, X.; Li, Z.; Zhang, J.; Yang, X.: Research on pupil center localization in eye gaze tracking system. In: 37th Chinese Control Conference, 25–27 July 2018, pp. 3211–3215

Download references

Author information

Correspondence to Mohammad Alamgir Hossain.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Hossain, M.A., Assiri, B. An Enhanced Eye-Tracking Approach Using Pipeline Computation. Arab J Sci Eng (2020). https://doi.org/10.1007/s13369-019-04322-7

Download citation


  • Tracking
  • Iris
  • Pipeline
  • HCI
  • APDF