Measuring Focused Attention Using Fixation Inner-Density

  • Wen Liu
  • Soussan DjamasbiEmail author
  • Andrew C. Trapp
  • Mina Shojaeizadeh
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10916)


Examining user reactions via the unobtrusive method of eye tracking is becoming increasingly popular in user experience studies. A major focus of this type of research is accurately capturing user attention to stimuli, which is typically established by translating raw eye movement signals into fixations, that is, ocular events characterized by relatively stable gaze over a specific stimulus. Grounded in the argument that inner-density of gaze points within a fixation represents focused attention, a recent study has developed the fixation-inner-density (FID) methodology, which identifies fixations based on the compactness of individual gaze points. In this study we compare the FID filter with a widely used method of fixation identification, namely the I-VT filter. To do so we use a set of measures that investigate the distribution of gaze points at a micro-level, that is, the patterns of individual gaze points within each fixation. Our results show that in general fixations identified by the FID filter are significantly denser and more compact around their fixation center. They are also more likely to have randomly distributed gaze points within the square box that spatially bounds a fixation. Our results also show that fixation duration is significantly different between the two methods. Because fixation is a major unit of analysis in behavioral studies and fixation duration is a major representation of the intensity of attention, awareness, and effort, our results suggest that the FID filter is likely to increase the sensitivity of such eye tracking investigations into behavior.


Eye tracking Fixation identification Fixation-inner-density Fixation micro-patterns 


  1. 1.
    Djamasbi, S.: Eye tracking and web experience. AIS Trans. Hum.-Comput. Interact. 6(2), 37–54 (2014)CrossRefGoogle Scholar
  2. 2.
    Nyström, M., Holmqvist, K.: An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data. Behav. Res. Methods 42(1), 188–204 (2010)CrossRefGoogle Scholar
  3. 3.
    Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78. ACM, November 2000Google Scholar
  4. 4.
    Shojaeizadeh, M., Djamasbi, S., Trapp, A.C.: Density of gaze points within a fixation and information processing behavior. In: Antona, M., Stephanidis, C. (eds.) UAHCI 2016. LNCS, vol. 9737, pp. 465–471. Springer, Cham (2016). Scholar
  5. 5.
    Trapp, A.C., Liu, W., Djamasbi, S.: New density-based optimization formulations and algorithms to identify fixations in gaze data. In: Presented in INFORMS Annual Meeting, Houston, TX, INFORMS, Hanover, MD (2017)Google Scholar
  6. 6.
    Mitchell, A.: The ESRI Guide to GIS Analysis, Volume 2: Spatial Measurements and Statistics. ESRI Guide to GIS analysis (2005)Google Scholar
  7. 7.
    Tobii: Tobii technology (2017). Accessed 18 Dec 2017
  8. 8.
    Olsen, A.: The Tobii I-VT fixation filter. Tobii Technology (2012)Google Scholar
  9. 9.
    Blignaut, P.: Fixation identification: The optimum threshold for a dispersion algorithm. Atten. Percept. Psychophys. 71(4), 881–895 (2009)CrossRefGoogle Scholar
  10. 10.
    Komogortsev, O.V., Gobert, D.V., Jayarathna, S., Koh, D.H., Gowda, S.M.: Standardization of automated analyses of oculomotor fixation and saccadic behaviors. IEEE Trans. Biomed. Eng. 57(11), 2635–2645 (2010)CrossRefGoogle Scholar
  11. 11.
    Gurobi Optimization, Inc.: Gurobi Optimizer 7.5.1 Reference Manual (2017)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Wen Liu
    • 1
  • Soussan Djamasbi
    • 1
    Email author
  • Andrew C. Trapp
    • 1
  • Mina Shojaeizadeh
    • 1
  1. 1.User Experience and Decision Making Research LaboratoryWorcester Polytechnic InstituteWorcesterUSA

Personalised recommendations