Novice and Expert Information Behavior: An Eye Tracking Study from Qatar

  • A. M. Salaz
  • Teresa MacGregorEmail author
  • Priya Thomas
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 676)


This paper presents findings of an exploratory pilot study investigating the information evaluation behavior of 30 researchers, including both novices and experts. Specifically, the participants’ approaches to evaluating the quality, credibility, and accuracy of scholarly materials were observed using Tobii eye tracking device hardware and triangulated with the participants’ qualitative written descriptions of how they evaluated the material. The initial findings include hypotheses about differences between novices and experts, and the utility of different gaze measurements for assessing information evaluation processes.


Information behavior Eye tracking Expert Novice Evaluation 



The authors acknowledge the support of funding from Carnegie Mellon University Qatar to carry out this research.


  1. 1.
    Foster, N.F.: Information literacy and research practices. Ithaka S+R 13, 3–4 (2014).
  2. 2.
    Orenic, K., Beestrum, M.: The CRAP test. LOEX Wiki (2008).
  3. 3.
    Lupton, M.: Critical evaluation of information: generic, situated, transformative, and expressive windows. In: Spiranec, S. et al. (eds.) The Third European Conference on Information Literacy, Tallinn, Estonia, Abstracts p. 8. Talinn University, Tallinn (2015)Google Scholar
  4. 4.
    Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., van de Weijer, J.: Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press, Oxford (2011)Google Scholar
  5. 5.
    Case, D.O.: Looking for Information: A Survey of Research on Information Seeking, Needs, and Behavior. Emerald Group Publishing, Bingley (2012)Google Scholar
  6. 6.
    Fisher, K.E., Erdelez, S., McKechnie, L. (eds.): Theories of Information Behavior. Information Today Inc., New Jersey (2005)Google Scholar
  7. 7.
    Howard, G., Lubbe, S., Klopper, R.: The impact of information quality on information research. Manag. Inform. Res. Des. 18, 288–305 (2011)Google Scholar
  8. 8.
    Metzger, M.J., Flanagin, A.J., Zwarun, L.: College student web use, perceptions of information credibility, and verification behavior. Comput. Educ. 41, 271–290 (2003)CrossRefGoogle Scholar
  9. 9.
    Grimes, D.J., Boening, C.H.: Worries with the web: a look at student use of web resources. Coll. Res. Libr. 62, 11–22 (2001)CrossRefGoogle Scholar
  10. 10.
    Rieh, S.Y., Hilligoss, B.: College students’ credibility judgments in the information-seeking process. In: Metzger, M., Flanagin, A.J. (eds.) Digital Media, Youth, and Credibility, pp. 49–72. MIT Press, Cambridge (2008)Google Scholar
  11. 11.
    Foster, N.F. (ed.): Scholarly Practice, Participatory Design, and the Extensible Catalog. Association of College and Research Libraries, Chicago (2011)Google Scholar
  12. 12.
    Bornmann, L., Marx, W.: Methods for the generation of normalized citation impact scores in bibliometrics: which method best reflects the judgments of experts? J. Informetr. 9, 408–418 (2015)CrossRefGoogle Scholar
  13. 13.
    Haddawy, P., Hassan, S.U., Asghar, A., Amin, S.: A comprehensive examination of the relation of three citation-based journal metrics to expert judgment of journal quality. J. Informetr. 10, 162–173 (2016)CrossRefGoogle Scholar
  14. 14.
    Pinelli, T.E., Kennedy, J.M., Barclay, R.O.: NASA/DoD aerospace knowledge diffusion research project. paper twelve: the diffusion of federally funded aerospace research and development (R and D) and the information seeking behavior of US aerospace engineers and scientists. DTIC Document (1991)Google Scholar
  15. 15.
    O’Reilly, C.A.: Variations in decision makers’ use of information sources: the impact of quality and accessibility of information. Acad. Manag. J. 25(4), 756–771 (1982)CrossRefGoogle Scholar
  16. 16.
    Pfeffer, J., Salancik, G.R.: Administrator effectiveness: the effects of advocacy and information on achieving outcomes in an organizational context. Hum. Rel. 30(7), 641–656 (1977)CrossRefGoogle Scholar
  17. 17.
    Lorigo, L., Haridasan, M., Brynjarsdóttir, H., Xia, L., Joachims, T., Gay, G., Granka, L., Pellacini, F., Pan, B.: Eye tracking and online search: lessons learned and challenges ahead. J. Am. Soc. Info. Sci. Tech. 59, 1041–1052 (2008)CrossRefGoogle Scholar
  18. 18.
    Kammerer, Y., Gerjets, P.: Effects of search interface and internet-specific epistemic beliefs on source evaluations during web search for medical information: an eye-tracking study. Behav. Info. Tech. 31(1), 83–97 (2012)CrossRefGoogle Scholar
  19. 19.
    Pöntinen, J., Vakkari, P.: Selecting fiction in library catalogs: a gaze tracking study. In: Aalberg, T., Papatheodorou, C., Dobreva, M., Tsakonas, G., Farrugia, C.J. (eds.) TPDL 2013. LNCS, vol. 8092, pp. 72–83. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-40501-3_8 CrossRefGoogle Scholar
  20. 20.
    Cole, M.J., Gwizdka, J., Liu, C., Bierig, R., Belkin, N.J., Zhang, X.: Task and user effects on reading patterns in information search. Interact. with Comp. 23, 346–362 (2011)CrossRefGoogle Scholar
  21. 21.
    Kemman, M., Kleppe, M., Maarseveen, J.: Eye tracking the use of a collapsible facets panel in a search interface. In: Aalberg, T., Papatheodorou, C., Dobreva, M., Tsakonas, G., Farrugia, C.J. (eds.) TPDL 2013. LNCS, vol. 8092, pp. 405–408. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-40501-3_47 CrossRefGoogle Scholar
  22. 22.
    Pretorius, M.C., Biljon, J., Kock, E.: Added value of eye tracking in usability studies: expert and non-expert participants. In: Forbrig, P., Paternó, F., Mark Pejtersen, A. (eds.) HCIS 2010. IAICT, vol. 332, pp. 110–121. Springer, Heidelberg (2010). doi: 10.1007/978-3-642-15231-3_12 CrossRefGoogle Scholar
  23. 23.
    Kules, B., Capra, R., Banta, M., Sierra, T.: What do exploratory searchers look at in a faceted search interface? In: Proceedings of the 9th ACM/IEEE-CS Joint Conference on Digital Libraries, pp. 313–322. ACM, New York (2009)Google Scholar
  24. 24.
    The Altmetric Top 100: What academic research caught the public imagination in 2015? Altmetric.
  25. 25.
    Dreyfus, S.E.: The five-stage model of adult skill acquisition. Bull. Sci. Technol. Soc. 24, 177–181 (2004)CrossRefGoogle Scholar
  26. 26.
    MacGregor, T., Salaz, A.M., Thomas, P.: Eye tracking pilot accuracy and precision data (30 Participants). Figshare. doi: 10.6084/m9.figshare.3483146.v1

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Carnegie Mellon UniversityDohaQatar

Personalised recommendations