BCI and Eye Gaze: Collaboration at the Interface

  • Leo GalwayEmail author
  • Chris Brennan
  • Paul McCullagh
  • Gaye Lightbody
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9183)


Due to an extensive list of restraints, brain-computer interface (BCI) technology has seen limited success outside of laboratory conditions. In order to address these limitations, which have prevented widespread deployment, an existing modular architecture has been adapted to support hybrid collaboration of commercially available BCI and eye tracking technologies. However, combining multiple input modalities, which have different temporal properties, presents a challenge in terms of data fusion and collaboration at the user interface. The use of cost-effective and readily available equipment will further promote hybrid BCI as a viable but alternative interface for human computer interaction. In this paper, we focus on navigation through a virtual smart home and control of devices within the rooms; the navigation being controlled by multimodal interaction. As such, it promises a better information transfer rate than BCI alone. Consequently, an extended architecture for a personalised hybrid BCI system has been proposed.


Hybrid brain-computer interface Eye tracking Domotic control modalities 


  1. 1.
    Zander, T.O., Kothe, C., Jatzev, S., Gaertner, M.: Enhancing human-computer interaction with input from active and passive brain-computer interfaces. In: Tan, D.S., Nijholt, A. (eds.) Brain-Computer Interfaces: Human-Computer Interaction Series, pp. 181–199. Springer, London (2010)CrossRefGoogle Scholar
  2. 2.
    Volosyak, I., Valbuena, D., Malechka, T., Peuscher, J., Gräser, A.: Brain-computer interface using water-based electrodes. J. Neural Eng. 7(6), 066007 (2010)CrossRefGoogle Scholar
  3. 3.
    Wolpaw, J.R., Birbaumer, N., McFarland, D.J., Pfurtscheller, G., Vaughan, T.M.: Brain-computer interfaces for communication and control. Clin. Neurophysiol. 113(6), 767–791 (2002)CrossRefGoogle Scholar
  4. 4.
    Allison, B.Z., Brunner, C., Kaiser, V., Müller-Putz, G.R., Neuper, C., Pfurtscheller, G.: Toward a hybrid brain-computer interface based on imagined movement and visual attention. J. Neural Eng. 7(2), 026007 (2010)CrossRefGoogle Scholar
  5. 5.
    McCullagh, P., Galway, L., Lightbody, G.: Investigation into a Mixed Hybrid Using SSVEP and Eye Gaze for Optimising User Interaction within a Virtual Environment. In: Stephanidis, C., Antona, M. (eds.) UAHCI 2013, Part I. LNCS, vol. 8009, pp. 530–539. Springer, Heidelberg (2013)Google Scholar
  6. 6.
    Ware, M.P., McCullagh, P.J., McRoberts, A., Lightbody, G., Nugent, C., McAllister, G., Mulvenna, M.D., Thomson, E., Martin, S.: Contrasting levels of accuracy in command interaction sequences for a domestic brain-computer interface using SSVEP. In: 5th Cairo International Biomedical Engineering Conference. IEEE Press, New York, pp. 150–153 (2010)Google Scholar
  7. 7.
    Pfurtscheller, G., Allison, B.Z., Brunner, C., Bauemfeind, G., Solis-Escalante, T., Scherer, R., Zander, T.O., Mueller-Putz, G., Neuper, C., Birbaumer, N.: The hybrid BCI. Front. Neurosci. 4(42), 1–11 (2010)Google Scholar
  8. 8.
    Allison, B., Luth, T., Valbuena, D., Teymourian, A., Volosyak, I., Gräser, A.: BCI demographics: how many (and what kinds of) people can use an SSVEP BCI? IEEE Trans. Neural Syst. Rehabil. Eng. 18(2), 107–116 (2010)CrossRefGoogle Scholar
  9. 9.
    McFarland, D.J., Miner, L.A., Vaughan, T.M., Wolpaw, J.R.: Mu and beta rhythm topographies during motor imagery and actual movements. Brain Topogr. 12(3), 177–186 (2000)CrossRefGoogle Scholar
  10. 10.
    Durka, P.J., Kuś, R., Żygierewicz, J., Michalska, M., Milanowski, P., Łabęcki, M., Spustek, T., Laszuk, D., Duszyk, A., Kruszyński, M.: User-centered design of brain-computer interfaces: and BCI appliance. Bull. Polish Acad. Sci. Tech. Sci. 60, 427–431 (2012)Google Scholar
  11. 11.
    George, L., Lécuyer, A.: An overview of research on “passive” brain-computer interfaces for implicit human-computer interaction. In: International Conference on Applied Bionics and Biomechanics ICABB 2010 – Workshop W1 “Brain-Computer Interfacing and Virtual Reality”, Venise (2010)Google Scholar
  12. 12.
    The Future BNCI Project: Future BNCI: a roadmap for future directions in brain/neuronal computer interaction research (2012)Google Scholar
  13. 13.
    Lee, E.C., Woo, J.C., Kim, J.H., Whang, M., Park, K.R.: A brain-computer interface method combined with eye tracking for 3D interaction. J. Neurosci. Methods 190(2), 289–298 (2010)CrossRefGoogle Scholar
  14. 14.
    The Eye Tribe Tracker:
  15. 15.
    Emotiv EPOC / EPOC+:
  16. 16.
    Gao, S., Wang, Y., Gao, X., Hong, B.: Visual and auditory brain-computer interfaces. IEEE Trans. Biomed. Eng. 61(5), 1436–1447 (2014)CrossRefGoogle Scholar
  17. 17.
    Molina, G.G., Tsoneva, T., Nijholt, A.: Emotional brain-computer interfaces. In: 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops ACII 2009. IEEE Press, New York (2009)Google Scholar
  18. 18.
    Zander, T.O., Jatzev, S.: Detecting affective covert user states with passive brain-computer interfaces. In: 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops ACII 2009. IEEE Press, New York (2009)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Leo Galway
    • 1
    Email author
  • Chris Brennan
    • 1
  • Paul McCullagh
    • 1
  • Gaye Lightbody
    • 1
  1. 1.Computer Science Research InstituteUniversity of UlsterColeraineUK

Personalised recommendations