Advertisement

Brain-Computer Interfaces by Electrical Cortex Activity: Challenges in Creating a Cognitive System for Mobile Devices Using Steady-State Visually Evoked Potentials

  • Pedro MoraisEmail author
  • Carla Quintão
  • Pedro Vieira
Conference paper
Part of the IFIP Advances in Information and Communication Technology book series (IFIPAICT, volume 470)

Abstract

The research field of Brain-Computer Interfaces (BCI) emerged in an attempt to enable communication between paralyzed patients and technology. Identifying an individual’s mental state, through his brain’s electric activity, a typical BCI system assigns to it a particular action in the computer. It is known that when the visual cortex is stimulated with a certain frequency, it shows activity with the same frequency. This Steady-State Visually Evoked Potential (SSVEP) activity can be used to achieve the aforementioned communication goal. In this work, we first analyze the spontaneous electrical activity of the brain, to distinguish two mental sates (concentration/meditation). Then, following an SSVEP type of approach, we divide the stimulating screen in four areas, each of which flickering at a distinct frequency. By observing the responding frequency from the occipital lobe of the subject, we can then estimate the 2 bit decision he made. We observe that such a setup is efficient for real time BCI, and can be easily integrated in mobile devices. Besides, the user is able to change voluntarily her/his decisions, interacting with the system in a natural manner.

Keywords

BCI EEG SSVEP Mobile device Tablet Smartphone 

1 Introduction

A biomedical cognitive control system must be able to interpret the electrical signals produced in our brain and distinguish different levels of activity. There have been various approaches taken in this area, such as event-related desynchronization and synchronization (ERD/ERS) [1], evoked potentials with latency of 300 ms (P300) [2, 3] and visual evoked potentials in stationary mode (SSVEP) [4, 5]. The ERD/ERS engaged in the study of alpha and beta waves, characterized by frequency between 8 Hz and 12 Hz and 12 Hz to 30 Hz, respectively, which can be observed, for example, during an imagination task motion. P300 in an evoked potential with a latency of about 300 ms, which appears after a visual or auditory stimulus that requires attention and cause some surprise. SSVEP are elicited by retinal stimulation with a signal whose frequency can vary between 3.5 Hz and 75 Hz and consist of a continuous and periodic signal detected in the visual cortex with the same frequencies [6]. The publications in this area are many and varied, ranging from the presentation of tools to control different devices, to new technological solutions. Among the applications that resulted from the above described techniques, we highlight some technological advances such as: 1. Activation of a mobile robot using a BCI [7], wherein a robot is controlled through four imagined movement (foot, tongue, left arm and right arm) using ERD/ERS; 2. Application of dried EEG sensors to mobile BCIs [8], which are placed on the hair, and exhibit very similar results to traditional sensors that use saline solution or conductive gel for electrical contact; 3. Construction of a simple communication system using SSVEP based on BCI [9] where a user gives answers like “yes”, “no”, “good”, “bad”; 4. Creation of an online BCI using static visual evoked potential [10] where through one EEG channel the user can write a word and perform a search on Google; 5. Characterization of stimuli based on P300 amplitude [11], wherein the study shows that there are several factors which influence the potential analyzed, for example, the effect of motivation as possible physiological influence on the amplitude P300 and, finally; 6. Development of a mobile phone based on BCI for communication on a day-to-day [12] through SSVEP stimulation.

In this context it is rather evident the importance of a transversal knowledge between neuroscience and computer science to reach new insights in BCI area, solving problems related to the acquisition, storage and retrieval of brain information, as well as creating new approaches to identify different actions on the same interface.

This work aims to answer the question: “How to develop a solution where an user can interact with mobile devices naturally changing voluntarily their mental task?”. We propose a system consisting of two specific phases: First, the detection of an individual’s state of concentration and second, the selection of the action to be taken through SSVEP after confirming the previous condition. To fulfill the first phase, the spontaneous electroencephalographic (EEG) signals were classified based on the traditional band frequency analysis (alpha band, between 8 to 13 Hz and beta band, between 13 and 30 Hz) [13, 14]. In the second phase, the SSVEP permit to distinguish between at least 4 different actions that the subject would like to performed.

2 Relationship to Cyber-Physical Systems

A cyber-physical system (CPS) comprises the junction of computing elements with nature physical processes. This approach provides the development of more specific applications such as process control, instrumentation, medical devices and smart structures. In the coming age of internet of everything, the development of this type of system contributes to a new era of products where everyone will be connected everywhere. To reach such solution is necessary a flexible architecture with new interfaces. The research for user-friendly interfaces is increasing and one of the aims will be to replace keyboard and mouse computers for more effective means of communication. The use of touch screens, commonly available in tablets and smartphones, is a clear example of this demand. The project described in this paper presents a system that establishes the connection between machines and humans using brain-computer interfaces (BCI)/Electroencephalography (EEG). The EEG is used as a non-invasive electrophysiological monitoring approach to identifying behavior patterns of brain electrical activity, while BCIs promote a direct communication channel between the brain and an electronic device. Using brain electrical activity recorded at scalp level enables intelligent monitoring systems in real-time. The use of sensory channels becomes a new form of input in addition to providing information about the status and user intent. Using this type of information, systems can adapt dynamically contributing to the task that one wants to run. To develop new solutions in this field, it is essential to understand how the brain works and manages the information. Referring to EEG and other biological signals such as electrocardiography (ECG) and electromyography (EMG), opens up a new form of communication between humans and electronic devices. This approach presents several challenges such as the efficiency of embedded systems, the implementation of algorithms using brain electrical impulses and distribution architectures that add autonomy to the devices and increase the efficiency to the communication mechanisms. At the end, this solution must have a high degree of robustness, and should enable connection to the cloud, not confined to the local control devices.

3 Materials and Methods

To answer the challenges in creating this BCI solution it is used the EPOC equipment [15] to record the EEG signals. The system comprises 14 channels (AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, AF4) and two reference electrodes located in P3 and P4 channels (Common Mode Sense active electrode and Driven Right Leg passive electrode - CMS/DRL). The device records the signals sequentially with a rate of 128 Hz, with a resolution of 14 bits per channel and have a frequency response between 0.16 and 43 Hz. In addition, it provides wireless data transmission using a frequency of 2.4 GHz, with a battery allowing for 8 h of continuous work. One of the greatest facilities of this device is the use of saline solution electrodes instead of the common conductive gel to establish the contact between the electrodes and the scalp. The acquisition control and the complete signal classification process were performed by two software platforms: OpenVibe and EEGLAB. The OpenVibe is an open source application, multi-platform, containing various pre-programmed modules for signal processing. In parallel, the EEGlab, a Matlab toolbox for EEG and event related potentials processing, is also used, being necessary to install the plug-in acquisition BIOSIG data for reading data in GDF format (General Data Format) [16].

3.1 Acquisition Protocol for Classification of the Mental State

The signals from the 14 channels are recorded while the subject is seated with a straight posture with the palms on the knees, avoiding any muscular movement. A complete test lasts a total of five minutes. The subject is asked to switch from one mental state (meditation) to the other (concentration), every 30 s, being notified with a beep. In meditation he should get away from any thought focusing his attention on the breath which should be long and deep. For the state of concentration he should countdown of 3 in 3 from 100, while he visualizes the results.

3.2 Signal Processing and Classification

A band pass analog filter from 0.16 Hz to 85 Hz, together with a 50 Hz notch filter, is applied to the signals. Since the meditation/concentration detection states is the main purpose of this phase of the project, we focalized our analysis on the prefrontal cortex electrodes (AF3, AF4), relating with attention activities [17] and in occipital electrodes (O1, O2), where the alpha rhythm, which measure the level of arousal of the subject, is particularly intense.

We used the OpenVibe application to implement the following steps: 1. The electrodes are selected; 2. A 4th order Butterworth digital band pass filter between the alpha band frequencies (8–12 Hz) is applied to the signals; 3. The signal is sub-divided into various time windows lasting 5 s with an overlap of 4.9 s; 4. The power spectral density is computed; 5. A moving average is applied.

The identification of meditation and concentration states was performed by real-time signal acquisition using the following criteria: if the alpha band spectrum in AF3 electrode is below a certain threshold (in our arbitrary units this threshold is 20) then the state will be classified as “concentration”, else, the state will be classified as “meditation”. If the alpha band spectrum exceeds the value 80 the corresponding EEG epoch is considered an artifact.

3.3 Choosing Actions Using SSVEP

In a second stage, after checking the “concentrate’’ state, it is necessary to identify an action by means of SSVEP. To fulfill this objective it is presented to the subject a screen divided into four areas, each one with an image formed by a black/white checkerboard that oscillates at a predefined frequency. The subject option is identified depending on the area/image where the subject focuses its attention.

The stimulation is made using the Psychophysics version 3, a Matlab toolbox, which allows imposing frequencies at 8.6 Hz, 10 Hz, 12 Hz and 15 Hz to the screen. The detection of the frequency emitted by the different screen areas is performed through the analysis of the power spectrum obtained from the electrodes O1 and O2, located in primary visual areas in the occipital lobe. This process allows the recognition of the action to be taken in a natural way.

In this stage the signals were filtered using a 4th order Butterworth digital band pass filter between 8–30 Hz, which comprises the alpha and beta bands. Once again, the analysis was performed on windows during 5 s with an overlap of 4.9 s. Finally, the spectrogram of those signals is generated and the results were compared with the frequencies of the visual stimuli in order to identify the action that should be taken.

4 Results

4.1 Meditation/Concentration

The states identification of meditation and concentration which corresponds to the 1st phase of the project was performed by real-time signal acquisition. One example is shown next, for electrodes AF3 and O1 [Fig. 1].
Fig. 1.

Results of a 5 min essay in which the subject is switching from the meditation to concentration stage. Each state lasts 30 s starting with meditation. The graph represents the amplitude of the alpha waves in O1 and AF3 electrodes.

As expected, during the concentration periods the alpha activity decreases substantially to values that rarely exceed the value 20 (arbitrary units). It should be noted in this example that the two peaks seen in between 30.0–60.0 and 150.0–180.0 observed in O1 electrode, it is a state deconcentration unscheduled during the visual imagination of the subtraction results. Analyzing the values obtained, there is a clear distinction between the two states meditation/concentration, both on the front and occipital zone, although in the latter the difference is more pronounced.

These results demonstrate that it is possible to search and identify patterns of brain activity consistent with the classification of this information in real time.

4.2 SSVEP Results

After completing the 1st stage of the process, with the concentration state identification, it follows the 2nd phase results, corresponding to the choice of the action using SSVEP. The results obtained in the spectrogram show a clear identification of the 8.6 Hz, 10 Hz, 12 Hz and 15 Hz frequencies [Fig. 2] (the ones at which the four areas of the screen flick).
Fig. 2.

Spectrogram with frequencies 8.6 Hz, 10 Hz, 12 Hz and 15 Hz identification using the electrodes O1 and O2 during the SSVEP stimulation.

It was also realized that the checkerboard images in which the oscillating frequencies were applied, should be dispersed in the screen away from each other as much as possible. Besides, it is expected that the use of preset options will provide faster responses, compared to description of the desire. Taking an example of someone who wants to quench his thirst, using a grid with the image of a glass of water as one of the default options, will save significant time compared to writing this same intention.

5 Conclusions

The junction of the identification of a concentrated state with SSVEP will allow the implementation of a hybrid system with high accuracy. BCI approach allows the user to change voluntarily their mental task interacting with the system naturally. This work aims to contribute to the development of an autonomous system which allows monitoring in real-time mobile devices from the electrical brain activity. Associated with this application are the interfaces that will interact with the user. The flickering stimuli using checkerboard images can be replaced with icons that identify the user’s intent with a clear differentiation of the actions to be taken. At the same time it should be evaluate the disruption that may be caused by peripheral vision of a concentrated individual, in order to remove this undesirable activity from the signal. We also believe that the use of higher frequencies will provoke a decreasing in visual fatigue caused by the oscillation of the image, making the system more comfortable. The presented approach should be tested in a statistical significant number of individuals to identify more precisely the degree of accuracy of the results.

References

  1. 1.
    Lopes da Silva, F.H., Pfurtscheller, G. (eds.): Event-Related Desynchronization. Handbook of Electroencephalogr and Clinical Neurophysiology, vol. 6, pp. 51–65. Elsevier, Amsterdam, Revised Edition (1999)Google Scholar
  2. 2.
    Fazel-Rezai, R., Abhari, K.: A region-based P300 speller for brain-computer interface. Can. J. Electr. Comput. Eng. 34, 81–85 (2009)CrossRefGoogle Scholar
  3. 3.
    Sellers, E., Arbel, Y., Donchin, E.: BCIs that uses P300 event-related potentials. In: Wolpaw, J., Wolpaw, E.W. (eds.) Brain-Computer Interfaces: Principles and Practice. Oxford University Press, Oxford (2012)Google Scholar
  4. 4.
    Allison, B., Faller, J., Neuper, C.H.: BCIs that use steady-state visual evoked potentials or slow cortical potentials. In: Wolpaw, J., Wolpaw, E.W. (eds.) Brain-Computer Interfaces: Principles and Practice. Oxford University Press, Oxford (2012)Google Scholar
  5. 5.
    Silberstein, R.B.: Steady state visually evoked potential (SSVEP) topography in a graded working memory task. Int. J. Psychophysiol. 42, 219–232 (2001)CrossRefGoogle Scholar
  6. 6.
    Pastor, M., Artieda, J., Arbizu, J., Valencia, M., Masdeu, J.: Human cerebral activation during steady-state visual-evoked responses. J. Neurosci. 23(37), 621–627 (2003)Google Scholar
  7. 7.
    Barbosa, A.O.G., Achanccaray, D.R., Meggiolaro, M.A.: Activation of a mobile robot through a brain computer interface. In: 2010 IEEE International Conference on Robotics and Automation, pp. 4815–4821 (2010)Google Scholar
  8. 8.
    Chi, Y.M., Wang, Y.-T., Wang, Y., Maier, C., Jung, T.-P., Cauwenberghs, G.: Dry and noncontact EEG sensors for mobile brain-computer interfaces. IEEE Trans. Neural Syst. Rehabil. Eng.: Publ. IEEE Eng. Med. Biol. Soc. 20, 228 (2011)CrossRefGoogle Scholar
  9. 9.
    Sanchez, G., Diez, P.F., Avila, E., Leber, E.L.: Simple communication using a SSVEP-based BCI. J. Phys: Conf. Ser. 332, 012017 (2011)Google Scholar
  10. 10.
    Liu, T., Goldberg, L., Gao, S., Hong, B.: An online brain-computer interface using non-flashing visual evoked potentials. J. Neural Eng. 7, 036003 (2010). doi: 10.1088/1741-2560/7/3/036003 CrossRefGoogle Scholar
  11. 11.
    Kleih, S.C., Nijboer, F., Halder, S., Kübler, A.: Motivation modulates the P300 amplitude during brain–computer interface use. Clin. Neurophysiol. 121, 1023–1031 (2010)CrossRefGoogle Scholar
  12. 12.
    Wang, Y.-T., Wang, Y., Jung, T.-P.: A cell-phone-based brain-computer interface for communication in daily life. J. Neural Eng. 8, 025018 (2010). doi: 10.1088/1741-2560/8/2/025018 CrossRefGoogle Scholar
  13. 13.
    Fisch, B., Spehlmann, R.: Fisch and Spehlmann’s EEG Primer, 3rd edn. Elsevier, Amsterdam (1999)Google Scholar
  14. 14.
    Niedermeyer, E., da Silva, F.H.L.: Electroencephalography: Basic Principles, Clinical Applications, and Related Fields. Williams & Wilkins, Baltimore (1993)Google Scholar
  15. 15.
  16. 16.
    Schlogl, A., Filz, O., Ramoser, H., Pfurtscheller, G.: GDF - a general dataformat for biosignals, Technical report (2004)Google Scholar
  17. 17.
    Rebollo, M.A., Montiel, S.: Atención y funciones ejecutivas. Revista de Neurologia, 42 (Supl 2), S3–7 (2006)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2016

Authors and Affiliations

  • Pedro Morais
    • 1
    • 2
    • 3
    Email author
  • Carla Quintão
    • 2
    • 3
  • Pedro Vieira
    • 2
    • 3
  1. 1.Biomedical Engineering Doctoral ProgramUniversidade NOVA de LisboaLisbonPortugal
  2. 2.Department of Physics, Faculty of Sciences and TechnologyUniversidade NOVA de LisboaLisbonPortugal
  3. 3.Laboratory for Instrumentation, Biomedical Engineering and Radiation PhysicsUniversidade NOVA de LisboaLisbonPortugal

Personalised recommendations