Investigating Brain Dynamics in Industrial Environment – Integrating Mobile EEG and Kinect for Cognitive State Detection of a Worker
In the present work we used wearable EEG sensor for recording brain activity during simulated assembly work, in replicated industrial environment. We investigated attention related modalities of P300 ERP component and engagement index (EI), which is extracted from signal power ratios of α, β and θ frequency bands. Simultaneously, we quantified the task unrelated movements, which are previously reported to be related to attention level, in an automated way employing kinectTM sensor. Reaction times were also recorded and investigated. We found that during the monotonous task, both the P300 amplitude and EI decreased as the time of the task progressed. On the other hand, the increase of the task unrelated movement quantity was observed, together with the increase in RTs. These findings lead to conclusion that the monotonous assembly work induces the decrease of attention and engagement of the workers as the task progresses, which is observable in both neural (EEG) and behavioral (RT and unrelated movements) signal modalities. Apart from observing how the attention-related modalities are changing over time, we investigated the functional relationship between the neural and behavioral modalities by using Pearson’s correlation. Since the Person’s correlation coefficients showed the functional relationship between the attention-related modalities, we proposed the creation of the multimodal implicit Human-Computer Interaction (HCI) system, which could acquire and process neural and behavioral data in real-time, with the aim of creating the system that could be aware of the operator’s mental states during the industrial work, consequently improving the operator’s well-being.
KeywordsWireless EEG Kinect ERP P300 Attention Neuroergonomics
This research is financed under EU - FP7 Marie Curie Actions FP7-PEOPLE-2011-ITN.
- 3.Tompkins, J.A., et al.: Manufacturing systems (chap. 8). In: Facilities Planning. Wiley (2010)Google Scholar
- 4.Shappel, S.A., Wiegmann, D.A.: The human factors analysis and classification system–HFACS (No. DOT/FAA/AM-00/7). US Federal Aviation Administration, Office of Aviation Medicine (2000)Google Scholar
- 8.Mijović, P., et al.: Towards creation of implicit HCI model for prediction and prevention of operators’ error. In: Proceedings of the 17th International Conference on Human-Computer Interaction, Los Angeles, CA, August 2015Google Scholar
- 9.Gramann, K., et al.: Cognition in action: imaging brain/body dynamics in mobile humans. Rev. Neurosci. De Gryter 22(6), 593–608 (2011)Google Scholar
- 15.Zander, T.O., Kothe, C.: Towards passive brain–computer interfaces: applying brain–computer interface technology to human–machine systems in general. J. Neural Eng. 8(2), 1–5 (2011)Google Scholar
- 22.Bigdely-Shamlo, N., Kreutz-Delgado, K., Robbins, K., Miyakoshi, M., Westerfield, M., Bel-Bahar, T., Kothe, C., Hsi, J., Makeig, S.: Hierarchical event descriptor (HED) tags for analysis of event-related EEG studies. In: Global Conference on Signal and Information Processing (GlobalSIP), 2013 IEEE, pp. 1–4. IEEE (2013)Google Scholar