Cognitive load associations when utilizing auditory display within image-guided neurosurgery
- 5 Downloads
The combination of data visualization and auditory display (e.g., sonification) has been shown to increase accuracy, and reduce perceived difficulty, within 3D navigation tasks. While accuracy within such tasks can be measured in real time, subjective impressions about the difficulty of a task are more elusive to obtain. Prior work utilizing electrophysiology (EEG) has found robust support that cognitive load and working memory can be monitored in real time using EEG data.
In this study, we replicated a 3D navigation task (within the context of image-guided surgery) while recording data pertaining to participants’ cognitive load through the use of EEG relative alpha-band weighting data. Specifically, 13 subjects navigated a tracked surgical tool to randomly placed 3D virtual locations on a CT cerebral angiography volume while being aided by visual, aural, or both visual and aural feedback. During the study EEG data were captured from the participants, and after the study a NASA TLX questionnaire was filled out by the subjects. In addition to replicating an existing experimental design on auditory display within image-guided neurosurgery, our primary aim sought to determine whether EEG-based markers of cognitive load mirrored subjective ratings of task difficulty
Similar to existing literature, our study found evidence consistent with the hypothesis that auditory display can increase the accuracy of navigating to a specified target. We also found significant differences in cognitive working load across different feedback modalities, but none of which supported the experiments hypotheses. Finally, we found mixed results regarding the relationship between real-time measurements of cognitive workload and a posteriori subjective impressions of task difficulty.
Although we did not find a significant correlation between the subjective and physiological measurements, differences in cognitive working load were found. As well, our study further supports the use of auditory display in image-guided surgery.
KeywordsImage-guided neurosurgery Neuronavigation Auditory display Data sonification EEG Cognitive workload Evaluation Interfaces
This study was funded by Natural Sciences and Engineering Research Council of Canada (NSERC Grant N0759) and Fonds de recherche du Quebec Nature et technologies (FRQNT Grant F01296).
Compliances with ethical standard
Conflict of interest
The authors declare that they have no conflict of interest.
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.
Informed consent was obtained from all individual participants included in the study.
Supplementary material 1 (avi 22397 KB)
- 1.Hermann T (2008) Taxonomy and definitions for sonification and auditory display. International Community for Auditory Display, ChicagoGoogle Scholar
- 2.Kramer G, Walker B, Bonebright T, Cook P, Flowers JH, Miner N, Neuhoff J (2010) Sonification report: status of the field and research agenda. Technical Report. International Community for Auditory Display, 1999. http://www.icad.org/websiteV2.0/References/nsf.html. Accessed Jan 2019
- 7.Bork F, Fuers B, Schneider AK, Pinto F, Graumann C, Navab N (2015) Auditory and visio-temporal distance coding for 3-dimensional perception in medical augmented reality. In: IEEE international symposium on mixed and augmented reality (ISMAR), pp 7–12Google Scholar
- 11.Hankins TC, Wilson GF (1998) A comparison of heart rate, eye activity, EEG and subjective measures of pilot mental workload during flight. Aviat Space Environ Med 69(4):360–367Google Scholar
- 15.Yuksel BF, Oleson KB, Harrison L, Peck EM, Afergan D, Chang R, Jacob RJ (2016) Learn piano with BACh: an adaptive learning interface that adjusts task difficulty based on brain state. In: Proceedings of the 2016 CHI conference on human factors in computing systems, pp 5372–5384Google Scholar
- 19.Muse Developers (2015). http://developer.choosemuse.com/research-tools. Accessed Nov 2018
- 22.Puckette M (1996) Pure data: another integrated computer music environment. In: Proceedings of the second intercollege computer music concerts, pp 37–41Google Scholar
- 23.Bencina R (2006) oscpack [computer software]Google Scholar
- 26.Negi S, Mitra R (2018) EEG metrics to determine cognitive load and affective states: a pilot study. In: Proceedings of the 2018 ACM international joint conference and 2018 international symposium on pervasive and ubiquitous computing and wearable computers, pp 182–185Google Scholar
- 27.Brooke J (1996) SUS—a quick and dirty usability scale. Usability Eval Ind 189(194):4–7Google Scholar
- 28.Berkman MI, Karahoca D (2016) Re-assessing the usability metric for user experience (UMUX) scale. J Usability Stud 11(3):89–109Google Scholar
- 29.Lewis JR, Utesch BS, Maher DE (2013) UMUX-LITE: when there’s no time for the SUS. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 2099–2102. ACMGoogle Scholar
- 30.Tomlinson BJ, Noah BE, Walker BN (2018) BUZZ: an auditory interface user experience scale. In: Extended abstracts of the 2018 CHI conference on human factors in computing systems, p LBW096. ACMGoogle Scholar