Advertisement

Algorithm for Assessing Auditory Images Perception and Verbal Information

  • Ksenija Belskaya
  • Sergey LytaevEmail author
Conference paper
  • 6 Downloads
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1201)

Abstract

In some contexts, brain dysfunctions may be considered as a model of phantom functional states that can form in a healthy person under extreme conditions. To assess individual psychological characteristics and the degree of phantom severity a clinical interview, a study of auditory speech memory, an assessment of situational and personal anxiety, and differential diagnosis of depressive conditions were used. To study the auditory-cognitive functions, an original technique was developed “Recognition of auditory images”. EEG in the pre-stimulus period, as well as simultaneously with the perception of auditory images was recorded. In a phantom state, the latent period of perception of auditory images was significantly higher than the reference in 2.6–2.9 times, and the substantive effectiveness of auditory perceptual activity was 2.5 times lower. The coherent analysis of the EEG revealed a violation of the formation of “interaction foci”, where the reduction of intercentral interaction reached 18.8– 42.7%.

Keywords

Consciousness Non-verbal perception Auditory images EEG Coherence Phantom states 

References

  1. 1.
    Masakowski, Y.R., Aguiar, S.K.: Human performance in virtual environments. In: Cai, Y. (ed.) Computing with Instinct. Lecture Notes in Computer Science, vol. 5897, pp. 107–118 (2011)  https://doi.org/10.1007/978-3-642-19757-4_7
  2. 2.
    Verkholyak, O.V., Kaya, H., Karpov, A.A.: Modeling short-term and long term dependencies of the speech signal for paralinguistic emotion classification. SPIIRAS Proc. 18, 30–56 (2019)  https://doi.org/10.15622/sp.18.1.30-56
  3. 3.
    Shostak, V.I., Lytaev, S.A., Golubeva, L.V.: Topography of afferent and efferent flows in the mechanisms of auditory selective attention. Neurosci. Behav. Physiol. 25(5), 378–385 (1995).  https://doi.org/10.1007/BF02359594
  4. 4.
    Ren, P., Ma, X., Lai, W., et al.: Comparison of the use of blink rate and blink rate variability for mental state recognition. IEEE Trans. Neural Syst. Rehabil. Eng. 27, 867–875 (2019).  https://doi.org/10.1109/TNSRE.2019.2906371
  5. 5.
    Lytaev, S.A., Shostak, V.I.: The thalamic integration of afferent flows in man during image recognition. Zhurnal Vysshei Nervnoi Deyatelnosti Imeni I.P. Pavlova 42(1), 12–20 (1992)Google Scholar
  6. 6.
    Lytaev, S., Shevchenko, S.: VEPs and AEPs: mapping of occlusive lesions in cerebral vessels. Ann. New York Acad. Sci. 821, 524–528 (1997).  https://doi.org/10.1111/j.1749-6632.1997.tb48321.x
  7. 7.
    Ichiki, H., Kuroiwa, T., Taniguchi, I., Okeda, R.: Delayed recovery of auditory cortical evoked potentials is correlated with cortical neuronal death after transient cerebral ischemia in awake gerbils. Brain Res. 806(2), 278–281 (1998)  https://doi.org/10.1016/S0006-8993(98)00757-4
  8. 8.
    Lytaev, S., Aleksandrov, M., Ulitin, A.: Psychophysiological and intraoperative AEPs and SEPs monitoring for perception, attention and cognition. Commun. Comput. Inf. Sci. 713, 229–236 (2017)  https://doi.org/10.1007/978-3-319-58750-9_33
  9. 9.
    Corr, P.J., McNaughton, N.: Neuroscience and approach/avoidance personality traits: a two stage (valuation–motivation) approach. Neurosci. Biobehav. Rev. 36, 2339–2354 (2012).  https://doi.org/10.1016/j.neubiorev.2012.09.013
  10. 10.
    Lytaev, S., Aleksandrov, M., Lytaev, M.: Estimation of emotional processes in regulation of the structural afferentation of varying contrast by means of visual evoked potentials. Adv. Intell. Syst. Comput. 953, 288–298 (2020).  https://doi.org/10.1007/978-3-030-20473-0_28
  11. 11.
    Lytaev, S., Aleksandrov, M., Popovich, T., Lytaev, M.: Auditory evoked potentials and PET-scan: early and late mechanisms of selective attention. Adv. Intell. Syst. Comput. 775, 169–178 (2019).  https://doi.org/10.1007/978-3-319-94866-9_17
  12. 12.
    Lytaev, S.A., Belskaya, K.A.: Integration and disintegration of auditory images perception. In: Lecture Notes in Computer Science, vol. 9183, pp. 470–480 (2015).  https://doi.org/10.1007/978-3-319-20816-9_45
  13. 13.
    Hubbard, T.L.: Auditory imagery: empirical findings. Psychol. Bul. 136, 302–329 (2010).  https://doi.org/10.1037/a0018436
  14. 14.
    Leaver, A.M., Van Lare, J., Zielinski, B., Halpern, A.R., Rauschecker, J.P.: Brain activation during anticipation of sound sequences. J. Neurosci. 29, 2477–2485 (2009).  https://doi.org/10.1523/JNEUROSCI.4921-08.2009
  15. 15.
    Herholz, S.C., Halpern, A.R., Zatorre, R.J.: Neuronal correlates of perception, imagery, and memory for familiar tunes. J. Cogn. Neurosci. 24, 1382–1397 (2012).  https://doi.org/10.1162/jocn_a_00216
  16. 16.
    Shergill, S.S., Bullmore, E., Simmons, A., Murray, R., McGuire, P.: Functional anatomy of auditory verbal imagery in schizophrenic patients with auditory hallucinations. Am. J. Psychiatry 157, 1691–1693 (2000).  https://doi.org/10.1176/appi.ajp.157.10.1691
  17. 17.
    Seal, M., Aleman, A., McGuire, P.: Compelling imagery, unanticipated speech and deceptive memory: neurocognitive models of auditory verbal hallucinations in schizophrenia. Cogn. Neuropsychiatry 9, 43–72 (2004).  https://doi.org/10.1080/13546800344000156

Copyright information

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2021

Authors and Affiliations

  1. 1.Saint Petersburg State Pediatric Medical UniversitySaint PetersburgRussia
  2. 2.Saint Petersburg Institute for Informatics and AutomationSaint PetersburgRussia

Personalised recommendations