Advertisement

Assessing the Applicability of Surface EMG to Tongue Gesture Detection

  • João Freitas
  • Samuel Silva
  • António Teixeira
  • Miguel Sales Dias
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8854)

Abstract

The most promising approaches for surface Electromyography (EMG) based speech interfaces commonly focus on the tongue muscles. Despite the interesting results in small vocabularies tasks, it is yet unclear which articulation gestures these sensors are actually detecting. To address these complex aspects, in this study we propose a novel method, based on synchronous acquisition of surface EMG and Ultrasound Imaging (US) of the tongue, to assess the applicability of EMG to tongue gesture detection. In this context, the US image sequences allow us to gather data concerning tongue movement over time, providing the grounds for the EMG analysis. Using this multimodal setup, we have recorded a corpus that covers several tongue transitions (e.g. back to front) in different contexts. Considering the annotated tongue movement data, the results from the EMG analysis show that tongue transitions can be detected using the EMG sensors, with some variability in terms of sensor positioning, across speakers, and the possibility of high false-positive rates.

Keywords

tongue gestures surface electromyography ultrasound imaging synchronization silent speech interfaces 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Denby, B., Schultz, T., Honda, K., Hueber, T., Gilbert, J.M., Brumberg, J.S.: Silent speech interfaces. Speech Commun. 52, 270–287 (2010)CrossRefGoogle Scholar
  2. 2.
    Wand, M., Schultz, T.: Session-independent EMG-based Speech Recognition. In: International Conference on Bio-Inspired Systems and Signal Processing (BIOSIGNALS), pp. 295–300 (2011)Google Scholar
  3. 3.
    Seikel, J.A., King, D.W., Drumright, D.G.: Anatomy and physiology for speech, language, and hearing. Delmar Learning (2009)Google Scholar
  4. 4.
    Silva, S., Martins, P., Oliveira, C., Silva, A., Teixeira, A.: Segmentation and Analysis of the Oral and Nasal Cavities from MR Time Sequences. In: Campilho, A., Kamel, M. (eds.) ICIAR 2012, Part II. LNCS, vol. 7325, pp. 214–221. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  5. 5.
    Scobbie, J.M., Wrench, A.A., van der Linden, M.: Head-Probe stabilisation in ultrasound tongue imaging using a headset to permit natural head movement. In: Proceedings of the 8th International Seminar on Speech Production, pp. 373–376 (2008)Google Scholar
  6. 6.
    Stone, M., Lundberg, A.: Three-dimensional tongue surface shapes of English consonants and vowels. J. Acoust. Soc. Am. 99, 3728–3737 (1996)CrossRefGoogle Scholar
  7. 7.
    Livescu, K., Zhu, B., Glass, J.: On the phonetic information in ultrasonic microphone signals. In: IEEE Int. Conf. on Acoustics, Speech and Signal Processing (ICASSP 2009), pp. 4621–4624. IEEE (2009)Google Scholar
  8. 8.
    Hardcastle, W.J.: Physiology of speech production: an introduction for speech scientists. Academic Press, New York (1976)Google Scholar
  9. 9.
    Jorgensen, C., Dusan, S.: Speech interfaces based upon surface electromyography. Speech Commun. 52, 354–366 (2010)CrossRefGoogle Scholar
  10. 10.
    Florescu, V.M., Crevier-Buchman, L., Denby, B., Hueber, T., Colazo-Simon, A., Pillot-Loiseau, C., Roussel-Ragot, P., Gendrot, C., Quattrocchi, S.: Silent vs vocalized articulation for a portable ultrasound-based silent speech interface. In: Proceedings of Interspeech 2010, pp. 450–453 (2010)Google Scholar
  11. 11.
    Hofe, R., Ell, S.R., Fagan, M.J., Gilbert, J.M., Green, P.D., Moore, R.K., Rybchenko, S.I.: Evaluation of a silent speech interface based on magnetic sensing. In: Proceedings of Interspeech 2010, pp. 246–249 (2010)Google Scholar
  12. 12.
    Alfonso, P.J., Baer, T.: Dynamics of vowel articulation. Lang. Speech. 25, 151–173 (1982)Google Scholar
  13. 13.
    Shankweiler, D., Harris, K.S., Taylor, M.L.: Electromyographic studies of articulation in aphasia. Arch. Phys. Med. Rehabil. 49, 1–8 (1968)Google Scholar
  14. 14.
    Teixeira, A., Martins, P., Oliveira, C., Ferreira, C., Silva, A., Shosted, R.: Real-time MRI for Portuguese: Database, methods and applications. In: Caseli, H., Villavicencio, A., Teixeira, A., Perdigão, F. (eds.) PROPOR 2012. LNCS (LNAI), vol. 7243, pp. 306–317. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  15. 15.
    Kent, R.D.: Some considerations in the cinefluorographic analysis of tongue movements during speech. Phonetica 26, 16–32 (1972)CrossRefGoogle Scholar
  16. 16.
    Articulate Assistant Advanced Ultrasound Module User Manual, Revision 212, http://www.articulateinstruments.com/aaa/
  17. 17.
    Plux Wireless Biosignals, http://www.plux.info/
  18. 18.
    Freitas, J., Teixeira, A., Vaz, F., Dias, M.S.: Automatic Speech Recognition Based on Ultrasonic Doppler Sensing for European Portuguese. In: Torre Toledano, D., Ortega Giménez, A., Teixeira, A., González Rodríguez, J., Hernández Gómez, L., San Segundo Hernández, R., Ramos Castro, D. (eds.) IberSPEECH 2012. CCIS, vol. 328, pp. 227–236. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  19. 19.
    Freitas, J., Teixeira, A., Dias, M.S.: Multimodal Corpora for Silent Speech Interaction. In: 9th Language Resources and Evaluation Conference, pp. 1–5 (2014)Google Scholar
  20. 20.
    Silva, S., Teixeira, A.: Automatic Annotation of an Ultrasound Corpus for Studying Tongue Movement. In: Campilho, A., Kamel, M. (eds.) ICIAR 2014, Part I. LNCS, vol. 8814, pp. 469–476. Springer, Heidelberg (2014)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • João Freitas
    • 1
    • 2
  • Samuel Silva
    • 2
  • António Teixeira
    • 2
  • Miguel Sales Dias
    • 1
    • 3
  1. 1.Microsoft Language Development CenterLisboaPortugal
  2. 2.Dep. Electronics Telecommunications & Informatics/IEETAUniversity of AveiroPortugal
  3. 3.ISTAR-IULInstituto Universitário de Lisboa (ISCTE-IUL)LisboaPortugal

Personalised recommendations