Glossokinetic potential based tongue–machine interface for 1-D extraction

  • Kutlucan Gorur
  • M. Recep Bozkurt
  • M. Serdar Bascil
  • Feyzullah Temurtas
Scientific Paper
  • 30 Downloads

Abstract

The tongue is an aesthetically useful organ located in the oral cavity. It can move in complex ways with very little fatigue. Many studies on assistive technologies operated by tongue are called tongue–human computer interface or tongue–machine interface (TMI) for paralyzed individuals. However, many of them are obtrusive systems consisting of hardware such as sensors and magnetic tracer placed in the mouth and on the tongue. Hence these approaches could be annoying, aesthetically unappealing and unhygienic. In this study, we aimed to develop a natural and reliable tongue–machine interface using solely glossokinetic potentials via investigation of the success of machine learning algorithms for 1-D tongue-based control or communication on assistive technologies. Glossokinetic potential responses are generated by touching the buccal walls with the tip of the tongue. In this study, eight male and two female naive healthy subjects, aged 22–34 years, participated. Linear discriminant analysis, support vector machine, and the k-nearest neighbor were used as machine learning algorithms. Then the greatest success rate was achieved an accuracy of 99% for the best participant in support vector machine. This study may serve disabled people to control assistive devices in natural, unobtrusive, speedy and reliable manner. Moreover, it is expected that GKP-based TMI could be alternative control and communication channel for traditional electroencephalography (EEG)-based brain–computer interfaces which have significant inadequacies arisen from the EEG signals.

Keywords

Glossokinetic potential Tongue machine interfaces Assistive technologies Electroencephalography Brain computer interfaces 

Notes

Acknowledgements

The authors would like to thank the students of the University of Bozok for providing the participation for this research.

Compliance with Ethical Standards

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

The study was approved by the Ethical Committee of Sakarya University, stated in the number of 61923333/044 decision document. All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee.

Informed consent

Informed consent was obtained from all individual participants included in the study.

References

  1. 1.
    Nam Y, Koo B, Cichocki A, Choi S (2014) GOM-face: GKP, EOG, and EMG-based multimodal interface with application to humanoid robot control. IEEE Trans Biomed Eng 61(2):453–462CrossRefPubMedGoogle Scholar
  2. 2.
    Nam Y, Koo B, Cichocki A, Choi S (2016) Glossokinetic potentials for a tongue–machine interface. IEEE Syst Man Cybern Mag 2(1): 6–13CrossRefGoogle Scholar
  3. 3.
    Reuderink B, Poel M, Nijholt A (2011) The impact of loss of control on movement BCIs. IEEE Trans Neural Syst Rehabil Eng 19(6):628–637CrossRefPubMedGoogle Scholar
  4. 4.
    Nam Y, Zhao Q, Cichocki A, Choi S (2012) Tongue-rudder: a glossokinetic-potential-based tongue–machine interface. IEEE Trans Biomed Eng 59(1):290–299CrossRefPubMedGoogle Scholar
  5. 5.
    Huo X, Wang J, Ghovanloo M (2008) A magneto-inductive sensor based wireless tongue-computer interface. IEEE Trans Neural Syst Rehabil Eng 16(5):497–504CrossRefPubMedPubMedCentralGoogle Scholar
  6. 6.
    Huo X, Ghovanloo M (2012) Tongue drive: A wireless tongue-operated means for people with severe disabilities to communicate their intentions. IEEE Comm Mag 50(10):128–135CrossRefGoogle Scholar
  7. 7.
    Vaidyanathan R, Gupta L, Kook H, West J (2006) A decision fusion classification architecture for mapping of tongue movements based on aural flow monitoring. In: IEEE international conference on robotics and automation. pp. 3610–3617Google Scholar
  8. 8.
    Vaidyanathan R, Chung B, Gupta L, Kook H, Kota S, West JD (2007) Tongue-movement communication and control concept for hands-free human-machine interfaces. IEEE Trans on Sys Man Cybern 37(4):533–546CrossRefGoogle Scholar
  9. 9.
    Tang H, Beebe DJ (2006) An oral tactile interface for blind navigation. IEEE Trans Neural Syst Rehabil Eng 14(1):116–123CrossRefPubMedGoogle Scholar
  10. 10.
    Rechy-Ramirez EJ, Hu H (2015) Bio-signal based control in assistive robots: a survey. Digit Commun Netw 1(2):85–101CrossRefGoogle Scholar
  11. 11.
    Bascil MS, Tesneli AY, Temurtas F (2016) Spectral feature extraction of EEG signals and pattern recognition during mental tasks of 2-D cursor movements for BCI using SVM and ANN. Australas Phys Eng Sci Med 39(3):665–676CrossRefPubMedGoogle Scholar
  12. 12.
    Nam Y, Bonkon K, Choi S (2014) Language-related glossokinetic potentials on scalp. IEEE international conference on systems, man, and cybernetics, San Diego, USA. pp. 1063–1067Google Scholar
  13. 13.
    Vanhatalo S, Voipio J, Dewaraja A, Holmes MD, Miller JW (2003) Topography and elimination of slow EEG responses related to tongue movements. NeuroImage 20:1419–1423CrossRefPubMedGoogle Scholar
  14. 14.
    Ramadan RA, Vasilakos AV (2017) Brain computer interface: control signals review. Neurocomputing 223:26–44CrossRefGoogle Scholar
  15. 15.
    Klem GH, Lüders HO, Jasper HH, Elger C (1999) The ten-twenty electrode system of the international federation. Electroencephalogr Clin Neurophysiol 52:3–6Google Scholar
  16. 16.
    Yalcın N, Tezel G, Karakuzu C (2015) Epilepsy diagnosis using artificial neural network learned by PSO. Turk J Electr Eng Comp Sci 23:421–432CrossRefGoogle Scholar
  17. 17.
    Daly JJ, Fang Y, Perepezko EM, Siemionow V, Yue GH (2006) Prolonged cognitive planning time, elevated cognitive effort, and relationship to coordination and motor control following stroke. IEEE Trans Neural Syst Rehabil Eng 14(2):168–171CrossRefPubMedGoogle Scholar
  18. 18.
    Alpaydın E (2010) Introduction to machine learning. MIT Press, CambridgeGoogle Scholar
  19. 19.
    Bascil MS, Tesneli AY, Temurtas F (2015) Multi-channel EEG signal feature extraction and pattern recognition on horizontal mental imagination task of 1-D cursor movement for brain computer interface. Australas Phys Eng Sci Med 38(2):229–239CrossRefGoogle Scholar
  20. 20.
    Bozkurt MR, Yurtay N, Yılmaz Z, Sertkaya C (2014) Comparison of different methods for determining diabetes. Turk J Electr Eng Comp Sci 22:1044–1055CrossRefGoogle Scholar
  21. 21.
    Obermaier B, Neuper C, Guger C, Pfurtscheller G (2001) Information transfer rate in a five-classes brain-computer interface. IEEE Trans Neural Syst Rehabil Eng 9(3):283–288CrossRefPubMedGoogle Scholar
  22. 22.
    Sengelmann M, Engel AK, Maye A (2017) Maximizing information transfer in SSVEP-based brain-computer interfaces. IEEE Trans Biomed Eng 64(2):381–394CrossRefPubMedGoogle Scholar
  23. 23.
    Shannon CE, Weaver W (1964) Mathematical theory of communication champaign. University of Illinois Press, IllinoisGoogle Scholar
  24. 24.
    Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20:273–297Google Scholar
  25. 25.
    Aydemir Ö, Kayıkçıoğlu T (2016) Investigation of the most appropriate mother wavelet for characterizing imaginary EEG signals used in BCI systems. Turk J Electr Eng Comp Sci 24:38–49CrossRefGoogle Scholar
  26. 26.
    Chang CB, Seo BH (2000) Development of new brain computer interface based on EEG and EMG. In: Proceedings of the IEEE international conference on robotics and biomimetics, Thailand. pp. 1665–1670Google Scholar
  27. 27.
    Leeb R, Lee F, Keinrath C, Scherer R, Bischof H, Pfurtscheller G (2007) Brain-computer communication: motivation, aim, and impact of exploring a virtual apartment. IEEE Trans Neural Syst Rehabil Eng 15(4):473–481CrossRefPubMedGoogle Scholar
  28. 28.
    Zhu J, Hoi SCH, Lyu RT (2008) Robust regularized kernel regression. IEEE Trans Syst Man Cybern 38(6):1639–1644CrossRefGoogle Scholar
  29. 29.
    Jayaram V, Alamgir M, Altun Y, Schölkopf B, Grosse-Wentrup M (2016) Transfer learning in brain-computer interfaces. IEEE Comput Intell Mag 11(1):20–31CrossRefGoogle Scholar
  30. 30.
    Kao JC, Stavisky SD, Sussillo D, Nuyujukian P, Shenoy KV. (2014) Information systems opportunities in brain-machine interface decoders. Proc IEEE 102(5):666–682CrossRefGoogle Scholar
  31. 31.
    Barreto AB, Taberner AM, Vicente LM. (1996) Classification of spatio-temporal EEG readiness potentials towards the development of a brain-computer interface, bringing together education, science and technology. In: Proceedings of the IEEE, Tampa, FL, USA. pp. 99–102Google Scholar
  32. 32.
    Cerutti S (2009) In the spotlight: biomedical signal processing. IEEE Rev Biomed Eng 2:9–11CrossRefPubMedGoogle Scholar
  33. 33.
    Bao X, Wang J, Hu J (2009) Method of individual identification based on electroencephalogram analysis. In: International conference on new trends in information and service science, 2009, NISS’09, (pp. 390–393). IEEE.Google Scholar
  34. 34.
    Miller KJ, Shenoy P, Nijs M, Sorensen LB, Rao RJP, Ojemann JG (2008) Beyond the gamma band: the role of high-frequency features in movement classification. IEEE Trans Biomed Eng 55(5):1634–1637CrossRefPubMedGoogle Scholar

Copyright information

© Australasian College of Physical Scientists and Engineers in Medicine 2018

Authors and Affiliations

  • Kutlucan Gorur
    • 1
    • 2
  • M. Recep Bozkurt
    • 1
  • M. Serdar Bascil
    • 2
  • Feyzullah Temurtas
    • 3
  1. 1.Department of Electrical and Electronics EngineeringSakarya UniversitySakaryaTurkey
  2. 2.Department of Electrical and Electronics EngineeringBozok UniversityYozgatTurkey
  3. 3.Department of Electrical and Communication EngineeringBandirma Onyedi Eylul UniversityBalikesirTurkey

Personalised recommendations