Skip to main content

Brain-Controlled Selection of Objects Combined with Autonomous Robotic Grasping

  • Conference paper
Neurotechnology, Electronics, and Informatics

Abstract

A Brain–Computer Interface (BCI) could help to restore mobility of severely paralyzed patients, for instance by prosthesis control. However, the currently achievable information transfer rate of noninvasive BCIs is insufficient to control complex prostheses continuously in many degrees of freedom. In this paper we present an autonomous system for grasping natural objects that compensates the low information flow from noninvasive BCIs. Using this system, one out of several objects can be grasped without any muscle activity. Rather, the grasp is initiated by decoded voluntary brain wave modulations. Object selection and grasping are performed in a virtual reality environment. A universal grasp planning algorithm calculates the trajectory of a gripper online. The system can be controlled after less than 10 min of training. We found that decoding accuracy increases over time and that an increased sense of agency achieved by permitting free selections renders the system to work most reliably.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Wolpaw JR. Brain-computer interfaces. In: Barnes MP, Good DC, editors. Neurological rehabilitation, Handbook of clinical neurology, vol. 110. Amsterdam: Elsevier; 2013. p. 6774.

    Chapter  Google Scholar 

  2. Hochberg LR, Bacher D, Jarosiewicz B, Masse NY, Simeral JD, Vogel J, Haddadin S, Liu J, Cash SS, van der Smagt P, Donoghue JP. Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature. 2012;485(7398):372–5.

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  3. Velliste M, Perel S, Spalding MC, Whitford AS, Schwartz AB. Cortical control of a prosthetic arm for self-feeding. Nature. 2008;453(7198):1098–101.

    Article  CAS  PubMed  Google Scholar 

  4. Pfurtscheller G, Neuper C, Guger C, Harkam W, Ramoser H, Schlögl A, Obermaier B, Pregenzer M. Current trends in Graz Brain-Computer Interface (BCI) research. IEEE Trans Rehabil Eng. 2000;8(2):216–9.

    Article  CAS  PubMed  Google Scholar 

  5. Guger C, Edlinger G, Harkam W, Niedermayer I, Pfurtscheller G. How many people are able to operate an EEG-based brain-computer interface (BCI)? IEEE Trans Neural Syst Rehabil Eng. 2003;11(2):145–7.

    Article  CAS  PubMed  Google Scholar 

  6. Vidaurre C, Blankertz B. Towards a cure for BCI illiteracy. Brain Topogr. 2010;23(2):194–8.

    Article  PubMed Central  PubMed  Google Scholar 

  7. Guger C, Daban S, Sellers E, Holzner C, Krausz G, Carabalona R, Gramatica F, Edlinger G. How many people are able to control a P300-based brain-computer interface (BCI)? Neurosci Lett. 2009;462(1):94–8.

    Article  CAS  PubMed  Google Scholar 

  8. Farwell LA, Donchin E. Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr Clin Neurophysiol. 1988;70(6):510–23.

    Article  CAS  PubMed  Google Scholar 

  9. Brunner P, Joshi S, Briskin S, Wolpaw JR, Bischof H, Schalk G. Does the ‘P300’ speller depend on eye gaze? J Neural Eng. 2010;7(5):056013.

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  10. Frenzel S, Neubert E, Bandt C. Two communication lines in a 3 × 3 matrix speller. J Neural Eng. 2011;8(3):036021.

    Article  CAS  PubMed  Google Scholar 

  11. Sahbani A, El-Khoury S, Bidaud P. An overview of 3D object grasp synthesis algorithms. Robot Auton Syst. 2012;60(3):326–36.

    Article  Google Scholar 

  12. Quandt F, Reichert C, Hinrichs H, Heinze HJ, Knight RT, Rieger JW. Single trial discrimination of individual finger movements on one hand: a combined MEG and EEG study. Neuroimage. 2012;59(4):3316–24.

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  13. Rieger JW, Reichert C, Gegenfurtner KR, Noesselt T, Braun C, Heinze H-J, Kruse R, Hinrichs H. Predicting the recognition of natural scenes from single trial MEG recordings of brain activity. Neuroimage. 2008;42(3):1056–68.

    Article  PubMed  Google Scholar 

  14. Wolpaw JR, Birbaumer N, Heetderks WJ, McFarland DJ, Peckham PH, Schalk G, Donchin E, Quatrano LA, Robinson CJ, Vaughan TM. Brain-computer interface technology: a review of the first international meeting. IEEE Trans Rehabil Eng. 2000;8(2):164–73.

    Article  CAS  PubMed  Google Scholar 

  15. Reichert C, Kennel M, Kruse R, Hinrichs H, Rieger JW. Efficiency of SSVEF recognition from the magnetoencephalogram - a comparison of spectral feature classification and CCA-based prediction. In: NEUROTECHNIX 2013 - Proceedings of the International Congress on Neurotechnology, Electronics and Informatics, pp. 233–237. Vilamoura: SciTePress (2013)

    Google Scholar 

  16. Treder MS, Blankertz B. (C)overt attention and visual speller design in an ERP-based brain-computer interface. Behav Brain Funct. 2010;6:28.

    Article  PubMed Central  PubMed  Google Scholar 

  17. Bianchi L, Sami S, Hillebrand A, Fawcett IP, Quitadamo LR, Seri S. Which physiological components are more suitable for visual ERP based brain-computer interface? A preliminary MEG/EEG study. Brain Topogr. 2010;23(2):180–5.

    Article  PubMed  Google Scholar 

  18. Aloise F, Schettini F, Aricò P, Salinari S, Babiloni F, Cincotti F. A comparison of classification techniques for a gaze-independent P300-based brain-computer interface. J Neural Eng. 2012;9(4):045012.

    Article  CAS  PubMed  Google Scholar 

  19. Liu Y, Zhou Z, Hu D. Gaze independent brain–computer speller with covert visual search tasks. Clin Neurophysiol. 2011;122(6):1127–36.

    Article  PubMed  Google Scholar 

  20. Treder MS, Schmidt NM, Blankertz B. Gaze-independent brain-computer interfaces based on covert attention and feature attention. J Neural Eng. 2011;8(6):066003.

    Article  CAS  PubMed  Google Scholar 

  21. Curran EA, Stokes MJ. Learning to control brain activity: a review of the production and control of EEG components for driving brain-computer interface (BCI) systems. Brain Cogn. 2003;51(3):326–36.

    Article  PubMed  Google Scholar 

  22. Hoffmann U, Vesin J-M, Ebrahimi T, Diserens K. An efficient P300-based brain-computer interface for disabled subjects. J Neurosci Methods. 2008;167(1):115–25.

    Article  PubMed  Google Scholar 

  23. Wolpaw JR, McFarland DJ. Control of a two-dimensional movement signal by a noninvasive brain–computer interface in humans. Proc Natl Acad Sci U S A. 2004;101(51):17849–54.

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  24. Cherkassky V, Mulier FM. Learning from data: concepts, theory, and methods. New York, NY: John Wiley & Sons; 1998.

    Google Scholar 

  25. Krusienski DJ, Sellers EW, Cabestaing F, Bayoudh S, McFarland DJ, Vaughan TM, Wolpaw JR. A comparison of classification techniques for the P300 Speller. J Neural Eng. 2006;3(4):299–305.

    Article  PubMed  Google Scholar 

  26. Bradshaw LA, Wijesinghe RS, Wikswo Jr JP. Spatial filter approach for comparison of the forward and inverse problems of electroencephalography and magnetoencephalography. Ann Biomed Eng. 2001;29(3):214–26.

    Article  CAS  PubMed  Google Scholar 

  27. Khatib O. Real-time obstacle avoidance for manipulators and mobile robots. Int J Robot Res. 1986;5(1):90–8.

    Article  Google Scholar 

  28. Siciliano B, Villani L. Robot force control. Norwell, MA: Kluwer Academic Publishers; 1999.

    Book  Google Scholar 

  29. Ericson C. Real-time collision detection. San Francisco, CA: Elsevier, Morgan Kaufmann Publishers; 2005.

    Google Scholar 

  30. Prattichizzo D, Trinkle JC. Grasping. In: Siciliano B, Khatib O, editors. Springer handbook of robotics. Heidelberg: Springer; 2008. p. 671–700.

    Chapter  Google Scholar 

Download references

Acknowledgements

This work has been supported by the EU project ECHORD number 231143 from the 7th Framework Programme and by Land-Sachsen-Anhalt Grant MK48-2009/003.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jochem W. Rieger .

Editor information

Editors and Affiliations

Appendix

Appendix

In Sect. 2.4 we stated the rasterizing of the object and gripper surfaces with virtual point poles. Here we describe the algorithm in more detail.

Our grasp planning algorithm is organized by simulating the action of forces between target object and manipulator in consecutive time frames. While the object poles P O are defined as positive, the manipulator poles P M are defined as negative. In accordance with Khatib [27], we assume that opposite poles attract each other while like poles do not interact. The magnitude of the force between two poles P O i and P M j we calculated as

$$ \overrightarrow{F}\left({P}_i^O,{P}_j^M\right)={e}^{-\overrightarrow{P_i^O{P}_j^M}} $$
(1)

where \( \overrightarrow{P_i^O{P}_j^M} \) is the distance between the poles, and the unit of F is arbitrary. The exponential function limits F to a maximum of 1 unit. This avoids infinite forces at collision scenarios and provides a suitable scaling to instantiate both propulsive forces between manipulator and object and repulsive forces to reject manipulator poles that penetrate the object’s boundary.

The total propulsive force \( \overrightarrow{F}\left({P}_i^M\right) \) affecting one point pole P M i on the manipulator is calculated from a set of object point poles A O where

$$ {A}_O\left({P}_i^M\right):=\left\{{P}_j^O\Big|{P}_j^O\in {P}^O \wedge {\overrightarrow{n}}_j^O\cdot {\overrightarrow{n}}_i^M<0\right\} $$
(2)

which indicates that only pairwise point poles with an angle between the surface normal \( {\overrightarrow{n}}_i^M \) and \( {\overrightarrow{n}}_j^O \) greater than π/4 are involved. We included this constraint to restrict interactions to opposing surface force vectors. The force \( \overrightarrow{F}\left({P}_i^M\right) \) that moves the manipulator is then calculated as

$$ \overrightarrow{F}\left({P}_i^M\right)={\displaystyle \sum_{P_j^O\in {A}_O\left({P}_i^M\right)}\overrightarrow{F}\left({P}_i^O,{P}_j^M\right)} $$
(3)

The manipulator’s effective joint torque \( \overrightarrow{\tau} \) can be calculated by means of the Jacobian J generated from the joint angles \( \overrightarrow{q} \) and the point poles P M [28] by

$$ \overrightarrow{\tau}={\displaystyle \sum_iJ{\left({P}_i^M,\overrightarrow{q}\right)}^T\left[\begin{array}{c}\hfill \overrightarrow{F}\left({P}_i^M\right)\hfill \\ {}\hfill \overrightarrow{M}\hfill \end{array}\right]} $$
(4)

where external moments are considered \( \overrightarrow{M}=\overrightarrow{0} \). In order to simulate the manipulator movement, we calculated the new joint angle q k (t) of an axis k by solving the equation system

$$ {\dot{q}}_k(t)={\dot{q}}_k\left(t-\Delta t\right)+\Delta t*\frac{\tau_k}{{\overrightarrow{a_k}}^TI\left(\overrightarrow{q}\right)\overrightarrow{a_k}} $$
(5)
$$ {q}_k(t)={q}_k\left(t-\Delta t\right)+\Delta t*{\dot{q}}_k(t) $$
(6)

where \( I\left(\overrightarrow{q}\right) \) is the inertia tensor of the robot’s solid elements and \( \overrightarrow{a_k} \) defines one of the manipulator axes. We chose a heuristically dynamic calculation of the time frame length Δt which is proportional to the mean distance between the set of point poles P M and P O.

Collision detection was performed for the new posture before a new time frame was assigned to be valid and the position update was sent to the manipulator. We used standard techniques [29] to detect surface intersections. If intersections were detected, repulsive forces were calculated for the affected point poles directing to their position of the last valid time frame and satisfying Eq. (1). If no intersections were detected, the robot moved to the new coordinates. This procedure was repeated until the force closure condition [30] was satisfied.

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Reichert, C. et al. (2015). Brain-Controlled Selection of Objects Combined with Autonomous Robotic Grasping. In: Londral, A., Encarnação, P., Rovira, J. (eds) Neurotechnology, Electronics, and Informatics. Springer Series in Computational Neuroscience, vol 13. Springer, Cham. https://doi.org/10.1007/978-3-319-15997-3_5

Download citation

Publish with us

Policies and ethics