Skip to main content

The Mahalanobis Distance for Feature Selection Using Genetic Algorithms: An Application to BCI

  • Chapter
  • First Online:
New Trends in Emerging Complex Real Life Problems

Part of the book series: AIRO Springer Series ((AIROSS,volume 1))

Abstract

High dimensionality is a big problem that has been receiving a lot of interest from data scientists. Classification algorithms usually have trouble handling high dimensional data, and Support Vector Machine is not an exception. Trying to reduce the dimensionality of data selecting a subset of the original features is a solution to this problem. Many proposals have been applied and obtained positive results, including the use of Genetic Algorithms that has been proven to be an effective strategy. In this paper, a new method using Mahalanobis distance as a fitness function is introduced. The performance of the proposed method is investigated and compared with the state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Brester, C., Semenkin, E., Sidorov, M., Minker, W.: Self-adaptive multi-objective genetic algorithms for feature selection. In: Proceedings of International Conference on Engineering and Applied Sciences Optimization, pp. 1838–1846 (2014)

    Google Scholar 

  2. Chandrashekar, G., Sahin, F.: A survey on feature selection methods. Comput. Electr. Eng. 40, 16–28 (2014)

    Article  Google Scholar 

  3. Eads, D., Hill, D., Davis, S., Perkins, S., Ma, J., Porter, R., Theiler, J.: Genetic algorithms and support vector machines for time series classification. In: 5th Conference on the Application and Science of Neural Networks, Fuzzy Systems and Evolutionary Computation, pp. 74–85 (2002)

    Google Scholar 

  4. Garcia-Nieto, J., Alba, E., Jourdan, L., Talbi, E.: Sensitivity and specificity based multiobjective approach for feature selection: application to cancer diagnosis. Inf. Process. Lett. 109(16), 887–896 (2009)

    Article  MathSciNet  Google Scholar 

  5. Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97(1–2), 273–324 (1997)

    Article  Google Scholar 

  6. Mukhopadhyay, A., Maulik, U., Bandyopadhyay, S., Coello, C.A.C.: A survey of multiobjective evolutionary algorithms for data mining: part I. IEEE Trans. Evol. Comput. 18(1), 4–19 (2014)

    Article  Google Scholar 

  7. Nguyen Duy, D., Nguyen Hoang, H., Nguyen Xuan, H.: The impact of high dimensionality on SVM when classifying ERP data-a solution from LDA. In: Proceedings of the Sixth International Symposium on Information and Communication Technology, pp. 32–37. ACM, New York, NY, USA (2015). https://doi.org/10.1145/2833258.2833290

  8. Provost, F., Fawcett, T.: Robust classification for imprecise environments. Mach. Learn. 42(3), 203–231 (2001)

    Article  Google Scholar 

  9. Rejer, I.: Genetic algorithm with aggressive mutation for feature selection in BCI feature space. Pattern Anal. Appl. 18, 485–492 (2015)

    Article  MathSciNet  Google Scholar 

  10. Rejer, I.: Genetic algorithms for feature selection for brain–computer interface. Int. J. Pattern Recogn. Artif. Intell. 29(5), 1559008-1–1559008-27 (2015) (World Scientific Publishing Company)

    Google Scholar 

  11. Sanchez-Marono, N., Alonso-Betanzos, A., Tombilla-Sanroman, M.: Filter methods for feature selection-a comparative study. In: Yin, H., Tino, P., Corchado, E., Byrne, W., Yao, X. (eds.) Intelligent Data Engineering and Automated Learning—IDEAL 2007. Lecture Notes in Computer Science, vol. 4881. Springer, Berlin, Heidelberg (2007)

    Google Scholar 

  12. Vapnik, V.: Statistical Learning Theory. Wiley-Interscience, NY (1998)

    MATH  Google Scholar 

  13. Yang, L., Jin, R.: Distance metric learning: a comprehensive survey, Technical Report, Michigan State University (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maria Elena Bruni .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Bruni, M.E., Nguyen Duy, D., Beraldi, P., Violi, A. (2018). The Mahalanobis Distance for Feature Selection Using Genetic Algorithms: An Application to BCI. In: Daniele, P., Scrimali, L. (eds) New Trends in Emerging Complex Real Life Problems. AIRO Springer Series, vol 1. Springer, Cham. https://doi.org/10.1007/978-3-030-00473-6_9

Download citation

Publish with us

Policies and ethics