Feature Selection Using Tabu Search with Learning Memory: Learning Tabu Search

  • Lucien MousinEmail author
  • Laetitia Jourdan
  • Marie-Eléonore Kessaci Marmion
  • Clarisse Dhaenens
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10079)


Feature selection in classification can be modeled as a combinatorial optimization problem. One of the main particularities of this problem is the large amount of time that may be needed to evaluate the quality of a subset of features. In this paper, we propose to solve this problem with a tabu search algorithm integrating a learning mechanism. To do so, we adapt to the feature selection problem, a learning tabu search algorithm originally designed for a railway network problem in which the evaluation of a solution is time-consuming. Experiments are conducted and show the benefit of using a learning mechanism to solve hard instances of the literature.


Feature Selection Local Search Tabu Search Combinatorial Optimization Problem Learning Mechanism 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Cai, D., Zhang, C., He, X.: Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining, pp. 333–342. ACM (2010)Google Scholar
  2. 2.
    Calhoun, V.: MLSP 2014 schizophrenia classification challenge (2014).
  3. 3.
    Cervante, L., Xue, B., Zhang, M., Shang, L.: Binary particle swarm optimisation for feature selection: a filter based approach. In: 2012 IEEE Congress on Evolutionary Computation (CEC), pp. 1–8. IEEE (2012)Google Scholar
  4. 4.
    Corne, D., Dhaenens, C., Jourdan, L.: Synergies between operations research and data mining: the emerging use of multi-objective approaches. Eur. J. Oper. Res. 221(3), 469–479 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Dorigo, M., Birattari, M.: Ant colony optimization. In: Sammut, C., Webb, G.I. (eds.) Encyclopedia of Machine Learning, pp. 36–39. Springer, Heidelberg (2010)Google Scholar
  6. 6.
    Duval, B., Hao, J.-K., Hernandez Hernandez, J.C.: A memetic algorithm for gene selection and molecular classification of cancer. In: Genetic and Evolutionary Computation Conference, GECCO 2009, Proceedings, Montreal, Québec, Canada, 8–12 July 2009, pp. 201–208 (2009)Google Scholar
  7. 7.
    Emmanouilidis, C., Hunter, A., MacIntyre, J.: A multiobjective evolutionary setting for feature selection and a commonality-based crossover operator. In: Evolutionary Computation, 2000, vol. 1, pp. 309–316. IEEE (2000)Google Scholar
  8. 8.
    Gheyas, I.A., Smith, L.S.: Feature subset selection in large dimensionality domains. Pattern Recogn. 43(1), 5–13 (2010)CrossRefzbMATHGoogle Scholar
  9. 9.
    Guerra-Salcedo, C., Whitley, L.D.: Genetic approach to feature selection for ensemble creation. In: GECCO 1999, Orlando, Florida, USA, 13–17 July 1999, pp. 236–243 (1999)Google Scholar
  10. 10.
    Guyon, I., Gunn, S., Nikravesh, M., Zadeh, L.A.: Feature Extraction: Foundations and Applications, vol. 207. Springer, Heidelberg (2008)zbMATHGoogle Scholar
  11. 11.
    Guyon, I., Gunn, S.R., Ben-Hur, A., Dror, G.: Result analysis of the NIPS 2003 feature selection challenge. In: Advances in Neural Information Processing Systems 17, NIPS 2004, Vancouver, British Columbia, Canada, 13–18 December 2004, pp. 545–552 (2004)Google Scholar
  12. 12.
    Hamdani, T.M., Won, J.-M., Alimi, A.M., Karray, F.: Multi-objective feature selection with NSGA II. In: Beliczynski, B., Dzielinski, A., Iwanowski, M., Ribeiro, B. (eds.) ICANNGA 2007. LNCS, vol. 4431, pp. 240–247. Springer, Heidelberg (2007). doi: 10.1007/978-3-540-71618-1_27 CrossRefGoogle Scholar
  13. 13.
    Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97(1–2), 273–324 (1997)CrossRefzbMATHGoogle Scholar
  14. 14.
    Long, N., Gianola, D., Rosa, G.J.M.: Machine learning classification procedure for selecting SNPs in genomic selection: application to early mortality in broilers. J. Anim. Breed. Genet. 124(6), 377–389 (2007)CrossRefGoogle Scholar
  15. 15.
    Oliveira, L.S., Morita, M., Sabourin, R.: Feature selection for ensembles using the multi-objective optimization approach. In: Oliveira, L.S., Morita, M., Sabourin, R. (eds.) Multi-objective Machine Learning. SCI, vol. 16, pp. 49–74. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  16. 16.
    Schindl, D., Zufferey, N.: Solution methods for fuel supply of trains. INFOR 51(1), 23–30 (2013)Google Scholar
  17. 17.
    Schölkopf, B., Smola, A.: Support vector machines. Encycl. Biostat. (1998).
  18. 18.
    Tan, C.J., Lim, C.P., Cheah, Y.-N.: A multi-objective evolutionary algorithm-based ensemble optimizer for feature selection and classification with neural network models. Neurocomputing 125, 217–228 (2014)CrossRefGoogle Scholar
  19. 19.
    Xue, B., Zhang, M., Browne, W.N.: Particle swarm optimization for feature selection in classification: a multi-objective approach. IEEE Trans. Cybern. 43(6), 1656–1671 (2013)CrossRefGoogle Scholar
  20. 20.
    Yang, J., Honavar, V.: Feature subset selection using a genetic algorithm. In: Liu, H., Motoda, H. (eds.) Feature Extraction, Construction and Selection, vol. 453, pp. 117–136. Springer, Heidelberg (1998)CrossRefGoogle Scholar
  21. 21.
    Zhu, Z., Ong, Y.-S., Dash, M.: Markov blanket-embedded genetic algorithm for gene selection. Pattern Recogn. 40(11), 3236–3248 (2007)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Lucien Mousin
    • 1
    Email author
  • Laetitia Jourdan
    • 1
  • Marie-Eléonore Kessaci Marmion
    • 1
  • Clarisse Dhaenens
    • 1
  1. 1.Univ. Lille, CNRS, Centrale Lille, UMR 9189 - CRIStAL - Centre de Recherche en Informatique Signal et Automatique de LilleLilleFrance

Personalised recommendations