Advertisement

Forward Feature Selection Based on Approximate Markov Blanket

  • Min Han
  • Xiaoxin Liu
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7368)

Abstract

Feature selection has many applications in solving the problems of multivariate time series . A novel forward feature selection method is proposed based on approximate Markov blanket. The relevant features are selected according to the mutual information between the features and the output. To identify the redundant features, a heuristic method is proposed to approximate Markov blanket. A redundant feature is identified according to whether there is a Markov blanket for it in the selected feature subset or not.The simulations based on the Friedman data, the Lorenz time series and the Gas Furnace time series show the validity of our proposed feature selection method.

Keywords

Feature Selection Redundancy Analysis Markov Blanket Mutual Information 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artificial Intelligence 97(1-2), 273–324 (1997)zbMATHCrossRefGoogle Scholar
  2. 2.
    Battiti, R.: Using mutual information for selection features in supervised neural net learning. IEEE Trans. Neural Networks 5(4), 537–550 (1994)CrossRefGoogle Scholar
  3. 3.
    Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(8), 1226–1238 (2005)CrossRefGoogle Scholar
  4. 4.
    Estévez, P.A., Tesmer, M., Perez, C.A., Zurada, J.M.: Normalized mutual information feature selection. IEEE Transactions on Neural Networks 20(2), 189–201 (2009)CrossRefGoogle Scholar
  5. 5.
    Yu, L., Liu, H.: Efficient feature selection via analysis of relevance and redundancy. Journal of Machine Learning Research (5), 1205–1224 (2004)MathSciNetzbMATHGoogle Scholar
  6. 6.
    Koller, D., Sahami, M.: Toward optimal feature selection. In: Proc. Int. Conf. on Machine Learning, pp. 284–292. Morgan Kaufmann, San Francisco (1996)Google Scholar
  7. 7.
    Kraskov, A., Stogbauer, H., Grassberger, P.: Estimating mutual information. Physical Review E 69, 66138 (2004)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Herrera, L.J., Rubio, G., Pomares, H., Paechter, B., Guillén, A., Rojas, I.: Strengthening the Forward Variable Selection Stopping Criterion. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds.) ICANN 2009. LNCS, vol. 5769, pp. 215–224. Springer, Heidelberg (2009)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Min Han
    • 1
  • Xiaoxin Liu
    • 1
  1. 1.Faculty of Electronic Information and Electrical EngineeringDalian University of TechnologyDalianChina

Personalised recommendations