Skip to main content

Feature Weighting Algorithm Based on Margin and Linear Programming

  • Conference paper
  • 1923 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7413))

Abstract

Feature selection is an important task in machine learning. In this work, we design a robust algorithm for optimal feature subset selection. We present a global optimization technique for feature weighting. Margin induced loss functions are introduced to evaluate features, and we employs linear programming to search the optimal solution. The derived weights are combined with the nearest neighbor rule. The proposed technique is tested on UCI data sets. Compared with Simba and LMFW, the proposed technique is effective and efficient.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Gilad-Bachrach, R., Navot, A., Tishby, N.: Margin based feature selection–theory and algorithms. In: Proceedings of the 21st International Conference on Machine Learning, p. 40 (2004)

    Google Scholar 

  2. Chen, M., Ebert, D., Hagen, H., Laramee, R.S.: Data Information and Knowledge in Visualization. Computer Graphics and Applications, 12–19 (2009)

    Google Scholar 

  3. Liu, C., Jaeger, S., Nakagawa, M.: Offline Recognition of Chinese Characters: the State of Art. IEEE Transcation on Pattern Analysis and Machine Intelligence 2, 198–213 (2004)

    Google Scholar 

  4. Saeys, Y., Inza, I., Larranaga, P.: A review of feature selection techniques in bioinformatics. Bioinformatics 19, 2507–2517 (2007)

    Article  Google Scholar 

  5. Liu, H., Yu, L.: Toward integrating feature selection algorithms for classification and clustering. IEEE Transactions on Knowledge and Data Engineering 17, 494–502 (2005)

    Article  Google Scholar 

  6. Kohavi, R., John, G.: Wrapper for feature subset selection. Artifical Intelligence, 234–273 (1997)

    Google Scholar 

  7. Pal, M.: Margin-based feature selection for hyperspectral data. International Journal of Applied Earth Observation and Geoinformation 11, 212–220 (2009)

    Article  Google Scholar 

  8. Peng, H., Long, F., Ding, C.: Feature Selection Based on Mutual Information: Criteria of Max-Dependency, Max-Relevance, and Min-Redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence 8, 1226–1236 (2005)

    Article  Google Scholar 

  9. Huang, D., Chow, T.W.S.: Effective feature selection scheme using mutual information. Neurocomputing 63, 325–343 (2005)

    Article  Google Scholar 

  10. Liu, H., Sun, J., Liu, L., Zhang, H.: Feature selection with dynamic mutual information. Pattern Recognition 42, 1330–1339 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  11. Li, Y., Lu, B.-L.: Feature selection based on loss-margin of nearest neighbor classification. Pattern Recognition 42, 1914–1921 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  12. Kononenko, I.: Estimating Attributes: Analysis and Extensions of RELIEF. In: Bergadano, F., De Raedt, L. (eds.) ECML 1994. LNCS, vol. 784, pp. 171–182. Springer, Heidelberg (1994)

    Chapter  Google Scholar 

  13. Sun, Y.: Iterative RELIEF for Feature Weighting: Algorithms,Theories, and Applications. IEEE Transations on Pattern Analysis and Machine Intelligence 6, 1–17 (2007)

    Google Scholar 

  14. Weinberger, K.Q., Blitzer, J., Saul, L.K.: Distance Metric Learning for Large Margin Nearest Neighbor Classification. Journal of Machine Learning Research, 207–244 (2009)

    Google Scholar 

  15. Chen, B., Liu, H., Chai, J., Bao, Z.: Large Margin Feature Weighting Method via Linear Programming. IEEE Transactions on Knowledge and Data Engineering 10, 1475–1486 (2009)

    Article  Google Scholar 

  16. Merz, C.J., Merphy, P.: UCI repository of machine learning databases [OB/OL] (1996), http://www.ics.uci.edu/~mlearn/MLRRepository.html

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Pan, W., Ma, P., Su, X. (2012). Feature Weighting Algorithm Based on Margin and Linear Programming. In: Yao, J., et al. Rough Sets and Current Trends in Computing. RSCTC 2012. Lecture Notes in Computer Science(), vol 7413. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32115-3_46

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-32115-3_46

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-32114-6

  • Online ISBN: 978-3-642-32115-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics