Skip to main content

Input selection with partial retraining

  • Part III: Learning: Theory and Algorithms
  • Conference paper
  • First Online:
  • 89 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1327))

Abstract

In this article, we describe how input selection can be performed with partial retraining. By detecting and removing irrelevant input variables resources are saved, generalization tends to improve, and the resulting architecture is easier to interpret. In our simulations the relevant input variables were correctly separated from the irrelevant variables for a regression and a classification problem.

Real World Computing Partnership

Foundation for Neural Networks

This is a preview of subscription content, log in via an institution.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. T. Cibas, F. Fogelman Soulié, P. Gallinari, and S. Raudys. Variable selection with optimal cell damage. In M. Marinaro and P. G. Morasso, editors, Proceedings of the International Conference on Artificial Neural Networks, volume 1, pages 727–730. Springer-Verlag, 1994.

    Google Scholar 

  2. T. Czernichow. Architecture selection through statistical sensitivity analysis. In C. von der Malsburg, W. von Seelen, J. C. Vorbriiggen, and B. Sendhoff, editors, Artificial Neural Networks-ICANN 96, volume 1112 of Lecture Notes in Computer Science, pages 179–184. Springer, 1996.

    Google Scholar 

  3. N. R. Draper and H. Smith. Applied Regression Analysis. Wiley Series in Probability and Mathematical Statistics. Wiley, New York, second edition, 1981.

    Google Scholar 

  4. J. H. Friedman. Multivariate adaptive regression splines. The Annals of Statistics, 19(1):1–141, 1991.

    Google Scholar 

  5. G. D. Garson. Interpreting neural-network connection weights. AI Expert, 6(4):47–51, 1991.

    Google Scholar 

  6. B. Hassibi, D. G. Stork, G. Wolff, and T. Watanabe. Optimal Brain Surgeon: Extensions and performance comparisons. In J. D. Cowan, G. Tesauro, and J. Alspector, editors, Advances in Neural Information Processing Systems, volume 6, pages 263–270, San Francisco, 1994. Morgan Kaufmann.

    Google Scholar 

  7. D. G. Kleinbaum, L. L. Kupper, and K. E. Muller. Applied Regression Analysis and Other Multivariable Methods. The Duxbury series in statistics and decision sciences. PWS-KENT Publishing Company, Boston, second edition, 1988.

    Google Scholar 

  8. J. O. Moody and P. J. Antsaklis. The dependence identification neural network construction algorithm. IEEE Transactions on Neural Networks, 7(1):3–15, 1996.

    Google Scholar 

  9. M. C. Mozer and P. Smolensky. Using relevance to reduce network size automatically. Connection Science, 1(1):3–16, 1989.

    Google Scholar 

  10. K. L. Priddy, S. K. Rogers, D. W. Ruck, G. L. Tarr, and M. Kabrisky. Bayesian selection of important features for feedforward neural networks. Neurocomputing, 5(2/3):91–103, 1993.

    Google Scholar 

  11. P. van de Laar, T. Heskes, and S. Gielen. Partial retraining: A new approach to input relevance determination. Submitted, 1997.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Wulfram Gerstner Alain Germond Martin Hasler Jean-Daniel Nicoud

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

van de Laar, P., Gielen, S., Heskes, T. (1997). Input selection with partial retraining. In: Gerstner, W., Germond, A., Hasler, M., Nicoud, JD. (eds) Artificial Neural Networks — ICANN'97. ICANN 1997. Lecture Notes in Computer Science, vol 1327. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0020199

Download citation

  • DOI: https://doi.org/10.1007/BFb0020199

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-63631-1

  • Online ISBN: 978-3-540-69620-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics