Skip to main content

Evolutionary Approach to Overcome Initialization Parameters in Classification Problems

  • Conference paper
  • First Online:
Computational Methods in Neural Modeling (IWANN 2003)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2686))

Included in the following conference series:

  • 995 Accesses

Abstract

The design of nearest neighbour classifiers is very dependent from some crucial parameters involved in learning, like the number of prototypes to use, the initial localization of these prototypes, and a smoothing parameter. These parameters have to be found by a trial and error process or by some automatic methods. In this work, an evolutionary approach based on Nearest Neighbour Classifier (ENNC), is described. Main property of this algorithm is that it does not require any of the above mentioned parameters. The algorithm is based on the evolution of a set of prototypes that can execute several operators in order to increase their quality in a local sense, and emerging a high classification accuracy for the whole classifier.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. D. Aha and K. Kibler. Instance-based learning algorithms. Machine Learning, 6:37–66, 1991.

    Google Scholar 

  2. Pietro Burrascano. Learning vector quantization for the probabilistic neural network. IEEE Transactions on Neural Networks, 2(4):458–461, July 1991.

    Article  Google Scholar 

  3. Richard O. Duda and Peter E. Hart. Pattern Classification and Scene Analysis. John Wiley And Sons, 1973.

    Google Scholar 

  4. Eibe Frank and Ian H. Witten. Generating accurate rule sets without global optimization. In Proceedings of the Fifteenth International Conference on Machine Learnin, 1998.

    Google Scholar 

  5. Teuvo Kohonen. Self-Organization and Associative Memory. Springer, Berlin, Heidelberg, 1984. 3rd ed. 1989.

    Google Scholar 

  6. K. Z. Mao, K.-C. Tan, and W. Ser. Probabilistic neural-network structure determination for pattern classification. IEEE Transactions on Neural Networks, 11(4):1009–1016, July 2000.

    Article  Google Scholar 

  7. J. Ross Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, 1993.

    Google Scholar 

  8. I. H. Witten and E. Frank. Data Mining. Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann, 2000.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Isasi, P., Fernandez, F. (2003). Evolutionary Approach to Overcome Initialization Parameters in Classification Problems. In: Mira, J., Álvarez, J.R. (eds) Computational Methods in Neural Modeling. IWANN 2003. Lecture Notes in Computer Science, vol 2686. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44868-3_33

Download citation

  • DOI: https://doi.org/10.1007/3-540-44868-3_33

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-40210-7

  • Online ISBN: 978-3-540-44868-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics