Skip to main content

Towards Integration of Memory Based Learning and Neural Networks

  • Chapter
Soft Computing in Case Based Reasoning

Abstract

We propose a hybrid prediction system of neural network (NN) and memory based learning (MBR). NN and MBR are frequently applied to data mining with various objectives. NN and MBR can be directly applied to classification and regression without additional transformation mechanisms. They also have strength in learning the dynamic behavior of the system over a period of time. In our hybrid system of NN and MBR, the feature weight set which is calculated from the trained NN plays the core role in connecting both learning strategies and the explanation on prediction can be given by obtaining and presenting the most similar examples from the case base. Experimental results show that the hybrid system has a high potential in solving data mining problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 74.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Benitez JM, Castro JL, Requena I. Are Neural Networks Black Boxes? IEEE Trans. Neural Networks 1997; 8(5): 1156–1164.

    Article  Google Scholar 

  2. Towell G, Shavlik W. Refining Symbolic Knowledge Using Neural Networks. In: Machine Learning: A Multistrategy Approach. Morgan Kaufmann, San Mateo, CA, 1994; 405–429.

    Google Scholar 

  3. Park SC, Lam SM, Gupta A. Rule Extraction from Neural Networks: Enhancing the Explanation Capability. Journal of Expert Systems 1995; 2: 57–71.

    Google Scholar 

  4. Liu H, Setiono R. Effective Data Mining Using Neural Networks. IEEE Trans. on Knowledge and Data Engineering 1996; 8(6): 957–961.

    Article  Google Scholar 

  5. Tickle AB, Andrews R, Golea M, Diederich J. The Truth will Come to Light: Directions and Challenges in Extracting the Knowledge Embedded Within Trained Neural Networks. IEEE Trans. on Neural Networks 1998; 9(6): 1057–1068.

    Article  Google Scholar 

  6. Cost S, Salzberg S. A Weighted Nearest Neighbor Algorithm for Learning with Symbolic Features. Machine Learning 1993; 10(1): 57–78.

    Google Scholar 

  7. Wettschereck D, Aha DW. In: Weighting Features. Proceedings of ICCBR 1995; 347–358.

    Google Scholar 

  8. Hastie T, Tibshirani R. Discriminant Adaptive Nearest Neighbor Classification. IEEE Trans. on Pattern Analysis and Machine Intelligence 1996; 18(6): 607–616.

    Article  Google Scholar 

  9. Wettschereck D, Aha DW, Mohri T. A Review and Empirical Evaluation of Feature Weighting Methods for a Class of Lazy Learning Algorithms. AI Review 1997;11:273–314.

    Google Scholar 

  10. Shneiderman B. Designing the User Interface: Strategies for Effective Human-Computer Interaction. Addison-Wesley, Reading, MA, 1997.

    Google Scholar 

  11. Kang BS, Lee JH, Shin CK, Yu SJ, Park SC. Hybrid Machine Learning System for Integrated Yield Management in Semiconductor Manufacturing. Expert System With Applications 1998; 15: 123–132.

    Article  Google Scholar 

  12. Blake C, Keogh E, Merz CJ. UCI Repository of Machine Learning Databases [www.ics.uci.edu]. University of California, Department of Information and Computer Science, Irvine, CA, 1999.

    Google Scholar 

  13. Weigend AS, Rumelhart DE, Huberman BA. Generalization by Weight-Elimination with Application to Forecasting. Advances in Neural Information Processing Systems 1991; 3: 875–882.

    Google Scholar 

  14. Reed R. Pruning Algorithms: A Survey. IEEE Trans. on Neural Networks 1993; 4(5): 740–747.

    Article  Google Scholar 

  15. Setiono R, Liu H. Neural-Network Feature Selector. IEEE Trans. on Neural Networks 1997; 8(3): 654–662.

    Article  Google Scholar 

  16. Bishop CM. Neural Networks for Pattern Recognition, Clarendon Press, Oxford, 1997.

    Google Scholar 

  17. Karnin ED. A Simple Procedure for Pruning Back-Propagation Trained Neural Networks. IEEE Trans. on Neural Networks 1990; 1(2): 239–242.

    Article  Google Scholar 

  18. Cun YL, Denker JS, Solla SA. Optimal Brain Damage. In: Advances in Neural Information Processing (2). MIT Press, Cambridge, MA, 1989; 598–605.

    Google Scholar 

  19. Segee BE, Carter MJ. Fault Tolerance of Pruned Multilayer Networks. In: Proc. Int. Joint Conf. Neural Networks, vol. II, 1991; 447–452.

    Google Scholar 

  20. Daelemans W, Gillis S, Durieux G. The Acquisition of Stress: A Data Oriented Approach. Computational Linguistics 1994; 20(3): 421–455.

    Google Scholar 

  21. Rachlin J, Kasif S, Salzberg S, Aha D. Towards a Better Understanding of Memory Based and Bayesian Classifiers. In: Proc. Internati. Conf. on Machine Learning, New Brunswick, 1994; 242–250.

    Google Scholar 

  22. Kohavi R, Langley P, Yun Y. The Utility of Feature Weighting in Nearest-Neighbor Algorithms. In: Proceedings of ECML-97, 1997.

    Google Scholar 

  23. Reategui E, Campbell JA, Borghetti S. Using a Neural Network to Learn General Knowledge in a Case-Based System. In: Proceedings of ICCBR-95, 1995; 528–537.

    Google Scholar 

  24. Shin CK, Park SC. Memory and Neural Network Based Prediction System. Expert Systems with an Application 1999; 16: 145–155.

    Article  Google Scholar 

  25. Okamoto S, Satoh K. An Average-Case Analysis of k-Nearest Neighbor Classifier. In: Proceedings of ICCBR-95. 1995; 254–264.

    Google Scholar 

  26. Dash M, Liu H. Feature Selection for Classification. Intelligent Data Analysis 1997; 1(3).

    Google Scholar 

  27. Street WN, Wolberg WH, Mangasarian OL. Nuclear Feature Extraction for Breast Tumor Diagnosis. In: IS&T/SPIE 1993 International Symposium on Electronic Imaging: Science and Technology, 1993; 861–870.

    Google Scholar 

  28. Quinlan R. Bagging, Boosting, and C4.5. In: Proceedings on the AAAI-96. 1996; 725–730.

    Google Scholar 

  29. Gorman RP, Sejnowski TJ. Analysis of Hidden Units in a Layered Network Trained to Classify Sonar Targets. Neural Networks 1988; 1: 75–89.

    Article  Google Scholar 

  30. Quinlan R. Combining Instance-Based and Model-Based Learning. In: Proceedings on the Tenth International Conference of Machine Learning, 1993; 236–243.

    Google Scholar 

  31. Bennett KP. Decision Tree Construction Via Linear Programming. In: Proceedings of the 4th Midwest Artificial Intelligence and Cognitive Science Society, 1992; 97–101.

    Google Scholar 

  32. Shin CK, Hong HK, Park SC. A Hybrid Machine Learning Strategy in Credit Evaluation. In: Proceedings of the 2nd Asia-Pacific Industrial Engineering and Management Systems, 1999, 331–334.

    Google Scholar 

Download references

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag London Limited

About this chapter

Cite this chapter

Shin, CK., Park, S.C. (2001). Towards Integration of Memory Based Learning and Neural Networks. In: Pal, S.K., Dillon, T.S., Yeung, D.S. (eds) Soft Computing in Case Based Reasoning. Springer, London. https://doi.org/10.1007/978-1-4471-0687-6_5

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-0687-6_5

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-85233-262-4

  • Online ISBN: 978-1-4471-0687-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics