Abstract
We propose a hybrid prediction system of neural network (NN) and memory based learning (MBR). NN and MBR are frequently applied to data mining with various objectives. NN and MBR can be directly applied to classification and regression without additional transformation mechanisms. They also have strength in learning the dynamic behavior of the system over a period of time. In our hybrid system of NN and MBR, the feature weight set which is calculated from the trained NN plays the core role in connecting both learning strategies and the explanation on prediction can be given by obtaining and presenting the most similar examples from the case base. Experimental results show that the hybrid system has a high potential in solving data mining problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Benitez JM, Castro JL, Requena I. Are Neural Networks Black Boxes? IEEE Trans. Neural Networks 1997; 8(5): 1156–1164.
Towell G, Shavlik W. Refining Symbolic Knowledge Using Neural Networks. In: Machine Learning: A Multistrategy Approach. Morgan Kaufmann, San Mateo, CA, 1994; 405–429.
Park SC, Lam SM, Gupta A. Rule Extraction from Neural Networks: Enhancing the Explanation Capability. Journal of Expert Systems 1995; 2: 57–71.
Liu H, Setiono R. Effective Data Mining Using Neural Networks. IEEE Trans. on Knowledge and Data Engineering 1996; 8(6): 957–961.
Tickle AB, Andrews R, Golea M, Diederich J. The Truth will Come to Light: Directions and Challenges in Extracting the Knowledge Embedded Within Trained Neural Networks. IEEE Trans. on Neural Networks 1998; 9(6): 1057–1068.
Cost S, Salzberg S. A Weighted Nearest Neighbor Algorithm for Learning with Symbolic Features. Machine Learning 1993; 10(1): 57–78.
Wettschereck D, Aha DW. In: Weighting Features. Proceedings of ICCBR 1995; 347–358.
Hastie T, Tibshirani R. Discriminant Adaptive Nearest Neighbor Classification. IEEE Trans. on Pattern Analysis and Machine Intelligence 1996; 18(6): 607–616.
Wettschereck D, Aha DW, Mohri T. A Review and Empirical Evaluation of Feature Weighting Methods for a Class of Lazy Learning Algorithms. AI Review 1997;11:273–314.
Shneiderman B. Designing the User Interface: Strategies for Effective Human-Computer Interaction. Addison-Wesley, Reading, MA, 1997.
Kang BS, Lee JH, Shin CK, Yu SJ, Park SC. Hybrid Machine Learning System for Integrated Yield Management in Semiconductor Manufacturing. Expert System With Applications 1998; 15: 123–132.
Blake C, Keogh E, Merz CJ. UCI Repository of Machine Learning Databases [www.ics.uci.edu]. University of California, Department of Information and Computer Science, Irvine, CA, 1999.
Weigend AS, Rumelhart DE, Huberman BA. Generalization by Weight-Elimination with Application to Forecasting. Advances in Neural Information Processing Systems 1991; 3: 875–882.
Reed R. Pruning Algorithms: A Survey. IEEE Trans. on Neural Networks 1993; 4(5): 740–747.
Setiono R, Liu H. Neural-Network Feature Selector. IEEE Trans. on Neural Networks 1997; 8(3): 654–662.
Bishop CM. Neural Networks for Pattern Recognition, Clarendon Press, Oxford, 1997.
Karnin ED. A Simple Procedure for Pruning Back-Propagation Trained Neural Networks. IEEE Trans. on Neural Networks 1990; 1(2): 239–242.
Cun YL, Denker JS, Solla SA. Optimal Brain Damage. In: Advances in Neural Information Processing (2). MIT Press, Cambridge, MA, 1989; 598–605.
Segee BE, Carter MJ. Fault Tolerance of Pruned Multilayer Networks. In: Proc. Int. Joint Conf. Neural Networks, vol. II, 1991; 447–452.
Daelemans W, Gillis S, Durieux G. The Acquisition of Stress: A Data Oriented Approach. Computational Linguistics 1994; 20(3): 421–455.
Rachlin J, Kasif S, Salzberg S, Aha D. Towards a Better Understanding of Memory Based and Bayesian Classifiers. In: Proc. Internati. Conf. on Machine Learning, New Brunswick, 1994; 242–250.
Kohavi R, Langley P, Yun Y. The Utility of Feature Weighting in Nearest-Neighbor Algorithms. In: Proceedings of ECML-97, 1997.
Reategui E, Campbell JA, Borghetti S. Using a Neural Network to Learn General Knowledge in a Case-Based System. In: Proceedings of ICCBR-95, 1995; 528–537.
Shin CK, Park SC. Memory and Neural Network Based Prediction System. Expert Systems with an Application 1999; 16: 145–155.
Okamoto S, Satoh K. An Average-Case Analysis of k-Nearest Neighbor Classifier. In: Proceedings of ICCBR-95. 1995; 254–264.
Dash M, Liu H. Feature Selection for Classification. Intelligent Data Analysis 1997; 1(3).
Street WN, Wolberg WH, Mangasarian OL. Nuclear Feature Extraction for Breast Tumor Diagnosis. In: IS&T/SPIE 1993 International Symposium on Electronic Imaging: Science and Technology, 1993; 861–870.
Quinlan R. Bagging, Boosting, and C4.5. In: Proceedings on the AAAI-96. 1996; 725–730.
Gorman RP, Sejnowski TJ. Analysis of Hidden Units in a Layered Network Trained to Classify Sonar Targets. Neural Networks 1988; 1: 75–89.
Quinlan R. Combining Instance-Based and Model-Based Learning. In: Proceedings on the Tenth International Conference of Machine Learning, 1993; 236–243.
Bennett KP. Decision Tree Construction Via Linear Programming. In: Proceedings of the 4th Midwest Artificial Intelligence and Cognitive Science Society, 1992; 97–101.
Shin CK, Hong HK, Park SC. A Hybrid Machine Learning Strategy in Credit Evaluation. In: Proceedings of the 2nd Asia-Pacific Industrial Engineering and Management Systems, 1999, 331–334.
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag London Limited
About this chapter
Cite this chapter
Shin, CK., Park, S.C. (2001). Towards Integration of Memory Based Learning and Neural Networks. In: Pal, S.K., Dillon, T.S., Yeung, D.S. (eds) Soft Computing in Case Based Reasoning. Springer, London. https://doi.org/10.1007/978-1-4471-0687-6_5
Download citation
DOI: https://doi.org/10.1007/978-1-4471-0687-6_5
Publisher Name: Springer, London
Print ISBN: 978-1-85233-262-4
Online ISBN: 978-1-4471-0687-6
eBook Packages: Springer Book Archive