Pattern classification based on local learning

  • Jing Peng
  • Bir Bhanu
Poster Papers
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1451)

Abstract

Local learning methods approximate a target function (a posteriori probability) by partitioning the input space into a set of local regions, and modeling a simple input-output relationship in each one. In order for local learning to be effective for pattern classification in high dimensional settings, regions must be chosen judiciously to minimize bias. This paper presents a novel region partitioning criterion that attempts to minimize bias by capturing differential relevance in input variables in an efficient way. The efficacy of the method is validated using a variety of real and simulated data.

Keywords

Input Space Target Function Pattern Classification Split Point Local Learning 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    R.E. Bellman, Adaptive Control Processes. Princeton Univ. Press, 1961.Google Scholar
  2. 2.
    L. Bottou and V. Vapnik, Local learning algorithms. Neural Computation, 4(6), 888–900, 1992.Google Scholar
  3. 3.
    L. Breiman, J.H. Friedman, R.A. Olshen, and C.J. Stone, Classification and Regression Trees. Wadsworth, 1984.Google Scholar
  4. 4.
    W.S. Cleveland and S.J. Devlin, Locally weighted regression: An approach to regression analysis by local fitting. J. Amer. Statist. Assoc. 83, 596–610, 1988.Google Scholar
  5. 5.
    J.H. Friedman, Flexible metric nearest neighbor classification. Tech. Report, Dept. of Statistics, Stanford University, Stanford, CA 94305, 1994.Google Scholar
  6. 6.
    J.H. Friedman, Local Learning Based on Recursive Covering. Tech. Report, Dept. of Statistics, Stanford University, Stanford, CA 94305, 1996.Google Scholar
  7. 7.
    M.I. Jordan and R.A. Jacobs, Hierarchical mixtures of experts and the EM algorithm. Neural Computation 6, 181, 1994.Google Scholar
  8. 8.
    P.M. Murphy and D.W. Aha, UCI repository of machine learning databases. http://www.cs.uci.edu/-mlearn/MLRepository.html, 1995.Google Scholar
  9. 9.
    J. Peng, Efficient Memory-Based Dynamic Programming. Proceedings of the 12th International Conference on Machine Learning, 438–446, 1995.Google Scholar
  10. 10.
    J.R. Quinlin, C4.5: Programs for Machine Learning. Morgan-Kaufmann Publishers, Inc., 1993.Google Scholar
  11. 11.
    L.E. Scales, Introduction to Non-Linear Optimization. New York: Springer-Verlag, 1985.Google Scholar
  12. 12.
    C.J. Stone, Nonparametric regression and its applications (with discussion). Ann. Statist. 5, 595, 1977.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1998

Authors and Affiliations

  • Jing Peng
    • 1
  • Bir Bhanu
    • 1
  1. 1.College of EngineeringUniversity of CaliforniaRiversideUSA

Personalised recommendations