Skip to main content

Dynamic Recursive Model Class Selection for Classifier Construction

  • Conference paper
Selecting Models from Data

Part of the book series: Lecture Notes in Statistics ((LNS,volume 89))

Abstract

Before applying an automated model selection procedure, one must first choose the class (or family) of models from which the model will be selected. If there is no prior knowledge about the data that indicates the best class of models, then the choice is difficult at best. In this chapter, we present an approach to automating this step in classifier construction. In addition to searching for the best model, our approach searches for the best model class using a heuristic search strategy that finds the best model class for each recursive call of a divide-and-conquer tree induction algorithm. The end result is a hybrid tree-structured classifier, which allows different subspaces of a data set to be fit by models from different model classes. During search for the best model, the method considers whether and why a model class is a poor choice, and selects a better class on that basis. We describe an implementation of the approach, the MCS system, and present experimental results illustrating the system’s ability to identify the best model (and model class) efficiently.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Box, D. R. (1990). “Role of models in statistical analysis.” Statistical Science, 5, 169–174.

    Article  MathSciNet  Google Scholar 

  2. Breiman, L., Friedman, J. H., Olshen, R. A., & Stone, C. J. (1984). Classification and regression trees. Belmont, CA: Wadsworth International Group.

    MATH  Google Scholar 

  3. Breiman, L., Friedman, J. H., Olshen, R. A., & Stone, C. J. (1984). Classification and regression trees. Belmont, CA: Wadsworth International Group.

    MATH  Google Scholar 

  4. Cook, R. D., & Weisberg, S. (1982). Residuals and influence in regression. Chapman and Hall.

    MATH  Google Scholar 

  5. Detrano, R., Janosi, A., Steinbrunn, W., Pfisterer, M., Schmid, J., Sandhu, S., Guppy, K., Lee, S., & Froelicher, V. (1989). “International application of a new probability algorithm for the diagnosis of coronary artery disese.” American Journal of Cardiology, 64, 304–310.

    Article  Google Scholar 

  6. Frean, M. (1990). Small nets and short paths: Optimising neural computation. Doctoral dissertation, Center for Cognitive Science, University of Edinburgh.

    Google Scholar 

  7. Kittler, J. (1986). “Feature selection and extraction.” Handbook of pattern recognition and image processing.

    Google Scholar 

  8. Lehmann, E. L. (1990). “Model specification: The views of Fisher and Neyman, and later developments.” Statistical Science, 5, 160–168.

    Article  MathSciNet  MATH  Google Scholar 

  9. Linhart, H., & Zucchini, W. (1986). Model selection. NY: Wiley.

    MATH  Google Scholar 

  10. Matheus, C. J. (1990). Feature construction: An analytic framework and an application to decision trees. Doctoral dissertation, Department of Computer Science, University of Illinois, Urbana-Champaign, IL.

    Google Scholar 

  11. Nilsson, N. J. (1965). Learning machines. McGraw-Hill

    MATH  Google Scholar 

  12. Pagallo, G. M. (1990). Adaptive decision tree algorithms for learning from examples. Doctoral dissertation, University of California at Santa Cruz.

    Google Scholar 

  13. Quinlan, J. R. (1986). “Induction of decision trees.” Machine Learning, 1, 81–106.

    Google Scholar 

  14. Quinlan, J. R. (1987). “Simplifying decision trees” Internation Journal of Man-Machine Studies, 27, 221–234.

    Article  Google Scholar 

  15. Rissanen, J. (1989). Stochastic complexity in statistical inquiry. New Jersey: World Scientific.

    MATH  Google Scholar 

  16. Safavian, S. R., & Langrebe, D. (1991). “A survey of decision tree classifier methodology.” IEEE Transactions on Systems, Man and Cybernetics, 21, 660–674.

    Article  Google Scholar 

  17. Young, R (1984). Recursive estimation and time-series analysis. New York: Springer- Verlag.

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1994 Springer-Verlag New York, Inc.

About this paper

Cite this paper

Brodley, C.E., Utgoff, P.E. (1994). Dynamic Recursive Model Class Selection for Classifier Construction. In: Cheeseman, P., Oldford, R.W. (eds) Selecting Models from Data. Lecture Notes in Statistics, vol 89. Springer, New York, NY. https://doi.org/10.1007/978-1-4612-2660-4_34

Download citation

  • DOI: https://doi.org/10.1007/978-1-4612-2660-4_34

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-0-387-94281-0

  • Online ISBN: 978-1-4612-2660-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics