Generalized Additive Multi-Model for Classification and Prediction

  • Claudio Conversano
  • Roberta Siciliano
  • Francesco Mola
Part of the Studies in Classification, Data Analysis, and Knowledge Organization book series (STUDIES CLASS)


In this paper we introduce a methodology based on a combination of classification/prediction procedures derived from different approaches. In particular, starting from a general definition of a classification/prediction model named Generalized Additive Multi-Model (GAM-M) we will demonstrate how it is possible to obtain different types of statistical models based on parametric, semiparametric and nonparametric methods. In our methodology the estimation procedure is based on a variant of the backfitting algorithm used for Generalized Additive Models (GAM). The benchmarking of our methodology will be shown and the results will be compared with those derived from the applications of GAM and Tree procedures.


Linear Discriminant Analysis Regression Tree Multivariate Adaptive Regression Spline Smoothing Function Semiparametric Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. BREIMAN,L.(1996): Bagging predictorsMachine Learning26,46–59.Google Scholar
  2. BREIMAN, L., FRIEDMAN, J.H., OLSHEN, R.A. and STONE, C.J. (1984):Classification and Regression Trees Belmont C.A. Wadsworth.Google Scholar
  3. CONVERSANO, C. (1998): A Regression Tree Procedure for Smoothing in Generalized Additive Models. In M. Huskova et al. (eds.):Prague Stochastics’98 Abstracts13–14, Union of Czech Mathematicians and Physicists.Google Scholar
  4. CONVERSANO, C. (1999)Semiparametric Models for Supervised Classification and Prediction. Some Proposals for Model Integration and Estimation ProceduresPh.D Thesis in Computational Statistics and Data Analysis, Università di Napoli Federico II.Google Scholar
  5. CONVERSANO, C.,and SICILIANO,R. (1998): A regression tree procedure for smoothing and variable selection in generalized additive models,submitted.Google Scholar
  6. FRIEDMAN, J.H.(1991): Multivariate adaptive regression splines. The Annals of Statistics, 19, 1–141.CrossRefGoogle Scholar
  7. HASTIE, T.J., and TIBSHIRANI, R.J. (1990): Generalized Additive Models. Chapman and Hall, London.Google Scholar
  8. MERTENS, B.J., HAND, D.J. (1999): Adjusted estimation for the combination of classifiers. In D. Hand et al. (eds.): Intelligent Data Analysis IDA99 Proceedings. 317–330, Springer, Berlin.CrossRefGoogle Scholar
  9. MOLA, F. (1998):Selection of cut points in generalized additive models. In M. Vichi and 0. Optiz (eds.): Classification and Data Analysis: Theory and Application, Springer Verlag, Berlin, 121–128.Google Scholar
  10. MOLA, F.,SICILIANO, R. (1997). A fast splitting procedure for classification trees, Statistics and Computing, 7, 208–216CrossRefGoogle Scholar
  11. SICILIANO,R. and MOLA, F.(1994):Modelling for recursive partitioning and variable selection.In:R.Dutter and R.Grossman (eds.):Compstat’94 Proceedings. Phisyca-Verlag, Heidelberg,172–177.Google Scholar

Copyright information

© Springer-Verlag Berlin · Heidelberg 2000

Authors and Affiliations

  • Claudio Conversano
    • 1
  • Roberta Siciliano
    • 2
  • Francesco Mola
    • 1
  1. 1.Dipartimento di Matematica e StatisticaUniversità di Napoli Federico IINapoliItalia
  2. 2.Dipartimento di EconomiaUniversitàdi CagliariCagliariItalia

Personalised recommendations