Advertisement

Computational Comparisons

  • Bertrand Clarke
  • Ernest Fokoué
  • Hao Helen Zhang
Chapter
Part of the Springer Series in Statistics book series (SSS)

Up to this point, a great variety of methods for regression and classification have been presented. Recall that, for regression there were the Early methods such as bin smoothers and running line smoothers, Classical methods such as kernel, spline, and nearest-neighbor methods, New Wave methods such as additive models, projection pursuit, neural networks, trees, and MARS, and Alternative methods such as ensembles, relevance vector machines, and Bayesian nonparametrics. In the classification setting, apart from treating a classification problem as a regression problem with the function assuming only values zero and one, the main techniques seen here are linear discriminant analysis, tree-based classifiers, support vector machines, and relevance vector machines. All of these methods are in addition to various versions of linear models assumed to be familiar.

Keywords

Decision Boundary Test Error Training Error Stepwise Linear Regression Single Hide Layer 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag New York 2009

Authors and Affiliations

  • Bertrand Clarke
    • 1
  • Ernest Fokoué
    • 2
  • Hao Helen Zhang
    • 3
  1. 1.University of MiamiMiamiCanada
  2. 2.Department of Science & MathematicsKettering UniversityFlintUSA
  3. 3.Department of StatisticsNorth Carolina State University Program in Statistical GeneticsRaleighUSA

Personalised recommendations