Encyclopedia of Systems Biology

2013 Edition
| Editors: Werner Dubitzky, Olaf Wolkenhauer, Kwang-Hyun Cho, Hiroki Yokota


  • Daniel Berrar
  • Werner Dubitzky
Reference work entry
DOI: https://doi.org/10.1007/978-1-4419-9863-7_601



Overfitting refers to the process of learning specific idiosyncrasies from a training set such as spurious artifacts or random noise, which results in an over-adaption to the training set and therefore in a degradation of the ability to generalize to new, unseen data (Duda et al. 2001). Such an overadapted or overtrained model is called overfitted.


A central goal in predictive analysis is the identification of a model with good generalization ability for new, unseen data. It is a fundamental tenet that the training and test data originate from the same distribution (stationarity assumption). However, if a model adapts too well to the idiosyncrasies of the training data, then the model will not generalize well to new cases. Hence, the model is said to be overfitted to the training set.

Two important characteristics of the model are the  bias and the variance (  Analysis of Variance), which are in a trade-off relationship, that is, the...
This is a preview of subscription content, log in to check access.


  1. Duda RO, Hart PE, Stork DG (2001) Pattern classification, 2nd edn. Wiley-Interscience, New YorkGoogle Scholar
  2. Hastie T, Tibshirani R, Friedman J (2001) The elements of statistical learning, Springer series in statistics. Springer, New YorkGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2013

Authors and Affiliations

  1. 1.Interdisciplinary Graduate School of Science and EngineeringTokyo Institute of TechnologyMidori-kuJapan
  2. 2.Biomedical Sciences Research InstituteUniversity of UlsterColeraineUK