The Hypothesis of Elementary Errors

  • Hans Fischer
Part of the Sources and Studies in the History of Mathematics and Physical Sciences book series (SHMP)


In the framework of classical probability theory, the primary objective was to calculate probabilities of certain events, with the aim of making“rational” decisions based on these probabilities. Error or frequency functions1 only played the role of auxiliary subjects. This paradigm, however, would change fundamentally during the course of the 19th century. In the field of biological statistics, for example, probability distributions became an independent object of research. In this context, it was the prevailing opinion for a long time that almost all quantities in nature obeyed normal distributions. For a justification of the apparently privileged role of normal distribution, a model was used in most cases which had originally arisen from error theory: the hypothesis of elementary errors. A random quantity obeying this hypothesis was assumed to be additively composed of a very large number of independent elements, each of them being insignificant comparedwith the total sum. In this case, the CLT guaranteed an approximate normal distribution of the random quantity under consideration.


Series Expansion Independent Random Variable Nonnormal Distribution Error Theory Hermite Polynomial 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  1. 1.Mathematisch-Geographische FakultätKatholische Universität Eichstätt-IngolstadtEichstättGermany

Personalised recommendations