Skip to main content

Important Probability Distributions

  • Chapter
Analysis of Neural Data

Part of the book series: Springer Series in Statistics ((SSS))

  • 6037 Accesses

Abstract

In Chapter 1 we said that a measurement is determined in part by a “signal” of interest, and in part by unknown factors we may call “noise.” Statistical models introduce probability distributions to describe the variation due to noise, and thereby achieve quantitative expressions of knowledge about the signal—a process we will describe more fully in Chapters 7 and 10.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Additional comments on this method, and its use in analysis of synaptic plasticity, may be found in Faber and Korn (1991).

  2. 2.

    The derivation of the Poisson distribution as an approximation to the binomial is credited to Siméon D. Poisson, having appeared in his book, published in 1837. Bortkiewicz (1898, The Law of Small Numbers) emphasized the importance of the Poisson distribution as a model of rare events.

  3. 3.

    Rutherford et al. (1920, p. 172); cited in Feller (1968).

  4. 4.

    He actually found the “probable error,” which is \(.6745\sigma \) to be 48.4 s. See Stigler (1986) for a discussion of these data.

  5. 5.

    Actually, different authors give somewhat different advice. The acceptability of this or any other approximation must depend on the particular use to which it will be put. For computing the probability that a Poisson random variable will fall within 1 standard deviation of its mean, the normal approximation has an error of less than 10 % when \(\lambda = 15\). However, it will not be suitable for calculations that go far out into the tails, or that require several digits of accuracy. In addition, a computational fine point is mentioned in many books. Suppose we wish to approximate a discrete cdf \(F(x)\) by a normal, say \(\tilde{F}(x)\). The the value \(\tilde{F}(x+.5)\) is generally closer to \(F(x)\) than is \(\tilde{F}(x)\). This is sometimes called a continuity correction.

  6. 6.

    Another reason the exponential distribution is special is that among all distributions on \((0,\infty )\) with mean \(\mu =1/\lambda \), the \({\textit{Exp}}(\lambda )\) distribution has the maximum entropy. See Eq. (4.33).

  7. 7.

    The memoryless property can also be stated analogously for discrete distributions; in the discrete case only the geometric distributions are memoryless.

  8. 8.

    It may be shown that \(\hat{\rho }_{XY|U}\) is equal to the correlation between the pair of residual vectors found from the multiple regressions (see Chapter 12) of \(x\) on \(u\) and \(y\) on \(u\).

  9. 9.

    In fact, \(\hat{\rho }_{XY|U}\) is the maximum likelihood estimate; maximum likelihood estimation is discussed in Chapter 7.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Robert E. Kass .

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media New York

About this chapter

Cite this chapter

Kass, R.E., Eden, U.T., Brown, E.N. (2014). Important Probability Distributions. In: Analysis of Neural Data. Springer Series in Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-9602-1_5

Download citation

Publish with us

Policies and ethics