Skip to main content

Bayesian Inference

  • Chapter
  • First Online:

Part of the book series: Human–Computer Interaction Series ((HCIS))

Abstract

Bayesian inference has a long standing history in the world of statistics and this chapter aims to serve as an introduction to anyone who has not been formally introduced to the topic before. First, Bayesian inference is introduced using a simple and analytical example. Then, computational methods are introduced. Examples are provided with common HCI problems such as comparing two group rates based on a binary variable, numeric variable, as well as building a regression model.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    Priors with higher variance can be considered less informative in this setting.

  2. 2.

    A more formal version of the likelihood would be \(p(\{y_1,...,y_n\}|\theta ) = \prod _{i} \theta ^{y_i} (1-\theta )^{(1-y_i)}\), where the set \(D = \{y_1,...,y_n\}\) represents the outcome for the sequence of attempts to turn on the device (Kruschke 2013).

  3. 3.

    In the Beta/Binomial approach, the prior is defined using the Beta distribution’s probability density function (PDF). The simplified form of Beta’s PDF (for this type of problem), is \(p(\theta |\alpha ,\beta ) \propto \theta ^{\alpha -1} (1-\theta )^{\beta -1}\). Assuming that the friend told the child that he/she has seen these devices turn on ten times (\(\alpha = 10\)) and fail to turn on two times (\(\beta = 2\)), our prior would be: \(p(\theta |\alpha = 10, \beta = 2) \propto \theta ^{10-1} (1-\theta )^{2-1}\). The likelihood function is based on the Bernoulli distribution with 1 successes and 0 failures expressed as \(p(D|\theta ) \propto \theta ^{1} (1-\theta )^{0}\). Using Bayes’ Rule we can combine the likelihood and prior to produce the posterior distribution: \(p(\theta |D) \propto p(D|\theta ) p(\theta ) = \theta ^{10-1} (1-\theta )^{2-1} \theta ^{1} (1-\theta )^{0} = \theta ^{10} (1-\theta )^{1}\). The posterior density is a beta density that we can easily interpret if we calculate its \(\alpha \) and \(\beta \) parameters: \(\alpha =10+1\) and \(\beta =1+1\). As such the mean for \(\theta \) is \(M = \alpha / (\alpha +\beta ) = 11/(11+2) = 0.846\) or the child’s beliefs that the device will turn on is focused at 84.6 %. The standard deviation is \(SD = \sqrt{\frac{\alpha \beta }{(\alpha +\beta )^2(\alpha +\beta +1)}} \approx 0.0093\). The probability interval with a 95 % probability will be \(0.846 \pm 1.96 \times 0.0093\) which places the child’s belief in the device turning on between 82.7 % and 86.4 %.

  4. 4.

    An alternative approach to solving the problem would be to use Bayesian Probit Regression Jackman 2009.

References

  • Albert J (2009) Bayesian Computation with R. Number 3 in Use R! Springer, New York

    Google Scholar 

  • Aldrich J (2008) R.A. Fisher on Bayes and Bayes Theorem. Bayesian Anal 3(1):161–170

    Article  MathSciNet  MATH  Google Scholar 

  • Chung KL, AitSahlia F (2003) Elementary probability theory: with stochastic processes and an introduction to mathematical finance. Springer Undergraduate Texts in Mathematics and Technology, Springer

    Google Scholar 

  • Cowles MK, Carlin BP (1996) Markov Chain Monte Carlo convergence diagnostics: a comparative review. J Am Stat Assoc 91(434):883–904

    Article  MathSciNet  MATH  Google Scholar 

  • Gelman A, Rubin DB (1992) Inference from iterative simulation using multiple sequences. Stat Sci 7(4):457–472

    Article  Google Scholar 

  • Gilks WR (2005) Markov chain monte carlo. In: Encyclopedia of biostatistics. Wiley

    Google Scholar 

  • Imai K, King G, Lau O (2008) Toward a common framework for statistical analysis and development. J Comput Graph Stat 17(4):892–913

    Article  MathSciNet  Google Scholar 

  • Jackman S (2009) Bayesian analysis for the social sciences. Wiley, Hoboken, NJ

    Book  MATH  Google Scholar 

  • Jaynes ET (2003) Probability theory: the logic of science. Cambridge University Press, Cambridge

    Google Scholar 

  • Kruschke JK (2010) Doing bayesian data analysis: a tutorial with R and BUGS, vol 1 (Academic Press)

    Google Scholar 

  • Kruschke JK (2013) Bayesian estimation supersedes the t test. J Exp Psychol: Gen 142(2):573–603

    Article  Google Scholar 

  • Lee MD, Wagenmakers EJ (2014) Bayesian cognitive modeling: a practical course. Cambridge University Press, Cambridge

    Google Scholar 

  • Lynch SM (2007) Introduction to applied bayesian statistics and estimation for social scientists. Springer

    Google Scholar 

  • McGrayne SB (2011) The theory that would not die: how Bayes’ rule cracked the enigma code, hunted down russian submarines, and emerged triumphant from two centuries of controversy. Yale University Press

    Google Scholar 

  • Muchnik L, Aral S, Taylor SJ (2013) Social influence bias: a randomized experiment. Science 341(6146):647–651

    Article  Google Scholar 

  • Plummer M, Best N, Cowles K, Vines K (2006) CODA: convergence diagnosis and output analysis for MCMC. R News 6(1):7–11

    Google Scholar 

  • Robert C, Casella G (2011) A short history of Markov Chain Monte Carlo: subjective recollections from incomplete data. Stat Sci 26(1):102–115

    Article  MathSciNet  MATH  Google Scholar 

  • Rossi PE, Allenby GM, McCulloch R (2005) Bayesian statistics and marketing. Wiley

    Google Scholar 

  • Triantafyllopoulos K, Pikoulas J (2002) Multivariate Bayesian regression applied to the problem of network security. J Forecast 21(8):579–594

    Article  Google Scholar 

  • Trusov M, Bodapati AV, Bucklin RE (2010) Determining influential users. J Mark Res, XL VII:643–658

    Google Scholar 

  • Tsikerdekis M (2013) Dynamic voting interface in social media: Does it affect individual votes? In: Boas PvE, Groen FCA, Italiano GF, Nawrocki J, Sack H (eds) SOFSEM 2013: theory and practice of computer science, Springer, pp 552–563

    Google Scholar 

  • Volf P, Jakubuv J, Koranda L, Sislak D, Pechoucek M, Mereu S, Hilburn B, Nguyen DN (2014) Validation of an air-traffic controller behavioral model for fast time simulation. In: 2014 Integrated communications, navigation and surveillance conference (ICNS), conference proceedings, IEEE. pp T1–1–T1–9

    Google Scholar 

  • Wagenmakers E-J, Lee MD, Lodewyckx T, Iverson G (2008) Bayesian versus frequentist inference. In: Hoijtink H, Klugkist I, Boelen PA (eds) Bayesian evaluation of informative hypotheses in psychology. Springer, New York, pp 181–207

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michail Tsikerdekis .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Tsikerdekis, M. (2016). Bayesian Inference. In: Robertson, J., Kaptein, M. (eds) Modern Statistical Methods for HCI. Human–Computer Interaction Series. Springer, Cham. https://doi.org/10.1007/978-3-319-26633-6_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-26633-6_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-26631-2

  • Online ISBN: 978-3-319-26633-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics