Encyclopedia of Evolutionary Psychological Science

Living Edition
| Editors: Todd K. Shackelford, Viviana A. Weekes-Shackelford

Biases

  • Dallas Novakowski
  • Sandeep MishraEmail author
Living reference work entry
DOI: https://doi.org/10.1007/978-3-319-16999-6_627-1

A cognitive bias refers to any systematic deviation from accuracy or “rationality” in judgment and decision-making. Some commonly examined biases include the following: confirmation bias (the tendency to interpret new evidence as confirming one’s existing beliefs); hindsight bias (the tendency to overestimate one’s ability to have forecasted known outcomes); base rate neglect (the tendency to ignore general frequency information and instead focus on specific information); and sexual over-perception – the tendency for men to over-perceive sexual interest in others. Simon (1955) first proposed that people use simple mental shortcuts (heuristics) for judgment and decision-making, leading to what appears to be biased decision-making. The “heuristics-and-biases program” achieved greater prominence as a result of the publication of Tversky and Kahneman’s (1974) seminal Science paper. Tversky and Kahneman’s work has largely focused on understanding heuristics and biases as errors in judgment and decision-making. That is, heuristics and biases were seen as violations of the standard utility-maximizing “rational” economic model of decision-making. However, this approach of understanding heuristics and biases as “errors” rather than products of adaptive decision-making processes has been widely criticized, especially by evolution-minded researchers.

A more contemporary, evolution-informed understanding suggests that cognitive biases may arise for three possible reasons. First, the use of mental shortcuts (heuristics, which reflect bounded rationality) may have systematic breakdowns, which lead to errors. Second, some errors in decision-making may have had (historical) asymmetries in their fitness cost, with the consequence that organisms evolved to favor errors that had the lowest net cost (manifesting in error management biases). Third, people may appear to make systematic errors because the laboratory tasks or evaluation standards used to measure such “errors” were unnatural and incompatible with the design of the human mind (manifesting in artifacts; reviewed in Haselton et al. 2009). In the following, we review each source of cognitive bias briefly.

Biases as a Product of Bounded Rationality

Traditional economic utility-maximizing models of decision-making assume that actors have accurate and complete information, enough time to assess all decision alternatives, and enough cognitive capacity to go through a complete deductive process. The bounded rationality approach suggests that most decisions are products of simple heuristics that are cognitively efficient and robust to diverse environments (Gigerenzer and Selten 2002). Moreover, research has demonstrated that heuristics often have high ecological validity. Some evidence suggests that heuristics operating on limited information are superior to decisions made with complete information (e.g., Gigerenzer and Goldstein 1996; reviewed in Todd and Gigerenzer 2007). However, while effective in a wide variety of circumstances, heuristics are prone to breakdown in systematic ways, leading to biases in judgments and decision-making (Tversky and Kahneman 1974).

The hindsight bias, for example, has been demonstrated to be at least partly a product of bounded rationality. This bias has been hypothesized to be a product of a memory-updating heuristic when decision-makers do not have access to memory cues. Supporting this hypothesis, the hindsight bias is reduced when participants are given relevant memory cues (reviewed in Blank and Nestler 2007). Some theorists argue that biases resulting from heuristics are due to the methodological neglect of decision environments. They argue that cognitive abilities were designed for specific environments and problems, and that these abilities cannot be properly understood or assessed when studied out of context (e.g., Gigerenzer et al. 1999). Expanding on the importance of ecological rationality, evolutionary theorists posit that ancestral environments should also be considered when investigating biases.

Error Management Biases

Error management theory (Haselton and Nettle 2006) suggests that some biases reflect adapted preferences for errors with lower fitness costs. When inferring the presence or absence of a target in a stimulus, judgments can be falsely positive or falsely negative. However, false positives and false negatives cannot be simultaneously eliminated. Traditional signal detection theory (Tanner and Swets 1954) posits that instruments should instead be biased toward the least costly error. Error management theory thus integrates evolutionary logic and signal detection theory.

Error management theory posits that biases are shaped by historical fitness costs. For example, the sexual over-perception bias can be explained by the potential costs of searching for a receptive mating partner (Haselton and Buss 2000). Falsely perceiving sexual interest (false positive) costs an individual courtship effort, whereas failing to perceive sexual interest (false negative) costs a reproductive opportunity. Since men do not have to make the metabolic investment in gestation and lactation, a missed reproductive opportunity carries a far larger cost than a wasted courtship attempt (Haselton and Buss 2000). Error management theory posits that certain biases exist because of environments and challenges that are proxies of fitness. However, biases can also arise when humans reason in environments that are unfamiliar to the evolved mind.

Biases as Artifacts

Some evolutionary theorists argue that many documented instances of biases are actually artifacts of unnatural research designs. These theorists argue that humans are functionally rational when reasoning about evolutionarily relevant information (e.g., Cosmides and Tooby 1996). However, biases can occur when either the problem format or the problem content is incompatible with a mind shaped by natural selection.

Human bias is often inferred from inaccuracy when estimating probabilities or likelihoods (Tversky and Kahneman 1974, 1983). Biases stemming from innumeracy, for example, might only occur because humans are not suited to reasoning with percentages and probabilities. Humans should be more proficient at estimating likelihoods when they are framed as discrete, naturally occurring events (Gigerenzer 1997; Cosmides and Tooby 1996). For instance, organisms must understand their probability of success when deciding whether to hunt a specific animal. However, organisms did not experience rates of success as percentage values or odds ratios. Instead, successes and failures were likely historically recalled as individual events. For example, consider someone projecting the success of a hunting attempt. It is likely that inference of the probability of success should be a product of the frequency of recent attempts (e.g., three of the last ten hunting attempts were successful), rather than a product of a percentage probability (e.g., 30% of hunting trips end in success). Consistent with this hypothesis, people are more accurate at making predictions when questions are framed as natural frequencies, rather than probabilities (reviewed in Gigerenzer and Hoffrage 2007). The extant research suggests that many instances of bias might not necessarily be errors. Rather, such biases may be better understood as the consequence of the incompatibility between stimuli and mind.

The heuristics and biases program has demonstrated that human reasoning often deviates from utility-maximization and norms of rationality. Biases can impact outcomes in judicial, economic, and medical settings (e.g., Harley 2007). It is thus important to understand the functional, evolutionary reasons for bias in order to develop effective interventions. Theories that only assume that biases are the result of flaws in the human mind are limited in their explanatory power. By contrast, ecological and evolutionary theories have produced many testable (and empirically supported) predictions about the nature of bias. In particular, cognitive mechanisms that manifest in biases reflect the importance of the historical (and contemporary) structure of decision environments, as well as the importance of problem formats and contents.

References

  1. Blank, H., & Nestler, S. (2007). Cognitive process models of hindsight bias. Social Cognition, 25, 132–146.CrossRefGoogle Scholar
  2. Cosmides, L., & Tooby, J. (1996). Are humans good intuitive statisticians after all? Rethinking some conclusions from the literature on judgment under uncertainty. Cognition, 58, 1–73.CrossRefGoogle Scholar
  3. Gigerenzer, G. (1997). Ecological intelligence: An adaptation for frequencies. Psychologische Beiträge, 39, 107–125.Google Scholar
  4. Gigerenzer, G., & Hoffrage, U. (2007). The role of representation in Bayesian reasoning: Correcting common misconceptions. Behavioral and Brain Sciences, 30, 264–267.CrossRefGoogle Scholar
  5. Gigerenzer, G., & Goldstein, D. G. (1996). Reasoning the fast and frugal way: Models of bounded rationality. Psychological Review, 103, 650–669.CrossRefGoogle Scholar
  6. Gigerenzer, G., & Selten, R. (2002). Bounded rationality: The adaptive toolbox. Cambridge, MA: MIT Press.Google Scholar
  7. Gigerenzer, G., Todd, P. M., & the ABC Research Group. (1999). Simple heuristics that make us smart. New York: Oxford University Press.Google Scholar
  8. Harley, E. M. (2007). Hindsight bias in legal decision making. Social Cognition, 25, 48–63.CrossRefGoogle Scholar
  9. Haselton, M. G., Bryant, G. A., Wilke, A., Frederick, D. A., Galperin, A., Frankenhuis, W. E., & Moore, T. (2009). Adaptive rationality: An evolutionary perspective on cognitive bias. Social Cognition, 27, 733–763.CrossRefGoogle Scholar
  10. Haselton, M. G., & Buss, D. M. (2000). Error management theory: A new perspective on biases in cross-sex mind reading. Journal of Personality and Social Psychology, 78, 81–91.CrossRefGoogle Scholar
  11. Haselton, M. G., & Nettle, D. (2006). The paranoid optimist: An integrative evolutionary model of cognitive biases. Personality and Social Psychology Review, 10, 47–66.CrossRefGoogle Scholar
  12. Simon, H. A. (1955). A behavioral model of rational choice. The Quarterly Journal of Economics, 69, 99–118.Google Scholar
  13. Tanner, W. P., Jr., & Swets, J. A. (1954). A decision-making theory of visual detection. Psychological Review, 61, 401–409.CrossRefGoogle Scholar
  14. Todd, P. M., & Gigerenzer, G. (2007). Environments that make us smart: Ecological rationality. Current Directions in Psychological Science, 16, 167–171.CrossRefGoogle Scholar
  15. Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124–1131.CrossRefGoogle Scholar
  16. Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review, 90, 293–315.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  1. 1.Department of PsychologyUniversity of ReginaReginaCanada
  2. 2.Faculty of Business AdministrationUniversity of ReginaReginaCanada

Section editors and affiliations

  • Doug P. VanderLaan
    • 1
    • 2
  1. 1.Department of PsychologyUniversity of Toronto MississaugaMississaugaCanada
  2. 2.Child, Youth and Family DivisionCentre for Addiction and Mental HealthTorontoCanada