Encyclopedia of Evolutionary Psychological Science

Living Edition
| Editors: Todd K. Shackelford, Viviana A. Weekes-Shackelford

Human Deception

  • Melissa S. de Roos
  • Daniel N. JonesEmail author
Living reference work entry
DOI: https://doi.org/10.1007/978-3-319-16999-6_3305-1

Keywords

Nonhuman Animal Frequency Dependent Selection Cooperative Society Financial Community Dark Triad 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Synonyms

Definitions

Deception is notoriously difficult to define given that there are so many forms across human and nonhuman animals (Mitchell 1993). Here, we define deception as the intentional or unintentional misrepresentation of information. This misrepresentation can be conscious or unconscious and may serve a multitude of purposes including (but not limited to) survival (e.g., physical defense), resource acquisition (e.g., acquiring food or resources), or increased inclusive fitness (e.g., reproductive advantages).

Introduction

Deception is common in human interactions (Kashy and DePaulo 1996). It can be intentional such as in the case of lying, self-presentation, or withholding important information. Deception may also be unintentional or unconscious, such as in the case of self-deception (e.g., von Hippel and Trivers 2011). Deception may stem from good intentions or a desire not to harm, or it may come from a place of selfishness or malevolence.

In the animal kingdom, deception is often used for survival. Some researchers have argued that survival of primates is calibrated on deception and that “Machiavellian intelligence” is required for survival (Byrne and Whiten 1989). Despite its benefits, deception also poses risk. To deceive and fail can pose fitness costs in the form of ostracism or death. To deceive and succeed could mean fitness enhancement.

Although a tremendous amount of research and theoretical attention has been paid to the evolution of cooperation and reciprocal altruism in modern society (Trivers 1971), players in a cooperative society may not merely fit a cooperator vs. cheater distinction. Instead, some deceptive individuals may engage in cooperation for long periods of time in order to cultivate trust and the appearance of cooperation. This cultivation of trust, however, may ultimately be in the service of large-scale defection. In this way, individuals in cooperative societies might be better categorized as cooperators and cheaters.

Deception and defection are pervasive in modern society. At the corporate level, financial fraud costs a cooperative society more than 680 billion dollars annually (Wells 2007). However, the costs of defection are not solely financial. Individuals suffer severe psychological consequences as a result of being victims of fraud or financial misbehavior. Such consequences include (but are not limited to) increased anxiety, depression, and even suicide (Titus and Gover 2001).

In humans, examples of deceptive strategies can take either long- or short-term forms. In the short term, individuals may execute strategies which are direct and are designed to reach broad communities, in short intervals of moderate payouts; these individuals are often referred to as con-artists or swindlers. A successful confidence artist or “con-artist” may prey on a wide variety of victims (e.g., elderly women, college students). One example comes from Pratkanis and Shadel (2005, pg. 35) where an individual would call and ask elderly individuals to donate money to a fictitious “Say No to Drugs Program.” He used very broad and flexible persuasion techniques and was successful in getting a wide variety of individuals to “donate” to his organization. One particularly clever swindle occurred when this con-artist was arrested. Using his phone call from the Federal Marshal’s office (where he was being held), he called one of his previous victims claiming that he was a Federal Marshal and that he could recover all of the victim’s losses for a processing fee of $20,000 (Pratkanis and Shadel 2005; pg. 42).

In contrast, some individuals execute strategies designed to reach very specific communities for long-term intervals of large payouts. These individuals are often guilty of antitrust or business/trade violations (Benson and Simpson 2009). One successful long-term cheater of this type is Bernie Madoff. Most evidence indicates that Madoff first began his fraudulent trading back in the early 1970s and eventually was responsible for the largest documented Ponzi scheme in history (Arvedlund 2009). Ponzi schemes are difficult to perpetuate because one must have great patience and build a reputation within the financial community over a significant period of time. Even more difficult, one must show initial returns in order to gain more clients to participate in the Ponzi scheme.

Although Madoff’s primary charge was running a Ponzi scheme, he used what is known as “affinity fraud” to perpetuate his fraudulent trades (e.g., Babiak and Hare 2006; Perri and Brody 2011). Affinity fraud occurs when an individual integrates her-/himself into a community for the purposes of eliciting trust and then proceeds to defraud that community. Madoff, however, went far beyond general integration into financial communities, serving on many financial advisory boards and forming long-standing relationships with important people in governmental positions (Collins 2011). Operating for almost 40 years, Madoff was able to defraud investors in excess of 65 billion dollars (Arvedlund 2009).

Although both successful con-artists and white-collar offenders (such as Madoff) have cheated innocent people out of money, they presented themselves differently. A con-artist, on the one hand, can be considered an intentional and direct defector. Con-artists and short-term deceivers use only temporary and superficial deceptive techniques, techniques of which people are immediately suspicious (Pratkanis and Shadel 2005). On the other hand, Madoff worked himself into the fabric of the financial community through more complex deception, giving every appearance of cooperation and contribution. In essence, he imitated or “mimicked” cooperative behavior in every facet of his public life.

A comparison of human and nonhuman animal deception was proposed by Mitchell (1986, as cited in Mitchell 1996). Mitchell (1993, as cited in Mitchell 1996) argued that researchers could learn a lot about human deception by studying deception within different kingdoms of living creatures. In particular, Mitchell (1996) walks the reader through levels of deception starting with simple camouflage or mimicry all the way up to memory and perspective taking. Indeed, these more sophisticated deceptions generally require more planning and cognitive ability.

Mitchell further (1996) articulates the process of deception insofar as one must convince a target of fictional information without raising issues of suspicion. Suspicion, according to Mitchell, is the enemy of deception. This assertion makes sense, given that the human default in most cases is to believe others (Levine 2014). In fact, Levine argues that deception sometimes requires self-deception on the part of the deceived. This finding of course raises the issue of motivation. Individuals are most vulnerable to lies when they want to believe them in the first place. Take for example, someone who is uneasy about their savings with respect to retirement. This individual meets Madoff and is motivated to believe that the solution to their financial woes may be the impressive payouts that Madoff offers. Rather than raising red flags about unrealistic returns, the individual is motivated to believe that they are simply lucky to have found such a wise financial genius.

Differential Temporal Strategies

Jones (2014) argued that all deception falls on a long- to short-term continuum, and there are four key components that exist for long-term deception: complex deception, slow resource extraction, community integration, and difficulty in detection. These four components are absent in short-term deceptions. Empirical evidence supports the idea that these four components do indeed intercorrelate (Jones and de Roos in press). Below we review further evidence for these claims.

Among predators and prey in the animal kingdom, mimicry is an adaptation utilized to confuse other organisms to an adaptive end (Wickler 1968, as cited in Jones 2014). In fact, many kingdoms, viruses/bacteria, plants, nonhuman animals, and humans (both children and adults), are replete with examples of some form of mimicry (Damian 1964 & Gilpin 1975, as cited in Jones 2014). Mimicry is especially prevalent when deception is adaptive for the organism (Gilpin 1975, as cited in Jones 2014).

Much in the same way that Madoff displayed characteristics of a cooperator, when in fact, he was a cheater, some animals will display characteristics that appear similar to other organisms but are actually quite different. In contrast, con-artists engage in direct parasitic and/or predatory behaviors, behaviors that are analogous to traditional infections (Damian 1964 as cited in Jones 2014) or predators (Gilpin 1975, as cited in Jones 2014).

Strategies of fraud resemble viral and bacterial infections of cooperative hosts, and as such, virulence provides a useful analogy for fraudulent behavior. Virulence refers to the rate of infection of a microorganism. Some infections spread quickly in a broad fashion, while others tailor their infection to a specific host, making the spread of infection slower, but the virus more difficult to detect (Levin and Bull 1994). Virulence, however, comes at a cost: widespread detection and destruction by antigenic entities within the host (Levin and Bull 1994). On one end of the virulence spectrum, certain microorganisms will infect a broad range of hosts. Examples of such microorganisms are Streptococcus pneumonia and Haemophilus influenza. Such microorganisms are transmitted via droplet infection (e.g., sneezing). The mechanism for infection, among these parasites, is similar in most hosts, and these microorganisms infect similar locations within the host (e.g., nasopharyngeal passages or respiratory systems).

By contrast, certain forms of meningitis and viral infections, such as the human immunodeficiency virus (HIV), have a different approach to infection. Such infecting agents evolve within the host, in order to mimic naturally occurring cells and entities. HIV, for example, is not easily transmitted (i.e., not through droplet) and will often lie undetected for years within a host, before causing damage. These differential parasitic strategies can be viewed as two separate “defection” strategies utilized by microorganisms in compromising a “cooperative” host (Zimmer 2000). Thus, it appears in the evolutionary arms race between infecting microorganisms and multicellular hosts that many parasitic organisms have utilized mimicry strategies to appear similar to naturally occurring substances within the host’s system (Damian 1964, as cited in Jones 2014).

Mimicry also occurs in the animal kingdom when an animal or species gives the appearance of possessing characteristics of an unrelated animal or species, but actually does not possess these traits (Holling 1965, as cited in Jones 2014). Common examples can be found in organisms such as butterflies, frogs, fish, and lizards. Mimicry, which evolved for the purpose of predator confusion and defense, is often referred to as Batesian mimicry (Malcolm 1990, as cited in Jones 2014). Such creatures may display bright coloring or similar markings to creatures that are unpalatable or even poisonous. However, the mimicking organism is either palatable, nontoxic, or both.

Mimicry can work in the other direction as well, in favor of a predator. Some animals appear to be harmless, when in fact they are predatory. This type of mimicry is referred to as Mertensian mimicry (Wickler 1968, as cited in Jones 2014). For example, the Venus flytrap, which looks quite harmless to unsuspecting insects, but is carnivorous, will strike when an insect is inside.

Mimicry represents an evolutionary arms race in its own right between organisms utilizing mimicry and organisms attempting to detect mimicry (Caley and Schluter 2003). These selective pressures and arms-race style evolutionary mechanisms are intensified by the trade-off that exists between costs associated with missing opportunities (e.g., the chance to feed on palatable prey) vs. the costs associated with making mistakes (e.g., dying from biting into poisonous prey). In other words, the cost/benefit to trusting or avoiding mimics is often based on consequences associated with false positives or false negatives. For example, if the consequence of mistaking a palatable fish with an unpalatable fish is merely an unpleasant taste, then there will be mild selective pressure placed on predators to distinguish between the two types of fish. On the other hand, if one of the fish carried lethal poison, selective pressures placed on predators for distinguishing between the two types of fish would be intense (Caley and Schluter 2003). However, as aforementioned, motivation (and therefore, self-deception) may play a role in deception as well. Take, for example, a fish that is starving, he/she may be more likely to fall for the mimicry deception, given that the trade-off of poison vs. starving increases risk-taking behaviors.

When detecting potential mimicry or deception in humans, differential pressures also produce divergent abilities for detection. For example, when consequences are low and familiarity is sparse, humans are not much better than average at detecting deception in other humans (von Hippel and Trivers 2011). In such situations of little consequence or closeness, most individuals lack the motivation and/or ability to detect deception. This lack of motivation is analogous to the lack of selective pressure on predators that face little consequence for mistaking bad-tasting fish with palatable ones. However, when consequences are high (e.g., the individual is high in closeness or the event high in consequence), detection of deception is much improved.

Deception and Evolution

Frequency Dependent Selection

The proportion of cheaters vs. cooperators follows the basic principles of frequency-dependent selection (Pfenning et al. 2001). Frequency dependence means that too much or too little of a particular trait will create an imbalance. This imbalance is then corrected through selective pressures. Several publications have reviewed the impact of frequency-dependent selective forces on cheater vs. cooperator strategies (Mealey 1995; Wilson et al. 1996). In particular, Mealey (1995) argued that psychopathy was likely to fit in an evolutionary model mixed of cheaters and cooperators (p. 526). Mealey (1995) also noted that other variables (e.g., competitive disadvantage) played a role in determining a player’s strategy (in this case, disadvantaged individuals were more likely to cheat).

In the case of mimicry, if there ends up being an exceedingly high level of harmless (Batesian) mimics when compared to harmful (e.g., poisonous) organisms, it becomes too advantageous for predators to take a risk and attack both harmful and harmless organisms, because probability is in favor of non-poison. By contrast, if the balance of mimicry to harmful organisms is such that there is an exceedingly high percentage of harmful (e.g., poisonous) organisms, then selective pressure would favor (a) finding an alternative source of food (i.e., predators would avoid both mimics and harmful prey altogether) and (b) evolving stronger mechanisms for differentiating mimics from non-mimics (Caley and Schluter 2003). In this way, a positive imbalance of cheaters might pave the way for mimics.

Among humans, some who mimic cooperation with selfish intentions may never defect unless they feel it is “necessary”. However, until necessity strikes, the most effective cheaters may maintain their cooperative veneer, and thus, a positive reputation. It should be noted that necessity may be perceived differently by different individuals.

Single vs. Dual Selective Pressures on Mimic Cheaters

Frequency dependent selections also are context specific. In areas familiar to a particular type of dangerous predator or unpalatable prey, mimicry is quite effective (Pfenning et al. 2001). In such cases, predators and prey require highly evolved mechanisms to detect differences and mimicry will often work. In areas unfamiliar to these organisms, mimicry is unnecessary. For example, Pfenning et al. (2001) placed coral snake look-alikes in eastern regions of the United States, as well as regions of Central America. They found that snake predators in the United States would readily attack these coral snake look-alikes because they had no prior exposure to such snakes or to their mimics. By contrast, coral snake look-alikes placed in regions of Central America were left alone.

By analogy, detection of deception in humans may operate in a similar fashion. For example, individuals planning affinity fraud are more likely to go undetected in certain environments once introduced (e.g., religious communities) as compared to other environments (e.g., Wall Street). Communities used to dealing in fast-paced monetary exchanges (e.g., Wall Street) are more likely to be flooded with deceptive attempts. Thus, within these communities, there is likely to be greater selective pressure placed on deceptive individuals to emulate both being a cooperator and simultaneously avoid looking like a cheater. Less pressure may be placed on someone in a community where deception is not expected (e.g., religious community).

Differentiating Human Deception Strategies

Indeed, research and theory in evolutionary models of cooperation have alluded to several permutations of cooperation that might differentially benefit the individual at the (slight) cost to the group. For example, Trivers (1971) noted that appearing to be a good cooperator with mild levels of selfish bias provides long-term benefits of resource allocation without incurring direct cheating costs. However, it should be noted that such strategies require social dominance to execute. When caught taking slightly more than one is supposed to take, it would require either refined social skills, status, or intimidation (Tooby and Cosmides 1990) to avoid having group members act in punitive ways based on unfairness. Moreover, as Axelrod (1984) points out, strategies are contextually based, and one might cheat when costs are low, others are submissive, detection is unlikely, payoffs are great, or one feels cheated in their own right. As aforementioned, frequency dependence and the presence of others’ strategies greatly influence the strategy employed by a given group member (Tooby and Cosmides 1990).

However, there is a second way an individual may engage in deception is through self-deception. For example, von Hippel and Trivers (2011) reviewed a body of evidence suggesting self-deceptive enhancement bestows advantages with respect to interpersonal deception. They argue that through deceiving oneself, detection of deception by others is rendered extraordinarily difficult. The absence of truthful leaks and/or hints of deception stem from the fact that self-deceptive individuals genuinely believe that they are cooperators, they benefit others, and they are not cheating. In sum, mimicry may be split into two forms: mimicry stemming from intentional and planful strategy vs. mimicry stemming from self-deception and exaggerated overconfidence.

Self-deceptive enhancement is a fairly broad strategy that can be beneficial (at least in the short-term). It should be noted, however, that exaggerations come at a cost. At best, these self-deceptive exaggerations represent a “mixed blessing” (Paulhus 1998, as cited in Paulhus et al. 2003), because they wear off over time.

Personality and Deception: The Dark Triad

The dark triad is a term that refers to three commonly studied personality traits in the realm of interpersonal harm (Paulhus and Williams 2002). All three of these traits are linked to deception under different circumstances (Jones and Paulhus 2016). Machiavellianism is associated with strategic and planful deception (Jones and Paulhus 2009). Such individuals are cautious and anticipate future moves from others (Bereczkei et al. 2013). In contrast, psychopathy is purely short term and aggressive in their deception, wearing a temporary mask of deception (Book et al. 2015). Finally, narcissistic individuals deceive through self-deception (Paulhus et al. 2003). In this way, the dark triad traits cover a wide array of deceptive dispositions.

It should be noted that all deception requires some level of skill. Individuals who are convincing “talkers” often rise in corporate ranks with little substance (Babiak and Hare 2006). Skill is likely to be even more necessary for deceptions that are long term or planned. Turner and Martinez (1977) found that individuals high in Machiavellianism but low in intelligence were the least successful individuals. In contrast, those high in Machiavellianism and high in intelligence were the most successful.

Conclusion

Deceptions can take predatory and defensive forms. Further, deception ranges from simple (e.g., camouflage) to complicated (e.g., planned). In many ways, mimicry can be split into more nuanced categories, which suggests that there may be multiple forms of mimicry operating in humans as well as nonhuman animals. Moreover, there is a wide variety of deceptive tactics. Some are likely to be perpetuated by planful mimic cheaters, such as intentional information omission (DeScioli et al. 2011). By contrast, other tactics are simple and direct lies based on emotional manipulation or fabrication (Patrick and Zempolich 1999). Still others involve overconfidence and entitlement (Campbell et al. 2004). Simply put, understanding human deception is complicated because there are infinite ways to deceive, but few ways (perhaps even, just one) to tell the truth.

Cross-References

References

  1. Arvedlund, E. (2009). Too good to be true: The rise and fall of Bernie Madoff. New York, NY: Penguin Books.Google Scholar
  2. Axelrod, R. (1984). The evolution of cooperation. New York, NY: Basic Books.Google Scholar
  3. Babiak, P., & Hare, R. (2006). Snakes in suits: When psychopaths go to work. New York, NY: Harper Collins.Google Scholar
  4. Benson, M. L., & Simpson, S. S. (2009). White-collar crime: An opportunity perspective. New York, NH: Routledge.Google Scholar
  5. Bereczkei, T., Deak, A., Papp, P., Perlaki, G., & Orsi, G. (2013). Neural correlates of Machiavellian strategies in a social dilemma task. Brain and cognition, 82(1), 108–116.Google Scholar
  6. Book, A., Methot, T., Gauthier, N., Hosker-Field, A., Forth, A., Quinsey, V., & Molnar, D. (2015). The mask of sanity revisited: Psychopathic traits and affective mimicry. Evolutionary Psychological Science, 1(2), 91–102.Google Scholar
  7. Byrne, R., & Whiten, A. (1989). Machiavellian intelligence: Social expertise and the evolution of intellect in monkeys, apes, and humans. Oxford Science Publications.Google Scholar
  8. Caley, M. J., & Schluter, D. (2003). Predators favour mimicry in a tropical reef fish. Proceedings of the Royal Society of London B, 270, 667–672. doi: 10.1098/rspb.2002.2263.CrossRefGoogle Scholar
  9. Campbell, W. K., Goodie, A. S., & Foster, J. D. (2004). Narcissism, confidence, and risk attitude. Journal of Behavioral Decision Making, 17, 297–311. doi: 10.1002/bdm.475.CrossRefGoogle Scholar
  10. Christie, R., & Geis, F. (1970). Studies in Machiavellianism. London: Academic Press.Google Scholar
  11. Collins, D. (2011). Bernie Madoff’s Ponzi Scheme: Reliable returns from a trustworthy financial advisor. In D. Collins (Ed.), Business ethics (pp. 435–453). New York, NY: Wiley.Google Scholar
  12. DeScioli, P., Christner, J., & Kurzban, R. (2011). The omission strategy. Psychological Science, 22, 442–446. doi: 10.1177/0956797611400616.CrossRefPubMedGoogle Scholar
  13. Jones, D.N., & de Roos, M.S. (in press). Differential reproductive patterns among the Dark Triad. Evolutionary Psychological Science.Google Scholar
  14. Jones, D. N., & Paulhus, D. L. (2009). Machiavellianism. In M. R. Leary & R. H. Hoyle (Eds.), Handbook of individual differences in social behavior (pp. 102–120). New York, NY: Guilford.Google Scholar
  15. Jones, D. N. (2014). Predatory personalities as behavioral mimics and parasites mimicry–deception theory. Perspectives on Psychological Science, 9(4), 445–451.CrossRefPubMedGoogle Scholar
  16. Kashy, D. A., & DePaulo, B. M. (1996). Who lies? Journal of Personality and Social Psychology, 70(5), 1037.CrossRefPubMedGoogle Scholar
  17. Levin, B. R., & Bull, J. J. (1994). Short-sighted evolution and the virulence of pathogenic microorganisms. Trends in Microbiology, 2, 76–81. doi: 10.1016/0966-842X(94)90538-X.CrossRefPubMedGoogle Scholar
  18. Levine, T. R. (2014). Truth-Default Theory (TDT) A Theory of Human Deception and Deception Detection. Journal of Language and Social Psychology, 33(4), 378–392.Google Scholar
  19. Maynard Smith, J. (1982). Evolution and the theory of games. Cambridge University Press.CrossRefGoogle Scholar
  20. Maynard Smith, J., & Price, G. (1973). The logic of animal conflict. Nature, 246, 15–18.CrossRefGoogle Scholar
  21. Mealey, L. (1995). The sociobiology of sociopathy: An integrated evolutionary model. Behavioral and Brain Sciences, 18, 523–599. doi: 10.1017/S0140525X00039595.CrossRefGoogle Scholar
  22. Mitchell, R. W. (1993). Animals as liars: The human face of nonhuman deception. In M. Lewis & C. Saarni (Eds.), Lying and deception in everyday life (pp. 59–89). New York: Guilford Press.Google Scholar
  23. Mitchell, R. W. (1996). The psychology of human deception. Social Research, 1996, 819–861.Google Scholar
  24. Patrick, C. J., & Zempolich, K. A. (1999). Emotion and aggression in the psychopathic personality. Aggression and violent behavior, 3(4), 303–338.Google Scholar
  25. Paulhus, D. L., Harms, P. D., Bruce, M. N., & Lysy, D. C. (2003). The over-claiming technique: Measuring self-enhancement independent of ability. Journal of Personality and Social Psychology, 84, 681–693. doi: 10.1037/0022-3514.84.4.890.CrossRefGoogle Scholar
  26. Paulhus, D. L., & Williams, K. M. (2002). The dark triad of personality: Narcissism, Machiavellianism, and psychopathy. Journal of Research in Personality, 36, 556–563. doi: 10.1016/S0092-6566(02)00505-6.CrossRefGoogle Scholar
  27. Perri, F. S., & Brody, R. G. (2011). Birds of the same feather: The dangers of affinity fraud. Journal of Forensic Studies in Accounting & Business, 3(1), 33–46.Google Scholar
  28. Pfenning, D. W., Harcombe, W. R., & Pfenning, K. S. (2001). Frequency dependent Batesian Mimicry. Nature, 410, 323.CrossRefGoogle Scholar
  29. Pratkanis, A., & Shadel, D. (2005). Weapons of fraud: A source book for fraud fighters. Washington, DC: AARP.Google Scholar
  30. Titus, R. M., & Gover, A. R. (2001). Personal fraud: The victims and the scams. Crime Prevention Studies, 12, 133–152.Google Scholar
  31. Tooby, J., & Cosmides, L. (1990). The past explains the present: Emotion adaptations and the structure of ancestral environment. Ethology and Sociobiology, 11, 375–424. doi: DOI10.1016/0162-3095(90)90017-Z.
  32. Trivers, R. L. (1971). The evolution of reciprocal altruism. Quarterly Review of Biology, 46, 35–57 http://www.jstor.org/stable/2822435.CrossRefGoogle Scholar
  33. Turner, C. F., & Martinez, D. C. (1977). Socioeconomic achievement and the Machiavellian personality. Sociometry, 40, 325–336 http://www.jstor.org/stable/3033481.CrossRefGoogle Scholar
  34. von Hipple, W., & Trivers, R. (2011). The evolution and psychology of self-deception. Behavioral and Brain Sciences, 34, 1–56. doi: 10.1017/S0140525X10002657.CrossRefGoogle Scholar
  35. Wells, J. (2007). Corporate fraud handbook: Prevention and detection. Hoboken, NJ: Wiley.Google Scholar
  36. Wilson, D. S., Near, D. C., & Miller, R. R. (1996). Machiavellianism: A synthesis of the evolutionary and psychological literatures. Psychological Bulletin, 119, 285–299. doi: 10.1002/per.859.CrossRefPubMedGoogle Scholar
  37. Zimmer, C. (2000). Parasite rex. New York, NY: The Free Press.Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Department of PsychologyUniversity of Texas, El PasoEl PasoUSA

Section editors and affiliations

  • Christopher Watkins
    • 1
  1. 1.Abertay UniversityDundeeScotland