Advertisement

Deception

Its Use and Abuse in the Social Sciences
  • David CalveyEmail author
Living reference work entry

Abstract

Deception is a controversial and emotive area, which is strongly associated with the transgression, violation, and breaching of research integrity. Deception has been object of both fear and fascination for many researchers and practitioners for a lengthy period of time. For some, deception runs counter to the established principle of informed consent and hence has no place in ethical decision-making. However, for others, deception does have a rich role to play, albeit submerged, in the critical imagination and toolkit of the researcher. Deception occupies a classic love or loathe position in social research, which often results in extreme and hyper responses from its audiences on both sides. For me, deception can be justified and has been successfully used in various research settings across the social sciences to gain rich insider knowledge and creatively manage the problem of artificiality. It has been long demonized, maligned, and stigmatized in various research communities and is under used. This chapter shall frame the usage of deception in different contexts-popular culture, occupational and social scientific. The author shall explore the diaspora of classical and contemporary exemplars of deception followed by some reflections on the longitudinal use of deception in his covert sociological study of bouncers in the night-time economy of Manchester. The chapter shall also examine the increasing ethical regulation of deception as well as investigate the future landscape for its use and abuse.

Keywords

Bouncers Covert Criminology Cyber Deception Ethnography Ethics Sociology Violence 

Introduction

The chapter is organized into seven sections. Following the introduction, deception shall be framed by exploring the different contexts and categories of its use, including definitions and genealogies. Third, a diaspora of classical and contemporary exemplars from the social sciences shall be critically outlined. Fourth, the author’s covert ethnography of bouncers in the night-time economy of Manchester shall be reviewed as a display of deception in action. Fifth, the ethical governance and regulation of deception shall be examined, which has broadly stifled the use of many deceptive strategies in social research. Sixth, the future landscape of deception shall be investigated in the form of cyber research and autoethnography, followed finally by a conclusion.

The rationale here is the rehabilitation of deception in scientific research such that it can be creatively used in appropriate circumstances rather than have a crude and restrictive blanket rejection of it as a frowned upon, marginalized, and stigmatized last resort and outlaw methodological pariah (Calvey 2017, 20182019). Detailed and comparative genealogies of deception are limited, stifled, subsumed, and glossed over in many research methodology literatures partly due to the inflated emotive and ethical responses to the uses and consequences of deception. Deception should form part of a robust scientific research imagination and a creative toolkit rather than be locked away in a closet as an unethical horror.

Framing Deception

Deception has interested scholars from multidisciplinary backgrounds for some time (Levine 2014). Masip et al. broadly define deception as:

The deliberate attempt, whether successful or not, to conceal, fabricate, and/or manipulate in any other way, factual and/or emotional information, by verbal and/or nonverbal means, in order to create or maintain in another or others a belief that the communicator himself or herself considers false. (2004, p. 147)

In sociological traditions, which trade on a covert approach (Calvey 2017), deception is typically associated with fake passing with a specific identity group and/or subculture under study in order to gain firsthand insider knowledge. What Rohy usefully describes as: “passing designates a performance in which one presents oneself as what one is not” (1996, p. 219).

Gamer and Ambach (2014) argue that “Research on deception has a long tradition in psychology and related fields. On the one hand, the drive for detecting deception has inspired research, teaching and application over many decades” (2014, p. 3). Lie detector tests, or polygraphs, have been in popular use by police and security agencies worldwide, since the technology was invented at the University of California, Berkley, United States, in 1921 by John Larson, a medical student and police officer. Various psychologists have focused on detecting deception in body and face demeanor and micro-expressions (Ekman 1985). Robinson (1996) makes a useful distinction between the interpersonal and institutional contexts of deception and views lies, falsity, belief, and intentionality as the core areas of the multidisciplinary study of deception.

Methodologically, deception has been used in various psychological field and laboratory experiments by deliberate staging, managed misinformation, and the use of confederates who fake and feign various psychological states and symptoms. For those in this tradition, the use of deception is a key part of experimental control and justified to help manage artificiality and avoid overly reactive subjects. Using deception in such traditions is typically legitimated by the detailed retrospective debriefing of the subject. Korn (1997) argues that deception is used in social psychology to create “illusions of reality,” which has led to some of the most dramatic and controversial studies in the history of psychology. Herrera, in a similar historical review, stresses that “deception is still treated as if it is a necessary evil, which it may be for some time” (1997, p. 32). Brannigan (2004) views experiments as theaters and provocatively argues that the new regulatory ethical environment will lead ultimately to the end of experimental social psychology. He hopes that such a death will help rehabilitate the field. For Brannigan (2004), deceptive experimental methodology has ironically partly led to both the rise and fall of social psychology.

Deception is bound up with lying in complex ways. Lying is typically perceived as a common form of deception and deceit in everyday life. What Adler (1997) usefully characterized as “falsely implicating.” Moral philosophy has long explored the topic as related to the development of our shared moral compass and conscience in society from childhood. What Bok (1978) refers to as a series of “moral choices” in both our private and public lives.

Barnes (1983), in his innovative sociological view of lying, argues that “lying has been a human activity for a long time, and is not merely a recent and regrettable innovation” (1983, p. 152). Barnes adds that it is “generally regarded as a form of deviance that needs to be explained, whereas truthfulness is taken for granted as the normal characteristic of social intercourse” (1983, p. 152). Barnes (1994, p. 1) later opens up his book boldly stating: “Lies are everywhere.” Barnes views lying as a mode of deception where “different contexts, however, provide different possibilities for deceit” (1994, p. 19).

Deception is also bound up with secrecy in society, which is an accepted systemic feature of modern societies (Simmel 1906; Bok 1982; Barrera and Simpson 2012). Deception is the core specialist business of M15 and M16 in the United Kingdom and the Federal Bureau of Investigation (FBI) and the Central Intelligence Agency (CIA) in the United States, which are familiar names to the public. The Official Secrets Act (1889, 1911, 1920, 1989) is still in operation in the United Kingdom for the purposes of protecting national security, and there is very similar legislation in Hong Kong, India, Ireland, Malaysia, Myanmar, and Singapore. Clearly spying and espionage have long roots in several societies (Goldman 2006). What Pfaff and Tiel (2004) describe as specific “military and intelligence ethics.” In many ways, this is the accepted face of professionalized deception, particularly in the current and intensified context of global counter-terrorism. In this sense, state deception is bound up with ideas of public trust (Shilling and Mellor 2015).

Turning to the different contexts of usage for deception, it is very clear that deception is woven into the fabric of everyday life and is part of the public imagination and popular culture. This context involves media scandals, exposed documentaries, and comedic dupery from mass entertainment. We have become a voyeur nation (Calvert 2000), obsessed with, and indeed normalized to, peering and watching others and strangers. Undercover journalists like Nellie Bly, Mimi Chakarova, Barbara Ehrenreich, Anna Erelle, Stuart Goldman, John Howard Griffin, Stetson Kennedy, Suki Kim, Tim Lopes, Mazher Mahmood, James O’Keefe, Donal MacInytre, Antonio Salas, Roberto Savianno, Gloria Steinem, Chris Terrill, Polly Toynbee, Eva Valesh, Norah Vincent, Gunter Wallraff, and Peter Warren, to name some of the most celebrated, have become household names in their respective countries and are probably more familiar to the public than academics using deception.

These journalists have investigated a long list of sensitive topics, wrongdoings, corruptions, cover-ups, breadline poverty, violence, abuse, criminal, deviance, and extremist organizations. Broadly, their investigative logic would often be a paradoxical one of “lying to get to the truth” in a sort of “quick and dirty realism” manner. Many of them have pushed the envelope and placed themselves in considerable personal risk and danger. The evocative, partisan, cavalier, and heroic populist image of deception has been tied up with these figures rather than academic ones, particularly when their deceptive investigations have been widely televised, caused headline scandals and media moral panics, and in some cases become major films. Such an image partly causes a problem of credibility and legitimacy for academic deceptive researchers. The imperatives and aims of their deception game differs from academic ones, and the differences should not be simply collapsed. Indeed, some academics fear that deceptive research has been effectively hijacked by investigative journalism (Van Den Hoonaard 2011).

In a very different context, the invented characters of controversial comedians Sacha Baron, Dom Jolly, Novan Cavek, and Marc Wootton provocatively use antagonistic deception in their performances to satirize and challenge current thinking, celebrity culture, political figures, and household institutions. Such television hoaxing programs have attracted and sustained high viewing figures over a number of years. Clearly, deception sells well, and the cogent narrative it can weave has stood the test of time.

In the commercial world, deception has a credible place. The large supermarkets in the United Kingdom extensively use mystery shoppers and simulated clients as a way of gaining data to profile consumption patterns. The Department of Work and Pensions in the United Kingdom routinely uses surveillance tactics to investigate suspected fraudulent benefit claims. Various councils use undercover agents to review the health and safety, hygiene, and fair trading standards of the restaurant and hospitality industry. In the high-end culinary world, revered Michelin stars are still awarded by covert dining visits and private detectives are regularly used as alternative ways to spy on failed intimate relationships (Calvey 2017).

The medical world has also routinely used placebo experiments. This typically involves controlled and comparative experiments in clinical trials where participants are deceived as to what they are consuming and not consuming. Such accepted deception has a long history in medical science (Beecher 1955; Gotzsche 1994). Lichtenberg et al., in a review of the role of placebo in clinical practice, provocatively argues:

The placebo is the most commonly-employed treatment across cultures and throughout history. Today’s physician, resting on her evidence-based laurels, might have no trouble accepting this claim when considering the medical practice of yore. After all, what else can one make of the potions, herbs, leechings and rituals of our distant colleagues of an earlier age—medicine men, shamans, wizards—if not that they were, wittingly or ignorantly, purveyors of placebos? (2004, p. 215)

Deception has also played a significant role in police culture in different ways. As expected, the police covertly investigate a wide range of sensitive topics including pedophilia, drug abuse, football hooliganism, people trafficking, and counterfeit goods. What Loftus and Goold (2012) elegantly describe as the invisibilities of policing. There is an industry of glamorized and gritty firsthand accounts of the undercover work of former police and security professionals, some of which have been popularized in films. Again, this can produce a rather glossed, mythic, heroic, and romantic image of deception.

These different professional and practitioner contexts are more practically than theoretically driven and have different agendas, aims, challenges, and legal sensitivities than academic ones, which must be recognized and appreciated. Thus, certain forms of deception in our society that are seen to be motivated by public interest and are contained in the hands of specialist professionals with expert knowledge are more readily accepted and expected. What is clear is that deception is used in hybrid ways over the years and is intrinsically woven into society. It is a clearly still an object of both fear and fascination.

Classical and Contemporary Exemplars: The Deceptive Diaspora

There is a wide and dispersed range of studies and topics in the deceptive diaspora. On further granulation, many of these deceptive studies are rarely purist and employ more mixed strategies such as gatekeeping and key informants. The diaspora then is more akin to a continuum rather than a fixed state of deception. The studies, from different eras, are drawn from various fields across the social and human sciences, including anthropology, investigative journalism, psychology, and sociology.

Some of studies in the diaspora are what can be termed the “usual suspects” (Calvey 2017), which conventionally frame the field and often have ongoing scholarship about them, while others are less popular but still instructive gems. Because deception studies is not an incremental, integrated, or cross-fertilized field, some of the studies have a rather stand-alone status in their respective fields. I will present them in chronological order.

Nellie Bly, an alias for Elizabeth Jane Cochrane (1864–1922), was an inspirational icon for feminists, with her courageous and early covert study of a women’s lunatic asylum in New York, evocatively titled Ten Days in a Mad-House (1887). She revealed the brutalization of inmates, which resulted in police investigations and legal reform. This was one of her first undercover assignments as a young reporter at 23 years old. She gained entry after feigning hysteria and a court appearance. Bly struggled to get released from the asylum and needed the support of newspaper to verify her story as a genuine journalist.

The Tuskegee syphilis study (1932), in Alabama, United States, was a notorious example and historical marker of belligerent and harmful deceptive research which strategically used medical misinformation and controlled nontreatment. In many ways, this became a negative landmark and milestone public health study that changed the face of modern ethical regulation of research, principally the primacy of informed consent and the protection of the subject against harm. Sensitive concerns about the institutionalized racism (Brandt 1978), that the study clearly dispalyed, are well documented and still quoted today. Thomas and Quinn sum up the perception of this flawed and reckless public health experiment when they stress that “there remains a trial of distrust and suspicion” (1991, p. 1498).

Paul Cressey, from the University of Chicago’s famous Department of Sociology, wrote The Taxi-Dance Hall (1932) after longitudinal materials on taxi dance halls were gathered over a 5-year period from a team of covert investigators, acting as “anonymous strangers.” This was a pioneering early study of the commercialization of sex work, which influenced future scholars researching sexual deviance and sex work. While serving as a case worker and special investigator for the Juvenile Protective Association in Chicago in the summer of 1925, Cressey (1899–1969) was asked to report on the new and morally controversial “closed dance halls,” open to male patrons only as a prototypical early strip joints. For Cressey the growth of such spaces was an inevitable feature of modernity and leisure capitalism. Cressey sympathetically points toward subjugation as he states: “Feminine society is for sale, and at a neat price” (1932, p. 11).

In his exploration of racial prejudice and discrimination, psychologist Richard LaPiere (1934) travelled throughout the United States with a Chinese couple, visiting restaurants and using deceptive tactics throughout. The work of Leon Festinger and colleagues within social psychology was seminal in the study of religious cults, particularly the application of cognitive dissonance in the management of individual and group delusion. Festinger and colleagues did not use gatekeeping arrangements and used a team of trained researchers, including the authors, posing as “ordinary members” during their work in publishing When Prophecy Fails (1956). This work was a seminal psychological study of religious cults. Festinger et al. (1956) state “Our observers posed as ordinary members who believed as the others did” (1956, p. 237) and stress that their work was “as much a job of detective work as of observation” (1956, p. 252).

Melville Dalton, an organizational sociologist, explored management work culture and bureaucracy in Men Who Manage: Fusions of Feeling and Theory in Administration (1959). It is very distinctive in the length of time he spent in a sustained covert role, which was around a decade gathering rich longitudinal organizational data. His justification of his covert stance centers around his critique that “controlled experiments which are not suited to my purpose” (1959, p. 275). Dalton innovatively used an extensive network of key informants or what he describes as “intimates.”

Seminal sociologist Erving Goffman’s iconic book Asylums (1961) was to have a major influence on the social science community, including healthcare fields and the anti-psychiatry movement. Goffman spent a year doing fieldwork in the mid-1950s in this gatekeeping commissioned piece of covert ethnography, which radically put the patient perspective at the heart of the analysis. Despite the recognized methodological glosses, his study cogently explored “the mortification of self” within a total institution. His covert insider account, as an assistant to the physical director, was an innovative part of developing “a sociological version of the self” (1961, p. xiii) in such a setting. Goffman later commented, published after his death, on fieldwork strategies:

…with your ‘tuned up’ body and with the ecological right to be close to them (which you’ve obtained by one sneaky means or another) you are in a position to note their gestural, visual, bodily responses to what’s going on around them and your empathetic enough-because you’ve been through the same crap they’ve been taking-to sense what it is they’re responding to. (1989, p. 125)

Psychologists Latane and Darley (1969) explore “bystander apathy” and the “bystander effect” (Darley and Latane 1970) on strangers helping people in public by using deceptive confederates in a number of field experiments. It was later extended by Darley and Batson (1973), with similar deceptive field tactics, in their famous “Good Samaritan experiments.” Similar psychological field experiments have been done on public honesty and stealing with lost letters and lost money previously (Merritt and Fowler 1948) and more recently (Farrington and Knight 1979).

Sociologist Laud Humphreys’ Tearoom Trade (1970) is an infamous landmark study found in most ethics handbooks. The semi-covert study was based on his sociology doctoral thesis and analyzed “the social structure of impersonal sex” (p. 14), which was a criminal act of sexual deviance at the time in the United States. The covert stages included his covert participation observation as a voyeuristic “watch queen” and transgressive fake health researcher doing home interviews, combined with the less recognized overt interview data with his “intensive dozen” of key informants. His work was to have a seminal impact on sexuality studies. The Humphreys’ trope forms part of sociological folklore, and the ethical landscape was never the same again. As a gay man himself, this was a profoundly partisan and activist piece of work which attempted to de-stigmatize homosexuality. Part of this was his valiant efforts to protect the participant’s anonymity, despite intense police pressure to incriminate, the clear threat of personal prosecution, and being academically discredited. Humphreys reflects “There was no question in my mind that I would go to prison rather than betray the subjects of my research” (1975, p. 230). Humphreys describes his cruder critics defiantly as “Ayatollahs of Research Ethics” (1980, p. 714).

Criminologist James Patrick, a pseudonym, provides a rich covert participant observation account of a juvenile gang in Glasgow over 4 months in the mid-1960s in A Glasgow Gang Observed (1973). His account of brutality and violence, alongside camaraderie and fictive kinship, became a seminal study of juvenile delinquency and a precursor to modern research on youth gangs. This undercover study would not have been possible without the secret collusion with a key informant, who was a gang leader and who he used to teach in the past. His passing as a gang member, particularly blurred any age differences, was very artfully done.

Patrick resolved to be a “passive participant,” but this still presented him with a complex set of ethical dilemmas in terms of what he witnessed during his fieldwork. Ultimately, for Patrick: “In fact it was the internal struggle between identification with the boys and abhorrence of their violence that finally forced me to quit” (1973, p. 14).

The Rosenhan experiment (1973), provocatively titled “Being Sane in Insane Places,” was a very influential covert pseudo-patient study, which had a considerable impact on the anti-psychiatric movement. The field experiment consisted of eight pseudo patients, including psychologist David Rosenhan of Stanford University, feigning the same mental health problem in order to gain admission to different psychiatric hospitals. The majority of them were wrongly diagnosed, gained admission, and were quickly given medication for schizophrenia. Rosenhan polemically asks on psychiatric diagnosis “If sanity and insanity exist, how shall we know them?” (1973, p. 250). Rosenhan astutely claims that a “psychiatric label has a life and an influence of its own” (1973, p. 253) and these environments “seem undoubtedly counter therapeutic” (1973, p. 257). Rosenhan justifies his deceptive stance stressing “Without concealment, there would have been no way to know how valid these experiences were” (1973, p. 258). The study caused outrage from various professional psychiatrists.

Stanley Milgram’s work in social psychology and his obedience to authority experiments (1974), often popularly characterized as the pain or torture experiments, was to be a landmark one which impacted on a very broad audience both inside and outside academia. The experiments were undertaken at Yale University in the 1960s, and repeated with 18 different variations, before being collected in Obedience to Authority (1974). The Holocaust provides the wider political and rather emotive backdrop to his work.

Milgram was clearly influenced directly by his mentor Solomon Asch and his conformity experiments, which extensively used confederates, as well Allport on personality theory. Milgram’s highly staged experiments involved various types of deception, including confederates and a fake electric shock machine. They were ultimately designed to “shock” and produce counterintuitive results. Milgram (1977) in an interesting latter defense of his experiments states:

A majority of the experiments carried out in social psychology use some degree of misinformation. Such practices have been denounced as “deception” by critics, and the term “deception experiment” has come to be used routinely, particularly in the context of discussions concerning the ethics of such procedures. But in such a context, the term “deception” somewhat biases the issue. It is preferable to use morally neutral terms such as “masking”, “staging”, or “technical illusions” in describing such techniques. (1977, p. 19)

Milgram became the standard trope on deception. What Miller accurately describes as “unprecedented and continuing interest and impact” (2013, p. 17) and Blass (2004), an authoritative figure in Milgram scholarship, titled his book as The Man Who Shocked the World. Questions of ethics and data reliability have been consistently raised (Baumrind 1964, 2013, 2015; Perry 2013; Brannigan et al. 2015). What Brannigan et al. (2015) recently term “unplugging the Milgram machine.” The American Association for the Advancement of Science initially awarded Milgram a research prize in 1964, but this was later revoked on ethical grounds as to the question of Milgram causing deliberate harm to the participants.

Psychologist Philip G. Zimbardo’s Stanford prison experiment (Haney et al. 1973; Zimbardo 2007) was an extreme simulation funded by the Office of Naval Research and approved by the Stanford Human Subjects Research Review Committee. Zimbardo, who took the role of a guard as well as researcher, was influenced by Milgram in terms of his situationalist analysis. The experiment was an attempt to explore labelling, survival group dynamics and situational extremity but was eventually pulled after only 6 days because of the increasing brutality of the guards toward prisoners and the harm caused to the participants. Although participants knew about the simulation, there was an amount of deception-induced stress caused as prisoners were subjected to unexpected, mock arrests at home, in full view of neighbors and police custody. There was debriefing done by both Milgram and Zimbardo, but doubts have been raised about their consistency and the genuine level of support offered.

Working within the controversial pseudo-patient tradition is R. W. Buckingham and colleagues’ provocative and underutilized Living with the Dying (1976). This study was conducted by medical anthropologists who, with careful gatekeeping, used covert participation observation to explore the culture of treating terminal cancer patients in a palliative care ward in a hospital in Montreal, Canada. Despite being only 9 days long, this was a very intense form of passing. The units were told they were being evaluated but not in what detailed form. Buckingham passionately committed himself to an embodied covert role as he assumed the role of a patient with terminal pancreatic cancer, with a second medical anthropologist acting as his cousin and his regular key research contact.

Buckingham et al. stress the emotional angst over his disguised role and his feelings of going native “he identified closely with these sick people and became weaker and more exhausted. He was anorexic and routinely refused food. He felt ill” (1976, p. 1212). Their results showed that, although the needs of the dying and their families are widely recognized, the patient perspective still needs emphasizing. Some distancing by medical staff resulted in feelings of isolation and abandonment for the dying. They argued that such vulnerable groups need more resourced specialized care. Buckingham et al. humanely conclude:

There is a need for comfort, both physical and mental, for others to see them as individuals rather than as hosts for their disease, and for someone to breach the loneliness and help them come to terms with the end. (1976, p. 1215)

Nancy Scheper-Hughes, a medical anthropologist, conducted a controversial undercover ethnography of global organ trafficking (2004), which was to have concrete impact on policy and criminal prosecutions, which took her to 12 different countries and a protected medical power elite. She later describes her politicized work as a form of “engaged ethnography” (Scheper-Hughes 2009). She used mixed methods, which included several important deceptive roles to access delicate and protected information. Hence, she briefly posed as a kidney buyer in a suitcase market in Istanbul, travelled incognito with a private detective from Argentina investigating organ theft from inmates in a state home for the vulnerable, and posed as the relative of a patient looking for a kidney broker with sellers and brokers in person and over telephones.

She passionately stresses: “deceptions are no longer permissible for researchers operating under the strict guidelines of human subjects protection committees. But there are times when one must ask just whom the codes are protecting” (Scheper-Hughes 2004, p. 44). She asked for the project to be given exceptional dispensation akin to a human rights investigative reporter, which was granted.

Geoff Pearson, from a sports sociology and socio-legal studies background, has openly used deception in his autoethnographic research on football hooliganism since the mid-1990s. Pearson poses the challenging questions about deception: “Can participation in criminal activity by a researcher be justified on the grounds that it is necessary to prevent the distortion of the field? Alternatively, can the difficulties in gaining and maintaining access in such spheres excuse such conduct?” (2009, p. 243). Pearson honestly reflects: “I found myself both witnessing criminal offences and being put under pressure to commit them personally” (2009, p. 245).

Pearson, because of the sensitive topic under study, walked a risky legal tightrope throughout his study, including pitch invasion, threatening behavior, and illegal alcohol consumption, which were all rites and rituals of credible acceptance in that subculture. Pearson stresses:

Little formal guidance is provided to researchers in the social sciences who wish to carry out ethnographic research within ‘criminal’ fields…Without researchers who are willing to embark upon covert research, and are sometimes willing to break the law in order to gather this data, some aspects of society will remain hidden or misunderstood…researchers will continue to operate in potentially dangerous research fields without adequate risk assessment or guidance. (2009, pp. 252–253)

The Ethical Regulation of Deception

The standard contemporary ethics debates, which have structured much discussion on deception, have been centered on informed consent, debriefing, protection, and harm in various guises. Such debates help drive standard research practices. In such a framework, deception must be clearly justified and presents itself as a typical ethical and moral dilemma to be avoided or at best minimalized.

Deception, in various psychological fields, is a backbone of experimental control and a long-established and accepted way to manage reactivity and artificiality (Herrera, 1997). For sociology, anthropology, and allied disciplines, their concerns are not typically framed within an experimental tradition and more applied to participant observational contexts, but the clear concern with violating informed consent, causing harm and protecting the subject, is shared. Barrera and Simpson (2012) stress that divergent disciplinary views of deception and norms governing usage “stifle interdisciplinary research and discovery” (2012, p. 383).

The ethical sensibilities and obligations are firmly enshrined in various professional codes and associations, which typically inform and guide various social science disciplines. Most take a standard prescriptive and pejorative view on deception, where it is frowned upon in different ways as a “last resort methodology” (Calvey 2008). Thus, the methodological orthodoxy presents deception as a pariah (Calvey 2018). The rationalizing tendencies of the ethical review boards deny ambiguity in the research relationship, which is problematic in real-world research. Most sensible researchers are not against informed consent per se but are skeptical of the pervasive “one-size-fits-all” mentality and, in some cases, the strict application of an outdated medicalized model to social science topics and fields.

In many ways, the much quoted essay by Kia Erikson (1967) displays the conventional stance of condemnation against what was generally described then as “disguised observation” and “secret observation” in sociology. Erikson argues that “It is unethical for a sociologist to deliberately misrepresent the character of the research in which he is engaged” (1967, p. 373). For Erikson, deceptive strategies may be appropriate for espionage and journalism, but he argues that professional sociology has different rules of engagement. Homan (1991), a well-quoted figure, firmly occupies the standard position in his opposition to deception in social research and takes up the Erikson position in a modern context. For him, covert methods are seldom necessary.

As a counter argument to this, a smaller amount of dissident authors are broadly more sympathetic to the appropriate use of deception (Spicker, 2011). Bulmer (1982) stresses the need to recognize a wider variety of observational research strategies that are not captured by the crude binary dichotomy around covert and overt research. Such simplistic reasoning, which does not recognize complexity, for Bulmer, “stultifies debate and hinders methodological innovation” (1982, p. 252). Similarly, Mitchell disagrees with crude debates about deception and stresses: “Secrecy in research is a risky but necessary business” (1993, p. 54). Barrera and Simpson suggest that researchers should adopt “a pragmatic, evidence-based approach to the question of when deception is advisable” (2012, p. 406). Similarly, Roulet et al. (2017) argue for a reconsideration of the value of covert research as informed consent is typically ambiguous in many participant observation research settings. Roulet et al. stress: “Covert participant observation can enable researchers to gain access to communities or organizations and to collect knowledge that would otherwise remain unavailable. In some situations, covert participant observation can help create knowledge to change society for the better” (2017, p. 512).

Linked to this, there is a growing dissident literature on informed consent (Crow et al. 2006; Librett and Perrone 2010; Sin 2005) with many researchers viewing informed consent as ultimately partial, contingent, dynamic, and shifting rather than fixed and absolute. It is not to say that informed consent does not have a valuable role to play in social research or that it is only ever ceremonial. Rather, it is my contention that we effectively have a “fetish” for informed consent; it attracts a sort of blind faith which, in its simple form, denies the use of deception. Of course, there is reason to restrict covert research in certain areas and with vulnerable populations. In such instances, there is relevant and appropriate legislation and related codes of practice, such as the Mental Capacity Act 2005 and Safeguarding Vulnerable Groups Act 2006.

Clearly there are very particular contexts and settings, where the vast majority of sensible researchers would be ordinarily compliant with specific rules. The point here is that most research settings are more mundane and less extreme. Hence, it is clearly not and never can be an ethical “one-size-fits-all” blanket situation. Put simply, covert research can be applicable in some settings and not in others rather than never applicable in any settings. Lawton (2001) and Marzano (2007) each discuss the challenges and messy complexities of doing research in palliative care units, which involved liminal mixtures of both deception and informed consent at different stages of the research process.

Concerns about the restrictive regimentation of research are not new with deception, but what we have in the current regime is a distinct intensification in ethical regulation and governance. There is also a growing dissident literature on ethical governance and regimentation (Israel 2015; Van Den Hoonaard 2011, 2016). What researchers are effectively faced with is an “ethics creep” (Haggerty 2004), an “audit creep” (Lederman 2006), and a focus on reputational brand management (Hedgecoe 2016). Hammersley (2009, 2010), Hammersley and Traianou (2012), and Dingwall (2006, 2008) rightly caution the social science community about the growing disconnect of the ethical regime from the complexities of real-world research. Hammersley (2010) provocatively describes the process as “creeping ethical regulation and the strangling of research.” Moore (2018) argues that the current reactionary views toward deception are bound up with the repressive transparency and openness agenda, sensitivities over public trust, and the development of the audit and ethics culture.

Uz and Kemmelmeier (2017) adopt a more sympathetic approach to forms of managed deception in social and behavioral research, stressing that “rather than a type of research reserved for exceptional cases, there should be no prejudice against the use of deception” (2017, p. 104) with their important caveat that “at the time of consent, participants must be informed that the research procedures they are about to experience may include deception” (2017, p. 104). Perhaps this type of middle-ground position might be more palatable to more researchers in the social science community.

It seems as if the inflated reactions and responses to deceptive research is based on extremity and certain assumptions about the integrity of deception. As if, to put it bluntly, all deceptive research results in the inevitable harm and brutalization of both researched and researcher. Logically, then, it follows that there can be no place for deception, which I think is reductionist and reactionary. Deceptive research can be, although not always, a creative and reflective endeavor and is not utterly devoid of scientific integrity. Rather, it requires a different type of situated integrity which involves nuanced ethical self-regulation.

Put more boldly, for some in deceptive research, the wider social science community could be missing a trick. Clearly, it is not to everyone’s intellectual taste and will not become mainstream, but it still has a different and valuable contribution to make to research methodology. Deception, for me, is part of a necessary and sensible process of ethics rupture (Van Den Hoonaard 2016). This is not an extreme relativistic position urging the removal of ethical scrutiny and review but the relaxation of it as regards deception.

Some universities are thankfully encouraging more flexibility in ethical research governance by constituting discipline-specific ethics committees rather than institution-wide pseudo-medicalized ones. We certainly cannot apply a simple “checklist mentality” to complex ethical decision-making (Iphofen 2009, 2011). Ethics committees do valuable work, and they are needed as checks and safeguards to research, particularly with regard to the cost and consequences of deception. Clearly, we have to work out ways of having sensible dialogues among various stakeholders in the research community, including deceptive researchers.

Deception in Action: A Covert Study of Bouncers in the Night-time Economy

My original 6-month deceptive fieldwork (Calvey 2000, 2008, 2013) was based in a range of pubs and clubs in Manchester, where I both lived and worked. It was a nomadic and embodied ethnography that never quite finished, and by both drift and opportunism, it became a longitudinal one (Calvey 2019). Despite managed avoidance after the fieldwork period had finished, I was regularly invited into various night clubs and pubs as “ponytail Dave,” my door nickname, for a number of years. It was commonly assumed that I was in between doors and looking for some door work, with a refusal of free entry being seen as a clear offense. As my door community network was quite extensive and unbounded, it was proving problematic and messy to cleanly exit the door world. Hence, I was never ‘off duty’ as a sociologist in these liminal days. Namely, my “bouncer gaze,” if you will, was quickly resurrected as I went “back into character” and, ironically, it became a source of further and rich immersive data (Calvey 2019). My deception was now an unforeseen longitudinal one.

My covert study was a purist piece of deception with no gatekeepers, key informants, or retrospective debriefing. I had come close to developing a key informant in the fieldwork period but finally decided against it in terms of the potential information spread that would be difficult to control. I had also seriously considered retrospective debriefing in the post-fieldwork period, but this can be dangerous, messy, and impractical. I also felt, with some bouncers, that I would cause them emotional distress by this revelation, so, ironically, the continued deception seemed a way of managing fake friendships and sensitive confessions. This continued deception was a source of considerable anxiety and guilt for me. In a complex way, the deceptive bouncer self, which is fabricated, manipulated, and engineered one, became intimately tied with my biography and true identity in ways that I had not planned for.

The analytic push was to debunk a deeply demonized subculture and occupation (Calvey 2000; Monaghan 2003; Hobbs et al. 2003; Rigakos 2008; Sanders 2016; Winlow 2001) and investigate the everyday world of bouncers as a “lived experience” (Geertz 1973) of doing the doors. I wanted to resist a type of analtyic exotica in producing yet another zoo-keeping study of deviance (Gouldner, 1968). My deceptive role and manufactured hyper-masculine bouncer self was deeply dramaturgical (Goffman 1967) throughout. Part of my autoethnographic layered portrait (Rambo 2005) of bouncing was about managing my “secret self” which involved the emotional management and guilty knowledge around “ethical moments” (Guilleman and Gillam 2004) in the field such as witnessing violence and faking friendship.

In this account of deception in action, I wanted to reject and resist a heroic, belligerent, and cavalier view of deception and replace it with a more nuanced one which involved types of ethical self-regulation and self-censorship throughout the fieldwork and beyond. Related to this, I made deliberate choices as to what I would publish to help protect the participants. A useful example of the ambivalent nature of the situated ethics in this deceptive role was when I withheld information from the police about an assault on a bouncer, which I had witnessed, at the strong request of the bouncer, who was the actual victim of the assault. My personal ethics were at odds with this, but the priority was to suspend my own moral code in the setting and not judge or correct those I was researching. I many ways, I was walking a type of legal tightrope (Pearson, 2009) that could have gone wrong. In this instance, the ethics of the other (Whiteman 2018) ran counter to my own but was ultimately privileged over mine.

My deceptive ethnography was a “lived intensity” (Ferrell and Hamm 1998), an “experiential immersion” (Ferrell 1998) and a form of edgework (Lyng, 1990). I perceive deception in this context as craft-like, demanding empathy, flexible passing, and nuanced mimicry. Deception got me closer to the everyday realities of being a bouncer and, in turn, a more sensible and realistic view of how they performed violence and masculinity as a doing which involved shades of bravado, deterrent and persuasion in both obvious and subtle ways. I wanted to disrupt the rather one-dimensional charicature of bouncers that many commentators, both academics and journalists, had cogently painted.

The Future of Deception: Cyber Research and Autoethnography

In the cyber or virtual world, which is very different from the traditional fieldwork locations, ethical concerns have not gone away but, indeed, are more difficult to regulate in this diffuse and fragmented environment. Cyberspace has, in some ways, become what I describe as a “covert playground” (Calvey 2017). Informed consent takes on new challenges in such an arena, which has become a serious concern for some researchers in their ongoing attempt to develop specific Internet ethics and protocols for various online ethical dilemmas (Buchanan 2004; Flick 2016; Hine 2005). Carusi and Jirotka (2009) accurately characterize the field of Internet and online ethics as an “ethical labyrinth,” with the Association of Internet Researchers consequently producing some ethical guidelines in 2002, modified in 2012. Granholm and Svedmark (2018) caution that online research with vulnerable populations can harm both the researcher and the researched, which ethics boards and committees must be actively vigilant about.

A diverse range of sensitive topics, including Internet sex (Sharp and Earle 2003), cosmetic surgery (Langer and Beckman 2005), and extreme dieting (Day and Keys 2008), have been explored by online lurking, and it is reasonable to assume that this is likely to increase. Murthy argues that “the rise of digital ethnographies has the potential to open new directions in ethnography” (2008, p. 837). Moreover, Murthy describes digital ethnography as a “covert affair” stressing: “my survey of digital ethnographies reveals a disproportionate number of covert versus overt projects” (2008, p. 839).

Social media is also saturated in forms of deception. The cases of bullying, intimidation, racist comments and textual hate by internet trolls (Jane 2013; Hughey and Daniels 2013) typically involve fake cyber selves (Calvey 2017). Similarly, the growth of fake selfies by the public is designed deceptively to shock and fool as many people as possible. Similarly, instances of “fake news” are massively on the rise in social media. As Tandoc et al. stress “Fake news has real consequences, which makes it an important subject for study” (2018, p. 149). Indeed Google and Facebook specifically employ more staff to try to censor the dissemination of fake news. Whistleblowing, although still based on anonymity, is the opposite side of the cyber coin and can be empowering (De Maria 2008).

The other side of this coin is the recent data harvesting and behavior modeling scandal associated with corporate giants Facebook, Amazon, and Google, which caused outrage and government inquiries. Zuboff (2018) cautions us that complex forms of deception is part of the new logic of surveillance capitalism. As big data research projects using social media become much more commonplace in the future, the problem of deceptive Internet research ethics will not go away but, if anything, will intensify and become more important.

Deception is not an “outdated” methodology that is resigned to the past, as some critics might assume. It has been successfully applied more recently to critically explore a range of topical contemporary issues in autoethnographic forms. Autoethnography is a relatively recent development in qualitative research (Ellis 1999, 2004, 2007), which has a variety of styles. Despite the criticisms of autoethnography as narcissistic (Delamont 2009; Tolich 2010), it is increasingly popular in various ethnographic communities. Some styles and use of it can involve deception, with the justification being that is biographically and experientially based.

Ronai and Ellis (1989) explored the interactional strategies of strippers and table dancers in the United States, with Ronai retrospectively reflecting on her biography as an erotic dancer in the past. This was a dancer as researcher autoethnographic role, as a way of accessing and exploring a closed deviant subculture and involved some deceptive tactics, particularly with the customers. Similarly, this liminal insider position from the “dancer’s perspective,” which involved some deceptive moves and tactics, was also used by Frank (2002), in her 6-year immersion of stripping in the United States and Colosi (2010) in her 2-year study of lap dancing in the United Kingdom. Because the community is unbounded and transitory, it is difficult to maintain any standardized form of informed consent. Ward (2010) faced similar challenges in her autoethnographic study of raving and recreational drugs culture in London. Her study initially started out as overt but shifted to covert by drift and not design due to the fluid nature of the dance community.

Woodcock (2016) went undercover for 6 months in a UK call center which sold life insurance to explore surveillance, control, resistance, and copying mechanisms in such places. What Woodcock describes as “working the phones,” which involved the use of standardized customer focused scripts and resulted in stressed employees with little autonomy. Call centers are typically known as the modern sweatshops and employ a large amount of young people precariously in the service sector. Woodcock employs a Marxist analysis to unpack the alienation of this modern workplace.

Similarly, Brannan went undercover for 3 months to explore “the mechanics of mis-selling” (2017, p. 641) and how that becomes embedded in the organizational practice of an international financial services call center. Brannan explored the training, induction, and initial work of direct sales agents as a set of sales rituals and legitimacy arrangements, with run counter to the increasing regulation of such activities. Some key informants were used by Brannan to further unpack misbehavior in the workplace.

Zempi (2017) used deception in her research on victimization and Islamophobia by wearing a Muslim veil in public, at the suggestion of her interview participants. Zempi, a female from a Christian background, argues that her covert autoethnographic experience was a way of gaining insider knowledge and an opportunity to “generate appreciative criminological data” (2017, p. 9).

What these discussions partly display is the successful use of deception as part of a mixed or multiple methods strategy. For me, such hybrid methodological moves will become more popular in the future. Deception can then be potentially more sensibly viewed as a more complementary rather than necessarily antagonistic field research strategy.

Conclusions: Deception as a Creative Part of the Ethical Imagination and Research Toolkit

Deception has not gone away and is currently still a very controversial topic, which arouses suspicion, shock, fascination, and emotive outcries. Although there is generally much less deception used now, it has not been confined to the history bin and realistically probably never will be.

The award-winning documentary films, The Twinning Reaction (2017), Three Identical Strangers (2018), and Secret Siblings (2018), investigate secretive psychological experiments on twin and sibling separation by Dr. Viola Bernard, a child psychologist and adoption consultant, and Dr. Peter Neubauer, a psychiatrist and principal investigator. The longitudinal experiment started in 1960 and was closed 20 years later in 1980. The participants, including various sets of twins and triplets, were babies from the Louise Wise Services, a prominent Jewish adoption agency in New York. They were part of a comparative adoption study, including extensive film footage and observational fieldwork data, with babies being separated and strategically placed in different homes. The key deceptive element here being that researchers actively withhold information from both parents and children over the entire study period that they had any siblings. None on the participants at any point were informed about the true aims of the secretive experiment. At least three individuals have committed suicide so far, and genuine doubts remain as to whether all the participants are now even fully aware of the hidden nature versus nurture experiment, which has been covered up for a lengthy period of time.

Although Bernard and Neubauer, both psychoanalyst practitioners linked to the Freud family, are generally published in conventional adoption science, this belligerent and cruel deceptive study has never been published. Clearly, such an experiment stands alongside the infamous Tuskegee syphilis experiment, conducted between 1932 and 1972, for ethical belligerence and human cruelty. Drs. Neubauer and Bernard have now both passed away, the specific adoption agency that colluded in the experiment closed and the project data sealed in the Vale University archives until 2065, with very restricted access to the data until publication. It is the source of ongoing litigation and is a seismic game changer in the genealogy of ethics as it has only recently come to light by the committed activism of filmmakers, investigative journalists, and television producers.

In a very different field, the left field activities of undercover police officers, investigating targeted activist groups during 2011, have been the subject of ongoing media attention and are now under formal investigation in the United Kingdom. Such practitioner work is typically justified as part of the protection of national security. Various male officers, the most prominent being Mark Kennedy, had become involved in long-term intimate relationships while still undercover, including marriage, fatherhood, and using the fake identities of dead children. Several women have taken legal action against the Metropolitan Police Force, and the Special Demonstration Squad was disbanded. Various compensation packages totalling millions have been paid to the victims, and a public inquiry into undercover policing in England and Wales since 1968, chaired initially by the late Sir Christopher Pritchard and then Sir John Mitting, was launched in July 2015. Historically, the inquiry is investigating the covert tactics used by around 144 officers on around 1,000 political groups. The enquiry is estimated to report its final findings in 2023 and is estimated to have cost around £10 million up to now.

These are clearly two extreme minority cases and should not represent nor spoil the use of deception in other areas. It is however a cautionary and provocative tale of the consequences of the potentially reckless use of deception.

It appears that deception is far more extensive and diverse than first anticipated. This is compounded by some rather sanitized research accounts, which underplay and gloss over deceptive tactics and moves. What we realistically have are “varieties of deception” using a “continuum of deception” as not all theorists are doing the same thing when using deception.

My intention here is not to develop a prescriptive recipe manual on how to do deceptive research. Rather, my intention has been to compare and contrast different deceptive scenarios and reflect on what lessons can be learnt from the deceptive condition. It is my strong contention that the history and direction of the social sciences would definitely not have been the same without deception. The deceptive outsiders, outcasts, heritics and pariahs are needed to provocatively breach and positively disrupt the canonical methodological rules around doing social research.

A number of provocative questions remain. What is the rationale for deliberate deception? Can deception be reasonably justified as a means to an end? Can deception provide a more intimate, nuanced, and layered view? Can deception potentially provide something different, something left field, push the envelope, and be outside of the box? Can deception be legitimately used to whistle blow and dig the dirt on wrongdoing? Is social science missing a trick by excluding deception as a research strategy?

In terms of justifying deception within a typical means-end schema, the principle of beneficence can still come into play, not automatically, but on a case-by-case basis via self-regulation. Clearly, some research settings with vulnerable groups, sensitive topics, and legal boundaries would not be appropriate for the use of deception. My point here is that some cases, which are in the minority, are extreme and untypical with the vast majority of research cases being more open to the use of deception.

My focus in the chapter has been on deliberate deception, whereas some common forms of deception are more akin to actively blurring your research identity, faking agreement, and concealing your true feelings and views, to develop rapport with the research participants. Many overt researchers use such chameleon tactics in the field, some without much ethical reflection. In this sense, some forms of subtle deception are more widespread than anticipated. Deception and covertness are thus not fixed states but negotiated and shifting ones.

Deceptive methodology is not a heroic panacea, as no research methodology is, and it has some serious moral and ethical baggage which must be borne in mind, but it should not be ignored, marginalized, and censored. Deception can and should be recognized and appreciated more creatively as part of a wider and robust ethical imagination and research toolkit. Using deception shall realistically remain a niche position and not become the mainstream, but its minority status does not mean to say that it lacks value. Surely we need a broad range of methodological strategies to cope with the complexity of real-world research. It should be afforded a fairer and less prejudiced reading. Being able to equally choose deception in appropriate research settings is part of our research freedoms and, ultimately, a form of creative research integrity and research ethics.

References

  1. Adler JE (1997) Lying, Deceiving, or Falsely Implicating. The Journal of Philosophy 94(9):435–452CrossRefGoogle Scholar
  2. Barnes JA (1983) Lying: a sociological view. Aust J Forensic Sci 15(4):152–158CrossRefGoogle Scholar
  3. Barnes JA (1994) A pack of lies: towards a sociology of lying. Cambridge University Press, CambridgeCrossRefGoogle Scholar
  4. Barrera D, Simpson B (2012) Much ado about deception: consequences of deceiving research participants in the social sciences. Sociol Methods Res 41(3):383–413CrossRefGoogle Scholar
  5. Baumrind D (1964) Some thoughts on ethics of research: after reading Milgram’s ‘behavioral study of obedience’. Am Psychol 19:421–423CrossRefGoogle Scholar
  6. Baumrind D (2013) Is Milgram’s deceptive research ethically acceptable? Theor Appl Ethics 2(2):1–18Google Scholar
  7. Baumrind D (2015) When subjects become objects: the lies behind the Milgram legend. Theory Psychol 25:690–696CrossRefGoogle Scholar
  8. Beecher HK (1955) The powerful placebo. J Am Med Assoc 159(17):1602–1606CrossRefGoogle Scholar
  9. Blass T (2004) The man who shocked the world: the life and legacy of Stanley Milgram. Basic Books, LondonGoogle Scholar
  10. Bly N (1887) Ten days in a mad-house and miscellaneous sketches: ‘trying to be as servant’ and ‘Nellie Bly as a white slave’. Ian L. Munro Publishers, New YorkGoogle Scholar
  11. Bok S (1978) Lying: moral choice in public and private life. Pantheon Books, New YorkGoogle Scholar
  12. Bok S (1982) Secrets: on the ethics of concealment and revelation. Pantheon Books, New YorkGoogle Scholar
  13. Brandt AM (1978) Racism and research: the case of the Tuskegee syphilis study. Hast Cent Rep 8(6):21–29CrossRefGoogle Scholar
  14. Brannan MJ (2017) Power, corruption and lies: mis-selling and the production of culture in financial services. Hum Relat 70(6):641–647CrossRefGoogle Scholar
  15. Brannigan A (2004) The rise and fall of social psychology: the use and abuse of the experimental method. Aldine de Gruyter, New YorkGoogle Scholar
  16. Brannigan A, Nicholson I, Cherry F (2015) Introduction to the special issue: unplugging the Milgram machine. Theory Psychol 25(5):551–563CrossRefGoogle Scholar
  17. Buchanan EA (ed) (2004) Readings in virtual research ethics. Information Science Publishing, HersheyGoogle Scholar
  18. Buckingham RW, Lack SA, Mount BM, MacLean LD (1976) Living with the dying: use of the technique of participant observation. Can Med Assoc J 115:1211–1215Google Scholar
  19. Bulmer M (1982) When is disguise justified? Alternatives to covert participation observation. Qual Sociol 5(4):251–264CrossRefGoogle Scholar
  20. Calvert C (2000) Voyeur nation: media, privacy, and peering in modern culture. Westview Press, BoulderGoogle Scholar
  21. Calvey D (2000) Getting on the door and staying there: a covert participant observational study of bouncers. In: Lee-Treweek G, Linkogle S (eds) Danger in the field: risk and ethics in social research. Routledge, London, pp 43–60Google Scholar
  22. Calvey D (2008) The art and politics of covert research: doing ‘situated ethics’ in the field. Sociology 42(5):905–918CrossRefGoogle Scholar
  23. Calvey D (2013) Covert ethnography in criminology: a submerged yet creative tradition in criminology. Curr Issues Crim Just 25(1):541–550CrossRefGoogle Scholar
  24. Calvey D (2017) The art, politics and ethics of undercover fieldwork. Sage, LondonCrossRefGoogle Scholar
  25. Calvey D (2018) Covert: the fear and fascination of a methodological Pariah. In: Iphofen R, Tolich M (eds) The SAGE handbook of qualitative research ethics. Sage, London, pp 470–485CrossRefGoogle Scholar
  26. Calvey D (2019) The everyday world of bouncing: a rehabilitated role for covert ethnography. Qual Res 19(3):247–262CrossRefGoogle Scholar
  27. Carusi A, Jirotka M (2009) From data archive to ethical labyrinth. Qual Res 9(3):285–298CrossRefGoogle Scholar
  28. Colosi R (2010) Dirty dancing? An ethnography of lap-dancing. Willan Publishing, AbingdonGoogle Scholar
  29. Cressey PG (1932) The taxi-dance hall; a sociological study in commercialized recreation and city life. University of Chicago Press, ChicagoGoogle Scholar
  30. Crow G, Wiles R, Heath S, Charles V (2006) Research ethics and data quality: the implications of informed consent. Int J Soc Res Methodol 9(2):83–95CrossRefGoogle Scholar
  31. Dalton M (1959) Men who manage: fusions of feeling and theory in administration. Wiley, New YorkGoogle Scholar
  32. Darley JM, Batson CD (1973) From Jerusalem to Jericho: a study of situational and dispositional variables in helping behavior. J Pers Soc Psychol 27(1):100–108CrossRefGoogle Scholar
  33. Darley JM, Latane B (1970) The unresponsive bystander: why doesn’t he help? Appleton Century Crofts, New YorkGoogle Scholar
  34. Day K, Keys T (2008) Starving in cyberspace: a discourse analysis of pro-eating disorder websites. J Gend Stud 17(1):1–15CrossRefGoogle Scholar
  35. De Maria W (2008) Whistle blowers and organizational protestors: crossing imaginary borders. Curr Sociol 56(6):865–883CrossRefGoogle Scholar
  36. Delamont S (2009) The only honest thing: auto ethnography, reflexivity and small crises in fieldwork. Ethnogr Educ 4(1):51–63CrossRefGoogle Scholar
  37. Dingwall R (2006) Confronting the anti-democrats: the unethical nature of ethical regulation in social science. Med Sociol Online 1(1):51–58Google Scholar
  38. Dingwall R (2008) The ethical case against ethical regulation in humanities and social science research. Twenty-First Century Soc 3(1):1–12CrossRefGoogle Scholar
  39. Ekman P (1985) Telling lies: clues to deceit in the marketplace, politics and marriage. W. W. Norton, New YorkGoogle Scholar
  40. Ellis C (1999) Heartfelt autoethnography. Qual Health Res 9(5):669–683CrossRefGoogle Scholar
  41. Ellis C (2004) The ethnographic I: a methodological novel about autoethnography. AltaMira Press, Walnut CreekGoogle Scholar
  42. Ellis C (2007) Telling secrets, revealing lives: relational ethics in research with intimate others. Qual Inq 13(1):3–29CrossRefGoogle Scholar
  43. Erikson KT (1967) A comment on disguised observation in sociology. Soc Probl 14(4):366–373CrossRefGoogle Scholar
  44. Farrington DP, Knight BJ (1979) Two non-reactive field experiments on stealing from a lost letter. Br J Soc Clin Psychol 18(3):277–284CrossRefGoogle Scholar
  45. Ferrell J (1998) Criminological verstehen: inside the immediacy of crime. In: Ferrell J, Hamm MS (eds) Ethnography at the edge: crime, deviance, and field research. Northeastern University Press, Boston, pp 20–43Google Scholar
  46. Ferrell J, Hamm MS (1998) True confessions: crime, deviance, and field research. In: Ferrell J, Hamm MS (eds) Ethnography at the edge: crime, deviance, and field research. Northeastern University Press, Boston, pp 2–20Google Scholar
  47. Festinger L, Riecken H, Schachter S (1956) When prophecy fails: a social and psychological study of a modern group that predicted the destruction of the world. University of Minnesota Press, MinneapolisCrossRefGoogle Scholar
  48. Flick C (2016) Informed consent and the Facebook emotional manipulation study. Res Ethics 12(1):14–28CrossRefGoogle Scholar
  49. Frank K (2002) G-strings and sympathy: strip club regulars and male desire. Duke University Press, DurhamCrossRefGoogle Scholar
  50. Gamer M, Ambach W (2014) Deception research today. Front Psychol 5:Article 256, 1–3CrossRefGoogle Scholar
  51. Geertz C (1973) The interpretation of cultures: selected essays. Basic Books, New YorkGoogle Scholar
  52. Goffman E (1961) Asylums: The Essays on the social situation of mental patients and other inmates. Doubeday, New YorkGoogle Scholar
  53. Goffman E (1967) Interaction ritual: essays on face-to-face behaviour. Doubleday, New YorkGoogle Scholar
  54. Goffman E (1989) On fieldwork. J Contemp Ethnogr 18(2):123–132CrossRefGoogle Scholar
  55. Goldman J (2006) The ethics of spying: a reader for the intelligence professional. The Scarecrow Press, OxfordGoogle Scholar
  56. Gøtzsche PC (1994) Is there logic in the placebo? Lancet 344:925–926CrossRefGoogle Scholar
  57. Gouldner, A (1968) The sociologist as partisan: Sociology and the welfare state. The American Sociologist 3(2):103–116Google Scholar
  58. Granholm C, Svedmark E (2018) Research that hurts: ethical considerations when studying vulnerable populations online. In: Iphofen R, Tolich M (eds) The SAGE handbook of qualitative research ethics. Sage, London, pp 501–509CrossRefGoogle Scholar
  59. Guilleman M, Gillam L (2004) Ethics, reflexivity and ‘ethically important moments in research’. Qual Inq 10(2):261–280CrossRefGoogle Scholar
  60. Haggerty KD (2004) Ethics creep: governing social science research in the name of ethics. Qual Sociol 27(4):391–414CrossRefGoogle Scholar
  61. Hammersley M (2009) Against the ethicists: on the evils of ethical regulation. Int J Soc Res Methodol 12(3):211–226CrossRefGoogle Scholar
  62. Hammersley M (2010) Creeping ethical regulation and the strangling of research. Sociol Res Online 15(4):16CrossRefGoogle Scholar
  63. Hammersley M, Traianou A (2012) Ethics in qualitative research: controversies and contexts. Sage, LondonCrossRefGoogle Scholar
  64. Haney C, Banks C, Zinbardo P (1973) Interpersonal dynamics in a simulated prison. Int J Criminol Penol 1:69–97Google Scholar
  65. Hedgecoe A (2016) Reputational risk, academic freedom and research ethics review. Sociology 50(3):486–501CrossRefGoogle Scholar
  66. Herrera CD (1997) A historical interpretation of deceptive experiments in American psychology. Hist Hum Sci 10(1):23–36CrossRefGoogle Scholar
  67. Hine C (2005) Virtual methods: issues in social research on the Internet. Berg, OxfordGoogle Scholar
  68. Hobbs D, Hadfield P, Lister S, Winlow S (2003) Bouncers: violence and governance in the night-time economy. Oxford University Press, OxfordGoogle Scholar
  69. Homan R (1991) The ethics of social research. Macmillan, LondonGoogle Scholar
  70. Hughey MW, Daniels J (2013) Racist comments at online news sites: a methodological dilemma for discourse analysis. Media Cult Soc 35(3):332–347CrossRefGoogle Scholar
  71. Humphreys L (1970) Tearoom trade: impersonal sex in public places. Aldine Publishing Company, ChicagoGoogle Scholar
  72. Humphreys L (1975) Tearoom trade: impersonal sex in public places. Aldine Publishing Company, Chicago. (Enlarged edition)Google Scholar
  73. Humphreys L (1980) Social science: ethics of research. Science 207(4432):713–714CrossRefGoogle Scholar
  74. Iphofen R (2009) Ethical decision making in social research: a practical guide. Palgrave Macmillan, BasingstokeGoogle Scholar
  75. Iphofen R (2011) Ethical decision making in qualitative research. Qual Sociol 11(4):443–446Google Scholar
  76. Israel M (2015) Research ethics and integrity for social scientists: beyond regulatory compliance. Sage, LondonCrossRefGoogle Scholar
  77. Jane EA (2013) Beyond antifandon; cheerleading, textual hate and new media ethics. Int J Cult Stud 17(2):176–190Google Scholar
  78. Korn JH (1997) Illusions of reality: a history of deception in social psychology. State University of New York Press, AlbanyGoogle Scholar
  79. Langer R, Beckman SC (2005) Sensitive research topics: netnography revisited. Qual Mark Res 8(2):189–203CrossRefGoogle Scholar
  80. LaPiere RT (1934) Attitudes vs. actions. Soc Forces 13(2):230–237CrossRefGoogle Scholar
  81. Latane B, Darley JM (1969) Bystander “apathy”. Am Sci 57(2):244–268Google Scholar
  82. Lawton J (2001) Gaining and maintaining consent: ethical concerns raised in a study of dying patients. Qual Health Res 11(5):693–705CrossRefGoogle Scholar
  83. Lederman R (2006) Introduction: anxious borders between work and life in a time of bureaucratic ethics regulation. Am Ethnol 33:477–481CrossRefGoogle Scholar
  84. Levine TR (ed) (2014) Encyclopedia of deception. Sage, LondonGoogle Scholar
  85. Librett M, Perrone D (2010) Apples and oranges: ethnography and the IRB. Qual Res 10(6):729–747CrossRefGoogle Scholar
  86. Lichtenberg P, Heresco-Levy U, Nitzan U (2004) The ethics of the placebo in clinical practice. J Med Ethics 30(6):551–554CrossRefGoogle Scholar
  87. Loftus B, Goold B (2012) Covert surveillance and the invisibilities of policing. Criminol Crim Just 12(3):275–288CrossRefGoogle Scholar
  88. Lyng, S (1990) Edgework: A Social Psychological Analysis of Voluntary Risk Taking. American Journal of Sociology 95(4):851–886CrossRefGoogle Scholar
  89. Marzano M (2007) Informed consent, deception and research freedom in qualitative research: a cross-cultural comparison. Qual Inq 13(3):417–436CrossRefGoogle Scholar
  90. Masip J, Garrido E, Herrero C (2004) Defining deception. An Psicol 20(1):147–171Google Scholar
  91. Merritt CB, Fowler RG (1948) The pecuniary honesty of the public at large. J Abnorm Soc Psychol 43(1):90–93CrossRefGoogle Scholar
  92. Milgram S (1974) Obedience to authority: an experimental view. Tavistock Publications, LondonGoogle Scholar
  93. Milgram S (1977) Subject reaction: the neglected factor in the ethics of experimentation. Hast Cent Rep 7(5):19–23CrossRefGoogle Scholar
  94. Miller AG (2013) Baumrind’s reflections on her landmark ethical analysis of Milgram’s obedience experiments (1964): an analysis of her current views. Theor Appl Ethics 2(2):19–44Google Scholar
  95. Mitchell RG Jr (1993) Secrecy and fieldwork. Sage, LondonCrossRefGoogle Scholar
  96. Monaghan L (2003) Danger on the doors: bodily risk in a demonised occupation. Health Risk Soc 5(1):11–31CrossRefGoogle Scholar
  97. Moore S (2018) Towards a sociology of institutional transparency: openness, deception and the problem of public trust. Sociology 52(2):416–430CrossRefGoogle Scholar
  98. Murthy D (2008) Digital ethnography: an examination of the use of new technologies for social research. Sociology 42(5):837–855CrossRefGoogle Scholar
  99. Patrick J (1973) A Glasgow gang observed. Eyre Methuen, LondonGoogle Scholar
  100. Pearson G (2009) The researcher as hooligan: where ‘participant’ observation means breaking the law. Int J Soc Res Methodol 12(3):243–255CrossRefGoogle Scholar
  101. Perry G (2013) Behind the shock machine: the untold story of the notorious Milgram psychology experiments. The New Press, New YorkGoogle Scholar
  102. Pfaff T, Tiel JR (2004) Ethics of espionage. J Mil Ethics 3(1):1–15CrossRefGoogle Scholar
  103. Rambo C (2005) Impressions of grandmother: an autoethnographic portrait. J Contemp Ethnogr 34(5):560–585CrossRefGoogle Scholar
  104. Rigakos GS (2008) Nightclub: Bouncers, Risk and the Spectacle of Consumption, McGill-Queen’s University Press, MontrealGoogle Scholar
  105. Robinson WP (1996) Deceit, delusion and detection. Sage, Thousand OaksGoogle Scholar
  106. Rohy V (1996) Displacing desire: passing, nostalgia and Giovanni’s Room. In: Ginsberg EK (ed) Passing and the fictions of identity. Duke University Press, Durham, pp 218–233CrossRefGoogle Scholar
  107. Ronai CR, Ellis C (1989) Turn-ons for money: interactional strategies of the table dancer. J Contemp Ethnogr 18(3):271–298CrossRefGoogle Scholar
  108. Rosenhan DL (1973) On being sane in insane places. Science 179(4070):250–258CrossRefGoogle Scholar
  109. Roulet TJ, Gill MJ, Stenger S, Gill DJ (2017) Reconsidering the value of covert research: the role of ambiguous consent in participant observation. Organ Res Methods 20(3):487–517CrossRefGoogle Scholar
  110. Sanders B (2016) In the Club: Ecstasy Use and Supply in a London Nightclub. Sociology 39(2):241–258CrossRefGoogle Scholar
  111. Scheper-Hughes N (2004) Parts unknown: undercover ethnography of the organs-trafficking underworld. Ethnography 5(1):29–73CrossRefGoogle Scholar
  112. Scheper-Hughes N (2009) The ethics of engaged ethnography. Anthropology News, September, pp 13–14Google Scholar
  113. Sharp K, Earle S (2003) Cyber punters and cyber whores: prostitution on the Internet. In: Jewkes Y (ed) Dot.cons: crime, deviance and identity on the Internet. Willan Publishing, Collumpton, pp 36–52Google Scholar
  114. Shilling C, Mellor P (2015) For a sociology of deceit: doubled identities, interested actions and situational logics of opportunity. Sociology 49(4):607–623CrossRefGoogle Scholar
  115. Simmel G (1906) The Sociology of Secrecy and of Secret Societies. American Journal of Sociology 11(4):441–498CrossRefGoogle Scholar
  116. Sin CH (2005) Seeking informed consent: reflections on research practice. Sociology 39(2): 277–294CrossRefGoogle Scholar
  117. Spicker P (2011) Ethical covert research. Sociology 45(1):118–133CrossRefGoogle Scholar
  118. Tandoc EC, Lim ZW, Ling R (2018) Defining ‘fake news’. Digit Journal 6(2):137–153CrossRefGoogle Scholar
  119. Thomas SB, Quinn SC (1991) Public health then and now. Am J Public Health 81(11):1498–1505CrossRefGoogle Scholar
  120. Tolich M (2010) A critique of current practice: ten foundational guidelines for autoethnographies. Qual Health Res 20(12):1599–1610CrossRefGoogle Scholar
  121. Uz I, Kemmelmeier M (2017) Can deception be desirable? Soc Sci Inf 56(1):98–106CrossRefGoogle Scholar
  122. Van Den Hoonaard WC (2011) The seduction of ethics: transforming the social sciences. University of Toronto Press, TorontoGoogle Scholar
  123. Van Den Hoonaard WC (2016) The ethics rupture: exploring alternatives to formal research ethics review. The University of Toronto Press, TorontoCrossRefGoogle Scholar
  124. Ward J (2010) Flashback: drugs and dealing in the Golden Age of the London rave scene. Routledge, LondonGoogle Scholar
  125. Whiteman N (2018) What if they’re bastards?: ethics and the imagining of the other in the study of online fan cultures. In: Iphofen R, Tolich M (eds) The SAGE handbook of qualitative research ethics. Sage, London, pp 510–525CrossRefGoogle Scholar
  126. Winlow S (2001) Badfellas: Crime, Tradition and New Masculinities. Berg, OxfordGoogle Scholar
  127. Woodcock J (2016) Working the phones: control and resistance in call centres. Pluto Press, LondonCrossRefGoogle Scholar
  128. Zempi I (2017) Researching victimisation using auto-ethnography: wearing the Muslim veil in public. Methodol Innov 10(1):1–10Google Scholar
  129. Zimbardo P (2007) The Lucifer effect: understanding how good people turn evil. Random House, New YorkGoogle Scholar
  130. Zuboff S (2018) The age of surveillance: the fight for a human future at the new frontier of power. Profile Books, LondonGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of Sociology, Manchester Metropolitan UniversityManchesterUK

Personalised recommendations