Skip to main content

PETs in the Surveillance Society: A Critical Review of the Potentials and Limitations of the Privacy as Confidentiality Paradigm

  • Chapter
  • First Online:

Abstract

“Privacy as confidentiality” has been the dominant paradigm in computer science privacy research. Privacy Enhancing Technologies (PETs) that guarantee confidentiality of personal data or anonymous communication have resulted from such research. The objective of this chapter is to show that such PETs are indispensable but are short of being the privacy solutions they sometimes claim to be given current day circumstances. We will argue using perspectives from surveillance studies that the computer scientists’ conception of privacy through data or communication confidentiality is techno-centric and displaces end-user perspectives and needs in a surveillance society. We will further show that the perspectives from surveillance studies demand a critical review for their human-centric conception of information systems. Last, we re-position PETs in a surveillance society and argue for the necessity for multiple paradigms for privacy and related design.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    The series of conferences commenced in 1961 and were dissolved in 1990 (IEEE Computer Society Press Room 2007).

  2. 2.

    Although in the presented papers articulations of privacy and security solutions have parallels to the much longer standing tradition of cryptography and security research, we in this paper start our account of computer science privacy research with the explicit introduction of the term “privacy” at the Spring Joint Computer Conference.

  3. 3.

    Three of the authors were from the RAND Corporation (Ware 1967a; Petersen and Turn1967), one from M.I.T (Glaser 1967) and one from a company named Allen-Babcock Computing (Babcock 1967).

  4. 4.

    In the last 10 years privacy has become one of the central concerns of Pervasive Computing, in Europe often researched under the title Ambient Intelligence. The journal IEEE Security and Privacy, for example, was a side effect of these research fields, giving recent privacy research a visible publication (Cybenko 2003).

  5. 5.

    Later, other sub-fields in computer science have proposed other types of PETs that often rely on the contractual negotiation of personal data revelation. A review of such PETs can be found in (Wang and Kobsa 2006) but are not the focus of this paper since they are not based on the same assumptions that “privacy as confidentiality” is dependent on.

  6. 6.

    The consequences of probabilistic identification as legal evidence have been picked up by Braman (2006), but they are beyond the scope of this paper.

  7. 7.

    The focus of this paper is on privacy concerns raised and privacy enhancing technologies used on the Internet. We are aware that breaches also occur with devices that are off-line and on networks other than the Internet. Further, mobile technology has opened up a whole new set of questions about the feasibility of privacy enhancement with respect to location data. Whether the same assumptions and analyses hold for these problems is beyond the scope of this paper.

  8. 8.

    In 2007 Dan Egerstad set up a number of TOR exit nodes (a popular anonymizer, http://www.torproject.org) and sniffed over 100 passwords from traffic flowing through his nodes. The list included embassies and government institutions (Paul 2007).

  9. 9.

    There are numerous court cases in which digital data are used as evidence in ways which claim much more than the data seems at face value to represent. For example, a court in the U.S.A. accepted pictures from a social network site of a young woman enjoying a party a number of weeks after a car accident with casualties. The picture was used as proof that she lacked remorse (Wagstaff 2007).

  10. 10.

    Surveillance studies is a cross-disciplinary initiative to understand the rapidly increasing ways in which personal details are collected, stored, transmitted, checked, and used as means of influencing and managing people and populations (Lyon 2002). Surveillance is seen as one of the defining features that define and constitute modernity. In that sense, surveillance is seen as an ambiguous tool that may be feared for its power but also appreciated for its potential to protect and enhance life chances. Departing from paranoid perspectives on surveillance, the objective of these studies is to critically understand the implications of current day surveillance on power relations, security and social justice.

  11. 11.

    Unlinkable pseudonyms refer to systems in which users identify themselves with different pseudonyms for different sets of transactions. For an observer it should not be possible to identify whether two pseudonyms are coming from the same user. If implemented in an infrastructure with a trusted third party distributing the pseudonyms, then these can be revoked, ideally only under certain (legally defined) conditions.

  12. 12.

    It is therefore no surprise that in the latest uproar against the new Terms of Use of Facebook, users have argued for a radical deletion of their profile to include the deletion of all their contributions to other profiles and all their correspondences to other people. The protection of individual privacy in such instances is valued over the integrity of the discussions forums, mailboxes of friends, posted photographs etc.

  13. 13.

    Anne Roth started a blog in which she documented their everyday activities after her partner was declared a terrorist in Germany in 2007 and they found out that their family had been subject to police surveillance for over a year (http://annalist.noblogs.org/). Similarly, New Jersey artist Hasan Elahi started documenting every minute of his life on the Internet after he was detained by the FBI at an airport (http://trackingtransience.net/). Both of these persons made the assumption that keeping their lives public protects their freedoms when their data is used against them.

  14. 14.

    The information in this document is provided “as is”, and no guarantee or warranty is given that the information is fit for any particular purpose. The above referenced consortium members shall have no liability for damages of any kind including without limitation direct, special, indirect, or consequential damages that may result from the use of these materials subject to any liability which is mandatory due to applicable law.

References

  • Babcock, J.D. 1967. A brief description of privacy measures in the rush time-sharing system. In AFIPS ’67 (Spring): Proceedings of the April 18–20, 1967, spring joint computer conference, 301–302. New York: ACM.

    Google Scholar 

  • BBC. 2002. Flashback: Rodney king and the LA riots. BBC. Online, July 10, 2002.

    Google Scholar 

  • Becker, Justin, and Hao Chen. 2009. Measuring privacy risk in online social networks. In Web 2.0 security symposium. Oakland.

    Google Scholar 

  • Berendt, Bettina, Oliver Günther, and Sarah Spiekermann. 2005. Privacy in e-commerce: stated preferences vs. actual behavior. Communications of the ACM 48: 101–106.

    Article  Google Scholar 

  • Braman, Sandra. 2006. Tactical memory: The politics of openness in the construction of memory. First Monday 11 (7). http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/1363/1282

    Google Scholar 

  • Bundesverfassungsgericht. 1983. BVerfGE 65, 1 – Volkszahlung. Urteil des Ersten Senats vom 15. Dezember 1983 auf die mündliche Verhandlung vom 18. und 19. Oktober 1983—1 BvR 209, 269, 362, 420, 440, 484/83 in den Verfahren über die Verfassungsbeschwerden.

    Google Scholar 

  • Chaum, David. 1985. Security without identification: Transaction systems to make big brother obsolete. Communications of the ACM 28: 1030–1044.

    Article  Google Scholar 

  • Curry, Michael R., and David Phillips. 2003. Surveillance as social sorting: Privacy, risk, and automated discrimination. In Privacy and the phenetic urge: Geodemographics and the changing spatiality of local practice, ed. David Lyon. London: Routledge.

    Google Scholar 

  • Cybenko, George. 2003. A critical need, an ambitious mission, a new magazine. IEEE Security and Privacy 1 (1): 5–9.

    Google Scholar 

  • Diaz, Claudia. 2005. Anonymity and privacy in electronic services. PhD thesis, Katholieke Universiteit Leuven.

    Google Scholar 

  • Dingledine, Roger, Nick Mathewson, and Paul Syverson. 2002. Reputation in privacy enhancing technologies. In CFP ’02: Proceedings of the 12th annual conference on Computers, freedom and privacy, 1–6. New York: ACM.

    Google Scholar 

  • Domingo-Ferrer, J., and V. Torra. 2008. A critique of k-anonymity and some of its enhancements. In Third international conference on availability, reliability and security, 2008. ARES 08, 990–993. Washington, DC: IEEE Computer Society.

    Google Scholar 

  • Dwork, Cynthia. 2006. Differential privacy. In ICALP, vol. 2, 1–12. Berlin: Springer.

    Google Scholar 

  • EU. 1995. Directive 95/46/ec of the European parliament and of the council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data. Official Journal of the European Communities, 31. (November 1995).

    Google Scholar 

  • Gallagher, Cornelius E. 1967. The computer and the invasion of privacy. In SIGCPR ’67: Proceedings of the fifth SIGCPR conference on Computer personnel research, 108–114. New York: ACM.

    Google Scholar 

  • Glaser, Edward L. 1967. A brief description of privacy measures in the multics operating system. In AFIPS ’67 (Spring): Proceedings of the April 18–20, 1967, spring joint computer conference, 303–304. New York: ACM.

    Google Scholar 

  • Graham, Stephen. 2005. Software-sorted geographies. Progress in Human Geography 29 (5): 562–580.

    Article  Google Scholar 

  • Gross, Ralph, and Alessandro Acquisti. 2005. Information revelation and privacy in online social networks. In WPES ’05: Proceedings of the 2005 ACM workshop on Privacy in the electronic society, 71–80. New York: ACM.

    Google Scholar 

  • Guarda, Paolo, and Nicola Zannone. 2009. Towards the development of privacy-aware systems. Information and Software Technology 51 (2): 337–350.

    Article  Google Scholar 

  • Gutwirth, Serge. 2002. Privacy and the information age. Lanham: Rowman and Littlefield Publishers.

    Google Scholar 

  • Hansen, Marit. 2008. Linkage control: Integrating the essence of privacy protection into identity management. eChallenges.

    Google Scholar 

  • Hildebrandt, Mireille. 2008. Profiling and the identity of the European citizen. In Profiling the European citizen: Cross disciplinary perspectives, eds. Mireille Hildebrandt and Serge Gutwirth. Dordrecht: Springer.

    Google Scholar 

  • IEEE Computer Society Press Room. 2007. Computer society history committee names top 60 events (1946–2006). IEEE Website.

    Google Scholar 

  • Kifer, Daniel, and Johannes Gehrke. 2006. l-diversity: Privacy beyond k-anonymity. In IEEE 22nd International Conference on Data Engineering (ICDE’07). New York: ACM.

    Google Scholar 

  • Lederer, Scott, Jason I. Hong, Anind K. Dey, and James A. Landay. 2004. Personal privacy through understanding and personal privacy through understanding and action: Five pitfalls for designers. Personal Ubiquitous Computing 8: 440–454.

    Article  Google Scholar 

  • Lewis, Paul. 2009. Video reveals G20 police assault on man who died. The Guardian, April 7, 2009.

    Google Scholar 

  • Li, Ninghui, and Tiancheng Li. 2007. t-closeness: Privacy beyond k-anonymity and—diversity. In IEEE 23rd International Conference on Data Engineering (ICDE’07). IEEE Computer Society Press: Los Alamitos, CA.

    Google Scholar 

  • Liu, Hugo Liu, Pattie Maes, and Glorianna Davenport. 2006. Unraveling the taste fabric of social networks. International Journal on Semantic Web and Information Systems 2 (1): 42–71.

    Google Scholar 

  • Lyon, David. 2002. Editorial. Surveillance Studies: Understanding visibility, mobility and the phenetic fix. Surveillance and Society 1 (1): 1–7.

    Google Scholar 

  • McGrath, John. 2004. Loving big brother: Performance, privacy and surveillance space. London: Routledge.

    Google Scholar 

  • Nguyen, David H., Elizabeth D. Mynatt. 2002. Privacy mirrors: Understanding and shaping socio-technical ubiquitous computing. Technical Report. Georgia Institute of Technology.

    Google Scholar 

  • Nissenbaum, Helen. 2004. Privacy as contextual integrity. Washington Law Review 79 (1): 101–139.

    Google Scholar 

  • Orlikowski, Wanda J. 2007. Sociomaterial practices: Exploring technology at work. Organization Studies 28: 1435–1448.

    Article  Google Scholar 

  • Owad, T. 2006. Data mining 101: Funding subversives with amazon wishlists. http://www.applefritter.com/bannedbooks.

    Google Scholar 

  • Palen, Leysia, and Paul Dourish. 2003. Unpacking “privacy” for a networked world. In CHI ’03: Proceedings of the SIGCHI conference on human factors in computing systems, 129–136. New York: ACM.

    Google Scholar 

  • Paul, Ryan. 2007. Security expert used tor to collect government e-mail passwords. Ars Technica, September 2007.

    Google Scholar 

  • Petersen, H.E., and R. Turn. 1967. System implications of information privacy. In AFIPS ’67 (Spring): Proceedings of the April 18–20, 1967, spring joint computer conference, 291–300. New York: ACM.

    Google Scholar 

  • Pfitzmann, Andreas, and Marit Hansen. 2008. Anonymity, unobservability, and pseudonymity: A consolidated proposal for terminology. Technical report. Technical University, Dresden.

    Google Scholar 

  • Phillips, David J. 2004. Privacy policy and PETs. New Media and Society 6 (6): 691–706.

    Article  Google Scholar 

  • Rebollo-Monederom, David, Jordi Fornfie, and Josep Domingo-Ferrer. 2008. From t-closeness to pram and noise addition via information theory. In PSD ’08: Proceedings of the UNESCO Chair in data privacy international conference on Privacy in Statistical Databases. Berlin: Springer.

    Google Scholar 

  • Rouvroy, Antoinette. 2009. Technology, virtuality and utopia. In Reading panel on autonomic computing, human identity and legal subjectivity: Legal philosophers meet philosophers of technology, CPDP 2009. Heidelberg: Springer.

    Google Scholar 

  • Solove, Daniel J. 2006. A taxonomy of privacy. University of Pennsylvania Law Review 154 (3): 477 (January 2006).

    Article  Google Scholar 

  • Stalder, Felix. 2002. The failure of privacy enhancing technologies (PETs) and the voiding of privacy. Sociological Research Online 7 (2). http://www.socresonline.org.uk/7/2/stalder.html.

    Google Scholar 

  • Sweeney, Latanya. 2002. k-anonymity: A model for protecting privacy. International Journal on Uncertainty, Fuzziness and Knowledge-based Systems 10 (5): 557–570.

    Article  Google Scholar 

  • Sweeney, Latanya. 2003. Achieving k-anonymity privacy protection using generalization and suppression. International Journal of Uncertainty 10 (5): 571–588.

    Google Scholar 

  • Tavani, Herman T., and James H. Moor. 2001. Privacy protection, control of information, and privacy-enhancing technologies. SIGCAS Computer Society 31 (1): 6–11.

    Article  Google Scholar 

  • Titus, James P. 1967. Security and privacy. Communications of the ACM 10 (6): 379–381.

    Article  Google Scholar 

  • Wagstafi, Evan. 2007. Court case decision reveals dangers of networking sites. Daily Nexus News, February 2007.

    Google Scholar 

  • Wang, Yang, and Alfred Kobsa. 2006. Privacy enhancing technologies. In Handbook of research on social and organizational liabilities in information security, eds. M. Gupta and R. Sharman. Hershey, PA: IGI Global.

    Google Scholar 

  • Ware, Willis H. 1967a. Security and privacy in computer systems. In AFIPS ’67 (Spring): Proceedings of the April 18–20, 1967, spring joint computer conference, 279–282. New York: ACM.

    Google Scholar 

  • Ware, Willis H. 1967b. Security and privacy: Similarities and difierences. In AFIPS ’67 (Spring): Proceedings of the April 18–20, 1967, spring joint computer conference, 287–290. New York: ACM.

    Google Scholar 

  • Westin, A.F. 1970. Privacy and freedom. Atheneum: Bodley Head Ltd.

    Google Scholar 

  • Whitten, Alma, and J.D. Tygar. 1999. Why johnny can’t encrypt: A usability evaluation of pgp 5.0. In SSYM’99: Proceedings of the 8th conference on USENIX security symposium. Berkeley: USENIX Association.

    Google Scholar 

  • Wills, David, and Stuart Reeves. 2009. Facebook as a political weapon: Information in social networks. British Politics 4 (2): 265–81.

    Article  Google Scholar 

  • Zheleva, E., and L. Getoor. 2009. To join or not to join: the illusion of privacy in social networks with mixed public and private user profiles. WWW 2009.

    Google Scholar 

  • Zwick, Detlev, and Nikhilesh Dholakia. 2003. Whose identity is it anyway? Comsumer representation in the age of database marketing. Journal of MacroMarketing 24 (1): 31–43.

    Google Scholar 

Download references

Acknowledgements

This work was supported in part by the Concerted Research Action (GOA) Ambiorics 2005/11 of the Flemish Government, by the IAP Programme P6/26 BCRYPT of the Belgian State (Belgian Science Policy), and by the European Community’s Seventh Framework Programme (FP7/2007–2013) under grant agreement n 216287 (TAS3—Trusted Architecture for Securely Shared Services).Footnote 14 The authors also wish to thank Carmela Troncoso, Claudia Diaz, Nathalie Trussart, Andreas Pfitzmann, Brendan van Alsenoy, Sarah Bracke, Manu Luksch and Aaron K. Martin for their valuable and critical comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Seda Gürses .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer Science+Business Media B.V.

About this chapter

Cite this chapter

Gürses, S., Berendt, B. (2010). PETs in the Surveillance Society: A Critical Review of the Potentials and Limitations of the Privacy as Confidentiality Paradigm. In: Gutwirth, S., Poullet, Y., De Hert, P. (eds) Data Protection in a Profiled World. Springer, Dordrecht. https://doi.org/10.1007/978-90-481-8865-9_19

Download citation

Publish with us

Policies and ethics