Skip to main content

Whom to Trust? Using Technology to Enforce Privacy

  • Chapter
  • First Online:
Enforcing Privacy

Part of the book series: Law, Governance and Technology Series ((ISDP,volume 25))

Abstract

A wide variety of technologies and tools have been proposed to improve privacy protection. We first review these technologies according to two criteria: the functionality they provide and the actors involved in their use. The main classes of functionalities are information hiding (e.g., anonymisation, encryption, etc.), information management (subject privacy policies, user interfaces, etc.), transparency (dashboards, controller privacy policies) and accountability (traceability, log management, etc.). As far as the actors involved are concerned, we identify three main categories: the data subject, trusted third parties and pairs. The categories of actors required to deploy a tool can have a great impact on its usability and on the type of protection and trust provided by the tool. The role of the subject is also a critical aspect, which requires careful thinking. It is related to the notion of consent, its value for privacy protection but also its limitations and the risks of relying too much on it. In conclusion, we review some of the main challenges in this area including the issues raised by the large-scale exploitation of data (“big data”) and the effective implementation of privacy by design and accountability.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Goldberg, Ian, David Wagner, Eric A. Brewer, “Privacy-Enhancing Technologies for the Internet”, IEEE COMPCON ’97, February 1997. Goldberg, Ian, “Privacy-Enhancing Technologies for the Internet III: Ten years later”, Chapter 1, in Alessandro Acquisti, Stefanos Gritzalis, Costos Lambrinoudakis, Sabrina di Vimercati (eds.), Digital Privacy: Theory, Technologies, and Practices, December 2007. Danezis, George, and Seda Gürses, “A critical review of 10 years of privacy technology” in Surveillance Cultures: A Global Surveillance Society?, UK April 2010. Diaz, Claudia, and Seda Gürses, “Understanding the landscape of privacy technologies”, Extended abstract of invited talk in proceedings of the Information Security Summit, 2012, pp. 58–63. Gürses, Seda, and Bettina Berendt, “PETs in the surveillance society: a critical review of the potentials and limitations of the privacy as confidentiality paradigm”, in Serge Gutwirth, Yves Poullet, Paul De Hert (eds.), CPDP Springer Verlag, 2009. Shen, Yun, and Siani Pearson, “Privacy-enhancing Technologies: A Review”, HP Laboratories HPL-2011-113.

  2. 2.

    Borking, John J., “Why Adopting Privacy-enhancing Technologies (PETs) Takes so Much Time”, in Serge Gutwirth, Yves Poullet, Paul De Hert and Ronald Leenes (eds.), Computers, Privacy and Data Protection: an Element of Choice, Springer Verlag, 2011, pp. 309–341.

  3. 3.

    Diaz Claudia, Omer Tene, Seda F. Guerses, “Hero or Villain: The Data Controller in Privacy Law and Technologies”, Ohio State Law Journal, Vol. 74, No. 6, 2013.

  4. 4.

    We use the expression “data controller” in the sense of the entity collecting and processing personal data in this chapter. More precisely, the data controller is defined as “the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data” by the Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of such Data, European Parliament, Brussels, 1995.

  5. 5.

    A way to enhance trust in the technology is to have all technological components entirely available in source code and open to scrutiny by communities of experts (even though this scrutiny does not bring absolute guarantees either, as illustrated by the famous Heartbleed security bug in the OpenSSL cryptography library: https://en.wikipedia.org/wiki/Heartbleed?oldid=cur

  6. 6.

    In particular, we do not discuss techniques and protocols dedicated to specific applications such as e-voting, even though they are very challenging and give rise to interesting and potentially wide-ranging research work.

  7. 7.

    See, for example, Bruce Schneier’s cryptography classics library: Applied cryptography (Wiley, 1995), Practical cryptography (Wiley, 2003) and Secrets and lies (Wiley, 2004) or Mao, Wenbo, Modern cryptography: theory and practice, HP Professional Series, 2003.

  8. 8.

    Organisation for Economic Co-operation and Development, Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, 1980. See also Thirty Years After The OECD Privacy Guidelines, 2011, http://www.oecd.org/sti/ieconomy/49710223.pdf and the latest version of the guidelines: The Recommendation of the OECD Council concerning Guidelines governing the Protection of Privacy and Transborder Flows of Personal Data, July 2013. http://www.oecd.org/sti/ieconomy/2013-oecd-privacy-guidelines.pdf

  9. 9.

    Deng, Mina, Kim Wuyts, Riccardo Scandariato, Bart Preneel and Wouter Joosen, “A privacy threat analysis framework: supporting the elicitation and fulfillment of privacy requirements”, Requirements Engineering, Special Issue on Digital Privacy, Vol. 16, Issue 1, March 2011, pp. 3–32.

  10. 10.

    See, for example, Kokott, Juliane, and Christoph Sobotta, “The distinction between privacy and data protection in the jurisprudence of the CJEU and the ECtHR”, International Data Privacy Law, Vol. 3, No. 4, 2013.

  11. 11.

    Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), voted by the European Parliament on 12 March 2014.

  12. 12.

    See, for example, Wuala (https://www.wuala.com/en/), Spideroak (https://spideroak.com/) or owncloud (http://owncloud.org/features/).

  13. 13.

    Local encryption of data before they are transferred to the cloud and no disclosure of the decryption key to the cloud service provider so that users keep full control over their data.

  14. 14.

    For a survey on data minimisation properties, see: Pfitzmann, Andreas, and Marit Hansen, “A terminology for talking about privacy by data minimization: Anonymity, Unlinkability, Undetectability, Unobservability, Pseudonymity, and Identity Management”, Version v0.34, 10 August 2010. https://dud.inf.tu-dresden.de/literatur/Anon_Terminology_v0.34.pdf

  15. 15.

    More precisely, anonymity is defined as the fact that a subject is not identifiable (i.e., cannot be uniquely characterised) within a set of possible subjects. Indeed, identity is only one of the possible ways of characterising uniquely a subject.

  16. 16.

    Pretty Good Privacy.

  17. 17.

    Gnu Privacy Guard.

  18. 18.

    These tools provide further functionalities, such as signature and authentication but we focus on confidentiality here.

  19. 19.

    https://crypto.cat/#

  20. 20.

    Off-the-Record Messaging.

  21. 21.

    Actually, OTR provides further guarantees including deniability (a user can always deny having sent a message, just as he can deny having said something in a real-life conversation) and perfect forward secrecy (if the computer of a user is compromised or his secrets are stolen, the messages previously sent by other users remain secret).

  22. 22.

    SSL (Secure Socket Layer) is the ancestor of TLS (Transport Security Layer).

  23. 23.

    Virtual Private Networks.

  24. 24.

    Add-ons such as HTTPS Everywhere are also available to force HTTPS versions of websites when they are available.

  25. 25.

    Goldberg, Ian, “Privacy-Enhancing Technologies for the Internet III: Ten years later”, Chapter 1, in Alessandro Acquisti, Stefanos Gritzalis, Costos Lambrinoudakis and Sabrina di Vimercati (eds.), Digital Privacy: Theory, Technologies, and Practices, Auerbach Publications, New York, December 2007.

  26. 26.

    Beato, Filipe, Markulf Kohlweiss and Karel Wouters, “Scramble! Your Social Network Data”, in Simone Fischer-Hübner and Nicholas Hopper (eds.), Privacy-enhancing Technologies, Proceedings of the 11th International Symposium, PETS 2011, Waterloo, ON, Canada, 27–29 July 2011, Springer, Heidelberg, 2011, pp. 211–225.

  27. 27.

    De Cristofaro, Emiliano, Claudio Soriente, Gene Tsudik and Andrew Williams, “Hummingbird: Privacy at the Time of Twitter”, in IEEE Symposium on Security and Privacy, San Francisco, CA, 21–23 May 2012, pp. 285–299.

  28. 28.

    A time correlation attack is possible when an attacker watching the network can establish a correlation between different events, for example, the sending of a message and its receipt, based on their times of occurrence.

  29. 29.

    A size correlation attack is possible when an attacker watching the network can establish a correlation between different events, for example, the sending of a message and its receipt, based on their sizes.

  30. 30.

    Goldberg, op. cit.

  31. 31.

    https://www.anonymizer.com/anonymizer_universal.html

  32. 32.

    Ixquick was awarded the first European Privacy Seal in 2008. It offers an anonymous proxy browsing option in which the web pages retrieved after a search also follow the proxy route, which makes the user IP address invisible from the visited page: https://startpage.com and https://www.ixquick.com. Other privacy-preserving search engines are available, such as DuckDuckGo, which does not track users or store (or use except for the search itself) personal data: https://duckduckgo.com/ and is enjoying a growing popularity with more than 5 million queries per day in March 2014 (compared to less than 1 million in January 2012).

  33. 33.

    https://ixquick.com/eng/company-background.html

  34. 34.

    Especially type-0 technology.

  35. 35.

    The Onion Router: https://www.torproject.org/. See also Syverson, Paul S., David M. Goldschlag, and Michael G. Reed, “Anonymous Connections and Onion Routing”, in Proceedings of the 18th Annual Symposium on Security and Privacy, IEEE CS Press, May 1997, pp. 44–54. Goldschlag, David, Michael Reed and Paul Syverson, “Onion Routing for Anonymous and Private Internet Connections”, Communications of the ACM, Vol. 42, No. 2, February 1999, pp. 39–41.

  36. 36.

    More precisely, this type of system relies on a form of distributed trust: privacy can be breached only in case of collusion between a significant number of malicious nodes. For example, the anonymity of a Tor user can be broken only if all the nodes relaying his communication collude.

  37. 37.

    Actually, Tor provides an additional functionality to withhold part of this information.

  38. 38.

    https://en.wikipedia.org/wiki/Tor_%28anonymity_network%29#Licit_and_illicit_uses

  39. 39.

    Cutillo, Leucio Antonio, Refik Molva and Thorsten Strufe, “Safebook: a privacy preserving online social network leveraging on real-life trust”, IEEE Communications Magazine, Vol. 47, No. 12, December 2009. The Diaspora is another example of a decentralised social network hosted by independently owned nodes (called pods); the goals of its creators are to use its distributed design to ensure that “no big corporation will ever control Diaspora” and to ensure that users keep control over their personal data. https://blog.diasporafoundation.org/

  40. 40.

    With the assumption that an attacker cannot observe the whole network.

  41. 41.

    http://cappris.inria.fr/wp-content/uploads/2013/04/S%C3%A9bastien-Gambs.pdf

  42. 42.

    ISO/IEC DIS 24760-2, Information Technology – Security Techniques – A Framework for Identity Management – Part 2: Reference architecture and requirements.

  43. 43.

    For example, Microsoft Passport, CardSpace and its successor U-Prove, the Liberty Alliance, now integrated within the Kantara Initiative.

  44. 44.

    Federated Identity Management.

  45. 45.

    Camenisch, Jan, and Els Van Herreweghen, “Design and Implementation of the Idemix Anonymous Credential System”, Proceedings of the 9th ACM Conference on Computer and Communications Security, CCS ’02, 2002, pp. 21–30.

  46. 46.

    The technique used in Idemix is called “zero knowledge proof” because it does not reveal any other information than the veracity of a proof (see Sect. 17.2.3). In addition, a probabilistic algorithm is used to produce the proofs, which is essential to ensure unlinkability (because a new proof is produced for each use).

  47. 47.

    Actually, true accountability can be guaranteed only if the true identity of the user was authenticated upon registering his pseudonym.

  48. 48.

    Bitcoin has attracted much attention though, and some initiatives have been taken recently, for example, Expedia’s accepting Bitcoin payments for hotel bookings (http://www.bbc.com/news/technology-27810008) or the Monoprix e-commerce branch’s announcing its decision to accept Bitcoins before 2015. http://www.latribune.fr/entreprises-finance/services/distribution/20140409trib000824457/pourquoi-monoprix-a-decide-d-accepter-les-bitcoins.html. It should be noted also that the anonymity provided by Bitcoin has been challenged recently: see Meiklejohn, Sarah, Marjori Pomarole, Grant Jordan, Kirill Levchenko, Damon McCoy, Geoffrey M. Voelker and Stefan Savage, “A Fistful of Bitcoins: Characterizing Payments Among Men with No Names”,; login: The USENIX magazine, Vol. 38, No. 6, December 2013.

  49. 49.

    https://en.wikipedia.org/wiki/Bitcoin#Criminal_activity

  50. 50.

    Deswarte, Yves, and Sébastien Gambs, “A proposal for a privacy-preserving national identity card”, Transactions on Data Privacy, Vol. 3, Issue 3, December 2010, pp. 253–276.

  51. 51.

    Just to take two examples, the fingerprints and the iris are attributes that are, by definition, unique (or almost) to a person and cannot be replaced if they are “stolen” by an impostor.

  52. 52.

    Ratha, N.K., J.H. Connell and R.M. Bolle, “Enhancing security and privacy in biometrics based authentication systems”, IBM Systems Journal, Vol. 40, No. 3, 2001, pp. 614–634.

  53. 53.

    Gentry, Craig, “Full Homomorphic Encryption using ideal lattices”, in the 41st ACM Symposium on Theory of Computing (STOC), 2009. https://www.cs.cmu.edu/~odonnell/hits09/gentry-homomorphic-encryption.pdf

  54. 54.

    Paillier, Pascal, “Public-Key Cryptosystems Based on Composite Degree Residuosity Classes”, EUROCRYPT, Springer, 1999, pp. 223–238.

  55. 55.

    Often called the hiding and binding properties.

  56. 56.

    Quisquater, Jean-Jacques, Louis C. Guillou and Thomas A. Berson, “How to explain Zero-Knowledge protocols to your children”, in Proceedings on Advances in Cryptology – CRYPTO89, Springer-Verlag New York, 1989, pp. 628–631. Goldwasser, Shafi, Silvio Micali and Charles Rackoff, “The knowledge complexity of interactive proof-systems”, in Robert Sedgewick (ed.), Proceedings of 17th Symposium on the Theory of Computation, Providence, RI, ACM, 1985.

  57. 57.

    Chi-Chih Yao, Andrew, “Protocols for Secure Computations”, Extended Abstract, FOCS 1982, pp. 160–164.

  58. 58.

    De Cristofaro, Emiliano, and Gene Tsudik, “Practical private set intersection protocols with linear complexity”, in Financial Cryptography, Springer-Verlag, Berlin, 2010, pp. 143–159.

  59. 59.

    Jawurek, Marek, Florian Kerschbaum and George Danezis, “Privacy Technologies for Smart Grids – A Survey of Options”, Microsoft Technical Report, MSR-TR-2012-119, 2012. Rial, Alfredo, and George Danezis, “Privacy-Preserving Smart Metering”, Proceedings of the 2011 ACM Workshop on Privacy in the Electronic Society, WPES, 2011. Acs, Gergely, and Claude Castelluccia, “I Have a DREAM!: Differentially Private Smart Metering”, Proceedings of the 13th International Conference on Information Hiding, Springer Verlag, 2011, pp. 118–132.

  60. 60.

    Garcia, Flavio D., and Bart Jacobs, “Privacy-Friendly Energy-Metering via Homomorphic Encryption”, in Jorge Cuellar, Javier Lopez, Gilles Barthe and Alexander Pretschner (eds.), Security and Trust Management (STM’2010), Springer, 2011, pp. 226–238.

  61. 61.

    This level of protection is often measured in terms of differential privacy, a formal privacy metric providing a characterisation of privacy in terms of the knowledge gained by a powerful adversary (possessing all possible auxiliary information).

  62. 62.

    Josep Balasch, Alfredo Rial, Carmela Troncoso, Christophe Geuens, Bart Preneel and Ingrid Verbauwhede, “PrETP: Privacy-Preserving Electronic Toll Pricing”, 19th USENIX Security Symposium, USENIX Association, 2010, pp. 63–78. Troncoso, Carmela, George Danezis, Eleni Kosta and Bart Preneel, “PriPAYD: Privacy Friendly Pay-As You-Drive Insurance”, Proceedings of the 6th ACM workshop on Privacy in the electronic society (WPES), 2007. de Jonge, Wiebren, and Bart Jacobs, “Privacy-friendly electronic traffic pricing via commits”, in P. Degano, J. Guttman and F. Martinelli (eds.), Formal Aspects in Security and Trust, Springer, 2009, pp. 143–161.

  63. 63.

    Chor, Benny, Eyal Kushilevitz, Oded Goldreich and Madhu Sudan, “Private Information Retrieval”, Journal of the ACM, Vol. 45, No. 6, 1998, pp. 965–981.

  64. 64.

    Guha, Saikat, Bin Cheng and Paul Francis, “Privad: Practical Privacy in Online Advertising”, USENIX Symposium on Networked Systems Design and Implementation, NSDI 2011.

  65. 65.

    Toubiana, Vincent, Arvind Narayanan, Dan Boneh, Helen Nissenbaum and Solon Barocas, “Adnostic: Privacy Preserving Targeted Advertising”, Network and Distributed System Security Symposium 2010.

  66. 66.

    Backes, Michael, Aniket Kate, Matteo Maffei and Kim Pecina, “ObliviAd: Provably Secure and Practical Online Behavioral Advertising”, in Proceedings of 33rd IEEE Symposium on Security and Privacy (S&P 2012), 2012, pp. 257–271.

  67. 67.

    Howe, Daniel C., and Helen Nissenbaum, “TrackMeNot: resisting surveillance in web search”, in Ian Kerr, Carole Lucock and Valerie Steeves (eds.), Lessons from Identity Trail: Privacy, Anonymity and Identity in a Networked Society, Oxford University Press, Oxford, 2008.

  68. 68.

    Krumm, John, “A survey of computational location privacy”, Personal Ubiquitous Computing, Vol. 13, 2009, pp. 391–399. Gambs, Sébastien, Marc-Olivier Killijian and Miguel Núñez del Prado Cortez, “GEPETO: A GEoPrivacy-Enhancing TOolkit”, AINA Workshops, 2010, pp. 1071–1076.

  69. 69.

    See Sect. 17.2.4.

  70. 70.

    Damiani, Maria Luisa, Elisa Bertino and Claudio Silvestri, “The PROBE framework for the personalized cloaking of private locations”, Transactions on Data Privacy, Vol. 3, No. 2, 2010, pp. 123–148.

  71. 71.

    Beresford, Alastair R., and F. Stajano, “Location privacy in pervasive computing”, IEEE Pervasive Computing, Vol. 3, No. 1, 2003, pp. 46–55.

  72. 72.

    See, for example, Kosta, Eleni, Jan Zubuschka, Tobias Scherner and Jos Dumortier, “Legal considerations on privacy-enhancing location based services using PRIME technology”, Computer Law and Security Report, Vol. 24, Issue 2, 2008, pp. 139–146.

  73. 73.

    Allard, Tristan, Nicolas Anciaux, Luc Bouganim, Yanli Guo, Lionel Le Folgoc, Benjamin Nguyen, Philippe Pucheral, Indrajit Ray, Indrakshi Ray and Shaoyi Yin, “Secure Personal Data Servers: a Vision Paper”, in Elisa Bertino, Paolo Atzeni, Kian Lee Tan, Yi Chen and Y.C. Tay (eds.), Proceedings of the 36th International Conference on Very Large Data Bases (VLDB), Vol. 1, No. 1, 2010, pp. 25–35.

  74. 74.

    Anciaux, N., M. Benzine, L. Bouganim, K. Jacquemin, P. Pucheral and S. Yin, “Restoring the Patient Control over her Medical History”, 21th IEEE International Symposium on Computer-Based Medical Systems (IEEE CBMS), Finland, June 2008.

  75. 75.

    http://data.gov.uk/library/mydex

  76. 76.

    The case for personal information empowerment: the rise of the personal data store, Mydex.

  77. 77.

    For example, the primary purpose can be the clinical monitoring of patients or payment and the secondary purpose medical research or extraction of buying patterns.

  78. 78.

    Fung, Benjamin C.M., Ke Wang, Rui Chen and Philip S. Yu, “Privacy-preserving data publishing: A survey of recent developments”, ACM Computing Surveys, Vol. 42, No. 4, June 2010.

  79. 79.

    One famous case was the re-identification of the former governor of the state of Massachusetts in a published medical database, through a link with a public voter list.

  80. 80.

    Eighty-seven per cent of the American population have a unique combination of zip code, date of birth and gender, according to Sweeney, Latanya, “Re-identification of De-identified Survey Data”, Carnegie Mellon University, School of Computer Science, Data Privacy Laboratory, Technical Report, 2000.

  81. 81.

    Fung, Benjamin C.M., Ke Wang, Ada Wai-Chee (Fu and Philip S. Yu, Introduction to Privacy-Preserving Data Publishing: Concepts and Techniques, Chapman & Hall/CRC, August 2010.

  82. 82.

    Because they may lead to inconsistent databases or break interesting relations between attributes.

  83. 83.

    Sweeney, Latanya, “k-anonymity: A model for protecting privacy”, International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, Vol. 10, No. 5, 2002, pp. 557–570.

  84. 84.

    Machanavajjhala, Ashwin, Daniel Kifer, Johannes Gehrke and Muthuramakrishnan Venkitasubramaniam, “l-diversity: Privacy beyond k- anonymity”, ACM Transactions on Knowledge Discovery from Data (TKDD), Vol. 1, No. 1, March 2007.

  85. 85.

    If l-diversity were not met and, for example, all records with zip code equal to “92340” and gender equal to “male” were such that the associated illness is “cancer”, then it would be possible to infer that someone with these quasi-identifiers has cancer, even if k-anonymity were met for a given k.

  86. 86.

    Dwork, Cynthia, “Differential privacy”, in ICALP (2), 2006, pp. 1–12. Dwork, Cynthia, “A firm foundation for private data analysis”, Communications of the ACM, Vol. 54, No. 1, 2011, pp. 86–95.

  87. 87.

    Verykios, Vassilios S., Elisa Bertino, Igor Nai Fovino, Loredana Parasiliti Provenza, Yucel Saygin and Yannis Theodoridis, “State-of-the-art in Privacy Preserving Data Mining”, in SIGMOD Record, Vol. 33, No. 1, March 2004, pp. 50–57.

  88. 88.

    Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), voted by the European Parliament on 12 March 2014.

  89. 89.

    In addition, they are rarely read by the subject because they often are in the form of long and tedious legal documents that she has neither the time nor the legal background or competence to read.

  90. 90.

    For example, when she leaves traces of her activities on servers or when third parties use cookies to track her on her own device, or when mobile phone applications plunder her address book or collect her location and forward this information to advertising brokers.

  91. 91.

    Typically, they can be designated in a generic or deliberately vague way in the privacy policy of the data controller (“our partners”, “publishers”, “advertisers”, “trusted businesses”).

  92. 92.

    Hildebrandt Mireille, and Bert-Jaap Koops, “The Challenges of Ambient Law and Legal Protection in the Profiling Era”, The Modern Law Review, Vol. 73, Issue 3, May 2010, pp. 428–460.

  93. 93.

    Privacy Bird is an example of a browser add-on (for Internet Explorer) that provides this facility.

  94. 94.

    Google Dashboard includes this feature but it shows only a subset of the collected data, which may actually be misleading for the users.

  95. 95.

    For example, Scott Lederer et al. identify five pitfalls for designers (obscuring potential information flow, obscuring actual information flow, emphasising configuration over action, lacking coarse-grained control and inhibiting existing practice) and they show how existing systems either fall into these pitfalls or avoid them.

  96. 96.

    http://primelife.ercim.eu/ (the dashboard does not seem to be maintained any longer).

  97. 97.

    Paul, Thomas, Daniel Puscher and Thorsten Strufe, “Improving the Usability of Privacy Settings in Facebook”, CoRR abs/1109.6046, 2011.

  98. 98.

    Wang, Yang, Saranga Komanduri, Pedro Leon, Gregory Norcie, Alessandro Acquisti and Lorrie Faith Cranor, “‘I regretted the minute I pressed share’, A Qualitative Study of Regrets on Facebook”, Proceedings of the Seventh Symposium on Usable Privacy and Security (SOUPS), ACM, July 2011.

  99. 99.

    http://www.mozilla.org/en-US/lightbeam/

  100. 100.

    Enck, William, Peter Gilbert, Byung-Gon Chun, Landon P. Cox, Jaeyeon Jung, Patrick McDaniel and Anmol N. Sheth, “TaintDroid: An Information-flow Tracking System for Realtime Privacy Monitoring on Smartphones”, Proceedings of the 9th USENIX Conference on Operating Systems Design and Implementation, OSDI’10, 2010, pp. 1–6.

  101. 101.

    Achara, Jagdish Prasad, Franck Baudot, Claude Castelluccia, Geoffrey Delcroix and Vincent Roca, “Mobilitics: analyzing privacy leaks in smart phones”, ERCIM News, 93, April 2013.

  102. 102.

    https://panopticlick.eff.org/

  103. 103.

    Eckersley, Peter, “How Unique Is Your Web Browser?”, in Mikhail J. Atallah and Nicholas J. Hopper (eds.), Privacy-enhancing Technologies, Springer, 2010, pp. 1–18.

  104. 104.

    http://tosdr.org/

  105. 105.

    https://tosback.org/

  106. 106.

    Ackerman, Mark S., and Lorrie Cranor, “Privacy Critics: UI Components to Safeguard Users’ Privacy”, CHI ’99 Extended Abstracts on Human Factors in Computing Systems, CHI EA’ 99, ACM, 1999, pp. 258–259.

  107. 107.

    Bonneau, Joseph, Jonathan Anderson and Luke Church, “Privacy suites: shared privacy for social networks”, SOUPS 2009.

  108. 108.

    See, for example, Barth, Adam, Anupam Datta, John C. Mitchell and Helen Nissenbaum, “Privacy and Contextual Integrity: Framework and Applications”, Proceedings of the 2006 IEEE Symposium on Security and Privacy, SP ’06, IEEE Computer Society, 2006, pp. 184–198. Becker, Moritz Y., Alexander Malkis and Laurent Bussard, “S4P: A Generic Language for Specifying Privacy Preferences and Policies”, Technical report MSR-TR-2010-32, Microsoft Research, April 2010. Le Métayer, Daniel, “A formal privacy management framework”, in Pierpaolo Degano, Joshua D. Guttman and Fabio Martinelli, Formal Aspects of Security and Trust, Proceedings of the FAST’2008 Workshop (IFIP WG 1.7 Workshop on Formal Aspects in Security and Trust), Springer Verlag, Berlin, 2009. Barth, Adam, John C. Mitchell, Adapu. Datta and Sharada Sundaram, “Privacy and utility in business processes”, Proceedings of the 20th IEEE Computer Security Foundations Symposium, 2007, pp. 279–294. Karjoth, G., M. Schunter and E.V. Herreweghen, “Translating privacy practices into privacy promises, how to promise what you can keep”, Proceedings of the IEEE 4th International Workshop on Policies for Distributed Systems and Networks, 4–6 June 2003, pp. 135–146.

  109. 109.

    W3C, Platform for privacy preferences (P3P), W3C recommendation, 2002. http://www.w3.org/TR/P3P/

  110. 110.

    http://www.privacybird.org/

  111. 111.

    Karat, John, Clare-Marie Karat, Caroloyn Brodie and Jinjuan Feng, “Privacy in information technology: designing to enable privacy policy management in organizations”, International Journal of Human-Computer Studies, Vol. 63, Issues 1–2, July 2005, pp. 153–174.

  112. 112.

    European Parliament and the Council, Directive 95/46/EC of 24 October 1995 on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of such Data, Brussels, 1995.

  113. 113.

    Brodie, Carolyn A., Clare-Marie Karat and John Karat, “An empirical study of natural language parsing of privacy policy rules using the Sparcle policy workbench”, in Symposium On Usable Privacy and Security (SOUPS), 2006.

  114. 114.

    Reidenberg, Joel, and Lorrie Faith Cranor, “Can User Agents Accurately Represent Privacy Policies?”, 30 August 2002. http://ssrn.com/abstract=328860

  115. 115.

    Le Métayer, Daniel and Shara Monteleone, “Automated consent through privacy agents: Legal requirements and technical architecture”, Computer Law & Security Review, Vol. 25, Issue 2, 2009 pp. 136–144.

  116. 116.

    Le Métayer, Daniel, “A formal privacy management framework”, in Pierpaolo Degano, Joshua D. Guttman and Fabio Martinelli (eds.), Formal Aspects in Security and Trust, Proceedings of the FAST’2008 Workshop, Springer Verlag, 2009, pp. 162–176.

  117. 117.

    Barth, Adam, Anupam Datta, John C. Mitchell and Helen Nissenbaum, “Privacy and Contextual Integrity: Framework and Applications”, Proceedings of the 2006 IEEE Symposium on Security and Privacy, SP’ 06, IEEE Computer Society, 2006, pp. 184–198.

  118. 118.

    Becker, Moritz Y., Alexander Malkis and Laurent Bussard, “S4P: A Generic Language for Specifying Privacy Preferences and Policies”, Technical report MSR-TR-2010-32, Microsoft Research, April 2010.

  119. 119.

    They can be used to specify norms, in a more general sense, for example, CI has been applied to the US HIPPA (Health Insurance Portability and Accountability Act), COPPA (Children’s Online Privacy Protection Act) and GLBA (Gramm-Leach-Bliley Act).

  120. 120.

    http://msdn.microsoft.com/en-us/library/ie/ms537343(v=vs.85).aspx

  121. 121.

    https://addons.mozilla.org/en-US/firefox/addon/adblock-plus/

  122. 122.

    https://www.whitehatsec.com/aviator/

  123. 123.

    https://itunes.apple.com/us/app/onion-browser/id519296448?mt=8

  124. 124.

    https://guardianproject.info/apps/orweb/

  125. 125.

    https://www.mozilla.org/en-US/dnt/

  126. 126.

    World Wide Web Consortium: http://www.w3.org/Consortium/

  127. 127.

    http://www.w3.org/2011/tracking-protection/

  128. 128.

    http://www.w3.org/TR/tracking-dnt/

  129. 129.

    Howe, Daniel C., and Helen Nissenbaum, “TrackMeNot: resisting surveillance in web search”, in Ian Kerr, Carole Lucock and Valerie Steeves (eds.), Lessons from Identity Trail: Privacy, Anonymity and Identity in a Networked Society, Oxford University Press, Oxford, 2008.

  130. 130.

    https://www.schneier.com/blog/archives/2006/08/trackmenot_1.html

  131. 131.

    Users should be careful about the use of these groups though, as they are dynamic by nature: the rights granted to a group such as friends of friends at a given time may no longer be appropriate one year after (because the size of the group may have grown dramatically).

  132. 132.

    See, for example, http://www.daniel-puscher.de/fpw/ or http://www.privacyfix.com/

  133. 133.

    Fong, Philip W.L., and Mohd Anwar and Zhen Zhao, “A Privacy Preservation Model for Facebook-style Social Network Systems”, Proceedings of the 14th European Conference on Research in Computer Security, ESORICS ’09, 2009, pp. 303–320.

  134. 134.

    https://www.facebook.com/about/privacy/your-info: “Granting us permission to use your information not only allows us to provide Facebook as it exists today, but it also allows us to provide you with innovative features and services we develop in the future that use the information we receive about you in new ways.”

  135. 135.

    Solove, Daniel J., “Privacy Self-Management and the Consent Paradox”, Harvard Law Review, Vol. 126, No. 7, May 2013, p. 2. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2171018. Zanfir, Gabriela, “Forgetting About Consent: Why the Focus Should Be on ‘Suitable Safeguards’ in Data Protection Law”, in Serge Gutwirth et al. (eds.), Reloading Data Protection, Springer, Dordrecht, 2014.

  136. 136.

    This may not always be obvious for non-technical users though. For example, they may not be aware of the fact that different types of cookies may be stored on their computer, some of them directly by their browser, others by Adobe Flash Player, which require different actions.

  137. 137.

    Ensuring Consent and Revocation (EnCoRe) is a research project undertaken by UK industry and academia, to give individuals more control over their personal information: http://www.encore-project.info/

  138. 138.

    DRM (for Digital Rights Management) technologies are used to protect digital content (for example, music or video) by preventing illegal copies or controlling access.

  139. 139.

    Mayer-Schönberger, Victor, “Beyond Copyright: Managing Information Rights with DRM”, Denver University Law Review, Vol. 84, No. 1, 2006, pp. 181–198.

  140. 140.

    Hilty, Manuel, David Basin and Alexander Pretschner, “On obligations”, Proceedings of the 10th European conference on Research in Computer Security (ESORICS’05), Springer, Dordrecht, 2005, pp. 98–117.

  141. 141.

    http://www.trustedcomputinggroup.org/files/resource_files/3B1360F8-1D09-3519-AD75FFC52338902D/03-000216.1.03_CBIHealth.pdf

  142. 142.

    Which is, admittedly, the intended effect for privacy enforcement when the trusted execution environment is on the side of the data controller, since the objective is to force him to fulfil the sticky privacy policies.

  143. 143.

    Mayer-Schönberger, Victor, Delete: the virtue of forgetting in the digital age, Princeton University Press, 2009.

  144. 144.

    Unless they are complemented by the trusted execution environment mentioned above.

  145. 145.

    Geambasu, Roxana, Tadayoshi Kohno, Amit Levy and Henry M. Levy, “Vanish: Increasing Data Privacy with Self-Destructing Data”, in Proceedings of the USENIX Security Symposium, 2009.

  146. 146.

    Castelluccia, Claude, Emiliano De Cristofaro, Aurélien Francillon and Mohamed Ali Kâafar, “EphPub: Toward robust Ephemeral Publishing”, 19th IEEE Conference on Network Protocols (ICNP 2011), pp. 165–175.

  147. 147.

    See, in particular, Druschel, Peter, Michael Backes and Rodica Tirtea, The right to be forgotten – between expectations and practice, ENISA Report, 2011. https://www.enisa.europa.eu/activities/identity-and-trust/library/deliverables/the-right-to-be-forgotten. It should also be stressed that, technically speaking, data erasure is not such a simple task, because of data remanence: a residual representation of the data remains in the computer memory after a simple attempt to erase it, which requires one to resort to more sophisticated techniques or overwriting to make it more difficult to retrieve the data after its “erasure”.

  148. 148.

    Computer scientists such as Fred Schneider have also advocated accountability for similar reasons in a more general context: “Accountability, then, could be a plausible alternative to perfection. And while perfection is clearly beyond our capabilities, accountability is not. It’s therefore feasible to contemplate an exchange: accountability for perfection.” in Schneider, Fred B., “Accountability for perfection”, IEEE Security and Privacy, March–April, 2009, pp. 3–4.

  149. 149.

    Organisation for Economic Co-operation and Development, Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, Paris, 1980. See also Thirty Years After The OECD Privacy Guidelines, 2011 (http://www.oecd.org/sti/ieconomy/49710223.pdf) and the latest version of the guidelines: The Recommendation of the OECD Council concerning Guidelines governing the Protection of Privacy and Transborder Flows of Personal Data, July 2013. http://www.oecd.org/sti/ieconomy/2013-oecd-privacy-guidelines.pdf

  150. 150.

    See, for example, Alhadeff, Joseph, Brendan Van Alsenoy and Jos Dumortier, “The accountability principle in data protection regulation: origin, development and future directions”, in Daniel Guagnin, Leon Hempel, Carla Ilten, Inga Kroener, Daniel Neyland and Hector Postigo, Managing privacy through accountability, Palgrave Macmillan, September 2012. To be fair, however, it should also be noted that accountability has also been questioned as it may sometimes be used by industry as a synonym of self-regulation and a way to avoid more stringent legal obligations. As an illustration of this trend, see Ernst & Young, “Privacy Trends 2012. The case for growing accountability”: “To avoid greater regulation, organizations in the retail and consumer products industries and GS1, a supply chain standards organization, are working with privacy commissioners to voluntarily set guidelines that address the privacy implications of using radio frequency identification (RFID) technology in their operations.”

  151. 151.

    Article 29 Data Protection Working Party, Opinion 3/2010 on the principle of accountability, 13 July 2010. http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2010/wp173_en.pdf

  152. 152.

    Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), voted by the European Parliament on 12 March 2014.

  153. 153.

    Pearson, Siani, and Andrew Charlesworth, “Accountability as a way forward for privacy protection in the cloud”, in M.G. Jaatun, G. Zhao and C. Rong (eds.), Cloud Computing, Proceedings of CloudCom 2009, Beijing, Springer, Berlin, 2009, pp. 131–144.

  154. 154.

    Bennett, Colin J., “Implementing Privacy Codes of Practice”, (PLUS 8830), Canadian Standards Association, 1995. http://shop.csa.ca/en/canada/privacy/privacy-package/invt/27020152004. See also Bennett, Colin, “International privacy standards: can accountability be adequate?”, Privacy Laws and Business International, August 2010.

  155. 155.

    Butin, Denis, Marcos Chicote and Daniel Le Métayer, “Log Design for Accountability”, Proceedings of the 4th International Workshop on Data Usage Management, IEEE Computer Society, Washington, DC, 2013.

  156. 156.

    See, for example, for database access policies: Biswas, Debmalya, and Valtteri Niemi, “Transforming Privacy Policies to Auditing Specifications”, IEEE 13th International Symposium on High-Assurance Systems Engineering (HASE), 2011, pp. 368–375.

  157. 157.

    Butin, Denis, and Daniel Le Métayer, “Log Analysis for Data Protection Accountability”, in Cliff Jones, Pekka Pihlajasaari and Jun Sun (eds.), FM 2014: Formal Methods: 19th International Symposium, Singapore, May 12–16, 2014, Proceedings (Lecture Notes in Computer Science/Programming and Software Engineering), Springer Verlag, 2014, pp. 163–178.

  158. 158.

    Mechanisms have also been proposed for “accountable virtual machines” in the context of distributed computing but faults are detected by replaying the execution of the remote processor using a correct copy of its code, which is not really an applicable strategy for privacy accountability because the subject (or even the auditor) does not necessarily know (and should not have to know) the software code running on the controller’s side. See Haeberlen, Andreas, Paarijaat Aditya, Rodrigo Rodrigues and Peter Druschel, “Accountable Virtual Machines”, Proceedings of the 9th USENIX Conference on Operating Systems Design and Implementation (OSDI 2010), USENIX Association, Berkeley, CA, 2010, pp. 119–134.

  159. 159.

    Bellare, Mihir, and Bennet S. Yee, “Forward integrity for secure audit logs”, Technical report, University of California at San Diego, 1997. Schneier, Bruce, and John Kelsey, “Secure Audit Logs to Support Computer Forensics”, ACM Transactions on Information and System Security, Vol. 2, No. 2, 1999, pp. 159–176.

  160. 160.

    Even if, of course, nothing would prevent the attacker from deleting these logs if he has taken control of the machine, but this might not go undetected if the machine has communicated with third parties or if logs have been periodically stored on a backup site.

  161. 161.

    Waters, Brent R., Dirk Balfanz, Glenn Durfee and D.K. Smetters, “Building an encrypted and searchable audit log”, presented at the 11th Annual Network and Distributed System Security Symposium, 2004.

  162. 162.

    Exceptional situations or emergencies, for example, when there is a matter of life and death, that can justify a derogation to the general rule.

  163. 163.

    See, for example, Garg, D., L. Jia and A. Datta, “Policy Auditing over Incomplete Logs: Theory, Implementation and Applications”, in Proceedings of 18th ACM Conference on Computer and Communications Security, October 2011. Butin, Denis, and Daniel Le Métayer, “Log Analysis for Data Protection Accountability”, in C. Jones, P. Pihalajasaari and J. Sun (eds.), Formal Methods 2014, 19th International Symposium on Formal Methods (FM 2014), Springer Verlag, 2014, pp. 163–178.

  164. 164.

    See, for example, Froomkin, A. Michael, “PETs must be on a leash’: how U.S. law (and industry practice) often undermines and even forbids valuable Privacy-enhancing Technology”, Ohio State Law Journal, Vol. 74, No. 6, 2013, for a critical review of the situation in the US and for a plea in favour of privacy-enhancing technologies, see Diaz, Claudia, Omer Tene and Seda F. Guerses, “Hero or Villain: The Data Controller in Privacy Law and Technologies”, Ohio State Law Journal, Vol. 74, No. 6, 2013, pp. 923–964. It should be noted that the latter paper defines PETs in a more restricted sense than this chapter, covering essentially a subset of Sect. 17.2. More precisely, it restricts the term “PETs” to “technological solutions that combine three objectives: elimination of the single point of failure inherent with any centralized third party; data minimization; and subjecting protocols and software to community-based public scrutiny”.

  165. 165.

    Diaz, Claudia, Omer Tene and Seda F. Guerses, “Hero or Villain: The Data Controller in Privacy Law and Technologies”, Ohio State Law Journal, Vol. 74, No. 6, 2013, pp. 923–964.

  166. 166.

    Birnhack, D. Michael, “A quest for a theory of privacy : context and control”, Jurimetrics: The Journal of Law, Science, and Technology, Vol. 51, No. 4, 2011.

  167. 167.

    Solove, J. Daniel, “Privacy self-management and the consent paradox”, Harvard Law Review, Vol. 126, 2013.

  168. 168.

    Lazaro, Christophe, and Daniel Le Métayer,“The control over personal data : true remedy or fairy tale ?”, Script-ed, Vol. 12, Issue 1, pp. 3–34, June 2015, http://script-ed.org/?p=1927. Gabriela Zanfir, “Forgetting about consent. Why the focus should be on “suitable safeguards” in data protection law”, 2013. Available at SSRN: http://ssrn.com/abstract=2261973 or http://dx.doi.org/10.2139/ssrn.2261973

  169. 169.

    ToS;DR (Terms of Service; Didn’t Read: http://tosdr.org) is an example of effort in this direction. The goal of ToS;DR is to create a database of analyses of the fairness of privacy policies and to make this information available in the form of explicit icons (general evaluation plus good and bad points) which can be expanded if needed into more detailed explanations. A key aspect of ToS;DR is the fact that users can submit their own analysis for consideration, the goal being that, just like Wikipedia, a group consensus will emerge to provide a reliable assessment of each policy.

  170. 170.

    Ibid.

  171. 171.

    https://en.wikipedia.org/wiki/Heartbleed?oldid=cur

  172. 172.

    Daniel Le Métayer, “Privacy by design: a formal framework for the analysis of architectural choices”, in ACM Conference on Data and Application Security and Privacy (CODASPY 2013), ACM, 2013, pp. 95–104. Antignac, Thibaud, and Daniel Le Métayer, “Privacy by design: from technologies to architectures”, Annual Privacy Forum, Springer Verlag, 2014, pp. 1–17.

  173. 173.

    Gürses, Seda, Carmela Troncoso and Claudia Diaz, “Engineering Privacy by Design”, Presented at the Conference on Computers, Privacy & Data Protection (CPDP), 2011.

References

  • Achara, Jagdish Prasad, Franck Baudot, Claude Castelluccia, Geoffrey Delcroix and Vincent Roca, “Mobilitics: analyzing privacy leaks in smart phones”, ERCIM News, 93, April 2013.

    Google Scholar 

  • Acs, Gergely, and Claude Castelluccia, “I Have a DREAM!: Differentially Private Smart Metering”, Proceedings of the 13th International Conference on Information Hiding, Springer Verlag, 2011, pp. 118–132.

    Google Scholar 

  • Ackerman, Mark S., and Lorrie Cranor, “Privacy Critics: UI Components to Safeguard Users’ Privacy”, CHI ’99 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’99, ACM, 1999, pp. 258–259.

    Google Scholar 

  • Alhadeff, Joseph, Brendan Van Alsenoy and Jos Dumortier, “The accountability principle in data protection regulation: origin, development and future directions”, in Daniel Guagnin, Leon Hempel, Carla Ilten, Inga Kroener, Daniel Neyland and Hector Postigo, Managing privacy through accountability, Palgrave Macmillan, September 2012.

    Google Scholar 

  • Allard, Tristan, Nicolas Anciaux, Luc Bouganim, Yanli Guo, Lionel Le Folgoc, Benjamin Nguyen, Philippe Pucheral, Indrajit Ray, Indrakshi Ray and Shaoyi Yin, “Secure Personal Data Servers: a Vision Paper”, in Elisa Bertino, Paolo Atzeni, Kian Lee Tan, Yi Chen and Y.C. Tay (eds.), Proceedings of the 36th International Conference on Very Large Data Bases (VLDB), Vol. 1, No. 1, 2010, pp. 25–35.

    Google Scholar 

  • Anciaux, Nicolas, Mehdi Benzine, Luc Bouganim, Kevin Jacquemin, Philippe Pucheral and Shaoyi Yin, “Restoring the Patient Control over her Medical History”, 21th IEEE International Symposium on Computer-Based Medical Systems (IEEE CBMS), Finland, June 2008, pp. 132–137.

    Google Scholar 

  • Antignac, Thibaud, and Daniel Le Métayer, “Privacy by design: from technologies to architectures”, Annual Privacy Forum, Springer Verlag, 2014, pp. 1–17.

    Google Scholar 

  • Article 29 Data Protection Working Party, Opinion 3/2010 on the principle of accountability, 13 July 2010. http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2010/wp173_en.pdf

  • Backes, Michael, Aniket Kate, Matteo Maffei and Kim Pecina, “ObliviAd: Provably Secure and Practical Online Behavioral Advertising”, in Proceedings of 33rd IEEE Symposium on Security and Privacy (S&P 2012), 2012, pp. 257–271.

    Google Scholar 

  • Balasch, Josep, Alfredo Rial, Carmela Troncoso, Christophe Geuens, Bart Preneel and Ingrid Verbauwhede, “PrETP: Privacy-Preserving Electronic Toll Pricing”, 19th USENIX Security Symposium, USENIX Association, 2010, pp. 63–78.

    Google Scholar 

  • Barth, Adam, Anupam Datta, John C. Mitchell and Helen Nissenbaum, “Privacy and Contextual Integrity: Framework and Applications”, Proceedings of the 2006 IEEE Symposium on Security and Privacy, SP ’06, IEEE Computer Society, 2006, pp. 184–198.

    Google Scholar 

  • Barth, Adam, John C. Mitchell, Adapu. Datta and Sharada Sundaram, “Privacy and utility in business processes”, Proceedings of the 20th IEEE Computer Security Foundations Symposium, 2007, pp. 279–294.

    Google Scholar 

  • Beato, Filipe, Markulf Kohlweiss and Karel Wouters, “Scramble! Your Social Network Data”, in Simone Fischer-Hübner and Nicholas Hopper (eds.), Privacy-enhancing Technologies, Proceedings of the 11th International Symposium, PETS 2011, Waterloo, ON, Canada, 27–29 July 2011, Springer, Heidelberg, 2011, pp. 211–225.

    Google Scholar 

  • Becker, Moritz Y., Alexander Malkis and Laurent Bussard, “S4P: A Generic Language for Specifying Privacy Preferences and Policies”, Technical report MSR-TR-2010-32, Microsoft Research, April 2010.

    Google Scholar 

  • Bellare, Mihir, and Bennet S. Yee, “Forward integrity for secure audit logs”, Technical report, University of California at San Diego, 1997.

    Google Scholar 

  • Bennett, Colin J., “Implementing Privacy Codes of Practice”, (PLUS 8830), Canadian Standards Association, 1995. http://shop.csa.ca/en/canada/privacy/privacy-package/invt/27020152004

  • Bennett, Colin, “International privacy standards: can accountability be adequate?”, Privacy Laws and Business International, August 2010.

    Google Scholar 

  • Beresford, Alastair R., and F. Stajano, “Location privacy in pervasive computing”, IEEE Pervasive Computing, Vol. 3, No. 1, 2003, pp. 46–55.

    Article  Google Scholar 

  • Birnhack, D. Michael, “A quest for a theory of privacy: context and control”, Jurimetrics: The Journal of Law, Science, and Technology, Vol. 51, No. 4, 2011.

    Google Scholar 

  • Biswas, Debmalya, and Valtteri Niemi, “Transforming Privacy Policies to Auditing Specifications”, IEEE 13th International Symposium on High-Assurance Systems Engineering (HASE), 2011, pp. 368–375.

    Google Scholar 

  • Bonneau, Joseph, Jonathan Anderson and Luke Church, “Privacy suites: shared privacy for social networks”, SOUPS 2009.

    Google Scholar 

  • Borking, John J., “Why Adopting Privacy-enhancing Technologies (PETs) Takes so Much Time”, In Computers, Privacy and Data Protection: an Element of Choice, Springer Verlag, 2011, pp. 309–341.

    Google Scholar 

  • Brodie, Carolyn A., Clare-Marie Karat and John Karat, “An empirical study of natural language parsing of privacy policy rules using the Sparcle policy workbench”, in Symposium On Usable Privacy and Security (SOUPS), 2006.

    Google Scholar 

  • Butin, Denis, Marcos Chicote and Daniel Le Métayer, “Log Design for Accountability”, Proceedings of the 4th International Workshop on Data Usage Management, IEEE Computer Society, Washington, DC, 2013.

    Google Scholar 

  • Butin, Denis, and Daniel Le Métayer, “Log Analysis for Data Protection Accountability”, in C. Jones, P. Pihalajasaari and J. Sun (eds.), Formal Methods 2014, 19th International Symposium on Formal Methods (FM 2014), Springer Verlag, 2014, pp. 163–178.

    Google Scholar 

  • Butin, Denis, and Daniel Le Métayer, “Log Analysis for Data Protection Accountability”, in Cliff Jones, Pekka Pihlajasaari and Jun Sun (eds.), FM 2014: Formal Methods: 19th International Symposium, Singapore, May 12–16, 2014, Proceedings (Lecture Notes in Computer Science / Programming and Software Engineering), Springer Verlag, 2014, pp. 163–178.

    Google Scholar 

  • Camenisch, Jan, and Els Van Herreweghen, “Design and Implementation of the Idemix Anonymous Credential System”, Proceedings of the 9th ACM Conference on Computer and Communications Security, CCS ’02, 2002, pp. 21–30.

    Google Scholar 

  • Castelluccia, Claude, Emiliano De Cristofaro, Aurélien Francillon and Mohamed Ali Kâafar, “EphPub: Toward robust Ephemeral Publishing”, 19th IEEE Conference on Network Protocols (ICNP 2011), pp. 165–175.

    Google Scholar 

  • Chi-Chih Yao, Andrew, “Protocols for Secure Computations”, Extended Abstract, FOCS 1982, pp. 160–164.

    Google Scholar 

  • Chor, Benny, Eyal Kushilevitz, Oded Goldreich and Madhu Sudan, “Private Information Retrieval”, Journal of the ACM, Vol. 45, No. 6, 1998, pp. 965–981.

    Article  Google Scholar 

  • Cutillo, Leucio Antonio, Refik Molva and Thorsten Strufe, “Safebook: a privacy preserving online social network leveraging on real-life trust”, IEEE Communications Magazine, Vol. 47, No. 12, December 2009.

    Google Scholar 

  • Damiani, Maria Luisa, Elisa Bertino and Claudio Silvestri, “The PROBE framework for the personalized cloaking of private locations”, Transactions on Data Privacy, Vol. 3, No. 2, 2010, pp. 123–148.

    Google Scholar 

  • Danezis, George, and Seda Gürses, “A critical review of 10 years of privacy technology”, in Surveillance Cultures: A Global Surveillance Society?, UK, April 2010.

    Google Scholar 

  • De Cristofaro, Emiliano, and Gene Tsudik, “Practical private set intersection protocols with linear complexity”, in Financial Cryptography, Springer-Verlag, Berlin, 2010, pp. 143–159.

    Google Scholar 

  • De Cristofaro, Emiliano, Claudio Soriente, Gene Tsudik and Andrew Williams, “Hummingbird: Privacy at the Time of Twitter”, in IEEE Symposium on Security and Privacy, San Francisco, CA, 21–23 May 2012, pp. 285–299.

    Google Scholar 

  • de Jonge, Wiebren, and Bart Jacobs, “Privacy-friendly electronic traffic pricing via commits”, in P. Degano, J. Guttman, and F. Martinelli (eds.), Formal Aspects in Security and Trust, Springer, 2009, pp. 143–161.

    Google Scholar 

  • Deng, Mina, Kim Wuyts, Riccardo Scandariato, Bart Preneel and Wouter Joosen, “A privacy threat analysis framework: supporting the elicitation and fulfillment of privacy requirements”, Requirements Engineering, Special Issue on Digital Privacy, Vol. 16, Issue 1, March 2011, pp. 3–32.

    Google Scholar 

  • Deswarte, Yves, and Sébastien Gambs, “A proposal for a privacy-preserving national identity card”, Transactions on Data Privacy, Vol. 3, Issue 3, December 2010, pp. 253–276.

    Google Scholar 

  • Diaz, Claudia, and Seda F. Guerses, “Understanding the landscape of privacy technologies”, Extended abstract of invited talk in proceedings of the Information Security Summit, 2012, pp. 58–63.

    Google Scholar 

  • Diaz, Claudia, Omer Tene and Seda F. Guerses, “Hero or Villain: The Data Controller in Privacy Law and Technologies”, Ohio State Law Journal, Vol. 74, No. 6, 2013.

    Google Scholar 

  • Druschel, Peter, Michael Backes and Rodica Tirtea, The right to be forgotten – between expectations and practice, ENISA Report, 2011. https://www.enisa.europa.eu/activities/identity-and-trust/library/deliverables/the-right-to-be-forgotten

  • Dwork, Cynthia, “Differential privacy”, in ICALP, Vol. 2, 2006, pp. 1–12.

    Google Scholar 

  • Dwork, Cynthia, “A firm foundation for private data analysis”, Communications of the ACM, Vol. 54, No. 1, 2011, pp. 86–95.

    Article  Google Scholar 

  • Eckersley, Peter, “How Unique Is Your Web Browser?”, in Mikhail J. Atallah and Nicholas J. Hopper (eds.), Privacy-enhancing Technologies, Springer, 2010, pp. 1–18.

    Google Scholar 

  • Enck, William, Peter Gilbert, Byung-Gon Chun, Landon P. Cox, Jaeyeon Jung, Patrick McDaniel and Anmol N. Sheth, “TaintDroid: An Information-flow Tracking System for Realtime Privacy Monitoring on Smartphones”, Proceedings of the 9th USENIX Conference on Operating Systems Design and Implementation, OSDI10, 2010, pp. 1–6.

    Google Scholar 

  • European Parliament, Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), voted by the European Parliament on 12 March 2014.

    Google Scholar 

  • Fong, Philip W.L., and Mohd Anwar and Zhen Zhao, “A Privacy Preservation Model for Facebook-style Social Network Systems”, Proceedings of the 14th European Conference on Research in Computer Security, ESORICS’09, 2009, pp. 303–320.

    Google Scholar 

  • Froomkin, A. Michael, “‘PETs must be on a leash’: how U.S. law (and industry practice) often undermines and even forbids valuable Privacy-enhancing Technology”, Ohio State Law Journal, Vol. 74, No. 6, 2013.

    Google Scholar 

  • Fung, Benjamin C.M., Ke Wang, Rui Chen and Philip S. Yu, “Privacy-preserving data publishing: A survey of recent developments”, ACM Computing Surveys, Vol. 42, No. 4, June 2010.

    Google Scholar 

  • Fung, Benjamin C.M., Ke Wang, Ada Wai-Chee Fu and Philip S. Yu, Introduction to Privacy-Preserving Data Publishing: Concepts and Techniques, Chapman & Hall/CRC, August 2010.

    Google Scholar 

  • Gambs, Sébastien, Marc-Olivier Killijian and Miguel Núñez del Prado Cortez, “GEPETO: A GEoPrivacy-Enhancing TOolkit”, AINA Workshops, 2010, pp. 1071–1076.

    Google Scholar 

  • Garcia, Flavio D., and Bart Jacobs, “Privacy-Friendly Energy-Metering via Homomorphic Encryption”, in Jorge Cuellar, Javier Lopez, Gilles Barthe and Alexander Pretschner (eds.), Security and Trust Management (STM’2010), Springer, 2011, pp. 226–238.

    Google Scholar 

  • Garg, D., L. Jia and A. Datta, “Policy Auditing over Incomplete Logs: Theory, Implementation and Applications”, in Proceedings of 18th ACM Conference on Computer and Communications Security, October 2011.

    Google Scholar 

  • Geambasu, Roxana, Tadayoshi Kohno, Amit Levy and Henry M. Levy, “Vanish: Increasing Data Privacy with Self-Destructing Data”, in Proceedings of the USENIX Security Symposium, 2009.

    Google Scholar 

  • Gentry, Craig, “Full Homomorphic Encryption using ideal lattices”, in the 41st ACM Symposium on Theory of Computing (STOC), 2009. https://www.cs.cmu.edu/~odonnell/hits09/gentry-homomorphic-encryption.pdf

  • Goldberg, Ian, David Wagner and Eric A. Brewer, “Privacy-Enhancing Technologies for the Internet”, IEEE COMPCON ’97, February 1997.

    Google Scholar 

  • Goldberg, Ian, “Privacy-Enhancing Technologies for the Internet III: Ten years later”, Chapter 1, in Alessandro Acquisti, Stefanos Gritzalis, Costos Lambrinoudakis, Sabrina di Vimercati (eds.), Digital Privacy: Theory, Technologies, and Practices, December 2007.

    Google Scholar 

  • Goldschlag, David, Michael Reed and Paul Syverson, “Onion Routing for Anonymous and Private Internet Connections”, Communications of the ACM, Vol. 42, No. 2, February 1999, pp. 39–41.

    Google Scholar 

  • Goldwasser, Shafi, Silvio Micali and Charles Rackoff, “The knowledge complexity of interactive proof-systems”, in Robert Sedgewick (ed.), Proceedings of 17th Symposium on the Theory of Computation, Providence, RI, ACM, 1985.

    Google Scholar 

  • Guha, Saikat, Bin Cheng and Paul Francis, “Privad: Practical Privacy in Online Advertising”, USENIX Symposium on Networked Systems Design and Implementation, NSDI 2011.

    Google Scholar 

  • Gürses, Seda, and Bettina Berendt, “PETs in the surveillance society: a critical review of the potentials and limitations of the privacy as confidentiality paradigm”, in Serge Gutwirth, Yves Poullet and Paul De Hert (eds.), Data Protection in a Profiled World, Springer Verlag, 2009.

    Google Scholar 

  • Gürses, Seda, Carmela Troncoso and Claudia Diaz, “Engineering Privacy by Design”, Presented at the Conference on Computers, Privacy & Data Protection (CPDP), 2011.

    Google Scholar 

  • Haeberlen, Andreas, Paarijaat Aditya, Rodrigo Rodrigues and Peter Druschel, “Accountable Virtual Machines”, Proceedings of the 9th USENIX Conference on Operating Systems Design and Implementation (OSDI 2010), USENIX Association, Berkeley, CA, 2010, pp. 119–134.

    Google Scholar 

  • Hildebrandt Mireille, and Bert-Jaap Koops, “The Challenges of Ambient Law and Legal Protection in the Profiling Era”, The Modern Law Review, Vol. 73, Issue 3, May 2010, pp. 428–460.

    Google Scholar 

  • Hilty, Manuel, David Basin and Alexander Pretschner, “On obligations”, Proceedings of the 10th European conference on Research in Computer Security (ESORICS’05), Springer Dordrecht, 2005, pp. 98–117.

    Google Scholar 

  • Howe, Daniel C., and Helen Nissenbaum, “TrackMeNot: resisting surveillance in web search”, in Ian Kerr, Carole Lucock and Valerie Steeves (eds.), Lessons from the Identity Trail: Privacy, Anonymity and Identity in a Networked Society, Oxford University Press, Oxford, 2008.

    Google Scholar 

  • ISO/IEC DIS 24760-2, Information Technology – Security Techniques – A Framework for Identity Management – Part 2: Reference architecture and requirements.

    Google Scholar 

  • Jawurek, Marek, Florian Kerschbaum and George Danezis, “Privacy Technologies for Smart Grids – A Survey of Options”, Microsoft Technical Report, MSR-TR-2012-119, 2012.

    Google Scholar 

  • Karat, John, Clare-Marie Karat, Caroloyn Brodie and Jinjuan Feng, “Privacy in information technology: designing to enable privacy policy management in organizations”, International Journal of Human-Computer Studies, Vol. 63, Issues 1–2, July 2005, pp. 153–174.

    Google Scholar 

  • Karjoth, Günter, Matthias Schunter and Els Van Herreweghen, “Translating privacy practices into privacy promises, how to promise what you can keep”, Proceedings of the IEEE 4th International Workshop on Policies for Distributed Systems and Networks, 4–6 June 2003, pp. 135–146.

    Google Scholar 

  • Kokott, Juliane, and Christoph Sobotta, “The distinction between privacy and data protection in the jurisprudence of the CJEU and the ECtHR”, International Data Privacy Law, Vol. 3, No. 4, 2013.

    Google Scholar 

  • Kosta, Eleni, Jan Zubuschka, Tobias Scherner and Jos Dumortier, “Legal considerations on privacy-enhancing location based services using PRIME technology”, Computer Law and Security Report, Vol. 24, Issue 2, 2008, pp. 139–146.

    Article  Google Scholar 

  • Krumm, John, “A survey of computational location privacy”, Personal Ubiquitous Computing, Vol. 13, 2009, pp. 391–399.

    Article  Google Scholar 

  • Lazaro, Christophe, and Daniel Le Métayer, “The control over personal data: true remedy or fairy tale ?”, ScriptEd, Vol. 12, Issue 1, June 2015, pp. 3–34. http://script-ed.org/?p=1927

  • Le Métayer, Daniel, and Shara Monteleone, “Automated consent through privacy agents: Legal requirements and technical architecture”, Computer Law & Security Review, Vol. 25, Issue 2, 2009, pp. 136–144.

    Google Scholar 

  • Le Métayer, Daniel, “A formal privacy management framework”, in Pierpaolo Degano, Joshua D. Guttman and Fabio Martinelli (eds.), Formal Aspects in Security and Trust, Proceedings of the FAST’2008 Workshop, Springer Verlag, 2009, pp. 162–176.

    Google Scholar 

  • Le Métayer, Daniel, “Privacy by design: a formal framework for the analysis of architectural choices”, in ACM Conference on Data and Application Security and Privacy (CODASPY 2013), ACM, 2013, pp. 95–104.

    Google Scholar 

  • Machanavajjhala, Ashwin, Daniel Kifer, Johannes Gehrke and Muthuramakrishnan Venkitasubramaniam, “l-diversity: Privacy beyond k- anonymity”, ACM Transactions on Knowledge Discovery from Data (TKDD), Vol. 1, No. 1, March 2007.

    Google Scholar 

  • Meiklejohn, Sarah, Marjori Pomarole, Grant Jordan, Kirill Levchenko, Damon McCoy, Geoffrey M. Voelker and Stefan Savage, “A Fistful of Bitcoins: Characterizing Payments Among Men with No Names”, ;login: The USENIX magazine, Vol. 38, No. 6, December 2013.

    Google Scholar 

  • Mao, Wenbo, Modern cryptography: theory and practice, HP Professional Series, 2003.

    Google Scholar 

  • Mayer-Schönberger, Victor, “Beyond Copyright: Managing Information Rights with DRM”, Denver University Law Review, Vol. 84, No. 1, 2006, pp. 181–198.

    Google Scholar 

  • Organisation for Economic Co-operation and Development, Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, OECD, Paris, 1980.

    Google Scholar 

  • Organisation for Economic Co-operation and Development, Thirty Years After The OECD Privacy Guidelines, OECD, Paris, 2011. http://www.oecd.org/sti/ieconomy/49710223.pdf

    Google Scholar 

  • Organisation for Economic Co-operation and Development, the recommendation of the OECD Council concerning Guidelines governing the Protection of Privacy and Transborder Flows of Personal Data, OECD, Paris, 2013. http://www.oecd.org/sti/ieconomy/2013-oecd-privacy-guidelines.pdf

  • Paillier, Pascal, “Public-Key Cryptosystems Based on Composite Degree Residuosity Classes”, EUROCRYPT, Springer, 1999, pp. 223–238.

    Google Scholar 

  • Paul, Thomas, Daniel Puscher and Thorsten Strufe, “Improving the Usability of Privacy Settings in Facebook”, CoRR abs/1109.6046, 2011.

    Google Scholar 

  • Pearson, Siani, and Andrew Charlesworth, “Accountability as a way forward for privacy protection in the cloud”, in M.G. Jaatun, G. Zhao and C. Rong (eds.), Cloud Computing, Proceedings of CloudCom 2009, Beijing, Springer, Berlin, 2009, pp. 131–144.

    Google Scholar 

  • Pfitzmann, Andreas, and Marit Hansen, “A terminology for talking about privacy by data minimization: Anonymity, Unlinkability, Undetectability, Unobservability, Pseudonymity, and Identity Management”, Version v0.34, 10 August 2010. https://dud.inf.tu-dresden.de/literatur/Anon_Terminology_v0.34.pdf

  • Quisquater, Jean-Jacques, Louis C. Guillou and Thomas A. Berson, “How to explain Zero-Knowledge protocols to your children”, in Proceedings on Advances in Cryptology – CRYPTO ’89, Springer-Verlag New York, 1989, pp. 628–631.

    Google Scholar 

  • Ratha, Nalini K., Jonathan H. Connell and Ruud M. Bolle, “Enhancing security and privacy in biometrics based authentication systems”, IBM Systems Journal, Vol. 40, No. 3, 2001, pp. 614–634.

    Article  Google Scholar 

  • Reidenberg, Joel, and Lorrie Faith Cranor, “Can User Agents Accurately Represent Privacy Policies?”, 30 August 2002. http://ssrn.com/abstract=328860

  • Rial, Alfredo, and George Danezis, “Privacy-Preserving Smart Metering”, Proceedings of the 2011 ACM Workshop on Privacy in the Electronic Society, WPES, 2011.

    Google Scholar 

  • Schneier, Bruce, Applied cryptography, Wiley, 1995.

    Google Scholar 

  • Schneier, Bruce, Practical cryptography, Wiley, 2003.

    Google Scholar 

  • Schneier, Bruce, Secrets and lies, Wiley, 2004.

    Google Scholar 

  • Schneier, Bruce, and John Kelsey, “Secure Audit Logs to Support Computer Forensics”, ACM Transactions on Information and System Security, Vol. 2, No. 2, 1999, pp. 159–176.

    Article  Google Scholar 

  • Schneider, Fred B., “Accountability for perfection”, IEEE Security and Privacy, March-April, 2009, pp. 3–4.

    Google Scholar 

  • Shen, Yun, and Siani Pearson, “Privacy-enhancing Technologies: A Review”, HP Laboratories HPL-2011-113.

    Google Scholar 

  • Solove, J. Daniel, “Privacy self-management and the consent paradox”, Harvard Law Review, Vol. 126, 2013.

    Google Scholar 

  • Sweeney, Latanya, “Re-identification of De-identified Survey Data”, Carnegie Mellon University, School of Computer Science, Data Privacy Laboratory, Technical Report, 2000.

    Google Scholar 

  • Sweeney, Latanya, “k-anonymity: A model for protecting privacy”, International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, Vol. 10, No. 5, 2002, pp. 557–570.

    Article  Google Scholar 

  • Syverson, Paul S., David M. Goldschlag and Michael G. Reed, “Anonymous Connections and Onion Routing”, in Proceedings of the 18th Annual Symposium on Security and Privacy, IEEE CS Press, May 1997, pp. 44–54.

    Google Scholar 

  • Toubiana, Vincent, Arvind Narayanan, Dan Boneh, Helen Nissenbaum and Solon Barocas, “Adnostic: Privacy Preserving Targeted Advertising”, Network and Distributed System Security Symposium 2010.

    Google Scholar 

  • Troncoso, Carmela, George Danezis, Eleni Kosta and Bart Preneel, “PriPAYD: Privacy Friendly Pay-As You-Drive Insurance”, Proceedings of the 6th ACM workshop on Privacy in the electronic society (WPES), 2007.

    Google Scholar 

  • Verykios, Vassilios S., Elisa Bertino, Igor Nai Fovino, Loredana Parasiliti Provenza, Yucel Saygin and Yannis Theodoridis, “State-of-the-art in Privacy Preserving Data Mining”, in SIGMOD Record, Vol. 33, No. 1, March 2004, pp. 50–57.

    Google Scholar 

  • Wang, Yang, Saranga Komanduri, Pedro Leon, Gregory Norcie, Alessandro Acquisti and Lorrie Faith Cranor, “‘I regretted the minute I pressed share’, A Qualitative Study of Regrets on Facebook”, Proceedings of the Seventh Symposium on Usable Privacy and Security (SOUPS), ACM, July 2011.

    Google Scholar 

  • Waters, Brent R., Dirk Balfanz, Glenn Durfee and D.K. Smetters, “Building an encrypted and searchable audit log”, presented at the 11th Annual Network and Distributed System Security Symposium, 2004.

    Google Scholar 

  • Zanfir, Gabriela, “Forgetting About Consent: Why the Focus Should Be on ‘Suitable Safeguards’ in Data Protection Law”, in Serge Gutwirth et al. (eds.), Reloading Data Protection, Springer, Dordrecht, 2014.

    Google Scholar 

Download references

Acknowledgement

This work was partially funded by the European project PRIPARE / FP7-ICT-2013-1.5 and the Inria Project Lab CAPPRIS (Collaborative Action on the Protection of Privacy Rights in the Information Society). The author also thanks Denis Butin, Mathieu Cunche, Sébastien Gambs and Christophe Lazaro for their comments on an earlier version of this chapter and the editors for their assistance in the preparation of the final version. Their suggestions helped improve and clarify this document.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Daniel Le Métayer .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Le Métayer, D. (2016). Whom to Trust? Using Technology to Enforce Privacy. In: Wright, D., De Hert, P. (eds) Enforcing Privacy. Law, Governance and Technology Series(), vol 25. Springer, Cham. https://doi.org/10.1007/978-3-319-25047-2_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-25047-2_17

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-25045-8

  • Online ISBN: 978-3-319-25047-2

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics