Skip to main content

Profiling Technologies and Fundamental Rights and Values: Regulatory Challenges and Perspectives from European Data Protection Authorities

  • Chapter
  • First Online:
Reforming European Data Protection Law

Part of the book series: Law, Governance and Technology Series ((ISDP,volume 20))

Abstract

This paper aims to map the field of profiling, its implications for fundamental rights and values, and the measures which are or can be taken to address the challenges of profiling practices. It presents a working definition of profiling and elaborates a typology of its basic methods. In the second section the paper gives an overview of the technological background of profiling to display how fundamental rights and values of European societies are endangered by the use of profiling. Finally the paper presents the findings of a questionnaire addressed to European DPAs on the current and future legal framework, the domains of application, the complaints and remedies procedures regarding the use of profiling techniques, the main risks and benefits for the fundamental rights, and citizens’ awareness on this topic. These findings contribute important insights for the ongoing discussion on the regulation of profiling in Europe.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    See Fraunhofer. IAIS, Big Data – Vorsprung durch Wissen. Innovations potenzial analyse, http://www.bigdata.fraunhofer.de/content/dam/bigdata/de/documents/FraunhoferIAIS_Big-Data-Analyse_Doku.pdf, last accessed 01 April 2014. The programs of the world’s largest ICT fair CeBIT 2014, the Big Data Days 2013, and the European Data Forum and the presentations given there, draw an interesting picture of the potentials the ICT industry attributes to “Big Data” and big data analytics: http://www.cebit.de/home, last accessed 03 April 2014, http://www.big-data-days.de, last accessed 03 April 2014, and http://2014.data-forum.eu/, last accessed 03 April 2014.

  2. 2.

    Sasa Baskarada and Andy Koronios, “Data, Information, Knowledge, Wisdom (DIKW): A Semiotic Theoretical and Empirical Exploration of the Hierarchy and its Quality Dimension,” in Australasian Journal of Information systems, Vol 18, No 1 (2013): 5–24.

  3. 3.

    Karl-Heinz Streibich, “Big Smart Data. Mehrwert für Unternehmen” (paper presented at the Big Data Days, Berlin, Germany, November 11–12, 2013).

  4. 4.

    See Susanne Krasmann, “Der Präventionsstaat im Einvernehmen. Wie Sichtbarkeitsregime stillschweigend Akzeptanz produzieren,” in Sichtbarkeitsregime: Überwachung, Sicherheit und Privatheit im 21. Jahrhundert, ed. Leon Hempel, Susanne Krasmann and Ulrich Bröckling (Wiesbaden: VS Verlag, 2010), 53–70 and Pat O‘Malley, “Risk, power and crime prevention,” Economy and Society 21/3 (1992): 252–275.

  5. 5.

    For some of the technical problems which harm the reliability of profiling results, see Daniel Guagnin, Leon Hempel and Justin Jung, “Evolution of Technologies in Profiling”, Working Paper, http://profiling-project.eu/wp-content/uploads/2013/08/Evolution-of-Technologies-in-Profiling_0208.pdf, last accessed 02 April 2014.

  6. 6.

    The PROFILING project is funded from the European Union’s Fundamental Rights and Citizenship programme. The 2 year project started in November 2012. More information on the project can be found on the website http://profiling-project.eu.

  7. 7.

    Marx, Gary and Reichman Nancy. “Routinizing the Discovery of Secrets: Computers as Informants,” in American Behavioral Scientist, 27, 4 (1984): 429.

  8. 8.

    Clarke, Roger, “Profiling: A Hidden Challenge to the Regulation of Data Surveillance,” in Journal of Law and Information Science 4, 2 (1993): p. 403.

  9. 9.

    Bygrave, Lee A., Data protection law: Approaching its rationale, logic and limits (The Hague: Kluwer Law International, 2002), 301.

  10. 10.

    Mireille Hildebrandt, “Profiling and AML,” in The Future of Identity in the Information Society. Challenges and Opportunities, ed. Kai Rannenberg, Denis Royer and Andre Deuker (Heidelberg: Springer, 2009a), 275.

  11. 11.

    Mireille Hildebrandt, “Who is Profiling Who? Invisible Visibility” in Reinventing Data Protection?, ed. Serge Gutwirth et al. (Dordrecht: Springer, 2009c), 241.

  12. 12.

    Gloria González Fuster, Serge Gutwirth and Ellyne Erika, “Profiling in the European Union: A high-risk practice,” in INEX Policy Brief 10 (2010): 2.

  13. 13.

    Gloria González Fuster, Serge Gutwirth and Ellyne Erika, “Profiling in the European Union: A high-risk practice,” in INEX Policy Brief 10 (2010): 2.

  14. 14.

    Mireille Hildebrandt, “Defining profiling: a new type of knowledge?,” in Profiling the European Citizens. Cross-Disciplinary Perspectives, ed. Mireille Hildebrandt and Serge Gutwirth (Dordrecht: Springer, 2008), 28.

  15. 15.

    See Mireille Hildebrandt, “Profiling: from Data to Knowledge. The challenges of a crucial technology,” in DuD Datenschutz und Datensicherheit 30(9) (2006): 548–552 and Mireille Hildebrandt, “Defining profiling: a new type of knowledge?,” in Profiling the European Citizens. Cross-Disciplinary Perspectives, ed. Mireille Hildebrandt and Serge Gutwirth (Dordrecht: Springer, 2008), 17–47.

  16. 16.

    Mireille Hildebrandt, “Profiling: from Data to Knowledge. The challenges of a crucial technology,” in DuD Datenschutz und Datensicherheit 30(9) (2006): 550.

  17. 17.

    See Anton, Vedder, “KDD: The challenge to individualism,” in Ethics and Information Technology (1999): 275–281 and Arnold Roosendaal, Digital Personae and Profiles in Law. Protecting Individuals’ Rights in Online Contexts, Oisterwijk: Wolf Legal Publishers.

  18. 18.

    See Anton, Vedder, “KDD: The challenge to individualism,” in Ethics and Information Technology (1999): 275–281.

  19. 19.

    The important role the implementation of written files played as storage and medium for information but also as a symbol of power for the Inquisition trials in Italy is displayed by Thomas Scharff, “Erfassen und Erschrecken. Funktionen des Prozeßschriftguts der kirchlichen Inquisition in Italien im 13. und frühen 14. Jahrhundert. “in Als die Welt in die Akten kam. Prozeßschriftgut im europäischen Mittelalter, ed. Susanne Lepsius and Thomas Wetzstein (Frankfurt a.M.: Vittorio Klostermann, 2008), 255–274.

  20. 20.

    Open Data City, Stasi versus NSA, accessed February 27, 2014, http://apps.opendatacity.de/stasi-vs-nsa.

  21. 21.

    Bert-JaapKoops, “Technology and the Crime Society: Rethinking Legal Protection,” in Law, Innovation & Technology, 1, 1 (2009): 93–124.

  22. 22.

    Technische Universität Berlin conducted a case study about the transformation of policing practices due to the application of data processing technologies. Expert interviews were conducted with scholars, civil rights activists, directors of security technology companies, a police representative, and a lawyer. Police as well as technology providers mentioned changes in the workflow and the construction of theses from digitally stored information. The report of the case study’s final results will be available at http://profiling-project.eu/.

  23. 23.

    See Nina Degele, Einführung in die Techniksoziologie (Stuttgart, UTB, 2002), p. 167–168.

  24. 24.

    The results software can draw from data material are dependent on the quality of the data sets, which are examined, including the selection and pre-processing of data. Major problems, especially regarding large-scale data sets which combine data from various sources, are poor data quality, data incompatibility, and biased data sets which corrupt data mining outcomes. Furthermore operators might not be familiar with such reliability problems. Consequently operators might not act properly upon these problems. See Ana Canhoto and James Blackhouse, “General Description of Behavioural Profiling,” in Profiling the European Citizens. Cross-Disciplinary Perspectives, ed. Mireille Hildebrandt and Serge Gutwirth (Dordrecht: Springer, 2008), 47–63 and Bernhard Anrig, Will Brown, and Mark Gasson, “The Role of Algorithms in Profiling,” in Profiling the European Citizens, Cross-Disciplinary Perspectives, ed. Mireille Hildebrandt and Serge Gutwirth (Dordrecht: Springer, 2008), 65–87.

  25. 25.

    See Toon Calders and Indrė Žliobaitė, “Why Unbiased Computational Processes Can Lead to Discriminative Decision Procedures,” in Discrimination and Privacy in the Information Society, ed. Bart Custers et al. (Berlin: Springer, 2013), 43–57.

  26. 26.

    See Bruno Latour, “Technology is Society Made Durable,” in A Sociology of Monsters: Essays on Power, Technology and Domination, ed. John Law (London: Routledge, 1991), 103–131.

  27. 27.

    See Michaelis Lianos and Douglas Mary, “Dangerization and the End of Deviance: The Institutional Environment,” in British Journal of Criminology 40, 2 (2000): 261–278 and Rosamunde van Brakel and Paul De Hert, “Policing, surveillance and law in a pre-crime society: Understanding the consequences of technology based strategies,” in Cahier Politiestudies 2011–3 no. 20 (2011): 163–192.

  28. 28.

    See Mireille Hildebrandt, “Defining profiling: new type of knowledge?,” in Profiling the European Citizens. Cross-Disciplinary Perspectives, ed. Mireille Hildebrandt and Serge Gutwirth (Dordrecht: Springer, 2008), 17–47.

  29. 29.

    See David Lyon, “Surveillance as Social Sorting. Computer Codes and Mobile Bodies,” in Surveillance As Social Sorting: Privacy, Risk, and Digital Discrimination, ed. David Lyon (London: Psychology Press, 2003), 20.

  30. 30.

    Profiling appears to create a dialectic form of practical knowledge, which is non-representative and representative, as defined in Sect. 1.2, at the same time. It is non-representative as profiles do not describe a given reality, but are detected by the aggregation, mining and cleansing of data. Nevertheless as these profiles are used to address populations according to this knowledge, they constitute them as a reality and thus do have a representative function.

  31. 31.

    See Pat O‘Malley, “Risk, power and crime prevention”, Economy and Society 21/3 (1992): 252–275.

  32. 32.

    Torin Monahan, “Surveillance as Governance: Social Inequality and the Pursuit of Democratic Surveillance,” in Surveillance and Democracy issue (2010): 91–110.

  33. 33.

    See Bernhard Anrig, Will Brown, and Mark Gasson, “The Role of Algorithms in Profiling,” in Profiling the European Citizens, Cross-Disciplinary Perspectives, ed. Mireille Hildebrandt and Serge Gutwirth (Dordrecht: Springer, 2008), 65–87.

  34. 34.

    See Fareed Zakaria, “The rise of illiberal democracy,” in Foreign Affairs, 76, 6 (1997): 22–43.

  35. 35.

    Serge Gutwirth and Mireille Hildebrandt, “Some Caveats on Profiling,” in Data protection in a profiled world, ed. Serge Gutwirth, Yves Poullet and Paul de Hert (Dordrecht: Springer, 2010.), 33.

  36. 36.

    Ibid.

  37. 37.

    As an example we can think of applying automated profiling to the health sector were the risks of taking wrong decisions could cost lives.

  38. 38.

    See Daniel J. Solove, Digital Person: Technology and Privacy in the Information Age (New York: New York University Press, 2004).

  39. 39.

    See Serge Gutwirth and Mireille Hildebrandt, “Some Caveats on Profiling,” in Data protection in a profiled world, ed. Serge Gutwirth, Yves Poullet and Paul de Hert (Dordrecht: Springer, 2010.), 31–41.

  40. 40.

    Mireille Hildebrandt, “Who is Profiling Who? Invisible Visibility,” in Reinventing Data Protection?, ed. Serge Gutwirth et al. (Dordrecht: Springer, 2009c), 243.

  41. 41.

    Antoinette Rouvroy and Yves Poullet, “The right to informational self-determination and the value of self-development. Reassessing the importance of privacy for democracy,” in Reinventing Data Protection?, ed. Serge Gutwirth et al. (Dordrecht: Springer, 2009), 51.

  42. 42.

    Mireille Hildebrandt, “Who is Profiling Who? Invisible Visibility,” in Reinventing Data Protection?, ed. Serge Gutwirth et al. (Dordrecht: Springer, 2009c), 243.

  43. 43.

    Gilles Deleuze, “Postskriptum über die Kontrollgesellschaften,” in Unterhandlungen 1972–1990, Gilles Deleuze (Frankfurt a.M.: Suhrkamp, 1992), 254–262.

  44. 44.

    Bert-Jaap Koops, “Technology and the Crime Society: Rethinking Legal Protection,” in Law, Innovation & Technology, 1, 1 (2009): 104.

  45. 45.

    See Deutscher Bundestag, Automatisierte Strafverfolgung, Data Mining und sogenannte erweiterte Nutzung von Daten in polizeilichen Informationssystemen, Drucksache 17/11582, 22 November 2012, http://dip21.bundestag.de/dip21/btd/17/115/1711582.pdf, last accessed 26 March 2014.

  46. 46.

    Even though “Errichtungsanordnungen” can be requested by citizens, an expert on political activism in TUB’s case study reported that police refused to give him requested information as handing out this information would hamper police work. Additionally several answers of the German government to requests of members of the parliament regarding data gathering, storage and analytics conducted by German police forces show that essential information about this practice is kept secret in order to avoid infringement of police work. See Deutscher Bundestag, Automatisierte Strafverfolgung, Data Mining und sogenannte erweiterte Nutzung von Daten in polizeilichen Informationssystemen, Drucksache 17/11582, 22 November 2012, http://dip21.bundestag.de/dip21/btd/17/115/1711582.pdf, last accessed 26 March 2014 and Deutscher Bundestag, Computergestützte Polizeitechnik bei Polizeibehörden, Drucksache 17/8544 (neu), 06 Feb 2012, http://dipbt.bundestag.de/dip21/btd/17/085/1708544.pdf, last accessed 01 April 2014.

  47. 47.

    See Andrej Hunko, Suchbewegungen zu Data Mining-Software gehen über gesetzlichen Auftrag des BKA hinaus, 17 March 2014, http://www.andrej-hunko.de/presse/1934-suchbewegungen-zu-data-mining-software-gehen-ueber-gesetzlichen-auftrag-des-bka-hinaus, last accessed 26 March 2014, and Deutscher Bundestag, Automatisierte Strafverfolgung, Data Mining und sogenannte erweiterte Nutzung von Daten in polizeilichen Informationssystemen, Drucksache 17/11582, 22 November 2012, http://dip21.bundestag.de/dip21/btd/17/115/1711582.pdf, last accessed 26 March 2014.

  48. 48.

    For information about the case study conducted by Technische Universität Berlin see footnote 22.

  49. 49.

    Stefano Rodotà, “Data Protection as a Fundamental Right”, in Reinventing Data Protection?, ed. Serge Gutwirth et al. (Dordrecht: Springer, 2009), 78.

  50. 50.

    See Antoinette Rouvroy and Yves Poullet, “The right to informational self-determination and the value of self-development. Reassessing the importance of privacy for democracy”, in Reinventing Data Protection?, ed. Serge Gutwirth et al. (Dordrecht: Springer, 2009), 57.

  51. 51.

    Huber v. Germany, C-524/06 (2008), find a summary of the judgment at: http://ec.europa.eu/dgs/legal_service/arrets/06c524_en.pdf; Test-Achats v. Council of Ministry, C-236/09 (2011), find a summary of the judgment at: http://ec.europa.eu/dgs/legal_service/arrets/09c236_en.pdf

  52. 52.

    See Daniel J. Solove, “I’ve Got Nothing to Hide’ and Other Misunderstandings of Privacy”, in San Diego Law Review Vol. 44 (2007): 754–764.

  53. 53.

    There is a large amount of literature on privacy taxonomies. Finn, Wright, and Friedewald summarizes the debate and propose a taxonomy of 7 types of privacy: privacy of the person, privacy of behaviour and action, privacy of personal communication, privacy of data and image, privacy of thoughts and feelings, privacy of location and space and privacy of association (including group privacy). See Rachel L Finn, David Wright and Michael Friedewald, “Seven Types of Privacy”, in European Data Protection: Coming of Age, ed. Serge Gutwirth et al. (Dordrecht: Springer, 2013), 3–32.

  54. 54.

    See Raphael Gellert and Serge Gutwirth, “Beyond accountability, the return to privacy?,” in Managing Privacy through Accountability, ed. Daniel Guagnin et al. (Houndmills: Palgrave Macmillan, 2012), 261–284.

  55. 55.

    See Raphael Gellert and Serge Gutwirth, “Beyond accountability, the return to privacy?,” in Managing Privacy through Accountability, ed. Daniel Guagnin et al. (Houndmills: Palgrave Macmillan, 2012), 261–284.

  56. 56.

    Pierre Trudel, “Privacy Protection on the Internet: Risk Management and Networked Normativity,” in Reinventing Data Protection?, ed. Serge Gutwirth et al. (Dordrecht: Springer, 2009), 322.

  57. 57.

    The Privacy Principles are contained in the OECD Guidelines on the protection of privacy and transborder flows of personal data. In 2013 these Guidelines have been updated; the original version, developed in the late 1970s and adopted in 1980, was the first internationally agreed upon set of privacy principles. See OECD, OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, accessed 14 March, 2014, http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm.

  58. 58.

    The basic data protection principles largely overlaps with the principles outlined in the Council of Europe’s Convention 108 for the Protection of Individuals with Regard to Automatic Processing of Personal Data (http://www.conventions.coe.int/Treaty/en/Treaties/Html/108.htm) and the Directive 95/46/EC on the Protection of Personal Data (http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:EN:HTML), however the OECD Guidelines already included the principle of accountability which has been prominently resumed in the Article 29 Working Party’s Opinion on the Principle of Accountability in 2010 (http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2010/wp173_en.pdf, all accessed 03 March 2014).

  59. 59.

    Data protection accountability has recently been debated among privacy scholars (See Daniel Guagnin et al., eds, Managing Privacy Through Accountability (Houndmills: Palgrave, 2012.)) and is taken into account in the discussions of the current draft of the GDPR.

  60. 60.

    Some RFID chips which use unique identifiers for initializing connections to RFID readers can also be tracked by third parties through this unique ID without any need to establish an authorized connection with the chip. See for instance http://www.spiegel.de/netzwelt/netzpolitik/sparkassen-pilotprojekt-kontaktlose-geldkarte-verraet-ihren-besitzer-a-831711.html.

  61. 61.

    For a problematisation of inferring private data from large databases and efforts to avoid disclosure of private data see LiWu Chang and Ira S. Moskowitz, “An Integrated Framework for Database Privacy Protection”, in Data and Application Security, ed. By Bhavani Thuraisingham et al., IFIP International Federation for Information Processing 73 (Springer US, 2001), 161–72; Stefan Sackmann, Jens Strüker, und Rafael Accorsi, “Personalization in Privacy-aware Highly Dynamic Systems”, Commun. ACM 49, Nr. 9 (September 2006); Vassilios S. Verykios et al., “State-of-the-art in Privacy Preserving Data Mining”, SIGMOD Rec. 33, Nr. 1 (März 2004): 50–57.

  62. 62.

    See Paul Ohm, “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization,” UCLA Law Review Vol. 57 (2010): 1701.

  63. 63.

    Some scholars critizice that accountability could become just another ineffective bureaucratic measure, yet other scholars see potential of achieving stronger communication about data processing practices and verifiable accounts of data processors. The impact and effectiveness of accountability will depend on the actual implementation and the adoption by data processors. A number of contributions to the debate of the principle of accountability can be found in Daniel Guagnin et al., eds, Managing Privacy Through Accountability (Houndmills: Palgrave, 2012).

  64. 64.

    See Mireille Hildebrandt, “Profiling and AML,” in The Future of Identity in the Information Society. Challenges and Opportunities, ed. Kai Rannenberg, Denis Royer and Andre Deuker (Heidelberg: Springer, 2009a), 273–310 and Mireille Hildebrandt, “Technology and the End of Law,” in Facing the Limits of the Laws, ed. Erik Claes, Wouter Devroe and Bert Keirsbilck (Heidelberg: Springer, 2009b), 443–465.

  65. 65.

    Melik Özden, “The Right to non-discrimination,” in Series of the Human Rights Programme of the CETIM (2011): 7.

  66. 66.

    Andrea Romei and Salvatore Ruggieri, “Discrimination Data Analysis: A Multi-disciplinary Bibliography,” in Discrimination and Privacy in the Information Society. Data Mining and Profiling in Large Databases, ed. Bart Custers et al. (Berlin: Springer, 2013) 121.

  67. 67.

    See Dino Pedreschi, Salvatore Ruggieri, and Franco Turini, “The Discovery of Discrimination,” in Discrimination and Privacy in the Information Society. Data Mining and Profiling in Large Databases, ed. by Bart Custers et al. (Berlin: Springer: 2013), 91–108.

  68. 68.

    Bygrave, Lee A., Data protection law: Approaching its rationale, logic and limits (The Hague: Kluwer Law International, 2002), 3.

  69. 69.

    Article 20 par. 1: “Every natural person shall have the right not to be subject to a measure which produces legal effects concerning this natural person or significantly affects this natural person, and which is based solely on automated processing intended to evaluate certain personal aspects relating to this natural person or to analyse or predict in particular the natural person’s performance at work, economic situation, location, health, personal preferences, reliability or behaviour.”

  70. 70.

    Article 20 par. 3: “Automated processing of personal data intended to evaluate certain personal aspects relating to a natural person shall not be based solely on the special categories of personal data referred to in Article 9”.

  71. 71.

    Article 20 par. 2: “Subject to the other provisions of this Regulation, a person may be subjected to a measure of the kind referred to in paragraph 1 only if the processing:

    1. (a)

      is carried out in the course of the entering into, or performance of, a contract, where the request for the entering into or the performance of the contract, lodged by the data subject, has been satisfied or where suitable measures to safeguard the data subject’s legitimate interests have been adduced, such as the right to obtain human intervention; or

    2. (b)

      is expressly authorized by a Union or Member State law which also lays down suitable measures to safeguard the data subject’s legitimate interests; or

    3. (c)

      is based on the data subject’s consent, subject to the conditions laid down in Article 7 and to suitable safeguards.”

  72. 72.

    See for weaknesses and strengths of this provision Bert-Jaap Koops, “On decision transparency, or how to enhance data protection after the computational turn,” in Privacy, Due Process and the Computational Turn: The Philosophy of Law Meets the Philosophy of Technology, ed. Mireille Hildebrandt and Katja de Vries (Abingdon, Routledge, 2013), 189–213 and Mireille Hildebrandt, “The Dawn of a Critical Transparency Right for the Profiling Era,” in Digital Enlightenment Yearbook 2012, ed. Jacques Bus et al. (Amsterdam: IOS Press, 2012), 41–56.

  73. 73.

    Advice paper on essential elements of a definition and a provision on profiling within the EU General Data Protection Regulation http://ec.europa.eu/justice/data-protection/article-29/documentation/other-document/files/2013/20130513_advice-paper-on-profiling_en.pdf.

  74. 74.

    We thank the Italian, Romanian and German DPAs, the EDPS and the Council of Europe for their feedback to our pre-test questionnaire.

  75. 75.

    Within 2 months we received the questionnaire completed by 18 DPAs: Austria, Bulgaria, Croatia, Estonia, Finland, Germany, Greece, Hungary, Ireland, Italy, Lithuania, Malta, Romania, Slovakia, Slovenia, Sweden, Switzerland, and United Kingdom. Three DPAs informed us that they would not be able to complete the questionnaire, mainly because of lack of resources: Denmark, Luxembourg and Netherlands. However, DPAs from Luxembourg and Netherlands provided some information related to the questionnaire. Eight DPAs did not respond: Belgium, Cyprus, Czech Republic, France, Latvia, Poland, Portugal and Spain.

  76. 76.

    As defined by the Council of Europe, mainly (1) collection and storage of data, (2) correlation and analysis of data and (3) practical application of profiles.

  77. 77.

    According to article 15(1): “every person has the right not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc.” At the same time, article 15 (2) states an exception: “a person may nevertheless be subjected to an automated individual decision if that decision is taken: (a) in the course of the entering into or performance of a contract, provided the request for the entering into or the performance of the contract, lodged by the data subject, has been satisfied or that there are suitable measures to safeguard his legitimate interests, such as arrangements allowing him to put his point of view; or (b) is authorized by a law which also lays down measures to safeguard the data subject’s legitimate interests”.

  78. 78.

    Profiling is envisaged in the German Telemedia Act for the purposes of advertising, market research or in order to design the telemedia in a needs-based manner; in Italy, legal provisions on the assessment of income tax provide some type of profiling.

  79. 79.

    Finland have a Guide on the processing of personal data in context of direct marketing and a Guide on a data subject’s right of accede to his/her data which serve as official guidance to all); UK have Guides on subject access rights.

  80. 80.

    Austria Data Protection Commission took nine decisions which may serve as guideline of their activities, available online at: http://www.ris.bka.gv.at/Dsk/; Italian DPA issued several decisions on profiling, for example on loyalty cards, on customer profiling as carried out by telecom operators, in employment sector and in respect of interactive TV.

  81. 81.

    Hungarian former commissioner for data protection and freedom of information issued a report, in cooperation with the commissioner for ethnic and minority rights, on the processing of data relating to ethnic origin; Irish DPA provides general information and advice on the right of access of data subjects to their personal data, but not specifically tailored to the issue of automated profiling; Slovenia issued a couple of non-binding opinions on a credit worthiness system both to the data controller and to data subjects; Swedish DPA has published a leaflet on Article 12 that contains information about which public and private actors that process personal data and on how to proceed to exercise the right of access to personal data; Swiss DPA has provided guidance on subject access rights.

  82. 82.

    Committee on Civil Liberties, Justice and Home Affairs of the European Parliament. Draft report on the proposal for a regulation of the European Parliament and of the Council on the protection of individual with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation). 17/12/2012. Available online at: http://www.europarl.europa.eu/meetdocs/2009_2014/documents/libe/pr/922/922387/922387en.pdf.

  83. 83.

    Opinion of the European Data Protection Supervisor on the data protection reform package. 7/03/2012.

  84. 84.

    Article 29 Data Protection Working Party. Advice paper on essential elements of a definition and a provision on profiling within the EU General Data Protection Regulation. 13/05/2013.

  85. 85.

    The definition proposed states that: “Profiling means any form of automated processing of personal data, intended to analyse or predict the personality or certain personal aspects relating to a natural person, in particular the analysis and prediction of the person’s health, economic situation, performance at work, personal preferences or interests, reliability or behaviour, location or movements”.

  86. 86.

    Main proposals for improvement concern the scope: “It […] welcomes Rapporteur Albrecht’s proposal to broaden the scope of Article 20 covering processing of personal data for the purpose of profiling or measures based on profiling. The Working Party regards this as a necessary step towards more legal certainty and more protection for individuals with respect to data processing in the context of profiling”; a greater transparency and control for data subjects; more responsibility and accountability of data controllers; a balanced approach to profiling and the role of EDPS.

  87. 87.

    Bulgaria estimates there are good balances between individual rights and data controllers’ activity and between the general prohibition for processing sensitive data and the exceptions); Hungary support the Article, Croatia, Slovakia and Sweden do not have any comments or objections.

  88. 88.

    Austria, Bulgaria (but only for sensitive data), Croatia (not explicitly mention in the Data Protection Act), Hungary (audit and impact assessment envisaged as a prior checking procedure), Italy, Malta, Slovakia, Slovenia, UK (prior checking is in the Data Protection Act but has never been enforced).

  89. 89.

    Austria (before a court but not in practice), Bulgaria (under civil law not in the Data Protection Act which foresee administrative penalties-fines/sanctions), Croatia (before a court of general jurisdiction), Estonia (no precision), Finland (before a district court in civil procedure), Germany (the DPA of the non-public sector can impose a fine and civil procedure also apply), Greece (under civil procedure), Hungary (through civil and other procedures), Ireland (no direct compensation before the DPA but possible through civil procedure), Italy (judicial authorities competence), Lithuania (civil court competence), Malta (civil court competence), Romania (court of Law competence), Slovakia (not under the DPA but through civil Court), Slovenia (the civil law give competence to Court and relevant Authorities), Sweden (the Data Protection Act envisage compensation but not specific to profiling), UK (court competence). Switzerland did not answer.

  90. 90.

    Answer of the Italian DPA (‘Garante’): “On 21 June 2011, our DPA adopted a decision concerning the profiling carried out by the National Printing Institution on its employees, in particular as a result of the monitoring of the employees’ activities on the Internet. In such decision our DPA prohibited the unlawful data processing operations which had been carried out, inter alia, without informing the data subjects and notifying the processing to the Garante ( http://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/1829641 ).

References

  • Anrig, Bernhard, Brown, Will and Gasson, Mark, “The Role of Algorithms in Profiling,” in Profiling the European Citizens. Cross-Disciplinary Perspectives, edited by Mireille Hildebrandt and Serge Gutwirth, 65–87, Dordrecht: Springer, 2008.

    Chapter  Google Scholar 

  • Baskarada, Sasa and Koronios, Andy, “Data, Information, Knowledge, Wisdom (DIKW): A Semiotic Theoretical and Empirical Exploration of the Hierarchy and its Quality Dimension.” Australasian Journal of Information Systems, Vol 18, No 1 (2013): 5–24.

    Google Scholar 

  • Brakel, Rosamunde van and Paul de Hert. “Policing, surveillance and law in a pre-crime society: Understanding the consequences of technology based strategies.” Cahier Politiestudies 20 (2011): 163–192.

    Google Scholar 

  • Bygrave, Lee A. Data protection law: Approaching its rationale, logic and limits. The Hague: Kluwer Law International, 2002.

    Google Scholar 

  • Calders, Toon and Žliobaitė Indrė. “Why Unbiased Computational Processes Can Lead to Discriminative Decision Procedures.” In Discrimination and Privacy in the Information Society. Data Mining and Profiling in Large Databases, edited by Bart Custers et al., 43–57. Berlin: Springer, 2013.

    Google Scholar 

  • Canhoto, Ana and Blackhouse, James, “General Description of Behavioural Profiling,” in Profiling the European Citizens. Cross-Disciplinary Perspectives, edited by Mireille Hildebrandt and Serge Gutwirth, 47–63, Dordrecht: Springer, 2008.

    Chapter  Google Scholar 

  • Chang, LiWu, und Ira S. Moskowitz. “An Integrated Framework for Database Privacy Protection” In Data and Application Security, herausgegeben von BhavaniThuraisingham, Reind van de Riet, Klaus R. Dittrich, und ZahirTari, 161–72. IFIP International Federation for Information Processing 73. Springer US, 2001. http://link.springer.com/chapter/10.1007/0-306-47008-X_15.

  • Clarke, Roger, “Profiling: A Hidden Challenge to the Regulation of Data Surveillance”. In Journal of Law and Information Science 4, 2 (1993): 403–419.

    Google Scholar 

  • Degele, Nina, Einführung in die Techniksoziologie. Stuttgart: UTB, 2002.

    Google Scholar 

  • Deleuze, Gilles. “Postskriptum über die Kontrollgesellschaften” In Unterhandlungen 1972–1990, Gilles Deleuze, 254–262. Frankfurt a.M.: Suhrkamp, 1992.

    Google Scholar 

  • Deutscher Bundestag, “Automatisierte Strafverfolgung, Data Mining und sogenannte erweiterte Nutzung von Daten in polizeilichen Informationssystemen.” Drucksache 17/11582, 22 November 2012, http://dip21.bundestag.de/dip21/btd/17/115/1711582.pdf, last accessed 26 March 2014.

  • Deutscher Bundestag, “Computergestützte Polizeitechnik bei Polizeibehörden.” Drucksache 17/8544 (neu), 06 Feb 2012, http://dipbt.bundestag.de/dip21/btd/17/085/1708544.pdf, last accessed 01 April 2014.

  • Finn, Rachel L, David Wright and Michael Friedewald. “Seven Types of Privacy.” In European Data Protection: Coming of Age, edited by Serge Gutwirth et al., 3–32, Dordrecht: Springer, 2013.

    Google Scholar 

  • Fraunhofer. IAIS, “Big Data – Vorsprung durch Wissen. Innovationspotenzialanalyse,” http://www.bigdata.fraunhofer.de/content/dam/bigdata/de/documents/FraunhoferIAIS_Big-Data-Analyse_Doku.pdf, last accessed 01 April 2014.

  • Fuster, Gloria González, Gutwirth Serge, and Ellyne Erika. “Profiling in the European Union: A high-risk practice” INEX Policy Brief 10 (2010): 1–12.

    Google Scholar 

  • Gellert, Raphael and Serge Gutwirth. “Beyond accountability, the return to privacy?” In Managing Privacy through Accountability, edited BY Daniel Guagnin et al., 261–284. Houndmills: Palgrave Macmillan, 2012.

    Google Scholar 

  • Guagnin, Daniel et al., eds. Managing Privacy Through Accountability. Houndmills: Palgrave, 2012.

    Google Scholar 

  • Gutwirth, Serge and Hildebrandt Mireille. “Some Caveats on Profiling.” In Data protection in a profiled world, edited by Serge Gutwirth, Yves Poullet and Paul de Hert, 31–41. Dordrecht: Springer, 2010.

    Chapter  Google Scholar 

  • Gutwirth, Serge and Paul de Hert. “Regulating profiling in a Democratic Constitutional State.” In Profiling the European Citizens. Cross-Disciplinary Perspectives, edited by Mireille Hildebrandt and Serge Gutwirth, 272–293 Dordrecht: Springer, 2008.

    Google Scholar 

  • Hildebrandt, Mireille. “The Dawn of a Critical Transparency Right for the Profiling Era”. In Digital Enlightenment Yearbook 2012, edited by Jacques Bus, Malcolm Crompton, Mireille Hildebrandt, George Metakides, 41–56. Amsterdam: IOS Press, 2012.

    Google Scholar 

  • Hildebrandt, Mireille. “Defining profiling: a new type of knowledge?” In Profiling the European Citizens. Cross-Disciplinary Perspectives, edited by Mireille Hildebrandt and Serge Gutwirth, 17–47. Dordrecht: Springer, 2008.

    Chapter  Google Scholar 

  • Hildebrandt, Mireille. “Profiling and AML,” in The Future of Identity in the Information Society. Challenges and Opportunities, edited by Rannenberg Kai, Denis Royer and André Deuker, 273–310. Heidelberg: Springer, 2009a.

    Chapter  Google Scholar 

  • Hildebrandt, Mireille. “Profiling: from Data to Knowledge. The challenges of a crucial technology.” DuD Datenschutz und Datensicherheit, 30, 9 (2006): 548–552.

    Article  Google Scholar 

  • Hildebrandt, Mireille. “Technology and the End of Law.” In Facing the Limits of the Law, edited by Erik Claes, Wouter Devroe and Bert Keirsbilck, 443–465. Heidelberg: Springer, 2009b.

    Google Scholar 

  • Hildebrandt, Mireille. “Who is Profiling Who? Invisible Visibility.” In Reinventing Data Protection?, edited by Serge Gutwirth et al., 239–252. Dordrecht: Springer, 2009c.

    Chapter  Google Scholar 

  • Hunko, Andre, “Suchbewegungen zu Data Mining-Software gehen über gesetzlichen Auftrag des BKA hinaus,” 17 March 2014, http://www.andrej-hunko.de/presse/1934-suchbewegungen-zu-data-mining-software-gehen-ueber-gesetzlichen-auftrag-des-bka-hinaus, last accessed 26 March 2014.

  • Koops, Bert-Jaap (2013). “On decision transparency, or how to enhance data protection after the computational turn.” In Privacy, Due Process and the Computational Turn: The Philosophy of Law Meets the Philosophy of Technology, edited by Mireille Hildebrandt and Katja de Vries, 189–213. Abingdon: Routledge.

    Google Scholar 

  • Koops, Bert-Jaap. “Technology and the Crime Society: Rethinking Legal Protection.” Law, Innovation & Technology, 1, 1 (2009): 93–124.

    Google Scholar 

  • Latour, Bruno “Technology is Society Made Durable.” In A Sociology of Monsters: Essays on Power, Technology and Domination, edited by John Law, 103–131. London: Routledge, 1991.

    Google Scholar 

  • Lianos, Michaelis and Mary Douglas. “Dangerization and the End of Deviance: The Institutional Environment.” British Journal of Criminology 40, 2 (2000): 261–278.

    Article  Google Scholar 

  • Lyon, David “Surveillance as Social Sorting. Computer Codes and Mobile Bodies,” in Surveillance As Social Sorting: Privacy, Risk, and Digital Discrimination, edited by David Lyon, 13–31. City: Psychology Press, 2003.

    Google Scholar 

  • Marx, Gary and Reichman Nancy. “Routinizing the Discovery of Secrets: Computers as Informants”. In American Behavioral Scientist, 27, 4 (1984): 423–452.

    Google Scholar 

  • Monahan, Torin “Surveillance as Governance: Social Inequality and the Pursuit of Democratic Surveillance.” Surveillance and Democracy issue (2010): 91–110.

    Google Scholar 

  • O‘Malley, Pat. “Risk, power and crime prevention.” Economy and Society 21, 3 (1992): 252–275.

    Article  Google Scholar 

  • Ohm, Paul. “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization,” UCLA Law Review 57 (2010): 1701–1777.

    Google Scholar 

  • Özden Melik “The Right to non-discrimination”, in Series of the Human Rights Programme of the CETIM, 2011.

    Google Scholar 

  • Pedreschi, Dino, Ruggieri Salvatore, Turini Franco. “The Discovery of Discrimination.” In Discrimination and Privacy in the Information Society. Data Mining and Profiling in Large Databases, edited by Bart Custers et al., 91–108. Berlin: Springer, 2013.

    Chapter  Google Scholar 

  • Polakiewiecz, Jörg. “Profiling – The Council of Europe’s Contribution.” In European Data Protection: Coming of Age, edited by Serge Gutwirth et al., 367–377, Dordrecht: Springer, 2013.

    Chapter  Google Scholar 

  • Rodotà, Stefano. “Data Protection as a Fundamental Right.” In Reinventing Data Protection?, edited by Serge Gutwirth et al., 77–82. Dordrecht: Springer, 2009.

    Chapter  Google Scholar 

  • Romei, Andrea and Ruggieri Salvatore. “Discrimination Data Analysis: A Multi-disciplinary Bibliography.” In Discrimination and Privacy in the Information Society. Data Mining and Profiling in Large Databases, edited by Bart Custers et al., 109–135. Berlin: Springer, 2013.

    Chapter  Google Scholar 

  • Roosendaal, Arnold. Digital Personae and Profiles in Law. Protecting Individuals’ Rights in Online Contexts, Oisterwijk: Wolf Legal Publishers.

    Google Scholar 

  • Rouvroy, Antoinette and Yves Poullet. “The right to informational self-determination and the value of self-development. Reassessing the importance of privacy for democracy.” In Reinventing Data Protection?, edited by Serge Gutwirth et al., 45–76. Dordrecht: Springer, 2009.

    Chapter  Google Scholar 

  • Sackmann, Stefan, Strüker, Jens, and Accorsi, “Personalization in Privacy-aware Highly Dynamic Systems”. Commun. ACM 49, Nr. 9 (September 2006): 32–38. doi:10.1145/1151030.1151052.

  • Scharff, Thomas. “Erfassen und Erschrecken. Funktionen des Prozeßschriftguts der kirchlichen Inquisition in Italien im 13. Und frühen 14. Jahrhundert.” In Als die Welt in die Akten kam. Prozeßschriftgut im europäischen Mittelalter, edited by Susanne Lepsius and Thomas Wetzstein, 255–274. Frankfurt a.M.: Vittorio Klostermann, 2008.

    Google Scholar 

  • Schermer, Bart. “Risks of profiling and the limits of data protection law.” In Discrimination and Privacy in the Information Society. Data Mining and Profiling in Large Databases, edited by Bart Custers et al., 137–154. Berlin: Springer, 2013.

    Chapter  Google Scholar 

  • Solove, Daniel J. “I’ve Got Nothing to Hide’ and Other Misunderstandings of Privacy.” San Diego Law Review, 44 (2007): 745–772.

    Google Scholar 

  • Solove, Daniel J. Digital Person: Technology and Privacy in the Information Age. New York: New York University Press, 2004.

    Google Scholar 

  • Streibich, Karl-Heinz, “Big Smart Data. MehrwertfürUnternehmen”, (paper presented at the Big Data Days, Berlin, Germany, November 11–12, 2013).

    Google Scholar 

  • Susanne Krasmann, “Der Präventionsstaat im Einvernehmen. Wie Sichtbarkeitsregime stillschweigend Akzeptanz produzieren.” In Sichtbarkeitsregime: Überwachung, Sicherheit und Privatheit im 21. Jahrhundert, edited by Leon Hempel, Susanne Krasmann and Ulrich Bröckling, 53–70. Wiesbaden: VS Verlag, 2010.

    Google Scholar 

  • Trudel, Pierre. “Privacy Protection on the Internet: Risk Management and Networked Normativity.” In Reinventing Data Protection?, edited by Serge Gutwirth et al., 317–334. Dordrecht: Springer, 2009.

    Chapter  Google Scholar 

  • Vedder, Anton. “KDD: The challenge to individualism.” Ethics and Information Technology (1999): 275–281.

    Google Scholar 

  • Verykios, Vassilios S. et al., “State-of-the-art in Privacy Preserving Data Mining”. SIGMOD Rec. 33, Nr. 1 (März 2004): 50–57. doi:10.1145/974121.974131.

  • Zakaria, Fareed. “The rise of illiberal democracy.” Foreign Affairs, 76, 6 (1197): 22–43.

    Google Scholar 

  • Zarsky, Tal Z., “Mine Your Own Business!’: Making The Case For The Implications Of The Data Mining Of Personal Information in the Forum of Public Opinion.” Yale Journal of Law & Technology 5 (2002–2003): 1–56.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Francesca Bosco .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer Science+Business Media Dordrecht

About this chapter

Cite this chapter

Bosco, F., Creemers, N., Ferraris, V., Guagnin, D., Koops, BJ. (2015). Profiling Technologies and Fundamental Rights and Values: Regulatory Challenges and Perspectives from European Data Protection Authorities. In: Gutwirth, S., Leenes, R., de Hert, P. (eds) Reforming European Data Protection Law. Law, Governance and Technology Series(), vol 20. Springer, Dordrecht. https://doi.org/10.1007/978-94-017-9385-8_1

Download citation

Publish with us

Policies and ethics