Skip to main content

Profiling and Automated Decision-Making: Legal Implications and Shortcomings

  • Chapter
  • First Online:

Part of the book series: Perspectives in Law, Business and Innovation ((PLBI))

Abstract

The increased use of profiling and automated decision-making systems raises a number of challenges and concerns. The underlying algorithms embody a considerable potential for discrimination and unfair treatment. Furthermore, individuals are treated as passive objects of algorithmic evaluation and decision tools and are unable to present their values and positions. They are no longer perceived as individuals in their own right: all that matters is the group they are assigned to. Profiling and automated decision-making techniques also depend on the processing of personal data , and a significant number of the available applications are highly privacy -intrusive. This article analyses how the European General Data Protection Regulation (GDPR) responds to these challenges. In particular, Art. 22 GDPR , which provides the right not to be subject to automated individual decision-making, as well as the information obligations under Art. 13 (2) (f) and Art. 14 (2) (g) GDPR and the access right under Art. 15 (1) (h) GDPR , will be examined in detail. General data protection principles, particularly the principle of fairness, as well as specific German scoring provisions and anti-discrimination rules, are looked at, too. In conclusion, various shortcomings of the present legal framework are identified and discussed and a short outlook for potential future steps presented.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   139.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    For a clarification of the term Big Data , see Forgó et al. (2017), pp. 20–22.

  2. 2.

    Hacker and Petkovka (2017), p. 4.

  3. 3.

    See Footnote 2.

  4. 4.

    Hofmann (2016) gives a brief explanation of the characteristics of the Industry 4.0 development, pp. 12–13.

  5. 5.

    Edwards and Veale (2017), p. 19 and Vedder and Naudts (2017), p. 207.

  6. 6.

    Art. 4 (4) GDPR; Martini (2018), Art. 22 margin note 21.

  7. 7.

    Art. 29 Data Protection Working Party (2018), p. 7 and Martini (2018), Art. 22 margin note 21.

  8. 8.

    Art. 29 Data Protection Working Party (2018), p. 7.

  9. 9.

    Buchner (2018), Art. 22 margin note 22.

  10. 10.

    Hladjk (2017), Art. 22 margin note 4.

  11. 11.

    Kamlah (2003), p. V.

  12. 12.

    Steppe (2017), p. 783.

  13. 13.

    Buchner (2018), Art. 22 margin note 4.

  14. 14.

    Art. 29 Data Protection Working Party (2018), p. 8 and Kamlah (2016), Art. 22 margin note 2.

  15. 15.

    Art. 29 Data Protection Working Party (2018), p. 8, Edwards and Veale (2017), p. 19 and Kamlah (2016), Art. 22 margin note 2.

  16. 16.

    McLellan (2016).

  17. 17.

    Martini (2017), p. 1017.

  18. 18.

    Martini (2017), p. 1017 and Ernst (2017), p. 1026.

  19. 19.

    Vedder and Naudts (2017), p. 206.

  20. 20.

    Vedder and Naudts (2017), p. 210.

  21. 21.

    Ernst (2017), p. 1028.

  22. 22.

    See Footnote 21.

  23. 23.

    Edwards and Veale (2017), p. 25.

  24. 24.

    McLellan (2016) and Hoffmann-Riem (2017), p. 3.

  25. 25.

    Datatilsynet, The Norwegian Data Protection Authority (2018), p. 6.

  26. 26.

    Hoffmann-Riem (2017), p. 3 and McLellan (2016).

  27. 27.

    Edwards and Veale (2017), p. 19.

  28. 28.

    Whether these examples constitute automated decision-making in the sense of Art. 22 (1) or (4) GDPR is discussed in Sect. 4.1.1.1.

  29. 29.

    IT Finanzmagazin (2017).

  30. 30.

    Schwichtenberg (2015), p. 378.

  31. 31.

    Lüdemann et al. (2014), p. 304 and Schwichtenberg (2015), p. 379.

  32. 32.

    Schwichtenberg (2015), p. 378; Lüdemann et al. (2014), pp. 302–303.

  33. 33.

    Lüdemann et al. (2014), p. 304.

  34. 34.

    Der Tagesspiegel (2018) and Ernst (2017), p. 1026.

  35. 35.

    Der Tagesspiegel (2018) and Schönhaar (2018).

  36. 36.

    Steppe (2017), p. 781.

  37. 37.

    Borgesius Zuiderveen and Poort (2017), p. 350.

  38. 38.

    See Footnote 37.

  39. 39.

    James (2015).

  40. 40.

    See Footnote 37.

  41. 41.

    Kramer (2018).

  42. 42.

    Kramer (2018) and Ernst (2017), pp. 1027–1028.

  43. 43.

    Ernst (2017), p. 1029 and Vedder and Naudts (2017), p. 209.

  44. 44.

    Ernst (2017), p. 1029.

  45. 45.

    McLellan (2016), Edwards and Veale (2017), p. 28 and Schermer (2011), p. 47.

  46. 46.

    Edwards and Veale (2017), p. 28.

  47. 47.

    Edwards and Veale (2017), p. 28 and Ernst (2017), pp. 1028–1029.

  48. 48.

    Martini (2017), p. 1018.

  49. 49.

    Edwards and Veale (2017), p. 29 and Ernst (2017), p. 1032.

  50. 50.

    Hacker and Petkovka (2017), p. 8.

  51. 51.

    See Footnote 21.

  52. 52.

    Schermer (2011), p. 47, provides an explanatory example.

  53. 53.

    Ernst (2017), p. 1030 and Jandt (2015), p. 8.

  54. 54.

    See also Schaar (2016), Edwards and Veale (2017), pp. 19–20 and Hladjk (2017), Art. 22 Rn. 3.

  55. 55.

    A decision tree is a simple model which provides a high degree of transparency . For further information, see Datatilsynet, The Norwegian Data Protection Authority (2018), p. 13. An illustrative example is provided by Binns (2017).

  56. 56.

    See Footnote 20.

  57. 57.

    Vedder and Naudts (2017), pp. 208, 217, Hoffmann-Riem (2017), p. 29 and Martini (2017), p. 1018.

  58. 58.

    Ernst (2017), pp. 1028–1029, Vedder and Naudts (2017), p. 210 and Clifford Chance (2017).

  59. 59.

    Ernst (2017), p. 1030 and Clifford Chance (2017).

  60. 60.

    An explanation of the term “neural networks” can be found in Datatilsynet, The Norwegian Data Protection Authority (2018), p. 14.

  61. 61.

    McLellan (2016), Knight (2017), Ernst (2017), p. 1027, Martini (2017), pp. 1018–1019 and Datatilsynet, The Norwegian Data Protection Authority (2018), p. 12.

  62. 62.

    Stolberg and Ceccotti (2018), Knoche (2018) and Holzinger et al. (2017).

  63. 63.

    Edwards and Veale (2017), p. 54.

  64. 64.

    Edwards and Veale (2017), pp. 32–33 and Hoffmann-Riem (2017), p. 23.

  65. 65.

    Edwards and Veale (2017), p. 33 and Hacker and Petkovka (2017), p. 7.

  66. 66.

    Hacker and Petkovka (2017), p. 7.

  67. 67.

    Hildebrandt (2009), p. 243.

  68. 68.

    See Footnote 66.

  69. 69.

    Edwards and Veale (2017), p. 32.

  70. 70.

    Art. 29 Data Protection Working Party (2018), p. 9.

  71. 71.

    Hildebrandt (2009), p. 242.

  72. 72.

    Hildebrandt (2009), p. 244.

  73. 73.

    Kühling and Martini (2016), pp. 448–449.

  74. 74.

    For details regarding the material and territorial scope see Art. 3 and 4 GDPR.

  75. 75.

    Kamlah (2016), Art. 22 margin note 2; Schulz (2017), Art. 22 margin note 4.

  76. 76.

    Schulz (2017), Art. 22 margin note 4.

  77. 77.

    Art. 29 Data Protection Working Party (2018), p. 19, Schulz (2017), Art. 22 margin note 5; Martini (2018), Art. 22 margin note 1. A different view is expressed by Kamlah (2016), Art. 22 margin note 4. Wachter et al. (2017), p. 95, conclude that the formulation chosen is critically ambiguous.

  78. 78.

    Helfrich (2017), Art. 22 margin note 44 and Malgieri and Comandé (2017), p. 9. A more restrictive view is taken by Wachter et al. (2017), p. 92.

  79. 79.

    Malgieri and Comandé (2017), p. 251.

  80. 80.

    Lüdemann et al. (2014), p. 304 and Steppe (2017), p. 783.

  81. 81.

    Art. 29 Data Protection Working Party (2018), p. 21 and Schulz (2017), Art. 22 margin notes 14–15.

  82. 82.

    See Footnote 79.

  83. 83.

    Edwards and Veale (2017), p. 45.

  84. 84.

    See also Vedder and Naudts (2017), pp. 216–217.

  85. 85.

    Buchner (2018), Art. 22 margin note 24.

  86. 86.

    Steppe (2017), p. 784. A more restrictive view is taken by von Lewinski (2018), Art. 22 margin note 28.

  87. 87.

    Schulz (2017), Art. 22 margin note 27.

  88. 88.

    The European Data Protection Board (2018), the “successor” of the Art. 29 Data Protection Working Party, has endorsed the Guidelines on automated individual decision-making and profiling for the purposes of Regulation 2016/679 17/EN WP251rev.01.

  89. 89.

    Art. 29 Data Protection Working Party (2018), p. 22.

  90. 90.

    For more details see Sect. 2.2.4.

  91. 91.

    Ernst (2017), pp. 1034–1035.

  92. 92.

    Schulz (2017), Art. 22 margin note 25.

  93. 93.

    Schulz (2017), Art. 22 margin note 24; Buchner (2018), Art. 22 margin note 26.

  94. 94.

    Steppe (2017), p. 784.

  95. 95.

    See Footnote 89.

  96. 96.

    See Footnote 91.

  97. 97.

    Malgieri and Comandé (2017) refer in that context also to the EU Unfair Commercial Practices Directive and state in that context that “the protection against pervasive marketing manipulation is already legally recognized as a legitimate interest” and argue that “the impairing influence and pervasive manipulation of consumers vulnerability performed through algorithmic decision-making can be considered a ‘significant effect’ in terms of Article 22(1),” p. 11.

  98. 98.

    See Footnote 89.

  99. 99.

    Art. 29 Data Protection Working Party (2018), p. 23.

  100. 100.

    Schulz (2017), Art. 22 margin note 29–30 and Hladjk (2017), Art. 22 margin note 11.

  101. 101.

    Schulz (2017), Art. 22 margin note 30.

  102. 102.

    Hladjk (2017), Art. 22 margin note 11.

  103. 103.

    Art. 29 Data Protection Working Party (2018), p. 13.

  104. 104.

    Hoffmann-Riem (2017), p. 22.

  105. 105.

    Hoffmann-Riem (2017), pp. 22–23.

  106. 106.

    See Footnote 104.

  107. 107.

    See Footnote 103.

  108. 108.

    Martini (2017), p. 1019.

  109. 109.

    See Footnote 104.

  110. 110.

    Lüdemann et al. (2014), p. 305.

  111. 111.

    For further remarks on that issue, see Schwichtenberg (2015), p. 380.

  112. 112.

    Steppe (2017), pp. 777–778.

  113. 113.

    If the automated individual decision-making is authorized by Union or Member State law , the law must also provide suitable measures to safeguard the data subject ’s rights and freedoms and legitimate interests (Art. 22 (2) (b) GDPR).

  114. 114.

    Art. 29 Data Protection Working Party (2018), p. 27.

  115. 115.

    For example, Wachter et al. (2017) deny such a right on the basis of Art. 22 (3) GDPR, pp. 79–80. Malgieri and Comandé (2017) argue for a right to an ex post explanation , pp. 12–13.

  116. 116.

    Wachter et al. (2017), p. 80.

  117. 117.

    Wachter et al. (2017), p. 81.

  118. 118.

    Malgieri and Comandé (2017), pp. 12–13 and Wachter et al. (2017), p. 80.

  119. 119.

    Malgieri and Comandé (2017), pp. 12–13.

  120. 120.

    See Footnote 116.

  121. 121.

    See Footnote 114.

  122. 122.

    See Footnote 114.

  123. 123.

    Art. 29 Data Protection Working Party (2018), p. 27; however, this reference is not particularly illuminating for this specific issue as the Art. 29 Working Party seems to understand Art. 13 (2) (f) or Art. 14 (2) (g) GDPR and Art. 15 (1) (h) GDPR as not providing a right to an ex post explanation (see Sect. 4.1.1.4.2).

  124. 124.

    Datatilsynet, The Norwegian Data Protection Authority (2018), p. 21.

  125. 125.

    Art. 12 GDPR must be complied with, too. Inter alia, data subjects need to be informed in a “concise, transparent, intelligible and easily accessible form, using clear and plain language…”. “The controller shall also facilitate the exercise of the data subject rights under Art. 15 to 22 GDPR…” In general, information shall be provided free of charge.

  126. 126.

    Art. 13 GDPR applies where the personal data are collected from the data subject . Art. 14 GDPR is relevant if the personal data have not been obtained from the data subject .

  127. 127.

    Malgieri and Comandé (2017), pp. 13–16.

  128. 128.

    If pre-defined simplistic or linear models are used, it would be in principle possible to give information about the rationale ex ante (Wachter et al. 2017, p. 79).

  129. 129.

    See Sect. 4.1.1.4.1.

  130. 130.

    Wachter et al. (2017), pp. 83–84.

  131. 131.

    Malgieri and Comandé (2017), pp. 13–14.

  132. 132.

    For details see Malgieri and Comandé (2017), p. 4 and Wachter et al. (2017), pp. 83–84.

  133. 133.

    Art. 29 Data Protection Working Party (2018): “Article 15 (1) (h) entitles data subjects to have the same information about solely automated decision making, including profiling , as required under Art. 13 (2) (f) and 14 (2) (g), namely:

    • The existence of automated decision making, including profiling ;

    • Meaningful information about the logic involved, and;

    • The significance and envisaged consequences of such processing for the data subject .

    The controller should have already given the data subject this information in line with their Article 13 obligations.” pp. 26–27.

  134. 134.

    Art. 29 Data Protection Working Party (2018): “Article 15 (1) (h) says that the controller should provide the data subject with information about the envisaged consequences of the processing, rather than an explanation of a particular decision. Recital 63 clarifies this by stating that every data subject should have the right of access to obtain ‘communication’ about automatic data processing, including the logic involved, and at least when based on profiling , the consequences of such processing.” p. 27.

  135. 135.

    Section 4.1.1.4.1.

  136. 136.

    Automated decision-making in the sense of Art. 22 (1) GDPR.

  137. 137.

    Art. 29 Data Protection Working Party (2018), p. 25.

  138. 138.

    See Footnote 137.

  139. 139.

    See Footnote 137.

  140. 140.

    See Footnote 137.

  141. 141.

    See Footnote 114.

  142. 142.

    Art. 29 Data Protection Working Party (2018), p. 27. The Norwegian Data Protection Authority also requires that the data subject is informed about “how the data is to be weighted and correlated” (Datatilsynet, The Norwegian Data Protection Authority (2018), p. 21).

  143. 143.

    BGH (Bundesgerichtshof), judgement from 28 January 2014—VI ZR 156/13; Ernst (2017), p. 1033.

  144. 144.

    See Footnote 20.

  145. 145.

    Edwards and Veale (2017), p. 22.

  146. 146.

    Malgieri and Comandé (2017), p. 22.

  147. 147.

    See Footnote 146.

  148. 148.

    Bäcker (2018), Art. 13 margin note 54.

  149. 149.

    Bäcker (2018), Art. 13 margin note 54; Malgieri and Comandé (2017), p. 22.

  150. 150.

    McLellan (2016), Knight (2017), Ernst (2017), p. 1027 and Martini (2017), pp. 1018–1019.

  151. 151.

    Datatilsynet, The Norwegian Data Protection Authority (2018), p. 19.

  152. 152.

    Schmidt-Wudy (2018), Art. 13 margin note 2; Paal and Hennemann (2018), Art. 13 margin note 4.

  153. 153.

    Section 4.1 (first paragraph); for more information see Art. 29 Data Protection Working Party (2018), pp. 12–14.

  154. 154.

    Edwards and Veale (2017), pp. 32–33; detailed elaborations on Big Data and the purpose limitation principle can be found in Forgó, Hänold and Schütze (2017), pp. 17–42.

  155. 155.

    With regard to the other principles, see Art. 29 Data Protection Working Party (2018), pp. 9–15.

  156. 156.

    For information on the scope of application of Art. 22 (1) GDPR, see Sect. 4.1.1.1.

  157. 157.

    Datatilsynet, The Norwegian Data Protection Authority (2018), p. 22.

  158. 158.

    See Footnote 137.

  159. 159.

    Weichert (2014) p. 170 and Edwards and Veale (2017), p. 77.

  160. 160.

    For details, see Art. 21 (1) GDPR.

  161. 161.

    Gesetz zur Anpassung des Datenschutzrechts an die Verordnung (EU) 2016/679 und zur Umsetzung der Richtlinie (EU) 2016/680 (Datenschutz-Anpassungs- und -Umsetzungsgesetz EU–DSAnpUG-EU)); English version available at: https://www.bvdnet.de/wp-content/uploads/2017/08/BMI_%C3%9Cbersetzung_DSAnpUG-EU_mit_BDSG-neu.pdf. Accessed 17 May 2018.

  162. 162.

    Greve (2017), p. 737.

  163. 163.

    Buchner (2018), § 31 BDSG hae notes 4–5; Taeger (2017), pp. 3–9.

  164. 164.

    Buchner (2018), § 31 BDSG margin notes 6–7.

  165. 165.

    English version available at: https://www.gesetze-im-internet.de/englisch_gg/. Accessed 17 May 2018.

  166. 166.

    Bundesverfassungsgericht (Federal Constitutional Court), Decision from 11 April 2018—1 BvR 3080/09.

  167. 167.

    Bundesverfassungsgericht: “Grundsätzlich gehört es zur Freiheit jeder Person, nach eigenen Präferenzen darüber zu bestimmen, mit wem sie unter welchen Bedingungen Verträge abschließen will” (Decision from 11 April 2018—1 BvR 3080/09).

  168. 168.

    Allgemeines Gleichbehandlungsgesetz (AGG); English version available at: http://www.antidiskriminierungsstelle.de/SharedDocs/Downloads/DE/publikationen/AGG/agg_in_englischer_Sprache.pdf;jsessionid=1834417026F099B42C9B8BB560277233.2_cid332?__blob=publicationFile&v=3. Accessed 17 May 2018.

  169. 169.

    See, e.g., Council Directive 2000/43/EC of 29 June 2000 implementing the principle of equal treatment between persons irrespective of racial or ethnic origin, Official Journal L 180, 19/07/2000 p. 0022–0026; Council Directive 2000/78/EC of 27 November 2000 establishing a general framework for equal treatment in employment and occupation, Official Journal L 303, 02/12/2000 p. 0016–0022.

  170. 170.

    Ernst (2017), p. 1033.

  171. 171.

    See Sect. 3.2.

  172. 172.

    Buchner (2018), Art. 22 margin note 1; Ernst (2017), p. 1030.

  173. 173.

    Indeed, the European Parliament had proposed to cover also decisions “predominantly” based on automated processing (Wachter et al. 2017, p. 92).

  174. 174.

    Section 4.1.1.1.1.

  175. 175.

    Section 4.1.1.1.2.

  176. 176.

    See Foontoe 176.

  177. 177.

    Section 4.1.1.2.

  178. 178.

    See Footnote 177.

  179. 179.

    Section 3.2.

  180. 180.

    Section 4.1.1.4.

  181. 181.

    See Footnote 180.

  182. 182.

    Section 4.1.2.

  183. 183.

    Section 4.1.1.4.3; A fitting example is the German credit investigation company SCHUFA which did not want to disclose any information about the comparison groups and the particular weights of the characteristics used for the algorithm to determine creditworthiness (BGH, judgement from 28 January 2014—VI ZR 156/13).

  184. 184.

    Section 4.1.1.4.3.

  185. 185.

    Sections 3.2 and 4.1.1.4.4.

  186. 186.

    Hacker and Petkovka (2017), p. 16; Edwards and Veale (2017), pp. 65–67; Vedder and Naudts (2017), p. 215.

  187. 187.

    See Sect. 4.1.1.2.

  188. 188.

    Hacker and Petkovka (2017), p. 19.

  189. 189.

    Edwards and Veale (2017), p. 67.

  190. 190.

    See Edwards and Veale (2017), p. 67.

  191. 191.

    See also Edwards and Veale (2017), p. 67.

  192. 192.

    Vedder and Naudts (2017), p. 216.

  193. 193.

    Hacker and Petkovka (2017), p. 17.

  194. 194.

    It has been suggested to consider expanding the scope of the law by prohibiting discrimination in all cases that are based on algorithmic data assessment (Martini 2017, p. 1021).

  195. 195.

    Hacker and Petkovka (2017), p. 20.

  196. 196.

    Vedder and Naudts (2017), p. 217.

  197. 197.

    See Footnote 187.

  198. 198.

    See Sect. 3.3.

  199. 199.

    Martini (2017), p. 1021 and Edwards and Veale (2017), pp. 75–77.

  200. 200.

    Martini (2017), p. 1021.

  201. 201.

    Martini (2017), p. 102.

  202. 202.

    Hacker and Petkovka (2017), pp. 2–42.

References

Download references

Acknowledgements

This work has been supported by the EU project SoBigData (http://www.sobigdata.eu/) which receives funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No. 654024 and the German national project ABIDA (http://www.abida.de/) which has been funded by the Bundesministerium für Bildung und Forschung (BMBF). The author would like to thank Marc Stauch and Julia Pfeiffenbring for their valuable support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stefanie Hänold .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Hänold, S. (2018). Profiling and Automated Decision-Making: Legal Implications and Shortcomings. In: Corrales, M., Fenwick, M., Forgó, N. (eds) Robotics, AI and the Future of Law. Perspectives in Law, Business and Innovation. Springer, Singapore. https://doi.org/10.1007/978-981-13-2874-9_6

Download citation

  • DOI: https://doi.org/10.1007/978-981-13-2874-9_6

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-13-2873-2

  • Online ISBN: 978-981-13-2874-9

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics