Skip to main content

A Public Database as a Way Towards More Effective Algorithm Regulation and Transparency?

  • Chapter
  • First Online:
Regulating New Technologies in Uncertain Times

Part of the book series: Information Technology and Law Series ((ITLS,volume 32))

Abstract

The increasing usage of algorithmic decision-making (ADM) systems has led to new and partially urgent challenges for the law, specifically in the field of data protection. Decisions made by (classic and “intelligent”) algorithms can make people feel powerless and the underlying opaqueness makes it hard to understand the reasons for a specific decision. This also increases the danger of discriminating results, as reproducing if decisions were (indirectly) based on forbidden characteristics becomes increasingly hard. Especially on the private market, consequences for individuals and society as a whole can be problematic. Much discussion has revolved around the question of how to achieve more transparency to increase regulation and allow accountability for those using ADM systems. These discussions mostly focus on transparency-enhancing instruments the General Data Protection Regulation (GDPR) offers. While the GDPR offers a promising array of such instruments for data subjects and public authorities, specific instruments for public transparency are missing. The chapter discusses the notion of a public database that gives graduated access to information concerning ADM systems used by companies, allowing analyzing algorithms’ consequences and enabling individuals to make more informed decisions. Allowing such access would make it necessary to consider affected companies’ justified interests but could further overall societal trust and acceptance while increasing control. The contribution tries to analyze how some of the GDPR’s provisions (such as Articles 20 and 35) can help with this endeavor, draw comparisons to similar regulatory approaches in other areas (such as Environmental Law) and make specific recommendations for action.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 99.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Article 29 Data Protection Working Party 2016, pp. 4, 5.

  2. 2.

    Article 57 GDPR.

  3. 3.

    Bull 2015, pp. 24, 25; cf. also Marsch 2018, pp. 203 et seq. for a methodical outline of the instrumental and accessory nature of the fundamental law of Data Protection in Article 8 EU Charter of Fundamental Rights (CFR).

  4. 4.

    Natural persons whose data are being processed, Article 4(1) GDPR.

  5. 5.

    Regulation on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation); All subsequent articles cited without further description are the GDPR’s.

  6. 6.

    Basic benchmarks for the lawfulness of processing acts; even though they are being concretized and operationalized through other, more specific provisions, violating them can already render a processing unlawful, cf. Paal and Pauly 2018, Article 5 para 11; Rossnagel 2018, 340.

  7. 7.

    Recitals are non-binding provisions that serve the purpose of giving guidance for the interpretation of the GDPR’s provisions.

  8. 8.

    See also Article 12(1).

  9. 9.

    See Article 57(1)(a).

  10. 10.

    Laid down in Article 58.

  11. 11.

    Laid down in Article 35.

  12. 12.

    Article 33.

  13. 13.

    Laid down in Articles 40 et seq.

  14. 14.

    The risk-based approach of the GDPR, tying the scope of obligations and rights to the risk connected to a specific act of processing, allows—at least in theory—for handling new problems and technical scenarios like the usage of ADM systems by acknowledging their respective risks, therefore avoiding the constant need for adapting consisting or creating new laws.

  15. 15.

    While Article 13 relates to the collection of data directly from the subject, Article 14 covers instances in which data were collected from third parties.

  16. 16.

    The list of information in Article 13 is exhaustive.

  17. 17.

    Cf. Mendoza and Bygrave 2017, pp. 9 et seq. on interpreting Article 22(1) as a prohibition and not a subject right to object.

  18. 18.

    Recital 71.

  19. 19.

    Most notably in the case of an explicit consent by the data subject.

  20. 20.

    Wachter et al. 2017, p. 17.

  21. 21.

    Paal and Pauly 2018, Article 35 para 29, arguing for a basic right to an explanation. Cf. also Malgiere and Comandé 2017, pp. 246 et seq. and Selbst and Powles 2017, emphasising a functional understanding of such a right as giving at least enough information for a data subject to effectively execute his or her rights.

  22. 22.

    Wachter et al. 2017, p. 17, arguing against such a right.

  23. 23.

    The wording of Article 36(3) is not completely identical to the one in Article 22, leaving room for the interpretation that its scope is wider, making impact assessments obligatory where a decision is not made exclusively by the automated system, for example.

  24. 24.

    Here understood as any interested person that is neither a data subject nor part of a DPA, notably NGOs, journalists or any interested individuals.

  25. 25.

    Cf. Paal and Pauly 2018, Article 13 para 31.

  26. 26.

    See Wachter et al. 2018, p. 6.

  27. 27.

    Cf. Malgieri and Comandé 2017, p. 246.

  28. 28.

    Burrell 2016, p. 7.

  29. 29.

    See Goodman and Flaxman 2016, p. 3, describing the problem of “uncertainty bias” in connection with incomplete training data.

  30. 30.

    The current one being Directive 2014/52/EU, amending Directive 2011/92/EU.

  31. 31.

    Especially environmental associations and NGOs.

  32. 32.

    See Landmann and Rohmer 2018, § 9.

  33. 33.

    This acknowledgment of limitation made in Recital 63 about data subjects’ access rights a fortiori also applies to disclosures to members of the public that are not themselves affected by the processing.

  34. 34.

    See https://www.startnext.com/openschufa and https://www.openschufa.de. Last accessed 25 August 2018.

  35. 35.

    Predicting how likely it is that the respective individual settles his or her bills, pays back loans etc.

  36. 36.

    Just as it is already being done by OpenSchufa.

  37. 37.

    See http://www.lebensmittelklarheit.de/. Last accessed 25 August 2018.

  38. 38.

    See 2.2.1. above.

  39. 39.

    See, for example, Berreby D (2017) Click to agree with what? No one reads terms of service, studies confirm https://www.theguardian.com/technology/2017/mar/03/terms-of-service-online-contracts-fine-print. Last accessed 25 August 2018.

  40. 40.

    Made up of representatives from the DPAs of each Member State and replacing the Article 29 Data Protection Working Group.

  41. 41.

    Informationsfreiheitsgesetz (IFG).

  42. 42.

    Through reverse engineering or other similar measures.

  43. 43.

    Cf. Gierschmann et al. 2018, Article 20 para 23.

  44. 44.

    Article 29 Data Protection Working Party 2017, p. 10.

  45. 45.

    See Dwork 2008, p. 1.

  46. 46.

    An ideal that might not be completely reachable by just forcing controllers to inform them, see above.

References

  • Article 29 Data Protection Working Party (2016) Guidelines on Transparency under Regulation 2016/679 (wp260rev.01). Available at https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=622227. Last accessed 25 August 2018

  • Article 29 Data Protection Working Party (2017) Guidelines on the right to data portability (WP 242 rev.01). Available at https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=611233. Last accessed 25 August 2018

  • Bull HP (2015) Sinn und Unsinn des Datenschutzes. Mohr Siebeck, Tübingen

    Google Scholar 

  • Burrell J (2016) How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data & Society, January–June 2016, 1

    Google Scholar 

  • Dwork S (2008) Differential Privacy: A Survey of Results. In: International Conference on Theory and Applications of Models of Computation. Springer, Berlin

    Google Scholar 

  • Gierschmann S, Schlender K, Stentzel R, Veil W (2018) Kommentar Datenschutz-Grundverordnung. Bundesanzeiger Verlag, Cologne

    Google Scholar 

  • Goodman B, Flaxman S (2016) EU Regulations on Algorithmic Decision-Making and a “Right to Explanation”. In: arXiv:1606.08813v3 [stat.ML], 2016. Available at https://arxiv.org/abs/1606.08813. Last accessed 25 August 2018

  • Landmann R v, Rohmer G (2018) Kommentar Umweltrecht. C.H. Beck, Munich

    Google Scholar 

  • Malgieri G, Comandé G (2017) Why a Right to Legibility of Automated Decision-Making Exists in the General Data Protection Regulation. International Privacy Law 7:243–265

    Google Scholar 

  • Marsch N (2018) Das europäische Datenschutzgrundrecht. Mohr Siebeck, Tübingen

    Google Scholar 

  • Mendoza I, Bygrave LA (2017) The Right not to be Subject to Automated Decisions based on Profiling. University of Oslo Faculty of Law Legal Studies Research Paper Series, No. 2017–20. Available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2964855. Last accessed 25 August 2018

  • Paal BP, Pauly DA (2018) Beck’sche Kompakt-Kommentare Datenschutz-Grundverordnung. C.H. Beck, Munich

    Google Scholar 

  • Rossnagel A (2018) Datenschutzgrundsätze – unverbindliches Programm oder verbindliches Recht? Bedeutung der Grundsätze für die datenschutzrechtliche Praxis. Zeitschrift für Datenschutz, 339–344

    Google Scholar 

  • Selbst AD, Powles J (2017) Meaningful information and the right to explanation. International Data Privacy Law 7:233–242

    Google Scholar 

  • Wachter S, Mittelstadt B, Floridi L (2017) Why a right to explanation of automated decision-making does not exist in the General Data Protection Regulation. International Data Privacy Law 7:76–99

    Google Scholar 

  • Wachter S, Mittelstadt B, Russell C (2018) Counterfactual Explanations without opening the Black Box: Automated Decisions and the GDPR. Harvard Journal of Law & Technology

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Florian Wittner .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 T.M.C. Asser press and the authors

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Wittner, F. (2019). A Public Database as a Way Towards More Effective Algorithm Regulation and Transparency?. In: Reins, L. (eds) Regulating New Technologies in Uncertain Times. Information Technology and Law Series, vol 32. T.M.C. Asser Press, The Hague. https://doi.org/10.1007/978-94-6265-279-8_10

Download citation

  • DOI: https://doi.org/10.1007/978-94-6265-279-8_10

  • Published:

  • Publisher Name: T.M.C. Asser Press, The Hague

  • Print ISBN: 978-94-6265-278-1

  • Online ISBN: 978-94-6265-279-8

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics