Weiter zum Inhalt

When GDPR-Principles Blind Each Other: Accountability, Not Transparency, at the Heart of Algorithmic Governance

Paul de Hert, Guillermo Lazcoz

DOI https://doi.org/10.21552/edpl/2022/1/7



Transparency has been at the centre of the debate on algorithmic governance. However, when the GDPR was adopted in 2016, the legislator preferred to establish accountability as the core of the Regulation's principles, rather than transparency. Unfortunately, accountability does not yet seem to be playing the role it was assigned in the data protection ecosystem, at least when it comes to algorithmic decision-making. To turn this scenario around, we propose a reflective exercise in which we look at the concept of accountability and how it was introduced in the GDPR. By emphasising on the human element in algorithmic decision-making, we find a systematic and process-oriented accountability present in the GDPR. Following arguments already made in the literature, we hold that this kind of accountability is well suited for algorithmic governance. Moreover, we argue that it could be strengthened by the Commission's proposal for a Regulation on Artificial Intelligence.
Keywords: Accountability | Transparency | GDPR | Algorithmic Decision-Making | Artificial Intelligence

Paul de Hert, Faculty of Law and Criminology, Vrije Universiteit Brussel (LSTS), Pleinlaan 2, 1050 Brussels, Belgium; Tilburg University (TILT), the Netherlands; IBOF project ‘Improving accountability in human rights law'; EU project Leads (Legality Attentive Data Scientists (Horizon 2020); <paul.de.hert@vub.be>. Guillermo Lazcoz, Centro de Investigación Biomédica en Red (CIBERER - ISCIII), Monforte de Lemos 3-5, 28029 Madrid, Spain; <guillermo.lazcoz.ehu.eus>. We want to thank Cristina Cocito for her valuable feedback that helped us to improve this work. Thanks also to the anonymous reviewers for their comments.

Empfehlen


Lx-Number Search

A
|
(e.g. A | 000123 | 01)

Export Citation