Skip to content
  • «
  • 1
  • »

The search returned 4 results.

Two Necessary Approaches to a Privacy-Friendly 2033 journal article

Alexander Dix

European Data Protection Law Review, Volume 9 (2023), Issue 3, Page 305 - 310

Niels Bohr, the Danish Nobel prize winner, is known to have said ‘Prediction is very difficult, especially if it’s about the future !’ It is therefore rather futile to try and predict the future of data and privacy protection. Instead this article puts forward two conditions (inter alia) which should be met to make sure that privacy and data protection have the same or an even better standing as at present. They are of a regulatory and a non-regulatory nature. On a regulatory level it is suggested that the responsibility for implementing existing legal rules should no longer be restricted to controllers. On a non-regulatory level the key importance of improving media literacy and raising public awareness as a condition for individual autonomy in the digital age is stressed. Keywords: artificial intelligence, informational self-determination, media literacy, privacy by design, product liability


Data Protection in 2033: journal article

Playing Whac-A-Mole with Injustices?

Felix Bieker, Marit Hansen

European Data Protection Law Review, Volume 9 (2023), Issue 4, Page 399 - 408

In our exploration of the future of data protection, we begin our analysis with a look at historic patterns of discrimination in order to get a clear look at what the future might hold for data protection. As the past and present inform the future, we proceed with the current and upcoming EU legislation concerning current data practices: the GDPR, the DMA and DSA, as well as the draft AI Act, AI Liability Directive and Platform Workers Directive. We ultimately find that this regulation does not sufficiently address the market incentives underlying many of the current harmful data practices. Instead, we argue that for a better future a more systemic approach is required and that the law has to address infrastructures as well as service providers/manufacturers directly, as this is where informational power concentrates. Yet, it is also paramount to realise that the (data protection) law alone cannot fix the systemic failures created by market dynamics. In conclusion, we argue that in order to break the spiralling cycles of trying to fix harmful technologies after large players have started to gain immense profits, we need a more fundamental shift in these financial incentives. Keywords: GDPR, design justice, big tech, Fundamental Rights Impact Assessment


Privacy Icons: journal article open-access

A Risk-Based Approach to Visualisation of Data Processing

Zohar Efroni, Jakob Metzger, Lena Mischau, Marie Schirmbeck

European Data Protection Law Review, Volume 5 (2019), Issue 3, Page 352 - 366

Although the institution of consent within the General Data Protection Regulation intends to facilitate the exercise of personal autonomy, reality paints a different picture. Due to a host of structural and psychological deficits, the process of giving consent is often neither informed nor does it foster self-determination. One key element in addressing this shortcoming is the visualisation of relevant information through icons. This article outlines a risk-based methodology for the selection, design and implementation of such privacy icons. It lays the groundwork for identifying risky data processing aspects as a first step in a larger project of creating a privacy icons set to accompany privacy policies. The ultimate goal of the privacy icons is to assist users in making better informed consent decisions through the visualisation of data processing aspects based on their inherent risks. Keywords: Privacy Icons, Consent, Risk-Based Approach, Private Autonomy, Legal Design


  • «
  • 1
  • »