Skip to content

The search returned 13 results.

Sparkling Lights in the Going Dark: journal article

Legal Safeguards for Law Enforcement’s Encryption Circumvention Measures

Thiago Moraes

European Data Protection Law Review, Volume 6 (2020), Issue 1, Page 41 - 55

This article discusses legal safeguards that could be in place in the European jurisdictions when law enforcement authorities conducting investigations of criminal offenses implement circumvention measures to bypass encryption technologies designed to protect the right to privacy of users of electronic communication services and equipment. The analysis is structured in three parts: first, two encryption technologies used by communication applications and devices are explained: end-to-end encryption and full disk encryption. Second, two encryption circumvention measures are discussed: government hacking and unlock orders. This study discusses their effectiveness against those encryption techniques, as well as their degree of invasiveness and potential harm to individuals’ rights to privacy and concludes with a list of possible legal safeguards that could be considered when implementing them. These safeguards are defined and discussed, based on European case law and national legislations analysis. Keywords: encryption; right to privacy; surveillance; going dark

Planet49: Pre-Ticked Checkboxes Are Not Sufficient to Convey User’s Consent to the Storage of Cookies (C-673/17 Planet49) journal article

Agnieszka Jabłonowska, Adrianna Michałowicz

European Data Protection Law Review, Volume 6 (2020), Issue 1, Page 137 - 142

Case C-673/17 Bundesverband der Verbraucherzentralen und Verbraucherverbände - Verbraucherzentrale Bundesverband e.V. v Planet49 GmbH, Judgment of the Court (Grand Chamber) of 1 October 2019 Consent of a website user, required for the lawful storage of information or access to information already stored, in the form of cookies, in his or her terminal equipment is not validly constituted by way of a pre-ticked checkbox, which the user must deselect to refuse consent. Conditions for the lawful storage and access are not to be interpreted differently according to whether or not the information stored or accessed on a website user’s terminal equipment qualifies as personal data. Information that the service provider must provide to a website user, prior to the storage of information in his or her terminal equipment, includes information on the duration of the operation of cookies and whether or not third parties may have access to it. Articles 2(f) and 5(3) of Directive 2002/58/EC – Articles 2(h) and 10 of Directive 95/46/EC – Articles 4(11) and 13 of Regulation (EU) 2016/679

Balancing Data Subjects’ Rights and Public Interest Research: journal article

Examining the Interplay between UK Law, EU Human Rights Law and the GDPR

Jessica Bell, Stergios Aidinlis, Hannah Smith, Miranda Mourby, Heather Gowans, Susan E Wallace, Jane Kaye

European Data Protection Law Review, Volume 5 (2019), Issue 1, Page 43 - 53

The EU General Data Protection Regulation (‘GDPR’) seeks to balance the public interest in research with privacy rights of individuals, in particular, through research exemptions and safeguards set out in Article 89. While this affords Member States limited opportunities to modify the application of the GDPR at a national level, including for data processing that is necessary for the performance of a task carried out in the public interest, it is necessary for national approaches to conform with Article 89 safeguards where appropriate. One development of interest to the research community in the UK is a statutory power for public authorities to disclose administrative data for research under the Digital Economy Act 2017 (DEA). This article uses the DEA as a case study for analysis of the GDPR provisions governing processing of data for research purposes—including de-identification—and draws on human rights norms and jurisprudence to interpret the broad requirement for ‘appropriate safeguards’ for the ‘rights and freedoms of the data subject’ under Article 89. This analysis is important for data controllers seeking to meet their obligations under the UK framework and for those in other EU Member States considering the development of similar national provisions for data processing for research purposes. Keywords: GDPR, Public Interest Research, Privacy

Privacy Nudges: journal article

An Alternative Regulatory Mechanism to ‘Informed Consent’ for Online Data Protection Behaviour

Sheng Yin Soh

European Data Protection Law Review, Volume 5 (2019), Issue 1, Page 65 - 74

The informed consent paradigm of data protection law in the EU has failed to foster privacy-protective behaviour online, due to findings from behavioural science such as bounded rationality and asymmetric information. Hence, this article proposes a soft paternalistic approach through the use of ‘privacy nudges’ as an alternative regulatory tool to informed consent to nudge users towards more optimal privacy protection decisions. This article also discusses the potential benefits of privacy nudges, some of the main critiques of nudging and future directions for improvement. Keywords: Privacy Nudge, Informed Consent, Behavioural Economics

Differential Privacy and the GDPR journal article

Julian Hölzel

European Data Protection Law Review, Volume 5 (2019), Issue 2, Page 184 - 196

Under the European General Data Protection Regulation, anonymisation of personal data may not only provide a legal loophole for controllers to escape their regulatory burden. Considering specific circumstances, it can even be a legal duty for controllers to anonymise their personal data. Differential privacy has been proposed as a new approach to the problem of anonymisation. This article aims to assess the appropriateness of this approach with regards to the legal problem of anonymisation. Keywords: Anonymisation, Differential Privacy, Privacy Model, Model Comparison, Anonymisation Techniques

Assessing the Legal and Ethical Impact of Data Reuse: journal article

Developing a Tool for Data Reuse Impact Assessments (DRIA)

Bart Custers, Helena U Vrabec, Michael Friedewald

European Data Protection Law Review, Volume 5 (2019), Issue 3, Page 317 - 337

In the data economy, many organisations, particularly SMEs may not be in a position to generate large amounts of data themselves, but may benefit from reusing data previously collected by others. Organisations that collect large amounts of data themselves may also benefit from reusing such data for other purposes than originally envisioned. However, under the current EU personal data protection legal framework, constituted by the General Data Protection Regulation, there are clear limits and restrictions to the reuse of personal data. Data can only be reused for purposes that are compatible with the original purposes for which the data were collected and processed. This is at odds with the reality of the data economy, in which there is a considerable need for data reuse. To address this issue, in this article we present the concept of a Data Reuse Impact Assessment (DRIA), which can be considered as an extension to existing Privacy and Data Protection Impact Assessments (PIAs and DPIAs). By adding new elements to these existing tools that specifically focus on the reuse of data and aspects regarding data ethics, a DRIA may typically be helpful to strike a better balance between the protection of personal data that is being reused and the need for data reuse in the data economy. Using a DRIA may contribute to increased trust among data subjects that their personal data is adequately protected. Data subjects, in turn, may then be willing to share more data, which on the long term may also be beneficial for the data economy. Keywords: Data Reuse, Data Protection, Privacy, Data Protection Impact Assessments, Privacy Impact Assessments

Privacy Icons: journal article open-access

A Risk-Based Approach to Visualisation of Data Processing

Zohar Efroni, Jakob Metzger, Lena Mischau, Marie Schirmbeck

European Data Protection Law Review, Volume 5 (2019), Issue 3, Page 352 - 366

Although the institution of consent within the General Data Protection Regulation intends to facilitate the exercise of personal autonomy, reality paints a different picture. Due to a host of structural and psychological deficits, the process of giving consent is often neither informed nor does it foster self-determination. One key element in addressing this shortcoming is the visualisation of relevant information through icons. This article outlines a risk-based methodology for the selection, design and implementation of such privacy icons. It lays the groundwork for identifying risky data processing aspects as a first step in a larger project of creating a privacy icons set to accompany privacy policies. The ultimate goal of the privacy icons is to assist users in making better informed consent decisions through the visualisation of data processing aspects based on their inherent risks. Keywords: Privacy Icons, Consent, Risk-Based Approach, Private Autonomy, Legal Design

European Regulation of Smartphone Ecosystems journal article

Ronan Ó Fathaigh, Joris van Hoboken

European Data Protection Law Review, Volume 5 (2019), Issue 4, Page 476 - 491

For the first time, two pieces of EU legislation will specifically target smartphone ecosystems in relation to smartphone and mobile software (eg, iOS and Android) privacy, and use and monetisation of data. And yet, both pieces of legislation approach data use and data monetisation from radically contrasting perspectives. The first is the proposed ePrivacy Regulation, which seeks to provide enhanced protection against user data monitoring and tracking in smartphones, and safeguard privacy in electronic communications. On the other hand, the recently enacted Platform-to-Business Regulation 2019, seeks to bring fairness to platform-business user relations (including app stores and app developers), and is crucially built upon the premise that the ability to access and use data, including personal data, can enable important value creation in the online platform economy. This article discusses how these two Regulations will apply to smartphone ecosystems, especially relating to user and device privacy. The article analyses the potential tension points between the two sets of rules, which result from the underlying policy objectives of safeguarding privacy in electronic communications and the functioning of the digital economy in the emerging era of platform governance. The article concludes with a discussion on how to address these issues, at the intersection of privacy and competition in the digital platform economy. Keywords: Privacy, Smartphones, Platforms, Governance

Artificial Intelligence in Medical Diagnoses and the Right to Explanation journal article

Thomas Hoeren, Maurice Niehoff

European Data Protection Law Review, Volume 4 (2018), Issue 3, Page 308 - 319

Artificial intelligence and automation is also finding its way into the healthcare sector with some systems even claiming to deliver better results than human physicians. However, the increasing automation of medical decision-making is also accompanied by problems, as the question of how the relationship of trust between physicians and patients can be maintained or how decisions can be verified. This is where the right to explanation comes into play, which is enshrined in the General Data Protection Regulation (GDPR). This article explains how the right is derived from the GDPR and how it should be established. Keywords: Data Protection, Privacy, AI, Articial Intelligence, Algorithm