Skip to content

The search returned 41 results.

Machine Learning in Medicine: journal article open-access

Opening the New Data Protection Black Box

Agata Ferretti, Manuel Schneider, Alessandro Blasimme

European Data Protection Law Review, Volume 4 (2018), Issue 3, Page 320 - 332

Artificial intelligence (AI) systems, especially those employing machine learning methods, are often considered black boxes, that is, systems whose inner workings and decisional logics remain fundamentally opaque to human understanding. In this article, we set out to clarify what the new General Data Protection Regulation (GDPR) says on profiling and automated decision-making employing opaque systems. More specifically, we focus on the application of such systems in the domain of healthcare. We conducted a conceptual analysis of the notion of opacity (black box) using concrete examples of existing or envisaged medical applications. Our analysis distinguishes among three forms of opacity: (i) lack of disclosure, (ii) epistemic opacity, and (iii) explanatory opacity. For each type of opacity, we discuss where it originates from, and how it can be dealt with according to the GDPR in the context of healthcare. This analysis can offer insights regarding the contested issue of the explainability of AI systems in medicine, and its potential effects on the patient-doctor relationship. Keywords: Artificial Intelligence, Machine Learning, Black Box, Medicine, GDPR, Transparency


Data Portability in Health Research and Biobanking: journal article

Legal Benchmarks for Appropriate Implementation

Gauthier Chassang, Tom Southerington, Olga Tzortzatou, Martin Boeckhout, Santa Slokenberga

European Data Protection Law Review, Volume 4 (2018), Issue 3, Page 296 - 307

This article examines the content of data portability right (II), operationalisation of data portability in health research context and related challenges (III) by considering both GDPR provisions and special Guidelines from the European Data Protection Board (ex-Article 29 Data Protection Working Party). We provide in depth analysis of the provisions and tables for easing the identification of potential implementation of data portability in health research contexts. Keywords: Data Portability, Scientific Research, GDPR, Health Data, Data Subject's Rights, European Union


Machine Learning for Diagnosis and Treatment: journal article

Gymnastics for the GDPR

Robin Pierce

European Data Protection Law Review, Volume 4 (2018), Issue 3, Page 333 - 343

Machine Learning (ML), a form of artificial intelligence (AI) that produces iterative refinement of outputs without human intervention, is gaining traction in healthcare as a promising way of streamlining diagnosis and treatment and is even being explored as a more efficient alternative to clinical trials. ML is increasingly being identified as an essential tool in the arsenal of Big Data for medicine. ML can process and analyse the data resulting in outputs that can inform treatment and diagnosis. Consequently, ML is likely to occupy a central role in precision medicine, an approach that tailors treatment based on characteristics of individual patients instead of traditional ‘average’ or one-size-fits-all medicine, potentially optimising outcomes as well as resource allocation. ML falls into a category of data-reliant technologies that have the potential to enhance healthcare in significant ways. However, as such, concerns about data protection and the GDPR may arise as ML assumes a growing role in healthcare, prompting questions about the extent to which the GDPR and related legislation will be able to provide adequate data protection for data subjects. Focusing on issues of transparency, fairness, storage limitation, purpose limitation and data minimisation as well as specific provisions supporting these principles, this article examines the interaction between ML and data protection law. Keywords: Machine Learning, GDPR, Data Protection, Artificial Intelligence in Medicine, Health Data, Automated Processing, Data Minimisation


Interoperability of EU Databases and Access to Personal Data by National Police Authorities under Article 20 of the Commission Proposals journal article

Teresa Quintel

European Data Protection Law Review, Volume 4 (2018), Issue 4, Page 470 - 482

This contribution assesses data protection concerns relating to the processing of personal data carried out pursuant to Article 20 of the interoperability proposals, which were published by the European Commission on 12 December 2017. The proposals seek to enable all centralised EU databases for security, border and migration management to be interconnected by 2023. Under Article 20 of the proposals, Member States would be permitted to implement national provisions that allow national police authorities to query one of the interoperability components with biometric taken during an identity check. Such queries shall prevent irregular migration and ensure a high level of security within Union. In particular, this article seeks to examine the data protection concerns arising with regard to streamlined law enforcement access to non-law enforcement databases included in the interoperable framework, and addresses risks for individuals to become subject to unfair processing under Article 20 of the interoperability proposal. Keywords: Interoperability, EU Databases, Biometric Data, Random Police Checks, Directive (EU)2016/680, GDPR, Irregular Migration




Contesting Automated Decisions: journal article

A View of Transparency Implications

Emre Bayamlioglu

European Data Protection Law Review, Volume 4 (2018), Issue 4, Page 433 - 446

This paper identifies the essentials of a ‘transparency model’ which aims to scrutinise automated data-driven decision-making systems not by the mechanisms of their operation but rather by the normativity embedded in their behaviour/action. First, transparency-related concerns and challenges inherent in machine learning are conceptualised as ‘informational asymmetries’, concluding that the transparency requirements for the effective contestation of automated decisions go far beyond the mere disclosure of algorithms. Next, essential components of a rule-based ‘transparency model’ are described as: i) the data as ‘decisional input’, ii) the ‘normativities’ contained by the system both at the inference and decision (rule-making) level, iii) the context and further implications of the decision, and iv) the accountable actors. Keywords: Algorithmic Transparency, Automated Decisions, GDPR Article 22




Challenges for Citizen Science and the EU Open Science Agenda under the GDPR journal article open-access

Anna Berti Suman, Robin Pierce

European Data Protection Law Review, Volume 4 (2018), Issue 3, Page 284 - 295

Present discussions on the implications of the GDPR for medical practice and health research mostly target the passive collection of health data. This article shifts the lens of analysis to the scarcely researched and rather different phenomenon of the active sharing of health data within the framework of Citizen Science projects. Starting from this focus, the article queries whether data processing requirements under the GDPR impacts the advancement of Citizen Science for health research. A number of tensions between the two aims are identified both in abstract terms and ‘in practice’ by analysing three Citizen Science scenarios and drawing parallels with the experience of ‘collective’ Clinical Trials. The limited literature on the topic makes this article an exploratory reflection on key tensions, with the aim of opening the way for further research. This discussion is inspired by the need to guarantee that opportunities of Citizen Science will not be unduly curtailed by the advent of the GDPR but also to ensure that Citizen Science is implemented in ways that are consistent with the GDPR. Keywords: Citizen Science, Open Science, GDPR, Secondary Use of Health Data, Research Exemption