Skip to content

All Talk, No Action? The Effect of the GDPR Accountability Principle on the EU Data Protection Paradigm


The General Data Protection Regulation (679/2016, ’GDPR’) introduced the accountability principle to the field of EU data protection law. The principle aims to increase the controller’s responsibility for its personal data processing and to promote a risk-based approach to data protection. However, accountability, as implemented in the GDPR, fails to meet these objectives. Accountability is sometimes seen as a significant paradigm shift – as a move away from transparency and choice-based data subject control towards company liability. However, the principle does not truly replace the requirements-based approach in the GDPR. Nevertheless, accountability can effectively contribute to EU data protection law by reinforcing other GDPR obligations. This article analyses the contribution of the GDPR accountability principle to the EU data protection law, and the effectiveness of the principle in the light of its objectives. Although accountability does not radically change the European data protection paradigm, the principle does contribute to increasing controllers’ responsibility and facilitating enforcement. keywords: Accountability | GDPR | Article 5(2) | Risk-based Approach | Big Data

I. Introduction

Accountability is an epitome of the high expectations set for the EU General Data Protection Regulation1 (679/2016, ‘GDPR’) in terms of revolutionizing EU data protection law. The principle is also an illustrative example of the gap between expectations and reality.

The GDPR, approved in 2016 and enforced two years later, was promoted as a revolution bringing EU data protection law to the 21st century through the creation of rules fit for the digital age.2 An important motivator for the reform was the declining trust of citizens towards organizations processing their personal data, especially online.3 The GDPR’s predecessor, the Data Protection Directive4 (95/46/EC, ‘DPD’), dated back to 1995 and struggled to answer the challenges of large-scale online data processing.

The GDPR set out to fix these problems with two objectives: to put citizens back in control of their personal data and to target companies processing personal data as their core business with more responsibility and a stricter set of rules.5 The accountability principle lies at the heart of the latter. Article 5(2) of the GDPR defines accountability as an obligation of the controller (an entity determining the purposes and means of processing) to comply with the general principles of data processing defined in Article 5(1), and to demonstrate this compliance.6 The goal of accountability is to incentivise controllers to work towards a higher level of data protection compliance.7 In addition, accountability aims to establish a risk-based approach to data protection by allowing the scalability of data protection obligations, especially in high-risk cases.8 Accountability as an independent requirement, sanctioned in its own right, allowed for the deprecation of the preliminary notification and permission procedures of the DPD9: the GDPR was set to rely mainly on ex post monitoring by the supervisory authorities. However, accountability struggles to fulfil these goals.

In this paper, I ask how the introduction of accountability affects the paradigm of EU data protection law. I answer this question by examining different arguments and expectations set for the obligation, and subsequently analysing how accountability meets them. I argue that the GDPR did not implement the principle in a way that would enable the accomplishment of its ambitious objective, namely the introduction of a risk-based approach and an increase in controller focus on privacy protection. Nevertheless, accountability can support compliance with other GDPR obligations and enhance enforcement through improved documentation. Still, to have these effects, accountability needs to be supported by strong enforcement that aligns the controller’s commercial interests with those of data subjects.

This analysis of accountability’s potential and materialised applications contributes to our understanding of the effectiveness of the GDPR. The analysis can also help us identify effective uses for accountability as a new legal instrument. Accountability can become an important principle against which data protection compliance is measured. However, for the obligation to reach its full potential, we need to better understand its function and limitations.

This paper is structured as follows. Firstly, I analyse accountability in the light of the political discussions and drafting of the GDPR. My aim is to understand the expectations and plans covering the fulfilment of this obligation. Secondly, I concentrate on the different ways in which accountability can contribute to data protection in the EU. Thirdly, I will use big data processing as an example to show how accountability, as implemented in the GDPR, operates in practice. I will conclude by contemplating the different contributions accountability can make to European data protection law.

II. A Principle of High Hopes and Competing Interests

To understand the effects of accountability, it is useful to look back to the legislative process that led up to the GDPR. During the political process, accountability was subject to varying interpretations and heavy lobbying.10 The principle represented both an increase and a decrease in the level of data protection in the EU. A short overview of the political discussion illustrates the tensions and conflicting interests behind accountability, and helps to understand its final form.

In the final text of the GDPR, accountability refers to a controller’s obligation to comply with the Regulation and to demonstrate that it does so. Article 5(2) of the GDPR stipulates that ‘the controller shall be responsible for, and be able to demonstrate compliance with, paragraph 1 [general principles of data processing].’ In the same vein, Article 24(1) echoes the same obligation: ‘the controller shall implement appropriate technical and organisational measures to ensure and to be able to demonstrate that processing is performed in accordance with this Regulation.’

Initially, the Commission proposed accountability as a solution that enhances the responsibility of companies for their own data processing and reduces administrative duties in the area of data protection.11 The accountability obligation also presented an opportunity to scale data protection obligations to risk level.12 This risk-based approach was promoted as a way to impose stricter obligations on companies processing either large amounts of data or high-risk data.13 At the same time, it was supposed to ease the burden on small and medium-sized enterprises.14 However, proposals for the implementation of accountability varied during the legislative process. At the time of drafting, the Commission’s original proposal15 listed specific measures for accountability, but the Parliament and the Council struck out this list during the legislative process.16 The measures that the Commission suggested included documentation, data security, data protection impact assessments, prior authorization by and consultation of authorities, and designation of a data protection officer.17 Although these detailed obligations were omitted as indications of accountability, they remain in the final text of the GDPR as independent obligations.18

Also other actors expressed interest in accountability during the drafting process. Many large controllers lobbied for accountability and the risk-based approach, seeing it as a form of self-governance that would replace mandatory legal requirements with a data controller’s own risk-management and accountability measures.19 Meanwhile, opposing actors criticised accountability and the risk-based approach for entrusting controllers with excessive power over their data processing, diminishing the competence of supervisory authorities.20 Likely, these two opposing discourses had an effect on the final formulation of accountability.

The GDPR defines accountability as a general obligation. Its wording does not specify the concrete measures controllers should implement to indicate compliance.21 Most importantly, the connection between risk assessments and accountability was diminished in the final text, likely as a response to the critique of entrusting controllers with too much power. However, the substance of accountability may nevertheless be seen as being relational to the data protection risks present in the processing.22 The greater the risks entailed by a processing activity, the greater accountability should be expected of the controller. However, accountability is required from all controllers regardless of the risk. What varies are the measures needed to show that the obligation is fulfilled.

The level of accountability required varies on a case-by-case basis, leaving considerable room for interpretation. For example, in high-risk scenarios it is not clear whether accountability might require companies to do more than explicitly prescribed by the GDPR. In principle, the GDPR requires controllers to implement appropriate measures, whereas disproportionate efforts are not required.23 Recital 74 specifies that the controller ‘should be obliged to implement appropriate and effective measures and be able to demonstrate the compliance of processing activities with this Regulation, including the effectiveness of the measures.’ Recital 76 explains that appropriate measures ‘should be determined by reference to the nature, scope, context and purposes of the processing.’ From a risk perspective, the GDPR obligates controllers to take the risks to natural persons into account. By contrast, it does not seem to create an active duty to promote the rights and freedoms of data subjects. However, the line between appropriate and disproportionate is not always clear in practice.

In conclusion, accountability under the GDPR is a combination of a high-level obligation and a set of detailed requirements for controllers. In addition, accountability is comprised of two equally important components: compliance and the ability to prove it.24 However, due to competing interests and expectations during the legislative process, the principle was somewhat attenuated in the final text of the GDPR. Instead of replacing other mandatory obligations, accountability creates an additional layer on top of other obligations. In the next section, I will examine these different compliance mechanisms in more detail.

III. The Three Aspects of Accountability

1. Accountability as an Alternative to Data Subject Control

Accountability aims to augment a controller’s responsibility for its own data processing. While the existence of this responsibility may seem obvious, the controller has not always been the primary actor in data processing. Traditionally, the law has focused on the data subject. Accountability and the risk-based approach are sometimes presented as an alternative to the more traditional model of data protection, which is based on data subject control over personal data through notice and consent.25 That system is built on a controller’s obligation to inform its data subjects about processing, and the data subject’s free choice to allow or prohibit the use of their data.

However, alternatives to data subject control are necessary. The control-based framework is often criticised for inefficiency in the online era, in which the sheer volume of data processing makes it impossible for an average data subject to meaningfully understand and exercise choice over all processing of their data.26 Due to the overwhelming amount of data processing individuals face today, especially online, the average data subject has neither the time nor the competence to get acquainted with everything controllers do with their data.27 Moreover, research shows that even individuals who are aware of their rights and consider them important rarely exercise them.28 This phenomenon, referred to as the Privacy Paradox, is explained by individuals being uncertain about data and its processing and, consequently, easily nudged towards more permitting data choices.29 In other words, information about data processing does not translate into rational choices. The paradox highlights a need for alternative legislative models, in recognition of the imperfect capability of data subjects to use the control vested to them.

Hildén illustrates this problem by arguing that the essence of the Privacy Paradox is not the contrast between an individual’s opinions and actions regarding privacy. Rather, its essence is the duty of control placed on data subjects where the real need is for controllers to fulfil reasonable expectations of privacy.30 Harzog elaborates: ‘The problem with thinking of privacy as control is that if we are given our wish for more privacy, it means we are given so much control that we choke on it.’31

The accountability principle tries to solve this dilemma. Accountability challenges data subject control by forcing controllers to assume greater responsibility. The obligation shifts the burden of data protection to the controller, who should ensure that its processing is compliant.32 After all, it seems fair to assume that a controller has the information and resources necessary to evaluate its own processing activities. An accountability-based model aims to fulfil the data subject’s reasonable expectations of privacy without encumbering them with unreasonable amounts of information and choices.

Accountability is not the only feature of the GDPR that signals a move away from the idea of data subject control. For example, the new, stricter consent requirements can discourage controllers from relying on the traditional ‘notice and consent’ model. The narrow interpretation of the contract and legitimate interest legal grounds further limits the data subject’s free choice (or the illusion thereof) to exchange their personal data for the provision of services. Furthermore, the principles of data protection by design and default contribute to the controller’s obligation to ensure privacy. Accountability combined with these developments emphasises the controller’s obligation to plan their processing in accordance with fundamental rights, while also limiting the data subject’s responsibility to monitor and control their data.

However, the GDPR does not abandon the idea of data subject control altogether. Despite the criticism, the Regulation continues to strongly emphasise an individual’s control over their own data. Political statements about the Regulation highlight this discourse.33 Even the text of the GDPR enshrines the notion of data subject control in recital 7 by stating simply: ‘Natural persons should have control of their own personal data.’ Moreover, the enhanced data subject rights and strong transparency requirements entrench the notion of control. It seems that the introduction of accountability does not diminish the notion of data subject control but functions as a parallel mechanism.

Although accountability provides a potential alternative to transparency and choice, the GDPR does not live and die by this principle. Instead, the accountability obligation operates in addition to data subject control, strengthening the obligations that empower data subjects, such as transparency and individual rights. Another way in which accountability may reinforce other GDPR obligations is through the risk-based approach.

2. Accountability as a Risk-based Approach

The notion of a risk-based data protection framework is one of the cornerstones of the data protection reform brought about by the GDPR. The risk-based approach reflects an obligation for controllers to take potential risks to data subjects into account when implementing the GDPR.34 The Regulation implements a risk-based approach that both empowers and compels data controllers to assess and mitigate the risks related to their data processing. Accountability is an expression of the risk-based approach because it enhances the controller’s obligation to comply with the GDPR and to demonstrate it.35

A lack of accountability and risk orientation was a critical flaw in the GDPR’s predecessor, the Data Protection Directive, and represents one of the reasons for its inefficient enforcement.36 For example, in its opinion on the Future of Privacy in 2009, the Article 29 Working Party (‘WP 29’)37 opined that the legislation in force at the time had failed to provide effective data protection, offering accountability and a risk-based approach as solutions.38 In its opinion on accountability, the Working Party introduced its view on the concept of accountability as a way to enhance the efficiency of data protection laws, especially towards controllers in data-intensive business areas.39 Such companies, whose data processing represents a high risk to individuals, would have to implement more extensive data protection measures to fulfil their accountability obligation.40 In addition, the Working Party saw accountability as a way to credit companies for voluntarily implementing internal policies or other data protection measures not strictly required by law.41 In other words, the accountability principle was seen as a way to force stricter data protection obligations onto companies processing high-risk data.

On the other hand, the risk-based approach also has the potential to scale data protection obligations so that low-risk activities could be subject to less stringent requirements.42 The risk level determines the measures controllers are required to take.43 Aside from targeting large companies and high-risk processing with more extensive obligations, the principle could also ease the burden on small enterprises and low-risk processing. In addition, the approach guides the work of supervisory authorities, allowing them to allocate their resources primarily to high-risk activities.44

Nevertheless, the risk-based approach is criticised for confusing the roles of controllers and supervisory authorities. The risk-based approach relies on a controller’s ability and willingness to conduct objective risk assessments and to act on the results.45 The approach may incentivise controllers to do more than strictly required by law, but it also allows controllers to do less if they consider their processing activities to be of low risk.46 At the same time, the threshold for authorities to interfere is dependent on the risk level too, directing their resources to high-risk activities.47 As a result, the ultimate responsibility on first assessing the risk level, and then deciding on mitigation measures, lies with the data controller.48 The data protection authorities mainly evaluate whether the controller’s risk decision is reasonable, turning controllers into active parties that consult authorities on their decisions.49

Therefore, auditing risk-based decisions demands of supervisory authorities both competence and the resources to review and question these decisions made by controllers. If the authorities are not adequately resourced, their ability to investigate and review the measures taken by controllers may be limited, especially as concerns actors conducting complex, large-scale processing operations.50 In order to work properly, the risk-based data protection framework requires strong enforcement. Otherwise, the role of controllers in personal data protection may become overpronounced.51

The blurred division of duties is further underlined by the fact that under the GDPR risk is assessed both as an outcome (the intrusiveness of the processing activity) and as a process (how the controller limits privacy risks). This approach distinguishes data protection risks from many other legal risk assessments.52 In other areas of law, a risk-based approach usually refers to a system where the presence of risk is seen as a reason for the authority or legislator to interfere.53 For example, in EU cosmetics or food regulation the presence of certain risky substances in the product prevents them from entering the market altogether.54 In data protection, it seems that the presence of risk rather invites the supervisor to evaluate whether the risk is properly identified and whether mitigation measures are appropriate.

An illustrative example of these roles in action is the prior consultation procedure under Article 36 of the GDPR. The article requires controllers to consult with the supervisory authority when their own risk assessment (DPIA) indicates an increased level of risk that they are unable to mitigate. Therefore, the controller decides the kind of data processing they want to conduct, the implementation of this processing, and ultimately whether the risk level of their activities is low enough to be continued.55 The controllers can do less mitigation when they estimate that the processing entails a low risk. This means not only that the risk management measures can be less rigid, but also that the identification and assessment of the risks can be less thorough.56

While many controllers presumably act in good faith and implement their data processing with the rights and freedoms of data subjects in mind, a number of structural factors may limit a controller’s ability to do so. Firstly, assessing the risks related to complex technical operations can be extremely difficult, meaning that even the most meticulous controller acting in good faith can fail to properly identify and mitigate the risks.57 Secondly, the risk assessments may turn out to be intentionally shallow if a company’s business interests collide with data protection principles.58

According to the GDPR, the controller should take into account ‘the risks of varying likelihood and severity for the rights and freedoms of natural persons’ when implementing measures to comply with the Regulation (Article 24(1)). However, this is where data protection risk assessments fundamentally differ from most business risks assessments. In the area of data protection, the focus is on risks to the fundamental rights of data subjects, not the (commercial) interests of the company making the assessment. Although recital 76 of the GDPR states that risk assessments should be based on an objective assessment of the risks to data subjects59, it seems reasonable to ask whether data controllers can or will be neutral in their risk assessment.60 Since the risk is not assessed from the company’s perspective, the established risk management practices employed by companies may not adequately identify or address data protection risks.61 Moreover, data controllers may be biased when assessing risks from the perspective of data subjects.62 In particular, this may be the case in data-intensive business models where the available volume of data often correlates with revenue potential.63

If companies are incentivised to take higher privacy risks in their pursuit of profits, the major threats to data subject privacy are no longer external but caused by the companies and their business models. The same companies do not necessarily identify or mitigate these risks adequately in their internal processes. Therefore, accountability and the risk-based approach to data protection requires efficient supervision to further motivate controllers to manage the risks to data subjects. The GDPR provides for considerable fines and other strong enforcement measures, but supervisory authorities need to put these tools into use to make sure the business risks of companies remain in balance with the risks to data subjects.

This risk of biased assessments and conflicting interests largely contributed to the attenuation of the risk-based approach in the final text of the GDPR. The accountability principle includes an element of risk assessment that scales the measures required from controllers to the riskiness of processing. However, the way the risk-based approach is implemented in the Regulation implies that the GDPR is not fully risk-based but instead employs the approach in addition to its rule-based requirements. On the other hand, risk plays an important role in the supervision and enforcement of the Regulation.

3. Accountability as an Enforcement Tool

The main purpose of accountability is to spell out a specific obligation for controllers to comply with the GDPR and, perhaps more importantly, to document their compliance. These features can be particularly useful for the supervision and enforcement of the GDPR. The accountability principle can facilitate enforcement in three ways: by introducing risk as a criteria for compliance and interference, by providing means to target abusive processing, and by creating a documentation obligation to facilitate supervision. These key ideas may be the most significant contribution of accountability to EU data protection law.

Firstly, accountability acts as a reinforcing factor for other requirements in the Regulation. Accountability constructs criteria to measure compliance with other obligations.64 This criteria is related to the risk of processing to the data subjects: high risks call for a high level of accountability. This approach was underlined by the WP 29 in its statement in the risk-based approach in 2014, issued during GDPR negotiations. The Working Party stated that the level of accountability could vary depending on the risk level. However, the WP 29 emphasises that the controller is always accountable for its own data processing.65 At the same time, it states that the accountability tools required in each case may differ depending on the risk level and not all tools are necessary in all cases.66

As concluded in chapter III.2, accountability does not make the GDPR fully risk-based but the idea of risk remains integral to the GDPR. Although companies are not entirely free to determine their data protection measures based on their (self-assessed) risk levels, risk represents a means to evaluate the sufficiency of measures. The nature of data processing is a factor in determining the scope of a company’s data protection obligations.67 Moreover, while data protection laws apply equally to everyone and the exceptions are construed narrowly, the GDPR does recognise some scalability in obligations, especially when the company processes large amounts of data.68 In addition, the European Court of Justice has, to some extent, considered risk levels of data processing. For example, in its decision C-131/12 (‘Google Spain’), the Court concluded that the availability of data through a search engine was more intrusive to the individual than the existence of the same information on a publisher’s website.69

Another way for accountability to mitigate data protection risks is by enhancing competence of authorities to interfere. On its own, accountability could also provide means to intervene in abusive processing that does not explicitly breach any individual obligations under the GDPR. It is worth noting that breaches of accountability as such are fineable in the higher category of € 20 million or 4 % of the global annual turnover of the undertaking (GDPR Article 83(5)). However, so far EU data protection authorities seem to have relied on the accountability principle mostly as an aggravating factor in cases containing other breaches of the GDPR.70

Accountability in the GDPR is a high-level obligation, and the specific measures required to show accountability remain for controllers to determine on a case-by-case basis. 71 Accountability can, therefore, capture controller behaviour that does not explicitly breach any specific obligations but nonetheless shows negligence in the protection of personal data.72 Accountability can facilitate enforcement of the GDPR by helping to direct supervision to high-risk processing, to set penalties in proportion, and to transform some of the GDPR’s high-level obligations into specific requirements. Furthermore, accountability increases controller responsibility on data processing. The controller cannot outsource data protection choices to data subjects but must ensure that their own processing is compliant, regardless of any consents or other actions by data subjects.

Finally, accountability also requires controllers to demonstrate their compliance. The documentation required to meet this obligation can materially aid supervision of the GDPR. Supervisory authorities need complete and detailed information about the data processing they are tasked with overseeing. Therefore, companies can greatly facilitate or hinder supervision by controlling the material provided, especially in more complex processing operations. Accountability aims to ensure both that sufficient material exists and that the companies have an incentive to provide it to the authorities. This documentation obligation is particularly important in context of complex processing activities that may involve new or sophisticated technology, long processing chains, and multiple actors. Understanding this kind of processing in order to supervise it can be challenging, and collecting all the information needed for a complete picture is not always simple. The obligation of the controller to provide documentation facilitates the work of authorities.

To conclude, the three aspects of accountability – as an alternative to data subject control, as a risk-based approach, and as a tool for enforcement – all seem to be somewhat diminished in the final text of the GDPR compared to the expectations at the time of drafting the Regulation. Nevertheless, this does not mean that accountability would not contribute to the level of data protection in the EU. While accountability does not completely replace control-based models of data protection, it significantly increases controller responsibility. Similarly, accountability did not make the GDPR a fully risk-based regulation, but it allows for the consideration of risk making data protection obligations more scalable than before. Most importantly, accountability facilitates enforcement thereby increasing effectiveness of the GDPR. Accountability, together with other Article 5 principles of processing, has a particularly important role in the supervision of emerging technologies and processing activities that are not yet subject to specific guidelines or interpretative praxis. For example, big data processing has proven challenging to regulate through traditional means provided by EU data protection law and can be more effectively addressed through accountability. I will explore this further in the following section.

IV. Big Data and the Limits of Accountability

One of the reasons for introducing accountability and a risk-based approach to the GDPR was the challenge posed by modern online data processing to data protection. An example of this is the so-called big data paradigm that refers to a narrative promoting the processing of personal data for the sake of technological progress.73 Big data processing involves large datasets that are processed and analysed automatically in order to predict human behaviour, and to reveal patterns and trends.74 One of the best-known contemporary examples of such processing is the targeted advertising conducted by so-called platforms, ‘big tech’ or GAFAM companies that include Google, Amazon, Facebook, Apple, and Microsoft. Such companies employ big data analytics to collect revenue from advertising to finance the (free) services they provide to consumers.75 In this context, the discourse on regulating big data through accountability illustrate the many potential uses, and some of the challenges, of the principle. It is also worth noting that while the compatibility of big data with many data protection principles has been discussed in detail, to date the relationship between accountability and big data has received less attention.76

Big data processing, and data analytics in general, is an illustrative example of the collision between modern technologies and the law.77 The processing raises concerns, for example, when it comes to overly generalised predictions, discrimination, and, clearly, data protection.78 The premises of big data processing and European data protection law are antithetical. Big data processing is often perceived to be anonymous, meaning that individuals are not identifiable from the data. However, research shows that the combination of multiple datasets easily allows identification at least by indirect means.79 Furthermore, the principles of big data processing are difficult to align with the fundamental principles of the GDPR, such as the obligation to predetermine the means and purposes of data processing (Article 5(1)(b)).80

Big data distinguishes itself from traditional forms of data processing in the sense that the data and its processing usually represent a valuable business as such, instead of data processing being a side effect of other activities. When personal data is processed to achieve a certain purpose – such as customer data being processed to sell goods and services, employee data to manage the employment relationship, or contact information for marketing – the primary purpose of processing (in our examples: sales, employment, and marketing respectively) serves as a natural benchmark against which to assess the processing. Is the processing necessary for and proportionate to this purpose? Do we need all this data for this purpose?

In the big data context, such a benchmark rarely exists. Big data processing is characterised by ‘trial and error,’ meaning that data is processed in anticipation of finding something interesting, not to fulfil a predetermined goal. 81 The data produced by online ecosystems is generally heterogeneous, unstructured, and collected with the assumption of it becoming useful in the future.82 At the time of processing, the ultimate purpose of processing remains unknown and, consequently, it is impossible to predetermine whether all the data and processing operations are necessary.83 This difference between big data and traditional data processing is partly a result of changes in cost structure. The falling costs of computing power and data storage make data collection and processing less expensive.84 Unlike before, it is feasible, and even profitable, to process data first and only then consider whether the results are interesting or meaningful.

Accountability is sometimes presented as a solution to these incompatibilities between big data and purpose-based European data protection law. Accountability and the risk-based approach were partly promoted as an alternative to traditional data protection models.85 Taken far enough, these principles could allow big data analytics as long as the risk level is sufficiently low.86 Accountability could, in particular, have challenged the requirement for predetermined purposes, allowing for more room to process low-risk data and only identify purposes for the discovered correlations.87 This risk-based approach would then rely on the controller’s accountability and responsibility to protect data during initial processing. However, as concluded in Chapter III.2, accountability did not take this extreme form in the GDPR. The risk-based approach was watered down during GDPR negotiations due to concerns that it potentially compromises the fundamental rights of data subjects and lowers the overall level of data protection in the EU.88 Consequently, the limitations of accountability are particularly evident in this area, which the principle was supposed to facilitate.89

It can be said that data protection laws are, by nature, incompatible with big data discourse, given that enforced privacy rights directly restrict data processing.90 While it may not be impossible to conduct profitable data business in a privacy-friendly manner, the different restrictions that apply to EU and non-EU actors lead to an imbalance in global competition.91 This inequality is detrimental to the efficiency of the accountability principle. As discussed in chapter III.2, accountability depends on either an inherent motivation of data controllers to respect fundamental rights or strict enforcement that externally motivates controllers to comply with data protection rules. Therefore, in a big data context, accountability may not be an optimal solution when it is directly juxtaposed with a controller’s business interests.

On the other hand, accountability is criticized for failing to contribute to a higher level of data protection, as it only exacerbates the attribute-centric approach of the GDPR. Hence, accountability as an obligation to fulfil the GDPR’s specific requirements may not address the issues inherent in big data processing. The formalistic requirements to process individual data points for various predetermined purposes fail to address the inferences drawn from data that can be more intrusive than the data collection itself.92 That is, the Regulation does not consider the consequences of data processing as long as the data and purposes for which it is used are valid. In this vein, Wachter and Mittelstadt argue that EU data protection law does not sufficiently protect the use of data and conclusions drawn from it but concentrates on the data as such.93 If we accept a high level of data protection as the ultimate goal, it seems fair to ask whether a more risk-based approach would yield better results. Accountability, implemented in a manner that puts less emphasis on mandatory pre-processing operations and more emphasis on the overall responsibility of the controller, could also extend to the consequences of the processing on the individual.

Despite introducing accountability and the risk-based approach, the GDPR continues to build heavily on the idea of predetermining the limits of data processing and minimizing the use of data. In essence, accountability means compliance with obligations of the Regulation, which build on the assumption that the controller premeditates its actions in order to process as little data as possible. The GDPR imposes obligations on the controller – among others, those of purpose limitation, data minimization, limited retention, and transparency towards data subjects (GDPR Article 5). Both these general principles of processing and other obligations of the Regulation support the conclusion that accountability about the mere risks of processing alone is not sufficient to comply with EU data protection rules. Accountability is evaluated in light of other obligations, and how well the controller fulfils them. Also the risk-based approach, as implemented in the GDPR, calls for a preliminary risk assessment and the implementation of appropriate mitigation measures before processing takes place.94 Therefore, the principles and obligations of the GDPR become a criteria against which accountability and risk are measured.

While the GDPR and big data clearly start from very different premises, the important question is which approach should be followed. The fact that the GDPR does not easily allow big data processing is not necessarily a negative outcome, especially in the light of the Regulation’s primary purpose of protecting fundamental rights. The requirement to predetermine the scope of processing is intended to ensure that all controllers take the necessary precautions to handle personal data.

The fact that accountability enforces long-standing, key principles of European data protection law, namely purpose limitation, data minimization, and individual control, might restrict technological development in the area of big data. However, even if these principles collide with big data processing, it is not obvious that they should give way to technical development. A fully risk-based approach to data protection, relying on the accountability of controllers, likely could not ensure sufficiently high protection of personal data, at least not without clearly defined criteria for demonstrating appropriate accountability. For example, Hahn argues that purpose specification is necessary to ensure that the controllers know why they process the data in the first place, and eliminating the principle could legitimise excessive and discriminatory data processing.95 In other words, the data protection principles conflicting with big data are there for a reason. The general principles form a criteria against which accountability and the risks related to data processing can be measured.

However, regardless of the structural incompatibilities between the GDPR and big data, the big data paradigm constitutes an integral part of the environment in which the GDPR operates. The paradigm is connected to competition policy and economic growth, following the argument that data processing is a mandatory feature of future society.96 The premises of data protection law and big data processing are different, and despite some initial hopes, the introduction of the accountability principle does not change this setting. Accountability mainly augments the core principles of EU data protection law instead of replacing them with a more risk-based approach. This seems like a conscious political decision to challenge the kind of technological development that leads towards extensive data processing with the fundamental right to data protection and privacy.

V. Conclusions

To conclude, accountability embodies many of the goals and challenges of EU data protection law. At the time the GDPR was drafted, this principle represented an alternative to the paradigm of data subject control, the introduction of a risk-based approach, and improved scalability of obligations. In this paper, I have examined how the accountability principle contributes to EU data protection law. I conclude that accountability does not represent a significant paradigm shift. While the principle bring with it new perspectives, such as the risk-based approach and enhanced controller responsibility, it does not alter the core principles of the law. The GDPR continues to rely heavily on the idea of data subject control over personal data and builds on the requirement to predetermine the purposes and means of processing. The traditional qualities of accountability are particularly evident in areas where the principle was supposed to bring the most change, namely big data analytics and emerging technologies.

However, I argue that accountability has the potential to become an important tool for GDPR enforcement. Accountability and the risk-based approach can meaningfully contribute to the assessment of compliance with other GDPR obligations, such as data security or transparency, by representing a benchmark against which to assess whether these obligations are fulfilled.97 Furthermore, accountability creates an obligation to act proactively, especially towards the high-level principles of the Regulation that can be difficult to precisely translate into specific action items. The obligation to document compliance, which is a key feature of accountability, supports auditing and supervisory action by ensuring the availability of material required for such purposes. Lastly, a lack of accountability may constitute an aggravating factor in assessing GDPR breaches and amount of sanctions.

Beyond its direct impact, accountability has clear rhetorical significance. Accountability as an independent obligation forces every controller to make efforts to comply with the GDPR. Potentially, this could be extended as far as ideas of data stewardship or a fiduciary duty to act in the interests of data subjects. Accountability also underlines the nature of data protection as a collective right. The data subjects retain control over their own data, but the principal responsibility over protecting personal data lies with the controller who processes this data.

Accountability may not fulfil the goals its creators originally pursued, but it has an important role to play in the GDPR. The obligation carries elements of different functions, including a paradigm change from data subject to controller responsibility, of a shift towards a risk-based approach to data protection, and of more practical enforcement purposes. Ultimately, the versatility of the principle may prove to be its biggest strength, allowing accountability to develop into a formidable asset for EU data protection law.


[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation)

[2] European Commission ‘Statement by Vice-President Ansip and Commissioner Jourová ahead of the entry into application of the General Data Protection Regulation’ (Statement 24 May 2018) STATEMENT/18/3889 <>

[3] European Commission Special Eurobarometer 431 ‘Data Protection’ (2015) <>; and European Union Agency for Fundamental Rights Fundamental Rights Report 2020 Data protection and privacy (2020) <>

[4] Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data

[5] European Commission (n2)

[6] According to GDPR art. 5(1) the general principles of data processing include lawfulness, fairness, and transparency, purpose limitation, data minimization, accuracy, storage limitation, integrity and confidentiality.

[7] Opinion of the European Data Protection Supervisor on the Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of the Regions ‘A comprehensive approach on personal data protection in the European Union’ (14 January 2011) <>, para 102

[8] Christopher Kuner, Lee A Bygrave, Christopher Docksey, and Laura Drechsler, The EU General Data Protection Regulation (GDPR): A Commentary (Oxford University Press 2020) 562

[9] Kuner et al. (n 8) 560

[10] Jockum Hildén, The Politics of Datafication: The influence of lobbyists on the EU's data protection reform and its consequences for the legitimacy of the General Data Protection Regulation (University of Helsinki 2019) 165

[11] European Commission Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) COM/2012/011 final - 2012/0011 (COD)10

[12] European Commission ‘The GDPR: new opportunities, new obligations - What every business needs to know about the EU’s General Data Protection Regulation (Publications Office of the European Union 2018) 3

[13] Article 29 Working Party Opinion 3/2010 on the principle of accountability adopted on 13 July 2010 (WP 173) para 12

[14] European Commission ‘Communication from the Commission to the European Parliament and the Council, Data protection as a pillar of citizens’ empowerment and the EU’s approach to the digital transition - two years of application of the General Data Protection Regulation’ COM(2020) 264 final; see also recital 13 of the GDPR

[15] European Commission (n 11)

[16] Lina Jasmontaite and Valerie Verdoodt, ‘Accountability in the EU Data Protection Reform: Balancing Citizens’ and Business’ Rights’ in Aspinall D., Camenisch J., Hansen M., Fischer-Hübner S., Raab C. (eds) ‘Privacy and Identity Management: Time for a Revolution?’ Springer 2016, 3

[17] European Commission (n 11) art. 22(2)

[18] Art. 30 on records of processing, art. 32 on data security, art. 35 on data protection impact assessment, art. 36 on prior consultation, and art. 37 on data protection officers.

[19] Hildén (n10) 108

[20] Hildén (n10) 130

[21] Lachlan Urquhart, Tom Lodge, and Andy Crabtree, ‘Demonstrably Doing Accountability in the Internet of Things’ (2019) 27 International Journal of Law and Information Technology 1, 7

[22] Claudia Quelle, ‘Enhancing Compliance under the General Data Protection Regulation: The Risky Upshot of the Accountability- and Risk-Based Approach’ (2018) 9 European Journal of Risk Regulation 502, 502–503

[23] Quelle (n 22) 507

[24] Kuner et al. (n 8) 561-562

[25] Helen Nissenbaum, ‘A Contextual Approach to Privacy Online’ (2011) 140 Daedalus 32, 34; and Attila Kiss and Gergely László Szöke, ‘Evolution or Revolution? Steps Forward to a New Generation of Data Protection Regulation’ in Gutwirth S., Leenes R., de Hert P. (eds) Reforming European Data Protection Law (Springer 2015) 318

[26] Brendan Van Alsenoy, Eleni Kosta and Jos Dumortiera, ‘Privacy notices versus informational self-determination: Minding the gap’ (2014) 28 International Review of Law, Computers & Technology 185, 190; Woodrow Harzog, ‘The Case Against Idealising Control’ (2018) EDPL 423, 426

[27] Iris van Ooijen and Helena Vrabec, ‘Does the GDPR Enhance Consumers’ Control over Personal Data?: An Analysis From a Behavioural Perspective’ (2018) 42 Journal of Consumer Policy 91, 94-95

[28] Susan B. Barnes ‘A privacy paradox: Social networking in the United States’ (2006) 11 First Monday <>

[29] Alessandro Acquisti, Laura Brandimarte, and George Loewenstein ‘Privacy and Human Behavior in the Age of Information’ (2015) Science 509–514

[30] Hildén (n10) 200

[31] Harzog (n26) 429

[32] Brendan Van Alsenoy, 'Liability under EU Data Protection Law: From Directive 95/46 to the General Data Protection Regulation' (2016) 7 J Intell Prop Info Tech & Elec Com L 271, para 43

[33] See, for example, European Commission ‘ Data protection reform – frequently asked questions’ (Memo 4 November 2010) MEMO/10/542 <> ; and European Commission ‘European Commission sets out strategy to strengthen EU data protection rules’ (Press release 4 November 2010) IP/10/1462 <>

[34] Quelle (n 22) 508

[35] Milda Macenaite, ‘The “Riskification” of European Data Protection Law through a Two-Fold Shift’ (2017) 8 European Journal of Risk Regulation 506, 524-525

[36] Article 29 Working Party ‘The Future of Privacy, Joint contribution to the Consultation of the European Commission on the legal framework for the fundamental right to protection of personal data adopted on 01 December 2009’ (WP 168) para 90

[37] The working party was a cooperation body of all EU national data protection authorities established under article 29 of the DPD. It was replaced by the European Data Protection Board (EDPB) after the GDPR took effect.

[38] WP 168 (n 36) para 31

[39] WP 173 (n 13) para 4–5

[40] WP 173 (n 13) para 12

[41] WP 173 (n 13) para 14

[43] Macenaite (n 35) 513–514

[44] Maria Eduarda Gonçalve, ‘The EU Data Protection Reform and the Challenges of Big Data: Remaining Uncertainties and Ways Forward’ (2017) 26 Information & Communications Technology Law 90, 101

[45] Quelle (n 22) 516

[46] Henry Pearce, ‘Big data and the reform of the European data protection framework: an overview of potential concerns associated with proposals for risk management based approaches to the concept of personal data’ (2017) 26 Information & Communications Technology Law 312, 323

[47] Robert Baldwin and Julia Black ‘Really Responsive Risk-Based Regulation’ (2010) 32 Law and Policy 181, 188

[48] Gonçalves (n 44) 101

[49] Quelle (n 22) 515–516

[50] Gonçalves (n 44) 101

[51] Viktor Mayer-Schönberger and Kenneth Cukier, ‘Big Data : A Revolution That Will Transform How We Live, Work, and Think’ (First Mariner Books 2015) 174

[52] Quelle (n 22) 514.

[53] Quelle (n 22) 511–512

[54] Regulation (EC) No 1223/2009 of the European Parliament and of the Council of 30 November 2009 on cosmetic products; Regulation (EC) No 178/2002 of the European Parliament and of the Council of 28 January 2002 laying down the general principles and requirements of food law, establishing the European Food Safety Authority and laying down procedures in matters of food safety

[55] Macenaite (n 35) 520

[56] Quelle (n 22) 517

[57] Pearce (n 46) 315-326

[58] Bert-Jaap Koops, ‘The trouble with European data protection law’ (2014) 4(4) International Data Privacy Law 250, 254-255

[59] GDPR recital 76: ‘The likelihood and severity of the risk to the rights and freedoms of the data subject should be determined by reference to the nature, scope, context and purposes of the processing. Risk should be evaluated on the basis of an objective assessment, by which it is established whether data processing operations involve a risk or a high risk.’

[60] Macenaite (n 35) 534

[61] Pearce (n 46) 326-327

[62] Quelle (n 22) 524

[63] Bart Custers and Helena Uršič, ‘Big data and data reuse: a taxonomy of data reuse for balancing big data benefits and personal data protection’ (2016) 6(1) International Data Privacy Law 4, 7

[64] European Data Protection Supervisor (n 7) para 100

[65] Article 29 Working Party ‘Statement on the role of a risk-based approach in data protection legal frameworks adopted on 30 May 2014’ (WP 218) para 3

[66] ibid

[67] Orla Lynskey, ‘Grappling with “Data Power”: Normative Nudges from Data Protection and Privacy’ (2019) 20 Theoretical Inquiries in Law 189, 203

[68] For example, GDPR art. 35 (DPIA) and art. 37 (DPO). About narrow interpretation, see for example EU Court of Justice argumentation about so-called household exemption (GDPR art. 2(2)(c)) in cases C-212/13 František Ryneš v Úřad pro ochranu osobních údajů (2014) and C-101/01 Bodil Lindqvist (2003)

[69] C-131/12 Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González (2014) para 36

[70] Examples of national cases include Belgian Data Protection Authority Decision DOS-2019-01377 (2022, IAB); and Finnish Data Protection Ombudsman Decision 1150/161/2021 (2021, Psychotherapy Centre Vastaamo)

[71] Urquhart et al. (n 21) 7

[72] Kuner et al. (n 8) 566

[73] Joseph Alhadeff, Brendan van Alsenoy and Jos Dumortier, ’The Accountability Principle in Data Protection Regulation: Origin, Development and Future Directions’ in Carla Ilten, Inga Kroener, Daniel Neyland, Hector Postigo, D Guagnin, and L Hempel (eds.) Managing Privacy through Accountability (Palgrave Macmillan 2012) 49

[74] Robert Nisbet, Gary Miner and Ken Yale, Handbook of Statistical Analysis and Data Mining Applications (Elsevier 2018) 39

[75] Viktoria H.S.E. Robertson, 'Excessive data collection: Privacy considerations and abuse of dominance in the era of big data' (2020) 57(1) Common Market Law Review 161, 163

[76] See, for example Isabel Hahn ‘Purpose Limitation in the Time of Data Power: Is There a Way Forward?’ (2021) 7(1) European Data Protection Law Review 31; and Liana Colonna, ’Data Mining and Its Paradoxical Relationship to the Purpose Limitation Principle’ in Gutwirth, Serge, Paul De Hert, and Ronald Leenes (eds.) ‘Reloading Data Protection : Multidisciplinary Insights and Contemporary Challenges’ (Springer 2014)

[77] Mayer-Schönberger et al. (n 51) 173-174

[78] Ira S Rubinstein, 'Big Data: The End of Privacy or a New Beginning?' (2013) 3 International Data Privacy Law 74, 77

[79] Pearce (n 46) 318

[80] Bart van der Sloot and Sascha van Schendel, 'Ten Questions for Future Regulation of Big Data: A Comparative and Empirical Legal Study' (2016) 7 J Intell Prop Info Tech & Elec Com L 110, 119-120

[81] Rubinstein (n 78) 78

[82] Ioanna D. Constantiou and Jannis Kallinikos, ‘New Games, New Rules: Big Data and the Changing Context of Strategy’ (2015) 30 Journal of Information Technology 44, 49–50

[83] In GDPR terms, big data processing is often categorised as profiling (art. 4(1)(4)) or automated decision-making (art. 22). Following the purpose-based approach of the GDPR, big data processing can qualify either as an independent purpose or as compatible further processing (art. 6(4)), that is processing for another purpose than the one for which the data was originally collected. The GDPR allows further processing if the new purpose is compatible with the original in terms of context, the nature of the data, the consequences to the data subject, and applied safeguards. If big data is processed as an independent purpose, in most cases it likely falls under legitimate interest legal ground (GDPR art. 6(1)(f)) requiring a balancing test between the controller’s interest to process and the data subject’s interest in privacy. At the moment it is not entirely clear whether ‘detecting correlations’ could be a legitimate interest or compatible further processing.

[84] Nisbet et al. (n 74) 26

[85] Kiss et al. (n 25) 317-318

[86] Mayer-Schönberger et al. (n 51) 173-174

[87] Digital Europe (n 42) 2

[88] Gonçalves (n 44) 104

[89] Quelle (n 22) 513

[90] Tal Zarsky, 'Incompatible: The GDPR in the Age of Big Data' (2017) 47 Seton Hall Law Review 996

[91] Michal S. Gal and Oshrit Aviv, ‘The Competitive Effects of the GDPR’ (2020) 16(3) Journal of competition law & economics 349, 384

[92] Omer Tene and Jules Polonetsky, ‘Big Data for All: Privacy and User Control in the Age of Analytics’ (2013) 11 Northwestern Journal of Technology and Intellectual Property 239, 270

[93] Sandra Wachter and Brent Mittelstadt, ‘A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI’ (2019) Columbia Business Law Review 494, 511–512

[94] See GDPR article 35 on Data Protection Impact Assessment requiring a DPIA to be done ’prior to the processing’, and article 25 on Data Protection by Design and by Default that required the measures to be implemented ‘at the time of determination of the means for processing’.

[95] Hahn (n 76) 40

[96] Hilden (n 10) 3

[97] European Data Protection Supervisor (n 7) para 100

Export Citation