Skip to content

The Future of EU Data Protection Law for Collectives::

A Reverse Brussels Effect


Abstract

The article discusses the future of European Union (EU) data protection law following its fifth anniversary in May 2023. It argues for the expansion of the concept of personal data to cater for collectives and not just the individual. It discusses the interests of collectives in data protection that are overlooked by current laws and highlights the importance of extending data protection to collectives. In this regard, the article argues that EU data protection law should undergo a reverse Brussels effect which allows the EU to look outward and learn from cultures that privilege communities and groups to identify a collective cultural value that can inform a theoretical framework that recognises collective rights. It proposes Ubuntu as one such value that the EU can learn from and transpose into the current data protection system without making drastic changes. It proposes an Ubuntu Framework and the principles accompanying such framework that can be incorporated into the GDPR. Keywords: collectives, legal persons, groups, ubuntu, GDPR, collective identity, Brussels effect


I. Introduction

The fifth anniversary of the GDPR1 coincides with data protection law in the European Union being deemed to be at its peak and having its best moment.2 During these five years, there have been rapid societal changes and technological advancements in artificial intelligence, large language models, big data and other data-based technologies. The five-year mark together with the social and technological developments provides an opportunity for reflection on the future of data protection law in the EU. ‘Which fundamental concepts should be revised; what can and should data protection law aim for in ten years and which regulatory approach should be adopted?’ To cater for social changes and address challenges posed by technological changes this article submits that the fundamental concept of personal data which is the subject of data protection has to be revised to cater for collectives. Collectives in this instance refers to both organised and non-organised entities and groups.

Data protection law should aim to accommodate a collective dimension to achieve a holistic realisation of the right to data protection. To achieve a truly collective dimension to data protection, it is recommended that EU data protection law looks outward and undergoes a ‘reverse Brussels effect’3, where the EU learns from countries that have collective cultures that privilege communities and groups. Through the reverse Brussels effect, the EU will be able to identify a collective cultural value that can inform a theoretical framework that recognises collective rights for groups which can be transposed into the current system without a drastic change to EU law. For this article, ubuntu4 is selected as an example of a collective cultural value that the EU can learn from to develop a collective dimension to data protection law. An Ubuntu framework is proposed that contains principles which can be integrated into the GDPR as an example of how collective data protection could look in the EU when influenced by ubuntu.

This article proceeds as follows, Section II traces and outlines the development of data protection for collectives from the first data protection laws to the contemporary state. It also outlines the limitations of collective data protection in the current state. Section III explores the interests of collectives in data protection that have been overlooked and highlights the importance of extending protection to collectives. Section IV addresses the reverse Brussels effect and demonstrates how EU data protection law can learn from the collective culture of ubuntu to attain a truly collective dimension to data protection. Section V introduces the ubuntu framework discussing the selected values of ubuntu and the principles derived from these values. Section VI provides a brief conclusion to the discussion.

II. A Limited Approach to Data Protection for Collectives5

Data protection law in the EU has gone through two main phases. The first phase began in the 1970s up until the adoption of the Data Protection Directive.6 The second phase began with the adoption of the Data Protection Directive to the entry into force of the GDPR. Data protection law is currently in its third phase. Each change of phase has diminished the protection for collectives. Throughout all phases, there have been two divergent approaches, those that extend data protection law to collectives and those that do not.

The legitimacy of extending data protection to collectives was an important part of the early data protection discussions in the first phase. During this phase, six countries recognised the legitimacy of extending data protection to groups and collectives.7 In these countries, two approaches emerged, (i) similar protection for individuals and collectives8 and (ii) limited sectoral protection for collectives.9 However, in both these approaches, some countries made a distinction between collectives, groups and legal persons. An example is Austria where initially protection was extended to legal persons and then subsequently extended to collectives and groups. This divide seems to have been a desire to strike a balance between the approach of extending protection and that of denying protection to collectives.

In Europe, the Council of Europe recognised the legitimacy of extending data protection law to collectives in Convention 108.10 The Convention allows contracting parties to extend the protection offered under the Convention to, ‘groups of persons, associations, foundations, companies, corporations and any other bodies consisting directly or indirectly of individuals, whether or not such bodies possess legal personality’.11 In Convention 108+, the successor of Convention 108, rights in the Convention are granted to natural persons though parties retain the right to extend protection to legal persons to protect their legitimate interests.12 In the EU, the Data Protection Directive applied to an identified or identifiable natural person.13 However, it acknowledged the legitimacy of extending protection to collectives by clarifying that it did not affect existing legislation in member states that accorded protection to legal persons.14 It also provided limited indirect protection for legal persons by recognising the fact that data about legal persons could be considered as relating to natural persons when a legal person’s name was similar to that of a natural person.15

Austria, Denmark and Luxembourg therefore retained the laws extending protection for collectives and were joined by Italy in its implementation of the Directive. However, Luxembourg in 2007 and Italy in 2013 removed the protection for groups and collectives. The limitation of data protection to natural persons was carried into the GDPR.16 However, the GDPR departs from the Directive as it spells out that it does not apply to legal persons.17 The e-privacy Directive is currently the only EU legislation that extends data protection to collectives, particularly legal persons. The Directive lays down clear rules concerning unsolicited communications which apply to legal persons.18 However, the protection in this instance is within the limits laid down by the provisions.

The justification for protecting collectives was three-fold. First, there was concern with how information on collectives could be traced to an individual. As such complete protection of the individual was secured through protecting the data of collectives. The second relates to the concern of threats posed by the collection and processing of collectives’ data to their interests.19 Collective entities were seen as being similarly affected by the increased processing of information that natural persons were experiencing.20 It, therefore, was desirable to redress the balance of power between collectives and data controllers by protecting the processing of their data.21 The third concerned how collectives, like natural persons, had information that was worth protecting. The exclusion of such data from the law would have denied the information the protection it deserves.22

On the other hand, Sweden, France and Germany passed legislation that did not extend protection to collectives.23 They would be joined by eleven other countries.24 The justification for not extending protection to collectives was two-fold. First was a claim that the interests data protection laws seek to protect are only applicable to individuals. Collectives were therefore seen as being ineligible to receive such protection as rights such as privacy and personal integrity were rights that could only be exercised by individuals who held a legitimate claim.25 Extending protection to collectives in the same law would therefore blur the boundary of when the law would apply and regulate data about groups while enabling the protection of individuals.26 It was therefore desirable to protect the data of collectives through a different regime that applied solely to collectives. The second related to the market position of legal persons who viewed the extension of data protection to legal persons as enabling competitors to access information that the businesses stored.27

III. Overlooked Data Protection Interests of Collectives

The concerns identified in the 1970s as justifying the extension of data protection to collectives identified in part II above are more relevant now in a data-driven world where almost all aspects of everyday life are digitally recorded and stored. One would therefore expect legislation addressing the concerns raised in the 1970s, however, to date, these concerns have not been addressed. The absence of legislation addressing these concerns is best explained by the evolution of data protection to a fundamental right separate from the right to privacy.28 This makes it a subjective right invoked by citizens when their interests are threatened.29 As the GDPR gives effect to the right to data protection, the primary focus of the law is the individual and not the collective.

This approach negates the fact that data-driven innovation has enabled platforms and organisations to collect more personal data from individuals than has previously been done in history. Platforms and organisations are now more interested in population-level insights that are identifiable from the personal data they collect. The individual data subject is important in so far as they enable connections to be made to other individuals who share similar attributes. There is an inverse flow where an individual’s attributes are used to draw inferences about the group or collective they belong and vice versa. Data processing therefore takes a collective and group dimension as opposed to being focused on the individual. As a result, risks of harm are no longer experienced at an individual level only but are experienced at a collective and group level.

This challenges the notion that the protection of personal data that identifies people individually will protect collectives naturally. EU data protection law does not address harms or risks that arise from this relationality, and it does not provide a mechanism for collectives to exercise data protection rights that individuals have concerning their data. Data protection statutes have protected collective issues as a sum multiple of individual issues, and they are addressed using remedies premised on individual rights. 30 The individual, a data subject from whom data is collected is the only individual who bears the responsibility for the protection of the data of collectives. This undermines collective claims to data protection and yet these should not be downplayed as there is a possibility that collectives will be unable to control and perceive their own spaces and practices due to extraction.31

Furthermore, practices such as profiling are being used to create aggregate data sets that are valuable to different decision-makers. The inferences derived from these aggregate sets are viewed as more profitable than individual profiles by companies monetising such profiles given the richness of inferences that can be drawn.32 Once purchased the profiles can then be used for a variety of purposes including targeting and controlling collectives. Some of the purposes might have negative consequences for collectives as profiles developed can become more realistic than the identity of the collective itself. Collectives have goals, interests and operational modes that are not reducible to the individual members constituting them.33 In most instances, they have diverging purposes and they play different economic, political, legal and social roles.34 The roles they play and the goals they pursue contribute to the formulation of a ‘collective identity’. If the developed profile is more realistic than the identity of the collective there is a possibility of the collective suffering a significant impact regarding its interests. Thus, collectives have an interest in the quality of the information that is processed and the information systems used to manage this information.

Apart from profiling, artificial intelligence has enabled automated and systematic application of predictive analytics. Such analytics allow predictions to be made about collectives and individuals making up these collectives based on predictive data models. Predictive analytics rely on ‘collective data sets’ including anonymised sets as predictive capacities can be extracted from these sets.35 The predictions made can be used to target individuals or different collectives in society at a large scale. This can then result in exploitation or unequal treatment of different individuals or groups in society due to the resulting predictions. For example, different rates of insurance can be offered by virtue of having been predicted to be a higher risk because of predictions made about one’s financial status from readily available data about a group they have been predicted to belong to. The result is imbalances of power between different collectives in society and the organisations arising from the potential misuse of their data and the predictive models. Collectives have an interest in aiming to balance and address these power asymmetries.

Related to this is how data protection law fails to acknowledge the relationality of data to ensure that societal harms are prevented while facilitating collective interests.36 Relationality in this instance refers to how data about a single individual when processed can reveal data about other individuals they are connected with. Relationality is what gives data value as it allows aggregation of data collected from different individuals who share the same characteristics without consideration of the individual. The process of aggregation removes the identified or identifiable data subject leading to a collective data subject.37 This data is then used to target and influence decisions that affect individuals and the collectives they are part of usually with grave consequences without the individuals or collectives being aware of the causes of the phenomena they experience. An example of the effects of aggregation relates to predictive policing which results in increased law enforcement in areas where data about crimes and violence deems to be necessary.38

This has the potential to impact how individuals interact with collectives they are part of or belong to. Collectives have an interest in ensuring that there is a suitable environment for which its members can come together, ‘to exchange information, share feelings, make plans and act in concert’ to attain their objectives and the objectives of the collective.39 Individual members have an interest in the collective as it fosters an environment that allows them to share, make plans and act in concert. Individual members in this instance would rely on whom they associate with in the collective to ensure that what has been shared and planned stays within the environment of the collective.40 These interests in the protection of their data allow collectives to undertake their different roles to the benefit of society.41 However, these interests are not catered for by data protection law.

To overcome the inadequacies of the individual dimension, recent works have sought to expand the scope of data protection by examining its collective dimension.42 Bieker proposes a reformation of the term data subject to relate to one or more individuals.43 This would extend data protection rights to collectives which rights can be enforced by an individual or collective in line with the case law of ECtHR on measures of secret surveillance. van der Sloot proposes a two-tiered system for data protection. In this system, with respect to negative obligations for the state and data controllers, collectives would be entitled to similar protection with exceptions concerning the protection of sensitive data. In respect of positive rights for persons and data subjects, collectives would be entitled to limited rights of access to data and rectification of data. Collectives would also be offered a lower level of protection in comparison to natural persons.

Tzanou proposes the adoption of an egalitarian project. She proposes a new normative orientation for data protection which would be guided by methodologies that put to the fore neglected perspectives and narratives. This would ensure the inclusivity and diversity of data protection law in the EU. The focus of her proposal was to cater for the perspectives of vulnerable and disadvantaged groups that are subjected to monitoring and additional forms of surveillance which has been missed by data protection law in the EU. She therefore advocated for a paradigm shift in data protection that would be human-centric and focused on society.44 However, their formulations have one major limitation. The proposals are premised on Western conceptualisations of fundamental rights as a remedy against intrusions on the individual. They therefore retain their individualistic dimension and fail to effectively address the collective dimension of a right to data protection.45 Addressing the issues identified above as well as the interests of collectives requires dealing with large, systemic and deeply rooted data collection and processing practices from a collective perspective. To adequately protect the interests of collectives with their diverging interests requires an alternative system that recognises collective rights. However, such a system is less developed in the EU and would require the traditional system of rights to be changed together with EU primary law.46

IV. Towards a Paradigm Shift: Reverse Brussels Effect

The process of changing the traditional system of rights within the EU is a utopian endeavour and while it is one well worth pursuing,47 I wish to introduce a novel argument that does not involve changing the rights system. I call for a shift of the current focus of EU data protection law from the Brussels effect to a reverse Brussels effect. The reverse Brussels effect entails a two-fold task (i) identification of collective cultures that have rights systems that privilege collectives over individuals and (ii) identify elements from these cultures that can be integrated into current and future EU data protection law. Several countries have collective cultural norms that inform collective rights which the EU can learn from. This paper aims to propose an alternative approach to the integration of collective rights in data protection law in the EU without changing the traditional system of rights. This is done by demonstrating that there are elements within collective cultural norms such as ubuntu that the EU can learn from to develop collective data protection.

The two-fold task of the reverse Brussels effect is an important first step towards the realisation of an alternative solution to protecting collective data protection. It allows the development of a contrast between relational and collective societies represented by concepts such as ubuntu and the individualistic societies in the EU. Ubuntu in this instance is taken at the ideological and principle level where it is comparable to individualism rather than the regulation level at which data protection law exists. This is because ubuntu reflects a diversity of practices, values and cultures found in Africa. However, it is important to acknowledge that it is a unique challenge to carry out such socio-legal transplantation in the EU and in the GDPR, where the individual dimension prevails in today’s culture. Nonetheless, given the need for the realisation of a collective right to data protection, those in the EU will have to take up the challenge. The cultural norm proposed in this instance is ubuntu. Ubuntu is ideal as a starting point for this process because it is, ‘a collection of values and practices’ that is nuanced ‘across different ethnic groups’ but having a common theme of seeing human beings as part of a, ‘relational, communal, societal, environmental and spiritual world’.48

Ubuntu is a social-cultural concept that explains individual behaviour in groups. It emphasizes the notion that individuals are members of groups and their actions affect not only themselves but also others in their groups. It is best expressed as an aphorism in various native languages of Southern Africa with the most common being in Zulu, umuntu ngumuntu ngabantu which translates to a person is a person through other people.49 It has also been expressed as, ‘I am because we are; and since we are, therefore I am.’50 Ubuntu provides an understanding of individual behaviour in groups in an African society because the interests of individual members are focused towards activities and behaviours that ensure the good of the group.51

Two important questions then arise, whether the transplantation of ubuntu from African culture is consistent with European socio-legal culture and whether it is feasible to integrate it into the GDPR.52 To answer these two questions, I propose an Ubuntu Framework that contains data protection principles derived from the values of ubuntu. Ubuntu is made up of several values such as communitarianism, interdependence, dignity, solidarity and responsibility. This article focuses on the values of communitarianism, interdependence, responsibility, relationality and dignity. A framework containing principles premised on the values of ubuntu would provide a different kind of starting point for a collective right to data protection in the GDPR. The principles would have a similar status to principles found in the GDPR. This will allow ubuntu to be infused into the GDPR thereby contributing to the development of a collective dimension to data protection. The validation of the consistency of ubuntu with European socio-legal culture and the feasibility of its integration it the GDPR is to be pursued as future work.

V. Integration of Ubuntu into the GDPR: An Ubuntu Framework53

The Ubuntu Framework is based on four principles to ensure a holistic collective right to data protection that caters for collective interests and mitigates potential risks. For each of the principles, a description of the value underpinning the principle is provided as well as how the principle might impact a collective right to data protection. The framework itself is underpinned by Communitarianism. At its core ubuntu is concerned with communitarianism which entails communalism54 a mindset in which group welfare and interests are greater than those of the individual. Ubuntu emphasises the duties of the individual in the community and encourages respect for each person as a social unit.55 The right of one member of the community is the duty of another member of the community56 as such community rights are embedded in the culture of the people.57

This communal character presents a challenge to the consistency of ubuntu with the individual lifestyle of the EU with regard to the nature of the relationship between the individual and the collective. 58 However, it is important to note that the communal character of ubuntu does not imply individual rights are subordinated. It instead implies that individuals pursue their own success and goals by pursuing the community good.59 Two possible approaches emerge from this. The first, where data protection focuses on the welfare of collectives and not individuals with the individual being an organic component of the group. The second is where data protection law recognises that individual data protection and collective data protection would be incomplete without mutual recognition and complementarity.

The Ubuntu Framework proposed here is premised on the second approach as it is able to strike a balance between individual rights and collective rights. It will allow individuals and organisations that require data for their goals and success to pursue these goals by also pursuing the community good of protecting the data of groups and collectives to minimise risk and harm to these collectives and groups. The values and principles that make up the Ubuntu Framework are as follows.

1. Interdependence

In ubuntu interdependence is seen from the perspective of individuals forming a link in a chain. Interdependence requires a person’s existence to acquire an interpersonal nature dependent on other people.60 Human beings are social beings who must appreciate their social relatedness to other people such that violation of the being of the other becomes intolerable.61 The value of interdependence in ubuntu recognises the individual as equal in moral terms to that of the community while placing greater emphasis on communal interests and rights.62 Interdependence allows individuals to discover who they are through other people but also places emphasis on the distinctive identity of the individual.

Principle: ‘Data shall be collected and processed in a manner that protects the rights and interests of groups, collectives, associations and individuals who are part of these collectives or groups where the data being processed relates to the group, collective or association. (Interconnectedness).’

This principle is important for collective data protection because it establishes the collective dimension of data protection. The exclusion of ‘companies’ in the principle is deliberate to limit the scope of the collective dimension of data protection to groups, collectives and associations. Legal entities such as companies have different dimensions that should be addressed separately.

2. Relationality

Ubuntu at its core is relational due to the special moral importance of relationships. In ubuntu individuality exists, however, its existence is in a relational way as individuals are part of a network of relations and exist foremost as participants of their family, group and community.63 This aspect of relationality is aptly captured by Mbiti in the following terms, ‘Only in terms of other people does the individual become conscious of his own being, his own duties, his privileges and responsibilities towards himself and towards other people’.64 This underscores the point that at their core, humans are deeply social and relational.65 In extending data protection to collectives, relationality creates a different starting point for data protection law by acknowledging relationality and that human beings belong to a community and are part of a community.

Principle: ‘Personal data shall be collected and processed in a manner that protects the rights and interests of individuals and collectives whose data is inherently connected or related to the data subject whose personal data is being processed (relationality).’

This principle is important for collective data protection because it establishes that data collected from an individual is inherently connected to data about other people or a particular collective. This addresses the key aspect of relationality that is now a key feature of data collection and processing.

3. Responsibility

In the communitarian way of life, responsibility refers to the obligations imposed upon an individual to care for another person or other people. The obligations imposed range from helping others in distress, avoiding causing harm to other people and showing concern for the welfare of other people in society.66 Responsibility entails rights and duties that are inalienable to the whole community. Responsibility in a communitarian society is not based on the notion of social contract but rather, it is based on a communitarian ethic and its associated imperatives.67 Imperatives can either be positive or negative and negative imperatives that impose obligations have corresponding rights.68 Responsibility in the ubuntu sense demands that those who are in positions of power avoid actions and activities that will cause harm given that an individual’s ‘social status goes hand in hand with one’s responsibility or sense of duty towards, or in relation to, others’.69

Principle: ‘Data controllers shall be responsible for ensuring that data collected from individuals is processed in a manner that produces minimal risk and harm for groups and collectives (Responsibility)’

The principle of responsibility will impose additional obligations on data controllers to consider the impact of their data processing practices such as aggregation on groups and collectives including the ones they create themselves. This has the possibility of addressing power imbalances for collectives and groups as controllers would be obliged to exercise restraint in their data practices that have consequences for groups and collectives.

4. Dignity

Under ubuntu, dignity is seen as something that is inherent in all people and therefore makes individuals divine and thereby deserving of respect. Actions or activities that undermine, pose a threat, destroy or hurt human beings or the community are frowned upon as they affect the very foundation of society.70 As individuals deserve respect and are valued, they are not mere objects, tools or a means to an end. Individuals and the community they belong to must never be treated as a means to an end. To this end, ubuntu is not different from values that are found in the EU or other parts of the world in respect of dignity.71 However, what distinguishes ubuntu from Western understandings of dignity is the aspect that the dignity of persons is, ‘best realised in relationships with others’.72 This is because an individual does not exist alone in the community but exists with others and anything that harms the individual harms the community and vice versa.

Principle: ‘Data shall be processed in a manner that respects the dignity of the individuals, groups and collectives from whom it was collected and not processed for commodification or trading; it does not matter that the individual, group or collective is identifiable or not (dignity)’

The dignity principle would act as a safeguard for group and collective identity when the data of groups is processed. This would offer better safeguards than the present approach where the data of groups and collectives is commodified and traded leading to grave consequences for the individuals who belong to or are ascribed to these groups.

VI. Conclusion

Data protection law in the EU has gone through significant changes from the 1970s to its current state. However, the changes have been at the expense of the interests of collectives when it comes to their data. Data protection law has therefore overlooked collectives in its development and currently lacks a collective dimension. This article argued that in answering the following questions, Which fundamental concepts should be revised; what can and should data protection law aim for in ten years and which regulatory approach should be adopted?, EU data protection law must revise the fundamental concept of personal data to cater for collectives. It recommended that EU data protection law look outward and undergo a ‘reverse Brussels effect’, where the EU learns from countries that have collective cultures that privilege communities and groups. Values from these countries such as ubuntu could ultimately help the EU in developing a collective dimension to data protection. One such possible approach would be through the use of the proposed ubuntu framework which has principles that can be integrated into EU data protection law. The consistency of such an approach with EU socio-legal culture and the feasibility of the integration of such a framework into the GDPR remains the subject of further study.

Notes

[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC (GDPR) OJ L 119/1. The GDPR has been in force since 25 May 2018.

[2] Maria Tzanou, ‘The future of EU Data Privacy Law: Towards a more egalitarian Data Privacy’ (2020) 7 (2) JICL 449.

[3] The Brussels effect refers to the ‘EU’s unilateral ability to regulate global markets by setting the standards as well as, ‘the adoption of EU-style regulations by foreign governments.’ Anu Bradford, The Brussels Effect: How the European Union Rules the World (Oxford University Press 2019). The reverse Brussels effect in this instance would be where the EU looks outward to learn from other jurisdictions and adopts regulations influenced by foreign legislation and values.

[4] A cultural term commonly used in Southern Africa that defines how an individual exists in a community.

[5] For this part, groups/collectives refer to both organised and non-organised groups and collective entities. Organised groups and collectives include both legal/juristic persons including companies and similar associations and non-legal persons. In certain instances, specific reference will be made to the exact type of group/collective entity that was the subject of protection.

[6] Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data [1995] OJ L 281/31.

[7] Austria, Federal Law of October 18, 1978, on the Protection of Personal Data (Data Protection Act); Denmark, Private Registries Act, Luxembourg, Nominal Data (Automatic Processing) Act of 31 March 1979; Norway, Act relating to Personal Data Registers of 9 June 1978; Iceland, Act No. 63/1981 Respecting Systematic Recording of Personal Data. However, Iceland would clarify in 1989 that the law applied to natural persons only and Switzerland, Loi Federale sur la Protection des Donnees 1992 (Federal Data Protection Law).

[8] Austria, Norway and Luxembourg.

[9] Denmark protected in respect of credit rating agencies as well as warnings.

[10] Council of Europe Convention 108: Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data 1981, ETS 108 (Convention 108).

[11] Art 3 (2) (b) Convention 108.

[12] Council of Europe, Explanatory Report to the Protocol amending the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data.

[13] Art 3 Data Protection Directive.

[14] Recital 24 Data Protection Directive.

[15] Joined Cases C-92/09 and C-93/09 Volker und Markus Schecke GbR and Hartmut Eifert v Land Hessen [2010] ECR 11063. See also Article 29 Data Protection Working Party, Opinion 4/2007 on the concept of personal data (01248/07/ENWP 136) 12 (A29WP Opinion 4/2007).

[16] Art 2 GDPR.

[17] Recital 14 GDPR. However, countries such as Austria and Denmark have retained some semblance of protection of the data of collectives. See for example, Complainant: unknown GmbH (limited company), pharma trade v Bundesamt für Sicherheit im Gesundheitswesen DSB - 2020-0.191.240 (DSB - 2020-0.191.240).

[18] Arts 12 and 13 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on Privacy and Electronic Communications) OJ L 201/37.

[19] Partial opinion on Private Registers, Opinion No. 687; Lee A Bygrave, Data Protection law; Approaching its Rationale, logic and limits (Kluwer Law International 2002) 186.

[20] Douwe Korff, ‘Study on the protection of the rights and interests of legal persons with regard to the processing of personal data relating to such persons’ (Commission of The European Communities) 23.

[21] Ibid.

[22] E Hogrebe, ‘Legal Persons in European Data Protection Legislation: Past Experiences, Present Trends and Future Issues, Report for the OECD Working Party on Information, Computer and Communications Policy (DSTI/ICCP/81.25) 6.

[23] Sweden, Data Act; France, Law relating to data processing, files and freedoms and Germany, Bundesdatenschutzgesetz (BDSG).

[24] United Kingdom, Finland, Ireland, Netherlands, Slovenia, Portugal, Belgium, Czechia, Hungary, Slovakia and Spain.

[25] Bygrave (n 19) ch 12 where he offers an extensive account of the privacy-based argument against extending data protection to groups and collectives.

[26] I N Walden and R N Savage, ‘Data Protection and Privacy Laws: Should Organisations Be Protected?’ (1988) 37 (2) The International and Comparative Law Quarterly 337, 343.

[27] Bygrave (n 19) 197.

[28] Charter of Fundamental Rights of the European Union OJ C 326/02. See also Gloria González Fuster, The Emergence of Personal Data Protection as a Fundamental Right in Europe (Springer 2014).

[29] Inge Graef and Bart van der Sloot, ‘Crossroads of Data Protection and Competition Law’ [2022] EBLR 519.

[30] Urbano Reviglio and Rogers Alunge, ‘“I Am Datafied Because We Are Datafied”: An Ubuntu Perspective on (Relational) Privacy’ (2020) 33 Philosophy & Technology 595, 598.

[31] Parminder Jeet Singh and Jai Vipra, ‘Economic Rights Over Data: A Framework for Community Data Ownership’ (2019) 62 Development 53.

[32] Lanah Kammourieh et al, ‘Group Privacy in the age of Big Data’ in Linnet Taylor, Luciano Floridi and Bart van der Sloot (eds), Group Privacy: New Challenges of Data Technologies 48.

[33] Bygrave (n 19) 251, Nathalie A. Smuha, ‘Beyond the Individual: Governing AI’s Societal Harm’ (2021) 10(3) Internet Policy Review 1, 5.

[34] Bygrave (n 19) 176.

[35] Rainer Mühlhoff, ‘Predictive Privacy: Collective Data Protection in the Context of Artificial Intelligence and Big Data’ (2023) 10 Big Data & Society.

[36] Salomé Viljoen, ‘A Relational Theory of Data Governance’ [2021] 131 The Yale Law Journal 573, 613.

[37] Aisha P.L. Kadiri, ‘Data and Afrofuturism: an emancipated subject? (2021) 10(4) Internet Policy Review 14.

[38] Andrew Guthrie Ferguson, The Rise of Big Data Policing (New York University Press 2017).

[39] See Edward J. Bloustein, Individual & Group Privacy (2nd edn, Routledge 2003) comments on group privacy which apply with equal force to data protection.

[40] Ibid.

[41] See Alan F Westin, Privacy and Freedom (Atheneum 1967) comment on the importance of organisational privacy. The same applies to data protection.

[42] Alessandro Mantelero, ‘From Group Privacy to Collective Privacy: Towards a New Dimension of Privacy and Data Protection in the Big Data Era’ in Linnet Taylor, Luciano Floridi and Bart van der Sloot (eds), Group Privacy: New Challenges of Data Technologies 142; Bart van der Sloot, ‘Do privacy and data protection rules apply to legal persons and should they? A proposal for a two-tiered- system’ (2016) 31 Computer Law and Security Review 26.

[43] Felix Bieker, The Right to Data Protection: Individual and Structural Dimensions of Data Protection in EU Law (TMC Asser Press 2022) 243.

[44] Tzanou (n 2) 449.

[45] Li Yang and Min Yan, ‘The Conceptual Barrier to Comparative Study and International Harmonisation of Data Protection Law’ (2021) 51(3) Hong Kong Law Journal 917, 925.

[46] Bieker (n 43) 188.

[47] Ibid.

[48] J R Mugumbate and A Chereni, ‘Now, the theory of Ubuntu has its space in a social world’ (2020) 10 (1) African Journal of Social Work v-xv.

[49] J Y Mokgoro, ‘Ubuntu and the Law in South Africa’ (1998) 1(1) Potchefstroom Electronic Law Journal 1, 2 <http://dx.doi.org/10.4314/pelj.v1i1.43567> accessed 30 August 2023 and Chemhuru, Makuvaza and Mutasa, ‘Making Human Rights education discourse relevant’ (2016) 9(2) Africology: The Journal of Pan African Studies.

[50] J Mbiti, African religions and philosophy (Anchor Books 1970).

[51] H N Olinger, J J Britz and S N Oliver, ‘Western privacy and/or Ubuntu? Some critical comments on the influences in the forthcoming data privacy bill in South Africa’ (2007) 39(1) The International Information and Library Review 34.

[52] I am grateful to reviewer 2 for providing these two questions.

[53] The following section draws upon a thesis I submitted in 2019 in fulfilment of my LLM, Blessing Mutiro, ‘Communications data retention: Making safeguards for the protection of the right to privacy relevant to groups: lessons from ubuntu’ (LLM thesis, University of Birmingham 2019).

[54] Olinger, Britz and Oliver (n 51).

[55] Chemhuru, Makuvaza and Mutasa (n 49) 111.

[56] Josiah A M Cobbah, ‘African values and the Human Rights Debate: An African Perspective’ (1987) 9(3) Human Rights Quarterly 320, 321.

[57] Makau Mutua, ‘Human Rights in Africa: The Limited Promise of Liberalism’ (2008) 51(1) African Studies Review 34 <https://doi.org/10.1353/arw.0.0031> accessed 30 August 2023.

[58] See Kwame Gyekye, Tradition and Modernity: Philosophical Reflections on the African Experience (Oxford University Press 1997) for a discussion on the debates around the relationship of the individual and the community.

[59] C Ewuoso and S Hall, ‘Core Aspects of Ubuntu: A Systematic Review’ (2019) 12(2) South African Journal of Bioethics and Law 93.

[60] Olinger, Britz and Oliver (n 51) 34.

[61] Chemhuru, Makuvaza and Mutasa (n 49) 111.

[62] Chuma Himonga, ‘The Right to Health in an African Cultural Context: The Role of Ubuntu in the Realization of the Right to Health with Special Reference to South Africa’ (2013) 57(2) Journal of African Law 165, 178.

[63] J Ogude, Ubuntu and personhood (Africa World Press 2018), Mark Coeckelbergh, ‘The Ubuntu Robot: Towards a Relational Conceptual Framework for Intercultural Robotics’ (2022) 28 Science and Engineering Ethics 16.

[64] Mbiti (n 50) 141.

[65] Coeckelbergh (n 63) 15.

[66] Gyekye (n 58) 67.

[67] Ibid, 44, 69.

[68] Himonga (n 62) 179.

[69] Mluleki Mnyaka and Mokgethi Motlhabi, ‘The African Concept of Ubuntu/Botho and Its Socio-Moral Significance’ (2005) 3 Black Theology 215, 224.

[70] Ibid.

[71] J Teffo, ‘Botho/ubuntu as a Way Forward for Contemporary South Africa’ (1998) 38 Word and Action 4

[72] Mnyaka and Motlhabi (n 69) 221.

Export Citation