Published in European Law Blog, 1.04.2021
By Vagelis Papakonstantinou and Paul De Hert,
1. GDPR mimesis as the only regulatory method?
EU regulatory work on technology-related fronts has recently spiked. The EU has been extremely busy implementing its European Digital Strategy. Over a short period it has released a draft Digital Governance Act (DGA), a Digital Services Act (DSA), a Digital Markets Act (DMA), while also working on its proposal for AI Regulation. This recent battery of EU acts to regulate technology has provoked our comment, on this blog, on EU law “act-ification”. Now, instead of focusing on the title of these initiatives, we wish to turn our attention to their content, in order to identify a second phenomenon: GDPR mimesis.
EU personal data protection law applies a well-known scheme by now: Building on a specialized, unique set of terms, a set of basic principles and case-specific rights that are monitored by a specialized public agency. In some more detail, “data subjects” (meaning individuals) and “controllers” and “processors” (meaning those doing the processing) interact through “processing” of common or “sensitive” “personal information” (all terms closely, and uniquely, defined in the personal data protection context). This processing needs to be based on a set of special principles (e.g., fair and lawful processing, data minimization, purpose specification etc.). Special rights (e.g., information, access, rectification) need to be observed. All of these interactions are monitored by Data Protection Authorities, specialized state agencies that are established particularly for this purpose and carry out only this task.
Today, the most prominent representative of this scheme is the EU General Data Protection Regulation (the “GDPR”). While the same system is reproduced also in other basic instruments (such as the Law Enforcement Directive and the Regulation on the processing of EU organisations), it is the GDPR that stands out among them, if not for anything else than simply due to its sheer width and breadth of scope: With the exception of specific areas of personal data processing, the GDPR aims at regulating nothing less than any and all personal data processing in Europe.
The GDPR system was not created in 2016. On the contrary, it is built on premises as old as the first Hessian data protection act of 1970. Data protection laws were released during the 1970s and 1980s mostly with technology-regulation in mind. After an initial batch of similar laws released during the 1970s (after Hesse followed Sweden, France and Germany), most other EU Member States followed during the 1980s. Each and every act introduced during this period applied more or less the same above scheme. Its formal adoption as the European approach on personal data protection came in 1995, through Directive 95/46/EC. The only newcomer in the 1995 Directive’s text – because otherwise it simply merged approaches by Member States – was the establishment of an EU cooperation mechanism: the Article 29 Working Party (now, European Data Protection Board (EDPB)). The GDPR merely expands and furthers the same system, adapted to new technological and social challenges (most notably taking into account that the 1995 Directive was drafted before the rise of the internet).
The GDPR’s success is undisputed. It has raised global awareness and by now constitutes the basic, global text of reference when it comes to the protection of personal information (and privacy). The acronym has found its way into policy documents all over the world and is a recognized reference point within the business community.
However, has the GDPR’s success dulled EU legislators’ imagination? Is its influence so dominant that each new EU regulatory initiative for technology and digital life is obliged to pay tribute to its system through replicating it? Is GDPR mimesis the only regulatory method when it comes to the regulation of technology in EU law today?
2. EU’s Data Governance Act: definitional, substantive and institutional mimesis
In late 2020 the Commission announced its intention to introduce measures “to boost data sharing and support European data spaces”, based on ever-increasing volumes of data generated by public bodies, businesses and citizens. Accordingly, its draft DGA comprises a Regulation that is aimed at fostering the availability of data for use by increasing trust in data intermediaries and by strengthening data-sharing mechanisms across the EU. Three pillars are constructed in the DGA to realize this ambition. The first pillar aims to enable greater data sharing among public and private sector entities. A second pillar establishes a notification and compliance framework for providers of data sharing services with the aim to create more trustworthy data sharing. The third pillar establishes a (voluntary) registration regime for data-altruist entities.
Regardless of the merits of the DGA, its mirroring of the GDPR system is unmissable. GDPR mimesis takes three forms: definitional, substantive and institutional. Definitional GDPR mimesis is evoked through the DGA’s terminology. A new, unique set of terminology is introduced in Article 2 of the DGA: “Data holders”, “data users”, “data”, or “data sharing” (Article 2). These correspond to the GDPR’s “data subjects”, “controllers”, “personal data” and “processing” (in Article 4).
Then there is substantive mimesis. The DGA identifies a special set of principles to govern the provision of data sharing services (Article 11), organizes a system of adequacy for data exports outside the EU (Article 30) and introduces new ‘special’ rights in order to assist individuals who want to engage in “data altruism” (Article 19). It also establishes a new state authority to monitor all of the above (Articles 12, 13 and Chapter V). All of the above have easily identifiable parallels in the text of the GDPR.
Finally, institutional mimesis most prominently comes in the text of the DGA with regard to its suggested cooperation mechanism: the European Data Innovation Board (Article 26). Its name brings immediately to mind the GDPR’s EDPB, an administrative body endowed with legally binding powers. There is, however, little relationship with the DGA’s Innovation Board that is no more than an expert body.
This institutional mimesis may cause the most serious problems of all other types of GDPR mimesis. While definitional mimesis may be unavoidable within a technology-related setting, and substantive mimesis may warrant continuity in the protection of individuals from technological developments, institutional mimesis raises public expectations. To anyone not familiar with the intricate details of GDPR and DGA compatibility, the sharing of names is likely to suggest shared aims, means and objectives. However, this is not the case here: The DGA’s objectives are different than these of the the GDPR. Individuals, and businesses, expecting a common approach on the basis of GDPR mimetism are thus bound to be disappointed.
3. EU Regulation on AI: more mimesis
In its Resolution 2020/2012(INL) on a Framework of Ethical Aspects of Artificial Intelligence, Robotics and Related Technologies (the “AI Ethical Aspects Resolution”), the European Parliament suggests that the Commission introduces a Regulation “on ethical principles for the development, deployment and use of artificial intelligence, robotics and related technologies”. In another text of ours we highlighted the GDPR-feel of the model of regulation proposed by the Parliament to tackle Artificial Intelligence (AI). Definitional mimetism becomes visible through a new set of actors introduced in Article 4 (“user”, “developer” and “deployer”), resembling the GDPR’s “data subject”, “controller” and “processor”. Substantive mimetism is identifiable in Article 5, whose principles very much follow these of the GDPR. Other unmissable GDPR-reminding ideas that are akin to institutional mimesis include “risk assessments” in Article 14 (Data Protection Impact Assessments in the GDPR), “compliance assessment” in Article 15 (prior consultations in the GDPR) or the “European Certificate of Ethical Compliance” (European Data Protection Seal in the GDPR). In addition, the Parliament recommends establishment of “national supervisory authorities” for monitoring all of the above (in Article 18); space for an EDPB-like institution is openly left in Article 20.
The Commission’s final take on AI Regulation, of course, cannot be prejudiced. Its draft is expected to be released in the first quarter of 2021. It remains to be seen whether it will follow the Parliament’s proposals or not.
4. When does GDPR mimesis kick-in? The counterexample of the Digital Services Act package
Is the GDPR the standard go-to regulatory model in Europe for any and all technology-related new legislative initiatives? Actually, this is not the case. At the same time as releasing its DGA draft, the Commission also introduced two important proposals comprising the so-called Digital Services Act package: The Digital Services Act (DSA) and the Digital Markets Act (DMA). Both texts are a counterexample of GDPR mimesis. Not a trace of the EU personal data protection scheme can be found in their texts. Why is that? Is GDPR mimesis after all a horizontal or an isolated phenomenon, relevant only to the above two cases?
The DSA and the DMA, as was the case with the GDPR, have a long regulatory history. Particularly the DSA furthers and expands Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce. This e-Commerce Directive is an impressive text by its own merit that, same as the 1995 Data Protection Directive, withstood for more than twenty years the internet revolution that took place in the meantime. In other words, the aim-setting of the DSA and the DMA is entirely different: They aim at regulating the provision of services over the internet. Their objective is to protect consumers and offer legal certainty to providers. They could well be the result of “path dependency” within the same policy cycle, in the sense that they build on and further the aims and objectives of their predecessor, the e-Commerce Directive. While such path dependency is understandably, it might at the same time prejudice their approach.
In contrast, the GDPR furthers the fundamental right of data protection (notwithstanding its “free movement of data” imperative). Similarly, the DGA and the Parliament’s proposed regulation of AI adopt a protective approach for individuals. Their main concern is not to regulate the market, as is the case for the DSA and the DMA, but to protect individuals. So GDPR mimesis is a phenomenon that is not met in any and all new technology regulation in the EU, but only in those instruments aiming at protecting individuals against its countereffects. Most likely, it is this protective raison d’être that throws them into the arms of the GDPR’s model par excellence – albeit, to the point of asphyxiation.
Is GDPR mimesis such a bad thing after all? Does it not make sense for EU legislators to copy a model that has demonstrably served its purposes well, placing the EU at the international forefront when it comes to protecting individuals from the unwanted consequences of technology?
Yes and no. From a legal-technical point of view complexity is increased. If all of the above initiatives come through, the same company could be “controller” or “processor” under the GDPR, “data holder” under the DGA and “developer” under AI regulation – not to bring into the picture any DSA or DMA characterization. But, perhaps ours is an age of complexity, and simplicity in the digital era is long foregone. Notwithstanding any such pessimistic ideology, the fact remains that lawyers and state authorities will most likely have a particularly hard time juggling simultaneously over all of the above capacities. Consistency – if it ever was an EU law objective at all, as most pertinently questioned by Brownsword (p. 155) – would be substantially hampered.
Perhaps then the GDPR has formulated an EU model for technology regulation? A kind of acquis? While tempting from an EU law point of view (in line with the “Brussels effect” identified by Bradford), this finding may prove problematic. Would then the EU approach to technology essentially comprise a highly structuralist, bureaucratic approach composed of special roles, rights and principles and the establishment of new state authorities?
Even under a straightforward, human creativity perspective, mimesis is a bad thing. One is allowed, and indeed compelled, to stand on the shoulders of giants, but at some point he or she has to make his or her own contribution. Within a law making-context, Fuller has clarified that rules can be creative, but should be at least minimally clear and intelligible, free of contradictions, relatively constant, and possible to obey, amongst other things (p. 39). Mimesis in any one of its forms (definitional, substantive or institutional), particularly among regulatory texts of different aims, scope and objectives, may lead to confusion through creation of unattainable public expectations.
For all of the above reasons we believe that GDPR mimesis in EU law-making is ultimately a bad thing. The GDPR is an immensely successful legal instrument that has a life and history of its own. EU personal data protection is currently busy tilting the planet towards stronger protection of the privacy of individuals under technological deluge. This seat is therefore taken. Any new EU regulatory initiative will have to create a story of its own.