LinkedIn article, published on 30 September 2018


Article 1 of the GDPR rarely attracts much attention: At best, it is viewed as a repetition of the GDPR title; At worst, as a functional article, a necessary introduction to the GDPR – and to the much juicier articles that follow, on material scope and territoriality. This is a shame. Article 1 sets the framework for the GDPR. And, frameworks are important.

Unless there is consensus on and understanding of the framework, not much can be achieved. People need to agree on the basics, in order for any meaningful interaction to take place. In the case of the GDPR, if anyone does not think that individuals should be protected by law against processing of their data by third parties (because they are perfectly capable of doing it themselves) or, if a European thinks that a Regulation (federal law) should not be involved (because each Member States knows better), then he or she should go no further: The GDPR will only cause them distress.

Consequently, Article 1 matters. However, setting up the GDPR framework is only part, the lesser one, of its importance. Its crucial contribution is hidden behind its, unassuming, paragraph 1: “This Regulation lays down rules relating to the protection of natural persons with regard to the processing of personal data”. It is there that the GDPR provides its approach on the basic question, whether technological progress should be regulated by law or not.

This is an old question, ultimately referring to the relationship between law and science. Should the law interfere at all with scientific research? Should it say what science should, and should not, work on? Should it lay down prohibitions for scientists? Or should regulators patiently wait until scientific progress hits the streets, and then intervene on its actual uses? Even then, should the law anticipate these uses, or should the law react only once they have started affecting the lives of a significant number of people?

Today science is popularised through, and is in part paid by, technology. Technology, as enabled by science, is used for the processing of personal data. Hence, the GDPR steps in.

The GDPR, therefore, “lays down rules”. This is an important statement in itself. It assumes that rules need to be laid down. It also assumes that rules can be laid down. These rules pertain to the processing of personal data, regulating technology and thus science. The GDPR is typical of the European approach to technological developments: They are welcome but rules are both feasible and necessary.

If this is the case, is it possible to “simply say no” to a particular personal data processing operation under the GDPR? Is it appropriate for a Data Protection Authority or compliance legal counsel to advise against carrying out a particular processing operation altogether? Would a simple “not allowed under the GDPR” be an acceptable answer to a question by marketeers or engineers on whether to set up a new type of processing?

I think not. By “laying down rules” the GDPR describes and classifies personal data processing operations. This is a first, necessary step for regulation, however it also means that there are rules for any given type of processing. If a specific type of processing seems against GDPR rules, it should not be prohibited altogether but rather rules need to be applied to it so as for it to be brought within the GDPR boundaries.

In practice, this means that a simple “no” would not do. If a particular processing operation appears unlawful, the adequate response under the GDPR is not unequivocal prohibition, but a description of the conditions under which it would become lawful. In other words,the GDPR is to be viewed not as a yes/no switch, but rather as a filter to be applied onto any personal data processing operation, for it to become lawful.

Admittedly, this has been the norm for Data Protection Authorities so far, under the previous legal framework. I see no reason for this approach to change, particularly now that the GDPR, for right or wrong, appears to be the go-to legal response to anything from bioengineering to robots and artificial intelligence. While none of this should, or even could, be restricted by any law, this does not mean that rules should not be applied to it.