Europe Is Not “So Back”: Why Cookie Banners Are Here to Stay (Despite the Reform) and the Hard Route Not Taken
“Europe is so back. No more cookie banners.” Alas, no. Cookie banners are staying. But the banner is just an irritating symbol of the fact that European politicians can’t quite muster the will to do the hard things we need.
Ursula von der Leyen was right in this year’s State of the Union Address that “Europe is in a fight.” Thanks to the published “omnibus” reform proposals, we now know what Brussels is bringing to this fight.
Good intentions? Sure. Some recognition that our current economic malaise is due to regulatory overreach and the failure to safeguard the common market? Yes.
However, European politicians cannot abandon the “luxury” mindset: the belief that we can pursue all policy aims simultaneously, with “win-wins” and no difficult trade-offs (Rebuild our industry! But also: no emissions! And no nuclear!).
As the authors of the excellent Constitution of Innovation argued, we are on a path similar to that of twentieth-century Argentina—albeit worse due to Russia’s violent imperialism and a demographic crisis. To avoid that fate, the authors propose strengthening the common EU market while limiting the EU’s regulatory ambitions. Of course, the former faces opposition from national governments, and the latter is anathema to Brussels (a small illustration: “civil servants are being asked to rethink and streamline laws they authored, championed and built their careers on …”).
The proposed “digital omnibus” legislation is an excellent example of how challenging it is to meaningfully reform the excesses of EU regulation.
Last week, I commented on its leaked version (Leaked GDPR reform: some good ideas but what about enforcement?). Earlier this week, the Commission published the official proposal, which is largely in line with the content of the leaked document.
Cookie consent banners are here to stay
I have bad news for those who are celebrating the proposal’s headline aim of getting rid of cookie consent banners as a big win over regulatory overreach. To understand why, let’s look at why businesses are currently expected to ask for user consent.
I agree with the Commission that the cookie consent law—from the ePrivacy Directive—is “outdated and inadequate for contemporary privacy and data needs.”As interpreted by the European Data Protection Board (EDPB), the law requires prior user consent for many basic information exchanges between an Internet service and a user device (see my Consent for everything? EDPB guidelines on URL, pixel, IP tracking).
The exceptions to that are interpreted extremely narrowly and don’t include such table-stakes purposes as using parts of web links (URLs) to inform the digital service about which advertising partner originated the traffic or what advertising campaign the link was associated with. Similarly, the exceptions don’t cover basic measures used to detect attempts to defraud advertisers (and thus call for asking fraudsters to kindly consent to enabling anti-fraud measures!). It doesn’t matter if this isn’t personal data. It doesn’t matter if this isn’t even remotely sensitive. On the EDPB’s absolutist reading of the law, prior user consent is required—hence, the consent banners/pop-ups.
The legal change proposed by the Commission doesn’t even attempt to address the bulk of the issue. The non-sensitive, non-personal data will still require prior consent for processing.
What the Commission proposed was to exclude personal data processing from the ePrivacy Directive and mostly replicate the framework within the GDPR, but with minor tweaks.
The tweaks include the addition of two exemptions from consent: creating aggregated usage statistics and security measures. Despite the Commission’s optimism, the new exemptions will be practically insignificant because they will be interpreted with anti-business prejudice by the same people who produced the already cited EDPB guidelines requiring consent for processing even generic web link fragments.
With the EDPB in charge, we can expect the analytics exemption not to apply to the standard third-party analytics that businesses actually use. Not to mention any analytics not deemed “aggregated,” because it tracks individual users. Similarly, the security exemption will almost certainly not cover anti-advertising fraud measures. In other words, the purposes for which actual website operators process data will still require consent. The cookie banner will live to fight another day.
You might object that this is against the spirit of the reform, and surely there must be a way to interpret the law more flexibly? I would agree, but here we reach the core of the problem. It is the reason why, in practice, all the other good ideas from the Commission’s proposal also risk being largely nullified.
The issue is enforcement.
Privacy enforcement will undermine the reform
The officials in charge of applying the changed rules will be the same people who brought us the idea that prior user consent should be required for nearly all Internet communications. In other words, people who genuinely believe their job is to promote data protection and privacy above everything else. Even if legislators were to give them an explicit duty to care about “economic growth” or “innovation,” it would change little. In fact, privacy enforcers would likely argue that they are already doing that (e.g., some EDPB documents contain sections on balancing—I’ll leave it to the reader to guess how serious such balancing is).
What is especially difficult for privacy enforcers to internalise is the importance of clarity. I criticised the opinion on AI models issued by the EU-level body of data protection authorities, the EDPB, by pointing out how it failed to address regulatory uncertainty:
Yes, the Opinion does not say that AI is illegal in the EU, but let’s be honest: even in the EU, explicitly making such a declaration is politically unpalatable. Instead, the EU privacy enforcers did what they usually do. They kept as much enforcement flexibility for themselves as possible, opening the doors for any EU national enforcers to impose billions-worth fines. Of course, the other side of that coin is that those who want to use AI in the EU have no idea if all their GDPR compliance efforts will be judged as not good enough in a year or two. …
Some privacy regulators may protest that this was not their intent; after all, they did provide the list of things to try. But such an answer would show the fundamental disconnect from economic reality. Consciously or not, regulators can thwart development not only by explicitly banning it but also by creating an environment of uncertainty. Even the threat of discretionary regulatory enforcement—combined with the risk of heavy fines—can significantly chill investment decisions (and thus innovation) at the margins.
Why was the EDPB opinion formulated in a way that allows claiming that real-world AI efforts are illegal under the GDPR? Because an influential part of the EU’s data protection officials holds that view.
It should thus not be surprising that privacy activists are now saying the quiet part out loud in their opposition to the Commission’s proposal. For example, in NOYB’s view, the problem with the draft reform is precisely that it would legally enable state-of-the-art AI, which, according to them, is illegal today.
To be clear, some European privacy authorities have recently shown a genuinely pragmatic approach. The best example is the French authority’s guidance on AI. However, it is not a coincidence that the leading European player in that market is a French national champion (Mistral), a fact of which the French government is justifiably proud.
This kind of clarity of purpose, derived from national interest and a “whole-of-government” mobilization, is not something we can rely on for pragmatic, consistent enforcement across the EU.
Pragmatic voices are barely heard in documents adopted by the EDPB.
The EDPB gets to make the rules because courts have been excessively deferential to data protection authorities thus far. Furthermore, even getting an EU court to review the EDPB’s interpretations is procedurally complex and likely to take years.
In their reliance on interpretations from data protection authorities, the EU courts have failed to appreciate that these authorities often do not even attempt to act in the general public interest, as we would expect from a European public authority. Instead, they are effectively American-style campaigners for a single cause—a small aspect of the public interest.
This also partially explains the preoccupation with American “big tech” companies: activists seek “David-vs-Goliath” fights. It’s much harder to get a magazine cover for cracking down on homegrown illegal “sellers” of personal data, scammers, and their ilk.
The hard route not taken
Enforcement reform. Reforming the enforcement framework is essential for the success of any significant improvement to EU data protection law. If done well, this could reduce the need for substantive changes in laws like the GDPR.
This assumes, however, that an appropriately motivated and resourced authority could produce helpful guidance, both balancing privacy (and data protection) interests with other values and clearly informing organizations how to comply. The French data protection authority, CNIL, has demonstrated that this is possible in its guidance on AI, which contrasts starkly with the vague and non-committal language of the EDPB’s opinion on AI models.
I proposed one way to achieve this in A serious target for improving EU regulation: GDPR enforcement. I suggested establishing an EU tribunal with a clear mandate to balance privacy and other EU goals to approve guidance documents and cross-border enforcement decisions. Perhaps this idea could be integrated with the “specialized commercial courts” proposed in the Constitution of Innovation.
I also suggested that a centralised, EU-level data protection authority might be a good idea—though it would likely need to be created from scratch to safeguard ideological neutrality and a capacity for serious balancing of interests.
GDPR/ePrivacy changes and the courts. The disproportionate, myopic practice of data protection enforcement has resulted in a situation where the EU courts require clear guidance from the EU legislature—maybe even a treaty change—to undo the damage.
In the proposed GDPR reform, the Commission tries to rely on a recent judgment from the EU’s highest court, the Court of Justice (CJEU), to clarify that the definition of “personal data”—and thus the GDPR’s scope of application—should be construed more narrowly than many privacy authorities push for. This is now being criticised for not aligning perfectly with what the Court said. I understand why the Commission preferred to have such “cover” for beneficial GDPR changes.
The criticism the Commission’s proposal faced highlights the problem with relying on courts for guidance, rather than acknowledging that this area of law is a mess and that the courts need guidance from the legislature (and perhaps from the “masters of the treaties”—the member states).
The CJEU is unpredictable; they could issue a decision pushing the interpretation of “personal data” in a different direction next month. This is a significant risk, given that the GDPR is currently a de facto “law of everything” in a digital economy, with enforcers wielding powers to block technological progress while being utterly incompetent and lacking legitimacy to do so.
Stating openly that the EU legislature wants to go in a different direction, instead of claiming merely to codify what the EU Court has said, would make it clear to the courts that they’re no longer dealing just with a technical legal question, but with a considered political choice about the future of Europe.
Yes, this strategy is politically more challenging, but I’m skeptical that what the Commission is currently attempting will hold up.
Simplification and the definition of “personal data”
EU data protection law is notoriously complex—partially due to its drafting, but even more so due to its enforcement. The complexity is especially taxing for smaller organizations and may, in some cases, act as a business moat for larger ones. Critics of the proposed GDPR reform argue that modifying the GDPR text will introduce new interpretive challenges. To some extent, this is inevitable; that’s how the law works.
Nevertheless, the Commission’s proposal includes a provision moving toward true simplification: cutting the GDPR down to a manageable size and denying its unearned status as the “law of everything.” It is the provision regarding the definition of “personal data” I mentioned above.
In my previous comment, I summarised it in the following way:
In short, the idea is that whether something counts as personal data for you (and whether the GDPR applies to you) depends on whether you are capable of identifying the individuals to whom the data relates.
I also suggested a potential benefit of this approach:
…this will prompt many organisations to separate the processing of data allowing the identification of individuals (e.g. let other entities specialised in GDPR compliance … handle that processing) and otherwise to only process pseudonymous data (without being able to identify individuals).
Acknowledging that the GDPR should only apply to organisations that are “reasonably likely” to identify specific individuals would create a powerful incentive to process data in such a way that individuals cannot be identified. I can imagine that many organisations, large and small, would jump on that opportunity—even if it means incurring technical costs or knowing less about their customers. The payoff in reduced regulatory complexity would, I think, be very clear.
Of course, such organisations would still be “in the shadow” of the GDPR in the sense of needing to make sure that they are not reasonably likely to identify individuals. Thus, e.g., pragmatic GDPR guidance on data security and organisational measures could still be helpful for them. But that would still constitute a very significant simplification.
The criticism that this particular proposal received looks almost knee-jerk: any assault on the GDPR as the “law of everything” must be rejected. A potentially clear benefit—the reduction in the processing of personal data—is presented as undesirable. This is presumably because “true data minimization” can only be done under a full GDPR regime, without even the slightest hint of reducing privacy enforcers’ control. Such criticism proves too much and is rather revealing about the critics. It shows that those who engage in it start from a position of mistrust towards “business,” demonstrating that they are incapable of serious balancing of rights and interests.
That said, I am still skeptical that this change in the GDPR’s text, presented as a mere codification of what the Court said, will be sustainable. Whatever the official opinion of the data protection authorities will be on the Commission’s proposal, we can expect them to undermine it as much as possible. Whether the courts will police that several years down the line is also uncertain.
So this big chance for simplification will likely quickly fizzle out with new vague, “case-by-case” guidance, prompting lawyers to advise their clients that they are almost always at risk of being “reasonably likely” to identify individuals in the local data protection authority’s view.