ANALYSIS

Recital 26, the Digital Omnibus, and why deidentification statements are becoming inevitable

EU data law is becoming situational, requiring organizations to justify when identification is unrealistic.

Published
Subscribe to IAPP Newsletters

Contributors:

Noemie Weinbaum

AIGP, CIPP/A, CIPP/C, CIPP/E, CIPP/US, CIPM, CIPT, CDPO/FR, FIP

Senior Managing Counsel, Privacy and Compliance

UKG

Flora Garcia

CIPP/E, CIPP/US, CIPT, FIP

Former Chief Privacy Officer

Wayfair, McAfee, Time

Roy Kamp

AIGP, CIPP/A, CIPP/E, CIPP/US, CIPM, CIPT, FIP

Legal Director

UKG

If Ella Fitzgerald and Louis Armstrong were singing about EU data law today, their famous tomatoes-and-potatoes duet might need a new bridge.

"You say anonymization, I say deidentification."

"You say personal data, I say not for me."

"Let's not call the whole thing off — but let's stop borrowing each other's keys."

In a previous article, we explored how anonymization remains the high note of the EU General Data Protection Regulation world: rare, demanding and binary. Either the melody can no longer be traced back to a person, or it can. Most datasets that claim to be anonymized are, in reality, something more modest and far more common: pseudonymized — also known as deidentified. They are still playing the same tune, just behind a curtain.

That distinction mattered because the GDPR has, until now, treated personal data as a yes-or-no proposition. There has been no "mostly anonymous" refrain, no comfortable middle ground. And for years, that rigidity was justified by a single, deceptively compact provision: Recital 26.

Recital 26 has always set the tempo. It makes clear that data protection does not hinge on theoretical possibilities — whether, given unlimited time, resources, and technology, someone somewhere could reidentify a person — but on whether an individual is identifiable or can be singled out by means reasonably likely to be used, taking into account cost, time, technology and purpose.

The recital — and remember, recitals are not binding but provide strong, example-based color on the original legislative intent — embeds a contextual, risk-based and fundamentally relative concept of personal data into the GDPR's DNA. The Court of Justice of the European Union has been faithfully riffing on that theme for more than a decade, spanning decisions from Breyer v. Bundesrepublik Deutschland to EDPS v. Single Resolution Board.

Yet recital-level guidance and case law, however clear on paper, proved insufficient in practice. Supervisory authorities have often defaulted to absolutism. If someone, somewhere, could identify a person, the data was treated as personal data everywhere. The relative concept became a theoretical solo few dared to play.

With the proposed Digital Omnibus, the EU does not abandon Recital 26. On the contrary, it finally drags its logic out of the recital grey zone and into the main score of Article 4 itself. And in doing so, it addresses two developments that now converge in the new definition of personal data.

The first is a quiet but decisive neutralization of a doctrinal wobble in recent CJEU case law. The second is a practical consequence that many organizations have not yet fully absorbed: Companies will increasingly need to articulate, document, and defend how and why they cannot identify individuals in the data they process — this requires a detailed understanding, not just Jedi mind tricks. This forces companies to prove a negative, to show that they do not have a decoder ring to be able to reveal the natural persons behind the data.

This is not deregulation; it is redistribution of responsibility.

For more than a decade, the court has insisted that personal data is a relative concept. Whether information relates to an identifiable person depends on whether a given entity has a means reasonably likely to be used to identify that person. This logic, already present in Recital 26, was crystallized in Breyer and later reaffirmed in SRB. Sherlock Holmes would, most certainly, have approved. You assess the clues you actually have, not those that exist somewhere else in the world — or that could exist somewhere else in the world.

If Entity B can link a dataset to an individual, the data is personal for Entity B. If Entity A cannot, and has no realistic access to additional information, the data is not personal for Entity A. That is the relative concept in action now in practicality and under the Omnibus once eventually adopted.

And then came the confusion

In Gesamtverband Autoteile-Handel eV v. Scania and again in SRB, the court suggested that information might nevertheless become personal data for a sending entity if it is transferred to a recipient that can identify the individual. Read literally, this appeared to introduce a form of backward attribution. Identifiability seemed to flow upstream. 

A recipient's capabilities — and the scope of their existing engagement or data sets — could retroactively alter the legal nature of the sender's data. While this is a familiar songbook for experienced data professionals, the degree of futureproofing that this requires may still be unrealistic in practice. 

That reasoning sits uneasily with Recital 26. If identifiability depends on the means reasonably likely to be used by the entity in question, why should another entity's means suddenly matter? The court's likely intention was narrower. It wanted the GDPR to apply to the act of transferring data where identification becomes foreseeable, not to reclassify all of the sender's other processing activities. But the wording left room for doubt, and doubt turns a classy tune into trashy noise in operational privacy.

This uncertainty might have remained a doctrinal or academic discussion if it were not for the EU Data Act. While the Data Act focuses on data subjects' combined product and service data rather than the data required for data holders to operate a business, the clarifications within the act will have broad impacts. 

Under Chapter 2 of the Data Act, data holders must make data available to users and third parties as applicable or as requested by the data subject, especially when the data subject wishes to enter into a new vendor relationship. Much of the product and service data is technical or industrial in nature and not personal data in the hands of the data holder. 

If data holders were required to determine, for every potential recipient, whether that recipient could link the data to an identifiable individual, the system would quickly become unworkable. In fact, given how interconnected vendors and systems are, this interpretation would destroy one of the EU's freedoms — the free movement of data — and undermine the EU's digital strategy as part of the Digital Single Market. In reality, though, data holders would rarely be able to make such determinations with confidence. Sharing the data would expose them to GDPR risk. Refusing to share could expose them to Data Act sanctions.

"Hotel California," privacy edition: Your data can check out any time you like, but you can never be sure the data is allowed to leave.

This is the context in which the third proposed new sentence of Article 4(1) GDPR, as contained in the Digital Omnibus proposals, must be understood. By clarifying that information does not become personal for an entity merely because a subsequent recipient has means reasonably likely to identify a person, the European Commission is not restating Recital 26. It is correcting a doctrinal spillover that threatened to undermine the practical effectiveness of the EU’s broader digital framework and one that helps to better align the goals of the GDPR with the practicalities of the modern data ecosystems.

Other people's keys do not unlock your door. Or, as Sherlock Holmes might put it, you do not solve a case with evidence you cannot access.

As confirmed by a Commission representative at the IAPP Europe Data Protection Congress 2025, this provision is explicitly aimed at neutralizing the Scania/SRB transfer logic to the extent it risked creating a compliance deadlock under the Data Act. The GDPR's scope is clarified not to weaken protection, but to ensure coherence across EU digital law and across the member states.

Notably, the European Data Protection Board-European Data Protection Supervisor Joint Opinion on the Digital Omnibus does not contest this move. It does not attempt to revive an absolute concept of personal data. It does not argue that pseudonymized or technical data should be treated as personal data for everyone, everywhere. Instead, it emphasizes legal certainty, accountability and supervision. It accepts that some data will fall outside the GDPR for some entities and positions regulators to scrutinize how those determinations are made.

Where the second convergence becomes unavoidable

If Recital 26's relative concept is now firmly embedded in the operative text of the GDPR, organizations will increasingly need to explain why they cannot identify individuals in the data they process. Assertions will no longer be enough. Deidentification will need a documented narrative.

Much like transfer impact assessments after "Schrems II," organizations will be expected to articulate what data they hold; what identifiers have been removed or transformed; what additional information they do not possess; what technical, legal and organizational barriers exist; and why identification is not reasonably likely for them in that specific context. The issue is not whether anonymization can be claimed, but whether nonidentifiability can be demonstrated.

This development extends far beyond classic pseudonymization. For years, arguing that IP addresses, device identifiers or technical logs were not personal data in a given context carried significant enforcement risk, even where CJEU case law and Recital 26 supported that conclusion. The SRB judgment, the Omnibus proposal and the Commission’s clarifications together send a different signal. The relative concept is not an exception. It is the rule.

That does not make the GDPR disappear. It makes it situational. And with situationality comes responsibility.

Anonymization remains the Fitzgerald moment: dazzling, rare and binary. Anonymization remains a high, unchanged bar. But the Commission's proposal makes clear that organizations do not always need to reach that bar: Data can fall outside the GDPR where they cannot realistically identify individuals. In that sense, the change is less about anonymization and more about making nonidentifiability a defensible position. Rather than lowering the anonymization bar, the Commission's proposal acknowledges that Recital 26 was always right but not loud enough.

Or, to stay with the Eagles, you can check out any time you like, but you had better be able to explain how you locked the door.

The first two new sentences in Article 4(1) codify what Recital 26 and the CJEU have been saying or suggesting for years: "Information relating to a natural person is not necessarily personal data for every other person or entity, merely because another entity can identify that natural person. Information shall not be personal for a given entity where that entity cannot identify the natural person to whom the information relates, taking into account the means reasonably likely to be used by that entity."

The third and fourth sentences fix what case law may have unintentionally destabilized. "Information relating to a natural person is not necessarily personal data for every other person or entity, merely because another entity can identify that natural person. Information shall not be personal for a given entity where that entity cannot identify the natural person to whom the information relates, taking into account the means reasonably likely to be used by that entity. Such information does not become personal for that entity merely because a potential subsequent recipient has means reasonably likely to be used to identify the natural person to whom the information relates." Together, they close the door on the absolute concept of personal data through precision, not deregulation.

They also usher in a new compliance reality — one defined by better explanations and more logic.

In privacy, as in music, clarity matters. Not every tune is jazz. Not every dataset is anonymous. And sometimes the most important note is the one that never plays, provided you can explain why.

CPE credit badge

This content is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.

Submit for CPEs

Contributors:

Noemie Weinbaum

AIGP, CIPP/A, CIPP/C, CIPP/E, CIPP/US, CIPM, CIPT, CDPO/FR, FIP

Senior Managing Counsel, Privacy and Compliance

UKG

Flora Garcia

CIPP/E, CIPP/US, CIPT, FIP

Former Chief Privacy Officer

Wayfair, McAfee, Time

Roy Kamp

AIGP, CIPP/A, CIPP/E, CIPP/US, CIPM, CIPT, FIP

Legal Director

UKG

Tags:

AI and machine learningData securityLaw and regulation

Related Stories