
The European Commission has unveiled a sweeping proposal to revise the General Data Protection Regulation (GDPR), aiming to simplify compliance while introducing technically nuanced changes to key concepts like personal data, AI processing, pseudonymisation, and consent mechanisms.
The official draft, published on November 19, 2025, is part of Brussels' broader effort to streamline regulation without compromising core privacy principles. Lukasz Olejnik, a privacy and cybersecurity researcher and long-time GDPR advisor, published a detailed assessment of the proposal, calling the changes both “sensible” and “far-reaching.” One of the most consequential revisions redefines the very scope of what counts as personal data, shifting it from a universal standard to an entity-relative concept.
The GDPR, enforced since May 2018, governs data protection across the EU and has become a global benchmark. While the original regulation placed heavy compliance burdens on firms of all sizes, the latest proposal attempts to rebalance those obligations.
Personal data and AI
Under the proposed amendment to Article 4(1), information will only be considered personal data if the entity in question can reasonably identify the data subject using its own means. This entity-relative model diverges from the current interpretation upheld by the Court of Justice of the European Union (CJEU), which considers identifiability through both direct and third-party means.
For instance, if an adtech firm receives hashed email addresses that it cannot, on its own, resolve back to individuals, even though its partners can, the company could now argue that the data is not personal for it. This may significantly reduce the scope of data protections in certain cases, allowing more leeway for data-driven business models to operate under less regulatory scrutiny.
Another critical area of reform involves the use of special-category data, such as political beliefs, health status, or sexual orientation, in AI model development. The proposal acknowledges that incidental or residual presence of such data may occur in large-scale datasets. Under the new Article 9(2)(k), controllers may proceed with processing if
- They make technical and organizational efforts to avoid collecting such data;
- They remove it when identified;
- If removal requires disproportionate effort, they must protect it from influencing outputs or being shared.
This approach allows AI developers to operate with some flexibility while maintaining safeguards, a marked shift from the blanket restrictions currently in place.
Strengthening pseudonymisation
Article 41a introduces long-awaited clarity on pseudonymised data. It empowers the Commission to define when such data can be considered non-personal, offering potential relief for controllers employing privacy-preserving technologies. This provision could unlock new use cases for data analytics and research without compromising identifiability protections.
A separate clause under Article 9(2)(l) explicitly permits biometric verification where the means remain under the sole control of the data subject, such as on-device fingerprint or facial recognition authentication. This confirms the legality of local biometric processing, aligning with existing practices in mobile and IoT ecosystems.
User consent
Perhaps the most visible and practical change comes in how consent is collected and managed. The proposal introduces a strict purpose-based regime for operations on user devices, now governed by a closed list in Article 88a. Only four scenarios allow processing without consent:
- Transmission of communications,
- Services explicitly requested by the user,
- First-party audience measurement,
- Security maintenance.
In all other cases, user consent is required, but here, the mechanism for obtaining that consent is fundamentally transformed. The Commission mandates that consent must be grantable or rejectable via a single-click interface and must be respected for six months without re-prompting. It also mandates support for machine-readable consent signals, potentially via web browsers, operating systems, or digital identity wallets.
These changes directly address long-standing criticism of “consent fatigue” and deceptive interface patterns. Once standards emerge, websites will be legally required to recognize these automated signals, bringing long-abandoned proposals like W3C's Do Not Track or California's GPC back into relevance, though the EU is likely to favour a homegrown technical standard.
Notably, media service providers are temporarily exempt from the obligation to honour machine-readable consent signals while delivering their services, raising potential friction between privacy expectations and economic models that rely on ad revenue.
While the proposal still needs to pass through the legislative process, its clarity and technical precision mark a significant departure from the GDPR's sometimes vague interpretations. Olejnik comments that small businesses, in particular, stand to benefit from the narrower scope of “personal data” and clearer consent requirements.







Leave a Reply