
Over 500 leading cryptographers and security researchers from 34 countries have issued a stark warning to European governments, urging them to reject the EU's proposed “Chat Control” regulation.
In a joint open letter published today, the experts call the legislation “technically infeasible,” a “danger to democracy,” and a direct threat to the security and privacy of European citizens.
The warning comes just days before an EU Council working party meeting on September 12 and ahead of a pivotal ministerial vote scheduled for October 14. The outcome hinges on undecided countries like Germany, whose abstention or opposition could halt the proposal entirely.
The letter was signed by 502 scientists with established credentials in cybersecurity, cryptography, and privacy engineering. Notable signatories include Cas Cremers (CISPA Helmholtz Center), Bart Preneel (KU Leuven), Michael Veale (UCL), Carmela Troncoso (EPFL), and René Mayrhofer (JKU Linz), all highly respected within the global infosec community.
They argue that the proposed regulation, formally known as the EU Regulation to Prevent and Combat Child Sexual Abuse (CSAR), would mandate automated scanning of private communications, even on end-to-end encrypted (E2EE) platforms, using AI and machine learning tools. The scientists emphasize that such systems would be riddled with false positives, trivially easy to evade, and would introduce dangerous backdoors into secure messaging systems.
“There is no machine-learning algorithm that can detect unknown CSAM without committing a large number of errors,” the letter states. “All known algorithms are fundamentally susceptible to evasion.”
A technological and legal minefield
The latest draft of the Chat Control regulation, dated July 24, 2025, and circulated to EU delegations, includes compromises such as limiting detection to visual content (images, videos, and URLs) and excluding audio and text. However, the scientific community stresses that these adjustments do not address the core flaws.
Specifically, they highlight:
- Infeasibility at scale: Scanning billions of messages and images daily cannot achieve the required accuracy. Even a low false-positive rate would overwhelm authorities and implicate innocent users.
- Breakage of encryption: On-device scanning undermines E2EE by introducing a de facto backdoor, creating a single point of failure vulnerable to exploitation.
- Function creep risk: The system's capabilities could easily be repurposed for censorship, political surveillance, or broader law enforcement use.
- Democratic fragility: The researchers warn that the regulation risks normalizing mass surveillance in democratic societies under the guise of child protection.
The proposal's reliance on client-side scanning is particularly controversial. Even with user consent mechanisms and the promise of EU-vetted detection technologies, the scientific consensus is that such systems cannot be built securely or reliably.
Chat Control and political landscape
The draft regulation, currently backed by 14 EU member states including Denmark, France, and Spain, proposes sweeping mandates such as empowering the authorities to compel services to scan all user-uploaded visual content for known and unknown CSAM, using machine learning and AI models.
Providers must assess and mitigate risks of CSAM dissemination and implement technical controls such as age verification. Though the proposal claims not to “weaken encryption,” it requires scanning prior to encryption, directly contravening E2EE principles.
Despite language that aims to reassure providers that E2EE can coexist with detection, the regulation explicitly permits mandated client-side scanning. Scientists argue this turns every device into a surveillance node, undermining the very concept of secure communication.
Opposition has been growing, with countries like Austria, Belgium, the Netherlands, and Finland holding firm against the regulation. Germany, under a new government since May 2025, has yet to take a clear position. Its vote is critical to forming a blocking minority needed to defeat the legislation.
Meanwhile, advocacy groups such as the Pirate Party and privacy-focused companies like Tuta (formerly Tutanota) have been vocally critical. Tuta warned that the law would force them to either sue the EU or cease operations in Europe, as implementing client-side scanning would violate their commitment to privacy and E2EE.
Adding to the controversy, investigative reports have revealed that tech lobbying and affiliated NGOs, some backed by the Oak Foundation and WeProtect, have spent tens of millions of dollars promoting Chat Control since 2019, raising concerns about the influence of AI and law enforcement interests on EU policy.
Leave a Reply