Will Europe scan our chats under the guise of child sex abuse?
Let us give a straight answer: no.
Looking at power balances, it is unlikely that the European Union will approve the controversial regulation against child sexual abuse that would open the door to chat control.
Even if it did approve it, then, the power to ‘control chats’ would not be given to Ursula von der Leyen, and in theory not even to national government, but only to national judiciaries.
The need to discuss it in Brussels, in fact, is preventing many governments (including the Italian one) from proceeding independently.
Having said all this, the question arises: how is it possible that in democratic Europe there are those who want a universal surveillance mechanism like those of Russia and China?
What protects our chats
The term ‘end-to-end encryption’( E2EE) refers to a system in which only the sender and receiver can access the content, without it being readable by outsiders or even the service providers (i.e. messaging companies like Whatsapp and Signal and data storage companies like Apple and Google).
It is called end-to-end precisely because it protects data ‘endpoint-to-endpoint’, i.e. from the sender’s device to the receiver’s device.
This technology ensures the highest level of security by preventing encryption keys from being stored on servers that are potentially susceptible to compromise.
The downside is that the provider cannot provide the data in the clear (e.g. to collaborate in paedophile or terrorism investigations) as it does not possess the keys, which are stored exclusively on the users’ devices.
Well: the regulation being debated in Europe, and which perhaps without Europe would already be in force, risks precisely undermining end-to-end encryption, which is the cornerstone of our digital security.
The road to hell is paved with good intentions
The proposal for a ‘Child Sexual Abuse Regulation‘, presented by the European Commission in May 2022, aims to require providers to detect, report and remove child pornography, as well as to prevent crimes such as grooming.
The proposal also includes increased support for victims.
The text affirms the principle of technological neutrality, stating that no technology would be favoured or excluded, as long as it complies with the requirements.
In itself, the use of end-to-end encryption would not be banned; on the contrary, it would be recognised as an essential tool for confidentiality of communications.
However, a court would have the power to issue a detection order (an order to detect illegal content).
In that case, service providers would have to adopt suitable tools to detect that content on the chat rooms or clouds of those who use them.
On paper, the chosen technologies should ensure that they do not compromise confidentiality and have no other misuse. To support this approach, a hypothetical ‘EU Centre for Combating Online Sexual Abuse’ would provide free detection technologies that formally comply with European data protection law.
The dangers of chat control
The proposal, however, immediately raised strong controversy. The ‘chat control’ measures, which would authorise the scanning of private communications, have been described as disproportionate and detrimental to citizens’ fundamental rights.
Numerous experts and associations have pointed out that the security of online communications is based on end-to-end encryption.
Mandatory scanning would in fact lead to content being monitored directly on devices, turning smartphones and computers into tools for the authorities’ preventive surveillance of citizens.
In addition, the creation of backdoors, i.e. exceptions to encryption in order to make chats readable, would open up openings that could be exploited by cybercriminals and hostile powers.
The proposed detection techniques also raise concerns.
Tools such as fingerprinting,hashing and the use of artificial intelligence models, once applied to billions of messages, would inevitably generate false positives and false negatives.
The consequences would be severe: innocent users would be wrongly flagged as paedophiles and vice versa.
The risk of mass surveillance
Underlying the concerns is client-side scanning, i.e. the use of systems that scan content directly on the device before it is encrypted and sent.
This approach would entail the mandatory integration (foreseen in Article 7 of the regulation draft) of automated mechanisms on users’ endpoints.
This means that messages would be checked before being protected by encryption, opening the way to generalised surveillance, reduced privacy guarantees and potential censorship.
Indeed, such a system, once installed, could be used not only to search for child pornography content, but to block or report any image or text included by the authorities in an opaque database that cannot be verified by users.
In practice, client-side scanning would transform tools born to protect privacy into generalised surveillance systems.
The legal precedent: the Podchasov vs. Russia ruling
Ironically, just a year ago European law had produced a landmark pronouncement against the use of these systems, which at the time concerned Russia.
This is the ruling of the European Court of Human Rights in the case Podchasov vs Russia ( March 2024).
The Russian authorities had asked Telegram to provide decryption tools to allow the FSB access to the messages of suspected terrorists.
Telegram had refused, arguing that a backdoor would compromise the security of all users. The court agreed with the company.
Referring to Article 8 of the European Convention on Human Rights, the judges ruled that generalised access to communications constitutes indiscriminate surveillance. Such a practice undermines the very essence of the right to privacy and threatens freedom of expression.
A tough fight
It is no coincidence that, despite the fact that the Council of the European Union (the body in which the representatives of the 27 governments sit) had started the technical examination of the anti-pedophilia regulation as early as 2022, a final vote was only scheduled for 14 October.
At the moment as many as 15 governments, including Italy, would be in favour.
Germany, however, together with other central European countries that still remember mass surveillance under communist regimes, is forming a ‘blocking minority’. Moreover, it seems that France, under the new Lecornu government, is having second thoughts.
On such a sensitive matter, moreover, the Council is forced to come to an agreement with Parliament, which has already spoken out against it in 2023.
It is, in short, very unlikely that the regulation will pass.
Europe is the last dam
It is not true, therefore, that ‘Europe wants to control our chats’.
For the Spaniards or the Italians, in fact, the opposite is true: their national authorities would already be controlling them if the issue were not a European competence.
The proof is what is happening in the United Kingdom: since leaving the EU, it has been trying on its own to force digital companies to impose backdoors.
The attempt so far seems to have foundered in the face of Apple’s strenuous opposition.
The policy remains torn between two opposing needs: protecting minors and safeguarding privacy. But the Podchasov ruling and security analyses show how the chat control approach risks being disproportionate and counterproductive: instead of providing greater protection, it could make European citizens more vulnerable.
Let us hope for a beacon of common sense.








