Chat controls: The EU plans to abolish data protection for digital communications | Panda Anku

Largely unnoticed by the public, the EU Commission is working in coordination with the EU governments to abolish regulations for the protection of privacy in digital communication. Under the guise of combating child abuse on the Internet, a regulation is to be agreed that will oblige all network operators to search all communication content for depictions of child abuse or child care attempts by adults if ordered by the authorities.

Protest against chat controls in front of the European Commission office in Berlin in May 2022 [Photo by CC0 by Jakob Rieger] [Photo by CC0 by Jakob Rieger]

As reported by, such an order could even search a user’s devices for depictions of child abuse. With these so-called “client-side scans” (CSS), all outgoing messages are checked on the sender’s device to see whether they contain images of child abuse. If a supposed hit is triggered, an automatic alert would be sent to the control body, which can then involve the relevant authorities in the case.

Client-side scanning breaks the confidentiality of digital communications normally guaranteed by end-to-end encryption, since messages are checked before encryption, regardless of reasonable suspicion. Against protests from recognized IT security researchers, Apple has planned a similar approach to its devices. The latter came to the conclusion in a study that CSS poses a threat to privacy, IT security, freedom of expression and democracy as a whole.

The current bill does not prescribe any definitive technical procedures that would be part of a mandatory protocol. The ordering of such procedures will be left to a body yet to be set up, which is to be integrated into Europol and which can impose orders on individual providers. This creates a Europe-wide police surveillance center with far-reaching powers to lift the confidentiality of communications.

Civil rights organizations are extremely critical of the current draft. The chat controls are incompatible with European fundamental rights, neutralize the effectiveness of end-to-end encryption, making it obsolete and thus directly calling anonymous use of the Internet into question. Child protection organizations such as the German Child Protection Association also consider the measures to be too far-reaching.

Regardless of the specific enforcement of these chat controls, they constitute a serious violation of human rights. As with other legislative efforts, the stated goal of limiting the dissemination of child abuse images is a smokescreen. Rather, it is about building a surveillance apparatus with far-reaching censorship powers, which can then be expanded at will.

The expansion of censorship instruments, once created, is nothing new at EU level. The “upload filters” introduced as part of the EU copyright reform were originally planned as part of the TERREG guidelines to filter out terrorist content and only failed due to public resistance.

Claiming that chat controls are necessary to combat child abuse facilitates approval of this fatal technology that could not otherwise be adopted.


Leave a Comment