Big Brother EU: The CSAR Encryption Joke

Ah, the joy of the sunny days. While everyone scrambles around figuring out the SPF value they need to keep those sunburns at bay, the European Union is playing another round of ‘Big Brother.’

Last year, the European Commission proposed a new regulation, known as the ‘Regulation to Prevent and Combat Child Sexual Abuse’ or CSAR, for short.

The proposed regulation aims to establish a framework within the EU to combat child sexual abuse, both offline and online. However, it’s the online aspect that has some people, including me, a bit worried. Let’s look at what this new proposal entails.

What is CSAR?

CSAR is a comprehensive legislative proposal by the European Union aiming to combat online child sexual abuse. The regulation aims to address a significant issue: child sexual abuse, a problem that affects at least one in five children. The advent of online platforms has further facilitated the spread of this abuse, making it more critical to combat this issue urgently.

The proposal seeks to establish a legal framework for prevention, investigation, and victim support. It stresses the vital role of online service providers in maintaining a safe online environment. It also highlights the need for a harmonized approach across the EU to combat online child sexual abuse effectively and emphasizes the need to balance the rights of child victims with the rights of other users and service providers.

One of the notable suggestions in the proposal is to establish a European Centre to prevent and counter child sexual abuse. This body will maintain databases, facilitate cooperation, assist national authorities, support victims, and promote prevention efforts.

The Key Components of CSAR

CSAR proposes a myriad of measures to combat online child sexual abuse. These include providing detection technologies to service providers, mandating human oversight of these technologies, and setting clear reporting obligations. The regulation also underscores the need for swift removal of identified Child Sexual Abuse Material (CSAM) and applies to hosting services even if they don’t directly offer their services in the EU.

Further, the proposal emphasizes victims’ rights and support, requiring data related to potential online child sexual abuse to be preserved only as long as necessary. Service providers must designate a single point of contact and, if offering services in the EU, a legal representative within the Union. Lastly, providers are exempted from criminal liability when their activities strictly adhere to the regulation, acting in good faith.

The Trouble with Encryption

Now, my problem with the proposal isn’t that the EU wants to do something against sexual abuse of children. On the contrary, it is indeed commendable. However, it’s their methodology that I take issue with, specifically the idea of potentially breaking end-to-end encryption.

The CSAR proposal recognizes that certain technologies, like encryption, can make it challenging to detect CSAM. The proposal suggests that detection orders should apply regardless of whether the content is encrypted, indicating an expectation for service providers to detect CSAM, even in encrypted communications.

But here’s the catch: If someone or something can detect the content of a message, then end-to-end encryption isn’t truly end-to-end anymore, is it? It’s like having a secret language that only two people understand, except there’s a third party that can decipher it whenever they want. Doesn’t sound very secret now, does it?

As we speak, the EU Council is continuing to debate a law that would require communication providers to scan all communications, potentially including end-to-end encrypted conversations. They’re even contemplating including audio conversations in this.

What concerns me is that some of the world’s largest democracies are discussing how to break encryption to surveil us all. And most people don’t seem to be paying any attention to it. What a time to be alive..

Glow Agent

Contact | Github | RSS