EU law would require Big Tech to do more to combat child sexual abuse, but a key question remains: How?

The European Commission recently proposed regulations to protect children by requiring tech companies to scan the content in their systems for child sexual abuse material. This is an extraordinarily wide-reaching and ambitious effort that would have broad implications beyond the European Union’s borders, including in the U.S.

Unfortunately, the proposed regulations are, for the most part, technologically unfeasible. To the extent that they could work, they require breaking end-to-end encryption, which would make it possible for the technology companies – and potentially the government and hackers – to see private communications.

The regulations, proposed on May 11, 2022, would impose several obligations on tech companies that host content and provide communication services, including social media platforms, texting services and direct messaging apps, to detect certain categories of images and text.

Under the proposal, these companies would be required to detect previously identified child sexual abuse material, new child sexual abuse material, and solicitations of children for sexual purposes. Companies would be required to report detected content to the EU Centre, a centralized coordinating entity that the proposed regulations would establish.

Each of these categories presents its own challenges, which combine to make the proposed regulations impossible to implement as a package. The trade-off between protecting children and protecting user privacy underscores how combating online child sexual abuse is a “wicked problem.” This puts technology companies in a difficult position: required to comply with regulations that serve a laudable goal but without the means to do so.

Digital fingerprints

Researchers have known how to detect previously identified child sexual abuse material for over a decade. This method, first developed by Microsoft, assigns a “hash value” – a sort of digital fingerprint – to an image, which can then be compared against a database of previously identified and hashed child sexual abuse material. In the U.S., the National Center for Missing and Exploited Children manages several databases of hash values, and some tech companies maintain their own hash sets.

The hash values for images uploaded or shared using a company’s services are compared with these databases to detect previously identified child sexual abuse material. This method has proved extremely accurate, reliable and fast, which is critical to making any technical solution scalable.

The problem is that many privacy advocates consider it incompatible with end-to-end encryption, which, strictly construed, means that only the sender and the intended recipient can view the content. Because the proposed EU regulations mandate that tech companies report any detected child sexual abuse material to the EU Centre, this would violate end-to-end encryption, thus forcing a trade-off between effective detection of the harmful material and…

Access the original article

Subscribe
Don't miss the best news ! Subscribe to our free newsletter :