An online-safety expert argues that end-to-end encryption endangers children
Messaging platforms can and should balance privacy and welfare, says John Carr
ENCRYPTION TOOLS have been around for millennia. Julius Caesar used one form in his dispatches to his field commanders. Mary Queen of Scots used another to plot the overthrow of Queen Elizabeth I. And now some of the world’s largest tech companies are integrating end-to-end encryption into their mass-messaging systems.
Encryption allows people to send messages that can only be seen by the sender and receiver. This technology is a many-splendoured thing. But the problem I have with encryption is not a generalised one. It is the role it plays in enabling crimes against children, both online and offline.
Currently, information that tech companies give to law enforcement contributes to thousands of arrests of suspected child-sex offenders each month, and protects an even larger number of children from sexual abuse. America’s National Centre for Missing and Exploited Children (NCMEC), which distributes information to law-enforcement agencies worldwide on behalf of a group of national child-safety bodies, received 32m referrals for child sexual abuse last year. One company—Meta—accounted for 85% of these.
Despite this, Meta confirmed earlier this year that it will continue with its “pivot to privacy”. This involves rolling out end-to-end encryption across two of its largest communication platforms, Facebook Messenger and Instagram Direct (WhatsApp, which is also owned by Meta, is already end-to-end encrypted). The design of these platforms makes them vulnerable to exploitation by people looking to groom and sexually abuse children. Authentication of a new user’s actual identity, in particular their age, is weak. If end-to-end encryption is introduced without appropriate safeguards, it will become a lot easier to commit and hide terrible crimes against minors.
Meta says its roll-out of end-to-end encryption will involve “robust safety measures” and that it will “continue providing reports to law enforcement”. But the company is blinding itself to the evidence. End-to-end encryption prevents the screening of data while it’s transferred between devices, and so will make it harder for Meta to identify child sexual abuse on its platforms. The NCMEC estimates that 70% of Meta’s reports—or around 14m incidents of child sexual abuse—could go undetected every year if the company rolls out end-to-end encryption without safety measures in place. The police cannot investigate perpetrators or rescue children if they don’t have the reports in the first place.
Proponents of end-to-end encryption tout its privacy credentials, as messages cannot be intercepted by third parties. While it’s true that many people believe encryption will help them avoid persecution by totalitarian regimes, I fear that may only be true if it is an incompetent regime—under the most advanced surveillance, people can be identified from the metadata associated with their messages. However, even if that were not the case, how can you tell parents in Toronto that you cannot do the maximum possible to protect their child because of what a tyrant might do with the same technology in Tehran?
It is entirely possible to deploy tools alongside end-to-end encryption which allow patterns of child sexual abuse to be detected with a very high degree of accuracy. More than 99% of reports received each year by the NCMEC are generated by automated tools. The best-known programme, PhotoDNA, has been used at various points by Google, Facebook, Adobe, Reddit and X (formerly Twitter) since 2009.
Through its Safety Tech Challenge Fund, the British government supported the development of proof-of-concept tools capable of detecting abusive material within encrypted environments. Consequently, tech experts and industry partners were able to show that it is technically feasible to detect abuse on encrypted services while maintaining user privacy. Client-side scanning is a good example of one of these tools. This technology scans messages in milliseconds before they enter the encryption zone, allowing child-sex-abuse material to be detected and addressed.
This tool will only do one or more of three things: see and address already known abusive material; see and address images which are likely to feature child sex abuse; and see and address behaviour which is likely to indicate a child is being groomed for a sexual purpose. In the case of the latter two, if a human moderator checks it out and concludes it was a false alarm, that’s it. These tools do not collect, store, process or in any meaningful way “see” anything else. No record is made, so no record can be kept. No investigation ensues. Nobody’s time is wasted. Nobody’s reputation is affected.
Why won’t Meta introduce these technologies across Facebook Messenger and Instagram Direct Messages? The company will enjoy substantial business advantages from making the “pivot to privacy”, most obviously reduced overhead costs. In 2021 Meta spent $5bn and employed 40,000 people to work on “safety and security”. A large part of that would have been taken up by moderators, but encryption hugely reduces the need for such staff: you cannot moderate what you cannot see. The potential for Meta to be exposed to bad publicity over how much criminal content its systems pick up will also decline.
Unlike for Meta, there are no advantages for children. Quite the opposite. So the company must think again. It is not either/or: it’s possible to keep the contents of messages hidden from prying eyes and protect kids at the same time. So why not?■
John Carr is secretary of the Children’s Charities’ Coalition on Internet Safety. He is supporting Britain’s Home Office with its campaign to work with tech companies to ensure their platforms are safe for children.
From the December 9th 2023 edition
Discover stories from this section and more in the list of contents
Explore the editionMore from By Invitation
Mieko Kawakami on how men can make the world better for women
Fathers must confront their unconscious assumptions, says the Japanese writer
An industry pioneer on the under-appreciated benefits of the global mobile revolution
It has increased productivity, agency and individualisation, says Iqbal Quadir
We need to focus more on the social effects of AI, says Nicholas Christakis
The sociologist’s experiments suggest it will change how humans treat each other