whatsapp
WhatsApp, Signal and five other messaging services have joined forces to attack the government's Online Safety Bill. They fear the bill will kill end-to-end encryption and say, in an open letter, that this could open the door to 'routine, general and indiscriminate surveillance of personal messages'. The stakes are high: WhatsApp and Signal are threatening to leave the UK market if encryption is undermined. This intervention comes as the Lords begins their line-by-line committee stage scrutiny of the Bill today.

Encryption provides a defence against fraud and scams; it allows us to communicate with friends and family safely; it enables human rights activists to send incriminating information to journalists. Governments and politicians even use it to keep their secrets from malicious foreign actors (and their colleagues). Encryption should not be thrown away in a panic.

The government has responded to these concerns by declaring that the bill 'in no way represents a ban on end-to-end encryption'. This is technically true but deceptive. The bill gives Ofcom the power to require services to install tools (called 'accredited technology') that could require surveillance of encrypted communications for child exploitation and terrorism content.

Advocates claim this is possible without undermining encryption - by installing tools for scanning for certain content on a user's device. However, just as one can't be half pregnant, something can't be half encrypted. Once a service starts reading messages for any purpose the entire premise of encryption disappears. A paper from fifteen computer scientists and security researchers in 2021 explained it is 'moot' to talk about encryption 'if the message has already been scanned for targeted content.'

With respect to child exploitation material, messages could be checked against the PhotoDNA database. But that only contains historic photos and videos and cannot be stored on devices. It means creating a software vulnerability, that could be exploited by malicious actors, and sending data back to a central database to check whether it is a match. Alternatively, companies could use machine learning to detect nudity, which would need to be reviewed by authorities. But that has a high rate of failure. Just last year, a father lost their Google account and was reported to the police after sending a naked photo of their child to a doctor.

Some contend that privacy should be sacrificed in the fight against child abuse. But there are clearly limits to this logic. Few would consent to the state putting CCTV in everyone's bedroom to crack down on the abuse of children. But that is effectively what a technology notice could mean: a CCTV camera in everyone's phones. Ofcom could even be able to require the use of scanning technology without independent oversight (unlike the Investigatory Powers Act, which at least requires authorities to seek permission from a tribunal and is, generally, targeted against a specific individual rather than mass surveillance).

Message scanning is open to serious mission creep. There will be enormous pressure to scan communications for other purposes, from 'disinformation' in the UK to any unsanctioned material in authoritarian countries. This is why platforms, who do not want to create a vulnerability in their product or set a global precedent for their billions of users, really could leave the relatively small UK market because of the bill. The shutdown of WhatsApp in particular would be a political disaster for any government, and not just because ministers and MPs would lose their main communications platform; millions of people who use it across the country will also lose theirs.

Ironically, one of the services that could be forced to leave the UK is Element. They are a UK-based start-up that provides secure communications channels for organisations, including the UK government defence establishment. A British success story no longer able to operate in their home country hardly screams Rishi's science and tech superpower.

Child exploitation is abhorrent. It is well-understood that most child sexual abuse happens within the family and, to a lesser extent, with locals in a position of trust. There are also ongoing concerns around sexting, sexual extortion and revenge porn. None of this can be resolved through message scanning.

Professor Ross Anderson of the University of Cambridge explains that we cannot expect 'artificial intelligence' to replace police officers, teachers and social workers in child protection. 'The idea that complex social problems are amenable to cheap technical solutions is the siren song of the software salesman and has lured many a gullible government department on to the rocks,' Anderson writes.

Ultimately, message scanning only serves to distract. It wastes policymakers' and law enforcement's time while doing little to protect children and threatening fundamental freedoms.
Matthew Lesh is the Director of Public Policy and Communications at the Institute of Economic Affairs