They’re still trying to ban cryptography
The sales office to data localization pipeline for mass surveillance.
Some bad ideas never die. Since the late 1980s, spy agencies and cops have argued that the public should not have access to working cryptography, because this would mean that terrorists, mafiosi, drug dealers and pedophiles will be able to communicate in perfect, unbreakable secrecy.
The problem is that working cryptography protects everyone, not just the Four Horsemen of the Infocalypse: the same cryptographic tools that protect instant messages and “Darknet” sites also protect your communications with your bank, your Zoom therapy session, and the over-the-air updates for your pacemaker and your car’s anti-lock braking system.
Deliberately introducing defects into cryptographic tools — “back doors” for cops and spies — is a deadly proposition, carrying enormous risks. It’s hard enough to continually secure cryptographic systems without also having to preserve a “secret” backdoor that causes them to catastrophically fail when the right people need them to.
In a digital world, cryptography is all that stand between you (your cameras, thermostat, car, work email, family photos and records, etc) and everyone who might attack you (ransomware creeps, stalkers, identity thieves, corporate spies, crooked cops, foreign governments, etc).
And yet.
Governments around the world continue to periodically insist that working cryptography be outlawed and replaced with defective tools with deliberately introduced defects. All the way back in the early 1990s, the Clinton administration tried — and failed — to do this. Not long after, the Electronic Frontier Foundation won a landmark lawsuit that effectively ended any US attempts to ban cryptography.
But while the idea of crypto bans are mostly dead in the USA, they’re alive and well in the UK. While New Labour toyed with these ideas in the early 2000s, the idea really took root under the Conservatives, first with David Cameron’s coalition government, and then with the rapid succession of Tory PMs-for-a-day, each of whom has reliably demanded crypto backdoors, provided their political careers survived long enough to get around to it.
This is such a recurring motif that I actually wrote a standard article explaining why a ban on working cryptography was bound to fail, which I dusted off and republished every time it came up.
The central theme of my explainer isn’t about the risks to all parts of society and the economy from banning working crypto (though I do touch on these); it’s that such a ban is also impossible to accomplish. As I wrote:
For Theresa May’s proposal to work, she will need to stop Britons from installing software that comes from software creators who are out of her jurisdiction. The very best in secure communications are already free/open source projects, maintained by thousands of independent programmers around the world. They are widely available, and thanks to things like cryptographic signing, it is possible to download these packages from any server in the world (not just big ones like Github) and verify, with a very high degree of confidence, that the software you’ve downloaded hasn’t been tampered with.
May is not alone here. The regime she proposes is already in place in countries like Syria, Russia, and Iran (for the record, none of these countries have had much luck with it). There are two means by which authoritarian governments have attempted to restrict the use of secure technology: by network filtering and by technology mandates.
Theresa May has already shown that she believes she can order the nation’s ISPs to block access to certain websites (again, for the record, this hasn’t worked very well). The next step is to order Chinese-style filtering using deep packet inspection, to try and distinguish traffic and block forbidden programs. This is a formidable technical challenge. Intrinsic to core Internet protocols like IPv4/6, TCP and UDP is the potential to “tunnel” one protocol inside another. This makes the project of figuring out whether a given packet is on the white-list or the black-list transcendentally hard, especially if you want to minimise the number of “good” sessions you accidentally blackhole.
More ambitious is a mandate over which code operating systems in the UK are allowed to execute. This is very hard. We do have, in Apple’s Ios platform and various games consoles, a regime where a single company uses countermeasures to ensure that only software it has blessed can run on the devices it sells to us. These companies could, indeed, be compelled (by an act of Parliament) to block secure software. Even there, you’d have to contend with the fact that other EU states and countries like the USA are unlikely to follow suit, and that means that anyone who bought her Iphone in Paris or New York could come to the UK with all their secure software intact and send messages “we cannot read.”
But there is the problem of more open platforms, like GNU/Linux variants, BSD and other unixes, Mac OS X, and all the non-mobile versions of Windows. All of these operating systems are already designed to allow users to execute any code they want to run. The commercial operators — Apple and Microsoft — might conceivably be compelled by Parliament to change their operating systems to block secure software in the future, but that doesn’t do anything to stop people from using all the PCs now in existence to run code that the PM wants to ban.
More difficult is the world of free/open operating systems like GNU/Linux and BSD. These operating systems are the gold standard for servers, and widely used on desktop computers (especially by the engineers and administrators who run the nation’s IT). There is no legal or technical mechanism by which code that is designed to be modified by its users can co-exist with a rule that says that code must treat its users as adversaries and seek to prevent them from running prohibited code.
This, then, is what Theresa May is proposing:
* All Britons’ communications must be easy for criminals, voyeurs and foreign spies to intercept
* Any firms within reach of the UK government must be banned from producing secure software
* All major code repositories, such as Github and Sourceforge, must be blocked
* Search engines must not answer queries about web-pages that carry secure software
* Virtually all academic security work in the UK must cease — security research must only take place in proprietary research environments where there is no onus to publish one’s findings, such as industry R&D and the security services
* All packets in and out of the country, and within the country, must be subject to Chinese-style deep-packet inspection and any packets that appear to originate from secure software must be dropped
* Existing walled gardens (like Ios and games consoles) must be ordered to ban their users from installing secure software
* Anyone visiting the country from abroad must have their smartphones held at the border until they leave
* Proprietary operating system vendors (Microsoft and Apple) must be ordered to redesign their operating systems as walled gardens that only allow users to run software from an app store, which will not sell or give secure software to Britons
* Free/open source operating systems — that power the energy, banking, ecommerce, and infrastructure sectors — must be banned outright
Theresa May will say that she doesn’t want to do any of this. She’ll say that she can implement weaker versions of it — say, only blocking some “notorious” sites that carry secure software. But anything less than the programme above will have no material effect on the ability of criminals to carry on perfectly secret conversations that “we cannot read”. If any commodity PC or jailbroken phone can run any of the world’s most popular communications applications, then “bad guys” will just use them. Jailbreaking an OS isn’t hard. Downloading an app isn’t hard. Stopping people from running code they want to run is — and what’s more, it puts the whole nation — individuals and industry — in terrible jeopardy.
In the year 2000, Google decided to enter China, establishing a corporate presence within Chinese borders. Though the company did some engineering in-country, the primary reason for Google’s presence was to have a sales office —people who could maximize Google’s revenue from Chinese advertisers.
Once Google put its people and its money into China, censorship of Google’s Chinese search results became inevitable. China has laws that prohibit certain kinds of content — for example, mentions of the 1989 massacre of student protesters in Tiananmen Square.
Hypothetically, this prohibition applies to every online service in the world, but practically, the Chinese state can only enforce its will where it has something to grab hold of. After 2000, the Chinese state could threaten of Google’s staff and/or its bank accounts as a means of coercing Google’s participation in censorship.
Call this the “enforcement nexus” — for a government to enforce a law, it needs something to seize. Governments have broad latitude to seize things and people within their territorial borders (though this is not absolute, as I’ll discuss below).
But when it comes to conduct outside a government’s territory, enforcement depends upon the cooperation of another government — this is why so many crime dramas turn on a desperate dash for countries that don’t have extradition treaties.
Governments can project enforcement power into any territory that will allow it to seize the people or property of its adversaries. When the Argentinian government defaulted on its bonds, it failed to reckon with the fact that its US dollar holdings were stashed in the US Federal Reserve Bank in New York.
That meant that the vulture capitalists seeking to squeeze Argentina could argue their case in their home court in the USA, seeking a judgment that could be enforced domestically — that is, by seizing the Argentinian government’s assets held on US soil.
Back when Google went into China, the Great Firewall was still nascent, riddled with holes (companies like Cisco had yet to give the Chinese state the technical assistance it needed to plug those gaps).
It took years for China to perfect its national firewall. When the Chinese state hacked Google as part of a project to kidnap and torture dissidents, Google was able to withdraw to the semi-independent territory of Hong Kong and continue to serve search results to Chinese internet users, who would have to do a little bit of technical stuff at their end in order to reach Google’s servers, but nothing too onerous.
Google’s retreat to Hong Kong cost the company something: without a sales force (and also lobbyists) on the ground in China, their ad business suffered. It was harder to reach customers, and harder for those customers to pay Google.
In other words, the choice to go into China was always about revenue maximization, not about the ability to do business. So long as Chinese businesses knew about Google, so long as they could pay Google, and so long as Chinese internet users were visiting Google, Google could do business “in” China from anywhere.
As the Chinese firewall was completed, however, China gained a new enforcement nexus: they could block Google for failing to comply with their censorship rules.
National firewalls are everywhere today. Sometimes, they’re sold as turnkey solutions — by both Chinese and western firms — to poor countries with very little technical capacity of their own. Spy agencies from large, powerful countries love it when poor countries install foreign-made national firewalls, as these are key to “third-party collection” (when a spy agency taps into another spy agency’s files) and “fourth-party collection” (when a spy agency taps into another spy agency that has tapped into another spy-agency’s files).
As national firewalls proliferate, so too do enforcement nexuses. After Edward Snowden revealed that US tech giants were allowing US spy agencies to plunder their user data, the EU imposed a (perfectly reasonable) data localization regulation that required US tech companies to keep Europeans’ data on servers within the EU (this regulation remains contentious and fragile).
The EU doesn’t have a regional or national firewall, so tech giants who don’t want to comply with the regulation could simply withdraw their sales offices and engineering departments and lobbyists from the EU and ignore the rule — at least to the extent that they could convince US courts not to enforce EU judgments against them.
But the EU has other enforcement nexuses it could rely upon. It could order European banks and payment processors to block payments to tech firms that ignore the localization rule. Payment processing remains a highly regulated, concentrated industry, and even if, say, Facebook was willing to give up on 520,000,000 European consumers by retreating to the USA, it’s unlikely that Visa and Mastercard would follow suit.
Another important enforcement nexus: App Stores. Apple’s iOS platform blocks third-party app stores outright, while Google’s Android allows for sideloading, but uses contractual shenanigans to prevent alternative to Google’s core apps from coming pre-installed on Android devices, and require that users confront a string of sphincter-tighteningly terrifying warnings before they can “sideload” an app.
So long as Google and Apple maintain personnel and cash in reach of EU enforcers, the EU can order the companies to block apps that violate its data-localization laws.
The order, then, goes like this: first, require that tech companies maintain an enforcement nexus (personnel and money) within a government’s territory. Then, use national firewalls to block noncompliant companies. If you don’t have a national firewall, use chokepoints — like App Stores or payment processors — to prevent the companies from doing business from beyond the national borders.
Of course, none of this is perfect.
Not every payment processor will toe the line, and the EU has financial links to states with weak financial regulations or weak enforcement, both in the EU (Malta, Cyprus, etc) and beyond its borders (the UK, Azerbaijan, etc).
National firewalls can be circumvented with Tor or a VPN. China has so many enforcement nexuses over Apple — hundreds of millions of customers, as well as Apple’s key manufacturing operations — that it was able to order Apple to remove VPNs from the iOS App Store.
And, of course, money talks. Countries seeking to force compliance on an offshore tech giant may have the technical capacity to block noncompliant firms at the border through a national firewall, or to seize those firms’ assets or personnel, but will still choose not to because they are more interested in the company’s business (or its bribes) than they are in its compliance.
The EU, for example, has conspicuously failed to enforce its General Data Protection Regulation (GDPR) against Google and Facebook, who have never even tried to comply with it (and whose arguments as to why they needn’t do so are so flimsy as to be laughable).
The reasons for this nonenforcement are complex, tied up with the compromises of European federalism. Without getting into too much detail here, I’ll just say that, having captured the government of Ireland (winner of the prize for the EU’s most subservient tax-haven), the tech giants have neutered Ireland’s privacy enforcers while also insisting that all privacy cases against them must be heard in Ireland (this ruse worked for years, but is finally petering out).
It’s sometimes hard to predict what a tech giant will do when pressured by a state to do something it objects to on financial or ethical grounds. Google left China in 2010. Meanwhile, Facebook has ignored Uganda’s orders to censor and block accounts linked to opposition parties— and Uganda has retaliated by blocking Facebook with its national firewall (but Uganda either can’t or won’t block VPNs, so one of the main effects of blocking Facebook is to teach Ugandans how to download, install and use a VPN).
Enter American culture-war nonsense.
In Texas, they want to ban websites that explain how to get an abortion, as well as sites that ship the pills for a medication abortion. In Florida, they want to force bloggers who write about the state government to pay a fee and register with the state, prohibiting anonymous commentary about the state legislature and its actions. Florida has also required that online providers cease permitting their users to display pronouns other than the ones they were assigned at birth. Of course, online services have no way to know what pronouns any of their users were assigned at birth, so sites like Github are complying with Florida law by simply not displaying pronouns to Floridian users.
The biggest barrier to enforcing these laws is the US Constitution, which these laws assuredly violate. It’s entirely possible that a lower court will uphold these laws. It’s conceivable that an appeals court will do so as well. It’s not outside the realm of possibility that the current Supreme Court — illegitimately stacked with far-right partisan hacks lacking any shred of principle — will follow suit.
But it’s far from a sure thing. It’s not even clear whether the legislatures that passed these laws and the governors who signed them want them to be enforced. After all, if these policies do come into force, large numbers of corporations are likely to shutter their offices and move out of state (especially in Florida, a state that is less economically important than Texas).
For these cynical political operators, having their laws overturned by “activist judges” lets them eat their cake and have it too — they don’t have to alienate the business lobby, and they get a steady supply of red meat for their cruel base, driving voter turnout and donations from frightened bigots.
To the extent that governments embrace a rule of law, they take certain enforcement nexuses off the table — and tech businesses know it. When the FBI tried to force Apple to weaken the security of its iOS devices, Apple bet (correctly) that US courts would force the FBI to back off. When the Chinese state demanded that Apple add backdoors to its devices, Apple immediately capitulated. US courts may be awfully deferential to the US government, but Chinese courts are effectively agents of the Chinese state.
It’s not that Apple cares more about its Chinese customers or manufacturing capacity than it values its business in the USA — rather, Apple was betting that America wouldn’t use its enforcement nexus because doing so would violate the US Constitution.
If Florida and Texas — and other fever-swamp red states — do manage to get their censorship laws past the courts, they will need to find an enforcement nexus. Companies like Github could simply move any personnel and offices out of state and restore pronouns to their pages, counting on the federal courts to pre-empt any attempt by Florida to enforce its laws in California.
Texas thinks it has found an enforcement nexus: Internet Service Providers (ISPs). This is actually a pretty reliable enforcement nexus: by definition, ISPs need to have physical capital located within the territories they serve, and they also need to hire or contracts with technicians to build and maintain it. What’s more, the ISP sector is massively concentrated, and most Americans are served by two or fewer broadband companies, so the Texas AG’s office wouldn’t have to file hundreds of injunctions to get ISPs to block the websites the legislature wants to censor.
If you want to build a Great Firewall of Texas, ISP monopolies are a feature, not a bug.
Enter Cold War 2.0. The US government — and many state governments — want to ban TikTok on the grounds that the masses of data it collects can be easily harvested by the Chinese state. This is a perfectly reasonable thing to worry about, even if the rationale behind it is muddied by jingoistic nationalism, an obvious desire to protect America’s increasingly ossified and uncompetitive tech businesses, and the fact that keeping Americans corralled inside US giants’ walled gardens makes it easier for US law enforcement and spy agencies to keep tabs on Americans.
Indeed, this is the exact same reasoning the Russian state employed when it passed its data localization laws. Russia says (correctly) that the US government treats US tech giants’ user data as an adjunct to its own intelligence troves. Russia points to the EU’s (entirely justified) concerns about their citizens’ data being stored on US soil, in reach of US spies. And, like the US and EU, they don’t mention that they are quite excited by the prospect of having all their residents’ data kept in country, where there are all the enforcement nexuses you could want.
Unlike Russia, the US doesn’t have a national firewall. Thus far, all the US has done is order federal employees not to install Tiktok on their government-supplied devices. To call this a “hollow, symbolic gesture” is to do a grave disservice to honest, hardworking hollow, symbolic gestures all around the world.
Fed employees can still install Tiktok on their personal devices — and so can their families, who share local networks with them at home, which can be plundered for all kinds of intimate data.
The US government could pass a strong federal privacy law with a private right of action, and then ban Tiktok and its US competitors from gathering data that could be used by anyone’s spies or cops, but such legislation has been effectively stalled for a generation now. I’m going to keep fighting for it — but I’m not going to hold my breath.
Especially not now that the tech giants are such willing and powerful accomplices to red states that have passed forced-birth laws, and who rely on mass commercial surveillance as a means to find and arrest women who’ve sought abortions using Google or Facebook.
Another course of action: the US could order Apple and Google to nuke Tiktok from their App Stores (when it comes to censorship, mobile app store monopolies are also a feature and not a bug). This is what China did during the unprecedented protests against Xi Jinping and his establishment, forcing Apple to push a broken version of its anonymous wireless file-sharing tool, ending its use in organizing dissident protests.
Such a move would be unconstitutional — but again, it’s not clear that the Supreme Court cares about the Constitution except in the most cynical, instrumental way.
And now, finally, we come back to the UK, where, as prophesied, the government is once again promising to ban working cryptography, following in the steps of China, Russia and Iran.
Having left the EU, the UK has some exciting new enforcement nexuses. For one thing, it is unshackled from EU human rights law (though this is a lot less true as of last week, thanks to the new, post-Brexit Northern Ireland deal).
And despite its catastrophic economic deterioration, the UK remains an important market for tech giants, who will be at pains to quit the British Isles and conduct their sales from Ireland or France (though here, the UK itself is somewhat constrained, as many politically connected, wealthy people would be furious if their mobile phones and social media stopped working because tech giants had walked away from the country).
Like the US, the UK‘s internet services are run by a cartel of giant ISPs who have shown themselves to be biddable partners in censorship campaigns.
One wrinkle, though: one of the fastest-growing, most prominent encrypted messaging service isn’t a tech giant — it’s Signal, a nonprofit, mission-driven foundation, whose instant messaging tool was the first really secure communications technology to crack the hard problems of usability and penetrate to a mass audience.
What’s more, Signal’s CEO, the brilliant and principled Meredith Whittaker, has pledged that Signal won’t comply with an order to break its security. As Whittaker told Ars Technica:
We would absolutely exit any country if the choice were between remaining in the country and undermining the strict privacy promises we make to the people who rely on us. The UK is no exception.
It’s a promise that Signal can make good on: they don’t need a UK sales office, because they have nothing to sell. They don’t need a UK bank account, because they have no UK sales office (though they do rely on donations, and if the UK government orders payment processors to block these, it will inflict some pain on the foundation).
The real question isn’t whether Whittaker will make good on her promise. Anyone who knows her (disclosure: she’s a friend of mine) knows that she is deadly serious about this.
The question is whether the App Store duopoly can be forced to de-list Signal for UK users, and whether the ISPs will block Signal’s traffic for people who use Signal over the web.
Note that none of this will stop Signal from being used by the “bad guys” that the UK government claims to be concerned with. The Four Horsemen of the Infocalypse can register their mobile device accounts with mail-drops in Ireland or France (or anywhere).
The can also use VPNs to get around any national web-blocking. The UK government could try to follow the Chinese lead in blocking VPNs, but the country’s financial sector —already reduced to a rump after post-Brexit departures to Frankfurt and Paris — would be incandescent, depending as they do on VPNs to connect trading desks, analysts and markets.
After decades of trying, the attempts of governments to control how the people in their borders use the internet are finally bearing fruit. The toxic mix of corporate greed, mass surveillance, data localization, and monopolization in payment processing, network provision and mobile operating systems has finally put the Great Firewall of China within reach of every government, even beleaguered also-rans like poor, drowning Florida.
Cory Doctorow (craphound.com) is a science fiction author, activist, and blogger. He has a podcast, a newsletter, a Twitter feed, a Mastodon feed, and a Tumblr feed. He was born in Canada, became a British citizen and now lives in Burbank, California. His latest nonfiction book is Chokepoint Capitalism (with Rebecca Giblin), a book about artistic labor market and excessive buyer power. His latest novel for adults is Attack Surface. His latest short story collection is Radicalized. His latest picture book is Poesy the Monster Slayer. His latest YA novel is Pirate Cinema. His latest graphic novel is In Real Life. His forthcoming books include Red Team Blues, a noir thriller about cryptocurrency, corruption and money-laundering (Tor, 2023); and The Lost Cause, a utopian post-GND novel about truth and reconciliation with white nationalist militias (Tor, 2023).