Privacy first

A powerful principle with a vast constituency.

Cory Doctorow
6 min readDec 6, 2023

--

A hospital room with a hospital bed. The patient in the bed is wearing some kind of red mind-control helmet with a red cord snaking away to a switchplate on the wall. He is grimacing and clutching his sheets. A breakway wall shows a caricature of Uncle Sam whose legs stick out to suggest a horseshoe magnet. His face has been replaced with the glowing red eye of HAL9000 from Kubrick’s ‘2001: A Space Odyssey.’ Behind him is a ‘code waterfall’ as seen in the credit sequences of the Wachowskis’ ‘Mat

If you’d like an essay-formatted version of this post to read or share, here’s a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:

https://pluralistic.net/2023/12/06/privacy-first/#but-not-just-privacy

The internet is embroiled in a vicious polycrisis: child safety, surveillance, discrimination, disinformation, polarization, monopoly, journalism collapse — not only have we failed to agree on what to do about these, there’s not even a consensus that all of these are problems.

But in a new whitepaper, my EFF colleagues Corynne McSherry, Mario Trujillo, Cindy Cohn and Thorin Klosowski advance an exciting proposal that slices cleanly through this Gordian knot, which they call “Privacy First”:

https://www.eff.org/wp/privacy-first-better-way-address-online-harms

Here’s the “Privacy First” pitch: whatever is going on with all of the problems of the internet, all of these problems are made worse by commercial surveillance.

  • Worried your kid is being made miserable through targeted ads? No surveillance, no targeting.
  • Worried your uncle was turned into a Qanon by targeted disinformation? No surveillance, no targeting. Worried that racialized people are being targeted for discriminatory hiring or lending by algorithms? No surveillance, no targeting.
  • Worried that AI is being trained on your personal data? No surveillance, no training data.
  • Worried that the news is being killed by monopolists who exploit the advantage conferred by surveillance ads to cream 51% off every ad-dollar? No surveillance, no surveillance ads.
  • Worried that social media giants maintain their monopolies by filling up commercial moats with surveillance data? No surveillance, no surveillance moat.

The fact that commercial surveillance hurts so many groups of people in so many ways is terrible, of course, but it’s also an amazing opportunity. Thus far, the individual constituencies for, say, saving the news or protecting kids have not been sufficient to change the way these big platforms work. But when you add up all the groups whose most urgent cause would be significantly improved by comprehensive federal privacy law, vigorously enforced, you get an unstoppable coalition.

America is decades behind on privacy. The last really big, broadly applicable privacy law we passed was a law banning video-store clerks from leaking your porn-rental habits to the press (Congress was worried about their own rental histories after a Supreme Court nominee’s movie habits were published in the Washington City Paper):

https://en.wikipedia.org/wiki/Video_Privacy_Protection_Act

In the decades since, we’ve gotten laws that poke around the edges of privacy, like HIPAA (for health) and COPPA (data on under-13s). Both laws are riddled with loopholes and neither is vigorously enforced:

https://pluralistic.net/2023/04/09/how-to-make-a-child-safe-tiktok/

Privacy First starts with the idea of passing a fit-for-purpose, 21st century privacy law with real enforcement teeth (a private right of action, which lets contingency lawyers sue on your behalf for a share of the winnings):

https://www.eff.org/deeplinks/2022/07/americans-deserve-more-current-american-data-privacy-protection-act

Here’s what should be in that law:

  • A ban on surveillance advertising:

https://www.eff.org/deeplinks/2022/03/ban-online-behavioral-advertising

  • Data minimization: a prohibition on collecting or processing your data beyond what is strictly necessary to deliver the service you’re seeking.
  • Strong opt-in: None of the consent theater click-throughs we suffer through today. If you don’t give informed, voluntary, specific opt-in consent, the service can’t collect your data. Ignoring a cookie click-through is not consent, so you can just bypass popups and know you won’t be spied on.
  • No preemption. The commercial surveillance industry hates strong state privacy laws like the Illinois biometrics law, and they are hoping that a federal law will pre-empt all those state laws. Federal privacy law should be the floor on privacy nationwide — not the ceiling:

https://www.eff.org/deeplinks/2022/07/federal-preemption-state-privacy-law-hurts-everyone

  • No arbitration. Your right to sue for violations of your privacy shouldn’t be waivable in a clickthrough agreement:

https://www.eff.org/deeplinks/2022/04/stop-forced-arbitration-data-privacy-legislation

  • No “pay for privacy.” Privacy is not a luxury good. Everyone deserves privacy, and the people who can least afford to buy private alternatives are most vulnerable to privacy abuses:

https://www.eff.org/deeplinks/2020/10/why-getting-paid-your-data-bad-deal

  • No tricks. Getting “consent” with confusing UIs and tiny fine print doesn’t count:

https://www.eff.org/deeplinks/2019/02/designing-welcome-mats-invite-user-privacy-0

A Privacy First approach doesn’t merely help all the people harmed by surveillance, it also prevents the collateral damage that today’s leading proposals create. For example, laws requiring services to force their users to prove their age (“to protect the kids”) are a privacy nightmare. They’re also unconstitutional and keep getting struck down.

A better way to improve the kid safety of the internet is to ban surveillance. A surveillance ban doesn’t have the foreseeable abuses of a law like KOSA (the Kids Online Safety Act), like bans on information about trans healthcare, medication abortions, or banned books:

https://www.eff.org/deeplinks/2023/05/kids-online-safety-act-still-huge-danger-our-rights-online

When it comes to the news, banning surveillance advertising would pave the way for a shift to contextual ads (ads based on what you’re looking at, not who you are). That switch would change the balance of power between news organizations and tech platforms — no media company will ever know as much about their readers as Google or Facebook do, but no tech company will ever know as much about a news outlet’s content as the publisher does:

https://www.eff.org/deeplinks/2023/05/save-news-we-must-ban-surveillance-advertising

This is a much better approach than the profit-sharing arrangements that are being trialed in Australia, Canada and France (these are sometimes called “News Bargaining Codes” or “Link Taxes”). Funding the news by guaranteeing it a share of Big Tech’s profits makes the news into partisans for that profit — not the Big Tech watchdogs we need them to be. When Torstar, Canada’s largest news publisher, struck a profit-sharing deal with Google, they killed their longrunning, excellent investigative “Defanging Big Tech” series.

A privacy law would also protect access to healthcare, especially in the post-Roe era, when Big Tech surveillance data is being used to target people who visit abortion clinics or secure medication abortions. It would end the practice of employers forcing workers to wear health-monitoring gadget. This is characterized as a “voluntary” way to get a “discount” on health insurance — but in practice, it’s a way of punishing workers who refuse to let their bosses know about their sleep, fertility, and movements.

A privacy law would protect marginalized people from all kinds of digital discrimination, from unfair hiring to unfair lending to unfair renting. The commercial surveillance industry shovels endless quantities of our personal information into the furnaces that fuel these practices. A privacy law shuts off the fuel supply:

https://www.eff.org/deeplinks/2023/04/digital-privacy-legislation-civil-rights-legislation

There are plenty of ways that AI will make our lives worse, but copyright won’t fix it. For issues of labor exploitation (especially by creative workers), the answer lies in labor law:

https://pluralistic.net/2023/10/01/how-the-writers-guild-sunk-ais-ship/

And for many of AI’s other harms, a muscular privacy law would starve AI of some of its most potentially toxic training data:

https://www.businessinsider.com/tech-updated-terms-to-use-customer-data-to-train-ai-2023-9

Meanwhile, if you’re worried about foreign governments targeting Americans — officials, military, or just plain folks — a privacy law would cut off one of their most prolific and damaging source of information. All those lawmakers trying to ban Tiktok because it’s a surveillance tool? What about banning surveillance, instead?

Monopolies and surveillance go together like peanut butter and chocolate. Some of the biggest tech empires were built on mountains of nonconsensually harvested private data — and they use that data to defend their monopolies. Legal privacy guarantees are a necessary precursor to data portability and interoperability:

https://www.eff.org/wp/interoperability-and-privacy

Once we are guaranteed a right to privacy, lawmakers and regulators can order tech giants to tear down their walled gardens, rather than relying on tech companies to (selectively) defend our privacy:

https://pluralistic.net/2022/11/14/luxury-surveillance/#liar-liar

The point here isn’t that privacy fixes all the internet’s woes. The policy is “privacy first,” not “just privacy.” When it comes to making a new, good internet, there’s plenty of room for labor law, civil rights legislation, antitrust, and other legal regimes. But privacy has the biggest constituency, gets us the most bang for the buck, and has the fewest harmful side-effects. It’s a policy we can all agree on, even if we don’t agree on much else. It’s a coalition in potentia that would be unstoppable in reality. Privacy first! Then — everything else!

--

--