Facebook shouldn’t be in charge of how you use Facebook

Unfollow Everything and the need for (good) tech regulation.

Cory Doctorow
9 min readOct 8, 2021
The Unfollow Everything extension image: a mobile phone screen with a small Facebook logo in the top left corner and a large red X over the screen; it is captioned ‘Unfollow everything in your Facebook News Feed.’

The Facebook whistleblower’s testimony was refreshing and frightening by turns, revealing the company’s awful internal culture, where product design decisions that benefited its users were sidelined if they were bad for its shareholders.

The big question now is, what do we do about it? The whistleblower, Frances Haugen rejected the idea that Facebook’s power should be diminished; rather, she argued that it should be harnessed — put under the supervision of a new digital regulator.

Regulating tech is a great idea (assuming the regulations are thoughtful and productive, of course), but even if we can agree on what rules tech should follow, there’s still a huge debate over how the tech sector should be structured.

Like, should we leave monopolies intact so that we only have to keep track of a few companies to make sure they’re following the rules? Or should we smash them up — through breakups, unwinding anticompetitive mergers, and scrutinizing future mergers?

For me, the answer is self-evident: if we don’t make Big Tech weak, we’ll never bring them to heel. Giant companies can extract “monopoly rents” — huge profits — and cartels can agree on how to spend those profits to subvert regulation.

https://www.eff.org/deeplinks/2021/08/starve-beast-monopoly-power-and-political-corruption

We need to fix the internet, not the tech giants. The problem isn’t just that Zuck is really bad at being the unelected pope-emperor of the digital lives of 3,000,000,000 people — it’s that the job of “pope-emperor of 3,000,000,000 people” should be abolished.

I believe that people who rely on digital tools should have the final say in how those tools serve them. That’s the proposition at the core of the “nothing about us without us” movement for accessible tech, and the ethos of Free Software.

Technologists should take every reasonable step to make their products and services suitable for users, and regulators should step in to ban certain design choices: for example, algorithms that result in racial discrimination in housing, finance and beyond.

The law should step in when sites or apps are deceptive or fraudulent or otherwise harmful; people hurt by negligent security and other choices should have remedies in law, both as private individuals and through their law enforcement officials.

But even if we did all that — and to be clear, we don’t — it wouldn’t be enough to deliver technological self-determination, the right to decide how the technology you use works.

For example, when the W3C was standardizing EME — a shameful incident in which they created a standard for video DRM — there was a lot of work put into accessibility, including safeguarding closed captions and audio description tracks.

But even the most inclusive design process can’t contemplate all of the ways in which users will need to adapt their tools. My friend Jennifer has photosensitive epilepsy and was hospitalized after a strobe effect in a Netflix stream triggered a series of grand mal seizures.

EME could have accommodated that use-case by implementing a lookahead algorithm that checked for upcoming strobes and skipped past them or altered their gamma curves so that they didn’t flash. The committee rejected this proposal, though.

But that wasn’t all they did. They also rejected a proposal to extract a promise from the companies involved in EME’s creation to refrain from threatening toolsmiths who added this feature on their own, either to help themselves or on behalf of other users.

The reason such a promise was necessary is DRM enjoys special legal protection: distributing a tool that bypasses DRM — even to prevent grand mal seizures — can be prosecuted as a felony under Sec 1201 of the DMCA, with 5 years in prison and a $500k fine for a first offense.

The companies making browser DRM said that they didn’t need to promise not to destroy the lives of toolsmiths who added accessibility features to their product because they would add every necessary accessibility feature themselves.

Except they wouldn’t. They blocked an anti-seizure tool, and Dan Kaminski’s proposal to shift color palettes to compensate for color-blindness, and a proposal for automated captioning tools to bypass DRM to ingest videos and run multiple rounds of text-to-speech analysis.

And even if they had accepted all of these measures, it wouldn’t have been enough. No one can anticipate all the ways that people need to adapt their tools. “Nothing about us without us” can’t just mean, “Some people with disabilities helped design this.”

It also has to mean, “I, a person using this tool, get a veto over how it works. When my interests conflict with the manufacturer’s choices, I win. It’s my tool. Nothing about me without me.” That’s the soul of technological self-determination.

Not only is Zuck a bad custodian of 3b lives, but every company is a bad custodian — or at least, an imperfect one — when it comes to its users’ lives. Companies often do good things for their users, but when user interests conflict with shareholder priorities, users lose.

Companies should try to do good things, and we should have minimum standards, and also we should have ways to further adapt their tools to suit their own needs, first, because companies can’t anticipate all those needs, and second, because they have conflicts of interest.

In response to the whistleblower’s testimony, Slate Future Tense has an article by Louis Barclay describing his misadventures with a Facebook add-on he created called “Unfollow Everything” — a tool that helped people use Facebook less and enjoy it more.

https://slate.com/technology/2021/10/facebook-unfollow-everything-cease-desist.html

Unfollow Everything unfollowed all the friends, groups and pages you followed on Facebook. This eliminated Facebook’s News Feed entirely, and then users could either manually check in on friends (unfollowing isn’t the same as unfriending) or selectively follow them again.

Users who tried Unfollow Everything really liked it. They found themselves spending less time on Facebook, but enjoying the time they spent there a lot more. Both Barclay and his users felt “addicted” to Facebook and this helped them “control the addiction.”

I’m not a fan of analogizing the habitual use of digital tools to “addiction,” but that’s not the point here — before Unfollow Everything, these users didn’t like Facebook but kept returning to it, and after, they kept using it, and liked it more.

It would be really interesting to understand this phenomenon better. After all, Facebook has spent a lot of money on internal experiments where social scientists and designers collaborated to increase the time users spend on Facebook (AKA “engagement”).

In his spittle-flecked rebuttal to the WSJ’s Facebook Files, Mark Zuckerberg insisted that the point of this research was to increase user satisfaction, and the extra ad revenue from additional pageviews were just a coincidence.

https://mashable.com/article/mark-zuckerberg-refutal-whistleblower

I don’t think this is true. But it’s a claim we can empirically investigate, and Unfollow Everything would be a great tool for such an investigation. That’s why a group of academics at Switzerland’s University of Neuchâtel sought a collaboration with Barclay for a study.

That study never happened and probably never will, because Facebook had Unfollow Everything deleted from the Chrome store, kicked Barclay off the service and threatened to rain legal hell upon Barclay’s head if he ever made any Facebook tools, ever again.

Like the companies that standardized DRM at the W3C, Facebook says its users’ wellbeing and satisfaction are primary factors in its product design and demands a veto over any modifications so it can deliver the best product to those users.

And like those companies, Facebook actively opposes — and wields its legal might against — toolsmiths who adapt its products and services to reflect the needs of users when those needs conflict with its shareholders’ interests.

Time and again, Facebook demonstrates that it cannot be trusted to wield a veto over how we use Facebook. Think of how the company is attacking Ad Obervatory, an academic project that recruits volunteer Facebook users to track paid political disinformation on the platform.

Facebook falsely claimed that Ad Observatory compromised users’ privacy, and directed researchers to its own tools for analyzing content on the platform — tools that are riddled with errors and omissions and flat-out misleading distortions:

https://pluralistic.net/2021/08/06/get-you-coming-and-going/#potemkin-research-program

(To be fair, Facebook did once have a reliable research tool, Crowdtangle, which produced accurate and thus unflattering data about FB, so Facebook neutered the tool, disbanded the team that worked on it, and forced the founder out of the company)

https://pluralistic.net/2021/07/15/three-wise-zucks-in-a-trenchcoat/#inconvenient-truth

As part of its smear campaign against Ad Observatory, Facebook argued that it has to shut down this kind of activity, because that’s how it defends us from privacy-invading scumbags like Cambridge Analytica.

It’s true that Cambridge Analytica did “research” on FB that compromised millions of users’ privacy, but Facebook has already disqualified itself from defending us from Cambridge Analytica by failing to defend us from Cambridge Analytica.

If the specter of platform abuse — racial discrimination, harassment, infringement, malware, fraud, privacy invasion — grants Facebook a veto over independent modifications of its products, FB will fail to protect us, and will kill the tools that can protect us (from Facebook — and other bad actors).

And yet, Facebook really is used in ways that harm its users. Sometimes, the abuse comes from third parties that create plugins, bots, scrapers or other interoperable tools that modify Facebook. But you know what? Many of these abuses come from Facebook itself.

That’s why Facebook can’t be in charge of the Facebook Cops. That’s why Facebook can’t have the final say over how we modify Facebook (and why no tech company should have the final say over how its users adapt the things they buy and use).

Which brings me back to the whistleblower and her call for regulating Facebook. Yeah, we need to do that. We need to have things like a federal privacy law with a private right of action, robust protections against harassment, vigorous enforcement of anti-fraud rules.

That’s how you distinguish between good mods and bad mods — by passing democratically accountable laws that tell us what we are and are not entitled to. Not by letting Facebook (or Apple, or Google, or Netflix) decide what is and isn’t acceptable.

This is the thesis of “Privacy Without Monopoly,” the EFF white paper Bennett Cyphers and I wrote on how to balance interoperability and privacy: we do need privacy protections, and we can’t rely on monopolists to decide what those should be.

https://www.eff.org/wp/interoperability-and-privacy

As I said at the start, a digital regulator isn’t a bad idea, assuming the rules are good ones. We don’t have to wait for the creation of a new regulatory agency to get partway there, either. The FTC can use antitrust settlements to impose conditions on Big Tech.

For example, a settlement with FB could require it to seek permission from a “special master” before it could threaten to sue an interoperator like Ad Observatory, Unfollow Everything, Friendly Browser, GBWhatsApp, or any of the many other independent tools FB has pursued.

The special master could determine whether FB was engaged in privacywashing (or securitywashing, or safetywashing, etc) in a bid to make harming its users to help its shareholders seem legitimate.

Users don’t always make wise choices about how to use their tools, but tech giants can’t be trusted to distinguish between “preventing harms to users” and “preventing harms to the bottom line.”

That should be obvious, but if there’s any doubt, surely the Unfollow Everything scandal settles the matter.

Cory Doctorow (craphound.com) is a science fiction author, activist, and blogger. He has a podcast, a newsletter, a Twitter feed, a Mastodon feed, and a Tumblr feed. He was born in Canada, became a British citizen and now lives in Burbank, California. His latest nonfiction book is How to Destroy Surveillance Capitalism. His latest novel for adults is Attack Surface. His latest short story collection is Radicalized. His latest picture book is Poesy the Monster Slayer. His latest YA novel is Pirate Cinema. His latest graphic novel is In Real Life. His forthcoming books include The Shakedown (with Rebecca Giblin), a book about artistic labor market and excessive buyer power; Red Team Blues, a noir thriller about cryptocurrency, corruption and money-laundering (Tor, 2023); and The Lost Cause, a utopian post-GND novel about truth and reconciliation with white nationalist militias (Tor, 2023).

--

--