Member-only story

A systemic (not individual) approach to content moderation

All the good stuff happens BEFORE the moderator gets involved.

Cory Doctorow
9 min readMar 12, 2022
An animated gif depicting a ‘Useless Machine,’ that is, a box with a switch on it. When the operator’s finger flips the switch, a mechanical finger pops out of the box and flips it back and retracts back under the box lid. The loop shows the human and mechanical finger locked in an endless duel. Image: 陳韋邑 (modified) https://www.youtube.com/watch?v=PzlFOEJ1MiA CC BY 3.0: https://creativecommons.org/licenses/by/3.0/legalcode

As Mike Masnick is fond of saying, “Content moderation at scale is impossible.” The evidence for that statement is all around us, in the form of innumerable moderation gaffes, howlers and travesties:

  • Blocking mask-making volunteers to enforce a mask-ad ban;
  • Removing history videos while trying to purge Holocaust denial;
  • Deplatforming antiracist skinheads when trying to block racist skinheads;
  • Removing an image of Captain America punching a Nazi for depicting a Nazi.

But in a forthcoming Harvard Law Review paper, evelyn douek proposes that content moderation at scale can be improved, provided we drop the individual-focused “digital constitutionalism” model in favor of systemic analysis and regulation.

https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4005326

The paper opens with a blazing, lucid critique of the status quo, the “first wave,” “standard model” that moderation scholars call “New Governors.” In this model, the platforms have a set of speech policies about what you can say, enforced by moderators. This produces a “decision factory” where thousands of human moderators make billions of calls and when they get it wrong, the platform steps in as a kind of courtroom to hear appeals.

This has been a disaster. The calls are often wrong and the appeals process is either so cursory as to be useless or so slow that the thoroughness doesn’t matter. It’s a mistake to conceive of platforms as “a transmission belt implementing rules in an efficient and reliable way, organized around a limited set of hierarchically organized institutions and rights of appeal.”

But more importantly, these individual appeals and the stories of how they go wrong give no picture of what’s happening with speech on the platform as a system, what kind of speech is being systemically downranked or encouraged.

And even more importantly, focusing on how moderators (or filters) react to speech is wildly incomplete, because almost everything that’s important about platforms’ speech regulation happens…

--

--

Cory Doctorow
Cory Doctorow

Written by Cory Doctorow

Writer, blogger, activist. Blog: https://pluralistic.net; Mailing list: https://pluralistic.net/plura-list; Mastodon: @pluralistic@mamot.fr

Responses (3)

Write a response