A systemic (not individual) approach to content moderation

All the good stuff happens BEFORE the moderator gets involved.

Cory Doctorow

--

An animated gif depicting a ‘Useless Machine,’ that is, a box with a switch on it. When the operator’s finger flips the switch, a mechanical finger pops out of the box and flips it back and retracts back under the box lid. The loop shows the human and mechanical finger locked in an endless duel. Image: 陳韋邑 (modified) https://www.youtube.com/watch?v=PzlFOEJ1MiA CC BY 3.0: https://creativecommons.org/licenses/by/3.0/legalcode

As Mike Masnick is fond of saying, “Content moderation at scale is impossible.” The evidence for that statement is all around us, in the form of innumerable moderation gaffes, howlers and travesties:

  • Blocking mask-making volunteers to enforce a mask-ad ban;
  • Removing history videos while trying to purge Holocaust denial;
  • Deplatforming antiracist skinheads when trying to block racist skinheads;
  • Removing an image of Captain America punching a Nazi for depicting a Nazi.

But in a forthcoming Harvard Law Review paper, evelyn douek proposes that content moderation at scale can be improved, provided we drop the individual-focused “digital constitutionalism” model in favor of systemic analysis and regulation.

https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4005326

The paper opens with a blazing, lucid critique of the status quo, the “first wave,” “standard model” that moderation scholars call “New Governors.” In this model, the platforms have a set of speech policies about what you can say, enforced by moderators. This produces a “decision factory” where thousands of…

--

--

Cory Doctorow
Cory Doctorow

Written by Cory Doctorow

Writer, blogger, activist. Blog: https://pluralistic.net; Mailing list: https://pluralistic.net/plura-list; Mastodon: @pluralistic@mamot.fr

Responses (3)