Member-only story
When Facebook came for your battery, feudal security failed
A tale of whistleblowers, security through obscurity, and binding arbitration.

Next week (Feb 8–17), I’ll be in Australia, touring my book Chokepoint Capitalism with my co-author, Rebecca Giblin. We’ll be in Brisbane on Feb 8, and then we’re doing a remote event for NZ on Feb 9. Next is Melbourne, Sydney and Canberra. I hope to see you!
When George Hayward was working as a Facebook data-scientist, his bosses ordered him to run a “negative test,” updating Facebook Messenger to deliberately drain users’ batteries, in order to determine how power-hungry various parts of the apps were. Hayward refused, and Facebook fired him, and he sued:
https://nypost.com/2023/01/28/facebook-fires-worker-who-refused-to-do-negative-testing-awsuit/
Hayward balked because he knew that among the 1.3 billion people who use Messenger, some would be placed in harm’s way if Facebook deliberately drained their batteries — physically stranded, unable to communicate with loved ones experiencing emergencies, or locked out of their identification, payment method, and all the other functions filled by mobile phones.
As Hayward told Kathianne Boniello at the New York Post, “Any data scientist worth his or her salt will know, ‘Don’t hurt people…’ I refused to do this test. It turns out if you tell your boss, ‘No, that’s illegal,’ it doesn’t go over very well.”
Negative testing is standard practice at Facebook, and Hayward was given a document called “How to run thoughtful negative tests” regarding which he said, “I have never seen a more horrible document in my career.”
We don’t know much else, because Hayward’s employment contract included a non-negotiable binding arbitration waiver, which means that he surrendered his right to seek legal redress from his former employer. Instead, his claim will be heard by an arbitrator — that is, a fake corporate judge who is paid by Facebook to decide if Facebook was wrong. Even if he finds in Hayward’s favor — something that arbitrators do far less frequently than real judges do — the judgment, and all the information that led up to it, will be confidential, meaning we won’t get to find out more: