How lock-in hurts design
Felonizing the desire path.
Berliners: Otherland has added a second date (Jan 28) for my book-talk after the first one sold out — book now!
If you’ve ever read about design, you’ve probably encountered the idea of “paving the desire path.” A “desire path” is an erosion path created by people departing from the official walkway and taking their own route. The story goes that smart campus planners don’t fight the desire paths laid down by students; they pave them, formalizing the route that their constituents have voted for with their feet.
Desire paths aren’t always great (Wikipedia notes that “desire paths sometimes cut through sensitive habitats and exclusion zones, threatening wildlife and park security”), but in the context of design, a desire path is a way that users communicate with designers, creating a feedback loop between those two groups. The designers make a product, the users use it in ways that surprise the designer, and the designer integrates all that into a new revision of the product.
This method is widely heralded as a means of “co-innovating” between users and companies. Designers who practice the method are lauded for their humility, their willingness to learn from their users. Tech history is strewn with examples of successful paved desire-paths.
Take John Deere. While today the company is notorious for its war on its customers (via its opposition to right to repair), Deere was once a leader in co-innovation, dispatching roving field engineers to visit farms and learn how farmers had modified their tractors. The best of these modifications would then be worked into the next round of tractor designs, in a virtuous cycle:
But this pattern is even more pronounced in the digital world, because it’s much easier to update a digital service than it is to update all the tractors in the field, especially if that service is cloud-based, meaning you can modify the back-end everyone is instantly updated. The most celebrated example of this co-creation is Twitter, whose users created a host of its core features.
Retweets, for example, were a user creation. Users who saw something they liked on the service would type “RT” and paste the text and the link into a new tweet composition window. Same for quote-tweets: users copied the URL for a tweet and pasted it in below their own commentary. Twitter designers observed this user innovation and formalized it, turning it into part of Twitter’s core feature-set.
Companies are obsessed with discovering digital desire paths. They pay fortunes for analytics software to produce maps of how their users interact with their services, run focus groups, even embed sneaky screen-recording software into their web-pages:
https://www.wired.com/story/the-dark-side-of-replay-sessions-that-record-your-every-move-online/
This relentless surveillance of users is pursued in the name of making things better for them: let us spy on you and we’ll figure out where your pain-points and friction are coming from, and remove those. We all win!
But this impulse is a world apart from the humility and respect implied by co-innovation. The constant, nonconsensual observation of users has more to do with controlling users than learning from them.
That is, after all, the ethos of modern technology: the more control a company can exert over its users ,the more value it can transfer from those users to its shareholders. That’s the key to enshittification, the ubiquitous platform decay that has degraded virtually all the technology we use, making it worse every day:
https://pluralistic.net/2023/02/19/twiddler/
When you are seeking to control users, the desire paths they create are all too frequently a means to wrestling control back from you. Take advertising: every time a service makes its ads more obnoxious and invasive, it creates an incentive for its users to search for “how do I install an ad-blocker”:
https://www.eff.org/deeplinks/2019/07/adblocking-how-about-nah
More than half of all web-users have installed ad-blockers. It’s the largest consumer boycott in human history:
https://doc.searls.com/2023/11/11/how-is-the-worlds-biggest-boycott-doing/
But zero app users have installed ad-blockers, because reverse-engineering an app requires that you bypass its encryption, triggering liability under Section 1201 of the Digital Millennium Copyright Act. This law provides for a $500,000 fine and a 5-year prison sentence for “circumvention” of access controls:
https://pluralistic.net/2024/01/12/youre-holding-it-wrong/#if-dishwashers-were-iphones
Beyond that, modifying an app creates liability under copyright, trademark, patent, trade secrets, noncompete, nondisclosure and so on. It’s what Jay Freeman calls “felony contempt of business model”:
https://locusmag.com/2020/09/cory-doctorow-ip/
This is why services are so horny to drive you to install their app rather using their websites: they are trying to get you to do something that, given your druthers, you would prefer not to do. They want to force you to exit through the gift shop, you want to carve a desire path straight to the parking lot. Apps let them mobilize the law to literally criminalize those desire paths.
An app is just a web-page wrapped in enough IP to make it a felony to block ads in it (or do anything else that wrestles value back from a company). Apps are web-pages where everything not forbidden is mandatory.
Seen in this light, an app is a way to wage war on desire paths, to abandon the cooperative model for co-innovation in favor of the adversarial model of user control and extraction.
Corporate apologists like to claim that the proliferation of apps proves that users like them. Neoliberal economists love the idea that business as usual represents a “revealed preference.” This is an intellectually unserious tautology: “you do this, so you must like it”:
Calling an action where no alternatives are permissible a “preference” or a “choice” is a cheap trick — especially when considered against the “preferences” that reveal themselves when a real choice is possible. Take commercial surveillance: when Apple gave Ios users a choice about being spied on — a one-click opt of of app-based surveillance — 96% of users choice no spying:
But then Apple started spying on those very same users that had opted out of spying by Facebook and other Apple competitors:
https://pluralistic.net/2022/11/14/luxury-surveillance/#liar-liar
Neoclassical economists aren’t just obsessed with revealed preferences — they also love to bandy about the idea of “moral hazard”: economic arrangements that tempt people to be dishonest. This is typically applied to the public (“consumers” in the contemptuous parlance of econospeak). But apps are pure moral hazard — for corporations. The ability to prohibit desire paths — and literally imprison rivals who help your users thwart those prohibitions — is too tempting for companies to resist.
The fact that the majority of web users block ads reveals a strong preference for not being spied on (“users just want relevant ads” is such an obvious lie that doesn’t merit any serious discussion):
https://www.iccl.ie/news/82-of-the-irish-public-wants-big-techs-toxic-algorithms-switched-off/
Giant companies attained their scale by learning from their users, not by thwarting them. The person using technology always knows something about what they need to do and how they want to do it that the designers can never anticipate. This is especially true of people who are unlike those designers — people who live on the other side of the world, or the other side of the economic divide, or whose bodies don’t work the way that the designers’ bodies do:
https://pluralistic.net/2022/10/20/benevolent-dictators/#felony-contempt-of-business-model
Apps — and other technologies that are locked down so their users can be locked in — are the height of technological arrogance. They embody a belief that users are to be told, not heard. If a user wants to do something that the designer didn’t anticipate, that’s the user’s fault:
https://www.wired.com/2010/06/iphone-4-holding-it-wrong/
Corporate enthusiasm for prohibiting you from reconfiguring the tools you use to suit your needs is a declaration of the end of history. “Sure,” John Deere execs say, “we once learned from farmers by observing how they modified their tractors. But today’s farmers are so much stupider and we are so much smarter that we have nothing to learn from them anymore.”
Spying on your users to control them is a poor substitute for asking users their permission to learn from them. Without technological self-determination, preferences can’t be revealed. Without the right to seize the means of computation, the desire paths never emerge, leaving designers in the dark about what users really want.
Our policymakers swear loyalty to “innovation” but when corporations ask for the right to decide who can innovate and how, they fall all over themselves to create laws that let companies punish users for the crime of contempt of business-model.
I’m Kickstarting the audiobook for The Bezzle, the sequel to Red Team Blues, narrated by Wil Wheaton! You can pre-order the audiobook and ebook, DRM free, as well as the hardcover, signed or unsigned. There’s also bundles with Red Team Blues in ebook, audio or paperback.
If you’d like an essay-formatted version of this post to read or share, here’s a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
https://pluralistic.net/2024/01/24/everything-not-mandatory/#is-prohibited
Image:
Belem (modified)
https://commons.wikimedia.org/wiki/File:Desire_path_%2819811581366%29.jpg
CC BY 2.0
https://creativecommons.org/licenses/by/2.0/deed.en