computer skeletons
© Saratta Chuengsatiansup
Last year, I coined the term "enshittification" to describe the way that platforms decay. That obscene little word did big numbers; it really hit the zeitgeist.

The American Dialect Society made it its Word of the Year for 2023 (which, I suppose, means that now I'm definitely getting a poop emoji on my tombstone).

So what's enshittification and why did it catch fire? It's my theory explaining how the internet was colonised by platforms, why all those platforms are degrading so quickly and thoroughly, why it matters and what we can do about it. We're all living through a great enshittening, in which the services that matter to us, that we rely on, are turning into giant piles of shit. It's frustrating. It's demoralising. It's even terrifying.

I think that the enshittification framework goes a long way to explaining it, moving us out of the mysterious realm of the "great forces of history", and into the material world of specific decisions made by real people; decisions we can reverse and people whose names and pitchfork sizes we can learn.

Enshittification names the problem and proposes a solution. It's not just a way to say "things are getting worse", though, of course, it's fine with me if you want to use it that way. (It's an English word. We don't have ein Rat fรผr englische Rechtschreibung. English is a free-for-all. Go nuts, meine Kerle.) But in case you want to be more precise, let's examine how enshittification works. It's a three-stage process: first, platforms are good to their users. Then they abuse their users to make things better for their business customers. Finally, they abuse those business customers to claw back all the value for themselves. Then, there is a fourth stage: they die.

* * *

Let's do a case study. What could be better than Facebook?

Facebook arose from a website developed to rate the fuckability of Harvard undergrads, and it only got worse after that. When Facebook started off, it was only open to US college and high-school kids with .edu and K-12.us addresses. But in 2006, it opened up to the general public. It effectively told them:
Yes, I know you're all using MySpace. But MySpace is owned by a billionaire who spies on you with every hour that God sends. Sign up with Facebook and we will never spy on you. Come and tell us who matters to you in this world.
That was stage one. Facebook had a surplus โ€” its investors' cash โ€” and it allocated that surplus to its end users. Those end users proceeded to lock themselves into Facebook. Facebook, like most tech businesses, had network effects on its side. A product or service enjoys network effects when it improves as more people sign up to use it. You joined Facebook because your friends were there, and then others signed up because you were there.

But Facebook didn't just have high network effects, it had high switching costs. Switching costs are everything you have to give up when you leave a product or service. In Facebook's case, it was all the friends there that you followed and who followed you. In theory, you could have all just left for somewhere else; in practice, you were hamstrung by the collective action problem.

It's hard to get lots of people to do the same thing at the same time. So Facebook's end users engaged in a mutual hostage-taking that kept them glued to the platform. Then Facebook exploited that hostage situation, withdrawing the surplus from end users and allocating it to two groups of business customers: advertisers and publishers.

To the advertisers, Facebook said:
Remember when we told those rubes we wouldn't spy on them? Well, we do. And we will sell you access to that data in the form of fine-grained ad-targeting. Your ads are dirt cheap to serve, and we'll spare no expense to make sure that when you pay for an ad, a real human sees it.
To the publishers, Facebook said:
Remember when we told those rubes we would only show them the things they asked to see? Ha! Upload short excerpts from your website, append a link and we will cram it into the eyeballs of users who never asked to see it. We are offering you a free traffic funnel that will drive millions of users to your website to monetise as you please.
And so advertisers and publishers became stuck to the platform, too.

Users, advertisers, publishers โ€” everyone was locked in. Which meant it was time for the third stage of enshittification: withdrawing surplus from everyone and handing it to Facebook's shareholders.

For the users, that meant dialling down the share of content from accounts you followed to a homeopathic dose, and filling the resulting void with ads and pay-to-boost content from publishers. For advertisers, that meant jacking up prices and drawing down anti-fraud enforcement, so advertisers paid much more for ads that were far less likely to be seen. For publishers, this meant algorithmically suppressing the reach of their posts unless they included an ever-larger share of their articles in the excerpt. And then Facebook started to punish publishers for including a link back to their own sites, so they were corralled into posting full text feeds with no links, meaning they became commodity suppliers to Facebook, entirely dependent on the company both for reach and for monetisation.

When any of these groups squawked, Facebook just repeated the lesson that every tech executive learnt in the Darth Vader MBA:

"I have altered the deal. Pray I don't alter it any further."

Facebook now enters the most dangerous phase of enshittification. It wants to withdraw all available surplus and leave just enough residual value in the service to keep end users stuck to each other, and business customers stuck to end users, without leaving anything extra on the table, so that every extractable penny is drawn out and returned to its shareholders. (This continued last week, when the company announced a quarterly dividend of 50 cents per share and that it would increase share buybacks by $50bn. The stock jumped.)

But that's a very brittle equilibrium, because the difference between "I hate this service, but I can't bring myself to quit," and "Jesus Christ, why did I wait so long to quit?" is razor-thin.

All it takes is one Cambridge Analytica scandal, one whistleblower, one livestreamed mass-shooting, and users bolt for the exits, and then Facebook discovers that network effects are a double-edged sword. If users can't leave because everyone else is staying, when everyone starts to leave, there's no reason not to go. That's terminal enshittification.

This phase is usually accompanied by panic, which tech euphemistically calls "pivoting". Which is how we get pivots such as:
In the future, all internet users will be transformed into legless, sexless, low-polygon, heavily surveilled cartoon characters in a virtual world called the "metaverse".
That's the procession of enshittification. But that doesn't tell you why everything is enshittifying right now and, without those details, we can't know what to do about it. What is it about this moment that led to the Great Enshittening? Was it the end of the zero-interest rate policy (ZIRP)? Was it a change in leadership at the tech giants?

Is Mercury in retrograde?

Nope.

* * *

statue
© Saratta Chuengsatiansup
The period of free Fed money certainly led to tech companies having a lot of surplus to toss around. But Facebook started enshittifying long before ZIRP ended, so did Amazon, Microsoft and Google. Some of the tech giants got new leaders. But Google's enshittification got worse when the founders came back to oversee the company's AI panic โ€” excuse me, AI pivot. And it can't be Mercury in retrograde, because I'm a Cancer, and as everyone knows, Cancers don't believe in astrology.

When a whole bunch of independent entities all change in the same way at once, that's a sign that the environment has changed, and that's what happened to tech. Tech companies, like all companies, have conflicting imperatives. On the one hand, they want to make money. On the other hand, making money involves hiring and motivating competent staff, and making products that customers want to buy. The more value a company permits its employees and customers to carve off, the less value it can give to its shareholders.

The equilibrium in which companies produce things we like in honourable ways at a fair price is one in which charging more, worsening quality and harming workers costs more than the company would make by playing dirty.

There are four forces that discipline companies, serving as constraints on their enshittificatory impulses:

Competition. Companies that fear you will take your business elsewhere are cautious about worsening quality or raising prices.

Regulation. Companies that fear a regulator will fine them more than they expect to make from cheating, will cheat less.

These two forces affect all industries, but the next two are far more tech-specific.

Self-help. Computers are extremely flexible and so are the digital products and services we make from them. The only computer we know how to make is the Turing-Complete Von Neumann Machine, a computer that can run every valid program.

That means that users can always avail themselves of programs that undo the anti-features that shift value from them to a company's shareholders. Think of a boardroom table where someone says, "I've calculated that making our ads 20 per cent more invasive will net us 2 per cent more revenue per user."

In a digital world, someone else might well say, "Yes, but if we do that, 20 per cent of our users will install ad blockers, and our revenue from those users will drop to zero, for ever." This means that digital companies are constrained by the fear that some enshittificatory manoeuvre will prompt their users to google, "How do I disenshittify this?"

And, finally, workers. Tech workers have very low union density, but that doesn't mean that tech workers don't have labour power. The historical "talent shortage" of the tech sector meant that workers enjoyed a lot of leverage. Workers who disagreed with their bosses could quit and walk across the street and get another, better job.

They knew it and their bosses knew it. Ironically, this made tech workers highly exploitable. Tech workers overwhelmingly saw themselves as founders in waiting, entrepreneurs who were temporarily drawing a salary, heroic figures to be.

That's why mottoes such as Google's "Don't be evil" and Facebook's "Make the world more open and connected" mattered; they instilled a sense of mission in workers. It's what the American academic Fobazi Ettarh calls "vocational awe" or Elon Musk calls being "extremely hardcore".

Tech workers had lots of bargaining power, but they didn't flex it when their bosses demanded that they sacrifice their health, their families, their sleep to meet arbitrary deadlines. So long as their bosses transformed their workplaces into whimsical "campuses", with gyms, gourmet cafeterias, laundry service, massages and egg-freezing, workers could tell themselves that they were being pampered, rather than being made to work like government mules.

For bosses, there's a downside to motivating your workers with appeals to a sense of mission. Namely, your workers will feel a sense of mission. So when you ask them to enshittify the products they ruined their health to ship, workers will experience a sense of profound moral injury, respond with outrage and threaten to quit. Thus tech workers themselves were the final bulwark against enshittification.

The pre-enshittification era wasn't a time of better leadership. The executives weren't better. They were constrained. Their worst impulses were checked by competition, regulation, self-help and worker power. So what happened?

* * *

One by one, each of these constraints was eroded, leaving the enshittificatory impulse unchecked, ushering in the enshittocene.

It started with competition. From the Gilded Age until the Reagan years, the purpose of competition law was to promote competition between companies. US antitrust law treated corporate power as dangerous and sought to blunt it. European antitrust laws were modelled on US ones, imported by the architects of the Marshall Plan. But starting in the 1980s, with the rise of neoliberalism, competition authorities all over the world adopted a doctrine called "consumer welfare", which essentially held that monopolies were evidence of quality. If everyone was shopping at the same store and buying the same product, that meant that was the best store, selling the best product โ€” not that anyone was cheating.

And so, all over the world, governments stopped enforcing their competition laws. They just ignored them as companies flouted them. Those companies merged with their major competitors, absorbed smaller companies before they could grow to be big threats. They held an orgy of consolidation that produced the most inbred industries imaginable, whole sectors grown so incestuous they developed Habsburg jaws, from eyeglasses to sea freight, glass bottles to payment processing, vitamin C to beer.

Most of our global economy is dominated by five or fewer global companies. If smaller companies refuse to sell themselves to these cartels, the giants have free rein to flout competition law further, with "predatory pricing" that keeps an independent rival from gaining a foothold. When Diapers.com refused Amazon's acquisition offer, Amazon lit $100mn on fire, selling diapers way below cost for months, until Diapers.com went bust, and Amazon bought them for pennies on the dollar.

Lily Tomlin used to do a character on the TV show Rowan & Martin's Laugh-In, an AT&T telephone operator who'd do commercials for the Bell system. Each one would end with her saying: "We don't care. We don't have to. We're the phone company."

Today's giants are not constrained by competition. They don't care. They don't have to. They're Google.

That's the first constraint gone, and as it slipped away, the second constraint โ€” regulation โ€” was also doomed.

When an industry consists of hundreds of small- and medium-sized enterprises, it is a mob, a rabble. Hundreds of companies can't agree on what to tell Parliament or Congress or the Commission. They can't even agree on how to cater a meeting where they'd discuss the matter.

But when a sector dwindles to a bare handful of dominant firms, it ceases to be a rabble and it becomes a cartel. Five companies, or four, or three, or two or just one company can easily converge on a single message for their regulators, and without "wasteful competition" eroding their profits, they have plenty of cash to spread around.

This is why competition matters: it's not just because competition makes companies work harder and share value with customers and workers; it's because competition keeps companies from becoming too big to fail, and too big to jail.

Now, there are plenty of things we don't want improved through competition, like privacy invasions. After the EU passed its landmark privacy law, the GDPR, there was a mass-extinction event for small EU ad-tech companies. These companies disappeared en masse and that's a good thing. They were even more invasive and reckless than US-based Big Tech companies. We don't want to produce increasing efficiency in violating our human rights.

But: Google and Facebook have been unscathed by European privacy law. That's not because they don't violate the GDPR. It's because they pretend they are headquartered in Ireland, one of the EU's most notorious corporate crime havens. And Ireland competes with the EU's other crime havens โ€” Malta, Luxembourg, Cyprus and, sometimes, the Netherlands โ€” to see which country can offer the most hospitable environment.

The Irish Data Protection Commission rules on very few cases, and more than two-thirds of its rulings are overturned by the EU courts, even though Ireland is the nominal home to the most privacy-invasive companies on the continent. So Google and Facebook get to act as though they are immune to privacy law, because they violate the law with an app.

* * *

computer puke
© Saratta Chuengsatiansup
This is where that third constraint, self-help, would surely come in handy. If you don't want your privacy violated, you don't need to wait for the Irish privacy regulator to act, you can just install an ad blocker.

More than half of all web users are blocking ads. But the web is an open platform, developed in the age when tech was hundreds of companies at each other's throats, unable to capture their regulators. Today, the web is being devoured by apps, and apps are ripe for enshittification. Regulatory capture isn't just the ability to flout regulation, it's also the ability to co-opt regulation, to wield regulation against your adversaries.

Today's tech giants got big by exploiting self-help measures. When Facebook was telling MySpace users they needed to escape Murdoch's crapulent Australian social media panopticon, it didn't just say to those Myspacers, "Screw your friends, come to Facebook and just hang out looking at the cool privacy policy until they get here." It gave them a bot. You fed the bot your MySpace username and password, and it would login to MySpace and pretend to be you, scraping everything waiting in your inbox and copying it to your Facebook inbox.

When Microsoft was choking off Apple's market oxygen by refusing to ship a functional version of Microsoft Office for the Mac in the 1990s โ€” so that offices were throwing away their designers' Macs and giving them PCs with upgraded graphics cards and Windows versions of Photoshop and Illustrator โ€” Steve Jobs didn't beg Bill Gates to update Mac Office. He got his technologists to reverse-engineer Microsoft Office and make a compatible suite, the iWork Suite, whose apps, Pages, Numbers and Keynote could read and write Microsoft's Word, Excel and PowerPoint files.

When Google entered the market, it sent its crawler to every web server on earth, where it presented itself as a web-user: "Hi! Hello! Do you have any web pages? Thanks! How about some more? How about more?"

But every pirate wants to be an admiral. When Facebook, Apple and Google were doing this adversarial interoperability, that was progress. If you try to do it to them, that's piracy.

Try to make an alternative client for Facebook and they'll say you violated US laws such as the Digital Millennium Copyright Act and EU laws like Article 6 of the EU Copyright Directive. Try to make an Android program that can run iPhone apps and play back the data from Apple's media stores and they'd bomb you until the rubble bounced. Try to scrape all of Google and they'll nuke you until you glow.

Tech's regulatory capture is mind-boggling. Take that law I mentioned earlier, Section 1201 of the Digital Millennium Copyright Act or DMCA. Bill Clinton signed it in 1998, and the EU imported it as Article 6 of the EUCD in 2001. It is a blanket prohibition on removing any kind of encryption that restricts access to a copyrighted work โ€” things such as ripping DVDs or jailbreaking a phone โ€” with penalties of a five-year prison sentence and a $500,000 fine for a first offence. This law has been so broadened that it can be used to imprison creators for granting access to their own creations. Here's how that works: In 2008, Amazon bought Audible, an audiobook platform. Today, Audible is a monopolist with more than 90 per cent of the audiobook market. Audible requires that all creators on its platform sell with Amazon's "digital rights management", which locks it to Amazon's apps.

So say I write a book, then I read it into a mic, then I pay a director and an engineer thousands of dollars to turn that into an audiobook, and sell it to you on the monopoly platform, Audible, that controls more than 90 per cent of the market. If I later decide to leave Amazon and want to let you come with me to a rival platform, I am out of luck. If I supply you with a tool to remove Amazon's encryption from my audiobook, so you can play it in another app, I commit a felony, punishable by a five-year sentence and a half-million-dollar fine, for a first offence.

That's a stiffer penalty than you would face if you simply pirated the audiobook from a torrent site. But it's also harsher than the punishment you'd get for shoplifting the audiobook on CD from a truck stop. It's harsher than the sentence you'd get for hijacking the truck that delivered the CD.

Think of our ad blockers again. Fifty per cent of web users are running ad blockers. Zero per cent of app users are running ad blockers, because adding a blocker to an app requires that you first remove its encryption, and that's a felony. (Jay Freeman, the American businessman and engineer, calls this "felony contempt of business-model".)

So when someone in a boardroom says, "Let's make our ads 20 per cent more obnoxious and get a 2 per cent revenue increase," no one objects that this might prompt users to google, "How do I block ads?" After all, the answer is, you can't. Indeed, it's more likely that someone in that boardroom will say, "Let's make our ads 100 per cent more obnoxious and get a 10 per cent revenue increase." (This is why every company wants you to install an app instead of using its website.)

There's no reason that gig workers who are facing algorithmic wage discrimination couldn't install a counter-app that co-ordinated among all the Uber drivers to reject all jobs unless they reach a certain pay threshold. No reason except felony contempt of business model, the threat that the toolsmiths who built that counter-app would go broke or land in prison, for violating DMCA 1201, the Computer Fraud and Abuse Act, trademark, copyright, patent, contract, trade secrecy, nondisclosure and noncompete or, in other words, "IP law".

IP isn't just short for intellectual property. It's a euphemism for "a law that lets me reach beyond the walls of my company and control the conduct of my critics, competitors and customers". And "app" is just a euphemism for "a web page wrapped in enough IP to make it a felony to mod it, to protect the labour, consumer and privacy rights of its user".

We don't care. We don't have to. We're the phone company.

* * *

What about that fourth constraint: workers? For decades, tech workers' bargaining power and vocational awe put a ceiling on enshittification. Even after the tech sector shrank to a handful of giants. Even after they captured their regulators. Even after "felony contempt of business model" and extinguished self-help for tech users. Tech was still constrained by their workers' sense of moral injury in the face of the imperative to enshittify.

Remember when tech workers dreamt of working for a big company for a few years, before striking out on their own to start their own company that would knock that tech giant over? That dream shrank to: work for a giant for a few years, quit, do a fake start-up, get "acqui-hired" by your old employer, as a complicated way of getting a bonus and a promotion. Then the dream shrank further: work for a tech giant for your whole life, get free kombucha and massages on Wednesdays.

And now, the dream is over. All that's left is: work for a tech giant until they fire you, like those 12,000 Googlers who got fired last year, eight months after a stock buyback that would have paid their salaries for the next 27 years.

Workers are no longer a check on their bosses' worst impulses. Today, the response to "I refuse to make this product worse" is "turn in your badge and don't let the door hit you in the ass on the way out".

I get that this is all a little depressing. OK, really depressing. But hear me out! We've identified the disease. We've identified its underlying mechanism. Now we can get to work on a cure.

There are four constraints that prevent enshittification: competition, regulation, self-help and labour. To reverse enshittification and guard against its re-emergence, we must restore and strengthen each of these.

On competition, it's actually looking pretty good. The EU, the UK, the US, Canada, Australia, Japan and China are all doing more on competition than they have in two generations. They're blocking mergers, unwinding existing ones, taking action on predatory pricing and other sleazy tactics. Remember, in the US and Europe, we already have the laws to do this; we just stopped enforcing them.

I've been fighting these fights with the Electronic Frontier Foundation for 22 years now, and I've never seen a more hopeful moment for sound, informed tech policy.

Now, the enshittifiers aren't taking this lying down. Take Lina Khan, the brilliant head of the US Federal Trade Commission, who has done more in three years on antitrust than the combined efforts of all her predecessors over the past 40 years. The Wall Street Journal's editorial page has run more than 80 pieces trashing Khan, insisting that she's an ineffectual ideologue who can't get anything done. Sure, that's why you ran 80 editorials about her. Because she can't get anything done.

Reagan and Thatcher put antitrust law in a coma in the 1980s. But it's awake, it's back and it's pissed off.

What about regulation? How will we get tech companies to stop doing that one weird trick of adding "with an app" to escape enforcement?

Well, here in the EU, they're starting to figure it out. Recently, the main body of the Digital Markets Act and the Digital Services Act went into effect, and they let people who get screwed by tech companies go straight to the European courts, bypassing the toothless watchdogs in places like Ireland.

In the US, they might finally get a digital privacy law. You probably have no idea how backwards US privacy law is. The last time the US Congress enacted a broadly applicable privacy law was in 1988. The Video Privacy Protection Act makes it a crime for video-store clerks to leak your video-rental history. It was passed after a rightwing judge who was up for the Supreme Court had his rentals published in a DC newspaper. The rentals weren't even all that embarrassing.

Sure, that judge, Robert Bork, wasn't confirmed for the Supreme Court, but that was because he was a virulent loudmouth who served as Nixon's solicitor-general. Still, Congress got the idea that their own video records might be next, freaked out and passed the VPPA. That was the last time Americans got a big, national privacy law. And the thing is, there are a lot of people who are angry about it. Worried that Facebook turned Grampy into a QAnon? That Insta made your teen anorexic? That TikTok is brainwashing Gen Z into quoting Osama bin Laden?

Or that cops are rolling up the identities of everyone at a Black Lives Matter protest or the Jan 6 riots by getting location data from Google?

Or that red state attorneys-general are tracking teen girls to out-of-state abortion clinics?

Or that Black people are being discriminated against by online lending or hiring platforms?

Or that someone is making AI deepfake porn of you?

Having a federal privacy law with a private right of action โ€” which means that individuals can sue companies that violate their privacy โ€” would go a long way to rectifying all of these problems. There's a big coalition for that kind of privacy law.

What about self-help? That's a lot farther away, alas. The EU's DMA will force tech companies to open up their walled gardens for interoperation. You'll be able to use WhatsApp to message people on iMessage, or quit Facebook and move to Mastodon, but still send messages to the people left behind. But if you want to reverse-engineer one of those Big Tech products and mod it to work for you, not them, the EU's got nothing for you. This is an area ripe for improvement. My big hope here is that Stein's Law will take hold: anything that can't go on forever will eventually stop.

Finally, there's labour. Here in Europe, there's much higher union density than in the US, which American tech barons are learning the hard way. There is nothing more satisfying in the daily news than the recent salvo by Nordic unions against that Tesla guy. But even in the US, there's a massive surge in tech unions. Tech workers have realised they're not founders-in-waiting. In Seattle, Amazon's tech workers walked out in sympathy with Amazon's warehouse workers, because they're all workers.

We're seeing bold, muscular, global action on competition, regulation and labour, with self-help bringing up the rear. It's not a moment too soon, because the bad news is enshittification is coming to every industry. If it's got a networked computer in it, the people who made it can run the Darth Vader MBA playbook on it, changing the rules from moment to moment, violating your rights and then saying: "It's OK, we did it with an app."

From Mercedes effectively renting you your accelerator pedal by the month to Internet of Things dishwashers that lock you into proprietary dish soap, enshittification is metastasising into every corner of our lives. Software doesn't eat the world, it just enshittifies it.

There's a bright side to all this: if everyone is threatened by enshittification, then everyone has a stake in disenshittification. Just as with privacy law in the US, the potential anti-enshittification coalition is massive. It's unstoppable.

* * *

The cynics among you might be sceptical that this will make a difference. After all, isn't "enshittification" the same as "capitalism"? Well, no.

I'm not going to cape for capitalism. I'm hardly a true believer in markets as the most efficient allocators of resources and arbiters of policy. But the capitalism of 20 years ago made space for a wild and woolly internet, a space where people with disfavoured views could find each other, offer mutual aid and organise. The capitalism of today has produced a global, digital ghost mall, filled with botshit, crap gadgets from companies with consonant-heavy brand names and cryptocurrency scams.

The internet isn't more important than the climate emergency, gender justice, racial justice, genocide or inequality. But the internet is the terrain we'll fight those fights on. Without a free, fair and open internet, the fight is lost before it's joined.

We can reverse the enshittification of the internet. We can halt the creeping enshittification of every digital device. We can build a better, enshittification-resistant digital nervous system, one that is fit to co-ordinate the mass movements we will need to fight fascism, end genocide, save our planet and our species.

Martin Luther King said: "It may be true that the law cannot make a man love me, but it can stop him from lynching me, and I think that's pretty important." And it may be true that the law can't force corporations to conceive of you as a human being entitled to dignity and fair treatment, and not just an ambulatory wallet, a supply of gut bacteria for the immortal colony organism that is a limited liability corporation. But it can make them fear you enough to treat you fairly and afford you dignity โ€” even if they don't think you deserve it.

Cory Doctorow is a special adviser to the Electronic Frontier Foundation and a visiting professor of computer science at the Open University. His next book 'The Bezzle', published by Head of Zeus, is out this month. This piece is adapted from his Marshall McLuhan Lecture, delivered at the Embassy of Canada in Berlin last month