A case study Tainter explores is the Roman Empire. Initially, the republic and later empire expanded through conquest, using loot and plunder to finance state operations, including further expansion. But, eventually, Rome expanded to the point that further conquests were uneconomical—it simply could not field armies that far abroad.

When conquests ended, the Roman state had to finance its operations from resources under its control. This meant taxes, which meant a whole host of bureaucrats: census takers, to see who lived where, and surveyors, to see who owned what and what it was worth, as well as tax collectors and scribes. But all these bureaucrats had to eat, which meant that their incomes were deducted from that tax revenue.

So more expenses, and more complexity to finance those expenses, which created more expenses. This meant that every new crisis had to be financed out of tax revenue that was already strained by the costs of collecting it, which demanded yet more bureaucrats to find the necessary resources.

This was a downward spiral for Rome which, towards the end of the empire, featured a massively expanded bureaucracy and state apparatus but struggled to mobilize the resources it needed to manage its many challenges. Eventually, the state simply couldn’t finance its own operations and collapsed.

2/3

Is this perhaps why the hegemonic system of the capitalist state feels so *exhausted* and unable to tackle seemingly any problem, despite theoretically being able to mobilize vastly more resources than they were even a generation or two ago?

Powerful states, that previously won world wars against industrial peers, struggle to win wars against far weaker opponents. Wealthy governments struggle to afford infrastructure projects. Grand national undertakings belong to an increasingly mythic past.

It’s in this milieu that I interpret (easily foreseeable) results like this:

“In our in-progress research, we discovered that AI tools didn’t reduce work, they consistently intensified it.”

“For instance, engineers, in turn, spent more time reviewing, correcting, and guiding AI-generated or AI-assisted work produced by colleagues. These demands extended beyond formal code review. Engineers increasingly found themselves coaching colleagues who were ‘vibe-coding’ and finishing partially complete pull requests. This oversight often surfaced informally—in Slack threads or quick desk-side consultations—adding to engineers’ workloads.”

As a society, we have massively invested in generative “AI” in a desperate attempt to manage the increasing complexity of our lives (and to goose the wealth of a handful of tech barons), but the end result has been unprofitable slop and *more* work to fix the errors generated by these massively expensive systems.

I strongly suspect that generative “AI” represents the stage of decreasing returns to additional complexity—not just diminishing marginal returns but rather negative returns as we invest more resources into managing the fallout

hbr.org/2026/02/ai-doesnt-redu

3/3

0
2
0

If you have a fediverse account, you can quote this note from your own instance. Search https://kolektiva.social/users/HeavenlyPossum/statuses/116093166832393322 on your instance and quote it. (Note that quoting is not supported in Mastodon.)