🚨 AI IS ABOUT TO TRIGGER THE BIGGEST ECONOMIC CRISIS SINCE 2008.
Not because AI is failing. But because AI is moving at a pace the traditional economy is not adjusting to fast enough.
This is what Citrini layed out in its research and here's everything you need to know about it.
White collar workers make up roughly half of US employment. But more importantly, the top 20% of income earners drive around 60%+ of total consumer spending in the US.
That means a relatively small group supports a very large part of: • housing demand • car purchases • travel • restaurants • software subscriptions • private school • mortgage quality
Now AI coding tools are already reducing the cost of building software. Companies are renegotiating large SaaS contracts because internal teams using AI can now replicate core features faster and cheaper.
When companies cut 10-15% of staff, seat based software revenue drops automatically.
EARLY IMPACT: Margins improve because payroll drops.
SECOND IMPACT: High-income wages weaken.
THIRD IMPACT: Spending slows.
But machines produce output. Machines do not consume output. If a $180,000 product manager role is replaced by a $200/month AI tool, corporate profit rises but household income falls.
If this happens at scale, you get a loop:
AI improves → companies cut jobs → spending weakens → companies protect margins → buy more AI → cut more jobs.
Each step makes sense for one company. But across the whole economy, it shrinks the wage base that supports demand.
Now look at what just happened. After Claude announced its new Code Security AI update, cybersecurity stocks sold off sharply.
Why? Because investors immediately realized AI is now automating tasks inside cybersecurity itself, a sector that was considered protected from automation risk.
That is the displacement effect spreading beyond software into security, compliance, and enterprise defense roles. This is how the loop expands.
Now add credit.
US private credit has grown to over $2 trillion. Many deals in software and tech were priced assuming stable recurring revenue. If AI compresses pricing power, those revenue assumptions weaken.
If defaults rise, private credit funds, insurers, and pension holders face pressure.
Now look at housing. The US mortgage market is around $13 trillion. Mortgages are underwritten on stable income assumptions.
If high income job stability weakens, lenders tighten standards. If housing demand weakens while incomes are under pressure, prices soften.
That affects household wealth and confidence.
Now look at government revenue. The US government collects most of its money from payroll taxes and income taxes. If labor income shrinks while capital income rises, tax collection changes.
At the same time, displaced workers need support. That means higher spending and lower tax receipts at the same time.
This is different from 2008. 2008 was a financial system failure. This would be a labor structure shift. In past tech revolutions, new jobs replaced old jobs quickly.
The risk here is speed.
If AI replaces high-income cognitive roles faster than new high-income roles are created, spending drops before the system adapts.
This does not guarantee collapse. But it increases the risk of:
• weaker discretionary spending • pricing pressure in software and services • rising stress in private credit • housing sensitivity in high-income cities • fiscal deficits widening during a productivity boom
AI can be extremely productive and still create short-term economic instability. The system was built on scarce human intelligence.
If intelligence becomes cheap and widely available through machines, the value of human labor changes. Markets will reprice that shift.
If you want to track whether this risk is building, watch simple data:
• white collar job openings • wage growth in high-income sectors • enterprise software renewal pricing • private credit default rates • housing delinquencies in tech-heavy metros • payroll tax receipts
If those weaken while AI capability keeps accelerating, the displacement scenario becomes real.
The risk is not AI failing. The risk is AI working faster than institutions can adjust.