Skip to content Skip to footer

Is There An A.I. Bubble? (And Why It Matters To All Of Us)

I’ve been hearing a lot about a possible “A.I. Bubble” lately. Is this in your awareness as well?

Whether it’s on your radar or not, we need to talk about it, because these financial “bubbles” matter in ALL of our lives — they decide who wins, who loses, and what kind of future could become ‘normal’ without our consent.

Let’s start by understanding why that’s the case.

Why We Need To Talk About This

How many of you lived through the dot.com bubble in the late-nineties? I see those hands! I was there, too, but honestly I wasn’t super aware of what was happening — I was kinda busy with graduating high school and starting college, so this felt like an important topic for me to dig into so I could understand it better myself. 

Like a lot of the things we discuss on Hello Tomorrow, bubbles felt to me like a complex event — almost certainly a system problem — that I’ve heard people reference in a flippant way many times while having a strong “spidey sense” reaction that they didn’t really understand what they were talking about.

Here’s what I learned in making this episode: while financial bubbles do feel like some kind of inaccessible video game being played by the holders of capital (in large part because that’s exactly what they are), understanding bubbles matters because bubbles don’t just wreck markets — they really do reshape our lives.

We ALL live downstream from the games played by the holders of capital. (If you’re not super clear on that yet, I’d recommend going back to Episode 11.) Right now, the games they play largely decide which skills get rewarded, which jobs disappear, which companies survive, and who will end up holding the bag when the next bubble pops. 

In other words, if we don’t understand bubbles, we experience them as random chaos. When we do understand them, we can see the pattern — and patterns give us agency. When we can see the pattern, we can distinguish between what’s real and what’s hype.

Here’s what else I learned: bubbles are moments when the future gets “locked in” to some degree. In these moments, capital moves around really fast, rules get rewritten in a hurry, power consolidates, and defaults can “harden” before most people even realize what’s happening. 

So, understanding bubbles isn’t about “timing the market”… unless you’re sitting on a pile of cash and looking to invest, then by all means go for it. But for most of us, this is an opportunity to witness yet another once-in-a-generation event, this time with a powerful new technology.

These are unique times when a “decision window” is open and we get to decide whether we want to be passive spectators or conscious creators in helping shape what comes next.

What A Bubble Actually Is (And Isn’t)

So what is a bubble?

A bubble is what happens when a real, world-changing idea gets valued as if the future has already arrived before reality can catch up.

So, with A.I., folks with capital are valuing it like it’s already making a lot of money, even though it’s not yet.

What causes this to happen?

Well, it’s not caused by stupidity, greed, or “irrational markets.” A bubble is mostly caused by misaligned time horizons.

Humans are incredible at imagining futures, but we’re terrible at pacing ourselves. We want the shiny thing NOW. We’re a little bit more Veruca Salt than we care to admit. We treat long-term transformation like it should somehow show up on next quarter’s balance sheet.

That mismatch — between how long real change takes and how fast capital expects returns — is where bubbles are born.

Here’s how they actually form in six steps:

Step One:

Something genuinely new and exciting  appears. Railroads were real. The internet was real. A.I. is real. Whatever the technology is, it really is going to change what’s possible, and we see it. This part matters: bubbles form around truth, not fiction. But then…

Step Two:

We dramatically overestimate how fast the payoff will arrive. We see the future… and completely screw up the timeline. We start pricing things as if we’ve already arrived at the destination. And then…

Step Three:

Money rushes in faster than reality can keep up. Capital outruns capability. Expectations outrun infrastructure. Hype outruns the humans. Then…

Step Four:

Speculation feeds on itself. Rising prices become proof. Belief becomes validation. Skepticism sounds boring, caution sounds weak. Reality quietly exits the building. Then…

Step Five:

Something breaks the spell. Revenue doesn’t match expectations. Costs stay high. Growth slows, just a little. And suddenly everyone asks the same question: “Wait… what are we actually paying for?” And then…

Step Six:

The story collapses. Not the technology — the fantasy. Remember dot-com; the internet was, and is, very real, but Pets.com still went out of business.

What Happens When The Bubble Bursts?

When a bubble ends, what actually collapses isn’t the underlying technology, it’s the story we told ourselves about that technology (usually: the idea that it would be fast, or easy, or that “everyone would win”). Businesses and investors that over-leveraged in that fantasy will also lose.

When that story breaks, one of two things usually happens.

The first option is loud

That’s the kind of collapse we remember — this is railroads, dot-com, crypto.

Prices fall hard. Companies fail publicly. Headlines scream. Trillions of dollars in paper value vanish seemingly overnight. It feels like a catastrophe, and in the headlines and for investors, sometimes it is.

But the technology survives.

Railroads still crisscross continents.

The internet still runs the world.

Crypto is still out there.

Even when a bubble pops dramatically, the fantasy story may die but the infrastructure remains.

That’s the first option. Loud.

The second option is quieter

Sometimes when a bubble pops, it just kind of… dissolves. In these cases, there’s no single dramatic crash, no obvious “popping noise.” Instead, expectations slowly reset. Funding dries up. Companies disappear or get acquired. Power concentrates. 

Bubbles like this were electricity, automobiles, and radio.

These industries didn’t explode with a bang, they reorganized. Behind the scenes, entire industries were reshaped. Over time, jobs changed. Business models shifted. A few dominant players emerged and quietly rewrote the rules.

This part is important: for those of us who aren’t “Capital,” this version is counterintuitively more disruptive, because by the time it becomes visible to us, the defaults have already been set… by those with capital.

The big point is: all bubbles burst. Some with a bang and some… without.

So Where Are We With A.I. Right Now?

So where does A.I. sit in this cycle? 

I would say we’re probably between steps Four and Five — we know A.I. works and an astonishing amount of potentially reality-ignoring money has been poured into its future promise.

Conservatively, and I mean conservatively, hundreds of billions of dollars have already been invested directly into A.I. companies, models, chips, and infrastructure. The hyperscalers — Microsoft, Google, Amazon, Meta — are collectively spending $200–400+ billion per year on A.I.-related capital expenditures. That includes data centers, energy infrastructure, chip technology, and acquisitions.

When we zoom out and add committed future spend, we’re looking at north of a trillion dollars either already deployed or firmly locked in.

That does not mean we’re “definitely in a bubble.” But it does mean we are making a massive bet on a future timeline: a timeline where productivity gains arrive quickly, costs fall fast enough, and monetization catches up to expectations.

And right now, that timeline is… optimistic. Revenue is still catching up, productivity gains are cloudy, and energy constraints are real. And a non-trivial number of startups are, frankly, “A.I. plus a pitch deck.”

Which is why this moment feels so intense, so charged, so unstable.

And that brings us to the deeper pattern underneath all of this.

What Is Creative Destruction?

There’s a great term that describes exactly what’s happening: “creative destruction.” 

Creative destruction is the idea that progress doesn’t arrive gently. Old systems don’t politely step aside… they get destroyed. Jobs disappear. Industries collapse. Skills lose value. Entire ways of organizing work and life stop making sense.

And then — sometimes — something better emerges.

When it was first talked about, creative destruction was intertwined with capitalism as its “engine of innovation.” It’s the reason economies evolve instead of fossilize. It’s why new industries are born.

But this pattern isn’t unique to capital markets; it’s a systems pattern.

We see it in many areas of life:

  • Ecology: forests burn so ecosystems regenerate
  • Biology: extinction clears space for adaptation
  • Science: old paradigms fall apart so new models can emerge
  • Culture: movements tear down norms to create social progress
  • Politics: revolutions rewrite civic contracts

If some of this language reminds you of a Fourth Turning, I think you’re hearing the same music I am.

The important part, I think, is to remember that we are part of the dance. Our choices help create the future. And the capital systems we are choosing to run the show right now are extremely artificial — by which I mean, they are not integrated with our biological world at ALL — so there is absolutely ZERO guarantee that the creative destruction brought about by A.I. will actually emerge with any kind of goodness on the other side.

In healthy systems, like the one earth’s biosphere has been cultivating for half a billion years, destruction and regeneration are in balance.

In unhealthy systems, destruction accelerates faster than renewal.

That’s when things break.

And that’s the danger zone we’re flirting with right now.

A Destructive Creation

This is the distinction I think we desperately need: creative destruction is value neutral. It can lead to renewal or collapse. The outcome depends entirely on incentives and ownership.

But what if A.I. isn’t creative destruction, but a destructive creation?

A destructive creation would be what happens when we build astonishing tools on quarterly timelines inside extractive incentive systems without cultural, ethical, or institutional shock absorbers.

If we did that, we wouldn’t just be tearing down old forms of work. We’d be creating new systems faster than humans and our planet can adapt to them.

Faster than governance can evolve.

Faster than culture can metabolize change.

Faster than meaning can catch up.

The real risk with A.I. isn’t really that it replaces jobs… that’s been part of every major technological shift we’ve ever seen.

The real risk is that we collapse time horizons, centralize power, externalize human and planetary cost, and call the whole thing “inevitable progress.”

That wouldn’t be evolution. That would be a stress test on our global society… and it would be one most of us did not consciously agree to take.

The deepest danger of A.I. is that it lives inside an organizing story that only cares about creating capital. The tech will do what we tell it — but right now we, at the decision-making level, only care about making more capital… not because all humans feel that way but because that is what the current system demands. 

So, if nothing changes, that IS what the tech will end up being used for.

Do you see why we need a new organizing story so badly?

How This Actually Ends

Back to bubbles.

Here’s the honest truth we need to sit with.

Not all bubbles explode.

Some pop dramatically. Railroads. Dot-com. Crypto. Big headlines, big losses, seemingly overnight.

Others don’t really pop at all, but diffuse. Electricity. Automobiles. Radio. The hype quietly fades, expectations reset, consolidation happens, and the technology becomes part of the background of everyday life.

And the uncomfortable but honest answer about A.I. is: we don’t yet know which kind of bubble this will be.

What I think we know is that some kind of correction is coming. Capital moved really fast. Expectations are likely ahead of reality. Whether that correction looks like a sharp crash or a slow deflation, the fantasy will break. But remember, when it does, the technology will remain.

The bubble matters to investors. 

The after matters to all of us.

The Optimistic Rebellion

So what do we do with this? Here’s your optimistic rebellion for this week, in 3 steps…

1. Notice what you now understand

You now see something most people don’t. Most conversations about A.I. are stuck at the surface: hype vs fear, utopia vs apocalypse. You now understand the system underneath: time horizons, capital flows, incentive design, historical patterns.

That matters.

Understanding how bubbles form doesn’t make you smarter than other people, but it does make you a lot less manipulable.

2. Help others see it too

Now that you understand this, you can help others understand it. Share this episode with just one person. And even more importantly, learn to share the idea in your own words. When the “A.I. bubble” comes up, consider saying something like:

“Yeah, maybe. But bubbles aren’t really about whether the tech is real — they’re about capital getting ahead of what we can build. The bigger question is what happens after the correction, because that affects all of us.”

That’s a higher-quality conversation. And right now, we desperately need better conversations.

3. Start looking for leaders

Finally — and this part really matters — start actively looking for political candidates and leaders who can speak intelligently about this stuff.

Not with slogans. Not with panic. Not with tech-bro salvation fantasies. But with historical awareness, systems thinking, long-term time horizons, and an understanding that markets alone cannot steward civilizational technologies.

Because whether the A.I. bubble pops loudly or diffuses quietly, the downstream effects will touch labor, education, healthcare, democracy, and culture.

And in moments like these, we desperately need leaders who understand the difference between hype and transformation.

The future isn’t fragile because of A.I., it’s fragile because of how poorly we understand the forces shaping it and the systems that will seek to control it.

Understanding the system doesn’t need to make us cynical. But it does make us responsible. 

Stay vigilant, pay attention… the future needs you.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.