Bilal Khan
About Workshops Insights Resources

Can a $500B AI budget beat 3 billion years of R&D?

Profile photo

Bilal Khan

/

October 16, 2025

Hardware muscle built the first AI giants,
But it won’t build the next trillion-dollar firms.

Today, the world equates AI development with scale:
more GPUs, more data, more training.

Indeed, this fuelled the first trillion-dollar AI winners.

But every new model burns more power, and even more funds.

You’ve seen the headlines - OpenAI and Google building GPU farms big enough for their own postcodes.

That can’t scale forever.

So where will progress come from next?

I’d bet on systems that behave less like machines,
And more like living organisms.

I’ve been waiting for AI to look to biology for answers ever since my master’s in Computational Neuroscience.

Nature never had the luxury of endless power
(and neither do we),

So it had to learn efficiency the hard way.

And it did - at a level of mastery we can barely reach today.

A jumbo jet burns through tonnes of fuel,
While a Bar-tailed Godwit can fly 11,000 km without stopping for food or rest.

Evolution versus engineering is exactly where AI can learn most.

Our brains run on about 20 watts,
Roughly the same as a bright lightbulb.

While OpenAI plans to create data centres that would require 17 Gigawatts,
That’s half of the UK’s electricity usage.

That’s why I’ve been following Zuzanna Stamirowska , a good friend of mine and the founder of Pathway, with great interest.

Her team is rethinking AI by building systems that mimic the way the brain is organised:

Efficient, adaptive, and able to continuously learn without starting over.

If the last decade was about scaling AI through brute force,
the next will be about teaching it to think efficiently.

That’s how the next generation of trillion-dollar AI companies will scale.


You can find the original paper here: Pathway Research Paper
Such clarity in how they framed the problem!

© 2025 AI Executive Workshops. Ready to transform your organization?