← Back to Newsletter

Why AI Demand Will Exceed Anything We Can Build

Most investors are measuring the wrong thing. Here's what I learned after using Claude for everything in my investing business this week.

Let me start with something personal, because this isn't abstract for me.

I run a one-person investing and trading business. Research, analysis, content, trading, compliance, all of it. One person.

This week alone, I used Claude to:

  • Stress-test my portfolio against three Strait of Hormuz scenarios (each with its own probability weighting and sector implications)

  • Build an interactive dashboard showing stock-by-stock impact across the scenarios

  • Generate a trade flow map visualising how the Hormuz disruption cascades through global supply chains

  • Research, draft and publish two newsletters and four X threads

  • Produce two YouTube Shorts from script to final render

A year ago, I couldn't do most of that. Not because I lacked the ideas. Because I lacked the hours and the technical skills.

That gap - between what one person could do in 2024 and what one person can do in 2026 - is the most underrated economic story of this decade. And it's about to get dramatically bigger.

The Variable Most People Are Ignoring

When analysts debate AI demand, they almost always focus on adoption. How many users? What's penetration? When does the curve plateau?

That's a reasonable question for most technologies. Phones plateau. Cars plateau. Televisions plateau. Demand follows the adoption curve up, and then it flattens out.

But a small number of technologies don't work that way. They're called General Purpose Technologies - a category economists identified decades ago. The original 1992 paper by Bresnahan and Trajtenberg laid out the framework: GPTs don't just get adopted. They keep gaining new abilities over time, and they spawn entirely new industries as a byproduct.

Electricity is a GPT. So is the internal combustion engine. So is the internet.

Here's the thing that matters for investors: for a GPT, the demand curve decouples from the adoption curve. Even after "everyone has it," demand keeps growing because the technology keeps getting more capable and keeps enabling things that weren't possible before.

The best proof of this is internet traffic. US internet adoption hit ~75% around 2010 and hasn't moved much since. But global internet traffic grew roughly 1,000x between 2002 and 2022. Why? Because streaming happened. Cloud happened. Mobile happened. None of those existed at "peak adoption" - they were all downstream innovations that needed the internet as an input.

AI looks like that kind of technology. And what I saw in my own workflow this week is Exhibit A for why.

The Evidence Nobody Can Explain Away

Forget my personal anecdote for a moment. Look at the public data.

Claude's growth pattern tells you everything:

Anthropic's user base went from roughly 15 million to 30 million between July 2024 and July 2025. That's 2x.

But actual compute consumption - the tokens their users pushed through the models - grew roughly 50x in the same period. Anthropic's revenue went from $1B annualized in late 2024 to $30B annualized by March 2026. That's 30x in 16 months.

Claude.ai web visits went from 16 million per month in January 2025 to 164 million per month in August 2025. 10x in seven months.

And on April 1, 2026 - Anthropic publicly admitted that Claude Code users were hitting usage limits "way faster than expected."

Stop and think about what that data is actually saying. The number of users didn't explode. The amount of work those users are doing with AI exploded.

That's what I meant earlier. My user count didn't change between 2024 and 2026. I was already using Claude. What changed was what I could do with it. Research projects that used to take me a week now take an afternoon. Tasks I couldn't do at all (coding, video production, interactive dashboards) are now routine.

Every existing user of these tools is quietly becoming a 10x or 50x compute consumer without anyone adding them to a customer count.

And then there's the coding data:

In 2025, new iOS app submissions jumped 24% year-on-year - the first meaningful increase since 2016. December 2025 alone was up 60% year-on-year. Appfigures specifically attributed this to "agentic coding" - tools like Claude Code, Cursor, and Replit letting people who couldn't build software build software.

The consumer AI app category hit $3 billion in revenue by end of 2025, up 273% year-on-year. (Source: Appfigures, Sensor Tower, The Decoder.)

These aren't Anthropic's customers. These are the downstream effects - people using AI tools to build other things that then create their own demand for more AI. That's the second-order demand that never shows up in adoption charts because it's not part of any existing category.

Why the Infrastructure Can't Keep Up

So if demand is genuinely exploding, how are the suppliers responding?

They're spending everything they have. And it's not enough.

Company

Planned AI Capex 2026

Amazon

$200B

Alphabet

$185B

Microsoft

$145B

Meta

up to $135B

Oracle

$50B

Total

~$700B+

Roughly 75% of that is AI-specific. Call it $450-525 billion going into AI this year alone. (Source: CNBC, Axios.)

And here's the interesting part: they would spend more if they could.

SemiAnalysis, probably the most respected independent research shop covering chip supply chains, put it bluntly in early 2026: "We are now firmly in the silicon shortage phase. Hyperscalers would deploy more capital if silicon existed."

High-bandwidth memory (the specialist chips that make AI training and inference work) is supply-constrained through 2027. TSMC's advanced process nodes are bottlenecked because every major AI chip is transitioning to them at the same time. Global datacenter power demand went from 49 gigawatts in 2023 to 96 gigawatts in 2026, with AI consuming about 40 of those gigawatts and climbing.

The binding constraint on AI growth right now isn't demand. It isn't adoption. It isn't even capital. It's physical capacity - the chips, the power, the cooling, the construction schedules.

The counterintuitive part:

When DeepSeek published a model in January 2026 showing you could get competitive AI performance at roughly 1/45th the compute cost, tech pundits declared the AI buildout was over. "If it's cheaper, you need less of it."

Every hyperscaler immediately raised their capex guidance.

  • Meta went +50% to $60-65B

  • Alphabet raised to $91-93B

  • Microsoft jumped +74% quarter-on-quarter

Satya Nadella tweeted "Jevons paradox strikes again." The reference is to economist William Stanley Jevons, who noticed in 1865 that when steam engine efficiency improved, coal consumption went up, not down. Cheaper energy per unit meant more uses became economically viable, which meant more total energy consumed.

That's what's happening here. Cheaper inference doesn't reduce compute demand. It unlocks a much longer list of use cases that suddenly make economic sense. And the total consumption goes up.

The efficiency improvement is the demand expansion.

The Bear Case Worth Taking Seriously

I don't want to sound like every AI thesis needs to be bullish. Let me steelman the other side.

Goldman Sachs published research saying AI contributed "basically zero" to US GDP in 2025. That's a real number from a real institution. It should bother you.

Surveys also consistently show that only around 26% of companies report tangible value from AI deployments, and enterprise AI implementation failure rates are cited in the 70-85% range.

Microsoft has paused planned data centre projects in some regions. Yale's business school published an analysis that was blunt: "inflection bubbles are real, the timing is the key unknown."

These aren't cranks. They're serious people raising serious questions.

Here's my honest response.

Erik Brynjolfsson, one of the most cited economists on productivity and technology, has a concept called the "Productivity J-Curve." He argues that GPTs consistently show flat or negative measured productivity in their early adoption phases because businesses have to restructure themselves around the new technology before the gains show up in the data. The benefits of electrification didn't appear in productivity statistics for roughly two decades after factories started electrifying. That doesn't mean the benefits weren't real. It means the measurement system lagged the actual impact.

His 2025 paper in the Quarterly Journal of Economics documented an average 14-15% productivity improvement for workers using generative AI, with the largest gains concentrated in junior or lower-skilled workers. That's a specific, measured effect that's happening right now in controlled studies.

So both things can be true. AI can be contributing "basically zero" to measured aggregate GDP while simultaneously driving 50x growth in real compute consumption and 60% spikes in iOS app creation. Those aren't contradictions. The first one is a measurement problem. The second one is the underlying reality.

The bear case is: what if the measurement problem goes on forever, the capital gets stranded, and the hyperscalers are over-building into a fantasy?

The counter: the suppliers are supply-constrained. This isn't what a speculative overbuild looks like. In a bubble, you see excess capacity chasing uncertain demand. Here you see committed customers being rationed because the hardware physically doesn't exist yet.

You can hold both of these thoughts. The bear case is real. And the infrastructure demand is also real and growing faster than the supply can catch up.

How to Actually Position

A few thoughts on what to do about this, not as "advice" but as how I'm thinking about my own portfolio.

The application layer is where the narrative is. The infrastructure layer is where the scarcity is.

Most retail investors default to buying whatever AI application or model company is in the news. That's the layer with the most uncertainty. We don't know which chatbot wins. We don't know which enterprise tool dominates. We don't know which vertical AI agent becomes a unicorn.

But we know - with high confidence - that whoever wins will need vastly more compute than exists today. And the compute supply chain has fewer players, harder moats, and much more visible capacity constraints.

Amazon is the cleanest infrastructure play I've been thinking about.

AWS hit $35.6 billion in Q4 2025 revenue - up 24% year-on-year, their largest growth rate in 13 quarters. That acceleration is partly from Anthropic (which uses AWS as its primary cloud) and partly from OpenAI's committed $100 billion in incremental AWS spend.

Amazon is trading around 27x forward earnings, which is historically on the low end of its valuation range. And unlike pure-play AI names, the rest of Amazon's business (e-commerce, advertising) isn't existentially threatened if AI disappoints. You get the AI infrastructure story with a diversified floor underneath.

The broader infrastructure stack: TSMC, HBM suppliers (SK Hynix, Micron, Samsung), power infrastructure companies. The silicon shortage is multi-year. These are supply-constrained businesses facing structural excess demand. They're not growth-at-any-price bets. They're "you physically can't make enough of what everyone wants" bets.

The compute demand thesis isn't really speculation. It's physics. You can't serve 50x token growth on flat infrastructure.

The Part That Changed How I Think

Everything above is an investment argument. Here's what I actually believe, beyond the portfolio.

The most important thing I discovered this week isn't that Amazon stock is cheap or that silicon is scarce. It's that the gap between what I could do before AI and what I can do now is not a 20% productivity improvement.

It's a category change. Tasks that I couldn't do at all (build interactive dashboards, produce videos, generate professional map graphics) are now routine afternoon work. And I'm not special. I'm not a developer. I didn't study AI. I'm just someone who decided to actually use these tools for real work.

If you're reading this and thinking "I should understand AI better as an investor," that's correct but insufficient. The bigger opportunity is using it.

Pick one thing you don't know how to do. A document analysis, a data visualization, a research project, a piece of writing, a small software tool. Something that would normally take you days or that you'd pay someone else to do. Give it to Claude or ChatGPT. Actually work with the output. Iterate. See what happens.

Most people who try AI casually never discover what it can actually do for them. The people who use it seriously figure it out within a week.

Whether that knowledge stays abstract or becomes part of how you work is the question that matters - not just for your portfolio, but for how much of the next decade you participate in.

What to Remember

  • Demand for compute is decoupling from user growth. The same users are doing dramatically more with AI than they could 12 months ago.

  • The hyperscalers know this, which is why they're collectively spending $700B+ in 2026 and still can't build fast enough.

  • The bear case is a measurement problem, not a capacity glut. Supply is the binding constraint, not demand.

  • The Jevons paradox means efficiency gains expand the demand curve rather than shrinking it.

  • The best investment exposure is in the physically constrained supply chain, not the narrative-driven application layer.

  • And beyond all the investment logic: the tool is sitting in your browser. Whether you learn to use it matters more than whether you own the stock.

Sources: Bresnahan & Trajtenberg (1992) "General Purpose Technologies, Engines of Growth?"; Brynjolfsson et al. (2025) "Generative AI at Work," QJE Vol. 140(2); Business of Apps (Claude stats); Appfigures, Sensor Tower, The Decoder (iOS data); CNBC, Axios (hyperscaler capex); SemiAnalysis (silicon shortage); DevClass (Claude Code limits); NPR (Jevons paradox); Goldman Sachs, Yale SOM (bear case)

Enjoyed this issue?

Subscribe to get new insights delivered to your inbox.

Join the newsletter

Markets, AI tools, and hard-won investing lessons. I'm building a one-person investing business and sharing everything - so you can build yours.

No spam. Unsubscribe anytime.