TL;DR
- The AI companies getting rich aren't building models — they're selling the infrastructure underneath (NVIDIA, TSMC, cloud providers)
- Model quality is converging; the real differentiator is who owns their data, compute, distribution, and feedback loop
- Vertical integration (building your own stack) is the SpaceX playbook applied to AI
- If your company's "AI strategy" is a subscription, you don't have a strategy — you have an expense
- The Stack Test: Do you own your data, compute, distribution, and feedback loop?
In 2001, Elon Musk flew to Russia to buy intercontinental ballistic missiles. He wanted to strip the warheads, add an upper stage, and use them to send something to Mars. The Russians kept raising the price — from around $20 million to $21 million per rocket, then tried to charge that per unit instead of for the lot. On the flight home, Musk opened a spreadsheet, broke a rocket down to its raw materials — aluminum, titanium, carbon fiber — and calculated that the cost of materials was roughly 2% of what the aerospace industry was charging.
The problem was never the physics. The problem was the supply chain.
Twenty years later, the same spreadsheet logic applies to AI. And almost nobody is doing the math.
The Expensive Illusion
Here's what the AI landscape actually looks like right now, underneath the hype.
OpenAI builds a model. To train that model, they rent compute from Microsoft Azure. Azure runs on NVIDIA GPUs. NVIDIA gets its chips fabricated by TSMC. TSMC needs rare earth materials, advanced lithography machines from ASML, and enormous amounts of power.
Every layer in that stack takes a margin. Every margin makes the final product more expensive. And every dependency makes you more fragile.
OpenAI isn't building the future of intelligence. They're building a very expensive wrapper around someone else's hardware, running on someone else's cloud, powered by someone else's chips. The companies getting genuinely rich from the AI boom aren't the ones building the models — they're the ones selling the shovels. NVIDIA's market cap tells you everything you need to know about who's actually winning.
This is the dirty secret of the AI gold rush: most of the value is flowing downward through the supply chain, not staying with the model builders.
Follow The Money
Who's Actually Getting Rich From AI?
The model builders get the headlines. The supply chain gets the money.
The punchline: OpenAI raised $13B to build AI. NVIDIA made $60B+ in profit selling them the tools. The shovels always win the gold rush.
Margins are approximate and based on publicly available data.
The First Principles Play
Elon sees this. He's seen it before, because it's the same problem he solved with rockets.
When SpaceX couldn't buy affordable rockets, they didn't negotiate harder. They built their own. When they couldn't get reliable engines, they designed Merlin from scratch. When launch costs were too high, they made rockets reusable — something the entire aerospace industry said was impossible.
Now look at what he's doing with xAI and the broader ecosystem.
He's building his own data centers. Not renting — building. The Memphis supercomputer cluster, Colossus, started with 100,000 NVIDIA H100 GPUs, was doubled to 200,000 in just 92 additional days, and is now being expanded toward one million GPUs. The whole thing was built inside an abandoned Electrolux factory in 122 days — a timeline the industry said was impossible. But the play isn't to buy more GPUs than everyone else. The play is to eventually not need them at all.
If you follow the pattern, the next moves are already happening. xAI has finalized contracts with Samsung Foundry to manufacture custom AI chips at their Taylor, Texas facility. Tesla is planning a massive chip fabrication facility they're calling TeraFab. Vertical integration of power generation is underway — Colossus runs on Tesla Megapacks and hundreds of megawatts from gas turbines while permanent grid connections are built out. And distribution flows through X, with its hundreds of millions of users — a platform xAI now owns outright after acquiring X Corp in early 2025.
Apple understood something similar about chips a decade ago. They stopped using Intel and built their own silicon, designed specifically for what their devices needed to do. The M-series chips didn't just match Intel — they destroyed the performance-per-watt curve because they weren't carrying the overhead of being general-purpose. They were built for purpose.
The irony is that Apple's first-principles thinking was pointed at the wrong problem for AI. Tim Cook's Apple optimized for battery life and mobile efficiency — exactly right for phones, exactly wrong for training language models. They designed chips to run cool in your pocket, not hot in a data center. Great engineering, wrong war.
And now Apple finds itself in an awkward position. They've got extraordinary chip design capability but no training data, no model, and a brand built on privacy that makes it nearly impossible to collect the data they'd need. So they partner with OpenAI — the tech equivalent of a luxury automaker putting someone else's engine in their car. It works, but it's not a long-term strategy.
Why This Matters Beyond Elon
I'll be honest about why this pattern obsesses me. It's because I see the small-scale version of it every single day.
I run AI transformation at Welspun One, one of India's fastest-growing logistics platforms. And the most common thing I hear from companies — the thing that makes me want to flip a table — is this: "We bought a ChatGPT Enterprise subscription. We've done AI transformation."
No. You haven't. You've rented someone else's intelligence and called it your own.
Real transformation looks like what Elon does at the infrastructure level, applied to your business. It means asking first-principles questions. Not "which AI tool should we buy?" but "what data do we actually have, what decisions does it need to inform, and what capability do we need to build internally to make that happen?"
When I built the Asset Intelligence department at Welspun, we didn't start by shopping for AI vendors. We started by mapping every data source across the organization — leasing, construction, facilities, finance — and asking why none of them talked to each other. The answer wasn't a better chatbot. The answer was a unified data infrastructure that made intelligence possible.
That's the moat. Not the AI model on top. The infrastructure underneath.
The Stack Test
Here's a simple framework for evaluating any AI company — or any company trying to use AI:
Do you own your data? Not rent access to it. Own it. Control it. Understand it deeply enough to know what's missing.
Do you own your compute? Or are you at the mercy of someone else's pricing, someone else's capacity constraints, someone else's strategic priorities?
Do you own your distribution? Can you reach your users directly, or do you need someone else's platform to deliver your product?
Do you own your feedback loop? Are you getting the data back from your users that makes your product better over time, or is that data flowing to someone else?
Apply this test to the major AI players and the picture gets uncomfortable. OpenAI fails on compute and distribution. Google passes on infrastructure but their AI still plays second fiddle to their advertising business. Apple fails on data and models.
Elon? He's methodically checking every box. Custom data centers built in record time. Custom chips being fabricated with Samsung. Owned distribution through X, which xAI acquired in March 2025. A feedback loop from hundreds of millions of users. Tesla energy infrastructure powering the compute. And a track record of collapsing supply chains until the margin belongs to him.
The Uncomfortable Truth
The companies that survive the next decade of AI won't be the ones with the best models. Models are converging. The gap between the best and the fifth-best is shrinking every quarter. The real differentiator is shifting from intelligence to intimacy — the company that wins may be the one you simply can't leave.
What isn't converging is infrastructure. Ownership. Vertical integration. The ability to operate when the supply chain gets disrupted, when compute costs spike, when your cloud provider decides to compete with you.
This is true at the industry level, and it's true at the company level. If your AI strategy is "subscribe to the best tool," you don't have a strategy. You have an expense.
The companies that win will be the ones that did what Elon did on the flight home from Russia. They'll open a spreadsheet, look at the raw inputs, and ask: why am I paying someone else's margin on every single layer?
And then they'll start building.
First Principles Framework
The AI Moat Test
Score any company — or your own — on the five things that actually determine survival.
Framework for strategic thinking. Preset scores reflect one analyst's opinion.
I'm not saying every company needs to fabricate its own chips. I am saying that the question "what do we own versus what do we rent?" is the most important strategic question in AI right now, and almost nobody is asking it.
This analysis is part of a larger framework. See The AI Shakeout for the full interactive editorial covering consolidation, moats, and who survives.