OpenAI CEO Sam Altman says scaling up compute will drive both AI breakthroughs and the company's revenue.
In a recent blog post tied to Nvidia's planned $100 billion investment, Sam Altman argues that more compute is the key to unlocking AI's full potential.
"Maybe with 10 gigawatts of compute, AI can figure out how to cure cancer. Or with 10 gigawatts of compute, AI can figure out how to provide customized tutoring to every student on earth. If we are limited by compute, we'll have to choose which one to prioritize; no one wants to make that choice, so let's go build," he writes.
Altman describes a vision of an "AI factory" capable of delivering a gigawatt of new infrastructure every week. He admits this is "extremely difficult" and will take years, but he believes "if AI stays on the trajectory that we think it will, then amazing things will be possible." As Altman puts it, "increasing compute is the literal key to increasing revenue."
This perspective reflects a view in the industry that the real moat in AI isn't the models themselves, but the infrastructure required to run them at scale. Companies need massive compute both to serve huge numbers of users at low cost and to handle energy-hungry models that can spend hours or even days on a single problem.
Altman also warns that the US is losing ground. "Other countries," with China likely in mind, are moving "much faster" in investing in chip manufacturing and new energy.
The closed-loop economics behind Big Tech's AI boom
Some critics in editorials and on social media point to the OpenAI-Nvidia partnership as a textbook case of circular investment. Nvidia puts billions into startups like OpenAI, but only if that money is spent on Nvidia hardware. The cash never really leaves Nvidia's ecosystem, and when it returns as revenue, it's often worth even more to Nvidia because capital markets value reported sales at a multiple of their actual dollar amount.
Other tech giants use similar playbooks. Microsoft's investment in OpenAI is closely linked to Azure cloud budgets, while Amazon and Google secure their stakes in startups like Anthropic by delivering most of the required cloud and chip infrastructure themselves.
At first glance, all this activity looks like booming market demand. But much of the headline growth comes from internal capital loops and exclusive supply agreements. For independent players without deep-pocketed investor-suppliers, it's getting harder to compete.
The cycle is reminiscent of the dotcom bubble in the late 1990s, when telecom vendors used vendor financing to inflate demand that later collapsed, leaving behind excess capacity and battered balance sheets. Still, there's an important difference: today's AI companies are already generating real revenue, and their products are actually selling—so the dynamics aren't quite the same.