
“We have now grown our data center business by approximately 13x since the emergence of ChatGPT in fiscal 2023,” Nvidia CFO Colette Cress said in the company’s earnings call Wednesday.
While very impressive, this number is not at all surprising, given that global AI spending is expected to reach $2.5 trillion this year, and Nvidia’s biggest customers, major AI hyperscalers Amazon, Alphabet, Meta, and Microsoft, all reported record capital spending figures earlier this month.
Hyperscalers also made surprise financial commitments of nearly $700 billion for 2026, disappointing many investors who are becoming wary of AI spending.
Earlier this month, Evercore analysts warned that heavy capital spending could make hyperscalers’ cash flows negative.
And despite record after record multibillion-dollar commitments to expand AI infrastructure and increase adoption of the technology in the U.S. economy, the results have yet to be fully realized. A Goldman Sachs analyst recently said that the contribution of AI to US GDP in 2025 was “essentially zero”.
Nvidia CEO Jensen Huang spent much of his time in the investor call justifying that capital spending increase.
“I believe their cash flow is growing, and the reason is very simple: We have now seen the transformation of agentic AI and the utility of agents in enterprises around the world,” Huang said.
The adoption of AI by enterprises beyond the tech world, and whether these companies actually see real productivity gains and revenue returns from AI integration, is really important to Nvidia, as this is a big thing the AI industry is currently lacking to ease concerns over an AI bubble.
A recent survey found that despite 70% of companies employing AI, more than 80% reported no impact on employment or productivity.
Last week, OpenAI COO Brad Lightcap told TechCrunch that his company “hasn’t really seen enterprise AI enter the enterprise business process.”
Some experts believe that Anthropic’s cloud cowork, unveiled earlier this month, is going to prove to be a turning point in the penetration of AI in the workforce, so much so that they believe it will lead to a massive extinction-level event for software companies, and perhaps for white-collar work as well. Huang also gave special congratulations to Claude Cowork in the call.
Huang also had a technical explanation to justify the capital expenditure commitments.
“In this new world of AI, computation equals revenue,” Huang said, a phrase he repeated several times during the call. Huang argues that tokens, that is, the chunks of data that AI models process, are the most important part of the new AI economy. The more tokens a model uses, the more computing power and time it requires. So, as models become more complex, computing demands are also increasing “exponentially,” Huang said. He argued that capital expenditure commitments would go toward building this compute capacity, which would thus power higher-level models and translate into revenue.
“The token production capacity that the world needs is enormous, over $700 billion, and I am confident that we will continue to generate tokens… fundamentally because every single company depends on software, every software will depend on AI, and so every company will produce tokens,” Huang said. “If new software requires tokens to be generated and tokens are monetized, it stands to reason that their data center build-out directly increases their revenue.”
Huang’s justification may not have immediately reassured the market. Although shares initially rose in response to the report, after the call, gains ultimately fell to less than 1%. This is despite revenues that exceeded market expectations.
OpenAI and China still have blind spots
Throughout the call, Huang also tried to address rumors of rift with OpenAI, which first began after a $100 billion Nvidia investment announced in September 2025 reportedly failed to move past the early stages months later. Then, two back-to-back reports claimed that Huang was privately criticizing OpenAI’s business approach, while OpenAI was unhappy with the estimated speeds of Nvidia’s chips.
In the call on Wednesday, Huang repeatedly praised the AI giant’s offering, but revealed that the investment has still not been finalized.
“We continue to work with OpenAI toward a partnership agreement and believe we are close,” Huang said on the call. The filing also declined to provide assurances that “the transaction will be completed.”
Another burden of uncertainty weighing on Nvidia is China. The company shared that, starting this month, the Trump administration has finally given it permission to begin shipping small quantities of its H200 chips to China, where it once had a 95% market share before Trump banned the chipmaker’s sales in China, sparking a saga of tit-for-tat trade between the two global superpowers. But officials still don’t know whether imports will be allowed, and they are not factoring it into expected revenues this year.
<a href