Nvidia CEO Rebuts Report That $100B OpenAI Deal Has Stalled
Read Our Expert Analysis
Create an account or login for free to unlock our expert analysis and key takeaways for this development.
By continuing, you agree to receive marketing communications and our weekly newsletter. You can opt-out at any time.
Recommended for you

Nvidia and Other Tech Players Reportedly in Talks to Invest in OpenAI
Several major technology companies — led by a prominent chipmaker — are reportedly exploring minority investments in OpenAI, signaling renewed strategic capital flows into leading generative-AI developers. Reported interest, which may include very large single-source commitments, would be structured to preserve OpenAI’s operational control while tightening commercial ties around chips, cloud and distribution.

Nvidia Pushes Back on OpenAI Rift as AI-Fueled Selling Drags Software and Asset Managers
Nvidia’s CEO publicly pushed back on reports that a once‑prominent framework with OpenAI had broken down, stressing the talks were being mischaracterized and that any early memorandum was nonbinding. Markets nonetheless punished software and asset-management names as investors and credit desks repriced the prospect that generative AI will compress incumbent software economics and raise credit risk in private‑credit books.

NVIDIA Pulls Back From OpenAI and Anthropic Investments
NVIDIA signalled it will step back from making further headline private equity placements into OpenAI and Anthropic, citing closing IPO windows and strategic ecosystem goals, but company spokespeople also emphasised that earlier memoranda were non‑binding and that Nvidia still expects to participate in ongoing financing discussions in unspecified forms. The move appears less like an absolute retreat and more like a reallocation of capital toward supply‑chain and capacity anchoring (public stakes, CoreWeave commitment) while minimising large, balance‑sheet equity exposure amid rising policy and procurement scrutiny.

OpenAI posts 900M weekly users and secures $110B private round
OpenAI says it now reaches about 900M weekly users and roughly 50M paid subscribers, driven in part by a locally priced India tier. Multiple outlets report a very large private financing with an opening tranche near $100–110B and strategic talks with Amazon, NVIDIA and SoftBank, but sources diverge over how much of that headline amount is binding versus illustrative.

Nvidia Faces Market Stress Test As Cloud Players Build Their Own AI Chips
Nvidia heads into earnings under intense scrutiny as analysts expect roughly $66.16B in quarter revenue and continuing high margins, while cloud providers accelerate in-house AI chip programs and TSMC capacity limits cap upside. Recent industry moves — from Broadcom’s commercial tensor‑processor push to Nvidia’s portfolio reshuffle and a public clarification from CEO Jensen Huang on OpenAI financing — sharpen near‑term questions about supply timelines, commercial exclusivity and who captures the next wave of inference demand.

OpenAI closes in on $100B-plus funding; valuation may exceed $850B
OpenAI is finalizing an initial tranche of a landmark financing expected to exceed $100 billion, which would push its pro forma value above $850 billion while leaving a pre-money valuation near $730 billion. Industry sources say talks with strategic backers — including advanced discussions with SoftBank for an incremental commitment roughly in the $30 billion range — could anchor the round, though no binding agreements have been announced.

Altman’s High-Stakes Wager: OpenAI’s Trillion-Dollar Buildout, Hiring Pullback, and the Reality Check on AI-Driven Deflation
OpenAI is pressing ahead with an extraordinary infrastructure build while trimming hiring as cash outflows mount, betting that cheaper inference and broader automation will compress prices. Industry signals — from $1.5 trillion-plus global infrastructure spending to investor scrutiny and warnings about concentrated supplier power — complicate the path from capacity to economy‑wide deflation.

Nvidia CEO Argues AI Expansion Will Cut Energy Costs Over Time
Nvidia’s CEO says the current surge in AI compute will raise electricity use in the near term but argues that hardware, software and grid-level innovations will lower per-unit energy and compute costs over time. The claim hinges on sustained investment, faster deployment of efficient accelerators, and coordinated grid upgrades amid risks from permitting, supply‑chain constraints and uneven demand.