Nvidia is often the corporate different companies have to reply to. Not the opposite manner round. However on Tuesday, the $4 trillion chipmaker did one thing uncommon: It took to X to publicly defend itself after a report advised that certainly one of its greatest prospects, Meta, is contemplating shifting a part of its AI infrastructure to Google’s in-house chips, referred to as TPUs.
The catalyst was a report from The Data claiming that Google has been pitching its AI chips, often known as TPUs, to outdoors corporations together with Meta and a number of other main monetary establishments. Google already rents these chips to prospects by way of its cloud service, however increasing TPU use into prospects’ personal knowledge facilities would mark a significant escalation of its rivalry with Nvidia.
That was sufficient to rattle Wall Avenue, and likewise Nvidia itself.
“We’re delighted by Google’s success—they’ve made great advances in AI, and we continue to supply to Google,” Nvidia wrote in a submit on X. “Nvidia is a generation ahead of the industry—it’s the only platform that runs every AI model and does it everywhere computing is done.”
It’s not laborious to learn between the traces. Google’s TPUs could be gaining traction, however Nvidia desires buyers, and its prospects, to know that it nonetheless sees itself as unstoppable.
Brian Kersmanc, a bearish portfolio supervisor at GQG Companions, had predicted this second. In an interview with Fortune late final week, he warned that the business was starting to acknowledge Google’s chips as a viable various.
“Something I think was very understated in the media, which is fascinating, but Alphabet, Google’s Gemini 3 model, they said that they use their own TPUs to train that model,” Kersmanc mentioned. “So the Nvidia argument is that they’re on all platforms, while arguably the most successful AI company now, which is [Google], didn’t even use GPUs to train their latest model.”
Why Google immediately issues once more
For a lot of the previous decade, Google’s AI chips had been handled as a intelligent in-house software: quick, environment friendly, and tightly built-in with Google’s personal methods, however not a real risk to Nvidia’s general-purpose GPUs, which monopolize greater than 90% of the AI accelerator market.
A part of that’s architectural. TPUs are ASICs, customized chips optimized for a slender set of workloads. Nvidia, in its X submit, made certain to underline the distinction.
“Nvidia offers greater performance, versatility, and fungibility than ASICs,” the corporate mentioned, positioning its GPUs because the common possibility that may prepare and run any mannequin throughout cloud, on-premise, and edge environments. Nvidia additionally pointed to its newest Blackwell structure, which it insists stays a era forward of the sphere.
However the previous month has modified the tone. Google’s Gemini 3—educated totally on TPUs—has drawn sturdy critiques and is being framed by some as a real peer to OpenAI’s prime fashions. And the concept that Meta may deploy TPUs straight inside its knowledge facilities—decreasing reliance on Nvidia GPUs in elements of its stack—indicators a possible shift that buyers have lengthy questioned about however hadn’t seen materialize.
In the meantime, the Burry battle escalates
The defensive posture wasn’t restricted to Google. Behind the scenes, Nvidia has additionally been quietly combating one other entrance: a rising feud with Michael Burry, the investor well-known for predicting the 2008 housing collapse and a central character in Michael Lewis’s traditional The Large Quick.
After Burry posted a collection of warnings evaluating at this time’s AI increase to the dotcom and telecom bubbles—arguing Nvidia is the Cisco of this cycle, which means that it equally provides the {hardware} for the build-out however may undergo intensive corrections—the chipmaker circulated a seven-page memo to Wall Avenue analysts particularly rebutting his claims. Burry himself revealed the memo on Substack.
Burry has accused the corporate of extreme stock-based compensation, inflated depreciation schedules that make knowledge heart build-outs seem extra worthwhile, and enabling “circular financing” within the AI startup ecosystem. Nvidia, in its memo, pushed again line by line.
“Nvidia does not resemble historical accounting frauds because Nvidia’s underlying business is economically sound, our reporting is complete and transparent, and we care about our reputation for integrity,” it mentioned within the memo, on which Barron’s was first to report.
