Meta has announced a sweeping ‘multigenerational’ deal with Nvidia. With this deal the Facebook-parent is deepening its reliance on the AI chip giant. According to a report by Business Insider, the agreement will see Meta build data centers powered by millions of Nvidia’s current and next generation chips for both AI trading and inference. The move also underscores Meta’s commitment to Nvidia even as it develops in-house chips and explores alternatives like AMD processors and Google’s TPUs.
Why Meta-Nvidia deal is bad news for Intel and AMD
The Meta-Nvidia deal is important because it goes beyond GPUs. Meta will also deploy Nvidia’s CPUs, a domain traditionally dominated by Intel and AMD. CPUs are the backbone of general computing tasks in data centers, whereas GPUs are designed to handle specialised workloads such as AI training. By sourcing both from Nvidia, Meta reduces the complexity and consolidates its AI stack under a single vendor. Analysts also say that this approach could squeeze Intel and AMD’s market share, especially as Nvidia markets its forthcoming Vera CPU as a stand-alone product. With CPUs playing a larger role in Ai interface where efficient and cost matter and expansion of Nvidia in this space threatens to erode the long-standing dominance of Intel and AMD.Along with the chips, Meta will also use Nvidia’s networking equipment and confidential computing technology to run AI features inside WhatsApp. The companies plan to deploy Nvidia’s next‑generation Vera CPUs beyond the current Grace CPU model, further cementing Nvidia’s role in Meta’s ecosystem.
Analysts weigh in
* Patrick Moorhead, Moor Insights & Strategy: The deal may cool speculation about Meta’s TPU talks with Google, but it highlights Nvidia’s growing role in AI infrastructure.* Rob Enderle, Enderle Group: CPUs are cheaper and more power‑efficient for inference, making Nvidia’s dual offering attractive. CIOs also prefer a “one‑throat‑to‑choke” model, simplifying vendor management.





