SkyWatchMesh – UAP Intelligence Network

UAP Intelligence Network – Real-time monitoring of official UAP reports from government agencies and scientific institutions worldwide

Inside Nvidia’s ‘grid-to-chip’ vision: How Vera Rubin and Spectrum-XGS push toward AI giga-factories

Nvidia will be front-and-center at this week’s Global Summit for members of the Open Compute Project (OCP), emphasizing its “grid-to-chip” philosophy.

The company is making announcements on several fronts, including the debut of Vera Rubin MGX, its next-gen architecture fusing CPUs and GPUs, and Spectrum-XGS Ethernet, a networking fabric designed for “giga-scale” AI factories.

[ RelatedMore Nvidia news and insights ]

It’s all part of a bigger play by Nvidia to position itself as a connective tissue throughout the AI tech stack, embedding itself across every layer including chips and networking to full data center infrastructure and software orchestration.

“Data centers are evolving toward giga scale,” said Nvidia senior product marketing manager Joe Delaere ahead of the event. “AI factories that manufacture intelligence generate revenue, but to maximize that revenue, the networking, the compute, the mechanicals, the power and the cooling, all have to be designed as one.”

Putting numbers on next-gen Vera Rubin infrastructure

Nvidia will provide more detailed specifications for its Vera Rubin NVL144 MGX-generation open architecture rack servers at the event — although the servers themselves will not be available until late 2026.

The Vera Rubin chip architecture is the successor to Nvidia’s Blackwell. It is purpose-built for “massive-context” processing to help enterprises dramatically speed AI projects to market.

Vera Rubin MGX brings together Nvidia’s Vera CPUs and Rubin CPX GPUs, all using the same open MGX rack footprint as Blackwell. The system allows for numerous configurations and integrations.

“MGX is a flexible, modular building block-based approach to server and rack scale design,” Delaere said. “It allows our ecosystem to create a wide range of configurations, and do so very quickly.”

Vera Rubin MGX will deliver almost eight times more performance than Nvidia’s GB 300 for certain types of calculation, he said. The architecture is liquid-cooled and cable-free, allowing for faster assembly and serviceability. Operators can quickly mix and match components such as CPUs, GPUs, or storage, supporting interoperability, Nvidia said.

Matt Kimball, principal data center analyst at Moor Insights and Strategy, highlighted the modularity and cleanness of the MGX tray design.

“This simplifies the manufacturing process significantly,” he said. For enterprises managing tens or even hundreds of thousands of racks, “this design enables a level of operational efficiency that can deliver real savings in time and cost.”

Nvidia is also showing innovation with cooling, Kimball said. “Running cooling to the midplane is a very clean design and more efficient.”

With electricity supplies under increasing pressure, there’s a new trade-off between the cost of chips and their energy efficiency, making chips like Nvidia’s latest more attractive. Brandon Hoff, research director for enabling technologies at IDC, said, “You get more tokens per watt. That’s kind of where we’re ending up. People have the money, they don’t have the power.”

Dovetailing with the Vera advances, Nvidia and its partners are gearing up for the 800 VDC era. Moving from traditional 415 VAC or 480 VAC three-phase systems offers data centers increased scalability, improved energy efficiency, reduced materials usage, and higher capacity for performance in data centers, according to Nvidia. The advanced infrastructure necessary has already been adopted by the electric vehicle and solar industries.

But the transition requires collaboration from all the layers of the stack, and Nvidia is working with more than 20 industry leaders to create a shared blueprint, it said.

Supporting ‘giga-scale’ AI super-factories

Along with Vera Rubin MGX, Nvidia will this week introduce Spectrum-XGS Ethernet support for OCP.

Who wins/loses with the Intel-Nvidia union?

🛸 Recommended Intelligence Resource

As UAP researchers and tech enthusiasts, we’re always seeking tools and resources to enhance our investigations and stay ahead of emerging technologies. Check out this resource that fellow researchers have found valuable.

→ Ecovacs

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *