What's up with Eos, Nvidia's shrinking supercomputer?

What's up with Eos, Nvidia's shrinking supercomputer?

HomeNews, Other ContentWhat's up with Eos, Nvidia's shrinking supercomputer?

Updated Nvidia can't seem to decide how big its Eos supercomputer is.

Eos: The supercomputer powering NVIDIA's AI breakthrough

In a blog post this month rediscovering the ninth-most powerful widely known supercomputer from last fall's Top500 rankings, the GPU slinger said the Eos was built with 576 DGX H100 systems with a total of 4,608 GPUs. It was about what we expected when the computer was announced.

While that's impressive in its own right, that's less than half the number of GPUs Nvidia claimed the system had in November when it took to the web to talk up the machine's performance in a variety of MLPerf AI training benchmarks.

Back then, the Eos super apparently had a complement of 10,752 H100 GPUs, which would have spanned 1,344 DGX systems. With almost 4 petaFLOPS of frugal FP8 performance per GPU, that supercomputer would have been able to handle 42.5 exaFLOPS of peak AI computation. Compare that to the 18.4 AI exaFLOPS Nvidia says the system can produce today, and the Eos seems to have lost some muscle tone.

Tagged:
What's up with Eos, Nvidia's shrinking supercomputer?.
Want to go more in-depth? Ask a question to learn more about the event.