Elon Musk’s Grok 3 development accelerated with the Colossus supercomputer, utilizing 100,000 Nvidia H100 GPUs for training.
The development cycle of Elon Musk’s “scary smart” Grok 3 sped up through the use of its Colossus supercomputer, which underwent construction within a period of eight months, according to xAI. The Tesla boss called it the “smartest AI on Earth.”
A 3D-printed miniature model of Elon Musk and Grok logo are seen in this illustration taken, February 16, 2025. REUTERS/Dado Ruvic/Illustration(REUTERS)
Grok 3 used 100,000 Nvidia H100 GPUs to provide 200 million GPU-hours for training which exceeded Grok 2 by ten times. The large-scale installation of more computational power in Grok 3 enables it to run big datasets in a shorter time frame while providing enhanced accuracy.
xAI achieved better capabilities for Grok 3 by modifying its training processes beyond hardware improvements. The updated model implements synthetic datasets, self-correction, and reinforcement learning to enhance its performance.
{This is a developing story. Please stay tuned with us for the latest updates.}