Billionaire Elon Musk uses 100,000 NVIDIA H100 GPUs to train AI

Elon Musk’s xAI company is making significant strides in artificial intelligence with its Grok 3 large language model (LLM). It has been revealed that the company is using an astounding 100,000 NVIDIA GPUs to train this advanced AI model. This number far exceeds the 40,000 A100 GPUs used to train OpenAI’s GPT-4, showcasing the ambitious scale of xAI’s endeavors.

Additionally, Elon Musk has announced plans for the launch of Grok 2, an enhanced version of the existing Grok and Grok 1.5 models, scheduled to be released next August. At the same time, Mr. Musk has already begun to promote Grok 3, touting it as a superior product.

The use of 100,000 NVIDIA GPUs illustrates xAI’s commitment to developing an AI model that can surpass current competitors, despite the significant cost estimated to be as high as 3 billion USD for training. In a related move, Elon Musk has also disclosed plans to acquire NVIDIA’s Blackwell AI accelerators for xAI, with a substantial value of up to 9 billion USD. This demonstrates xAI’s significant investment in hardware and technology to position itself as a formidable player in the increasingly competitive AI landscape.

These developments have attracted considerable interest within the technology community, and the potential success of Grok 3 as described by Mr. Musk continues to be a subject of great anticipation.

Related posts

Google launches Gemini 2.0 – comprehensive AI that can replace humans

NVIDIA RTX 5090 can be 70% more powerful than RTX 4090?

iOS 18.2 launched with a series of groundbreaking AI features