Elon Musk Claims He Just Activated the World's Most Powerful AI Supercomputer

Hello!
Behold Colossus: Elon Musk's new supercomputer, allegedly powered by a staggering 100,000 Nvidia AI chips, which would be more than any single AI system on the planet.

"Colossus is the most powerful AI training system in the world," Musk said in a tweet.
The supercomputer is built with Nvidia H100 graphics processing units, which are the industry's most coveted pieces of hardware for training and running generative AI systems, such as AI chatbots and image generators.
And xAI's current tally of them is just the beginning. Musk claimed that, in a few months, Colossus will "double" in size to 200,000 AI chips, which will include 50,000 H200 GPUs, a newer version that Nvidia says will have nearly twice the memory capacity as its predecessor, and 40 percent more bandwidth.
Fast Learners
Musk only founded xAI last summer, its premier product being Grok, a foul-mouthed AI chatbot integrated into X-formerly-Twitter.

As Fortune notes, Nvidia sees Musk as one of its best customers, since he'd already bought tens of thousands of GPUs for Tesla — about $3 to $4 billion worth — before branching out with xAI.
Some of those chips, originally intended to train Tesla's Full Self-Driving system, would be used to train an early version of Grok.
To secure this latest wealth of 100,000 H100 GPUs, it's likely that Musk had to spend billions more, with each AI chip fetching a price of around $40,000. Luckily for him, xAI raised around $6 billion in a May fundraiser, thanks to the backing of notable tech VC firms including Andreessen Horowitz.
Claim to the Throne
The monstrous supercomputer's launch, however, was preceded by controversy. Last week, Memphis locals who live near the Tennessee data center complained about "untenable levels of smog" created by the supercomputer, which could augur further disputes down the line at the xAI facility.

Microsoft, for example, reportedly aims to amass 1.8 million AI chips by the end of the year (though this number sounds highly optimistic, if not infeasible).
In January, Mark Zuckerberg signaled that Meta intends to buy an additional 350,000 Nvidia H100s by the same deadline.
For now, though, Colossus remains a singular statement of raw computing power. According to Fortune, it'll be put to use to train Grok-3, which Musk aims to release in December.
Thank you!
Join us on social media!
See you!