AsianFin – Elon Musk's artificial intelligence startup, xAI, has announced plans to scale up its Colossus supercomputer by tenfold, integrating over one million GPUs in a bid to close the gap with competitors like Google, OpenAI, and Anthropic.
Earlier this year, Colossus, which was built in just three months, was hailed as the world's largest supercomputer, operating a cluster of over 100,000 interconnected Nvidia GPUs. These chips are primarily used for training Musk's chatbot, Grok.
With this expansion, xAI aims to strengthen its AI capabilities, positioning itself to compete more aggressively in the rapidly evolving field of artificial intelligence.