Updated
Updated · The Information · Apr 29
xAI records 11% GPU utilization in recent AI model training
Updated
Updated · The Information · Apr 29

xAI records 11% GPU utilization in recent AI model training

6 articles · Updated · The Information · Apr 29
  • xAI, which operates around 500,000 Nvidia GPUs, achieved only 11% Model Flops Utilization in recent weeks, according to an internal memo.
  • The low utilization highlights industry-wide challenges in maximizing GPU efficiency during bursty AI model training, often due to memory bandwidth limitations and complex networking across large GPU fleets.
  • This inefficiency partly explains xAI's recent deal with Cursor to supplement computational needs, as achieving over 40% utilization remains difficult for most major AI developers.
With AI's 'memory wall' making expensive GPUs idle, is the hardware race a dead end?
Beyond GPUs, what radical new computing architecture will break AI's efficiency barrier?
Hyperscalers have locked up memory supply until 2027. Can any new AI startup compete?
Is the AI boom creating a 'RAMageddon' that taxes the entire global tech economy?
As US sanctions fuel China’s AI efficiency, who is really winning the tech war?
As autonomous 'agentic AI' emerges, what guardrails can prevent widespread economic disruption?