NVIDIA tests mini AI data centres at homes and small businesses
Updated
Updated · Tom's Guide · May 7
NVIDIA tests mini AI data centres at homes and small businesses
4 articles · Updated · Tom's Guide · May 7
The trial with startup Span uses XFRA nodes installed alongside electrical panels and HVAC systems to tap unused residential power capacity for AI workloads.
Nvidia says shifting computing from centralised cloud servers to local systems could cut costs, lower latency, improve privacy and support always-on autonomous AI agents.
The move aligns with a wider industry push toward on-device and edge AI, as companies seek alternatives to energy-hungry data centres and greater control over user data.
If every home becomes a mini data center, who will ultimately control the AI agents and the data they process?
Can our current electrical grids actually support millions of home-based AI supercomputers running at the same time?
Will personal AI supercomputers create a new digital divide between those who can afford them and those who cannot?
Piloting Distributed AI Compute in 100 Homes: The Next Frontier for Edge Infrastructure and Smart Living
Overview
In Q3 2026, NVIDIA, Span, and PulteGroup will launch a pilot program to bring advanced AI compute directly into residential homes and small businesses in the U.S. This initiative will install XFRA nodes—mini AI data centers—creating a distributed network for artificial intelligence workloads. The initial rollout will target about 100 homes in the southwestern U.S. By distributing compute power closer to end-users, this approach addresses major challenges faced by centralized data centers, such as siting, permitting, and power constraints, while aiming to deliver low-latency AI solutions that can scale quickly to meet growing demand.