Updated
Updated · Computerworld · Apr 28
Analysts urge enterprises to rethink AI hardware as agentic AI favors CPUs and ASICs
Updated
Updated · Computerworld · Apr 28

Analysts urge enterprises to rethink AI hardware as agentic AI favors CPUs and ASICs

6 articles · Updated · Computerworld · Apr 28
  • Major cloud providers like Google, Amazon, and Microsoft are adopting their own CPUs and low-power ASICs for AI inference workloads, with 80–85% of AI tasks expected to shift to inference in 2–3 years.
  • Analysts highlight that CPUs are regaining importance as orchestration layers for AI systems, while ASICs offer greater efficiency and lower costs than GPUs for specific agentic AI tasks.
  • The transition to agentic AI is driving demand for more complex infrastructure, with enterprises and AI-native firms needing to optimize for inference per watt rather than server core count.
Is the GPU's decade-long reign over AI finally coming to an end?
Are AI data centers the hidden reason your personal electricity bill is skyrocketing?
Can the world's power grids actually support the AI revolution's insatiable energy demand?
Is 'inference per watt' the new metric that will decide the next trillion-dollar tech giant?
As AI agents multiply into the billions, who is legally responsible when they go rogue?
Nvidia paid $20B for a rival's tech. Is it a genius move or a desperate defense?