Anthropic explores buying AI inference chips from Fractile
Updated
Updated · The Information · May 2
Anthropic explores buying AI inference chips from Fractile
9 articles · Updated · The Information · May 2
The talks involve the three-year-old London startup's chips, expected next year, as Anthropic seeks another supplier alongside Google, Amazon and Nvidia.
Any deal remains early-stage and could fall through, but it would strengthen Anthropic's leverage as annual server and chip spending is projected to reach tens of billions of dollars.
Anthropic's revenue pace has tripled since late last year, worsening compute shortages, customer limits and outages, while AI firms increasingly pursue non-Nvidia inference chips to cut costs and improve margins.
Can Anthropic’s multi-chip strategy and Fractile’s SRAM tech really end the AI compute crunch, or will new bottlenecks emerge?
Could the rapid rise of startups like Fractile disrupt Nvidia’s dominance, or will hyperscalers always control the AI hardware market?
Will the surge in AI datacenter power usage trigger greater public resistance or force major changes in energy policy by 2028?
The $66 Billion CoreWeave Deal and Anthropic’s Quest for AI Chip Autonomy by 2027
Overview
By early 2026, Anthropic surged to a $30 billion annual revenue run rate, driven by enterprise adoption of its AI products. To meet soaring compute demands, it secured major partnerships with CoreWeave for GPU access and AWS for Trainium chips, supported by Amazon's $8 billion investment. Alongside these collaborations, Anthropic is exploring custom AI chip development to gain autonomy, reduce costs, and optimize performance amid growing reliance on external suppliers. Meanwhile, UK startup Fractile aims to disrupt the AI chip market with innovative in-memory compute technology, backed by $200 million in funding. The next few years will be critical as these efforts shape the future of AI hardware and industry competition.