- Amazon raised its total investment in Anthropic to $13 billion with a new $5 billion infusion announced April 20, 2026.
- Anthropic committed to spending over $100 billion on Amazon Web Services over the next 10 years to secure large-scale compute.
- The deal grants Anthropic up to 5 gigawatts of computing capacity and covers Amazon’s Trainium2 through Trainium4 AI accelerator chips, including chips not yet released.
- Venture capital firms have reportedly been in discussions to fund Anthropic at a valuation of $800 billion or more.
What Happened
On April 20, 2026, Anthropic announced that Amazon had agreed to invest an additional $5 billion in the company, as reported by TechCrunch, bringing Amazon’s total investment in Anthropic to $13 billion. In a reciprocal arrangement, Anthropic committed to spending over $100 billion on Amazon Web Services (AWS) over the next 10 years in exchange for dedicated computing capacity to train and run its Claude models.
Why It Matters
The deal reflects a financing structure that Amazon has applied at least twice in quick succession: converting equity stakes in AI labs into long-term, guaranteed cloud revenue. Two months prior, Amazon joined a $110 billion funding round for OpenAI — contributing $50 billion — in a deal similarly structured partly as cloud infrastructure commitments rather than straight cash, at a $730 billion pre-money valuation for the ChatGPT maker.
The pattern signals that compute access has become both the primary bottleneck and the primary negotiating lever in large-scale AI development agreements.
Technical Details
At the center of the agreement is Amazon’s custom silicon stack. Anthropic’s $100 billion commitment specifically covers access to Trainium2 through Trainium4 — Amazon’s AI accelerator family, which Amazon positions as a competitor to Nvidia’s GPU lineup. Trainium3, the most recent available chip in the series, was released in December 2025; Trainium4 chips are not currently available but are included within the deal’s contracted scope.
Anthropic also secured an option to purchase capacity on future Amazon chips as they become available. In aggregate, the arrangement grants Anthropic access to up to 5 gigawatts of new computing capacity for model training and inference workloads — a figure that reflects the infrastructure scale required to develop and serve frontier AI systems.
Who’s Affected
Anthropic gains a committed compute pipeline at a scale necessary to train future Claude models without relying on variable spot-market availability. AWS competitors — Google Cloud and Microsoft Azure in particular — face tightening competitive dynamics as AI labs sign multi-year infrastructure commitments that reduce their platform optionality. For Nvidia, Amazon’s expanded Trainium roadmap from Trainium2 through Trainium4 represents a continued effort by a major cloud provider to develop alternatives to Nvidia accelerators within the AI training stack.
What’s Next
The announcement may precede a broader Anthropic fundraising round. According to TechCrunch’s reporting, venture capital firms have been in discussions with Anthropic about terms that would value the company at $800 billion or more — a substantial increase from its most recently known valuation. Anthropic has not confirmed whether or when such a round will be finalized.