Amazon Just Bet $25 Billion on Anthropic While Funding OpenAI Too: What the Dual Investment Strategy Means


Cloud computing infrastructure representing Amazon's massive AI investment strategy

Amazon announced on April 20 that it will invest up to $25 billion more in Anthropic, the maker of Claude, on top of $8 billion already committed. The deal includes a $100 billion, decade long AWS spending commitment from Anthropic and secures up to 5 gigawatts of compute capacity for training and deploying Claude models. This Amazon Anthropic investment lands just two months after Amazon agreed to invest up to $50 billion in OpenAI, making the cloud giant the only company simultaneously backing both leading AI labs with nine figure infrastructure deals.

This is not a story about picking winners. It is a story about what happens when cloud infrastructure becomes the most valuable asset in the AI economy, and one company decides to own the rails on both sides.

What the Amazon Anthropic Deal Actually Includes

The structure of the deal matters more than the headline number. Amazon is putting $5 billion into Anthropic immediately at the company’s latest valuation of $380 billion. The remaining $20 billion is contingent on Anthropic hitting certain commercial milestones, a structure that ties future capital to actual performance rather than hype.

In return, Anthropic has committed to spending over $100 billion on AWS technologies over the next 10 years. That includes current and future generations of Trainium, Amazon’s custom AI chips, along with Graviton processors and the full AWS infrastructure stack.

The compute numbers tell the real story. Anthropic will secure up to 5 gigawatts of capacity for training and deploying Claude. That includes new Trainium2 capacity coming online in the first half of 2026 and nearly 1 gigawatt total of Trainium2 and Trainium3 capacity by the end of the year. The deal covers chips through Trainium4, hardware that does not even exist yet.

For context, AWS collaborated with Anthropic to launch Project Rainier, one of the world’s largest AI compute clusters with nearly half a million Trainium2 chips. When it launched, it was larger than any AI compute cluster in the world.

Why Amazon Is Funding Both Sides of the AI Race

Here is what makes this deal unusual. Amazon agreed to invest up to $50 billion in OpenAI just two months earlier, in February 2026. That deal included a commitment from OpenAI to consume at least 2 gigawatts of AWS Trainium based compute, plus 3 gigawatts of dedicated Nvidia inference capacity.

Now Amazon has done essentially the same thing with Anthropic: equity investment paired with a massive AWS infrastructure commitment.

This is not indecision. It is a deliberate strategy to make AWS the default compute layer for frontier AI, regardless of which lab ends up on top. Consider the combined numbers:

  • OpenAI: Up to $50 billion in investment, multi year AWS commitment, 2 GW of Trainium compute
  • Anthropic: Up to $25 billion in investment (total $33 billion), $100 billion AWS commitment, 5 GW of compute
  • Combined AWS commitment: Well over $200 billion in cloud spending from the two most important AI companies on the planet

Microsoft has also invested in both labs, putting more than $13 billion into OpenAI and up to $5 billion into Anthropic. But Amazon’s play is different. Microsoft’s OpenAI investment was about Azure. Amazon’s dual bet is about making Trainium the chip that trains the next generation of frontier models, regardless of who builds them.

The Revenue Flip That Forced Amazon’s Hand

The timing of this deal is not accidental. On April 7, Anthropic announced that its annualized revenue run rate had crossed $30 billion, surpassing OpenAI’s $25 billion for the first time.

That number deserves scrutiny. Anthropic’s revenue was approximately $1 billion at the end of 2024 and $9 billion at the end of 2025. Hitting $30 billion by early April 2026 represents a 30x increase in roughly 15 months. No enterprise software company has ever scaled this fast.

The growth is coming from a specific place. When Anthropic announced its Series G fundraising in February, over 500 business customers were each spending more than $1 million annually on Claude. By April, that number had doubled to over 1,000 customers. Enterprise demand for Claude’s coding capabilities and the Opus 4.6 model’s performance across reasoning benchmarks have driven adoption at a pace that Anthropic itself struggled to serve.

Anthropic said explicitly that enterprise and developer demand, combined with a “sharp rise” in consumer usage, had led to “inevitable strain” on its infrastructure that impacted reliability and performance. The Amazon deal is, in part, a direct response to capacity constraints that were costing Anthropic business.

The Valuation Surge: From $380 Billion to $800 Billion in Two Months

Anthropic raised $30 billion at a $380 billion valuation in its Series G round in February 2026. By mid April, venture capital firms were offering to invest at an $800 billion valuation, more than doubling the company’s worth in under 60 days.

The company is now in early discussions with Goldman Sachs, JPMorgan, and Morgan Stanley about a potential IPO that could come as early as October 2026, with an expected raise exceeding $60 billion.

For comparison, OpenAI last raised at approximately $340 billion. Anthropic’s revenue overtaking OpenAI’s, combined with the Amazon infrastructure partnership, has shifted the calculus for investors who had assumed OpenAI would remain the unchallenged leader in the AI lab race.

TechCrunch reported on April 14 that Anthropic’s rise is giving some OpenAI investors “second thoughts,” with several late stage funds now actively seeking Anthropic allocation for the first time.

What This Means for the Cloud Market

The Amazon Anthropic investment is part of a broader restructuring of how cloud computing works in the AI era. The old cloud model sold generic compute and storage. The new model sells purpose built AI infrastructure, and the customer relationships are worth hundreds of billions.

AWS Gets a Captive Customer Base

Every developer building on Claude through AWS Bedrock is now deeper in the Amazon ecosystem. As part of the expanded deal, AWS customers will be able to access the full Anthropic native Claude console from within AWS, with no additional credentials, contracts, or billing relationships. Same access controls, same monitoring, same billing.

This is meaningful because it means enterprises can standardize on Claude without leaving their existing AWS governance frameworks. For a Fortune 500 company already spending $50 million a year on AWS, adding Claude becomes a line item, not a procurement event.

The Custom Silicon Play

The deal commits Anthropic to multiple generations of Trainium chips through Trainium4. This gives Amazon something Nvidia does not want it to have: a guaranteed customer for its custom silicon roadmap.

If Anthropic proves that Trainium can train frontier models as effectively as Nvidia’s hardware (and Project Rainier suggests it can), Amazon has a reference customer that legitimizes the entire Trainium line for every other AWS customer considering it.

Google’s Parallel Move

Anthropic is not exclusive to Amazon. The company also recently expanded its partnership with Google and Broadcom, securing approximately 3.5 gigawatts of next generation TPU capacity starting in 2027, on top of roughly 1 gigawatt of Google compute already committed for 2026.

Anthropic is effectively playing the cloud giants against each other, securing compute from AWS, Google Cloud, and custom chip partnerships simultaneously. This multi cloud strategy gives Anthropic leverage that OpenAI, which remains more closely tied to Microsoft Azure, does not have.

What Enterprise Buyers Should Watch

If you are an enterprise leader evaluating AI infrastructure, this deal changes the conversation in three specific ways.

First, Claude on AWS just became a first class citizen. The integrated console access and native AWS billing mean that organizations already committed to AWS can deploy Claude with minimal procurement friction. Expect AWS account teams to lead with Claude in enterprise AI conversations.

Second, the infrastructure lock in cuts both ways. Anthropic’s $100 billion AWS commitment means the company’s interests are now deeply aligned with Amazon’s. If Trainium performance lags behind Nvidia on training workloads, Anthropic still has to use it. Watch for benchmark disclosures that compare Trainium and Nvidia performance on Claude training specifically.

Third, pricing pressure is coming. With both OpenAI and Anthropic competing for enterprise customers on AWS, and Google offering Gemini on its own cloud, the next 12 months will see aggressive API pricing moves. The compute infrastructure deals reduce marginal costs for both labs, and that savings will eventually flow to customers.

The Bigger Picture: Infrastructure as Kingmaker

The Amazon Anthropic investment is the clearest signal yet that the AI industry’s center of gravity is shifting from model development to infrastructure. The models themselves are converging in capability. Claude Opus 4.6, GPT-5.4, and Gemini 3.1 Ultra all score within a few percentage points of each other on major benchmarks. What separates winners from losers now is the ability to serve those models at scale, reliably, at a cost structure that supports actual business models.

Amazon has positioned itself at the center of that infrastructure layer by funding both leading labs and locking in decade long spending commitments. Whether Anthropic or OpenAI “wins” the model race is, from Amazon’s perspective, beside the point. Both are building on AWS. Both are buying Trainium. Both are spending hundreds of billions through Amazon’s cloud.

The real question is whether this dual bet creates a conflict of interest that eventually forces Amazon to choose, or whether the cloud market is large enough that backing both labs is not a contradiction but a hedge that pays off regardless.

For now, the numbers suggest Amazon can afford to be agnostic. The AI infrastructure market is growing faster than any single lab can capture, and the company that owns the compute layer collects rent from everyone.

FAQ

How much has Amazon invested in Anthropic total?

Amazon has now committed up to $33 billion in Anthropic across multiple rounds. That includes $8 billion in previous investments and up to $25 billion in the April 2026 deal, with $5 billion deployed immediately and $20 billion tied to commercial milestones.

Why is Amazon investing in both OpenAI and Anthropic?

Amazon’s strategy is to make AWS the default infrastructure layer for frontier AI. By investing in both labs and securing long term compute commitments, Amazon ensures that regardless of which lab leads in model performance, the training and inference runs on AWS using Amazon’s Trainium chips.

How does Anthropic’s revenue compare to OpenAI’s in 2026?

As of April 2026, Anthropic’s annualized revenue run rate reached $30 billion, surpassing OpenAI’s approximately $25 billion. This marks the first time Anthropic has outearned OpenAI, driven primarily by enterprise and developer demand for Claude’s coding and reasoning capabilities.

What is Anthropic’s current valuation?

Anthropic raised at a $380 billion valuation in February 2026. By mid April, the company was receiving investment offers at approximately $800 billion, and is exploring a potential IPO as early as October 2026 with an expected raise exceeding $60 billion.

What is the Anthropic AWS $100 billion commitment?

Anthropic has committed to spending over $100 billion on AWS technologies over the next 10 years. This covers Trainium chips (through Trainium4), Graviton processors, and the full AWS infrastructure stack, making it one of the largest cloud procurement deals ever announced.

Ty Sutherland

Ty Sutherland is the Chief Editor of AI Rising Trends. Living in what he believes to be the most transformative era in history, Ty is deeply captivated by the boundless potential of emerging technologies like the metaverse and artificial intelligence. He envisions a future where these innovations seamlessly enhance every facet of human existence. With a fervent desire to champion the adoption of AI for humanity's collective betterment, Ty emphasizes the urgency of integrating AI into our professional and personal spheres, cautioning against the risk of obsolescence for those who lag behind. "Airising Trends" stands as a testament to his mission, dedicated to spotlighting the latest in AI advancements and offering guidance on harnessing these tools to elevate one's life.

Recent Posts