OpenAI is widening access to its generative AI models by making them available on Amazon Web Services, a step that comes shortly after updating its agreement with Microsoft to relax earlier cloud exclusivity restrictions.
Summary
OpenAI is rolling out its newest models and the Codex agent on AWS through Amazon Bedrock, following a revised Microsoft deal that supports multi-cloud deployment.
Amazon Bedrock Managed Agents, powered by OpenAI, enable enterprises to create AI agents with memory and the ability to handle multi-step tasks within AWS environments.
Amazon is strengthening its AI strategy with closer OpenAI collaboration and a potential $25 billion investment in Anthropic, as demand for large-scale AI infrastructure continues to grow.
Reports indicate the updated agreement allows OpenAI to distribute its products across multiple cloud providers. Shortly after the change, the company confirmed that its models would be accessible via AWS, offering enterprises an additional platform to use its latest technologies.
Through AWS, developers can experiment with OpenAI models and its Codex coding agent using Amazon Bedrock, according to a joint announcement released Tuesday. Broader access is expected to roll out in the coming weeks.
AWS CEO Matt Garman, speaking at an event in San Francisco, said customer demand has long pointed toward this integration, noting that it’s something users have wanted “for a really long time.”
Previously, AWS customers were limited to OpenAI’s open-weight models launched in August. The new rollout broadens access to include more advanced systems through Bedrock’s unified APIs and enterprise-grade controls.
Tether introduces a Bitcoin faucet within its self-custody wallet using Lightning Network payouts
A central feature of the launch is Amazon Bedrock Managed Agents powered by OpenAI. These tools are designed to help organizations build AI agents capable of remembering past interactions and executing complex, multi-step workflows. By combining OpenAI’s models with AWS infrastructure, businesses can deploy production-ready agents directly within their existing systems.
OpenAI’s partnership with Microsoft remains important. Microsoft has provided computing resources since before ChatGPT’s debut in 2022. However, internal communications suggest the relationship also imposed limitations. Revenue chief Denise Dresser told employees the partnership “has been critical” but also “limited our ability to meet enterprises where they are — for many that’s Bedrock.”
The newly revised agreement allows OpenAI to limit revenue-sharing obligations with Microsoft while expanding availability across multiple cloud platforms. Amazon CEO Andy Jassy described the move as “very interesting” in a post on X, hinting at more developments ahead.
Expanding AWS partnership
OpenAI’s collaboration with Amazon has been growing steadily in recent months.
In November, the company revealed a $38 billion commitment linked to AWS, shortly after reaffirming that Microsoft Azure would remain the exclusive cloud provider for certain third-party API services.
About three months later, Amazon expanded the partnership further, announcing plans to invest $50 billion in OpenAI. The AI company also said it would utilize AWS infrastructure — including up to two gigawatts of Trainium chip capacity — to train its models.
This announcement came amid scrutiny following a Wall Street Journal report suggesting OpenAI had fallen short of internal growth and revenue targets. The report also questioned spending strategies, contributing to declines in shares of chipmakers like Nvidia and Broadcom.
OpenAI executives strongly rejected the claims. CEO Sam Altman and CFO Sarah Friar said in a joint statement, “This is ridiculous,” adding that the company remains “fully committed to acquiring as much compute as possible.”
Amazon ramps up AI infrastructure
Amazon has been accelerating its investments across the AI ecosystem alongside its work with OpenAI.
Just a week earlier, the company confirmed a new $5 billion investment in Anthropic, the firm behind the Claude AI models, as competition for computing resources intensifies.
The agreement includes provisions for up to $20 billion in additional funding tied to performance milestones, bringing the potential total to $25 billion.
As part of the deal, Anthropic has committed to spending more than $100 billion over the next decade on AWS infrastructure to support training and deployment. The company will also gain access to as much as 5 gigawatts of computing power, with around 1 gigawatt expected to go live using Trainium2 and Trainium3 chips by year-end.
These developments highlight Amazon’s strategy to position AWS as a leading platform for advanced AI workloads, while OpenAI’s latest move reflects a shift toward a more flexible, multi-cloud approach for delivering its technology to enterprise customers.



