OpenAI and Amazon Web Services are getting closer, and this partnership is about making OpenAI鈥檚 tools easier to access through systems many businesses already use.
Just yesterday, both companies announced that OpenAI鈥檚 latest models, Codex and a managed agent service are coming to Amazon Bedrock in limited preview. The update means companies already building on AWS can use OpenAI tools without needing separate environments, billing systems or security frameworks.
That is useful because large organisations usually cannot adopt new technology casually. Many have strict procurement processes, compliance requirements and internal governance rules that slow adoption, even when teams want quick access.
AWS explained the thinking behind the partnership in its announcement, saying, “Today, we are announcing a major expansion of our partnership with OpenAI that brings frontier AI to the infrastructure millions of organisations already trust.”
The partnership effectively places OpenAI within an ecosystem businesses already know how to operate, which removes a lot of friction from adoption.
Why Is OpenAI Joining Amazon Bedrock?
Amazon Bedrock is AWS鈥檚 platform for businesses that want access to different AI models through a single service.
Companies using Bedrock already had access to models from Anthropic, Meta, Mistral, Cohere, Amazon and other providers. OpenAI is now joining that ecosystem, which gives AWS customers another option without forcing them into a separate setup.
AWS said customers can use OpenAI models through the same Amazon Bedrock APIs and controls customers use currently.
That means businesses do not need to create new infrastructure or learn a different security model just to access OpenAI.
OpenAI spoke about the partnership in a similar way, saying, “Today, OpenAI and AWS are expanding our strategic partnership to help enterprises build using OpenAI capabilities in their AWS environments.”
The company also said, “We鈥檙e excited to give AWS customers access to the best frontier models, agents, and tools, which will operate within the systems, security protocols, compliance requirements, and workflows they already use.”
For large organisations, convenience is often as important as the model itself. A useful AI system still needs to fit into procurement workflows, internal controls and finance approvals before anyone can start using it.
OpenAI also announced that GPT-5.5 is now available on Bedrock, calling it “our best frontier model GPT-5.5”.
What Is Changing With Codex?
The second major announcement is Codex, OpenAI鈥檚 coding agent, arriving on AWS.
Both companies brought up the same user stat, saying more than 4 million people already use Codex every week.
AWS said, “More than 4 million people use Codex every week to automate coding work, write and refactor code, explain complex systems, generate tests, and accelerate software delivery.”
That is a substantial user base, especially for a product still gaining traction across enterprise teams.
More from Artificial Intelligence
- PCOS And AI: Can Algorithms Finally Help Women Get Diagnosed Faster?
- AI In The Shadows? Unofficial And Unapproved AI May Already Be Powering Your Business
- What Will Monetisation Look Like For Creators In The AI Era?
- Google Partners With SAP: What Is A Multi-Agent AI?
- Meta Will Be Using AI Chat Controls To Manage Parental Controls – How Will It Work?
- It May Not Be Intentional, But AI Bias Is Real, And It鈥檚 Already Distorting Real-World Outcomes: Experts Comment
- How Do NVIDIA Chips Power Alibaba鈥檚 AI Car Systems?
- Akeneo Unveils 鈥淭he Great Restack鈥 As Agentic Commerce Reshapes Global Retail Architecture
OpenAI explained the update, saying, “Organisations can now power Codex with OpenAI models served directly from Amazon Bedrock.”
The company added, “This allows any company with an AWS commit and Bedrock access to frictionlessly start using OpenAI鈥檚 powerful coding agent and products.”
This gives companies a neater commercial setup because existing AWS spending commitments can now cover more of their AI usage, rather than requiring additional vendor contracts.
It also means fewer barriers for developers, between wanting to use an AI coding assistant and actually getting approval to work with one.
What Are Bedrock Managed Agents?
The third announcement is Amazon Bedrock Managed Agents, powered by OpenAI.
This service is built for companies developing AI agents that can handle longer tasks, retain context and work through more complicated workflows with minimal human input.
OpenAI explained, “With Bedrock Managed Agents, organizations can build agents that maintain context, execute multi-step workflows, use tools, and take action across complex business processes.”
AWS says much of the operational work is handled automatically and its announcement said, “With Bedrock Managed Agents, deploying production-ready OpenAI-powered agents on AWS is fast and straightforward, so you can focus on what your agents should do, not the infrastructure behind them.”
That is appealing to businesses interested in AI agents but unwilling to spend months stitching together permissions, audit trails, logging systems and governance controls.
According to AWS, each agent operates with its own identity, logs actions for auditing and runs within the customer鈥檚 own environment.
Ben Kus, CTO at Box, explained why that appeals to enterprise teams.
He said, “With Amazon Bedrock Managed Agents, powered by OpenAI, developers can build optimized, production-scale AI applications that bring together the strengths and capabilities of OpenAI’s latest models with the scale, security, and infrastructure of AWS.”
He added, “That combination will result in agents that continuously learn what works over time, tailor responses to each user’s specific environment, and operate with the governance and auditability enterprises require, all running on the cloud we already trust.”
And What Does This Mean For Everyday Users?
This partnership is mainly aimed at enterprise customers, but regular users may feel the effects over time as well.
When businesses can adopt OpenAI more easily, employees are more likely to see OpenAI tools built into software they already use at work, whether that is coding software, research tools, internal assistants, customer support systems or document workflows.
That could mean fewer solo AI products and more OpenAI-powered features bit by bit appearing within existing workplace software.
It also gives businesses another route to use OpenAI without managing separate vendor relationships, which may make companies more comfortable adopting these systems at scale.
It creates a more convenient route to experiment with OpenAI and then roll it out more for AWS customers, if internal teams are happy with the results.
In AWS’ words, this is just “the beginning of a deeper collaboration between AWS and OpenAI,” so it’s only a matter of time before more developments are announced…