
service now OpenAI announces multi-year partnership with to Bring GPT-5.2 into its AI Control Tower and
For enterprise buyers, the deal outlines a broader shift: General-purpose models are becoming interchangeable, while the platforms that control how they are deployed and governed remain differentiated.
ServiceNow lets enterprises develop agents and applications, plug them into existing workflows, and manage orchestration and monitoring through its unified AI control tower.
John Isien, senior vice president of product management at ServiceNow, said the partnership doesn’t mean ServiceNow will no longer use other models to power its services.
"We will remain an open platform. There are certain things that we will partner with each model provider based on their expertise. “Nonetheless, ServiceNow will continue to support a hybrid, multi-model AI strategy, where customers can bring any model onto our AI platform,” Essien said in an email to VentureBeat. “Instead of specialization, we give enterprise customers maximum flexibility by combining powerful general-purpose models with our own LLMs built for ServiceNow workflows.”
What the OpenAI partnership unlocks for ServiceNow customers
ServiceNow customers get:
- Voice-First Agent: Speech-to-speech and voice-to-text support
-
Enterprise Knowledge Access: Quiz based on enterprise data with improved search and discovery
-
Operational Automation: Incident Summary and Resolution Support
ServiceNow said it plans to work directly with OpenAI to “build real-time speech-to-speech AI agents that can listen, reason, and respond naturally without text mediation.”
The company is also interested in exploiting OpenAI’s computer usage models To automate actions on enterprise tools like email and chat.
Enterprise Playbook
The partnership strengthens ServiceNow’s position as a control layer for enterprise AI, separating general purpose models from the services that control how they are deployed, monitored, and secured. Instead of owning the models, ServiceNow is emphasizing orchestration and guardrails – the layer enterprises need to safely scale AI rapidly.
Some companies that work with enterprises view partnerships as positive.
Tom Bachant, co-founder and CEO of AI workflow and support platform Unthread, said this could further reduce integration friction. “Deeply integrated systems often lower barriers to entry and simplify initial deployment," he told VentureBeat in an email. "However, as organizations expand AI into core business systems, flexibility becomes more important than standardization. Enterprises ultimately need the ability to optimize performance benchmarks, pricing models, and internal risk conditions; “None of these remain constant over time.”
As enterprise AI adoption accelerates, partnerships like this show that the real battleground is shifting away from the models themselves toward the platforms that control how those models are used in production.
<a href