The AI Control War: How ServiceNow’s OpenAI Bet Exposes the New Enterprise Tech Arms Race
As AI models become interchangeable, ServiceNow’s OpenAI partnership reveals a new battleground: control over enterprise AI deployment. The collaboration integrates GPT-5.2 into ServiceNow’s AI Control Tower and Xanadu platform, positioning the company as a middleman between cutting-edge models and enterprise workflows.
John Aisien, ServiceNow’s SVP of product management, emphasized the platform’s openness:
We will remain an open platform... customers can bring any model to our AI platform.
This approach prioritizes orchestration, guardrails, and workflow-specific large language models (LLMs) over building frontier models.
However, mid-sized IT managers evaluating this strategy must weigh practical limitations—such as voice-first agents’ contextual accuracy in noisy environments—and the cost of maintaining multi-model flexibility.
Features like enterprise knowledge access and operational automation aim to reduce integration friction. Yet, hybrid strategies require careful evaluation of how OpenAI’s models interact with legacy systems like email and chat.
For instance, voice-first agents may struggle with domain-specific jargon or require extensive training to avoid misrouting support tickets.