Scaling AI into production is forcing a rethink of enterprise infrastructure

AdobeStock 752216172
Presented by Nutanix


Across all industries, organizations are focusing on how to move from AI pilots, proof of concept, and cloud-based experimentation to deploying it at scale – in real workloads, for real users, in real business environments. VentureBeat spoke to Nutanix President and Chief Commercial Officer Tarkan Maner and EVP of Product Management Thomas Cornely about what this change demands, and what it will take to get it right.

“AI is transforming everything we do in general, not just in technology, but from regulated industries like banking, health care, government, education, to non-regulated industries like manufacturing and retail,” Maner said. “As an end-to-end platform company, we welcome this change. It creates more opportunities for us as a company to better serve our customers as we move forward.”

But there is still a practical gap between experimentation and production, Cornely said.

“It’s one thing to run an experiment, create a prototype. It’s another thing to take that prototype and deploy it to 10,000 employees,” he explained. “We went from people focusing on training models to chatbots and now agents, where the demands and pressures on AI infrastructure are rapidly increasing.”

Agentic AI introduces a new layer of enterprise complexity

The rise of agentic AI makes this change particularly consequential. These systems offer a degree of autonomy as well as multi-step workflows across applications and data sources that create new operational demands.

Enterprises now have to grapple with multiple agents running simultaneously, unpredictable and real-time workloads, and the need to coordinate access to infrastructure across teams.

“OpenCL is now making it much easier for anyone to create agents and work with agents,” Cornely said. “You don’t want those agents running on premises with your data. You need to build the right architecture around it to protect the enterprise from what an agent can do.”

As these systems become more autonomous, the challenge extends from how they operate to how they interact with enterprise data, systems, and teams.

AI is augmenting human work, not replacing it

Maner said, agent AI is fundamentally not a substitute for human capabilities but an enhancer of them. The goal of enterprises is not to eliminate human work but to find the right balance between human decision making, AI-powered automation, and agent-based workflows.

“We believe that there will be love, peace and harmony between AI, agentic tools and robotics systems and human capital,” Maner said. “If the right vendors provide the right tooling and the right services, that harmony can be optimized for better outcomes for businesses, enterprises, governments and public sector organizations.”

How enterprises are getting started with AI at scale

In practice, the move from experimentation to real-world deployment is where the challenges are most visible. Despite the momentum, many people are still working on how to extend AI beyond the initial use cases.

As they do, organizations quickly run into practical constraints. Many people start in the cloud because of the easy access to resources and services, but practical considerations like data, governance and control, and cost quickly come to the fore.

The cloud can be used to conduct experiments, with the ultimate goal of bringing applications back to premises as they move toward production using platforms that address security and cost.

The use cases gaining the most traction include document search and knowledge retrieval, security and predictive threat detection, software development and coding workflows, and customer support and service operations. In the security space, banking customers in Europe and the US and others are deploying AI-powered tools, including facial recognition and predictive threat detection. Meanwhile, in the customer support industry there is a growing focus on end-to-end, 360-degree customer engagement, from pre-sales to post-sales advocacy.

Industry-specific AI transformation is already underway

Across industries, the shift from experimentation to actual deployment is already taking shape in different ways. In retail, AI is transforming store operations with cameras and robotics used for targeted in-aisle marketing at the point of purchase decision, while cashier-less checkout is replacing traditional POS systems, and freed up human capital is being redeployed into back-office and merchandising functions.

In healthcare, Nutanix works with customers on applications related to diagnostics, treatment, remote health and hospital operations with cloud partners including AWS and Azure. In manufacturing and logistics, change is equally important.

Operational Challenges of Scaling Enterprise AI

As AI use cases continue to grow, enterprises are facing a new range of operational challenges. Managing multiple AI workloads and agents, coordinating infrastructure access across teams, ensuring security and governance, and integrating AI systems with existing business processes are now of paramount concern to IT and business leaders.

The gap between AI developers who emphasize speed and accessibility and infrastructure teams responsible for security, uptime, and governance is one of the defining challenges of this time.

Cornely said, “Now I’m running agents, and they’re all going to fight to get access to resources to solve my problems.” “What you want now is infrastructure that allows you to set constraints, control resources.”

AI Factory: A shared platform for AI production

These challenges are driving demand for what Maner and Cornely describe as an AI factory: a shared infrastructure environment that supports multiple users and workloads simultaneously, enabling both experimentation and production while balancing developer agility with enterprise governance.

At GTC 2026, Nutanix announced the Nutanix Agentic AI solution, an end-to-end platform spanning core infrastructure, Kubernetes-based container services running on a topology-aware hypervisor, and advanced services for building and governing agents.

“We are launching an end-to-end platform, from core infrastructure to an end-to-end management framework for your AI factories through PaaS and advanced PaaS services,” Cornely said. “Really enabling self-service for the teams that will be building these applications across the enterprise.”

Hybrid environments are essential for enterprise AI strategy

Operating in this type of environment requires flexibility in infrastructure. Hybrid infrastructure is not a compromise but a necessity. Some workloads will always run in the public cloud, while others must remain on premises due to security requirements, regulatory compliance, data sovereignty, or competing IP considerations.

“Especially in regulated industries, as sovereignty becomes a bigger issue, data gravity becomes a bigger issue, security, and there’s also a lot more competitive differentiation in the industry, it will depend on what the company wants for its IP,” Maner said.

This is the foundation of Nutanix’s platform positioning, he said.

“We have full cohesion, bringing those applications, that data and all the customizations for these use cases from on-premises to off-premises and hybrid mode,” he said. “Doing this not just in one cloud, but across multiple clouds.”

That resilience also extends to the broader ecosystem. Nutanix works with hyperscalers including AWS, Azure and Google Cloud, as well as regional service providers and emerging neoclouds. Nutanix NeoCloud offers a full software stack to run your own cloud and provide advanced AI services, giving enterprise customers already running Nutanix a simple extension of compute, networking and AI capabilities.

Maner called this arrangement a victory for both sides. For enterprises, this means simplified access to hybrid AI services. For NeoClouds, this means a proven platform to build on. It’s all automated and secure by default, Cornely said.

“All of the governance problems that now come with agentic AI are the same problems we’ve been solving for every other application running in your cloud for the last 16 years,” he said.

From pilot to production: operationalizing AI across the enterprise

Ultimately, the goal is not to run a successful AI pilot, but to operationalize AI in real-world use cases, manage the infrastructure as a shared resource, support collaboration between infrastructure teams and AI developers, and scale from initial projects to enterprise-wide deployments.

“There’s a huge gap right now between the people building AI applications, those AI engineers, those agentic AI developers, and your classical infra teams,” Cornely said. “They need the tooling to enable infra teams so they can support your AI engineers. That’s what we provide with our Agentic AI solution.”


Sponsored articles are content produced by a company that is either paying for the post or that has a business relationship with VentureBeat, and they are always clearly marked. Contact for more information sales@venturebeat.com.



<a href

Leave a Comment