Scaling AI through measurable business outcomes

Christine Sarros, SVP, Oracle

Christine Sarros is Senior Vice-President of Oracle’s Enterprise Engineering organization, which is tasked with delivering cutting-edge IT services to over 170,000 employees globally. Christine has 25 years of experience as a leader for Fortune 500 technology organizations and currently leads a team of 1,500 global staff within the Oracle Cloud Infrastructure organization. Currently, Christine heads up initiatives to find AI solutions for enterprise search and support, and companion assistants for secure dev and ops functions. She is an active member of the Oracle Women’s Leadership community, whose mission is to develop, engage, and empower current and future generations of women leaders.


New technologies are coming through all the time. How do you separate real value from hype?

Any new technology can be oversold, and AI is no different. I start with developing clear, desired business outcomes. If you think about engineering and automation since the Industrial Revolution, the goal has been to baseline end-to-end processes and align them with the outcomes we want to achieve. Then, I ask whether we already have a process that delivers part of them, or whether we need to build from scratch. From there, I identify high-impact use cases and prioritize those with high volume and clear ROI that drive customer success. Value can be efficiency, simplification, or innovation, but it must be measurable.

Next, I confirm where the data resides, how we can access and distribute it, and how process and technology components will come together to produce the outcome. Security and governance are essential, and a secure, unified data core is a major enabler.

This approach lets me see how much of the work already exists in our processes and how much needs to be introduced. It also clarifies which technologies to combine, whether that means using existing systems in a new way or orchestrating new systems, including standalone AI components.

Any new technology can be oversold, and AI is no different.


Scaling technology across a large organization is a demanding task. How do you approach it at Oracle?

We build for scale and flexibility. In our data-center options, we consider data residency and offer both commercial and sovereign platforms. Customers can also retain data on-premises. Internally, I plan across a multi-year horizon, with minimum and maximum thresholds. My guiding principle is to assume that everything we build may need to run at maximum capacity. I consider availability and resiliency requirements, asking whether a service is Tier-0 or something lighter, like a bi-weekly reporting service. I set the investment level according to the nature of the service.

Running on cloud lets you scale up and down quickly, so you do not have to reinvent the wheel. You can plug and play many Oracle services. If some applications or business data sit on Azure or GCP, you can plug in with our multi-cloud options. You can add new Oracle databases with AI integrated and tie them to those environments, which makes deploying a new Oracle database straightforward.

We build for scale and flexibility.


How are you using AI in operations inside Oracle?

We started with lower-complexity use cases. An example is our AI service desk, where an intelligent chatbot is integrated into our direct messaging system for employee to use if they have an issue, such as being unable to connect or a laptop freezing. We have achieved 48% ticket deflection on an extremely high monthly volume.

What is most exciting is the reasoning capability of generative AI [Gen AI]. We use outcome data to identify systemic failures. AI recommends what to prioritize and suggests potential root-cause fixes, so it is not just deflecting tickets but seeking to avoid or resolve the issue permanently. This improves process efficiency over time and lets engineering teams focus more on innovation instead of heavy manual analysis or basic root cause corrective action.

The goal is to be proactive. If we proactively recognize patterns in the data, we can pre-empt impact on the customer. AI surfaces those patterns from an analytics standpoint and can even help generate code to resolve issues. That is what I am most excited about.

Our AI service desk achieved 48% ticket deflection on an extremely high monthly volume.


Where do you see the strongest business value from AI?

We see clear value in our SaaS [Software-as-a-Service] Fusion platforms.

  • Supply chain: In Fusion Cloud SCM, AI features help predict demand by combining internal and external signals, historical patterns, real-time market changes, and simulations. We have reduced some supply-chain planning cycles by 70%.
  • Finance: In ERP [enterprise resource planning], we close our books and release earnings in under 10 days, nearly 60% faster than most organizations of our size.
  • HR: We have shortened onboarding, so new hires are productive from day one. From offer to full system access now takes about 24 hours

These outcomes come from orchestrating systems with AI that drives call-to-action activities. The reasoning element of the automation identifies what should happen next across back-end systems. As the models learn, processes become more efficient.

In ERP [enterprise resource planning], we close our books and release earnings in under 10 days, nearly 60% faster than most organizations of our size.


With so many AI use cases competing for attention, how do you decide where to invest and how to roll them out?

I always start with step one: define the measurable business outcome. Some things are interesting, like AI-generated meeting notes or email drafts, but they are more valuable when connected to real workflows. For example, tying the inbox to relevant documents, auto-updating tickets, sending action items, or triggering code changes.

Then, I match the complexity to the goal. If a process is mature, low-complexity improvements can move quickly. If the goal is to do something new, we talk through the data and systems involved, and we build iteratively toward that goal. It does not have to be a ‘Big Bang.’ You might begin with service-desk ticket deflection, then correlate device history to enable auto-remediation. Reasoning analytics can highlight where efficiency or innovation will have the highest pay-off.

This is basic Agile: act now in small measures, iterate, and show incremental value instead of a long waterfall that may miss the mark. Most large companies have vast amounts of data, so even organizing data is a meaningful task to test and improve. Oracle products have database and AI components built in, with optional add-on AI services, so you do not need to be an AI expert to architect the solution. We provide options based on the outcomes you are trying to achieve.


How are you navigating data sovereignty, and what should organizations consider?

Our strategy is to maximize flexibility for the customer. That can mean running in specific countries, in their own data centers, in a sovereign or dedicated model, or in a multi-cloud setup. Dedicated Region lets you operate a full Oracle Cloud platform inside your data center with a small, scalable footprint that starts at three racks, which helps when you want a smaller deployment. If you prefer taking only what you need, offerings such as Exadata Cloud at Customer bring specific cloud database capabilities on-premises.

Because regulations evolve, we also support government cloud options and air-gapped isolated regions. It is not one-size-fits-all. Many organizations use more than one model at the same time. Some workloads run well in commercial regions; others require sovereign or dedicated deployment; and some must meet FedRAMP-type isolation. The aim is that our products run consistently across platforms so you do not need special builds when you choose the model that fits each workload. Start with your regulatory, residency, and operational requirements, then select the mix that serves each application best. We continue to invest to keep these choices current as rules change globally.


Multi-cloud can be powerful, but it can also be complex. What does your approach mean for clients in practice?

Our goal is to make multi-cloud as simple as possible. Our database services run on OCI, while being directly available in partner data centers. If a customer already has an Oracle database on OCI, they can tie it to back-end services on AWS, Azure, or Google Cloud. If they want to move something from those clouds onto an Oracle database, they can do that, too. We have simplified administration through our console, so cloud operators can manage resources across environments without wrestling with underlying architectural complexity.


What do you think CXOs should focus on?

AI is changing how every industry and job function operates, but the data behind AI is core to economic growth. Organizations today have to keep pace with the amount and the speed of data so the business can make informed, data-driven decisions, create new products and services, sustain innovation, and gain real time insight into customer behavior, market trends, and operational efficiency.

For us at Oracle, the first priority is using our cloud platforms to their fullest capacity. We run Oracle on Oracle whenever we can. In addition, we partner with strategic providers. For example, we use Zoom alongside our contact center, so we can combine our own CX [customer experience] best practices with contact-center components in the Zoom system. The common thread is correlating secure data across platforms to truly understand customers and deliver the services they need and help them grow their own businesses.

AI is changing how every industry and job function operates.


AI is changing software engineering and the CIO remit. How do you see the role of the CIO evolving over the next few years?

I would summarize it in two foundations. First, understand your data. That means your core business data and any external data you need, where it resides, and the governance and security around it. Second, make that data accessible in environments that can scale. Cloud is ideal for this, and there are options to meet regulatory and security needs, including sovereignty and isolation models.

From an AI perspective, decide how much you will consume now and how much you will add as standalone AI components. When AI first emerged, it lived on a few platforms. Today, many products include AI in the platform, and you can compose additional AI where needed. At Oracle, we support that choice with strategic partnerships, including with NVIDIA, and we provide a choice of large language models [LLMs].

By consolidating on cloud and modernizing the operating model, you can translate cost savings from legacy on-premises sprawl into agile innovation. That innovation might be new AI or new capabilities for a specific business segment. The question for every CIO is whether the way you have been running things over the past decade can keep pace with the industry as it evolves. Flexibility in the cloud, combined with clear, business-driven and measurable AI outcomes, is what moves you forward.

The question for every CIO is whether the way you have been running things over the past decade can keep pace with the industry as it evolves.