We all know that company performance depends on several factors, and that many of those factors are variable and even non deterministic. If life is prone to inconsistency, so is business. Much of this is because of the unpredictability of human behavior, which is why it is interesting to grasp it with new kind of statistical models provided by what we call today data science.
In data science, building such a model is like assembling gears to create a mechanism that works on data. The only systematic and consistent approach is the scientific method – in other words, an inductive and iterative process. We make assumptions from the data to explain the fluctuations and correlations we observed, and then identify the models that could reproduce these observations.
We then have to check the assumption by testing it on new data that hasn’t been used during the learning step, and if the hypothesis is wrong, we have to follow this process again and again until we can construct a good model. This process reveals a kind of chicken and the egg dilemma between data and model – data is needed to determine the model, and the model is necessary to leverage the data and to reveal its value.
An important aspect of these techniques is machine learning. Also called statistical learning, this is traditionally defined as a form of artificial intelligence (AI) that enables computer systems to learn without being explicitly programed. Contrary to natural intelligence, it needs a huge amount of data from which to learn from itself. A child learns to identify cats and dogs with only a few examples; “deep learning” algorithms would need many, many more.
In and of itself, machine learning doesn’t play an important role in analyzing business processes. It is more about analytics applied to data logs in order to reveal the real orchestration of business operations that embody a complete IT system. The purpose of machine learning is to automate a task that is only a node in a business process, which could be represented by a graph connecting several nodes. Indeed, this is one of the main outcomes of process mining.
A process mining solution can be applied to the event logs that will reveal a full and accurate picture of the business process – not the process described in the manual, nor the process as perceived either by management or by individual front-line staff, but the actual process, with all its secret add-ons, workarounds, shortcuts, dead-ends, and compromises. This is, in fact, the starting point of any business transformation initiative.
First – ESOAR
When faced with the complicated reality of real-world processes, the first thing to do is to apply a series of measures in a defined sequence, starting with the elimination of wasteful tasks, before redesigning and automating those that remain. Capgemini’s approach is the ESOAR (Eliminate, Standardize, Optimize, Automate, Robotize) methodology.
It’s important to note here that, while the ultimate aim with intelligent automation may be to introduce Automate and Robotize measures, the actions to Eliminate, Standardize, and Optimize must be applied first, based on business knowledge and analytics derived during the process mining stage.
People or machines?
Capgemini’s proven methodology for deciding between human and machines comprises three steps:
- Identifying tasks that can be performed better and/or faster with AI
- Measuring the value that AI can add
- Designing human-in-the-loop solutions when the expected efficiency is not reached by machines alone
With a probabilistic approach, we can take into account all possible errors, and use mathematics to assess the process efficiency of a given human-in-the-loop solution. We can then show that the process efficiency is better than the machine-only process, while keeping the cost much lower than the person-only process. A large French insurance company provides a real-world case in point.
The introduction of automation and AI is often regarded these days as synonymous with the large-scale replacement of people with machines.
As I have argued above, this won’t necessarily be the case. It is often possible to orchestrate activities between people and machines; in fact, it’s not only possible, but preferable, as machines are unable to reach the same efficiency as humans for some tasks. By developing models and frameworks that reengineer processes for the digital age, we can deliver business outcomes that are better than could be achieved by either machines or humans on their own.
It’s a different vision, which – as is so often the case at Capgemini – is rooted in practicality and circumstances in our bid for operational excellence for our clients.
Finally, that’s not all, it’s also an optimistic vision that helps organizations meet their obligations not just to their customers and to their balance sheets, but also to wider society by keeping humans in the loop.
To learn how Capgemini’s Intelligent Process Automation offering provides human-in-the-loop processes that can deliver better outcomes when implementing intelligent automation and AI, contact: firstname.lastname@example.org
Read Taoufik’s full point of view in which he argues that intelligent automation will increasingly involve smart orchestration of tasks between machines and humans to reach operational excellence.
Taoufik Amri helps Capgemini’s clients implement intelligent automation into their business processes. As the principal data scientist for Capgemini’s Business Services, Taoufik identifies tasks that can be performed better and/or faster with AI, measures the value added by AI with advanced quantitative business process models, and designs human-in-the-loop solutions.