One of the most enjoyable aspects of my role as CTO for Capgemini’s Business Services is being able to work with technology companies that are pioneering breakthrough innovations—and this is one of the reasons we created our global partnership with Celaton in 2016. Their work around AI and machine learning is impressive to say the least, and the industry is taking notice—Celaton recently received the 2017 Queen’s Award for Enterprise in Innovation for the development of their inSTREAM™ offering.

I’ve had the pleasure of collaborating with Celaton’s CEO, Andrew Anderson, applying inSTREAM™ to improve business process transformation for our clients. In this blog, Andrew shares some of his insights on how machine learning can help manage unpredictable data volume and deliver significant benefits in finance and accounting.

Andrew Anderson, CEO, Celaton

According to a recent study, we create a staggering 2.5 quintillion bytes of data every day. Many organizations are reliant on data to run their business, including everything from order management and cash flow to HR and customer services. Yet the sheer amount of data that organizations receive on a daily basis from staff, suppliers and customers from a wide variety of channels can be overwhelming and if left unprocessed damaging to an organization’s financial performance and reputation.

In addition, the amount of data received is subject to surges created by factors out of the organization’s hands such as “acts of God.” For example, for train operating companies a disruption on the rail network may cause an influx of compensation claims and in the same respect a sudden flash flood may cause an influx of claims for insurance companies. These surges are unpredictable and as such difficult to plan for. Organisations are either forced to put additional strains on existing staff to process this incoming data, often resulting in delays to the customer or supplier or employing last minute temporary staff, at an inflated cost to the business. So, the question is how do organizations plan for and effectively manage these surges in data?

Unfortunately, there is no way to predict surges 100% of the time, so organizations need to have an effective system in place that can cope with any incoming data volumes—to be able to scale on demand. More and more organizations are turning to machine learning technologies like Celaton’s inSTREAM™ to achieve this.

Technology with machine learning capabilities can not only handle the plethora of incoming data regardless of volume or document type but also learn to understand the meaning and intent of the content. This allows it to recognize and extract key data, enrich it via other sources of information, and then process and deliver accurate information to other business systems or robots for further processing. Software that learns through the natural consequence of processing and monitoring the actions of humans in the loop can achieve greater productivity per person with continuous optimization. The greater the volume of data the faster it learns.

Taking it a step further, when machine learning technologies are applied as part of a business process transformation program, that’s when the real value is realized. For example, Capgemini’s Automation Drive Suite includes the Virtual Delivery Center which uses Celaton’s machine learning coupled with autonomous robots to deliver straight-through machine-to-machine processing without human intervention, unless there are exceptions to be handled. This delivers significant benefits in both productivity and accuracy in finance and accounting processes.

Applying machine learning technologies to optimize business processes means that organizations can continue to deliver great customer service, without delays, without additional staff, regardless of increase in volumes or unpredictable surges.

Save

Save