Skip to Content

Geospatial analytics: The key to unlocking the UK’s electric vehicle revolution

Capgemini
Capgemini Invent Geospatial Community
Jul 1, 2025

The United Kingdom (UK) is striving to bring about an electric vehicle (EV) revolution. EV adoption and rollout for UK citizens is an important part of the UK achieving long term sustainability goals.

However, the UK is currently falling behind the required charging infrastructure to support this revolution. According to the UK Infrastructure Bank, in 2023 an average of 1,600 charge points were installed per month – under half the required 3,250 to meet forecasted demand. The UK government also has an estimated shortfall of funding of £1.5 billion to build the necessary charging infrastructure to meet its goal of having 100% of all new vehicles being zero emissions by 2035.

Our geospatial analytics community at Capgemini Invent believe that geospatial data should be an important component in supporting the EV revolution, and policymakers should be fully utilising its potential.

In this article, they will review how the UK’s 2030 Geospatial Strategy supports the EV revolution, the key challenges the EV revolution faces in the UK, and finally how geospatial techniques offer solutions to support the transition to EVs.

Can the 2030 Geospatial strategy support the electric vehicle revolution in the UK?

The UK’s 2030 Geospatial Strategy, released by the UK Geospatial Commission, outlines a strategic framework for the utilisation of geospatial analytics and techniques by the public sector to support economic growth, while also fostering environmental stewardship and enhancing social well-being.

The strategy offers missions and opportunity areas for increasing the use of geospatial techniques to support a wide array of sectors. In relation to the electric vehicle revolution, it outlines a comprehensive framework and offers insights on how the public sector can further support electric vehicle adoption and infrastructure by utilising geospatial data and analytics.

The Challenges

Infrastructure building

The UK government estimates that 300,000 charge points will be needed to support a full EV rollout by 2030. The announcement by the UK government of a delay for this target gives more time, but UK is still short of EV charge points. As of March 2024, there are 59,590 EV charging points located across 32,322 charging locations. Building this infrastructure and the selected locations of charge points present a significant challenge – so what is preventing expansion of the network?

One of the main blockers is a lack of funding and financial incentives for the expansion. Although the UK government has committed £1.6 billion towards the UK’s charging infrastructure such as the Local EV Infrastructure (LEVI) fund and the Rapid Charging Fund (RCF), additional funding is still required as stated previously. Without sufficient investment and incentives for charge point infrastructure, EV rollout may be delayed and pushed further back.

Charge anxiety

Charge anxiety is a growing phenomenon challenging the adoption and rollout of electric vehicles. This can be defined as the fear that charge points are unreliable, too costly, and too sporadic to use effectively. This anxiety is interconnected with the need to expand the charge point network in the UK. Public confidence in electric vehicles is increasing, with over half of motorists between 16-49 stating they would switch to an electric vehicle within the next ten years. However, a general anxiety exists that the charge infrastructure is insufficient, and charge points themselves fail to meet consumer needs.

Equity and access:

Ensuring equitable access to the charge point system remains a critical challenge for UK citizens. Various segments of society encounter difficulties when trying to utilise the charge point network. One significant factor is the urban-rural divide. A County Councils Network report revealed that rural drivers have access to only one charge point per every 16 kilometres whereas London drivers enjoy a more favourable ratio of one charge point for every 1.2km.

Figure 1 below shows the rural/urban classification of local authority districts, and Figure 2 shows the number of EV chargers per km2 for the same boundary areas. Comparing these maps visually shows that local authorities with a higher proportion of urban areas tend to have a higher density of EV chargers compared to those that are more rural. The boxplot in Figure 3 shows this to be true. The average density of EV charge points increases in local authorities categorised as more urban.

Figure 1: Rural Urban Classification for England
Source: https://geoportal.statistics.gov.uk/
Visual produced using kepler.gl
Figure 2: Number of EV Chargers per km2
Source: https://chargepoints.dft.gov.uk
Visual produced using kepler.gl
Figure 3: Boxplot of charge point density by Rural Urban Classification

The second major factor is those living with no at-home access to charge points. This can be for varying reasons including living in a flat or not having access to off-street parking. The challenge remains that if citizens cannot access charge points at home, they will be forced to use public charge points as alternatives. This can result in higher costs relative to at-home chargers and poses a challenge in providing access to charging points for economically-disadvantaged individuals within society.

Solutions

Demand prediction and population movement

Geospatial analytics can provide detailed insights on origin-destination (O-D) movements, how travel patterns are changing, and where they differ from region to region. Traditionally this O-D flow data has been captured through census data. This has limitations based on sample sizes, frequency of updates and type of journey. Anonymised mobile phone data is rapidly becoming a more promising source. It has the advantage of modelling estimates for smaller areas and can generate more frequent and timely outputs, meaning it can respond to changing travel trends faster.

Geospatial analytics has a critical role to play in unlocking the travel flow insights. Utilising this data correctly can inform the prioritisation of where EV charge points are located and how future funding could be allocated. This aligns with the aims of the 2030 Geospatial Strategy which outlines the power of population movement data and how aggregated and anonymised data from mobile phone and apps can be utilised by the public sector.

Figure 4: Travel to work data – from place of usual residence to Manchester
Source: 2021 Census Data
Visual produced using kepler.gl
Figure 5: Travel to work data – from Manchester to place of usual residence
Source: 2021 Census Data
Visual produced using kepler.gl

Business fleets

Business fleets will play a critical part of the EV transition, however there are significant operational, financial, and logistical challenges that need to be considered to ensure feasibility. Combining advanced routing algorithms with geospatial analytics can help businesses scenario model fleet performance under a range of different conditions. This can help answer key strategic questions through data-driven analysis: What fleet mix will best suit business needs? Will depots have sufficient capacity to suit charging requirements? How adaptable is the fleet to meet future demands?

From an operational standpoint, geospatial analytics can be used to combine live data feeds from charge points, with vehicle routes, range, capacity, delivery requirements and driver schedules, to optimise routes – by minimising driver downtime and maximising efficiency.

Data and pricing

To combat the problems of equity and access, the effective use of data and standardised pricing are important to ensure all can benefit from both geospatial data and electric vehicles. Demographic, property, street, and traffic data can ensure all areas are adequately supplied with charging infrastructure.

Pricing is also an important aspect of ensuring equity between regions.  The UK government has tried to reduce charge anxiety through implementing regulations such as the Public Charge Point Regulations 2023. The regulations implement key changes including standardised pricing metrics to help improve consumers’ experience with charge points.

The future is geospatial and electric

Many challenges still exist for the transition to electric vehicles to succeed. The need to build the required infrastructure for electric vehicles and the charge anxiety that still exists amongst the British public will hinder any EV revolution. However, the 2030 Geospatial strategy seeks to alleviate some of these concerns through offering a strategic framework to enable the use of geospatial data and techniques by the public sector. Geospatial analytics is key in supporting the EV rollout and to enable consumers to make informed travel decisions in the future.

A journey toward more sustainable end-user devices in IT operations

Aleksandra Domagala
Jun 26, 2025

In today’s rapidly evolving business landscape, organizations are under immense pressure to not only remain competitive but also to operate sustainably. According to Capgemini Research Institute, sustainability remains a top priority, with 82% of organizations increasing investments in 2025 and 98% planning to do so by 2026.

Embracing cost reduction and resource conservation isn’t just a strategic move for end-user services and workplace leads – it’s a transformative journey toward a more resilient and responsible future. By focusing on these areas, organizations can achieve significant financial savings, reduce their environmental footprint, and enhance their corporate reputation.

The necessity for transitioning to profitable business models with reduced environmental impact

It is evident that sustainability is not just a fleeting trend, but a powerful force reshaping the future. This is also visible in the workplace area. For instance, 76% of organizations required to report emissions already have a sustainable end-user computing strategy. This revolution in end-user operations is driven by the increasing awareness of the environmental impact of electronic waste and the soaring costs of energy. Consider this stark reality: In 2024, twice as much e-waste was generated compared to 2010, and only 20% of it was properly recycled. Additionally, global average electricity prices rose by over 46% between 2010 and 2024. These trends highlight the urgent need for organizations to adopt sustainable practices, for the sake of the environment and their own business.

A journey toward proficiency in sustainable end-user devices

At Capgemini, we observe the complexities and challenges that organizations face in their sustainability efforts when addressing questions related to end-user devices, such as:

  • What the optimal end-user devices catalog is,
  • Which original equipment manufacturer (OEM) to select,
  • How to measure the impact on the environment and potential savings,
  • And how to incorporate circular economy aspects into IT.

For this reason, we have developed our proprietary approach to assist clients at every stage of their maturity journey. This method provides a comprehensive understanding of available data, delivers precise and detailed recommendations, and ultimately achieves better and more measurable results.

To achieve this, we work with our partners who are committed to sustainability to make sure our solutions are thorough and effective. We start with data gathering and analysis, using tools like digital experience monitoring and environmental impact assessment. Based on this data, we provide tailored recommendations to optimize energy consumption, refresh devices based on performance and experience, and assist in deciding how to allocate sustainable and efficient devices. What is even more important, our proprietary methodology encompasses not only environmental and business elements but also focuses on employees’ productivity and experience. Our approach isn’t just about lowering the impact on the planet and cutting costs; it’s about making sure employees stay productive and happy since they’re the ones who’ll be using the tech at the end of the day.

Our case studies demonstrate the tangible benefits of this approach. For instance, our assessment for a client in the retail industry indicated a remarkable potential carbon saving of 959 tons of CO2e annually, representing a 50 percent reduction in electricity consumption, and a cost saving of one million euros per year. Imagine the tremendous impact on both the environment and the business – truly a win-win scenario!

Start your journey toward more sustainable end-user devices 

Are you looking to conserve resources, and reduce the energy costs of your end-user devices?  Do you want to stay on top of sustainability goals while delivering employee experience?

Capgemini experts can incorporate employee experience and performance into sustainability efforts. This approach helps organizations meet sustainability goals while driving long-term success and resilience.

Check out our exciting Point of View: Achieving net zero: Cutting costs and carbon with sustainable devices.

Are you looking to start your journey toward more sustainable end-user devices?

Talk to us!

About the author

Aleksandra Domagala

Product Manager, CIS
Aleksandra is a Product Manager with a background in organizational psychology which enables her to create evidence-based solutions, adjust them to a multicultural context, and design delightful user experiences. She is engaged in the development of immersive workspaces and sustainable workplace solutions. Aleksandra has vast experience in digital transformations, employee research, consulting and change management.

    Machines need zero trust too: Why devices deserve context-aware security

    Lee Newcombe
    Jun 25, 2025

    In the first post in this series, I wrote about the business and security outcomes that can be achieved for users (and the organizations to which they belong!) by adopting approaches labeled as “zero trust.” But why should we limit ourselves to interactions with human users? Don’t machines deserve a little attention too?

    The answer, of course, is “yes” – not least because this would otherwise be a remarkably short post. So, I’m going to talk about the application of those high-level characteristics of zero trust mentioned in my last post – dynamic, context-based, security – to operational technology (OT).

    As every OT professional will quite rightly spell out – at length – OT is not IT. They have grown from separate disciplines, talk different network protocols, have different threat models, and often have different priorities when it comes to the application of the confidentiality, integrity, and availability triad we have used for so long in the security world. When your company faces losses of millions of dollars a day from a production line outage, or your critical national infrastructure (CNI) service can no longer function, availability rapidly becomes the key business issue, particularly where intellectual property may not be a core concern. Before diving into the application of dynamic, context-based, security principles to OT, we should probably set a little more context:

    • OT facilities may not be as well-segmented as modern corporate IT networks. They were either isolated or “behind the firewall,” so why do more? (Of course, best practice has long pointed toward segmentation, however if best practice were always implemented I’d likely be out of a job).
    • OT covers a vast range of technologies and different types of devices, from sensors out in the field through to massive manufacturing plants. Threat models differ! Context matters.
    • Devices often have embedded operating systems (typically cut-down versions of standard operating systems); these systems require patching and maintenance if they are not to become susceptible to known vulnerabilities.
    • Equipment requires maintenance. You’ll often find remote access facilities in the OT environment for the vendors to be able to conduct such maintenance remotely. (You might see where this is going from a security perspective.)
    • The move toward intelligent industry is pushing OT toward increasing use of machine learning and artificial intelligence, all of which is heavily reliant upon data – which means you need a way to export that data to the services performing the analysis. Your “air gap” isn’t really an air gap anymore. (And if we’re talking about critical national infrastructure, then there may well also be some sovereignty issues to consider.)
    • Legacy is a real problem. What happens if a business buys a specialist piece of kit and then the vendor goes bust? It could well form a critical part of the manufacturing process, and so stripping it out is not always possible, let alone straightforward.
    • OT doesn’t always talk IP. This is a problem for traditional security tools that only understand IP. We need to use specialized versions of traditional security tooling like monitoring solutions – solutions that can understand the communications protocols in use. Meanwhile, network transceivers/data converters may contain software components that can sometimes get overlooked from a security perspective.
    • Good models for thinking about OT security are out there, e.g. the Purdue model and the ISO 62443 series (which provide structures for the different levels of technology and functionality in OT environments, from the physical switches and actuators up to the enterprise information and management systems). It’s not as much of a wild west out there as my words so far may indicate – but we can do better.

    For the purposes of this article, the above highlights some interesting requirements from an OT security perspective:

    1. We need to understand the overall OT environment, and be able to secure access into and within it.
    2. We need to make the OT environment more resilient – reduce the blast radius of compromise. We really do not want one compromised machine taking out a whole facility.
    3. We want to be able to control machine-to-machine communications, and communications across the different layers of the Purdue model, e.g., from the shop floor to the management systems, or even across to the enterprise environment for import into the data lake for analysis purposes.

    Lots of interesting problems, some of which seem very similar to those discussed in the context of securing human user access to applications and systems.

    How do we start the process of finding some solutions? Well, first things first. We need a way to distinguish the devices we are securing, i.e., some form of machine identity. We have a variety of options here, from the installation of trusted digital certificates through to the use of network-based identifiers (including IP addresses and hardware addresses where available). Once we have identities, we can start to think of how to use them to deliver context-based security.

    Let’s start by establishing some baselines of normal behavior:

    • How do the devices in scope communicate?
    • What other devices do they communicate with, and what protocols do they use?
    • Are there some obvious segmentation approaches that we can take based off of those communication patterns? If not, are there some more context-based approaches we can take, e.g., do specific communications tend to take place at specific times of day?

    Such profiling may need to take place over an extended period of time in order to get a true understanding of the necessary communications. We should certainly be looking at how we control support access from vendors into the OT environment; let’s just start by making sure Vendor A can only access their own technology and not that of Vendor B. Let’s not forget to support access from internal users either, particularly if they have a habit of using personal or other unapproved devices. Going back to that segmentation point for a second, do we have any legacy equipment that is no longer in active support? If so, are we able to segment such kit away and protect access into and out of that environment to limit the risk associated with such legacy kit?

    Whether we are trying to apply dynamic, context-based security to machines or users, many of the same considerations apply:

    1. Is there a way to uniquely identify and authenticate the entities requesting access?
    2. Where are the signals going to come from to enable us to define the context used to either grant or deny access?
    3. How can we segment the resources to which access is being requested?
    4. Where are we going to apply the enforcement mechanisms that act as the barriers to access? Do these mechanisms have consistent network connectivity or must they operate independently?
    5. How do we balance defense in depth with simplicity and cost of operation?

    If an organization already has some technologies that can help to deliver the required outcomes, e.g., some form of secure software edge, there will often be some merit in extending that coverage to the OT environment, particularly with respect to remote access into such environments.

    I’ve shown that we can apply the same zero trust principles to machines that we can apply to users. However, knowing the principles and believing they have value is one thing, finding an appropriate strategy to deliver them in an enterprise context is something completely different. The final post in this series will talk about how we can approach doing this kind of enterprise security transformation in the real world.

    About the author

    Lee Newcombe

    Expert in Cloud security, Security Architecture, Zero Trust and Secure by Design
    Dr. Lee Newcombe has over 25 years of experience in the security industry, spanning roles from penetration testing to security architecture, often leading security transformation across both public and private sector organizations. As the global service owner for Zero Trust at Capgemini, and a member of the Certified Chief Architects community, he leads major transformation programs. Lee is an active member of the security community, a former Chair of the UK Chapter of the Cloud Security Alliance, and a published author. He advises clients on achieving their desired outcomes whilst managing their cyber risk, from project initiation to service retirement.

      How artificial intelligence can drive real sustainability

      Capgemini
      Capgemini
      Jun 27, 2025

      The question of sustainability is a question of digitalization. Artificial intelligence (AI) is a powerful tool that can make digital operations more efficient, furthering any goal a company may have – including increased sustainability.

      This blog is part of a three-part series co-developed by Capgemini and Microsoft, exploring how AI-driven digitalization can accelerate operational excellence and sustainability. From enterprise-wide deployment to the evolving human-AI dynamic, the series highlights key enablers for unlocking value responsibly at scale.

      Sustainability is one of the defining challenges of our time – and it’s an opportunity for innovation as much as a call to action. Like any powerful technology, AI has an environmental footprint that must be acknowledged and addressed. At the same time, AI offers a unique opportunity to reach our sustainability goals. One third (33%) of executives say they have already started using AI for sustainability initiatives. Organizations worldwide are using AI to make their digitization more efficient – and therefore more sustainable.

      Versatile technology for every need

      The field of AI is evolving rapidly and offers immense potential. The technology has advanced beyond Generative AI, which simply reacts to prompts written by humans. Now, AI agents can act more autonomously and more accurately, collaborating with each other to perform complex tasks. AI is a powerful technology built for flexibility in model and scale, with applications across every industry.

      High-powered digitalization with AI

      Harnessing AI’s capabilities to boost existing digital solutions could completely change the game when it comes to addressing sustainability issues.

      Consider its capability to monitor and manage complex systems. For example, in the U.S. and the U.K., AI-powered sensors and software can measure and predict the real-time capacity of transmission lines in the energy grid. The optimization of this complex system has unlocked significant unused capacity on long-distance transmission lines. This directly enables the adoption of renewable energy sources, which are often located far from where their power is needed. The U.K.’s National Grid used this technology to increase capacity by 60% and add an additional 600 megawatts (MW) of offshore wind capacity.

      When coupled with high-performance cloud computing, AI can significantly accelerate the development of innovative sustainability solutions. R&D teams are already deploying AI solutions in materials science. Using digital twins to simulate and predict the properties of materials that could be used in new kinds of batteries, they’re cutting development time down from years to weeks.

      What’s more, AI and digitalization are empowering the human workforce, especially when it comes to sustainability initiatives. Some companies are already using AI to collect Scope 3 emissions data from suppliers, streamlining the process and reducing the administrative burden on employees. This leaves humans more time for strategizing, decision-making, and implementation.

      Effective use of AI

      So how can organizations leverage AI to drive their sustainability agenda?

      Precision is key. Carefully optimized AI consumes fewer resources – and gets the job done more efficiently. Recent research shows that organizations are achieving measurable efficiencies, leading to cost reductions ranging from 26% to 31%. To find the best solution, companies first need to identify their needs. Guided by an expert partner like Capgemini, they can then choose the proper algorithm, model, and agents for each use case. Capgemini can then streamline deployment by seamlessly and securely integrating agentic capabilities into a company’s existing technology infrastructure.

      It’s also important for companies not to neglect the fundamentals on either side of AI agents: humans and data. To operate most efficiently, AI agents need access to robust and reliable data sets. They also need to be directed by human employees – who themselves need to be trained in AI management. With precise direction and clear data to process, AI agents can make a significant contribution to any sustainability initiative.

      A tool for a more sustainable future

      While digitalization increases energy and resource consumption, AI represents a powerful lever for making these digital processes more efficient. Organizations can strategically leverage AI’s analytical and predictive power to not only reduce their environmental footprint but also empower their workforce.

      Authors

      Régis Lavisse

      Sustainability Lead, Microsoft France

      Régis began his career as an operational manager of sales and technical teams in the electricity and gas industries. Passionate about the impact of technologies on economy and societies, and in the face of the environmental emergency, he joined ENGIE Digital in 2017 to accelerate the transition to a carbon-neutral economy through digital technology. Having joined Microsoft in 2023, Régis is now leading Sustainability for Microsoft France.

      Mark Oost

      AI, Analytics, Agents Global Leader, Capgemini

      Prior to joining Capgemini, Mark was the CTO of AI and Analytics at Sogeti Global, where he developed the AI portfolio and strategy. Before that, he worked as a Practice Lead for Data Science and AI at Sogeti Netherlands, where he started the Data Science team, and as a Lead Data Scientist at Teradata and Experian. Throughout his career, Mark has worked with clients from various markets around the world and has used AI, deep learning, and machine learning technologies to solve complex problems.

      In uncertain times, supply chains need better insights enabled by agentic AI

      Dnyanesh Joshi
      June 26, 2025

      Intelligent decision-making has never been so important, and agentic AI is a technology that can deliver the actionable insights the chief supply chain officer needs to build resilience and agility.

      Intelligent decision-making has never been so important, and agentic AI is a technology that can deliver the actionable insights the chief supply chain officer needs to build resilience and agility.

      To call the current business climate volatile is an understatement – and at enterprises across multiple industrial sectors, the people most keenly impacted by the resulting uncertainty are likely those responsible for managing their organization’s supply chains. These vital, logistical links are subject to powerful external forces – from economic and political factors to environmental impacts and changes in consumer behavior. It’s critical that the executives in charge of supply chains, and their teams, take advantage of every tool to make smarter decisions.

      New, multi-AI agent systems can deliver the insights that not only make supply chains more resilient, but also help executives identify opportunities to reduce logistics costs. But organizations must be ready to take advantage of these powerful tools. Preparing for success includes creating the right roadmap and engaging the right strategic technology partner.

      Common pain points in the chain

      In my conversations with chief supply chain officers, I’ve identified several common pain points they’re keen to address. Most are being challenged to improve supply planning, reduce inventory cycle times and costs, better manage logistics investments, and do a better job of assessing risks associated with suppliers and other partners across their ecosystem.

      A company’s own data is an important source of the information required to help CSCOs achieve these goals and to enable agentic AI. Unfortunately, legacy business intelligence systems are not up to the task. There are several ways in which they fail to deliver:

      • Analytics systems rarely support strategic foresight and transformative innovation – instead providing business users with yet another dashboard.
      • The results are often, at best, a topic for discussion at the next team meeting – not sufficient for a decision-maker to act upon immediately and with confidence.
      • Systems typically fail to personalize their output to provide insights contextualized for the person viewing them – instead offering a generic result that satisfies nobody.
      • Systems often aggregate data within silos, which means their output still requires additional interpretation to be valuable.

      In short, many legacy systems miss the big picture, miss actionable meaning, miss the persona – and miss the point.

      Based on my experience, I recommend an organization address this through multi-AI agent systems.

      With the introduction of Gen AI Strategic Intelligence System by Capgemini, this could be the very system that bridges the gap between the old way, and a value-driven future. This system converts the vast amounts of data generated by each client, across their enterprise, into actionable insights. It is agentic: it operates continuously and is capable of independent decision-making, planning, and execution without human supervision. This agentic AI solution examines its own work to identify ways to improve it rather than simply responding to prompts. It’s also able to collaborate with multiple AI agents with specialized roles, to engage in more complex problem-solving and deliver better results.

      How would organizations potentially go about doing this?

      Establish an AI-driven KPI improvement strategy

      First, organizations must establish a well-defined roadmap to take full advantage of AI-enabled decision-making – one that aligns technology with business objectives.

      For CSCOs, this starts by identifying the end goals – the core business objectives and associated KPIs relevant to supply chain management. These are the basis upon which the supply chain contributes to the organization’s value, and strengthening them is always a smart exercise. The good news is that even small improvements to any of these KPIs can deliver enormous benefits.

      The roadmap should take advantage of pre-existing AI models to generate predictive insights. It should also ensure scalability, reliability, and manageability of all AI agents – not just within the realm of supply chain management, but throughout the organization. That also means it should be designed to leverage domain-centric data products from disparate enterprise resource planning and IT systems without having to move them to one central location.

      Finally, the roadmap must identify initiatives to ensure the quality and reliability of the organization’s data by pursuing best-in-class data strategies. These include:

      • Deploying the right platform to build secure, reliable, and scalable solutions
      • Implementing an enterprise-wide governance framework
      • Establishing the guardrails that protect data privacy, define how generative AI can be used, and shield brand reputation.

      An experienced technology partner

      Second, the organization must engage the right strategic partner – one that can provide business transformation expertise, industry-specific knowledge, and innovative generative AI solutions.

      Capgemini leverages its technology expertise, its partnerships with all major Gen AI platform providers, and its experience across multiple industrial sectors to design, deliver, and support generative AI strategies and solutions that are secure, reliable, and tailored to the unique needs of its clients.

      Capgemini’s solution draws upon the client’s data ecosystem to perform root-cause analysis of KPI changes and then generates prescriptive recommendations and next-best actions – tailored to each persona within the supply chain team. The result is goal-oriented insights aligned with business objectives, ready to empower the organization through actionable roadmaps for sustainable growth and competitive advantage.

      Applying agentic AI to the supply chain*

      Here’s a use case that demonstrates the potential of an agentic AI solution for supply chain management.

      An executive responsible for supply chain management is looking for an executive-level summary and 360-degree visualization dashboard. They want automated insights and recommended next-best actions to identify savings opportunities.

      An analytics solution powered by agentic AI can incorporate multiple KPIs into its analysis – including logistics spend, cost per mile, cycle time, on-time delivery rates, cargo damage, and claims. It can also track performance of third-party logistics service providers – including on-time performance, adherence to contractual volumes, freight rates, damages, and tender acceptance.

      The solution can then apply AI and machine learning to optimize asset use through better design of loadings and routes. Partner performance can be analyzed – including insights into freight rates, delays, financial compliance, and lead times – and used to negotiate better rates.

      The impact of this can include a reduction in logistics spend of approximately 10 percent, an opportunity to save approximately five percent through consolidation of routes and services, and a 15 percent improvement in transit lead time.

      Capgemini enables this use case through an AI logistics insights 360 solution offered for the Gen AI Strategic Intelligence System by Capgemini. Just imagine this agent working 24/7 on your behalf; they don’t sleep, they don’t get tired, they don’t take vacation, and they’re completely autonomous.

      Real results that relieve supply chain pressures

      Capgemini’s modeling suggests that with the right implementation and support, the potential benefits include reducing overall supply chain spending by approximately five percent – including a 10-percent reduction in logistics spend. Other benefits include a three percent improvement in compliance, plus 360-degree order visibility and tracking.

      Given that today’s supply chains are being subjected to so many pressures from so many sources, those are meaningful advantages that cannot be ignored.

      *Results based on industry benchmarks and observed outcomes from similar initiatives with clients. Individual results will vary.

      The Gen AI Strategic Intelligence System by Capgemini works across all industrial sectors, and integrates seamlessly with various corporate domains. Download our PoV here to learn more or contact our below expert if you would like to discuss this further.

      Meet the authors

      Dnyanesh Joshi

      Large Deals Advisory, AI/Analytics/Gen-AI based IT/Business Delivery oriented Deals Shaping Leader
      Dnyanesh is a seasoned Large Deals Advisory, AI/Analytics/Gen-AI based IT/Business Delivery oriented Deals Shaping Leader with 24+ years of experience in Large Deals Wins by Value Creation through Pricing Strategy, Accelerator Frameworks/Products, Gen-AI based Strategic Operating Model/Productivity Gains, Enterprise Data Strategy, Enterprise, Data Governance, Gen-AI/ Supervised, Unsupervised and Machine Learning based Business Metrics Enhancements and Technology Consulting. Other areas of expertise are Pre-sales and Solutions Selling, Product Development, Global Programs Delivery, Transformational Technologies implementation within BFSI, Telecom and Energy-Utility Domains.

        Achieving regulatory excellence with India’s managed services 

        Syed Sanaur Rab
        Jun 26, 2025

        A revolution in trade and transaction reporting 

        With rising regulatory pressures and data challenges, financial institutions increasingly demand efficient trade and transaction reporting. In response, there is a noticeable shift toward managed services solutions, with India emerging as a key destination. 

        India’s rise in this domain can be attributed to several factors, including a robust talent pool, cost advantages, technological innovation, and operational efficiency, organizations like Capgemini are playing a central role in transforming how trade and transaction reporting is managed. In today’s financial landscape, institutions are under mounting pressure to meet increasingly complex and evolving regulatory requirements. From trade and transaction reporting (TTR) to data reconciliation, regulatory submissions, and analytics, the operational burden is growing – and so are the associated costs. Compliance is no longer just a legal necessity; it’s a strategic imperative that demands specialized expertise, scalable infrastructure, and round-the-clock operational support. 

        These challenges are compounded by the need for agility, accuracy, and cost-efficiency. Financial institutions must navigate a web of jurisdictional rules, manage vast volumes of data, and ensure timely reporting – all while keeping operational costs in check. This is where India’s value proposition becomes particularly compelling. 

        India has rapidly emerged as a global hub for managed services, offering a unique blend of deep domain expertise, advanced technological capabilities, and cost-effective delivery models. Its workforce is not only technically proficient but also increasingly specialized in financial services operations, regulatory compliance, and digital transformation. 

        At Capgemini, we are leveraging this strategic advantage through our Post-Trade Transaction Reporting Practice. By expanding the scope of managed services beyond reporting, Capgemini is helping financial institutions transform compliance from a cost centre into a source of strategic value. This blog explores how India is not just supporting this shift – but leading it 

        Leveraging a vast talent pool 

        India has long been recognized for its diverse and highly skilled workforce, and the financial services sector is no exception. With a large pool of professionals possessing a unique blend of expertise in finance, technology, and regulatory compliance, India is increasingly recognized for its ability to manage complex reporting requirements. These professionals bring a strong understanding of global financial markets, regulatory standards, and the technologies required to handle large-scale data processing, making India an ideal base for supporting trade and transaction reporting needs. 

        For financial institutions, this talent pool offers deep expertise in navigating regulatory landscapes such as MiFID II, EMIR, Dodd-Frank, and SFTR. These frameworks demand stringent data reporting and reconciliation processes.  This is an area where India’s workforce excels. 

        Cost-effectiveness and operational efficiency 

        In addition to technical expertise, India offers a significant cost advantage, making it an attractive destination for financial institutions aiming to optimize operational costs. Institutions are under constant pressure to streamline processes and reduce overhead while maintaining compliance and reporting accuracy. Leveraging managed services in India can significantly lower operational costs, as labor expenses are considerably lower than in many Western markets. 

        Moreover, the cost-effectiveness extends beyond just labor. Infrastructure and technology investments in India can be more easily scaled, allowing financial institutions to adopt cutting-edge solutions at a fraction of the cost. This provides access to best-in-class capabilities without the need for substantial capital expenditures. 

        Technological innovation and automation 

        India is increasingly becoming a global leader in IT infrastructure and innovation, with a focus on technologies transforming the trade and transaction reporting landscape. At Capgemini, there is a strong emphasis on integrating advanced technologies such as automation, data analytics, and AI into managed services offerings. 

        AI and machine learning streamline data aggregation, reconciliation, and validation, significantly reducing manual errors and improving speed and accuracy. These technologies enable financial institutions to achieve shorter turnaround times, ensuring that they meet regulatory deadlines and respond quickly to market changes. 

        As adoption of these technologies accelerates, India is becoming a key player. By partnering with Indian service providers, financial institutions can stay ahead of regulatory and technological curves as well as emerging market trends. 

        24/7 operational capabilities

        Financial markets operate continuously, requiring reliable, round-the-clock support for reporting functions. India, with its well-established infrastructure, offers a 24/7 operational model, ensuring financial institutions meet their reporting obligations across time zones. 

        Indian teams offer continuous monitoring and rapid response. This uninterrupted support is critical for global financial institutions with operations in multiple regions, ensuring seamless compliance and reporting activities across different time zones. 

        Post-trade transaction reporting and specialized expertise

        One of the key areas in which India excels is in post-trade transaction reporting. This includes critical processes like data reconciliation, regulatory submissions, and compliance checks that ensure transparency and reduce market risks. By focusing on building specialized talent pools, including subject matter experts, India enables firms to navigate the complexities of global regulations, such as EMIR and the U.S. Dodd-Frank Act. 

        Capgemini, for example, has established a dedicated Post-Trade Transaction Reporting Practice that helps financial institutions optimize operations by streamlining these processes. Using advanced analytics, automation, and regulatory expertise it helps clients reduce risk and ensure compliance. Centralizing these delivers cost-effective, high-quality services vital to managing regulatory obligations. 

        Regulatory change management   

        As financial regulations evolve globally, institutions must be agile and adapt their systems and processes in real time. Regulatory change management is a key area where Indian managed services providers add value. Changes in regulatory frameworks can be complex and costly to implement, particularly when new rules require re-architecting internal systems or updating reporting platforms. 

        Capgemini offers specialized solutions to help financial institutions navigate these changes. Whether it’s adapting existing systems to meet new regulations or developing entirely new platforms for reporting, Capgemini supports its clients through every phase of the change management process. This proactive approach ensures that financial institutions remain compliant with evolving regulations while avoiding costly penalties or operational disruptions. 

        Conclusion

        India’s emergence as a hub for trade and transaction reporting managed services reflects a broader shift toward outsourcing and automation in the financial services industry. With a wealth of talent, cost advantages, and a strong focus on technological innovation, India is transforming the way financial institutions manage regulatory compliance and reporting.  

        Author

        Syed Sanaur Rab

        Manager

          Unlocking the future of Project Management-as-a-Service through the power of Gen AI

          Przemysław Struzik, Iwona Drążkiewicz, Bernadetta Siemianowska
          Jun 26, 2025

          Several global trends, particularly the rise in digital transformations, the growing importance of connected technologies, and the demographic shifts affecting the global workforce are likely to soon lead to a shortage of professionals in project management (PM), organizational change management (OCM), and Global Business Services (GBS).

          In this context, the integration of connected technologies may provide a solution. One of the most promising developments is the emergence of Project Management as a Service (PMaaS) driven by Generative AI (Gen AI). This future-ready platform is poised to revolutionize reporting, resource management, portfolio and program management, and more, significantly reducing the workload of project managers by the end of 2030.

          The Connected Enterprise and Gen AI

          The concept of a Connected Enterprise revolves around the seamless integration of data, connectivity, and technology to drive business innovation, enhance efficiency, and foster growth. Gen AI, with its ability to generate human-like text, analyze vast amounts of data, and provide actionable insights, is at the forefront of this transformation.

          By leveraging Gen AI, PMaaS platforms offer unprecedented levels of automation and intelligence, higher levels of predictive insights and strategic advice, while providing scalable solutions available 24/7 enabling organizations to streamline their project management processes. This results in better project outcomes, reduced risk, and significant cost savings for Capgemini’s clients.

          Transforming reporting and analytics

          Traditional project reporting is often a time-consuming and labor-intensive task. Gen AI automates the generation of reports by analyzing project data in real-time and presenting it in a clear, concise, and visually appealing format. For example:

          • Gen AI not only collects updates but also generates custom reports based on predefined criteria.
          • It creates tailored reports for different stakeholders (e.g. project managers, clients, or executives) by transforming raw data into insightful summaries, charts, or KPIs.
          • It also creates interactive dashboards that display real-time project data and updates in a visual and intuitive way.
          • Moreover, Gen AI automatically gathers and compiles project updates by integrating with tools such as task management platforms (e.g. Jira, Wrike, Smartsheet) and collaboration tools (e.g. Microsoft Teams). It extracts data on project progress, task completion rates, budget use and milestones without manual input from team members.

          This saves time and ensures that stakeholders have access to up-to-date information, enabling better decision-making.

          Enhancing resource management

          The complexity of resource allocation will be reduced as Gen AI helps match the right skills to the right tasks (profiles matching %), considering availability (globally or regionally), business priorities, skills, and project demands (the scope of work of each project management task can be split between junior and senior resources).

          Gen AI will enable dynamic adjustments to resource plans, further eliminating inefficiencies and ensuring optimal resource utilization across portfolios. Additionally, Gen AI provides insights into resource utilization patterns, helping organizations make informed decisions about hiring and training.

          Streamlining portfolio and program management

          Managing a portfolio of projects and programs requires a holistic view of all ongoing initiatives. Gen AI provides this by aggregating data from multiple projects and presenting it in a unified dashboard. This enables portfolio and project managers to monitor progress, identify risks, and make strategic adjustments in real-time. Furthermore, Gen AI simulates various scenarios to predict the impact of different decisions, enabling proactive management.

          Reducing administrative burden and personalized knowledge management

          One of the most significant benefits of Gen AI in PMaaS is the reduction in administrative tasks it delivers. For example:

          • Onboarding new program team members is simplified through personalized learning paths based on the role, experience, and learning style of the new team member.
          • AI-powered virtual assistants or chatbots can support new team members by answering frequently asked questions, specific tools, and workflows.
          • Analysis of new team members’ tasks and project assignments while proactively delivering relevant knowledge resources or updating to-do lists for any team member.
          • Meeting scheduling through its ability to automatically find suitable times, reminding participants about upcoming meetings and agenda points, while sending follow up emails with action points to help keep everyone on track.

          This enables project managers to focus on more strategic activities, such as stakeholder engagement and risk management.

          Predictive analytics for project outcomes

          Gen AI predicts the likelihood of project success based on various factors such as team performance, project complexity, and external influences. Leveraging historical data, real-time project inputs and machine learning models to forecast project success, this technology can also recommend corrective actions if the project is off-track to achieve predicted outcomes.

          The future of PMaaS

          As we look towards the future, the integration of Gen AI in PMaaS platforms will continue to evolve. Advanced natural language processing capabilities will enable more intuitive interactions with project management tools, making them accessible to a broader range of users.

          Additionally, the continuous learning capabilities of Gen AI will ensure that these platforms become increasingly accurate and efficient over time.

          Conclusion

          While concerns about accuracy and governance remain, advances in AI-driven risk mitigation strategies and tighter oversight will address these issues effectively. As a result, PMaaS platforms powered by Gen AI will drastically reduce the need for manual project management tasks, enabling organizations to scale project execution with unprecedented speed and efficiency. This enhances efficiency and enables project managers to focus on strategic activities that drive business growth. As connected technologies continue to advance, the Connected Enterprise will become a reality, powered by the intelligent capabilities of Gen AI.

          PMaaS, driven by Generative AI, will be the cornerstone in realizing this vision. Leveraging AI’s capabilities, PMaaS seamlessly aligns portfolios, manages resources, and optimizes operations across departments and regions, echoing Capgemini’s approach of delivering continuous, digital, and sustainable business value. This future holds tremendous promise for the PMaaS model, making it indispensable to companies that aim to stay competitive in a rapidly evolving digital economy.

          A Connected Enterprise ensures that every aspect of an organization—from operations to customer experience—operates in sync. Similarly, AI-enabled PMaaS will create more cohesive, transparent, and agile project environments driven by data-driven insight and predictive analysis. In this future state, organizations will no longer see project management as a support function but as an integrated service that drives growth, adaptability, and long-term sustainability. Just as Capgemini’s model emphasizes continuous value delivery, the future of PMaaS promises to be a key driver of the Connected Enterprise—bridging silos, fostering collaboration, and ensuring that business outcomes are consistently achieved.

          At Capgemini, the future of PMaaS lies in harnessing the collective power of our specialized teams to deliver unparalleled value to our clients. This means our clients benefit from a holistic transformation experience—one that enhances data agility, drives sustainability, and ensures that every project not only meets but also exceeds expectations.

          This is the future of PMaaS: a fusion of technological innovation and expert collaboration, creating a trusted partnership that helps clients thrive in an ever-evolving business landscape.

          Meet our experts

          Przemysław Struzik, IFAO Transformation Projects & Consulting, Capgemini’s Business Services

          Przemysław Struzik

          IFAO Transformation Projects & Consulting, Capgemini’s Business Services
          Przemyslaw helps organizations future-proof their delivery models by scaling Project Management-as-a-Service through Gen AI and helps shape and deliver innovative solutions to clients.
          Iwona Drążkiewicz, Business Transformation Manager, Capgemini’s Business Services

          Iwona Drążkiewicz

          Business Transformation Manager, Capgemini’s Business Services
          Iwona drives business transformation through optimizing and automating clients’ process infrastructure by designing and implementing program management that augments deployment effectiveness and efficiency.
          Bernadetta Siemianowska, Business Transformation Manager, Capgemini’s Business Services

          Bernadetta Siemianowska

          Business Transformation Manager, Capgemini’s Business Services
          Bernadetta drives business transformation through optimizing and automating clients’ process infrastructure by designing and implementing program management that augments deployment effectiveness and efficiency.

            Introducing Snowflake Openflow: Revolutionizing data integration 

            Sagar Lahiri
            Jun 25, 2025

            In today’s data-driven world, the ability to seamlessly integrate and manage data from various sources is crucial for businesses. Snowflake, a leader in data cloud solutions, has introduced a groundbreaking service called Snowflake Openflow. This fully managed, global data integration service is designed to connect any data source to any destination, supporting both structured and unstructured data. Let’s dive into what makes Snowflake Openflow a game-changer. 

            OpenFlow stands out due to its unique ability to separate control and data planes in network architecture, which allows for more flexible and efficient network management. Here are some key features that make OpenFlow exceptional: 

            Centralized control: OpenFlow enables centralized control of network devices, such as switches and routers, through a dedicated controller. This centralization simplifies network management and enhances the ability to implement complex policies. 

            Programmability: It allows network administrators to program the behavior of the network dynamically, which accelerates the introduction of new features and services. 

            Scalability: OpenFlow supports scalable network configurations, making it suitable for both small- and large-scale deployments. 

            High availability: The protocol ensures high availability by preserving the flow table across management module failovers and syncing configurations between active and standby modules. 

            Flexibility: OpenFlow supports multiple flow tables, custom pipeline processing, and various modes of operation, providing a high degree of flexibility in network design and operation. 

            What is Snowflake Openflow? 

            Snowflake Openflow is built on Apache NiFi®, an open-source data integration tool that automates the flow of data between systems. Openflow enhances Apache NiFi® by offering a cloud-native refresh, simplified security, and extended capabilities tailored for modern AI systems. This service ensures secure, continuous ingestion of unstructured data, making it ideal for enterprises. 

            Openflow and Apache NiFi stand out as superior data integration tools due to their robust ETL/ELT capabilities and efficient handling of CDC (change data capture) transformations. Openflow’s seamless integration with Snowflake and AWS, combined with its user-friendly CLI, simplifies the management of data pipelines and ensures high performance and scalability. 

            Some of the components of Openflow are: 

            • Control Plane: Openflow control plane is a multi-tenant application, designed to run on Kubernetes within your container platform. It serves as the backend component that facilitates the management and creation of data planes and Openflow runtimes. 
            • Data Plane: The Data Plane is where data pipelines execute, within individual Runtimes. You will often have multiple Runtimes to isolate different projects, teams, or for SDLC reasons, all associated with a single Data Plane. 
            • Runtime: Runtimes host your data pipelines, with the framework providing security, simplicity, and scalability. You can deploy Openflow Runtimes in your VPC using a CLI user experience. You can deploy Openflow Connectors to your Runtimes and also build new pipelines from scratch using Openflow processors and controller services. 
            • Data Plane Agent: The Data Plane Agent facilitates the creation of the Data Plane infrastructure and installation of Data Plane software components including the Data Plane Service. The Data Plane Agent authenticates with Snowflake System Image Registry to obtain Openflow container images. 

            Workflow summary: 

            • AWS cloud engineer/administrator: installs and manages Data Plane components via Openflow CLI on AWS. 
            • Data engineer (pipeline author): authenticates, creates, and customizes data flows; populates Bronze layer. 
            • Data engineer (pipeline operator): configures and runs data flows. 
            • Data engineer (transformation): transforms data from Bronze to Silver and Gold layers. 
            • Business user: utilizes Gold layer for analytics. 

            Key aspects of Apache NiFi 

            Dataflow automation: NiFi automates the movement and transformation of data between different systems, making it easier to manage data pipelines. 

            Web-based interface: It provides a user-friendly web interface for designing, controlling, and monitoring dataflows. 

            FlowFiles: In NiFi, data is encapsulated in FlowFiles, which consist of content (the actual data) and attributes (metadata about the data). 

            Processors: These are the core components that handle data processing tasks such as creating, sending, receiving, transforming, and routing data. 

            Scalability: NiFi supports scalable dataflows, allowing it to handle large volumes of data efficiently. 

            Apache NiFi’s intuitive web-based interface and powerful processors enable users to automate complex dataflows with ease, offering unparalleled flexibility and control. Together, these tools provide a comprehensive solution for data engineers and business users alike, ensuring reliable data ingestion, transformation, and analytics, making them the preferred choice for modern data integration needs. 

            Key features of Snowflake Openflow 

            1. Hybrid deployment options: Openflow supports both Snowflake-hosted and Bring Your Own Cloud (BYOC) options, providing flexibility for different deployment needs. 
            1. Comprehensive data support: It handles all types of data, including structured, unstructured, streaming, and batch data. 
            1. Global service: Openflow is designed to be a global service, capable of integrating data from any source to any destination. 

            How Openflow Works 

            Openflow simplifies the data pipeline process by managing raw ingestion, data transformation, and business-level aggregation. It supports various applications and services, including OLTP, internet of things (IoT), and data science, through a unified user experience. 

            Deployment and connectors 

            Openflow offers multiple deployment options: 

            • BYOC: deployed in the customer’s VPC 
            • Managed in Snowflake: utilizing Snowflake’s platform. 

            It also supports a wide range of connectors, including SaaS, database, streaming, and unstructured data connectors, ensuring seamless integration with various data sources. 

            Key use cases 

            1. High-speed data ingestion: Openflow can ingest data at multi-GB/sec rates from sources like Kafka into Snowflake’s Polaris/Iceberg. 
            1. Continuous multimodal data ingestion for AI: Near real-time ingestion of unstructured data from sources like SharePoint and Google Drive. 
            1. Integration with hybrid data estates: Deploy Openflow as a fully managed service on Snowflake or on your own VPC, either in the cloud or on-premises. 

            Roadmap and future developments 

            Snowflake has outlined an ambitious roadmap for Openflow, with key milestones including private and public previews, general availability, and the introduction of new connectors. The service aims to support a wide range of databases, SaaS applications, and unstructured data sources by the end of 2025. 

            Conclusion 

            Snowflake Openflow is set to revolutionize the way businesses handle data integration. With its robust features, flexible deployment options, and comprehensive support for various data types, Openflow is poised to become an essential tool for enterprises looking to harness the power of their data. 

            Sagar Lahiri

            Data Architect, Insights & Data
            Tech Enthusiast, passionate about Modern Data Platforms at Capgemini, Data and Insights. As a Snowflake Data Engineer and Architect, I specialize in helping clients unlock the full potential of their data.With a deep understanding of Snowflake’s cloud-native architecture, I design and implement scalable, secure, and high-performance data solutions tailored to each organization’s unique needs.

              Five reasons why digital accessibility must matter

              Capgemini
              Laurie Bazelmans, Amish Desai
              Jun 23, 2025

              Sarah wants to access her new online banking account to pay an urgent bill. Frustrated and anxious, she spends close to an hour trying to complete a simple task that should only take a few minutes. Unfortunately, her bank’s digital interface is not compatible with her screen reader, which she relies on as a person with a visual impairment.

              If you think this is an extremely uncommon scenario, roughly 80 million people, or one-fifth of the EU’s population, live with a disability.[1] A disability can affect anyone at any time and can include temporary conditions like someone recovering from surgery or suffering from a short-term injury that prevents them from accessing a once routine service.

              As a brand in the digital space, you should strive to design and develop more accessible services, especially since digital accessibility can suddenly become very important for any of your customers.

              Here are five more reasons why this topic deserves your attention.

              1. Tap into a large and valuable audience

              People with disabilities will naturally favor brands that ease their accessibility challenges. A study in the Netherlands revealed that 45% of iOS and 61% of Android users “have one or more accessibility settings activated on their phone.”[2] By making features like screen reader compatibility, video captioning, and voice recognition standard practice in software development, your business will see a significant uptick in transactions thanks to new satisfied customers.

              2. Your brand reputation is on the line

              Whether physical or mental, temporary or permanent, disabilities can limit people’s access to essential services in areas like banking, transportation, healthcare, and education. As digital technologies become increasingly integral to our daily lives, addressing accessibility is not only a matter of social justice but also an economic necessity. If your business can design products and services that are easier to access by everyone, your new and existing customers will have a favorable view of your brand, which can engender strong customer loyalty.

              3. Non-compliance will be costly

              Starting June 28, 2025, the European Accessibility Act (EAA) will introduce measures requiring EU businesses to adhere to the updated digital accessibility guidelines, as it aims to reduce barriers to entry and ensure everyone can participate in the digital realm.[3] This new standard will catch many affected businesses by surprise. A January 2025 survey revealed that only 11% of organizations feel confident they will meet the June deadline, while another 35% aren’t even sure if their changes are enough to be in scope of the EAA.[4] By prioritizing accessibility today, you’ll avoid potential legal pitfalls tomorrow.

              4. A better user experience (UX) benefits everyone

              Accessibility is about more than just meeting legal requirements; it’s about creating a highly accessible service that’s more enjoyable to use. By adding new features, you enhance the overall experience for all users. Plus, did you know search engines favor accessible websites?

              5. Continuous improvement leads to long-term growth

              Accessibility is not a one-time fix; it’s a continuous journey that will only become more relevant in the future. Disabilities such as visual impairments, hearing loss, cognitive decline, and reduced mobility are a fact of life as populations age, so if you want to be a business that values a diverse customer base, you’ll make accessibility a foundational core of your business strategy. And by doing so you’ll set a positive example in your industry, winning you respect and loyalty from customers, employees, and stakeholders alike.

              For example, a national postal company recognized the importance of enhancing accessibility and collaborated with Capgemini to evaluate over 100 of their omnichannel journeys. This led to the identification of more than 250 recommendations for improving the UX of their app and website.

              Ready to create a more inclusive digital environment? Stay tuned for our follow-up article, Five steps to widespread digital accessibility.

              Contact us to learn more about how we can help you with your digital accessibility journey.


              • [1] https://www.who.int/health-topics/disability#tab=tab_1
              • [2] https://appt.org/en/stats
              • [3] https://creative-boost.com/european-accessibility-act/#:~:text=European%20Accessibility%20Act%20Exemptions,less%20than%20€2%20million
              • [4] https://abilitynet.org.uk/news-blogs/eaa-only-11-organisations-confident-they-will-meet-june-deadline

              Authors

              Laurie Bazelmans

              Laurie Bazelmans

              User Experience and Front-End Interactions Offer Leader, Netherlands
              Laurie is a product and services expert at Capgemini, specializing in user experience (UX) and behavioral psychology. As Offer leader UX & Frontend Interactions and UX Business Partner, she harnesses UX as a strategic lever for business growth – translating complex customer needs and journeys into impactful, user-centered solutions. Throughout her consulting career, she has elevated digital transformation initiatives, focusing on customer needs, business goals, and structured UX strategies.

              Amish Desai

              Global User Experience and Front-End Interactions Offer Leader
              With 20+ years in digital transformation, Amish has led Fortune 100 firms to profit through design and product innovation. Highlights include training 2,000+ CPG staff in Design Thinking, pioneering digital-first ventures in finance, and launching connected commerce for a century-old retailer. His pinnacle achievement is forming global teams that excel in crafting digital customer experiences at the nexus of immersive tech, customer insight, and business value. He teaches UX design, product, and strategy in academic and entrepreneurial institutions as a token of gratitude for those who have assisted him over the years.

                Why your bank’s customer service needs to up the empathy – and AI may hold the key

                P.V. Narayan
                Jun 24, 2025

                Marketing guru Shep Hyken once said “make every interaction count, even small ones.” This quote has always stuck with me because it’s so human, and because it explains why we feel a strong emotional connection to certain brands. We are more likely to become repeat customers if we experience good customer service, even in a small interaction.

                It is well known that contact center agents are the face of any bank. They are on the front lines dealing with customer interactions and shaping your bank’s perception. Alas, the unfortunate reality is that today’s customer service isn’t standing up to customers’ needs. Consumers in 2025 expect more, and it’s on banks to step up.

                Today’s consumer won’t stand for generic banking – they expect a personalized, seamless experience. More than that, they want it to feel human. Often, this demand lands with the staff at a contact center. But can we expect this staff to keep up with ever-growing customer expectations unaided? Or, even worse, can we expect the contact center to deliver a great experience when the perception is that banks are actively trying to automate away their jobs?

                Capgemini’s World Retail Banking Report 2025 finds that only 16% of agents appear satisfied with their jobs. Attrition continues to rise, increasing the cost of recruitment and time spent training agents. In between, customers are looking for empathy in basic interactions – and instead find things impersonal and procedural.

                I’m convinced we’ll do right by customers if we deploy technology to help overworked agents. Technology, after all, is a tool. The use of AI can help eliminate friction and let these agents deliver the kind of frictionless experiences that customers are hungry for. By implementing predictive AI capabilities, banks can prevent issues before they even occur based on historical patterns and trends, reducing the number of complaints and anomalies in real-time.

                In the World Retail Banking Report, we sought to understand how 8,000 millennial and Gen Z customers view perhaps the single most important feature of their banking relationship: the card. The consensus was clear: there is room for improvement at every point of the customer journey. And there is a clear need for personal connection.

                The worrying part of our research findings was the extent to which bank teams seemed aware of dissatisfaction among customers. Consider this: 68% of banking institutions acknowledged poor customer satisfaction as a major issue. What’s more concerning is that over 60% of bank marketing staff say they are overwhelmed by the number of applications they receive, and many banks acknowledge the KYC process can take days.

                All of this is taking place against a backdrop of profound technological change. These changes have benefited nimble, digital-first players such as Monzo and Revolut. While they may seem small compared to the scale of US megabanks, they have succeeded in capturing valuable market niches. They did so by creating smooth digital experiences, broadening the aperture of services available and sidestepping much of the friction that can hinder established banks. They created real customer connection.

                AI can let US banks build this connection too, removing bottlenecks in manual processes such as card applications. At a strategic level, it can inform banking strategy, create products with in-built personalization and close the customer service gap with the emerging neobank players.

                By proactively predicting and addressing trends, the technology can assist banks in staying ahead of customer complaints and operational bottlenecks, making the process smoother for both agents and customers. 

                However, AI can’t do it alone – many customers will still want the option to connect with a human being. After all, personal finance is personal, whether it’s a customer loan application or resolving a disputed charge. But AI can empower those humans, giving them a better insight into the customer’s situation and request.

                For example, if a customer is angry about an unauthorized credit card transaction, a human agent augmented by AI can use sentiment analysis to detect the customer’s anger. The AI can then direct the query to an agent who has a high success rate in managing similar complaints and calming frustrated customers. AI can even proactively anticipate scenarios to help agents better serve customers.

                Furthermore, by automating routine inquiries, AI allows agents to focus on complex, high-value tasks that require empathy, creativity, and judgment – attributes that customers are increasingly expecting. In this way, AI enables agents to provide more personalized service at scale, bridging the gap between human empathy and efficiency.

                To put it simply, AI can make customer service agents much happier and more productive in their work. This takes more than a technology strategy: bank leaders will have to implement a thorough change management plan. That means educating employees about the potential of AI and their role in augmenting human capabilities, as well as clearly delineating what work will be done entirely by AI, and where AI will play a supporting role.

                It’s also crucial that banks adopt a customer-centric AI strategy, focusing not only on operational efficiencies, but also how these technologies can directly enhance customer experience and employee experience. AI’s role is not just to solve problems faster, it’s to solve them better and with more empathy, while providing seamless self-service options and empowering agents to be more competent with contextual insights and continuous learning.

                The bottom line: bank executives must push the boundaries of innovation to explore the potential of AI – in a safe and controlled fashion – that strives to deliver enhanced client engagement. It’s time to make every interaction count.

                Author

                P.V. Narayan

                EVP and Head of US Banking and Capital Markets, Capgemini