Skip to Content

The role of AI in wealth management

Shreya Jain
14 Oct 2022

During the last few years, Artificial Intelligence (AI) has rapidly established various use cases for itself across different areas of Financial Services (FS).

Increasingly, AI-based applications are being used not only to augment human expertise in routine tasks, but also to streamline the more strategic business processes. One prominent industry area that AI is significantly transforming is Wealth Management (WM). The reason for this is the need for the highest level of accuracy, impeccable precision in analysis, and the sheer volume of data to derive insights from. The WM industry is starting to harness the benefits of AI in ripe use cases, the success of which is, in turn, opening doors to explore newer applications.1

Leveraging AI in Wealth Management

There are many ways for WM firms to leverage AI – the strategies or niches varying by the segments of clients they serve, the investment types they advocate, their overall investment philosophies, and the AI capabilities they possess. Even in the field of Digital Advisory alone, services could range from digital-only advisory, hybrid advisory, or simply augmenting portfolio rebalancing capabilities with AI-derived insights.

AI Use Cases in Wealth Management

As artificial intelligence is poised to enhance the various touchpoints in the wealth landscape, some front-runners that stand to readily benefit are emerging:

Portfolio Management

AI can help churn huge chunks of data instantaneously and derive meaningful, context-relevant insights. Financial Institutions (FIs) can leverage this functionality to generate portfolio insights sensitive to dynamic and wider contexts. Robo advisors by FIs like Vanguard and Charles Schwab can build, monitor, and automatically rebalance a diversified portfolio based on the client’s goals.2 3

Augmented Advisory and Next Best Action (NBA)

With the advent of technologies that facilitate tapping into more and more data on clients, AI can help FIs harness this ever-increasing pool to arrive at bespoke recommendations for each client. Morgan Stanley has developed an NBA system, which leverages machine learning to consider clients’ life events and generate hyper-personalized investment proposals in near-instant time frames.4

Tax Planning

Taxation is a vast and important domain for high-net worth individuals, within which AI finds multiple use cases. Right from automated tax filing by appropriately classifying tax-sensitive transactions or recommending investments for tax saving, there is a large scope for both generic and tailor-made AI solutions. Thus, new AI tax solutions focused on different needs are now entering the market. AiTax guarantees that clients pay the lowest amount of tax legally possible, using AI to scan opportunities and eliminate the risk of human error.5 6

Client Onboarding

Wealth management firms face Know-Your-Customer (KYC) requirements that are different from the rest of the industry, owing to the more stringent regulatory due diligence required while screening their clients. Artificial intelligence can aid in automating these time- and labour-intensive tasks, while adequately considering contextual relevance. Deutsche Bank Wealth Management is implementing the Finantix KYC Solution, which provides AI-powered multi-language and natural language processing to verify users. It includes the screening of adverse news and background information on existing and potential clients, and builds detailed profiles on them by aggregating, distilling, and classifying them by relevance and risk level.7

Cyber Security

With an ever-increasing amount of data being stored on cloud servers, the need to protect the privacy of clients’ financial records and private information falls on wealth management firms. This makes an ideal case for AI software sophisticated, up-to-date, real-time monitoring capabilities to flag issues at first notice. Goldman Sachs has a fund of $72.5 million exclusively for investment in AI, of which a crucial use case is the prevention of cyberattacks using AI-powered anomaly detection software based on real-time data analysis.8

The Future of AI in Wealth Management

The Wealth Management industry has only just begun to realize the impact of AI. It is still coming to terms with adopting the readily available solutions that aid portfolio managers and help streamline processes. However, the FS landscape is evolving to include advancements such as open banking, better accessibility of third-party market, growing interest in ESG investing, and other dynamic changes. As the landscape changes, wealth management firms stand to discover and invent newer roles for themselves in customer relationship journeys. This emerging definition of WM roles can greatly benefit, by building upon the unprecedented functionality provided by AI solutions, particularly in newer areas, like algorithmic trading and real estate investing.9

With the plethora of AI use cases still available to be piloted, we have only scratched the surface of what could be a transformed wealth management industry.10

Sources

  1. https://www.finextra.com/the-long-read/339/wealth-management-will-see-an-ai-revolution-in-2022-delivering-hyper-personalisation-at-scale
  2. https://investor.vanguard.com/advice/digital-advisor
  3. https://intelligent.schwab.com/
  4. https://www.forbes.com/sites/cognitiveworld/2020/01/09/how-ai-and-robotics-can-change-taxation/
  5. https://www.aitax.com/
  6. https://fintech.global/AIFinTechForum/deutsche-bank-wealth-management-to-implement-finantixs-kyc-solution/
  7. https://www.analyticsinsight.net/goldman-sachs-is-betting-on-artificial-intelligence-to-dive-growth/
  8. https://www.datarobot.com/blog/ai-for-real-estate-investment/
  9. https://sloanreview.mit.edu/article/the-pursuit-of-ai-driven-wealth-management/

Author

Shreya Jain

Manager, Global Banking Industry

    Achieving successful digital F&A transformation across industries

    Joanna Jaroszewska, R2A Process Modelling Consultant, Capgemini’s Business Services
    Joanna Jaroszewska
    12 Oct 2022

    The challenges of implementing best-in-class finance processes vary depending on the industry a particular client operates in. Capgemini’s Digital Global Process Model platform plays a key role in transforming business operations across industries.

    In the previous article in this series we focused on how Capgemini’s Digital Global Process Model (D-GPM) platform mitigates the common process frictions that occur when running an enterprise.

    In this article, we’ll focus on how D-GPM works as a key enabler of Capgemini’s digital transformation platform (Digital Global Enterprise Model – or D-GEM) to mitigate the challenges of transforming finance and accounting (F&A) operations that arise across industries such as pharmaceuticals, retail, consumer goods, media and entertainment, utilities, and manufacturing.

    What influences impact different industries

    Imagine living in a world where running a business is the same regardless of what services you provide or what you produce. It’s a wonderful vision – but the reality is very different.

    At Capgemini, our experience of helping clients transform their businesses has taught us that while many processes are common across all industries, each client still comes with their own challenges. Because of this, there simply isn’t one model that fits all. Each and every business has its own specific needs, characteristics, and risks based on the specificities of the industry sector within which they operate that require a tailored approach.

    The question is then – what are the indicators that define the diversity within each sector, and what influences them? The first and most important specific indicator is what a particular company does – meaning what it produces or distributes. When looking at a company through this prism the following factors must be considered:

    • Are the goods produced or distributed dependent on external factors such as seasonality or dictated by consumer preferences?
    • Is the company exposed to fraud, litigation, and reputational damage?
    • Is the company’s activity dependent on a specific piece of equipment or intellectual property that it can’t operate without?
    • Does what is produced have a significant impact on the environment and CO2 emissions?
    • Does the company struggle with broad product segmentation?

    And these are not the only questions that need to be answered. All organizations need to handle segmentation between vendors and customers; however, this can create challenges when handling special business relationships – for example what additional factors need to be considered when doing business with utility or pharmaceutical companies. They also need to attract a high volume of specific vendors or customers which may cause business to stagnate, and generate relationships with government or regulatory entities which brings more scrutiny to every activity the organization undertakes.

    All of these areas need special attention when it comes to redesigning your business processes. Ignoring red flags related to distinctive risks and challenges is a recipe for disaster. This means that like it or not, all of the above factors impact the success of any digital transformation across all industries. So, how can you keep these challenges in check?

    Digital transformation across industries made easy

    Our D-GEM digital transformation platform leverages D-GPM’s sector-specific process models to mitigate certain process characteristics resulting from sector specificity across industries. Specific  business knowledge of each sector enables us to understand the differences between sectors, and any challenges that may arise, which helps us to better identify and prioritize opportunities for process improvement.

    All of this enables us to reimagine and transform our clients’ business operations, helping them achieve – what we call – the Frictionless Enterprise.

    To learn more about how Capgemini’s D-GEM reshapes and streamlines your business processes to deliver a truly Frictionless Enterprise, please feel free to contact: joanna.jaroszewska@capgemini.com

    Joanna Jaroszewska works to develop Capgemini’s Digital Global Process Model platform, driving the digital transformation of her clients’ business operations.

    About author

    Joanna Jaroszewska, R2A Process Modelling Consultant, Capgemini’s Business Services

    Joanna Jaroszewska

    R2A Process Modelling Consultant, Capgemini’s Business Services
    Joanna Jaroszewska works to develop Capgemini’s Digital Global Process Model platform, driving the digital transformation of her clients’ business operations.

      What If investment bankers were software architects

      Adam Witkowski
      12 Oct 2022

      A New Perspective

      The way in which investment bankers often look at things is that they divide them into two groups:
      assets and liabilities. The assets are for example stocks, commodities, and cash, while the liabilities are debts, taxes, and wages to pay.

      Let’s look at software from this perspective. What are the assets? It is very simple: all that the software does that creates value, meaning the business features. That was easy.

      So now, what are the liabilities? All that the software needs to have to deliver the value, meaning code, modules, TeamCity/Jenkins tasks, dependencies, etc., meaning the cost.

      Nice perspective, but what does it give us? It gives us some insight if we start thinking how to maximize this value: assets/liabilities.

      We can do it in two ways:

      • Increasing the number of business features
      • Decreasing the cost.

      Increasing the value

      To achieve this, we can design the system in a way that the number of business features can grow faster than the cost of creating them.

      How can this be achieved? In many ways. Here are some of them:

      • Reusing data and platform for multiple business purposes (example: one platform with one collection of data used by multiple business projects).
      • Making the system scalable from the business feature perspective so that, for example, it is very easy to add more features by simple configuration etc.
      • Making software flexible. In order to do that you need to have a deep understanding of how it is used so you know what is likely to change in the future. Once you have it you make these pieces as configurable as can be.

      Decreasing the cost

      We can decrease the cost while keeping the same value for users.

      How can that be achieved? In many ways. Here are some of them:

      Cleanup

      Usually, people do not like cleanups because they are:

      • Risky (it is usually difficult to be sure nobody is using it; if there are still some forgotten users, it will cause a problem)
      • Low appreciation (if you decommission a system, they say “thanks”; if you introduce a new one, you get promoted, although the first option is better for maximizing the assets to liabilities ratio)
      • They simply forget (it is never urgent; you can always do it, so it never gets done).

      Nassim Nicholas Taleb (#nassimtaleb) was often referring to Via Negativa. In case you are not familiar with it: it focuses on removing rather than on adding, so it is a “if you are gaining weight do not start taking pills but remove carbs from your diet” kind of thinking.

      This is a great perspective for looking at software if we have already identified the costs.

      You can remove many future issues before they happen by:

      • Removing dependencies (people add them because they need them, but they do not remove them when they no longer need them)
      • Decommissioning whole projects (the TCO goes down, the team is happy because instead of maintaining it can focus on creative things, faster onboarding of new joiners, etc.).

      Simplify

      I am sure you have heard about it many times. I would say software architects agree with it in theory but not in practice. They still come up with complex, fancy designs and I would risk the statement that: Complexity is the biggest problem for software engineering.

      If we go a bit further: if a system is designed to be simple it may succeed or not, but if it is designed in a complex way, it will certainly fail.

      Self-maintaining system

      Try to design the system in a way that does not require people to run it. If something has failed, got stuck, or requires restarting, it should be done automatically by some resubmission logic etc. If something fails because of an external system/team – your software can take action that normally would be done by humans; for example, raising a ticket for some team, sending email, etc.

      Do not leave space for human intervention.

      Of course, there will be cases like a coding bug or race condition that will require the dev team to take a look, but only these cases should require people. Any non-creative work should be done by software.

      Tests

      Where are the automated tests on the asset and liability perspective? That depends. If the tests are testing the contract that changes very rarely and guarantee the quality, then they certainly belong to assets. If the tests are tightly coupled with technical details of the solution and require frequent changes, meaning the cost of their maintenance is high, then they belong to liabilities. Obviously flaky tests (ones that may have different outcomes for the same input) are liabilities.

      Summary

      If each module or line of code is a liability and removing them is good, I am not suggesting cutting any corners by putting everything into one module or compressing code, etc. It is important to understand that each technical entity is justified only if it delivers value for the users.

      If we cannot find any, I would suggest removing it.

      As they say, “A good developer writes code, a great developer removes it.”

      Meet our Experts

      Adam Witkowski

      Delivery Architect
      Adam joined Capgemini 3 years ago. He has been programming for 25 years. Currently he works as Delivery Architect in Enterprise Data and Analytics. He also supports the company as SPOC for FS Poland Architecture and Head of Risk, Trade Management and Collateral. He is involved in the creation and activities of the Cutting Edge Community as the Cohead. He speaks Italian and Czech. Privately a husband and father. He is interested in history, literature, mathematics and soccer.

        How your personal data helps to achieve sustainable development goals: MyData Conference 2022

        Pierre-Adrien Hanania
        20 Sep 2022
        capgemini-invent

        Albert Einstein said it best: “Information is not knowledge.” Perhaps some of you can empathize with this statement. After all, if you are reading this, you either use or are interested in adopting data sharing solutions. As such, you’ll understand that having information and knowing how to use it are completely different things.

        In the complex quest it is to tackle global challenges on a global level, bridging this gap is the key to achieving complex quests such as the Sustainable Development Goals (SDGs).

        This year, the annual MyData Conference took place in Helsinki, Finland. As Public Sector Data and AI Offer Leader, I quite felt home in this city. For a start, organizations like Helsinki Region Infoshare publish datasets for the common good, such as the water posts that can be freely used to fill water bottles. These are becoming an increasingly important way to cope with climate change. I saw at least one person benefitting from this data on the 22nd of July.

        The Digital Social Contract

        In recent years, we have seen the emergence of mission economies that share a purpose: sustainability. Another thing they share is their approach to tackle their goals, which involves the utilization of data and AI. Together, they are giving rise to a digital social contract that is powering the creation of augmented public services.

        The sharing of data in the Public Sector is deeply linked to the value of personal data, as its intelligent use can help to best serve citizens in their rights and duties.

        In the context of the rise of data and AI strategies on a governmental level across geographies, sharing data the right way is not only a technological question – it is also a matter of geopolitics.

        AI unleashes the full potential of emerging technologies, converting them from tools we use to tools that serve without additional input.

        Data and AI are intimately connected to the key values of our democracies, with the power to improve societies worldwide and when requesting refreshed positioning on crucial topics, such as accountability, transparency, and non-discrimination.

        In all cases, data and AI are enabling the public sector to achieve its missions with more pace, efficiency, and security. The end-to-end automation of case management and document processing results in intelligent administration. The implementation of large-scale automation fosters engagement, as liberated citizens can interact with public servants and processes around the clock. Moreover, the level of security and service is markedly improved, with automation powering real-time threat, incident, and anomaly detection. When taken together, data and AI generates insight that can be leveraged to feed a better decision-making process – from understanding a situation to suggesting next-best actions.

        Relieving instead of replacing

        It is important to remember that data and AI are tools, just like a carpenter’s electric saw. They make the work easier, add precision, and improve efficiency. But they are only as good as the operators – that’s us. An augmented public service should be the best of both worlds, with humans making the decisions based on the information gathered by these tools. It’s also worth pointing out that the human operator makes AI ethical, a quality it is unable to achieve on its own. Only by integrating technology with human management can we revolutionize the following four playing fields:

        • Intelligent Automation of Administration
        • Interactions between the citizen and servant
        • Anomaly detection and identification
        • Improved decision-making processes

        AI for Good

        This emphasis on the ethical use of AI underpins the AI for Good initiative, supported by the United Nation’s International Telecommunications Union (ITU). Like everyone else at Capgemini, I am extremely proud of our involvement in this movement. All members are committed to ensuring data and AI are used to active the following Sustainable Development Goals (SDGs):

        • Education: Decent work and economic growth, quality education, and good health and wellbeing.
        • The Environment: Life on land, clean water and sanitation, and climate action.
        • Nourishment: Zero hunger, no poverty, and good health and wellbeing.
        • Heath: clean water and sanitation, good health and wellbeing, and reduced inequalities.
        • Peace and Information: Peace, justice, and strong institutions combined with quality education.
        • Justice: Peace, justice, and strong institutions.

        Capgemini’s Collaborative Data Ecosystems: Sharing is caring

        When it comes to some of our greatest challenges, there really are some very simple solutions. The French journalist, Hervé Kempf, said it best:

        “Consume less, share better”

        This sentiment typifies the ethos of collaborative data ecosystems, which are partnerships between stakeholders to share and manage relevant data and insights. Operating on the assumption that the collective is stronger that the individual, these ecosystems create value for all the participants, value that they couldn’t generate on their own. But organizations to must have both a vision and a strategy that enables effective execution.

        Strategy and Business Model

        Organizations should follow the five-step plan of action:

        VisionDesignLiaisonGovernanceTrust
        Definition of key objectives, analytics of challenges and opportunities, industry benchmarks, case prioritization, stakeholder identification and management.Specifics of the ecosystem, new services and products, industry specific strategies for data sharing, and a long-term roadmapRepresent clients at venues, participate in advocacy coalitions, facilitate the formation of partnerships, and assist in partner selection.Set up operating models for partners, design collaboration rules, secure trust between stakeholders, ensure data privacy and quality, standardize digital collaboration processes.Ensure legal compliance, guard against threats with cybersecurity, advice ecosystem participants, educate authorities and the public, and create and operate label certification.

        Additionally, advisory services and industry specific point of views can accelerate collaboration.

        Implementation of Collaborative Data Ecosystems

        It is vital organizations establish data collaboration platforms and implement security and privacy protocols. Emerging technologies, such as federated learning and data mesh, shall be combined by strong data engineering platforms and the relevant privacy set-ups: Homomorphic Encryption and differential privacy are two examples of how data exchange can occur in a safe way.

        Furthermore, Capgemini provides organizations with several accelerators that make this possible. Our 890 and Industrialized Data and AI Engineering Acceleration (IDEA) offers are invaluable tools when establishing data collaboration ecosystems.

        Your Data for Good: the quest for SDGs

        Collaborative data ecosystems can have a profound effect on cost efficiency, omnichannel empowerment, insight multiplication, use case enablement, and citizen engagement. When these five factors are driven by data, they can transform environments and territories, education and work, information, justice and safety, food and agriculture, and healthcare. As you may have noticed, these are some of the UN’s more notable sustainable development goals.

        To give you just one example, let’s take a look at healthcare supply chain management:

        In 2018, each hospital in the United States spent an average of $11.9 million on medical and surgical supplies. This accounts for up to one-third of total operating expenses for some. Despite this, improving supply chain and inventory management is often not considered a high priority for hospitals; providers tend to focus more on the processes surrounding direct patient care. Yet, having these supplies is necessary for delivering high-quality care. Due to the Covid-19 pandemic, improving agility and resilience to demand and supply-side shocks has become even more critical. As a result, hospital managers are increasingly looking for ways to leverage data and technology to gain insights into inventory, pricing, lead times, and demand trends.

        The proof is in the pudding

        To give you some idea of how effectively collaborative data systems meet sustainable development goals, allow me to give you a couple of examples from the projects I’ve been involved with at Capgemini.

        Federated Learning

        The Federated Learning platform, developed by Capgemini, makes it possible for hospitals to share trained Artificial Intelligence (AI) models to create a global model that outperforms local versions. Three Spanish hospitals were able to dramatically improve both the speed and accuracy of COVID-19 screening. This was achieved by aggregating the clinical experience of each hospital to develop automated medical diagnosis models.

        OnDijon: the smart city

        Dijon was struggling with operations and ambitions, siloes, and the lack for a big picture. Capgemini and Bouygues worked with the Dijon metropole to develop an Artificial Intelligence (AI) platform that connected the control center with every machine, scanner, and citizen. This smart city initiative made use of open data, utilizing citizens in the process of creating new public services that unleash the potential of real-time information, smart mobility, and traffic-based lighting. The result was a 40% cost reduction for services through improved responsiveness to citizen activity. The city also expects to see energy savings of 65% over the next 12 years.

        The rise of the smart citizen at the heart of data collaboration

        Before bringing this analysis of data enabled sustainable development goals to a close, let’s return to the heart of the subject. It is easy to forget the human element in all this talk of technology, data, and goals. But human centricity is what this all about. It’s extremely satisfying to know that data is being used to the betterment of civilization. However, it doesn’t make much difference if engagement is low. That’s why it is worth remembering that real change in the world will only happen when the overriding majority of citizens are both informed and motivated. In short, we need a everyone to become smart citizens.

        We are well on our way, but here are three guiding ideas to keep in mind:

        1. The smart citizen shall be involved in data projects from the beginning
        2. The citizenship shall be nurtured over time
        3. The citizen shall move from a consuming to a producing role in data

        I’m sure that you’ve got the message by now – each and every one of us is directly involved in the quest for the value of data. So, let’s go out there and embrace our part!

        About Author

        Hanania-Pierre-Adrien

        Pierre-Adrien Hanania

        Global Public Sector Head of Strategic Business Development
        “In my role leading the strategic business development of the Public Sector team at Capgemini, I support the digitization of the public services across security and justice, public administration, healthcare, welfare, tax and defense. I previously led the Data & AI in Public Sector offer of the Group, focusing on how to unlock the intelligent use of data to help organizations deliver augmented public services to the citizens along trusted and ethical technology use. Based in Germany, I previously worked for various European think tanks and graduated in European Affairs at Sciences Po Paris.”

          The value of digital transformation in finance processes

          Michal Mróz, D-GEM Global Process Owner, Capgemini’s Business Services
          Michal Mróz
          7 Oct 2022

          Capgemini’s Digital Global Process Model, a key enabler of our digital transformation of business operations, drives standardization and harmonization of your finance processes on a global scale.

          Regardless of industry, size, infrastructure, market position, or strategy, running a business today involves overcoming countless obstacles that have a long-term impact on your organization’s ability to serve customers, achieve satisfactory financial results, and increase competitiveness in the marketplace.

          These challenges need to be minimized if you want to achieve a new dimension for your business – where creating a seamless and intelligent connection between your processes and people is not the exception, but the daily routine. This type of connection, ultimately, leads to a more frictionless approach to finance.

          What you need is a methodology that delivers these frictionless outcomes to your clients. One that provides a complete overview of your people, processes, technology, and governance through carefully monitored control points. All while accelerating your transition to transformed, future-proof, next-generation, and AI-enabled order-to-cash (O2C), purchase-to-pay (P2P), and record-to-analyze (R2A) processes.

          The key to digital transformation in finance

          To achieve this you either need to build a pioneering approach to process execution from scratch or partner with a global leader in this field to ensure success.

          Whatever approach you take, the key enabler of digital transformation remains the same – the ability to drive finance business process standardization and harmonization across your finance function without causing major disruption in your day-to-day operations.

          The question is then: how do you evolve into a mature organization capable of managing your processes, technology, and resources efficiently – without getting bogged down in the minute details of your finance transformation?

          Simplifying digital transformation across industries

          At Capgemini, our Digital Global Enterprise Model (D-GEM) platform addresses challenges in finance transformation by offering the tools and techniques needed to deliver increased efficiency, faster time to market, and an enhanced, customer-first, user experience to organizations. This, in turn, enables organizations to transition to – what we call – the Frictionless Enterprise.

          D-GEM also helps organizations transform their business operations by leveraging our Digital Global Process Model (D-GPM) – a key enabler of digital transformation within D-GEM, and a pioneering approach to best-in-class process execution that drives standardization and harmonization of business processes across industries on a global scale.

          Our D-GPM has now been consolidated into its own dynamic and agile platform that maps our clients’ finance processes at a highly detailed level in accordance with the latest global standards, enabling us to meet the challenges of ever-changing business requirements and deliver the expected results even faster than before.

          Optimize your finance and accounting processes

          The D-GPM platform is not only a comprehensive knowledge hub and modelling platform, but also a tool equipped with various functions required for the successful execution of digital transformation regardless of the industry it is leveraged in.

          This includes a repository of process re-engineering recommendations, AI-augmented cloud-based controls, scenario simulation through digital twins, integrated process modeling workflow, a fully developed RACI framework, ERP (SAP S/4HANA and Oracle) and sector-specific libraries, and enhanced KPIs and metrics.

          This approach to digital transformation helps our clients evolve into mature organizations capable of efficiently managing their processes, technology, and resources – making frictionless finance operations a reality for our clients.

          In the next blog in this series, we’ll be discussing how D-GPM helps overcome industry-specific challenges through its highly capable process mapping capabilities.

          To learn more about how Capgemini’s D-GEM reshapes and streamlines your business processes to deliver a truly Frictionless Enterprise, please feel free to contact: michal.mroz@capgemini.com

          About author

          Michal Mróz, D-GEM Global Process Owner, Capgemini’s Business Services

          Michal Mróz

          D-GEM Global Process Owner, Capgemini’s Business Services
          Michal Mróz leads a team that focuses on re-platforming and enhancing content, and re-designing Capgemini’s unique Digital Global Enterprise Model platform.

            Context-aware analytics is driving enhanced customer satisfaction

            Capgemini
            Capgemini
            7 Oct 2022

            Implementing context-aware capabilities across your contact center operations can drive faster, more accurate call and ticket resolution times and a seamless, more meaningful customer experience.

            Customer experience agents handle and resolve thousands of calls, issues, and tickets on a daily basis – many of which are similar in nature or characterized by a single underlying, often hidden, root cause.

            Identifying the similarities in calls or tickets is often extremely challenging, and agents need to check previous tickets in order to help the end-user. In addition, ticket or call patterns are often only manually identified at a later pattern analysis and root cause identification stage, which delays issue resolution by upwards of eight weeks. All of this creates negative customer and agent experiences, while also increasing the cost of customer operations.

            As such, the modern contact center faces a range of challenges, including:

            • An increasing number of similar issues that are not proactively addressed
            • Increased resolution time per ticket
            • Repeated analysis of similar issues by different agents.

            Leveraging context-aware analytics and ticket resolution

            Let’s look at a typical scenario, common to most businesses:

            • Michael calls the contact center in German regarding an audio issue with his laptop: “Ich habe ein problem mit dem Ton im laptop.”
            • Sofia also calls the contact center in Spanish with a similar issue: “No puedo escuchar nada en mi sistema.”
            • James then chats with the contact center in English about the same audio issue: “I’ve just received a new laptop, but I can’t hear anything.”

            Although these three challenges are similar, it’s difficult for contact center teams to identify call and issue patterns, especially when dealing with thousands of conversations across multiple languages in near real time.

            To solve these challenges, contact centers are deploying plug-in analytical tools with purpose-trained AI components that enable next-generation, context-aware analytics.

            These tools analyze calls in near real time and detect patterns in inbound calls, highlighting any underlying issues by comparing thousands or tens of thousands of previous tickets. The pattern detection tool helps unravel trends and anomalies in these tickets based on their context and details – and it does it faster and more accurately than a human is capable of.

            Next-generation call pattern detection

            But how does a context-aware call center work in practice? Using the same situation outlined previously, thousands of different customers call your contact center to ask about an issue in their own native language.

            The tool’s engine detects the language, translates it, and extracts the meaningful data from raw text utilizing advanced natural language processing (NLP). Key information is extracted and compared to previous cases in your contact center’s data repositories. Based on an established, underlying context, all similar tickets are identified and analyzed in seconds, sending an alert to your support teams to ensure it is handled by the right team quickly and efficiently.

            Your product or service teams then work on early root cause analysis, based on the information gathered from multiple, similar tickets. And your support teams are now able to proactively recommend and inform the business about similar issues, reducing the number of inbound tickets your customer experience agents need to deal with.

            This ensures patterns are discovered much earlier, which leads to fewer complaints, significantly improved wait times, and more satisfied customers. All of this is achieved through technologies such as voice-to-text, natural language understanding (NLU), and context-aware pattern analytics.

            • The technology behind the context-aware call center
              • Transcribing calls – a speech-to-text module helps convert audio calls into readable transcripts
              • Addressing language gaps – translation module translates calls and text messages into English quickly and easily
              • Fixing corporate lingo – the tool leverages enterprise glossaries and knowledge graphs to understand and process any business-specific terms
              • Understanding real meaning behind conversations – the tool leverages NLP and named entity recognition (NER) technology to extract the necessary information
              • Understanding trends – the tool uses data analytics to identify patterns and trends, with any abnormal patterns usually indicating an underlying issue.

            Implementing a frictionless customer experience

            Introducing context-aware analytics capabilities across your call center identifies and resolves customer issues in near real time. This helps to discover and proactively address the deeply hidden root-causes behind these issues, leading to faster, more accurate resolution of similar issues.

            In turn, this can reduce customer complaints by up to 70% – driving a seamless and frictionless customer experience.

            Learn how Capgemini’s Intelligent Process Automation leverages AI and intelligent automation to monitor, understand, and react to the root cause of your call and ticket issues, driving a more meaningful, productive, and frictionless relationship between your customers and employees.

            About author

            Vinod Nair

            Vinod Nair

            Technology Offering Lead, Innovation Cluster, Capgemini’s Business Services
            Vinod Nair is a technology leader in the Innovation Cluster of Capgemini’s Intelligent Automation practice, with extensive knowledge in process automation, artificial intelligence, solutioning, product development, and helping organizations in their automation journey.
            Marek Sowa Head of Intelligent Automation Offering & Innovation, Capgemini Marek empowers clients to revolutionize business operations with AI and RPA. He aids Fortune 500 companies in creating scalable, high-performance automation solutions that enhance efficiency, employee satisfaction, and transformation. His current role involves shaping market-leading offerings, GTM strategies, and aligning global services in the Data & AI portfolio. Marek also manages product design, sales enablement, marketing alignment, and market adoption.

            Marek Sowa

            Head of Generative Technologies Center of Excellence, Capgemini's Business Services
            Marek Sowa is head of Capgemini’s Intelligent Automation Offering & Innovation focused on adopting AI technologies into business services. He leverages the potential hidden in deep and machine learning to increase the speed, accuracy, and automation of processes. This helps clients to transform their business operations leveraging the combined power of AI and RPA to create working solutions that deliver real business value.

              Capgemini strengthen its position in Azure ecosystem achieving 3 advanced specialization status in Data & AI

              Elayaraja Eswaran
              Elayaraja Eswaran
              6 October 2022

              In a fast-evolving business context where Data & AI are driving the biggest transformation for companies, our clients are looking for partners with highly specialized skills.

              Microsoft specializations validate partners’ capability to deliver specialized services and support according to Microsoft’s highest technical standards with an independent audit.

              We are proud of being the only technology partner to be awarded by Microsoft in Data, Analytics & AI Advanced Specialization’s full scope:

              • Analytics on Microsoft Azure Advanced Specialization
              • Data Warehouse Migration to Microsoft Azure Advanced Specialization
              • AI and Machine Learning on Microsoft Azure Advanced Specialization

              By earning these 3 specializations with strict requirements, we differentiate ourselves from our competitors bringing to our clients the proof-points of our architectural expertise and ability to build very high-quality solutions.

              They are a recognition of our deeply knowledgeable and experienced leadership in the data engineering space enabled by strong competencies in a data cloud environment, AI and analytics – and our investment in strategic assets IDEA by Capgemini and 890 by Capgemini.

              What the Analytics on Microsoft Azure Advanced Specialization means for Capgemini?

              This specialization validates our deep experience in planning and delivering analytics solutions in Microsoft Azure, enabling customers to use the full breadth of their data assets to help build transformative and secure analytical solutions at an enterprise scale.

              On top of all the requirements, we brought some bold proof-points through clients’ projects delivered globally showcasing our strong capabilities in the assessment, design, automated deployment and implementation of sophisticated data analytics projects.

              What the Data Warehouse Migration to Microsoft Azure Advanced Specialization means for Capgemini?

              This specialization enables us with an active Gold Cloud Platform Competency differentiating and positioning our expertise in analyzing existing workloads, generating schema models and performing extract, transform, and load (ETL) operations to migrate data to cloud-based data warehouses.

              We have completed well over 20 data warehouse customer projects supported by a significant number of customer testimonials showcasing very strong knowledge, experience, and skills in migrating highly sophisticated and complex data warehouse projects, helping prominent global brands.

              This specialization validates our deep experience in data ware planning and delivering analytics solutions in Microsoft Azure, enabling customers to use the full breadth of their data assets to help build transformative and secure analytical solutions at the enterprise scale.

              What the AI and Machine Learning on Microsoft Azure Advanced Specialization mean for Capgemini?

              The AI and Machine Learning on Microsoft Azure advanced specialization demonstrates our deep knowledge, extensive experience, and proven success in planning and deploying AI and Machine Learning on Microsoft Azure cloud.

              Our end-to-end portfolio of services, including Data, AI and ML capabilities and solutions empower larger digital transformation by deploying AI cognitive services and machine learning solutions, from the assessment phase to design, pilot, implementation, and post-implementation phases to realize the full breadth of these transformative, secure solutions at enterprise scale.

              “We are very proud of this great achievement demonstrating Capgemini’s trusted partnership with Microsoft and our commitment to Azure ecosystem. This demonstrates Capgemini’s extensive Data & AI expertise and capabilities on Azure Services.”

              Zhiwei Jiang, CEO, Insights & Data (I&D), Global Business Line, Capgemini

              Did you know that data modernization is key to winning a data strategy? Learn more here – https://www.capgemini.com/solutions/industrialized-data-ai-engineering-acceleration-by-capgemini-with-microsoft/: Capgemini strengthen its position in Azure ecosystem achieving three advanced specialization status in Data & AI

              From end-to-end strength to open innovation: A new outlook for cybersecurity

              Geert van der Linden
              7 Oct 2022

              Cybersecurity today is the platform that’s driving some of the most innovative advances in the world around us.

              From self-driving cars to wind farms, from fintech to vaccines, it’s cybersecurity that’s opening the doors to the sort of experiences, breakthroughs, and futures that we could only imagine a few years ago.

              The new face of cybersecurity.

              As technology has advanced, it’s certainly expanded the threat surface that organizations need to protect. The explosion of connected devices, the move to the cloud, new ways of working, and the need to build richer experiences for customers – all these developments have pushed cybersecurity to new heights and new places.
              Gone are the days when simply protecting your infrastructure was enough. Today success is defined not by time spent behind firewalls, but by the ability to enhance connections with partners and suppliers, boosting productivity, and imagining new connected experiences for customers.

              From protection to foundations

              The world leading organizations that we help and support have embraced this deeper and broader vision of cybersecurity. No longer do they see it as the remit of a single team or function. No longer is it measured simply by its ability to stop bad things from happening. Today they’re embracing a cybersecurity mindset and culture that spans the organization. From manufacturing operations to customer service, from supply chain to finance, they’re embracing a multi-layered, zero-trust approach.

              It’s this view of cybersecurity as a foundation rather than a protective layer that’s allowing huge strides to be taken.

              Connected self-driving cars, for example, become possible when organizations have true confidence in their entire technology infrastructure. Deep collaboration on vaccines and other medical innovations is opened up when organizations have the confidence to share and collaborate in the cloud.

              At Capgemini it’s our mission to help organizations realize the benefits of end-to-end strength in cybersecurity. This means sharing our expertise, not just bringing in our experts; it means building-in security using the very latest technology rather than plugging in a solution; it means applying our deep sector expertise, and ultimately, it means accelerating innovation as both a force for good and a commercial opportunity.

              To put it simply: when you’re stronger inside, you’re more open to the outside.
               
              Contact Capgemini to understand how secure foundations can create open futures for you.

              Data-driven CX is not what you think it is

              Naresh Khanduri – Our Expert
              Naresh Khanduri
              6 October 2022

              So, what is data-driven CX? Simply put, it’s applying AI to data to create better customer experiences, regardless of where they happen along the customer lifecycle.

              From marketing and sales to customer service and e-commerce, companies have different departments for every major business function. And all of them collect massive amounts of information about each customer interaction.

              To efficiently serve customers, however, departments must have access to one another’s data to know if, when, and how a person may have interacted with their brand. It requires all enterprise data to be collected and stored in a shared database that’s easily accessible by everyone. This is nothing new. Companies typically extract a customer’s identity from their CRM system and see if there’s a similar identity in, say, the e-commerce system and then merge the two to build a single view of the customer. There’s only one problem. It’s not complete. There’s a vital missing element: experience data.

              Data-driven experience = (enterprise data + experience data)^AI

              Experience data is related to what customers are doing when browsing a website or using an app; they may be clicking on several different things, showing interest in various product categories, or maybe abandoning their carts due to lengthy page loads or mandatory registration requirements – all of this is essential data to collect.

              It can provide valuable insights into how to approach customers and how to customize the marketing message based on a user’s historical actions. For example, if enterprise data only shows that a customer has placed orders for furniture, they will be marketed to as just a random furniture buyer, despite them having shown specific interest in baby furniture or lighting. It can also be something as simple as determining who clicked on the latest ad and then targeting that customer with a more focused message. It’s important that the right message reaches the right audience because sending someone a buy one, get one free coupon for a product line they’ve never shown interest in will likely not incentivize them to buy. It won’t result in the trigger action the brand wants.

              Joining enterprise data with experience data, then responding with the right message fast enough to engage customers is not easy. It certainly shouldn’t be done manually because algorithms can change on a whim due to constantly evolving customer behaviors. That’s why AI needs to be part of the equation, and until this is done, CX will not improve.

              Three challenges to overcome

              Privacy and consent. Most users will consent to their data being stored (including the use of cookies, but only first-party, and not third-party cookies where data is shared with others) as long as they know what information is being collected and they have full control over it. Brands must respect their relationship with users. This means sharing with them how their data will be used and having systems in place that guarantee data security.

              Third-party cookies. Most companies rely on third-party cookies to follow a customer across their digital experiences, and when popular browsers like Google Chrome finally stop supporting them, they’ll have to turn to other means to understand their customers. Creating a first-party data strategy is key. Using tools like experience ID’s across multiple digital properties of a brand will help stich data across the different domains of an enterprise.

              High customer expectations. Customers want Amazon-like or better experiences as well as more control of the information they share with brands. Better tools and technologies can help introduce transparency so brands can attract and retain customers and keep them happy.

              What Capgemini can do and what clients can expect

              Using our unique framework, we help clients start building their first-party data, understand customer intent, and stich customer identity across domains to provide a personalized experience. Our framework complements technology solutions provided by our customer data platform [SP2] (CDP) partners like Tealium, Adobe, Salesforce, Microsoft, Pega, SAP, Action IQ, and others. Our framework addresses challenges posed by the third-party cookie conundrum by creating cross-referencing experience IDs for customer interactions and stiches the customer journey to understand their behaviors and deliver personalization across marketing, sales, service, and commerce.

              Next, we have CX-specific AI algorithms. Next best action (NBA) or next best offer (NBO) are terms commonly used in predictive analytics solutions. Knowing what those actions and offers should be at any moment is pivotal. Because if a customer is only looking for product information or wants to voice a complaint, but is instead targeted with an ad, frustration will grow and they may take their business elsewhere. CX algorithms are never perfect. But we have a structure that can calculate a customer’s intent and where they are on the customer engagement index to ensure our predictions are as accurate as possible.[NL3] 

              Powered by AI algorithms, our data-driven CX solutions help save time and money and improve customer experience across three dimensions:

              1. Acquisition. Thanks to optimized marketing spend, the cost to obtain new customers will decrease significantly, helping boost the return on investment from marketing campaigns.
              • Customer engagement. More engaged customers will lead to higher sales per customer or a higher frequency of purchases across all customer segments.
              • Customer retention. Since every department now has relevant, up-to-date data pertaining to each customer, they can respond accordingly. For example, when a customer is transferred to the call service center, an agent will already know about the order delay. It leads to higher retention and much lower churn rates.

              Data-driven CX will result in things like not retargeting a logged-in website visitor with ads they’ve already seen to keep marketing costs down. The beauty of it is that once it’s implemented, it connects all parts of the business via data insights, allowing AI to automatically filter the data and recommend actions that deliver superior customer experience all around.



              Author:

              Naresh Khanduri – Our Expert

              Naresh Khanduri

              Global Generative AI for CX Lead, Capgemini
              Naresh with over 6 years at Capgemini, currently serves Executive Vice President – Global Head Generative AI for CX. He drives the design and execution of key strategies that differentiate Capgemini in the marketplace. His expertise in Data and Generative AI enhances customer experience across Marketing, Sales, Service, and Commerce, shaping the future of CX through innovative AI applications.

                Radical ideas about quantum sensing arise from multidisciplinary collaboration

                Edmund Owen
                6 Oct 2022

                As part of the BMW Group Quantum Challenge, we discovered that quantum could help with sensing in various ways, including overcoming the problem of obscuration by stray particles, and potential bottlenecks arising from limitations on the amount of incoming data that quantum machines can handle.

                Earlier in this blog series, we communicated some of our excitement about taking part in the BMW Group’s Quantum Computing Challenge, and the value of our holistic approach, which brought together expertise from across Capgemini Group to consider the end-to-end quality process. Next, machine learning expert Barry Reese outlined our research into whether quantum machine learning (QML) could enhance the ML-based methods already used to screen automotive parts for flaws.

                Now I’d like to share what happened when we applied quantum to sensing. As we explained earlier, although the challenge was very focused on the application of quantum algorithms, we wanted to take a broader approach. Identifying a crack involves not just the image processing itself but also image capture and sensing, in terms of the production line, data acquisition, and automated decision-making. We decided to look at the opportunities to apply quantum in these areas.

                Looking into the future of sensing

                My contribution here was future-focused and inspired by the host of emerging new technologies which use quantum effects for enhanced sensing and imaging. My thoughts homed in on the possibility of using quantum sensing techniques to address obscuration, where dust or sprays of paint would normally prevent a traditional camera from capturing a clear image. Using single photon detection, LiDAR images could be captured to identify crack formation when it’s happening – even if it is obscured by stray particles.

                We also explored a future where quantum sensing and quantum computing could be used in combination. One of the key hurdles of image processing using quantum computers is sure to be the uploading of data. High-resolution images contain lots of information. Uploading it all will be hindered by a bottleneck caused when transferring classical data onto a quantum computer. The process is limited by physical principles, which are unlikely to be addressed quickly by quantum hardware development roadmaps. (Assuming it’s even possible.)

                To overcome this bottleneck, we suggested extracting quantum states from a quantum computer and using these in a quantum sensing approach. Again, the thinking is ahead of the curve, but in my view could be viable in five to 10 years. Transferring quantum information from one computer to another is going to be necessary for large-scale quantum computing – and this is where so-called flying qubits could come in. Our proposal was to use these information-carrying particles as a quantum sensor. An output state from a quantum computer could be transmitted to the part under inspection and an imaging system would be used to project the state onto the part. Information about the part would be imprinted onto the flying qubit state through interactions with the part’s surface.

                Figure 1: The hybrid sensing/computing approach:

                 

                This information would be captured and transmitted back to a post-processing quantum computer, which could use a quantum neural network to process whether the part is faulty or not. By uploading information from the part in a quantum-native form, this quantum sensing/computing hybrid approach would avoid memory bandwidth bottlenecks, see Figure 1.

                Viable new approaches to sensing

                This work suggested several ways in which quantum technology and techniques could contribute to sensing in the context of automotive quality assessment. The technology could overcome obscuration by stray particles and, via the hybrid approach, improve the flow of data into quantum computers.

                In the final article in this series, I’ll explain our related work on quantum adoption roadmapping.


                Edmund Owen

                Principal Quantum Physicist at Cambridge Consultants (Capgemini Invent)
                Edmund combines his experience in modelling and quantum systems with the expertise of engineers, programmers and designers to develop quantum products that provide practical solutions to commercially and socially relevant problems.