Skip to Content

House of the rising data

Pierre-Adrien Hanania
March 31, 2021

By Pierre-Adrien Hanania and Iftikhar Ahmed

Public organizations are becoming more and more data-driven. In 2020, 47% of public sector organizations stated that “decision making” in their “organizations is completely data-driven.”

Brace yourself, data-powered organizations are coming

This number will only increase in the coming years as more public services migrate to digital channels to meet citizen expectations. Whether it be AI-judges in Estonia, cancer screening in hospitals, or the detection of bark beetles endangering Swedish forests, AI promises us unprecedented access to new insights. Even though it is exciting to think about the possibilities and opportunities public organizations have thanks to the increasing amount of data, it is equally important to understand their responsibility in using that data. Only if it is used effectively, efficiently, and securely and the insights gathered are trustworthy and ethically processed, will the journey to fully data-driven governments be successful and eventually lead to the use of technologies, including AI.

As of today, only 9% of public organizations claim to have “successfully deployed [AI] use cases in production and continue to scale more throughout multiple business teams.” That’s why data governance needs to be addressed. Data governance includes many different aspects, but it can be broadly understood as the holistic approach to data management throughout an organization, whether it be a central command center for a city authority, a hospital dealing with capacity data, or a welfare agency leveraging case management. A good data governance framework addresses many questions:

  • Data availability – Do I have access to the data matching my public service to be delivered?
  • Relevance – Do I have the right data for these services?
  • Usability – Is the data I collected leverageable?
  • Integrity – Is the data non-biased, representative, and complete?
  • Security – Is the data safely stored and protected against cyberattacks?

To answer these questions, a strong governance plan, thanks to which the data is collected, stored, and used purposefully and efficiently, is needed. This is done through the introduction of standardized business processes that bring clear policies, procedures, roles, and responsibilities to all aspects related to data management within the organization.

Data on its voyage through life

This quest for data governance impacts the entire data lifecycle. Starting from the discovery and cataloging of data, where data sources are identified and the Data from these sources is centralized and aggregated to build a strong foundation for the coming steps. If we take the example of a bed availability monitoring system, data needs to be consolidated in relation to occupancy, weather, emergency helplines, and health agency reports.

This enables organizations to capture and understand their data better​. Given the federated nature of IT-systems in the public sector, activating data effectively and thoroughly is even more important. Through the data-enabling processes, where the data is prepared, data formats are standardized and the source data is enriched so that necessary insights can be gained. In our bed availability example, this means collecting a complete picture of availability across geographies and hospitals in order to get a clear picture of patterns and variables relevant to the situation. During this step, the data is also normalized, and outliers are identified and processed. Regions where bed availability is an issue due to specific occurrences, such as a new COVID-19 wave, or a structural deficit, such as a permanent shortage of resources, could help accurately identify hot spots causing anomalies. The normalization of data and the handling of anomalies ensures that the quality of the data is improved and the data is controlled and managed efficiently. Up to the final step, where real-life advantage is created, allowing for data-driven decision making. In a hospital facing a pandemic, this allows it to best use the gathered data around cases, such as available beds or pandemic patterns, to take the best decisions on resource planning and patient reallocations. It is only then, with a resilient and well-governed data playground guarantying process efficiency, that cutting-edge technologies such as AI become possible to embrace.

AI in the public sector will be governed, or won’t be at all

As in other fields, in public services the need for efficient data governance is key to the adoption of AI. Because the public sector deals with crucial services (sensitive and personal data) and issues (security, justice, health, etc.), the above-mentioned pillars of data trust, security, and ethics are critical when it comes to the implementation of AI and other data-driven projects.

There is no room for breaches or other errors because once lost citizen trust will be very hard to regain. To address these concerns, Capgemini has envisioned the AI & Data Engineering offering. Throughout the data lifecycle, Capgemini addresses all stages – from the platform foundation to activated data for AI and analytics execution.

Platform sweet platform – Data needs a trusted and resilient home

The customizable AI & Data Engineering Platform addresses the concerns that are specific to the public sector and provides a safe harbor for AI implementation in at least four ways:

  • Steering centralized digestion of qualitative and quantitative data

The wildly diverse data sources in the public sector that are used to collect the data and their different data quality standards need to be addressed and managed. To effectively solve a process, for example welfare fraud detection, various risk indicators from existing governmental systems such as taxes, health insurance, residence, education, etc. need to be considered. The raw data is collected and stored by different agencies and ministries at state, municipal, and federal level. Only if the collection and digestion of that data is managed well will trustworthy results be feasible. In the AI & Data Engineering Framework, the effective digestion and processing of different types of data is addressed as part of the “Data Foundation” building block.

  • Building a resilient and robust platform

The pandemic saw a spike in cyberattacks against hospitals. In France, for example, the number of such cyberattacks almost quadrupled, underscoring the fact that resilience against such threats is key for the success of any data-driven organization. Such a safety standard can only be ensured if the platform on which the data is processed is secure and offers protection against any kind of cyberattack. The Capgemini Framework addresses and solves these concerns as part of the “Platform Foundation” building block.

  • Ensuring the ethical use of data

Ethical AI may not be a concern that is specific to the public sector, although the democratic requirement of crucial public services does require special attention. Data collection and processing through public services must be transparent and comprehensible for citizens, especially when the decision-making process is data-driven. Only then can citizen requirements be satisfied regarding data ethics and explainability. Public sector projects can only be successful if the general public accepts and supports the process outcome, for example, in an intelligent job matching engine.

Data protection is another pillar of this quest for ethical AI. Ensuring the safety and integrity of citizen data is essential in public services, for example, in health agencies where patient data must be safely processed. With data anonymization techniques such as those addressed in the “Data Trust” block of the framework, embracing AI while meeting GDPR requirements is possible.

  • Opening the way towards decision-supporting intelligence

In 2020 Capgemini research, 68% of city officials stated that smart city initiatives – which build on a strong digitization of processes – have helped them manage the COVID-19 crisis effectively. Be it for a city, a hospital, or a central government, the pandemic has proven how important it is to have a solid information infrastructure that enables decision makers to take evidence-based administrative action. Only with such an infrastructure and with the help of an efficient data governance will it be possible to ensure successful AI projects in the public sector and make data-driven governments a reality. Building on its ability to master data, a data-driven organization is able to better understand the is-situation, to predict coming occurrences, and to proactively take decisions based on the gathered and analyzed data.

These benefits can be extended to every part of the public sector – from central command centers for smart territories to seamless smart borders for airport and security authorities; from data-driven hospitals gathering medical data for the good of the patient to end-to-end automated administrative processes in organizations dealing with heavy documentation; from efficient identification and control of real-time threats to more user-friendly and effective public services for the citizens.

Strong data platforms will be data’s rising sun momentum

Data-driven governments are no longer a utopia; they are fast becoming reality. More and more public organizations are using the data they have access to, to improve their own insights and the services that they offer to citizens.

In society 5.0, every part of the public sector has potential that proves the necessity for secure, robust, relevant, trustworthy, and ethical data. But only through an effective digital governance can the data, through its whole lifecycle, be managed and processed in a way that will fulfill all the requirements that public organizations have for it. It is clear that data-driven governments are the future. However, they will only have an impact if mastering data is holistically introduced throughout the organizational structures.

Our AI in Public Sector proposition is part of the suite of Capgemini services – Perfhttps://www.capgemini.com/solutions/data-and-ai-in-the-public-sector/orm AI – and is a catalyst for the AI-infused transformation of public services. Want to know more? Check out our AI for Public Sector point of view here.

Robo advisors and the Dutch banks: Strategically ignoring the hype or waiting for maturity?

Capgemini
March 31, 2021

Part 2 in the Retail Investor series

There is a legion of robo advisors available for the retail investor, whether it be directly offered domestically or internationally. And although it is a typical value proposition offered by FinTechs, these innovative “online, inventive, or digital stewards of wealth,” are also offered by the world’s largest asset management companies (e.g., BlackRock, Morgan Stanley, and Vanguard). The Netherlands is considered one of the most innovative countries in the world in the financial sector. So surely there must be a reason why these services are not offered by any of the large Dutch banks (at least not actively advertised)?

What is a robo advisor?

Robo advisors come in different shapes and sizes, making a single definition somewhat troublesome. They are best described as automated digital financial management tools to help investors manage their portfolios with moderate to minimal human involvement (from the bank). Their advice is built on algorithms or rules, which are fed by a questionnaire about the investor’s preferences (think about risk profile, resources, goals, etc.).

What are the advantages and disadvantages of robo advisors?

By automating the digital advice and portfolio management functions, organizations and investors can benefit from much lower fees than what traditional wealth management firms would charge. Better yet, as a result of the cost infrastructure, e.g. lack of human advisors or intervention, the service is no longer exclusively available to HNWIs (high-net-worth individuals) due to a large minimum investment. Instead, it becomes accessible to ordinary retail investors with a minimum investment of a few hundred euros. Finally, these robo advisor services are often digitally well-designed from a customer journey perspective. Everything can be managed online, from changing your preferences to gaining insights in the asset classes and returns of your portfolio.

Robo advisors also have some disadvantages. These services are not 100% personalized (yet). Even though an investor’s main concern is the desired returns, humans always have specific and different needs, which they want to express with another human, to feel understood or assured. There are no face-to-face meetings with a robo advisor. When the market suddenly expresses a drastic correction, the robo advisor will not calm you down with its experience and knowledge on how financial markets work.

In which context are robo advisors relevant?

To fully understand the context for robo advisors, see the investment options below. Robo advisors are especially relevant in the latter two categories (see explanations below):

Usually when people start to invest, they start small and just try their luck by themselves. Often, simple financial instruments, such as stocks, are chosen. This is also known as execution-only investing. Guided investing, or investing in index funds or mutual funds, is particularly easy since investors do not have to look back often or make big decisions (e.g., they just follow the index). So far, banks don’t offer actual advice during investment decisions. The next categories, personal banking and wealth management, can have many different names and forms, often with many varieties and different limits. Personal banking is often investing with advice from an advisor with a minimum investment (limits/naming can differ among banks). You can choose to receive advice or fully trust the advisor; either way, a team of experts will manage your portfolio and potentially offer additional  services too. The last category is also known as private banking. Often, these services are offered for the wealthy (HNWIs). Wealth management firms are highly specialized, with an expert team of bankers that cater to a larger set of needs of HNWIs from an investment and lifecycle perspective.

Are robo advisors a hype or are Dutch banks missing an opportunity?

The high fees mostly explain why wealth management has a large minimum investment, which is often charged every quarter based on a percentage of the assets under management. And while many wealth management firms have been able to quickly adapt to a new online or digital business in these turbulent times (Financial Times), according to the Capgemini Wealth Report (2020), “wealth managers must navigate an uncharted, post-pandemic world without a playbook.” The unusual events of this year have caused investors to critically assess their traditional wealth managers, especially scrutinizing two subjects: advisory fees and personalized services/advice along the customer journey. Data from the report supports these findings (poll results gathered by more than 2,500 investors):

  • 33% of all respondents said they were uncomfortable with the fees wealth firms charged.

This is especially true given growing concerns in volatile financial markets and growing expectations. Additionally, the gap between existing and desired states merits further consideration since:

  • 22% of HNWIs say they plan to change their primary wealth manager.

A top reason for a switch is the high fees. The obvious question then is, if robo advisors can also offer wealth management services at lower fees, why does it seem that they have not been widely accepted by top Dutch banks? The answer probably lies in the quest for hyper-personalized client expectations. Investors not only seek a reasonable amount of return, they also desire value-added services and are willing to pay extra for it (e.g., wealth transfer and inheritance management). Unfortunately, current robo advisors are not yet capable of doing so. They have been adopted by some European banks, and these automated services have greatly helped with the surge of new customers in March, but we’re not there yet. However:

  • 74% of investors are likely to consider BigTech wealth management services.

Robo advisors have the potential to offer financial well-being services, currently limited to HNWIs, to average retail investors. However, hyper-personalization is the missing link and probably needs data-driven capabilities. Google has already launched Google Pay, and we know BigTech can do almost anything with data. Survey results have pointed out that 74% of the surveyed investors are interested in BigTech. This may pose a real threat to current financial institutions. What if Google would not only deliver wealth management services, potentially through robo advisors, at low cost/prices and great accessibility to all (not just HNWIs). If no action is taken, current financial institutions may very well lose clients to FinTechs and BigTechs (as survey results have indicated). Keeping investors satisfied with sustainable investments alone might not be a durable growth strategy in the foreseeable future.

What’s next?

In the next article we will explore the future of investment management. If you are interested in this topic, connect with me on LinkedIn.

How to design for users despite growing challenges

Capgemini
March 30, 2021

The perceived importance of users, as the final recipients of a solution, grows continuously – not only when it comes to developing commercial products used for private purposes but also in the work environment. All in all, being useful, effective, and increasing the productivity of performed tasks are the key elements of many solutions. However, to meet these expectations we must hear what the users expect. And this is where things usually get tough. I’ve heard these statements many times:

Users are important but:

  • We don’t have any users yet.
  • We have many users. Hundreds! Thousands! Hundreds of thousand! And all of them are different.
  • Our project is so specific. There is no way to gather feedback.
  • We just can’t change the solution.

So, we can’t conduct any research. We can’t design for users.

Interestingly, as a psychologist in the IT world, I see that when the topic of people, peoples’ needs, or worse still, peoples’ emotions appears on the horizon, some get tight-lipped. Alternatively, suddenly available project resources get really scarce and there is not much space for user research. This might be the result of the previously mentioned attitudes, though. In the end, our finished solution is ready not for people but for the solution itself. So how to overcome those challenges and design for real, for people? Let’s find answers to the above statements:

  • We don’t have any users yet.

No users? Well, we don’t build time machines (yet!) so somewhere there should be people who use similar solutions or have similar needs. We can engage an external agency to recruit user research participants based on our requirements, look for a specific community on social media, or even ask our colleagues for help. Let me give you an example of what I mean. Let’s consider an app presenting different data for further analysis. In many roles, you have to use some kind of dashboard, analyze data, download reports, etc. But, why not ask colleagues for help? Although the content will be different, the usage patterns might be similar and the feedback will be priceless. Or, conversely, the feedback can have very specific value; the value of not developing solutions with significant flaws and redoing that after all.

  • We have many users. Hundreds! Thousands! Hundreds of thousands! And all of them are different.

This is a frequent notion in big organizations. Should we give up without even trying? No, it is always better to ask a few people than to not ask anyone. According to Norman Nielsen (the top UX consulting organization), five users are enough to find 85% of usability problems in design.[1]. Of course, when we design for groups of users with significantly different needs and contexts, it gets a bit more complex and more users are needed to test – still not hundreds, but much smaller numbers. Let me better visualize this. If 5–6 users can’t find the button to submit an IT issue in a self-service IT portal or 4–5 users spend 10 minutes scheduling a meeting in an app dedicated to this, it means that we have a serious problem. What is important, it is that we don’t need to worry about statistical significance and we don’t have to ask hundreds of people to get valid feedback. Even a relatively small sample can direct our attention to aspects we overlooked because  we were familiar with the product and observed as it developed over time.

  • Our project is so specific. There is no way to gather feedback.

I agree that every project is different and has other challenges regarding designing for experience. I can’t, however, agree that in some cases there is literally no way to figure out what can be improved to create a user-centric solution. I believe that there is always a way to find people who would share their feedback. Sometimes, they will even feel appreciated and listened to (at last!). It all depends on how we present the need for user feedback.

Even if we find the right people, there can be a language barrier. We used to cooperate with translators, still having interactive exercises or engaging conversation with our research participants. Or, let’s imagine the need to gather user feedback regarding a service, as opposed to the more tangible tools (e.g. web application). That’s not a problem at all. We can use storyboards, record a short movie, or describe the solution in any other ways enabling us to learn how it can be improved– but I have to admit that a bit of creativity is needed here.

  • We just can’t change the solution.

In many projects, Capgemini CIS implements out of the box solutions, e.g. Microsoft M365 suite (with Teams gaining popularity during the COVID -19 pandemic) or the ServiceNow platform. In most of these cases, we can’t change the features (although in some we successfully did!). But even if a single functionality can’t be adjusted to the organization and its users, we can significantly change how the solution will be perceived by users. Our Digital Adoption Managers team is responsible for activities aimed at increasing tools adoption: communication, marketing elements, gamification campaigns. Although tools can’t be changed, we can improve users’ perception of new solutions. They do not have to perceive new systems, applications, or services as a brand-new, top managers’ whimsical idea to make their life even more difficult. If we highlight benefits, present how new tools will help them to overcome current daily struggles in a targeted and innovative communication campaign, the results will be impressive. Nonetheless, a proper understanding of different groups of users (personas) is crucial to design a proper solution, no matter if that is a tangible device or application or much less palpable communication being sent to users. To adjust these activities, of course a relevant research should be conducted.

I hope that you will find some of these ideas useful in your projects. If you have other challenges or want to learn more, contact me directly on LinkedIn.

Five success factors for M&A integrations

Ludvig Daae
26 Mar 2021
capgemini-invent

Is your next merger bound for success? With these five tips it will be!

While M&As are on the agenda of every board member worldwide, they don’t always provide the desired synergies. In this article, we share five success factors that increase the chance that synergies are realized. We base these factors on the successful and unsuccessful integrations we have encountered over the years, no matter the size of the companies involved.

Perform proper IT due diligence at the start

Without proper IT due diligence pre- or at least post-merger, there is a huge risk that financial, compliance, and legal issues will pop up. This can mitigate any synergy benefits or even undo an entire post-merger integration. With today’s business relying heavily on IT, it is key to focus on information systems and information management. Costs for IT restructuring can make the business case less attractive. Also, companies engaging in cross-Atlantic mergers should consider the specific laws and regulations that define how information management should be performed. For US-based companies this is the Sarbanes-Oxley Act (SOX), while for companies working with European customers it is the General Data Protection Regulation (GDPR). SOX requires that publicly held companies have business continuity plans in place. This means their European subcompanies should have them in place as well. The GDPR, on the other hand, mandates that companies know where their customer data resides.

Base the synergies that are to be realized on clear strategic decisions

Synergy means that two companies combined will be greater than the sum of their parts. Synergies are most often the reason that mergers and acquisitions take place, but realizing them is easier said than done. To increase the probability that synergies will be realized down the line, it is important that they are clear from the start. By setting clear strategic decisions and expected outcomes upon which the merger is based, executives ensure that the synergies will naturally follow for both parties, mainly because the guidance regarding where the integration projects should go in the future is established. Linking the synergies to KPIs can help track whether progress is still in line with strategic goals.

Have a digital perspective on integrations

In current market conditions, accelerating digital transformation is key for improving profitability and growth. That’s why you should have a digital perspective on the integration. When two companies merge, their infrastructures, processes, etc. are not yet aligned. To ensure that these will successfully integrate to the required level (depending on strategic decisions), think about the applications, agility, scalability, and how both parties can exchange digital capabilities. Validate whether the application landscape aligns with digital ambitions. What resource capabilities support the ambitions and which capabilities are to be added? In other words, use IT modernization to ensure that business transformation takes place during the integration. For example, when agility and scalability are preferred, focus on consolidating multiple ERP systems into one ERP system and align the business processes of both companies. This also ensures scalability and ease-of-transition for future mergers. Don’t take any shortcuts while running these projects – such as moving workloads without proper analysis – due to business continuity risk. IT modernization is difficult work; it also is an ongoing project. You’re never fully modernized because people, processes, and technology are always changing. But a PMI provides a great opportunity to use the momentum to better align them.

Embed change management in each project or workstream instead of separating it as a responsibility

Culture eats strategy for breakfast. Each project or workstream lead should understand and feel that they are responsible for successful change management. There shouldn’t be a separate change management stream that can take accountability away from other streams. To successfully integrate two companies, their people and cultures, effective communication and giving employees ownership in projects is key. Using the leadership skills of an executive can have a positive effect on change; for example, have the CIO of the acquiring company provide the introduction and the strategic reasoning before a principles workshop. This can help both sides understand each other and the overarching vision. Ensuring that employees can interact and get to know each other through workshops on specific topics is key in bridging fears and motivations. Highlighting cultural differences upfront can take away a lot of misunderstanding.

Define clear principles and follow up on them

More often than not, IT managers try to use integrations to finance state-of-the-art and over-priced solutions that do not provide value to the transformation. Clear principles help to avoid such situations. Principles are fundamental statements that function as guidelines for integrations. They are key in bringing different parties together and should be prepared via executives’ alignment meetings. Always be very rigorous in setting correct and unambiguous principles, otherwise alignment and behavior could deteriorate along the way. To ensure principles are applied, make sure that they are embedded in the governance. This can be done in a couple of ways – for example, by deciding about these at board level and continuously carrying them out in workshops. When performing further integration projects, the principles should be taken as standards by the steering committee. When there are many integration projects, put a quality board in place to check if principles are applied when decisions are made or agree to only discharge a project team when principles have been successfully validated. By doing this you’ll also benefit from increased financial control, considering that a lot of capital is made available for mergers and acquisitions.

We look forward to having a conversation with you around these success factors. Please reach out to Ludvig Daae .

About Author

Ludvig Daae

Expert in Digital Transformation and Innovation
I help IT leaders of global organizations to navigate through the complexity of organizational and technological change: from Digital / IT strategy design to implementation of that strategy.

    On target! How to ensure your marketing gets results

    Capgemini
    March 26, 2021

    What do your customers want? What do they expect your brand to deliver? And are you delivering it? These are all questions that CMOs and their marketing teams have become used to. After all, getting to ‘know your customer’ is nothing new.

    Currently, however, direct to consumer marketing typically takes a fairly broad-brush approach – a mass mail shot here, or ad campaign there. They get results, but do they really generate the return on marketing investment (ROMI) hoped for?

    Largely, the answer to this is no. That’s because they don’t drill down into the unique needs and expectations of individual customers. A consumer who only drinks a particular brand of coffee. A customer’s birthday, anniversary, new home. Marketers still have a way to go to improve how they leverage their greatest asset – data.

    Data, data everywhere

    There is so much data available. On customers. On product lifecycles. Across multiple interaction channels. It’s everywhere. But it’s data that is fragmented, duplicated, and inconsistent across functional silos, brands, and partners. Then there’s the added challenge of how to manage consent in line with GDPR compliance in Europe and other legislative regimes. Where do CMOs start?

    Lots of companies are collecting data via pop-ups on customers’ screens, tagging and cookies. But many of them are not asking for their customers’ consent to use the data to provide a better, more personalized experience, so it’s wasted effort. Looking ahead to 2022 when Google plans to ban cookies, we see a valuable avenue to collecting customer data being closed.

    So, marketers must transform how they capture customer data and turn it into a true business asset. By using it to gain a deep understanding of customer characteristics, behaviors and needs, they can target their customers with personalized, timely offers and services. This will ensure they become relevant.

    At Capgemini, we believe the way to achieve this lies in segmentation and profiling. This helps companies put the customer at the heart of all marketing and sales decisions by generating a deep customer understanding that is based on data-driven insights and direct customer feedback.

    Building customer profiles

    Segmentation and profiling use primary data (collected directly from the customer) enriched with second-party data (captured via data alliances), and third-party data (captured by other companies with no direct relationship to the consumer). Building a profile based on this data is enabled by artificial intelligence (AI), which helps to identify behavior patterns, and segments customers into clusters with specific profiles. In turn, this enables more relevant targeting (hobbies, interests, lifestyle, etc).

    Some sectors are already ahead of the game. Media organizations and retailers, for example, make use of primary and second-party data in their direct-to-consumer marketing. Retailers are also tapping into data captured by their loyalty card schemes to target customers with relevant offers, and we are seeing some great uses of data in certain instances. Take French retailer Monoprix, for example. The company used purchasing history to surprise its customers with discounts on a particular product they knew their targeted customers already bought. So, in a sense, this was a gift to the customers, rather than them being offered a discount on something they didn’t really want.

    Other sectors, however, such as consumer packaged goods (CPG), don’t always have direct access to their customers because they reach the end consumer via retailers. Nonetheless, in some instances, we are beginning to see companies developing their own direct-to-consumer channels.

    Capture, analyze, and activate

    No matter how mature a company’s approach to data collection, the desired outcome is the same – to measurably improve their sales and customer outcomes through targeted customer interaction. Segmentation and profiling bring this to life.

    At Capgemini, we work with our clients to help them build data strategies across three pillars of segmentation and profiling activity: capture, analyze, and activate.

    We built a data management platform for a leading CPG company wanting to increase the value of its data for the marketing organization. This revolutionized the way in which its data was curated, analyzed, and used. This solution brings together the world’s largest data sets, advanced in-house machine learning tools, deep category understanding, powerful analytics, and compelling insights. The result? A data-driven approach to previously complex decision-making, helping to fine tune campaign content and media outputs. The analysis also provides a deeper understanding of the company’s competitors, emerging products, upcoming trends, and consumers.

    For a global automotive manufacturer, we used statistical models and AI to predict customers’ purchase propensity, model affinity and price elasticity. In addition, self-learning models recalibrate weekly based on direct customer feedback. We have helped to boost sales conversion by a factor of five through data-driven actions.

    Segmentation and profiling can clearly produce benefits fast and we can see so many ways in which to use data to improve marketing outcomes. However, a word of caution is in order. It is important to be sure that your customer is happy to receive the offers and information you target them with. After their consent has been given, be sure their data and contacts are managed appropriately. Don’t overdo it. Too much can risk your relationship with the customer. Be smart with their data. Make it work for your customer as well as for you.

    Download our  ‘put your customer first with connected marketing‘ whitepaper to learn more.

    Get in touch to learn more about segmentation and profiling in Capgemini’s Connected Marketing offer.

    Contact:

    Stephane Sun | Senior Director

    Florian Seltene | Senior Manager

    Award-winning AI query handling you can trust

    Marek Sowa Head of Intelligent Automation Offering & Innovation, Capgemini Marek empowers clients to revolutionize business operations with AI and RPA. He aids Fortune 500 companies in creating scalable, high-performance automation solutions that enhance efficiency, employee satisfaction, and transformation. His current role involves shaping market-leading offerings, GTM strategies, and aligning global services in the Data & AI portfolio. Marek also manages product design, sales enablement, marketing alignment, and market adoption.
    Marek Sowa
    March 26, 2021

    For years, queries sent to finance departments had to be processed manually. This was extraordinarily time-consuming, extremely prone to human error, and often took employees away from more business critical tasks. However, this problem is now a thing of the past as part of Capgemini’s move towards implementing frictionless operations for our clients.

    Building on our deep expertise…

    Building on our deep learning expertise, Capgemini’s Artificial Intelligence Lab created our AI Query Handling tool, which utilizes natural language classification (NLC) to assist in the data extraction process – providing relevant data points to a variety of finance-related questions. This ensures questions are always sent to the right person or department within a matter of minutes – not days.

    For example, if a client needs a copy of an invoice sent to them, they simply provide the email address and invoice number associated with the order. Our AI Query Handling tool can then extract and classify the information needed to send a copy of the invoice – without human intervention. Or if the query comes with supplemental documentation the tool will then scan the document, retrieve the relevant information, and provide the answer quickly and efficiently. If any information is missing, then a request for supplemental materials will be sent to the relevant people almost instantly to keep the process moving.

    Keeping coworkers in the loop when needed

    Our AI Query Handling tool is often able to identify when important information is missing, and quickly emails a request to fill in the blanks when needed. It was for this reason, that finance departments who utilized it saw a 90% reduction in inquiry response times.

    However, sometimes our tool needs help from its human coworkers. The AI Query tool recognizes inaccuracies or insufficient data, or if your question requires a more detailed answer. If any of these instances occur, it quickly intervenes and sends it to the right person or department. The employee there then figures out what is missing, provides the information, and sends it back to the tool. This enables the process to proceed automatically or, in more complicated instances, it enables the right people to spend the time needed to resolve the issue properly before moving it forward in the process. Either way it is just one of Capgemini’s key ways of reimagining key business operations.

    AIConics likes what Capgemini’s got…

    AIConics recently recognized the innovative and game-changing possibilities that Capgemini’s AI Query Handling tool provides to finance departments in addition to a number of other processes – due to its scalability in relation to HR and supply chain queries. The AI Lab is thrilled to receive this recognition for all their hard work – so congratulations go out to all involved!

    The AI Query Handling tool is part of Capgemini’s Intelligent Process Automation (IPA) offering. To learn more about how IPA can help your run more efficiently, helping you transition to – what we call – the Frictionless Enterprise, contact: marek.a.sowa@capgemini.com

    Marek Sowa is head of Capgemini’s Intelligent Automation Offering focused on adopting AI technologies into business services. He leverages the potential hidden in deep and machine learning to increase the speed, accuracy, and automation of processes. This helps clients to transform their business operations leveraging the combined power of AI and RPA to create working solutions that deliver real business value.

    Marcin Stachowiak is Head of Capgemini’s Intelligent Automation Lab and Senior AI Lead for Capgemini’s Business Services. He is responsible for establishing Capgemini’s machine learning-driven automation innovation strategy, and managing our internal centers of excellence that focus on machine learning-, production-based systems. This means Marcin is well-versed in overseeing effective systems integrations, while always keeping our clients’ wider needs in mind.

    How to drive a frictionless financial close

    Capgemini
    March 25, 2021

    There are increased expectations from CFO’s for faster financial and management report delivery – not only to meet regulatory requirements which are becoming more stringent but also to enable quicker responses to business needs.

    The goal of the one-day financial close or continuous close is no longer limited to a few organizations – it is increasingly becoming a CFO objective across all industries. This puts greater responsibility on the record-to-analyze (R2A) team to reduce financial close cycle times in addition to enhancing the accuracy of reported figures.

    The close process is challenging as stakeholder inputs are required across a variety of departments based on data housed in different systems. Activities also need to be completed within the close period, which requires a great deal of follow up emails and team coordination – often in a highly stressful environment.

    Proven ways to improve your close process

    With this in mind, how can you ensure that your financial close process is frictionless? These five key points can help you reduce the cycle time of period-end activities, and help you transition to – what we call – the Frictionless Enterprise:

    • A solid close calendar – agree, sign-off on, and strictly follow an entity-level close calendar at the beginning of the year, with only minimum exceptions by stakeholders
    • Enhanced workflow, tracking, and scheduling – track all close tasks with a defined responsibility matrix through leveraging BlackLine or Trintech Cadency technology instead of Excel spreadsheets. These tools can also be used to trigger actions, automate follow up, and provide status updates without having to use email. Building the dependencies between the tasks and allowing automated task cheduling is a great way to speed up the close and free up resources for more analytical activities
    • Rationalize close tasks – move all non-close-based tasks outside the month-end close period as much as possible; process journals before the month-end close period, and complete reconciliations in continuous manner rather than during the close or post the close. Move away from once a month mindset towards continuous accounting activities
    • Leverage automation – move towards touchless journals with workflow driven approvals, leveraging robotic process automation (RPA) and/or BlackLine or Cadency technology to automate close activites such as accurals, reversals, allocations and depreciation runs.
    • Enhanced reporting – standardize and automate financial and management reporting by leveraging tools such as Power BI and Tableau along with natural language generation (NLG) technology.

    In addition to this end-to-end transformation that integrates your systems across the organization, a single ERP platform supported by multiple BlackLine or Cadency modules – and a rationalized chart of accounts and entity structure – can help you streamline your end-to-end R2A processes, leading to a truly frictionless financial close.

    What frictionless finance can really offer

    Implementing a frictionless approach to your close process can reduce your cycle time by 40–50% and lead to improved statement accuracy. This, in turn, can free up your finance teams to concentrate more on business-critical tasks and enable your CFO to focus more on the strategic priorities that really matter to them.

    On top of this, removing the friction from your R2A process can help you implement continuous analysis and finance intelligence to enable informed business decision-making. This can lead to a confident, imperceptible period-end close with greater regulatory compliance, improved investor confidence, and enhanced market response.

    What’s not to love about frictionless?

    To learn how Capgemini’s Frictionless Finance leverages it’s partner technology ecosystem to drive the digital transformation of your finance function and deliver a frictionless close, contact: arush.kumar@capgemini.com

    Arush Kumar is responsible for driving digital transformation of finance processes leveraging Capgemini’s Finance Powered by Intelligent Automation offer, which adds value to our clients’ business operations by implementing best-in-class processes and driving financial savings.

    Deploying and managing containerized applications at scale: Kubernetes

    Rens Huizenga
    March 25, 2021

    The strength of an enterprise’s cloud application portfolio is intrinsically linked to successful customer experiences and great enterprise performance. For success, organizations need to be able to develop and deploy applications at speed, at scale and securely.

    As a result, containerized applications are becoming essential to organizations across sectors. These work by bundling together all the features of an application in one container, enabling organizations to easily run software moving from one computing environment to another.

    These applications improve agility, streamline development and deployment operations, increase scalability, and optimize resources. However, managing and orchestrating these at scale can certainly be a challenge. This is where the Kubernetes platform comes in.

    What is Kubernetes?

    Kubernetes is the leading-edge platform for containerized applications providing methods and interfaces that offer predictability, scalability, and high availability. It is available for enterprises to manage on their own or is offered by cloud service providers like Microsoft as Kubernetes-as-a-Service.

    While many organizations decide to run the orchestration of Kubernetes on their own, this means manually handling processes like lifecycle management, scaling, and scheduling. These require a lot of time and investment, made more complex as more applications are developed and deployed.

    Capgemini: your partner for Kubernetes

    Alongside our partners, including AWS, RedHat, VMWare and Microsoft we use our expertise to help organizations design, build, and manage cloud native applications in public, private, or hybrid cloud environments. Our three-step process is highly tailored to your organization’s needs, so that you get the best from containerized applications.

    • Phase one – landscape analysis: To kickstart your move, we’ll analyze your existing applications for their cloud readiness. This includes assessing current tech-debts against cloud native principles and identifying gaps in your current architecture versus the desired state. This process will define the vision, decision framework and notional architecture to reach your goals while prioritizing your ongoing business strategy.
    • Phase two – innovate: A strategic vision and roadmap define the actions needed to establish the foundational technologies and platforms to drive continuous innovation across multiple cloud infrastructures. We support organizations to re-platform existing monolithic applications and their functional capabilities into autonomous microservices, using best practices for how to containerize an application.
    • Phase three – optimize: Capgemini and its partners use the Kubernetes cluster management platform to orchestrate the networking, storage, security, image registry, and general computing of your containerized workloads. We can extend Kubernetes into your existing data centers, public clouds, and edge environments. We can also use our bespoke Kubernetes and Kubernetes-native tools to optimize deployments at scale, supported by a continuous measurement and optimization framework.

    Capgemini can help you derive the benefits of cloud computing to support cloud native apps and drive greater business agility across many areas. We provide the opportunity to scale and operate applications agnostic of prevailing infrastructure, additionally enabling complete monitoring, measurement, and control of your containers. Our end-to-end multi-cloud managed service includes turnkey solution and extremely convenient pricing models. Discover more here.

    Simplifying the Letter of Credit issuance through Trust

    Capgemini
    March 25, 2021

    In global trade, LC is the commonly used trade finance instrument to ensure payments of goods and services. LCs are used in 11–15% of the global trade, accounting to over a trillion dollars per year.

    A typical LC process involves following parties:

    • Applicant – importer who initiates the LC process
    • Issuing bank – bank that issues the LC for the applicant
    • Beneficiary – exporter for whom the LC is being issued
    • Advising bank – bank for the beneficiary, located in the country of beneficiary.

    Challenges in the current process

    • Process latency: With multiple stakeholders at different locations, delivery of documents via courier or other means consumes a lot of time. This results in higher turnaround time at each step of the process, from initiation to LC submission.
    • Security: As the documents pass through multiple touchpoints, they can be tampered with or lost in transit.
    • Fraudulent activity: Once the bank documents are handed to the applicant, they can alter or submit bogus documents.
    • Lack of trust: As there is no clear visibility in the process, this can lead to loss of trust between parties.
    • Restricted visibility: Since there is no real-time visibility throughout this long process, this creates a black box for the buyer/seller in getting the information about the delay point.

    To overcome these challenges, it is essential to develop a simplified, trust-based letter of credit issuance solution that would involve the following questions:

    • What are the typical steps required to build a solution for a system based on trust?
    • What components are required to create a system based on trust?
    • What service provider should be chosen to develop the solution?
    • How should the solution scale up as more personas are added to the system?
    • How should the system recognize and authorize its partners?
    • What are the levels of digitization to be considered (for example digitizing a bid or contract)?
    • How should one enable a “need-to-know basis access” feature?
    • How should existing applications be integrated?

    The solution:

    Capgemini has vast experience in building solutions based on trust. An accelerator built for issuance of LOCs consists of the components built on the Hyperledger Fabric framework, access control mechanism to provide access only on a need-to-know basis, standard set of RESTful APIs for easy integration with existing applications.

    An Addon feature is for the applicant to conduct a bid and choose a single bank among multiple banks. This helps the applicant get the best rate and terms on the LC. Below is a short illustration of the accelerator in action:

    Benefits of the Capgemini accelerator:

    • Processing speed is increased by almost 90% by real-time, digital tracking of documents.
    • Security All the documents and transactions are encrypted and stored on an immutable ledger, referred in a chain of interlinked blocks. Individual transactions within a channel is restricted by attribute-based access control (ABAC).
    • Fraud: Only the transactions that comply with the consensus configured for the network are committed to the blocks. These interlinked blocks are nearly impossible to tamper with, eliminating fraudulent activities.
    • Trust: Access to the documents is controlled by certificate-based authentication. The documents can be retrieved directly from the immutable ledger, ensuring authenticity.
    • Visibility: Provides end-to-end, real-time visibility of complete process and helps in identifying delay points. This helps in faster reporting and decision making.
    • Cost reduction: Reduces operational costs by 25%–30% by eliminating duplication of efforts towards data validation and reconciliation.

    Capgemini, a trusted partner for building trusted networks

    Capgemini has an ecosystem to help businesses improve their LOC issuance business processes to achieve better compliance and speed. Capgemini has reference roadmaps to develop a trust network for business processes. A team of domain experts can refine the business processes, and with the help of its team of cloud-native and blockchain experts, Capgemini accelerators play a major role in performing assessments, building target architecture, creating a scalable design, configuration, adaptors, and reusable code patterns. Capgemini has models that can help scale up the solutions at enterprise level. Its benefits have been realized at PoC level. Blockchain service provider alliances enable Capgemini to get required assistance to build an end-to-end solution for its clients.

    Conclusion

    Any process that requires data immutability, data transparency, trust, multiple stakeholders, need automation of business logic, and system resiliency can have a good impact with blockchain. Any business looking out to upgrade its business processes can save up to 20% in developing such solutions.

    This blog is co-authored by Ganesh Prabhu.

    So you want to be a Data Rock star?

    Zhiwei Jiang
    March 24, 2021
    By Zhiwei Jiang, CEO, Insights & Data and Ron Tolido, CTO, Insights & Data, Capgemini

    Fine. We all agree that “data is the new oil” is getting a bit worn out. ‘Smart energy’ works a lot better as a metaphor. It’s settled. Now, about that other cliché in the IT industry: “Data Scientist is the sexiest job on Earth.” Is it really still better than being the lead singer in a band?

    You wouldn’t necessarily say so, if you look at recent publications. AI and do-it-yourself tools are coming, rendering the role of a Data Scientist potentially obsolete through intelligent automation and through augmenting power users at the business side. And then there is a quickly growing number of off-the-shelf analytical and AI solutions as well, ready to be used by just adding water: your own data.

    If we look at the latest projections of the World Economic Forum in their Future Of Jobs report , we see – of course – the evolving need for skills in the technology field, combined with critical, analytical thinking and creativity, and – arguable due to COVID-19 – active learning, resilience, stress tolerance, and flexibility. And look at that top-three job roles in demand: Data Analysts and Scientists, AI and Machine Learning Specialists, and Big Data Specialists. Without diving into what exactly might be the difference between these job roles (or why ‘big data’ is still a thing, for that matter), it is obvious the outlook is not so bad for anybody setting their mind on being an ace in data.

    But it’s gonna take time, a whole lot of precious time. And a very particular set of skills, that have quickly been changing positions. Here’s our take on the top 7 skills that we believe need to be mastered to become the leader of the data band. We have combined it with some tips for entertainment – safely to be enjoyed at home of course – to get you inspired.

    1.     You do the math

    Yes, AI will definitely assist and augment more and more in the heavy lifting of data science and analytics. But having a deep affinity with algorithms and logic, even when they go into new areas (such as deep learning and reinforcement learning) is key. You need to appreciate what is going on under the hood in order to make informed decisions about which ways, approaches, and tools to use for the problem at hand. After all, a fool with a tool is still a fool. You may not want to go as hardcore math as Alan Turing, but watching The Imitation Game is always illustrative.

    2.      OK Computer

    Admittedly, in the end it is all about creating outcomes by the data-powered enterprise. Technology is only a way of achieving that. But every business now is a technology business: it’s technology that brings us a surge in real-time data points from so many more sources; it provides us with the means to collect it, store it, integrate it, and analyze it. Technology enables us to visualize insights at any point of action and take intelligent, automated action. It’s not for everybody to become a nerd such as in the IT Crowd series, but you don’t want to be a Jen Barber either.

    3.     I see you

    The WEF puts emphasis on problem-solving and analytical skills in its outlook. And rightfully so, it’s a complex technology business world that needs a solid IQ level. But it is also a humans’ world, and humans aren’t algorithmic, data-driven, intelligently automated beings. Understanding the problem is more crucial than solving it, hence you better work on upping your EQ – creating empathy, conversational capabilities and the ability to respectfully balance the objectives of being data-powered and being human. Jada Pinkett Smith’s Red Table Talk will bring you a healthy shot of EQ.

    4.     Let’s get down to business

    In a technology business, the best use of data is typically made far from central IT and data management, right in the middle of the business. In order to thrive there, your sector or domain knowledge needs to be nothing less than substantial. And then, when you master that sector or domain, make sure you have your own list of relevant external and open datasets – and increasingly also algorithms – to bring in into every new project. It will be the litmus test for your industry insight. Want to get a flavor of a domain, e.g. Marketing? Mad Men is our killer recommendation.

    5.     What’s your story?

    Few things are more difficult to bear than a poorly understood, unappreciated Data Scientist. Still, it happens regularly that even the most imaginative, deeply smart insights and predictions do not land in the business, let alone that they are being acted upon. Cold hard facts do not often do the trick, you see. Proper storytelling and visualization skills are needed to tempt your clients into the data-powered journey. Have a look at the Fargo series and notice how it over and over comes back again with another compelling “true story”: bingeworthy avant la lettre.

    6.     Just be good to me

    Being “data-powered” seems tempting and rewarding, but pronounced in a certain way, it suddenly sounds eerie. It’s up to every practitioner to seriously understand the ethical considerations of data and AI, and then live and breathe them every single day. Data the good way, data for good purposes. There is no shortcut, no workaround. Stay tuned, as very soon we will introduce you our code of seven principles for Ethical AI. Want to see an all-too relevant ethical dilemma evolving? The Circle with Emma Watson and Tom Hanks brings you right there.

    7.     I can do it in the mix

    Want to “Be Like Water,” ultra-agile, ultra-adaptive, and ultra-responsive? You’re going need tightly integrated, multi-disciplinary teams that rapidly bring solutions to operations. No times for silos or egos. Software development brought us DevOps, DataOps was the response of the data community. And it’s only the beginning, as all skills mentioned here must be put in the same cocktail, both within the team and within the individual team members. Specialization is bliss. Fusion is better. Ask Debbie Ocean about multi-skill teams and see how that works out in Ocean’s Eight.