Skip to Content

Let’s reinvent the onboarding experience

Dr. Sandra Duesing
16 Jan 2023
capgemini-invent

Explore Capgemini’s HR Cloud Advisory approach to support the optimal onboarding experience as part of “Experience Excellence in HR”

The current HR Cloud market is broad and offers a variety of unique features and functionalities that can often be difficult to keep track of. Regardless of the type of system (e.g., HR suite, HR enterprise platform, or experience solution), the right decisions should be taken with a focus on People’s Experience, Experience Excellence in HR and not forget the “Experience Excellence Value Case” focusing on the optimal balance between HR automation ambition, HR IT investments and the level of people experience and seamless workflow design desired. Careful consideration of all companies, business, people, HR and IT-related requirements is essential to ensure that the right solution is selected, the configuration is a success, and will fit with the company’s business case and experience culture.

HR Cloud selection and implementation

To take the right decisions, we recommend a structured vendor selection approach in the first step that guarantees the future HR Cloud system selected best fits the company’s budget, HR products & Service Delivery ambition, process requirements (e.g. in terms of ET&C, legal compliance) and the desired people experience & way of working culture. Afterward, a cloud readiness assessment stream ensures that the company can start the subsequent cloud implementation with all necessary preparations and requirements. To achieve all this, a detailed upfront consideration of the targeted HR IT strategy and vision is a must, defining the organization’s future HR IT Landscape ambition and related optimization and investment need decisions to be taken.

For a successful cloud implementation, it is crucial to also empower the HR team, employees, and managers as new processes are rolled out, political obstacles are carefully managed and moderated and a joint future cloud mindset is proactively promoted and growing. Our dedicated cloud advisory approach combined with agile program & change management capabilities is tailored for the suite, enterprise and experience solutions, and associated vendors. It is developed by agnostic HR Cloud Advisory & systems experts and enables customers to take the right decisions, from technical and functional perspectives. When handled in this way, organizations can generate major benefits towards seamless workflow and process design as well as HR IT architecture optimization and the defined Experience Excellence Value Case.

Rethink the onboarding experience

But what does this all mean in concrete terms? Let’s dig into one prominent example our clients frequently address as their major area of concern: large potential for experience-driven HR IT optimization is frequently identified e.g., in the Onboarding process. This process focuses on how organizations integrate new employees and prepare them for their new roles. When the process is successfully streamlined in our experience-driven way, we directly see project outcomes such as improved retention rates, employee engagement, and productivity as return on investment. But how to? As part of our HR IT Onboarding redesign of end-to-end processes, structures and responsibilities are reviewed and redefined in the experience-/ user journey-driven way considering all external and internal stakeholders involved, such as external workers, recruiters, hiring managers, IT employees, etc. This makes Onboarding processes for the HR IT landscape optimization certainly increasingly complex.

Almost all major HR management software platforms offer their own Onboarding modules. In this blog series, we take a closer look at the offerings from vendors of the suite solutions Oracle, SAP and Workday, as well as the enterprise solution, ServiceNow. Furthermore, we will offer an approach that leads to the optimal Onboarding experience, covering three Onboarding phases leveraged by HR Cloud technologies.

Our next blog #1 takes a closer look at creating the decisive factor for employees to stay by leveraging HR Cloud technologies.

Interested? Looking forward to reconnecting! 

Part of Capgemini’s HR CloudWatch Blog, hosted by Dr. Sandra Duesing, Vice President of Capgemini Invent Germany, and the Global Head of “Reinventing HR.”

Our authors

Dr. Sandra Duesing

Vice President in Workforce & Organization and the Global Head of Reinventing HR | Capgemini Invent
As Capability Lead Workforce & Organization at Capgemini Invent with a dedicated focus on Experience Excellence in HR & HR IT, I am passionate to re-imagine work & unlock underlying human potential to drive digital transformation journeys for business and society successfully.

Svenja Stegemann

Senior Consultant in Workforce & Organization | Capgemini Invent

Anne Geiter

Consultant in Workforce & Organization | Capgemini Invent

    What to expect from cybersecurity in 2023

    Geert van der Linden
    20 Dec 2022

    Rising geopolitical tensions, mass digitalization, more hybrid working, and a skilled labor shortage. As we enter 2023, it goes without saying that cybersecurity teams have a lot on their plate, and you’d be forgiven for feeling we live in an age of permacrisis. While a new era of almost limitless connectivity is changing the way we live, work and produce, organizations must adapt quickly or risk significant costs.

    In response, more organizations are waking up to the value of cybersecurity investment. This is reflected in global spending which Gartner estimates could be as high as $1.75 trillion by 2025. This year it was approximately $172 billion and, in some areas like data analytics, investment is paying off. Security teams are becoming increasingly effective at proactively detecting and mitigating cyber threats, with the added power of data and automation also playing more of a role.

    Nonetheless, the scope of cyber breaches continues to grow, and malicious actors continue to evolve, as do their targets. Today, a car manufacturer should be just as concerned about a supplier, or its equipment, being infected with malware as a malfunctioning part. Such ever-growing complexity calls for a mindset change. As the typical size of an IT team in an enterprise of revenue between $150M and $500M is only 11 people, it is virtually impossible to monitor and analyze everything. Employees continue to be the most vulnerable targets and as a result, they need to be just as aware of causing fires as the firefighters themselves.

    Here’s a look at some of the key trends in 2023:

    The end of perimeter and the rise of zero trust

    Traditionally, cybersecurity has been framed as an ongoing battle between hackers and criminals on the outside, and security experts on the inside. It is easy to frame organizations as closed shops and this narrative is reflected in popular culture. However, the reality is much more complex.

    The pandemic changed working patterns and a hybrid approach has become the norm for many businesses; employees are just as likely to be working from another country as they are from the office. At the same time, data is flowing outside of traditional closed networks and into the cloud, while the 5G-powered Internet of Things (IoT) means that equipment is too. Hospitals, for instance, are increasingly using connected medical devices for patient care, and yet one report found that over half of internet-connected devices used in hospitals have a vulnerability that could put patient safety, confidential data, or the usability of a device at risk. This, in some cases, can be life threatening. And is why the end of perimeter security must be followed by ‘zero trust’.

    Zero-trust security is exactly how it sounds like: don’t trust anyone when it comes to cybersecurity. Whether CEO or intern, every user is guilty until verified and must be granted access every time they pick up tools – eliminating any room for doubt and allowing for better monitoring of unusual behavior. Zero trust is crucial to enabling digitalization and cloud to thrive, it is no coincidence that Gartner reports that zero trust network access will remain the fastest-growing segment in network security, with growth of 36 percent in 2022 and 31 percent in 2023.

    Zero trust is not an overnight tale but a multiyear journey, depending on the amount of legacy infrastructure involved as well as the requirements of the industry, which is why we anticipate that 2023 will be the year where more organizations embed it. While some industries, like finance, are already close to or at zero trust, others like automotive and healthcare are not. To stabilize and tighten security frameworks beyond network zoning, it’s imperative that every vertical moves towards it.

    5G security gets hot

    The introduction of 5G into the digital ecosystem means that almost anything can be connected to the internet. It adds IoT into the ecosystem alongside IT and OT, where the product itself becomes a point of vulnerability. Whether its cars, washing machines, or factories, 5G is transformative and the foundation of Intelligent Industry.

    5G security will take off in 2023, boosted by businesses migrating to the cloud, and so its security architecture – with data flowing between organizations and telcos – will come under the spotlight. In tandem with leaders recognizing the benefits of 5G powered connectivity, they must make security a board-level priority. Without doing so, it will be difficult for organizations to overcome these challenges, educate their employees and vendors, and streamline communication between cybersecurity teams and decision makers.

    Supply chain vulnerabilities requires DevSecOps

    As more specialist connected devices are manufactured, threat actors are focusing on vulnerabilities further down the supply chain, such as the specialist manufacturer of a connected car part. With these attacks only intensifying as geopolitical aggressions on intellectual property and influence increase, we can expect – and require – security to be embedded at the stage of development.

    Security by design requires the convergence of development, security, and operations teams with the goal of automating security at every phase of the software development lifecycle, which when applied end-to-end, will reduce effort, costs, and improve compliance. This is called DevSecOps and will be crucial to meeting 2023’s requirement to do more, with less. If we fail to, the serious implications of not embedding security early-on will continue to hit critical sectors such as healthcare, automotive, energy, and even agriculture more frequently.

    Bank on data, not AI

    There is a metaphor about waiting for a bus to arrive and suddenly all come at once. Such is the expectation drummed up about the capabilities of non-human software to resolve our woes, but don’t bank on the bus to arrive in 2023. While there’s no doubt that AI and automation technology will continue to advance in capabilities, it’s not advancing at the rate many would hope. Instead, next year, data analytics and mining will take greater prominence.

    Both will be critical to relieving some of the pressure on IT teams. A study by Capgemini’s partner IBM, found that 67% of Cybersecurity Incident Responders say they experience stress and/or anxiety in their daily lives, with an alarming 65% seeking mental health assistance as a result of responding to cybersecurity incidents. Pressure has become part of the status quo in cybersecurity, and this is a global problem. By better harnessing data, teams can deliver better insights and correlation on attack trends, while forecasting future attacks.

    Hyperscalers race ahead

    Finally, worldwide spending on cloud is expected to reach $1.3 trillion by 2025 as more and more businesses migrate. At the same time, 79% of companies experienced at least one cloud data breach in the last 18 months which is shining a spotlight on hyperscaler security. The added values and integrations of platforms like Microsoft Azure and Amazon Web Services are significant and it puts more pressure on smaller security providers who will continue to lose their market share in the year ahead. But next year, the hyperscalers will be busy proving they are able to deliver secure cloud environments as part of the package. Businesses need to be able to move into the cloud with confidence, and for SME’s especially affordability is crucial.

    Although there is little sugar coating the scale of challenges, there’s room for hope in 2023. Investment is continuing to rise, even within the context of global inflation and capabilities are advancing. The security environment can feel overwhelming, and more skilled workers are required to alleviate the tensions, but advancements in data analytics are already proving their worth. The sooner businesses can harness it while embedding a security mindset across all levels – with suppliers and employees – the more likely it is that next will be a transformative period for the security industry.

    Contact Capgemini to understand how we are uniquely positioned to help you structure cybersecurity strength from the ground up. 

      Intelligent HR operations – drive amazing people experiences

      Capgemini
      19 Dec 2022

      Enterprises are increasingly relying on service providers to overcome the challenge of delivering intelligent, frictionless HR operations that drive enhanced, more personalized people experiences.


      If the global pandemic has taught us anything, it is the need to provide an irresistible and amazing people experience. Indeed, investment in the data, services, analytics, and tools to boost employee empowerment and engagement has had an extremely high return during this turbulent period.

      According to a report of senior HR leaders, 79% see acceleration of digital transformation in their organizations due to the pandemic, while 96% of HR leaders see the role of HR shifting from being an administrative service provider to concentrating more on designing employee experiences and satisfaction, acting as change agents, and developing talent.

      The HR function must keep up with this pace of digital transformation and disruptive business practices in order to deliver on growth, but is being impacted by a number of challenges, including labor shortages and the changing expectation of employees working from home.

      Unique APAC challenges for HR

      We only need to look at the APAC market, and its unique specificities, to understand the level of these challenges compared to the other regions. Up to 78% of the Asian workforce was working in person up until 2020 (pre-pandemic), much higher than in US and EMEA, making the move towards hybrid working more challenging.

      Highly diverse work cultures, lack of strong tech infrastructure, and a high proportion of blue collar workforce in APAC has also contributed to this slow adoption of the hybrid work model in the region. In addition, APAC’s highly fragmented and diversified market led by a myriad of languages, local regulatory requirements, and varied ways of working is making it challenging to drive standardization.

      Moreover, a recent survey by Korn Ferry states that APAC faces an imminent labor shortage of 47 million people and $4.238 trillion in unrealized annual revenue across the region by 2030. With the war for talent becoming increasingly competitive as employees prioritize experience over pay along with continuously changing skillsets, HR leaders have started to look at the gig economy to provide the greatest flexibility without hitting the bottom lines.

      Another important report highlights the focus on HR tech modernization in APAC with 89% of respondents preferring to implement people analytics solutions, but only 38% believing their organizations are ready. The main reasons for this sluggish adoption are the diverse nature of the APAC region, differing levels of economic maturity, multiple HCM systems, complex data transmission, shortage of right talent, inadequate systems/processes, budget limitations, and difficulty in securing executive buy-in are.

      People-centric, frictionless HR operations

      These challenges and priorities are defining a new future state for HR shared services and outsourcing. For the first time, “focus on core business outcomes” is the most important driver for companies, with “cost” falling to second place. This shows that now, more than ever, companies view HR outsourcing and transformation as a strategic driver of business value creation through innovation and differentiation.

      Given this shift, organizations are now looking for transformation partners who can proactively respond to regulatory changes and market shifts, while bringing cutting-edge HR and solutions to help with organizational and people challenges.

      Today’s CHROs and CXOs need to now focus on standardizing and automating their employee processes to create consumer-grade people experiences. And all of this after designing and executing an efficient, end-to-end service delivery model driven by intelligent, data-driven, frictionless HR operations that seamlessly connects people, processes, and technology.

      To learn how Capgemini’s Intelligent People Operations can drive a personalized and frictionless people experience across your organization, contact: ajay.chhabra@capgemini.com or rashmeet.kaur@capgemini.com

      About authors

      Ajay Chhabra, Practice Leader – APAC, Intelligent People Operations, Capgemini’s Business Services

      Ajay Chhabra

      Practice Leader – APAC, Intelligent People Operations, Capgemini’s Business Services
      Ajay Chhabra leads Capgemini’s Intelligent People Operations practice for APAC with specific focus on HR transformation & advisory. With over17 years of professional experience, Ajay is passionate about solving client’s HR & payroll challenges through consulting, transformation, and innovative solutions.
      Rashmeet Kaur, Team Lead, Intelligent People Operations, Capgemini’s Business Services

      Rashmeet Kaur

      Team Lead, Intelligent People Operations, Capgemini’s Business Services
      Rashmeet Kaur is a team lead with Capgemini’s Intelligent People Operations practice. She has worked on projects in different industries involving strategy, advisory, & consulting, HR transformation, and shared services setup.

        Monte Carlo: is this quantum computing’s killer app?

        Capgemini
        16 Dec 2022

        As the quantum computing revolution unfolds, companies, start-ups, and academia are racing to find the killer use case

        Among the most viable candidates and strongest contenders are quantum computing Monte Carlo (QCMC) simulations. Over the past few years, the pace of development has certainly accelerated, and we have seen breakthroughs, both in hardware and software, that bring a quantum advantage for finance ever closer.

        • Roadmaps for hardware development have been defined and indicate that an estimated quantum advantage is within a 2–5-year reach. See for example IBM and IonQ, who both mention 2025 as a year where we can expect the first quantum advantage.
        • End-to-end hardware requirements have been estimated for complex derivatives pricing at a T-depth of 50 million, and 8k qubits. Although this is beyond the reach of current devices, simple derivatives might be feasible with a gate depth of around 1k for one sample. These numbers indicate that initial applications could be around the corner and put a full-blown advantage on the roadmap. Do note, however, that these simple derivatives can also be efficiently priced by a classical computer.
        • Advances in algorithmic development continue to reduce the required gate depth and number of qubits. Examples are variational data loaders, or iterative amplitude estimation (IAE), a simplified algorithm for amplitude estimation. For the “simple derivatives,” the IAE algorithm can run with around 10k gates as opposed to 100k gates for 100 samples with full amplitude estimation.
        • There is an increasing focus on data orchestration, pipelines, and pre-processing, readying organizations for adoption. Also, financial institutions worldwide are setting up teams that work on QCMC implementation.

        All these developments beg the question: what is the actual potential of quantum computing Monte Carlo? And should the financial services sector be looking into it sooner rather than later? Monte Carlo simulations are used extensively in the financial services sector to simulate the behavior of stochastic processes. For certain problems, analytical models (such as the Black-Scholes equation) are available that allow you to calculate the solution at any one moment in time. For many other problems, such an analytical model is just not available. Instead, the behavior of financial products can be simulated by starting with a portfolio and then simulating the market behavior.

        Here are two important examples:

        • Derivatives pricing: Derivatives – financial products that are derived from underlying assets – include options, futures contracts, and swaps. The underlying assets are expected to be stochastic variables as they behave according to some distribution function. To price derivatives, the behavior of underlying assets has to be modelled.
        • Risk management: To evaluate the risk of a portfolio, for example interest rates or loans, simulations are performed that model the behaviour of the assets in order to discover the losses on the complete portfolio. Stress tests can be implemented to evaluate the performance of the portfolio under specified scenarios, or reverse stress tests can be carried out to discover scenarios that lead to a catastrophic portfolio performance.

        Classical Monte Carlo simulations require in the order of (1/ε)^2 samples to be taken, where ‘ε’ is the confidence interval. For large cases, this easily becomes prohibitive. Suppose a confidence interval of 10^(-5), billions of samples are required. Even if workloads are parallelized on large clusters, this might not be feasible within an acceptable runtime or for cost reasons. Take for example the start of the Covid-19 crisis. Some risk models looking at the impact of Covid on worldwide economies almost certainly would have taken months to build and run, and it is likely that before completion, the stock market would have dropped 20%, making the modelling irrelevant.

        Quantum computing Monte Carlo promises, in theory, a quadratic speedup over classical systems. Instead of (1/ε)^2  iterations on a classical system, (1/ε) iterations on a quantum computer would attain the same accuracy. This means that large risk models that take months to complete may become feasible within just hours.

        Unfortunately, it’s never as easy as it seems! Although sampling on quantum computers is quadratically faster, a large overhead could completely diminish any quantum speedup. In practice, expressing a market model as quantum data seems extremely difficult. There are a few workarounds around this problem, such as the data loaders as announced by QCWare, or a variational procedure as published by IBM, but it is yet to be seen if these work well on real problems.

        However, if quantum hardware and software continue to develop at their current pace, we can expect some very interesting and valuable uses for quantum Monte Carlo applications. A business case can easily be made, because if  QCMC improves risk management simulations, then the reserved capital required by compliance regulations could be reduced, freeing up capital that can be used in multiple other ways.

        Furthermore, the derivatives market in Europe alone accounts for a notional €244 trillion. A slightly inaccurate evaluation of this market could lead to a large offset to its actual value, which in turn could lead to instability and risks. Given the huge potential for derivative pricing and risk management, the benefit of significant and deterministic speedups, and an industry that is fully geared up to benefit from quantum, QCMC seems to be one of the killer applications.

        However, before QCMC works successfully in production, a lot of work remains to be done. Just like in any application, proper data pipelines needed to be implemented first. The time series required for risk management need to be processed on stationarity, frequency, or time period. If policy is adjusted to daily risk management, data streams also have to be up to date. If a quantum advantage needs to be benchmarked, then its classical counterpart must be benchmarked too. Additional necessary developments, such as building the required infrastructure (given the hybrid cloud nature of quantum applications), its relation to compliance regulations, and security considerations, are still in their early stages.

        Given the huge potential of quantum computing Monte Carlo, a number of pioneering financial services companies have already picked it up; Wells Fargo, Goldman Sachs, JP Morgan Chase, and HSBC are well established in their research into using QCMC or subroutines. Certainly, these front runners, will not be late to the quantum party, and they will be expecting to see benefits from these exploratory POCs and early implementations, likely in the near future.

        Deploying algorithms in productionized workflows is not easy, and it is even more difficult when a technology stack is fundamentally different. But, these challenges aside, if the sector as a whole wants to benefit from quantum technology, now is the time to get curious and start assessing this potential killer app.

        First published January 2021; updated Nov 2022
        Authors: Camille de Valk and Julian van Velzen

          Digital security and quantum computing: An essentially paradoxical relationship

          Capgemini
          14 Dec 2022

          All of our online actions today are governed by a set of cryptographic rules, allowing secure exchanges between different parties; but a new threat is looming.

          While we are impatiently waiting for the emergence of quantum technologies, which will bring major technological progress, the internet is preparing for the day when quantum computers will be able to decrypt our secure communications, called “Q-Day” by some.

          But, while quantum hardware is not yet mature enough to decrypt the algorithms currently used, our data is already at risk from hackers, who accumulate encrypted data in order to decrypt it in the future. So, how do we prepare for this eventuality and how has this threat, as of today, changed the way we think about this relationship between cybersecurity and quantum computing?

          Quantum computing is a scientific technological evolution. Conversely, digital security is a concept: in essence, it can be interpreted differently depending on its uses, which are sometimes competing (Read more on this subject from B. Buzan and D. Batistella). Exploring the relationship between quantum computing and digital security can therefore generate paradoxical discourse.

          The paradox of quantum progress: threat or opportunity for security?  

          The current investment in quantum technologies is teeming and patent publications have grown exponentially over the last 10 years. The emergence of numerous venture capital companies in the ecosystem also demonstrates the market’s interest in these technologies. This will lead to numerous use cases due to quantum computers providing exceptional computing power, but which will also offer tools to cybercriminals, to deconstruct the security systems already deployed.

          For example, via the quantum implementation of Shor’s algorithms, it will be possible to break the current cryptographic protections, which are based mainly on the current difficulty in factoring large prime numbers in a reasonable amount of time. Therefore, the arrival of the quantum computer, and its use by networks of cybercriminals, will mean a significant risk for companies. Data requiring long-term storage (biometrics, strategic building plans, strategic IP, etc.) could then be easily accessible by malicious entities using quantum computing power.

          These threats call into question the creation of a digital space known as “trusted.” The risk is therefore too great to stand still in the face of this quantum revolution, and the launch of initiatives is necessary, as soon as possible, to identify the risks associated with the various activities of the company, and to reassure on a large scale.

          At the same time, new technologies promising quantum security are emerging.  First, new mathematically based cryptography is being developed that remains “unbreakable” even by quantum computers. According to the viewpoint published by ANSSI on January 4, 2022, the National Institute of Standards and Technology (NIST) has been analyzing “post-quantum cryptography” (PQC) algorithms, to increase the defense strengths of the virtual world tenfold since 2016. This standardization recommendation is expected to be published by 2024.

          Second, physics-based cryptography, called quantum key distribution (QKD), is emerging as an alternative for ultra-secure communications. While post-quantum cryptography is based on the premise that no efficient (quantum) algorithm has been found to “break” it, QKD technology relies on physical principles to detect the interception of secure keys. In case of interception, these keys can be regenerated until the parties are certain that the communication is secure.

          This standardization of post-quantum algorithms is focused on a competition organized by NIST in order to analyze different algorithms and keep the most efficient. In this competition, 69 candidate algorithms were submitted in 2017, of which only 7 reached the 3rd round of qualification in July 2020. The algorithms selected in this round are based on NP-Hard problems (whose solution is performed in exponential times, at best) and which will remain complicated to solve even for quantum computers. They are concentrated around “lattice” problems (based on the calculation of the smallest vector in a set of points) as well as “code-based” algorithms (based on the decoding of a linear code). The results of this first round were published on July 5, 2022.

          Finally, by drastically accelerating the learning speed of artificial intelligence and machine learning algorithms, quantum computing will also strengthen all the security systems that rely more and more on these technologies. Security operations centers (SOCs), which deploy tools to detect weak signals of attacks, and which aim to quickly identify deviant behavior in networks and uses, will be all the more effective. In all sectors, from fraud detection in banking to industrial incidents, quantum computing will increase the effectiveness of SOCs. As a result, the need for security teams to ramp up on these already visible technologies will only increase in the coming years.

          The paradox of quantum research: wait too long or act too fast?

          In all public and private organizations, the development of security teams’ expertise in quantum issues is still in its infancy. In the best of cases, the first reflections are oriented around the business benefits of quantum technologies, leaving cybersecurity issues and the concrete analysis of risks and opportunities in the background. This observation is reinforced by a lack of awareness of the subject within companies, which means limited internal training initiatives for employees, and so leaves the subject to external expert partners.

          How can we professionalize the approach to a technology that has not yet been democratized? At what point does research become real enough to trigger industrialized programs? Even as we enter the development phase of a new generation of more powerful quantum computers, the need for companies to experiment becomes essential, in order to avoid being threatened by cybercrime networks that are constantly one step ahead of the lines of defense, and more agile in absorbing new technologies and hijacking them.

          The presence of Cloud leaders (Google, Amazon, Microsoft, IBM, etc.) in this market allows a democratization of, and a relatively easy access to, these technologies. The availability of quantum resources, via platforms already implemented by their customers, creates a very fertile ground for various experiments. But how to secure sensitive information from these projects when using external and shared devices, which is the very basis of the cloud model? So-called “blind quantum computing” ensures that even the owner of the quantum computer is unable to know what computations users are performing on them. However, while this can have great applications in terms of privacy protection, there is, in return, a risk of losing any insight into the user’s or users’ intentions.

          Waiting too long or acting too fast? The answer is not simple and will come from a collective movement, driven by the construction of large training programs in quantum engineering (certification courses, allocation of research budgets, etc.), or the pairing between organizations and start-ups.

          The geopolitical paradox: a universal issue or a question of sovereignty?

          All scientific revolutions end up becoming universal, one way or another. They impact societal structures, means of production, and, de facto, citizens; the quantum revolution is no exception to the rule.

          However, it is clear that it is the subject of a geopolitical confrontation. Like digital technology, quantum computing is a field of economic and political competition between states. Some states invest more than others, implying two phenomena: a competitive advantage in the context of the market economy around quantum computing that will be created in the coming years, and a use for national intelligence purposes.

          China is well on its way to becoming the leader in quantum technologies, especially in the field of communications, with an estimated investment volume of 10 billion euros. It has the largest QKD quantum communication network, with satellites and a fiber optic network capable of communicating over 4600 km. This is a strategic project that aims to protect its commercial and military communications from intrusions.

          France, for its part, aims to become a leader on a European scale, with a plan to invest 1.8 billion euros over five years, announced in January 2021. With a very dynamic ecosystem of start-ups, France is investing in ways to facilitate meetings between academic experts and industry. Examples include Pasqal, which is developing its offer (a quantum computer capable of computing at room temperature, providing the best ratio of computing capacity to energy consumed in the world), merging with Qu&Co to create a European leader, and Alice and Bob, which is raising €27 million – and academics around the Saclay plateau.

          This dynamism has led to some initial achievements, such as the French Navy’s development of the Girafe system (interferometric gravimeters for research with embarkable cold atoms), an autonomous navigation system based on quantum detection technologies that can calculate its exact position without using the American GPS network, scheduled for 2026/2027.

          On the cyber side, the inauguration of the Cyber Campus on February 15, 2022, is another example: it demonstrates a French willingness to organize cross-sectoral, public/private cooperation around major cybersecurity challenges and future innovations. It is typically the place where the challenges of the arrival of quantum computing can be discussed.

          Quantum computing then becomes a question of digital sovereignty – the idea that in a polarized world order, it will be important for states to assert a quantum power and a capacity for self-determination over their digital space.

          Quantum computing will bring intrinsic contradictions: on the one hand, universal scientific progress and the strengthening of our anti-cybercrime capacities, and on the other, the reinforcement of a geopolitical and economic conflict and a threat to digital trust.

          First published in Les Echos, 15 September 2022 : Avec le quantique la sécurité numérique entre dans l’ère du paradoxe

          Author: Clément Brauner, Quantum Computing Lead, Capgemini Invent.
          Co-authors: Jeanne Heuré, Head of Digital Trust & Cyber, and Nicolas Gaudilliere, CTO – both of Capgemini Invent

            Persona-led platform design drives enhanced finance intelligence

            Daniel Jarzecki
            12 Dec 2022

            Customized, persona-led design drives adoption of a finance intelligence analytics platform, creates a data-driven culture, and enables a more frictionless approach to finance operations.


            According to recent Capgemini research, only 50% of organizations benefit from data-driven decision-making, while only 39% of organizations successfully turn data-driven insights into sustained competitive advantage.

            In the world of finance and accounting (F&A), this really begs the question: how can your finance function implement an analytics platform that gives its users the insights they need to unlock potential and value for your organization?

            No one size fits all

            We live in a world full of information that comes at us from almost every aspect of our lives. We’re constantly bombarded with content, most of which has no relevance to us. And we appreciate the option to personalize how we receive and store this information through adding it to our feeds and favorites.

            The F&A world works in exactly the same way. The three main personas that work with finance intelligence in your F&A function (CFO, transformation lead, and service delivery lead) need the right dashboards and metrics embedded into an easy-to-use platform that gives them the information they need to provide actionable insights at the touch of a button.

            Let’s take accounts payable invoice processing as an example and reference the same input data. While a finance intelligence platform can give visibility on the status of all open or unpaid invoices to your service delivery leads (SDL), your transformation leads (TL) will be more typically interested in the healthiness of the overall invoice channels mix (paper, email, e-invoice, etc.), while the CFO will only want to look at the days payable outstanding (DPO) metric.

            Furthermore, fraud risk alerts will be relevant for the CFO and your SDLs, while metrics such as on-time payment will only interest your TLs and SDLs.

            The same metric – but at a different granularity level – can be relevant to different personas. For example, a single unpaid invoice vs. the systematic problem of late payment.

            But how can you customize your finance intelligence platform for different personas and users?

            Persona-led analytics platform design

            Understanding the role and tasks of your users enables you to select the most relevant content for each persona and build customized dashboards that help them be more productive. Showing a limited number of meaningful and actionable KPIs also helps your users stay focused on the right areas.

            The more seamless the adoption of a next-generation finance intelligence platform, the closer you are to establishing a data-driven culture. And while having the business insights alone will not make your challenges go away, having the right analytics and alerts in front of you will help you make the optimal decisions you need to succeed.

            In turn, this enables you to give your customers what they want, while achieving the benefits of a truly Frictionless Enterprise.

            To learn more about how Capgemini’s Finance Intelligence can help start your journey in smart analytics and real-time, frictionless decision-making, contact: daniel.jarzecki@capgemini.com

            About author

            Daniel Jarzecki

            Expert in Digital Transformation and Innovation
            Daniel Jarzecki is a transformation director with over 19 years of experience in managing Business Services delivery teams, building successful solutions, and running transformation programs for Capgemini clients across multiple industry sectors. Daniel’s passion is to enhance business operations with data-driven insights to help clients transform and improve.

              How to safeguard and protect our global forest ecosystems?

              Pierre-Adrien Hanania
              8 December 2022

              Key takeaways on how Data & AI can play a leading role to safeguard and protect our global forest ecosystems 

              Land use – including deforestation, which releases heat-absorbing carbon into the atmosphere – accounts for 25 percent of global greenhouse gas emissions, according to the Intergovernmental Panel on Climate Change (IPCC) Special Report on Climate Change and Land. In addition to playing a crucial role in carbon sequestration (essential in a warming environment), forests are home to earth’s most diverse species, and they provide a natural barrier between natural disasters and urban zones – all of which contribute to the United Nations’ 2030 Agenda for Sustainable Development Goals (specifically, goals 3, 9, 14 and 15).  

              As part of Capgemini’s support for AI For Good, we recently gathered a range of experts from forestry research programs, startups, and business project teams to discuss how best to observe, defend, and enhance the world’s forests. These leaders shared their insights and experience with diverse technologies, all with the same goal – ensuring that the forests remain for years to come.  

              Here are three key takeaways from that conversation:  

              New observational technologies are increasing our capacities –  

              Using AI, the physical labor that goes into the tedious process of analyzing imagery for forestry insights can be reduced tremendously, while improving data precision and quality. In combination with pre-existing government data and in-situ data, forestry professionals now possess high-quality tree-maps, which can then be leveraged to determine the effects climate change has on sustainable land practices, support or improve species habitat, and provide a more sustainable harvest. “AI and satellites give us the scale to be able to apply skill sets that people weren’t applying to the climate before,” said Kait Creamer, marketing manager of Overstory, a company specializing in vegetation intelligence through the use of geo-satellite imagery.  

              New observational capabilities are promising, but must be paired with defensive action:  

              “We need to know the past in order to predict the future,” argued Ms. Jonckheere, Forest and Climate Officer at the Food and Agriculture Organization of the UN (FAO). “And for this, machine learning and AI can really help.” Geo-satellite data can be used in combination with algorithms to predict the size, spread, and probability of a fire outbreak, protecting forests and also preventing the loss of life by inhabitants of rural areas.  

              In addition to fire prevention insights, AI and data can easily identify which trees on the ground are affected by invasive species, for example, the spruce bark beetle in Sweden. These insights allow professionals to visualize and manage an infestation.  

              Stéphane Mermoz, CEO and Research Scientist at GlobEO, a company that provides services based on Earth observation and remote sensing data, shared that another use case for predictive algorithms is illegal mining – data show that illicit mining operations on Indigenous lands and in other areas formally protected by law have hit a record high in the past few years1 – so we can use analysis through AI and machine learning to build correlations for predicting deforestation.   

              Data analytics and AI are presenting key opportunities to defend the local ecosystems that are essential to life. “The forest is my backyard,” commented Alook Johnson, an indigenous trapper from Canada supported by the ShagowAskee Group. Whether we are far or near to the forest, citizens are all concerned with its health and conservation. AI techniques can also reimagine the place of trees in our lives – in a forest far from highly populated cities or merged directly into our urbanized areas to prevent urban heat islands.  

              Policy and public sector coordination is key:  

              “Policy is the thing which holds us all accountable,” Ms. Creamer remarked, “in a way that maybe an individual couldn’t.” Without both policy support and economic viability, many of the small businesses and innovators exploring these technologies will not be able to scale to the level that the current environmental crisis requires. Ms. Creamer remarked, “when we’re conscious of making policy that serves our communities and businesses – that has a climate in mind – there’s this inherent motivation to follow through.”  

              According to Ms. Jonckheere, we have two things – global data, like the IPCC global report, that serve the needs of policymakers. The other is an action that needs to be encouraged on a national and local scale. Globally there is a UN forestry network and global goals, but then it’s up to different nations to come up with policies and measures and follow up with the implementation. Linking these two is crucial because these are global data products — which are very useful in the case that there is no national data that can be used by the national government or local end users. 

              Data and AI are game-changing tools when supporting and counteracting the degradation of our world’s forests, but rather than relying upon the existence of new innovations, it is a commitment to action that will be decisive in this sphere.  

              Watch the full replay on Youtube: 

              Author

              Hanania-Pierre-Adrien

              Pierre-Adrien Hanania

              Global Public Sector Head of Strategic Business Development
              “In my role leading the strategic business development of the Public Sector team at Capgemini, I support the digitization of the public services across security and justice, public administration, healthcare, welfare, tax and defense. I previously led the Data & AI in Public Sector offer of the Group, focusing on how to unlock the intelligent use of data to help organizations deliver augmented public services to the citizens along trusted and ethical technology use. Based in Germany, I previously worked for various European think tanks and graduated in European Affairs at Sciences Po Paris.”

                Focus on data ecosystems in the era of financial services

                Ashvin Parmar
                7 September 2022

                Constructing data ecosystems has helped financial companies become nimbler and move faster.

                When it comes to data, the financial services industry has some of the greatest opportunities but also faces tremendous risks and pressures. As competition increases, along with customer expectations, financial companies are navigating regulatory, security, and privacy minefields along the road to delivering greater innovation.

                To successfully make this data journey, finance has become a leading sector in terms of building data ecosystems. As such, it offers critical lessons for companies just starting to explore the possibilities that data sharing can unlock.

                “It helps from the bottom-line perspective in terms of bringing greater efficiencies,” according to Ashvin Parmar, Vice President, Insights & Data Practice Leader at Capgemini. “But more importantly, it lays the foundation for innovation. You come up with new products and bring them to market faster.”

                Parmar, who works closely with some of the biggest names in finance as well as emerging fintech startups, says the adoption of cloud computing and the shifting economics of data have enabled financial companies to become more aggressive and experimental with how they leverage their data. As the line blurs between traditional financial services and retail experiences, companies know they must rapidly adapt.

                “The banks and insurers don’t have a choice but to start to collaborate,” Parmar says. “So, the desire is there to grow and reach new prospects and clients and to service them better. By procuring data from a variety of sources, they can enrich their own data and improve in areas like risk management. They also get a better view of the client and their preferences.”

                Constructing data ecosystems has helped financial companies become nimbler and move faster.
                For instance, developing robust know your customer (KYC) systems is vital for financial companies from both a security and regulatory perspective to fight fraud and money laundering. But if each company has to build its own KYC platform, it can be costly while offering only limited reach for the data it can access.

                A Fintech startup is cutting across this problem, thanks to its neutral KYC platform. This startup has partnered with a wide range of financial players who are sharing their data in its system. Companies that engage them get access to a deeper pool of data without having to create their own algorithms and technology stacks.

                “It reduces the cost and you can have better risk management because you’re benefiting from experiences and the data from your competitors,” Parmar says.

                Capgemini is working with the fintech startup to create a credit analytics platform that brings in data from a wide range of sources while leveraging Capgemini’s domain expertise in this area, including its risk management model. Parmar believes this joint offering of credit analytics as a service will allow financial firms to remain focused on their core services rather than expending resources on making sure models are up to date and procuring the right data.

                Companies can also accelerate their data transformation via 890 By Capgemini, a platform that offers access to ecosystems of industry-specific data that can be combined with exclusive internal data. This curated experience includes data, insights, and outcome exchanges that help companies quickly and securely benefit from the power of data sharing.

                Getting started

                When a company is ready to take the leap into data sharing, Parmar offers a few guidelines for getting started.

                Naturally, a company should have a data-driven culture. But to compliment that, it’s also vital to have a privacy culture.

                “Privacy is not just about the technology and the government’s regulations or the processes,” Parmar says. “A culture of privacy has to be there for all the other controls to work properly.”

                Capgemini has a lightweight survey created in partnership with leading academics to allow a company to measure its internal privacy sensitivity. The survey delivers a heatmap so companies can see where their privacy blind spots are. That’s the first step toward addressing cultural issues.

                From there, companies need to understand their data environment, including its maturity and readiness to undertake data sharing. This should also include a clear inventory of internal IT systems in terms of their capabilities (or lack thereof) for connecting to partners or any type of ecosystem.

                Finally, companies need to set clear long-term goals and understand what drives them from a business perspective. Is it data monetization? Increasing the footprint? These answers will help identify the technology needs and business capabilities required to realize these plans.

                “It’s a journey,” Parmar says. “Maybe you start with an internal data marketplace to build your first data exchange platform. And that evolves into an external data exchange. And that morphs into new data products.”

                The good news is that once a company is ready to take those steps, cloud providers and hyperscalers now have tools that can help them move quickly. These companies are making massive investments in the technologies and services that are rapidly reducing costs even further and making them more accessible. While the focus was initially on just getting customers into the cloud and saving money, these cloud providers are now offering more sophisticated tools for specific verticals, such as finance.

                “If you are a bank, you don’t have to start from scratch,” Parmar says. “There is a tremendous amount of R&D being done to facilitate these data exchanges. The barriers to entry into a marketplace are going away. It’s becoming easier and it’s becoming safer to innovate with data exchanges and data ecosystems.”

                Author

                  How technology is enabling wealth democratization

                  Shreya Jain
                  30 November 2022

                  Historically, wealth management has only been accessible to the ultra-wealthy. Although there have never been any official barriers stopping people from investing, the exorbitant minimum ticket price excluded those on a lower budget. The Federal Reserve found that the wealthiest people control the majority of equities, with families in the top 10% of income brackets owning 70% of the market value of all stocks.1

                  However, a rapid rise of digital technologies is opening up new investment vehicles, such as cryptocurrencies, while widening the investor community by providing simpler and cheaper ways to invest. Investment hubs are shifting from Wall Street institutions to smartphone apps with incentives like free trading and the fractional ownership of stocks. This financial revolution is being led by an influx of FinTechs like Robinhood that are simplifying the process of investing and making it more accessible. Well-established financial institutions have clearly taken notice and are acting quickly to stay relevant to the changing and expanding audience. Financial services giants who have followed this movement of wealth democratization include DBS which launched its NAV Planner that leverages more than one hundred AI models to deliver personalised and actionable insights, helping Singapore residents better manage their money and grow their wealth2 and JPMorgan Chase which acquired Nutmeg, one of the most successful robo-advisory providers in the British wealth management market3, enabling people from across the financial spectrum to passively invest funds based on their investment goals and preferred risk level.

                  Here are a few examples of how technology is enabling arguably the greatest market democratization of our times.4

                  • Micro investing: In a bid to expand the investing audience beyond high net-worth individuals (HNWIs), WealthTech firms offer micro-investing platforms that allow investors with less savings to enter the investment market. Each time you make a purchase, the platform rounds up the cost to the nearest dollar and invests the difference into an ETF-based investment account. By lowering the cost of entry into the investment market and eliminating per-transaction fees and investment minimums, micro-investing platforms let people easily invest small amounts of money.5
                  • Robo-investing: Robo-advisors use machine-learning algorithms to automatically build the best investment portfolios for clients, based on their financial situation and future goals. Robo-advisors enable people to invest money towards financial freedom by offering “more accessible investment and money management options at a fraction of the cost hitherto available with traditional models”.6 Robo-investing offers a form of passive investing that expands the reach of investments for an audience that do not actively track the market.
                  • Digital brokerage: Digital brokerages are online platforms that leverage modern technologies to allow customers to get stock market data and access a range of investment opportunities. This breaks through a major investment barrier – the expensive flat fees per trade that were traditionally charged by brokers.

                  Advances in technology are changing the face of the wealth management industry by expanding its audience and introducing more inclusive and better use cases for all types of clients. These include retirement planning using robo-retirement technology7 and tailored portfolios based on the risk appetite of each individual.

                  The democratization of wealth is a trend that looks set to stay. It is a new reality that appears to be accelerating constantly. As more and more providers start catering to a greater number of retail customers, early adopters continue to explore new technologies and alternative investment possibilities that simplify previously complex investments and make these options accessible and practical. Technology-led democratization is increasingly opening up new opportunities. Banks that are not already onboard need to start strategizing to make investments available to a wider section of society.

                  Sources:

                  1. https://www.federalreserve.gov/econres/scfindex.htm
                  2. https://www.dbs.com/annualreports/2020/consumer-banking-wealth-management.html
                  3. https://www.jpmorganchase.com/news-stories/jpmorgan-chase-enters-agreement-to-acquire-digital-wealth-manager
                  4. https://www.windmill.digital/blog/how-wealthtech-is-democratizing-investing/
                  5. https://www.investopedia.com/terms/m/microinvesting-platform.asp
                  6. https://www.investopedia.com/best-robo-advisors-4693125
                  7. https://www.bankrate.com/retirement/how-to-invest-for-retirement-with-robo-advisor/

                  Meet our Experts

                  Shreya Jain

                  Manager, Global Banking Industry

                    Serendipity systems: Architecting personalization systems at scale

                    Neerav Vyas
                    4 November 2022

                    Personalization systems are all about the right advice at the right time. When it’s spot on, that advice leaves us stunned as to why we never thought of it before. If it’s more than spot on, it makes us feel eternally grateful. Welcome to serendipity systems, and the way next generation personalization engines aim to consistently deliver it.

                    Think of the last innovative enterprise you interacted with. Can you think of one where some form of recommendation or personalization was not part of the experience? Recommendations are no longer product features, service attributes, gimmicks, or nice-to-haves. They are the central organizing design principle of modern experiences. Personalization wasn’t an add-on for Amazon, Uber, Netflix, or Airbnb. It was core to their experiences because personalization was core to their business model. We’re moving to a world where not doing personalization is a recipe for guaranteeing underperformance and obsolescence. So, how do we do this well?

                    Serendipitous influence

                    Architecting modern personalization systems isn’t just about data and algorithms. Recommendation architectures are those that persuade and influence our choices. The choices they present and how they are presented change not only how we discover products and experiences but how they promote internal self-reflection. Done well, recommendation and personalization are systems of “serendipitous influence.” The best recommendations are those that inspire you in a way you didn’t expect. They make you wish you’d gotten that recommendation earlier. Fostering serendipitous delight consistently becomes the hallmark for enterprises that want to provide delightful experiences. In our experiences in building such systems, we find there are four key dimensions that help firms elevate from personalization to serendipity:

                    Data

                    1. A consolidated view of “customers” and the enterprise is critical for consistent experiences. Data silos are likely to result in incongruent experiences as customers traverse channels and inconsistent recommendations between content, products, and services.
                    2. The proliferation of data across applications, devices, and brands makes it critical that data is connected, otherwise the customer is more likely to ask “Why don’t you know who I am?”
                    3. Resolution of unknown to known users is also critical in maximizing the total market of customers that you can personalize for. Amazon’s internal data lake (Project Andes) was the start of creating a producer consumer “data mesh” to democratize its insights on customers. This enables Amazon teams to use data across the entire ecosystem to recognize customers across channels and devices (e.g., from the web to mobile, to Prime Video, and to Alexa devices).

                    Intelligence

                    1. The best personalization systems are built from strong data-powered cultures. Literacy in data science, AI, and ML is necessary, but not sufficient. Teams need to be literate in utilizing insights to drive action.
                    2. Linking analytics to outcomes and a holistic set of KPIs is needed to monitor the health of recommendation systems and understand their impact (intentional or otherwise). YouTube found optimizing on clicks created click-bait recommendations that resulted in poor experiences and less user engagement versus optimizing for time spent watching videos.
                    3. Rules-based strategies help to “fake it till you make it” but they struggle at scale and often underperform due to biases in rules and an inability to iterate quickly. Personalization systems should allow employees to build campaigns to optimize based on business goals as opposed to rules (e.g., maximize revenue, minimize my carbon footprint through fewer shipments). This allows employees to be more strategic and creative while permitting underlying analytical models to make decisions on data volumes and patterns that would otherwise go unnoticed.

                    Amazon.com is a different store for every customer. Its Personalization Platform (P13N) allows business teams to set strategies and filters to leverage personalization algorithms optimized against business outcomes (e.g., improve conversion, drive engagement). The system also understands when signals like customer intent are changing. If I searched for shoes in the morning but now, I’m searching for toothpaste, my recommendations should understand my intent and needs have shifted, and consumer goods and staples are more relevant than athleisure products. These signals are optimized against business or experience KPIs which allows for dynamic recommendations to improve business outcomes while boosting the customer experience.

                    Design

                    1. Experimentation leads to better personalization. Design is key to better experimentation. The best recommendation cultures view experimentation of systems as an end-to-end exercise from data and analytics to UI/UX development to creative development and qualitative research.
                    2. Recommendation Experience Design (RXD) as a competency is integral to designing systems that create the proper nudges and in understanding the intended and unintended consequences of personalization. Done well, these teams are a blend of technologists, behavioral scientists, creatives, and experience designers.
                    3. Volume of experimentation can be a valuable KPI unto itself.
                      “If you double the number of experiments you do per year, you’re going to double your inventiveness.” – Jeff Bezos.
                    4. Without trust in the system, it’s hard to get adoption. Without adoption the system will not survive. There’s a fine line between helping to make someone aware of a system and navigating their choices and manipulating them. Systems that manipulate are unlikely to survive.

                    Through user experience research, Stitch Fix found that people couldn’t judge what they would like from clothing images alone. Customers might say from an image they don’t like something, but when they interacted with it or put it on, they would find they loved it. This gave encouragement to be more aggressive in sending products that algorithms suggested a customer would like even if the customer stated they weren’t interested in those types of products.

                    Orchestration

                    1. Orchestrating intelligence and actions for interventions across the customer journey is essential to consistent experiences. Enterprises need a view of the critical journeys (if not all journeys), and this requires organizations to easily orchestrate data and analytics internally and externally to empower employees to improve customer experiences.
                    2. Employee experiences can be as critical as customer experiences. If employee experience is an afterthought (and poorly done) then adoption of the system will be low and the corresponding pace of experimentation will suffer (if it occurs at all).

                    Stitch Fix’s orchestration of customer and merchandising data enabled the development of Hybrid Designs, its internal AI-driven design group. The apparel designs are a true collaboration built on the orchestration of human intelligence and artificial intelligence that helped Stitch Fix generate 2021 revenue of $2.1 billion, with over four million active users

                    Recommendation systems that are thoughtfully architected across these four dimensions drive differentiated and innovative experiences through more experimentation and greater degrees of adoption within the enterprise and by customers. This results in systems that evolve from a goal that’s transactional – “Will I buy this?” – to those that make us wonder “How did I live without this?” What does the future hold? The expectation from consumers is some form of personalization. Moving forward, discovery should be like talking with a friend who knows you so well that they can anticipate your needs. This is a world where we’ve democratized access to serendipity, and firms should provide such experiences or be left behind. It is either serendipity at scale or obsolescence with haste.

                    INNOVATION TAKEAWAYS

                    Personalization is key
                    Personalization is no longer a nice to have, it’s critical for maintaining a competitive advantage.

                    Serendipity tops it
                    The best recommendations are those that inspire and delight you in a way you didn’t expect.

                    Four dimensions
                    Data, intelligence, design, and orchestration are the key dimensions for architecting innovative, serendipitous personalization systems.

                    Cover it all
                    End-to-end experimentation is critical in architecting and designing personalization systems that actually make an impact on consumer and employee experiences.

                    Interesting read?

                    Capgemini’s Innovation publication, Data-powered Innovation Review | Wave 4 features 18 such articles crafted by leading Capgemini and partner experts sharing inspiring examples of it – ranging from digital twins in the industrial metaverse, “humble” AI, serendipity in user experiences, all the way up to permacomputing and the battle against data waste.. In addition, several articles are in collaboration with key technology partners such as  AlationCogniteToucan TocoDataRobot, and The Open Group to reimagine what’s possible.  Find all previous Waves here.

                    Author

                    Neerav Vyas

                    Head of Customer First, Co-Chief Innovation Officer, Insights & Data, Global
                    Neerav is an outstanding leader, helping organizations accelerate innovation, drive growth, and facilitate large-scale transformation. He is a two-time winner of the Ogilvy Award for Research in Advertising and an AIconics 2019 and 2020 finalist for Innovation in Artificial Intelligence for Sales and Marketing.