Skip to Content

Data and tech: The future of commerce is data-driven

Kees Jacobs
Jul 20, 2023

Part 2: Handle data with care

Welcome back to the second part of my blog series on the future of commerce and the role of data and technology. In the first part, we discussed the importance of channel-less commerce, the impact of disruption in the industry, the need for connected capabilities, and the significance of data-driven competencies and value. In this continuation, we will explore additional key elements and strategies for companies to succeed in the data-driven era of commerce.

Data sources: Whoever masters the data owns the future

In a connected commerce landscape, the data landscape expands beyond traditional, modern, and omnichannel models. While it’s critical for companies to master their owned data, including transactional data, first-party customer data, marketing and customer experience data, and supply chain data, there is a vast array of external data sources that further enhance consumer and business intelligence. This includes third-party data from ecosystem partners, social media data, search data, ratings and reviews, location-specific data, and even cross-sector data collaborations. The more companies can blend internal and external data, the more powerful their insights and decision-making capabilities become.

Data collaboration: If you want to go far, go together

There is an African proverb that I really like: ‘If you want to go fast, go alone – but if you want to go far, go together’. The future of commerce is defined by ecosystem collaborations. Companies need to be able to share and blend their internal commercial data with data from existing and new connected commerce ecosystem partners.

By combining data resources and knowledge, companies can create powerful ecosystems that generate valuable insights and provide enhanced services to consumers. Collaboration can range from partnerships with last-mile intermediaries, social commerce platforms, and third-party marketplaces, to direct-to-consumer/business initiatives. Nestlé’s example of Purina’s digital ecosystem showcases the power of data collaboration in providing holistic and personalized experiences to pet owners throughout their pet’s lifetime.

Predictive and generative analytics: Even more intelligent intelligence

Recent technological advancements have significantly enhanced our ability to analyze data and identify patterns. For example, we are all amazed by the power of Generative AI to create original and realistic content, such as images, music, or text, by learning from vast amounts of data. We are seeing how predictive analytics can determine likely future outcomes, and how prescriptive analytics provides recommendations on what actions should be taken.

These advanced analytics capabilities can be leveraged to optimize merchandising, pricing, marketing and sales execution, individual consumer experiences, and operational efficiencies. Data products that deliver this level of intelligence can take various forms, from strategic storytelling and business solutions to tactical self-service dashboards and operational execution through real-time algorithms and machine-learning models. I see that successful companies are able to automate and industrialise more and more the delivery of these data products within the business and that they are able to effectively take advantage of the new innovations that arrive almost daily.

Data foundations: Quality in, quality out

While the value of data and analytics lies in their ability to drive business decisions and actions, it is crucial to have a robust data foundation. What I often see is that companies embark on specific data-driven business initiatives, but only at the end of such programs realise that they need to organize for proper data foundations, at scale. Quality data input leads to quality output.

Organizations must proactively manage their data and technology platforms, ensuring data availability, trust, governance, and master data management. Data capture, processing, cleansing, modeling, analytics, sharing, and consumption are all vital components of a well-managed data foundation. The architecture should support flexible data capturing, data storage, and data preprocessing. Competencies in data engineering and data science are essential for generating and activating data products.

Composable tech: Plug-and-play

To stay competitive, retailers and consumer goods brands are modernizing their technology architectures. Composable architectures, such as Crafted Commerce, provide a modular approach that enables development and consumption across channels, touchpoints, and modalities. These architectures combine cloud-native, headless, API-first, and microservices principles, offering better interoperability, scalability, and innovation flexibility.

Composable tech empowers companies to differentiate themselves and adapt to evolving consumer-centric business models. Agility, lean delivery processes, and continuous innovation are key to leveraging composable architectures effectively.

Does your organisation manage its data estate properly, both internally and externally, and does your organisation operate from a future-fit data and technology architecture?

Say no more, it’s clear that the future of commerce is data-driven, and it’s essential for companies to organize their data capabilities. Stay tuned for part three of my blog series, where I will focus on data cultures and business-transformational data journeys at scale…

About the author

Kees Jacobs

Consumer Products & Retail Global Insights & Data Lead, Capgemini
Kees is Capgemini’s overall Global Consumer Products and Retail sector thought leader. He has more than 25 years’ experience in this industry, with a track record in a range of strategic digital and data-related B2C and B2B initiatives at leading retailers and manufacturers. Kees is also responsible for Capgemini’s strategic relationship with The Consumer Goods Forum and a co-author of many thought leadership reports, including Reducing Consumer Food Waste in the Digital Era.

    Explore more

    Reimagine the future of consumer products

    Redefining success in the new era of connected commerce

    What matters to today’s consumer 2023?

    Network protection is Key to successful ethernet deployments

    Parthasarathy Varadharajan
    19 July 2023
    capgemini-engineering

    Ethernet networks are relied upon to transport real time data and critical business information. Such networks provide critical end-to-end services to subscribers and enterprise users. They must be reliable and resilient. As such, network designers must design their networks to be highly fault tolerant and capable of rapid recovery, to ensure near zero service downtime. The focus of this blog is to detail the various transport protocols implemented in routers/switches used by service providers to achieve a high degree of network protection, and therefore service availability.

    Protection broadly involves three phases. These are i) service establishment along with the establishment of alternate/backup services ii) failure detection, and iii) a repair and recovery process. A failure event in a network can occur due to link flapping, a node failure, or a path failure. The protocols required for protection, monitoring and repair vary, depending on the transport mechanism used (i.e. Ethernet vs IP).

    Protection techniques must also ensure that service restoration happens within 50 milliseconds, which has become the de-facto industry standard. To meet such stringent performance requirements, software implementations should take advantage of hardware assist techniques supported by silicon vendors.

    One such hardware assist technique is ‘offloading’. Offloading involves packet generation and processing of the received packets used for monitoring at a very high frequency. Errors are reported by the hardware offloading mechanism back to the software, in the event of non-reception of packets, which then leads to switching to protected paths. To achieve protection, resources must be provisioned such that when the protected resource becomes unavailable, the backup resource takes over. The backup resource(s) can be placed in standby mode or can actively transport traffic in a load-balanced fashion. Standby mode is easier to implement but consumes more resources. The backup resources are used only when the primary goes down. Active mode is difficult to implement but conserves resources since sharing happens on the protection path.

    Given this background, let us examine the plethora of protocols available for protection, starting with Layer2 Ethernet traffic, IP traffic and MPLS transport.

    Ethernet links transporting Layer2 traffic can be connected in a mesh fashion to protect against failures. The usage of STP (Spanning Tree Protocol) blocks the redundant links from forwarding traffic. This protocol ensures that redundant links become available on primary link or node failures. Since STP does not meet the convergence time requirements, evolutions of STP, namely RSTP (Rapid Spanning Tree Protocol) and MSTP (Multiple Spanning Tree Protocol) – which guarantee faster convergence and load balancing – are deployed instead.

    To ensure predictable topology convergence, operators now use ring topologies instead of mesh networks. The elusive 50 ms convergence with load balancing finally was achieved with the advent of ERPS (Ethernet Ring Protocol Switching). ERPS uses ECFM (Ethernet Connectivity Fault Management) as the monitoring mechanism for faster failure detection. ECFM using hardware offloading support detects failures within ten 10 msec, providing ERPS a 40 msec window for switchover. Hardware-aided MAC flushing and hardware-assisted failover support ensures that 50 msec service protection via ERPS is a reality. Operators can also aggregate links using LACP (Link Aggregation Control Protocol) for higher bandwidth and availability. The usage of micro-BFD (Bidirectional Forwarding Detection) with LACP enables faster failure detection and the convergence of aggregated links.

    The routing technology used in IP networks inherently supports redundancy. The best path is used for packet forwarding and the alternate/inferior path, though established, is used for forwarding only when the best path goes down. Since convergence time is high in the plain vanilla routing methodology, IP networks use BFD as a monitoring protocol, along with fast reroute support in routing protocols. This allows them to realize link and node protection and achieve faster convergence. BFD sessions are offloaded after the initial packet exchange and, just like ECFM, can detect failures within 10msec. BFD sessions associated with routing protocols assist in faster routing protocol convergence. To achieve the 50 msec industry benchmark, backup paths are computed using the LFA (Loop free alternate) mechanism and installed in the forwarding path beforehand. Protection techniques supported in the hardware ensure backup paths are installed on failover scenarios, providing service availability even in route-scaled deployments. IP networks also support ECMP (Equal Cost Multi-Path) which provides protection and load balancing. An alternate mechanism in IP networks for node redundancy is achieved via the usage of the VRRP (Virtual Router Redundancy Protocol) protocol along with BFD for monitoring.

    IP MPLS networks support a wide variety of protection schemes, with BFD again acting as the monitoring mechanism. RSVP-TE (Resource Reservation Protocol – Traffic Engineering) inherently contains FRR (Fast Re-route) support and further supports the provision of backup tunnels to protect traffic-engineered tunnels. LDP (Label Distribution Protocol) supports FRR techniques, including LFA and RLFA (remote LFA) for protection. Segment routed LSPs (Label Switched Path) not only support LFA and RLFA, but also TI-LFA (Topology Independent LFA) for complete protection coverage.

    Conclusion

    Service providers design the edge and core of their networks with service protection and high availability as one of their key objectives. Redundancy at every level, including link, node and path, is the cornerstone of such networks. The transport protocols and network topology vary, depending on the services and applications supported. Capgemini’s ISS switching solution supports a rich set of protocols with full support for redundancy, allowing equipment manufacturers to build devices for high service scale and availability.

    Author

    Parthasarathy Varadharajan

    Senior Director – Principal Engineer, Capgemini Engineering
    Varadharajan is an IP/MPLS Architect with 25 years of experience in datacom and the telecommunications industry. He has been involved in the development of frameworks for mobile backhaul gateways, data center switches, secure routers, metro ethernet devices and industrial switches.

      Data and tech: The future of commerce is channel-less

      Kees Jacobs
      Jul 19, 2023

      Part 1: Unlocking value from commerce with data

      In my role as global Data and Analytics leader for Capgemini’s Consumer Products and Retail Sector, I often see business leaders having a blind spot when it comes to the increasing value of data in driving sales for their organization. So perhaps it is not surprising that so few of them know how to unlock this at scale within their companies.

      Recently at the Consumer Goods Forum Global Summit in Kyoto, Nestlé’s global head of sales and customers, Jordi Bosch, made an inspiring statement, when he said:Data and tech are the future of sales.”

      In this three-part blog series, I will explore the implications of this statement for all consumer goods and retail companies as they strive to transform their sales and marketing organisations in this new, more data-driven era of commerce.

      Data-driven transformation to channel-less commerce

      Consumer behavior has undergone a permanent transformation, along with their expectations of the companies that serve them. Nestlé refers to this as “channel-less commerce” (others call it ‘Connected’ or ‘eco-system led’ Commerce) where consumer goods brands and retailers must be able to provide personally relevant and consistent experiences across an eco-system of owned and third-party channels and touchpoints.

      On stage in Kyoto, my colleague Owen McCabe, who is supporting Nestlé in their transformation, presented the impact this has on companies, including different metrics (customer lifetime value) and next-levels of data capabilities (e.g. first-party data), in order to offset additional costs of customer acquisition and fulfilment.

      Data flows to connect capabilities end-to-end

      To thrive in this changing business model landscape, the consumer goods industry needs to master a new set of end-to-end capabilities turbo-powered by data. These can be split into front-end capabilities such as search and merchandising, personalized content generation, integrated marketing automation, comprehensive product information, dynamic pricing and promotions and channel-less carts and checkout experiences.

      These front-end capabilities must then seamlessly align with back-end capabilities like inventory and order management, last-mile logistics and fulfilment, customer management and loyalty, risk management, customer service, and sales management. All functions must be rewired to work together as one demand-team, in effective collaboration with selected eco-system partners. With data as the ‘electric current’ flowing ‘end-to-end’, powering the organisation to meet consumer expectations.

      Data value: Benefits for all

      What I often find when working with clients, is that ultimately those who best leverage the data are those who best understand the value of the data. Specifically, by linking that data to the concrete business benefits it enables for consumers, retailers, consumer goods companies, and the wider ecosystem of value-chain partners. Consumers will see value when they get relevant inspiration, personalized experiences and can seamlessly shop across channels.

      For companies, data value translates into growth, profitability, and customer satisfaction. By using data to understand consumer needs, companies can develop products and services that meet those needs, improve reach, engagement and conversion across channels (in collaboration with channel partners) and touchpoints, increase repeat purchases, and deliver on promises. Smart analytics play a crucial role in improving decision-making and optimizing the entire customer journey.

      Does your organization leverage data to connect front-end and back-end operations in a seamless way, and does your organization truly understand the business benefits that such data can unlock?

      Stay tuned for the second and third parts of this blog series, where I will delve deeper into data sources, data collaboration, predictive and generative analytics, data foundations, composable tech architectures, data cultures, and the transformational data journey.

      The future of commerce is channel-less, and it’s essential for companies to embrace it. Stay tuned…

      About the author

      Kees Jacobs

      Consumer Products & Retail Global Insights & Data Lead, Capgemini
      Kees is Capgemini’s overall Global Consumer Products and Retail sector thought leader. He has more than 25 years’ experience in this industry, with a track record in a range of strategic digital and data-related B2C and B2B initiatives at leading retailers and manufacturers. Kees is also responsible for Capgemini’s strategic relationship with The Consumer Goods Forum and a co-author of many thought leadership reports, including Reducing Consumer Food Waste in the Digital Era.

        Explore more

        Reimagine the future of consumer products

        Redefining success in the new era of connected commerce

        What matters to today’s consumer 2023?

        Technology in rugby: 3 concrete use cases

        Jerome Chavoix
        Jul 18, 2023

        Data and AI are redefining both the broadcasting of rugby matches and the game itself – revolutionizing the fan experience, improving athlete performance, and developing women’s sports in the process.

        To take the pulse of this ongoing revolution, the Capgemini Research Institute has published the second edition of its report, A whole new ball game: Why sports tech is a game changer.

        Does this mean that sport comes down to mathematics? What we can say for certain is that, for several years now, technology has been making inroads into all sports, for professionals and fans alike. The Capgemini Research Institute’s 2023 report, A whole new ball game: Why sports tech is a game changer, even underlines, with figures to back it up, that tech is transforming the experience of players and fans alike, both inside and outside stadiums. As a partner of World Rugby, the sport’s global governing body, Capgemini supports the organization’s digital transformation, enabling it to benefit from the latest technological innovations.

        #1: Technology for improved performance

        In rugby, as in other sports, digital technology is already improving the game and individual team performance with advances like body sensors to record players’ physiological data and game statistics transmitted in real time to coaches and television viewers. Today, thanks to new sources of data, but also artificial intelligence, data processing is becoming much more precise, refining the feedback of information in real time and even enabling predictions to be made. Clothing, smartwatches, heart rate monitors, virtual reality helmets, and connected mouthguards are tools that can be used to monitor players’ state of health, analyze playing technique, identify vulnerabilities, and even predict the outcome of a match.

        Building on the Momentum Tracker solution, originally launched at the World Rugby Sevens Series in Dubai in 2020, we are now using artificial intelligence to measure the performance of men’s and women’s teams as well as their ability to improve during competition.

        Momentum Tracker’s algorithm also provides each team’s coaching staff with reliable statistics on the performance of their opponents, enabling them to better guide their game strategy.

        Data has also become an important management tool. It now lies at the heart of team management. In November 2022, the French Rugby Federation announced its partnership with an American data analytics company to bring together on a single platform the data collected over the last few years during international and top 14 matches. Still, this invaluable aid, which will be used by coaches to build their game strategy, does not replace the intuition and know-how of the experts who coach the French national team.

        #2: Technology for better player health

        Tech is also opening up new prospects for preventing injuries during games, which is of particular interest in contact sports such as rugby. Prevent Biometrics, an American company, has developed a connected mouthguard to capture the actual intensity of the impacts players undergo. All of this data, impossible to detect by human observation alone, enables players to be protected and for a library of data to be built up, with the aim of developing measures to protect and monitor players’ health. The system was tested by World Rugby at the Women’s Rugby World Cup in 2021.

        #3: Technology for an enhanced fan experience

        Technology is revolutionizing the fan experience both inside and outside the stadium.

        The Capgemini Research Institute’s report reveals that almost 7 out of 10 sports fans prefer to watch a sports match outside the stadium if tech sufficiently enhances the viewing experience. Better camera angles, better broadcast quality, real-time statistics, and immersive experiences have profoundly transformed the fan experience outside stadiums. As a result, the number of spectators in stadiums has tended to decline over the last three years. While major international competitions are still very popular, this trend has seriously affected lower-level events. Responding to the question, “Have you attended a match in a stadium in the last 12 months?”, only 37% of fans surveyed answered in the affirmative in 2023, compared with 80% in the last quarter of 2019.

        In response to this trend, and with a view to democratizing rugby, Capgemini is supporting World Rugby by developing innovative statistics with three objectives in mind: to provide live analysis of game phases and players, to help players understand the often complex rules, and to gamify the experience. Relying on data and artificial intelligence, new predictive statistics are being developed – for example, the percentage chance of scoring a try in a given type of situation, or even real-time victory predictions.

        As these three examples show, technology has become omnipresent in sport. However, it still faces certain limits.

        In stadiums, although technology has enabled several improvements to the fan experience (video replay, online drink ordering, information on previous matches thanks to apps, etc.), its use is still limited. On the one hand, this is due to the lack of connectivity in today’s stadiums, and, on the other, because the fans who attend matches also, above all, come to enjoy an atmosphere, an experience, and emotions that the screen can’t match yet, or may even hinder. So, while technology continually improves the spectator experience as well as play, there are still issues and use cases to be addressed.

        The 2023 Rugby World Cup in France, which kicks off in September, will bring together fans from around the world, both at venues and on screens, to enjoy the game’s highest level of play, enabled by the most advanced technology. The tournament aims to offer spectators­ – veterans and novices alike – an unprecedented level of immersion during matches. I look forward to experiencing it.

        Author

        Jerome Chavoix

        Head of Partnerships Acceleration, Capgemini Invent
        Jérôme has over 20 years of experience in business development and setting up major partnerships. In July 2018, he took over as the Head of Customer Engagement at Capgemini Invent in France, and is now in charge of the French team at frog, part of Capgemini Invent.

          Unleashing the power of quantum computing: The imperative for application research

          Julian van Velzen
          Julian van Velzen
          Jul 17, 2023

          As we approach the era of quantum advantage, where quantum computers outperform classical computers for specific tasks, exploring how we can harness the potential of quantum computing and algorithms for real-world applications becomes crucial.

          Among the industries eagerly anticipating the impact of quantum computing, material science and drug discovery hold immense promise. However, transitioning from quantum computational advantage to practical implementation won’t be a walk in the park and will require addressing complementary challenges. Without dedicated research into practical applications, we risk having powerful quantum computers sitting idle while companies are still unprepared to adopt them.

          This article will delve into the essential requirements for quantum computers and algorithms to become useful in these fields. It will highlight the significance of focused application research in driving advancements in quantum technology.

          Unleashing quantum potential In material science and drug discovery

          Material science, at the forefront of innovation, presents abundant opportunities for leveraging quantum computing. By delving into the atomic and molecular levels, material scientists strive to enhance properties and functionalities. Advanced alloys, nanomaterials or topological materials are just a few examples of materials defined by quantum behavior and therefore offer exciting avenues for exploration.

          Another promising industry for a quantum advantage is the pharmaceutical industry. The industry faces significant challenges in the drug discovery process, with escalating costs and time requirements. Accelerating the virtual screening of potential drugs in an expanding chemical space is imperative.

          Both industries are heavy users of molecular and material simulation. Approximate solutions like density functional theory (DFT) are invaluable for exploring and designing drugs and materials and can model many desired properties. However, other simulating other chemical features can be more challenging. In material science, for example—which approximates solutions of charge transfer processes, photochemical reactions and catalytic reactions—often fails to capture the underlying quantum physics determinative for the material’s behavior. In many cases, these characteristics are defined by electron dynamics and, in some cases, strong many-electron correlations, which is notoriously difficult for classical solvers.

          This is where a potential quantum advantage comes in. As we scale up the power of quantum computers, simulating electron dynamics is one area in which scientists hope for a quantum computational advantage.

          Still, material scientists and pharmaceuticals aspire to understand higher-level questions such as material behavior and performance or potency and selectivity of candidate drugs. Simulating electron dynamics for small systems alone is insufficient to answer these questions. On the other hand, the genuine quantum effects that determine the behavior of some materials or molecules make quantum computers an attractive tool. Therefore, we’ll likely need a combination of quantum and classical computers and solvers to answer these questions.

          The urgent need for application research

          To make quantum computers useful, despite all their limitations, we must emphasize the criticality of extensive application research. Applied research goes beyond developing isolated quantum algorithms. It focuses on identifying applications that can genuinely benefit from a quantum approach and integrating quantum technologies into existing computational workflows. Note that this will not be easy. It must deal with complex matters in an interdisciplinary environment of quantum information scientists, domain knowledge experts and business owners. It must deal with the limits of today’s quantum computers while building tomorrow’s applications, all with uncertain specifications and timelines.

          However, if done well, it will usher companies into the era of quantum computing. It will allow companies to benefit from quantum computers as soon as they’re available. In the first place, by developing the right capabilities and knowledge. Given the steep and interdisciplinary learning curve, companies should expect several years before becoming quantum-ready.

          However, with the high pace of current developments, that time might be now. Benefits go beyond capability and knowledge development, too. Essential technologies must be developed to integrate quantum computing into workflows. Additionally, application research serves as a compass for the quantum industry. By understanding the computational requirements companies face, they can influence the direction of research and guarantee systems support industry requirements.

          Conclusion

          As we inch closer to quantum advantage, the pressing question arises: How do we make quantum computing truly useful? While numerous companies are exploring quantum computing, the percentage of them publishing their findings remains minimal. This highlights the urgent need for dedicated teams comprised of chemists, material scientists and computational experts to bridge the gap between theoretical advancements and practical applications. If we want to prevent the transformative power of quantum computing from going unused while companies are still getting ready, then initiating application research today is key.

          This article first appeared on Forbes.com

          Meet the author

          Julian van Velzen

          Julian van Velzen

          Quantum CTIO, Head of Capgemini’s Quantum Lab
          I’m passionate about the possibilities of quantum technologies and proud to be putting Capgemini’s investment in quantum on the map. With our Quantum Lab, a global network of quantum experts, partners, and facilities, we’re exploring with our clients how we can apply research, build demos, and help solve business and societal problems that till now have seemed intractable. It’s exciting to be at the forefront of this disruptive technology, where I can use my background in physics and experience in digital transformation to help clients kick-start their quantum journey. Making the impossible possible!

            Sustainable, green supply chains; A manufacturer’s new reality

            Vincent-de-Montalivet
            Vincent de Montalivet
            14 July 2023

            A Q&A with Vincent de Montalivet, Principal, Data for Net Zero offer leader, and Christopher Scheefer, Vice President, North American Lead for Intelligent Industry at Capgemini

            The world’s manufacturers are more connected than ever, and they simply cannot afford to be at the mercy of unreliable, insecure supply chains. Yet supply chains are among the business operations most vulnerable to outside forces and the cost can be significant, ranging from lost sales and production time to lower brand image and increased difficulty in raising capital.

            Insights derived from top-quality data can help manufacturers avoid disruptions in their supply chains – or at least mitigate the effects of climate change – by making these vital operations more sustainable.

            Capgemini’s Vincent de Montalivet and Christopher Scheefer talk about how they help manufacturers leverage data to transform their operations. Here, they explain how the sustainable supply chain creates financial and operational advantages even as it reduces environmental impact and mitigates risk.

            What is a sustainable supply chain? Is it one that’s environmentally responsible?

            Vincent de Montalivet: Environmental excellence is certainly an important part of it. Consumers, investors, and regulators are increasingly concerned about carbon emissions and other environmental issues, and companies must respect and respond to those concerns. Green supply chains are particularly important as companies make net-zero commitments and need to track their emissions across their ecosystem and the life cycle of their products. But sustainability is more than just being green. What we’re really talking about is the transformation of the supply chain beyond its traditional role in which partners work with an enterprise to design and manufacture a product and then bring it to market. The sustainable supply chain encompasses that, but then expands to cover the entire lifecycle of the product in a way that reduces that product’s impact on the environment and minimizes risks to the business. It accounts for how a manufacturer’s decisions and activities impact climate change, but also how climate change impacts the organization – in terms of both financial and organizational risks.

            Can you provide an example?

            Christopher Scheefer: Sure. A manufacturer buys from a partner in its supply chain that experiences a chemical spill and has to suspend business while it addresses this. From an organizational perspective, the manufacturer needs to find a new source for that chemical so it can continue to operate. But that spill could also negatively impact the manufacturer’s brand because the manufacturer is associated with that supplier. It could also have financial implications – both in terms of sales but also in the form of making it harder to attract capital from investors who are increasingly concerned about environmental, social, and governance (ESG) issues. So it’s absolutely paramount that organizations include sustainable risk analysis in their network.

            That’s a greatly expanded role for the traditional supply chain, isn’t it?

            Scheefer: Absolutely. A supply chain has traditionally been treated as a means to manage disruptions – for example, those caused by the COVID-19 pandemic or by international conflicts – and, in legacy supply chains, the chief procurement officer would rank members of their network primarily on time, quality, and cost.

            de Montalivet: We call those the currencies of a supply chain: they’re the main criteria that companies consider as they make decisions. Now, manufacturers need to add carbon as a currency. We’re witnessing increasing risks to supply chains arising from climate catastrophes and, as we look ahead, extreme weather events are only expected to grow more frequent and more severe. So there’s a real need for companies to make their supply chains more sustainable because that also means making them more resilient.

            Scheefer: In the modern business environment,manufacturers must also deal with the rise of the circular economy. Chief procurement officers now must also manage reverse logistics – for example, they must figure out how to bring a product back into the company for reuse, refurbishment, or remanufacture after the customer is finished with it. And once the product has reached its end of life, companies must be able to determine its ultimate waste footprint in order to report on its environmental performance to shareholders, potential investors, and regulators.

            Those are significant changes and the supply chain of today must look very different from those of the past. What does today’s supply chain look like?

            Scheefer: From a technology perspective, a modern, sustainable supply chain is intelligent and connected. The intelligence provides the manufacturer with the ability to track the priority, the commitment, and the execution of sustainability for all its suppliers, globally, at any given time. In the past that would have been impossible – but now we can apply advanced, data-driven solutions employing artificial intelligence and other leading-edge technologies to the problem. It’s still a challenge – especially for enterprises that rely upon a globe-spanning network of suppliers and sub-suppliers, distributors, and other partners. But data mastery makes it viable.

            de Montalivet: Connectivity is also important. It enables the manufacturer to manage the business in a way not previously possible. For example, the supply chain can now be connected to research and development, to ensure new product designs are more sustainable. It can be connected more deeply into manufacturing, where it will have an effect on new methods, new systems, and new machinery. And connectivity enables more sustainable sourcing through marketplaces that provide a true view of the supply chain network and a true view of the organization’s exposure. AI and analytics play huge roles in this because they’re critical technologies for assessing risk on the scale we’re describing. As an aside, it’s why AI is playing an increasingly vital role in the insurance industry as that sector comes to grips with the risks associated with climate change.

            Can you provide some examples of how AI enables or enhances the sustainable supply chain?

            de Montalivet: The Capgemini Research Institute has identified more than 70 AI-enabled use cases related to climate action. It focused on some of the top use cases for its report, Climate AI: How artificial intelligence can power your climate action strategy. For the manufacturing sector, these include tracking greenhouse gas emissions and tracing GHG leaks at industrial sites, and improving the energy efficiency of manufacturing facilities and industrial processes. Capgemini researchers also cited a number of use cases that have direct implications for the sector’s supply chains – including designing new products that reduce waste and emissions during prototyping, production, and use, improving demand planning, reducing the waste of raw materials, and route optimization and fleet management.

            Scheefer: AI-derived insights are giving decision makers the information they need to transition manufacturing away from “make to stock,” in which a company creates products and warehouses them on the assumption that a customer, eventually, will want them. It facilitates “make to order,” in which the product isn’t produced until there’s a customer for it. This confers several advantages – including eliminating waste, reducing warehousing space, and making better use of a company’s workers. At the same time, better management allows the organization to reduce water use, energy consumption, and associated emissions. All of this has implications for the supply chain as well.

            How are successful companies enabling their supply chains to become more sustainable?

            de Montalivet: It may sound obvious, but if an organization can’t see all of the parameters across its supply chains in real time, it’s almost impossible to optimize them in a sustainable manner.So companies that already employ a logistics control tower are off to a good start. The control tower isn’t a new concept – it’s been around for years – but it’s essential to incorporate sustainability data into it.

            Scheefer: Successful companies also recognize this is a journey. Once they’ve defined and implemented a sustainable supply chain, they can then start to apply data and analytics to continuously improve it. Capgemini offers solutions that help with this. Our Data for Net Zero solution enables organizations to master data from different sources and suppliers, then share it across their business functions and value chain. Meanwhile, our Sustainable AI solution ensures that employing the computational power of AI is itself accomplished in the most energy efficient, sustainable way possible.

            You obviously believe a sustainable supply chain is essential. Why?

            Scheefer: We do, because the supply chain is linked to so many other aspects of a business. Well-managed, sustainable supply chains have become a transformational agent and a necessary capability. It’s absolutely essential that a company has a sustainable supply chain strategy. If the chief procurement officer is not making their supply chains more sustainable, they’re losing out. Increasingly, it’s a license to operate and should be a corporate imperative.

            Continue the conversation, get in touch with us:

            Vincent de Montalivet, Principal, Data for Net Zero Offer Leader

            Christopher Scheefer, Vice President, North American Lead for Intelligent Industry at Capgemini

            Building virtual storage in electricity
            is there a shortcut?

            Dr Danica Vukadinovic Greetham
            7 June 2023
            capgemini-engineering

            Flexibility and predictability in electric grids – multi-scale challenges and future directions through meso-level aggregation.

            Needed: Intelligent Local Grids. When: Immediately

            The push to NetZero places most of its bets on electrification[1]. While, on average, total UK electricity demand has been decreasing since 2005[2], massive changes are brewing. Increases in distributed generation and demand due to the electrification of heat and transport are fast approaching.

            These changes will require much more ‘intelligence’ from low voltage local networks to deliver more flexible load management if they are to continue to function within operational and regulated limits, and maintain grid stability. Such flexibility management at the local level must also be designed to benefit from the wholesale electricity and balancing market, otherwise the investment in flexibility will be under-optimized.

            Increased visibility through monitoring will be the key to unlocking new intelligent solutions and keeping costs down. One of the benefits of more data is to facilitate more reliable forecasts of both demand and supply, which in theory can enable the operators to act more optimally. Smooth demand curves have a double advantage: they postpone the need for expensive reinforcements or negotiating massive reserves to be able to meet peak demand, and they allow for better prediction.

            As we know, demand smoothing is difficult, but it can be leveraged through time-of-use tariffs, by automating demand shift (e.g., through smart charging or intermittently switching off appliances), or by using storage[3].

            So, the solution is mostly…storage?

            Storage requires the least amount of behavioural change, but as we know, it is expensive. An alternative approach would be to create virtual storage, through flexibility. By ‘flexibility’, we assume a timely and/or spatial shift of activities, that reduces demand during peak periods, moves it from substations with a little headroom to those with more headroom or from less green to a greener generation mix.

            But, to shift demand, we need to be able to predict it accurately.

            New sources of data can improve the prediction of local demand: mobility, transport, local weather forecasts, smart meters, etc. Still, we are dealing with complex human interactions, where the only constant is change. For example, the future equivalents of the TV pickup[4] for soap operas would be difficult to predict – as content becomes available 24/7, and one can pause broadcasting, synchronous breaks are rarer.

            A complex system of a dynamic environment and human behaviour presents us with various challenges:

            • climate change impacts time-of-year usage patterns and the frequency and intensity of peak electricity demand[5];
            • shifting working patterns caused by COVID moved demand from industrial buildings to individual households and disrupted daily time-of-use patterns;
            • different policy considerations risk the creation of new peaks (e.g. an increased early evening peak, or localised night peaks for EV charging);
            • technological innovations (cryptocurrency, large language models) generate new significant electricity demand. 

            Follow the yellow brick road…

            Predicting accurately and then shifting demand equates to building robust virtual storage solutions, but obviously, the devil is in the detail.

            We can think about three different levels of time/space demand shifting:

            • Macro-level – Large consumers (e.g., data centres[6] and other industrial consumers)
            • Meso-level – Aggregated flexibility services
            • Micro-level – Individual households, secondary substations, or distribution feeders[7]

            The different levels present different technical challenges when trying to create data profiles of these entities and ecosystems.

            Individual large consumers or categories of commercial consumers are, in general, easier to predict as they will have smoother profiles[8]. But because they also have harder constraints, need more time to react, and have a vast range of data, the challenge is higher. That is why it is much easier to model domestic demand over commercial or industrial.

            On the other hand, flexibility at the meso and micro levels allows for more adaptable solutions. However, while they might be able to act faster, accurate forecasting is much more difficult at the feeder or individual level due to the volatility and variety of behaviours, not to mention privacy and scalability issues. What’s more, local dispatch optimisation problems can become too big to solve simultaneously.

            The answer, therefore, may lie in the meso-level. Creating data-profiles of similar neighbourhoods and similar businesses[9], and aggregating these, could provide an innovative ‘network of small virtual plants’, providing a storage solution that is robust, flexible, and cost-effective. By creating virtual storage, the UK can manage predicted increases in electricity demand, and thus improve its odds of meeting its Net Zero pledge.


            [1] https://www.iea.org/reports/net-zero-by-2050

            [2] https://www.gov.uk/government/statistical-data-sets/historical-electricity-data

            [3] N.B. that the technology is still some distance from scaling to the level needed

            [4] https://en.wikipedia.org/wiki/TV_pickup

            [5] https://www.pnas.org/content/114/8/1886

            [6] Our data centers now work harder when the sun shines and wind blows (blog.google)

            [7] https://doi.org/10.1016/j.erss.2019.02.008

            [8] https://arxiv.org/pdf/2106.11750.pdf

            [9] https://www.creds.ac.uk/publications/development-of-a-profile-based-electricity-demand-response-estimation-method-an-application-based-on-uk-hotel-chillers/


            Author

            Dr Danica Vukadinovic Greetham

            Technical Consultant, Hybrid Intelligence, Capgemini Engineering
            Danica has over 15 years of industrial and academic experience in predictive analytics of large human activity datasets, including smart energy, brain networks and social media. She is helping companies across different sectors with data&digitalization strategies and roadmap creation, enjoys problem solving and creating innovative solutions.

              5G private networks for intelligent and connected cruise ships

              Capgemini
              Capgemini
              7 June 2023
              capgemini-engineering

              A cruise ship’s crew and passengers expect continuous access to affordable, high-quality internet services while onboard. But the communications network on a typical cruise ship is a complex system, integrating multiple types of infrastructure to meet the needs of the ship, its crew, and its guests.

              It will often include existing wired infrastructure, supporting multiple network protocols, and will be augmented by both WiFi and private cellular networks such as 4G, LTE, and 5G to meet current and future connectivity requirements. The often unpredictable nature of a cruise ship’s journey means the network must constantly adjust to changing conditions, selecting the most appropriate connectivity options for providing reliable and fast internet access, and supporting onboard applications.

              Optimizing backhaul connectivity is crucial to delivering the desired internet access. To address this while at sea, the ship’s network may rely on a combination of Geostationary (GEO), Medium Earth (MEO), and more recent high-bandwidth, low-latency Low Earth Orbit (LEO) satellite connections, depending on availability and the ship’s location. Alternatively, when at port or close to coastal areas, it may utilize terrestrial public cellular and port WiFi networks to provide the best possible internet connection.

              Deploying a private 5G network can address many of the onboard challenges faced by a cruise ship’s communication network by providing faster, more reliable connectivity.

              Enabling a host of new applications

              A private cellular network based on 5G can augment existing WiFi and wired infrastructure and, by enabling modern applications requiring high bandwidth, low latency, and higher connection density, create a more intelligent and connected ship.

              5G technology offers high-speed broadband connectivity, low latency, a vast number of simultaneous connections, and quality of service through network slicing. Unlike WiFi networks, 5G can operate at higher frequencies, potentially providing faster speeds and better coverage. Moreover, its slicing technology can be optimized for specific use cases such as providing high-speed internet access to all the guests on a cruise ship, thus alleviating bandwidth constraints and improving the overall guest experience.

              A host of new applications, based on a private 5G network and with improved backhaul internet bandwidth connections, can help reduce operational costs for a ship’s owners, enhance guest experiences, and generate revenue through monetizing applications.

              Exploring the benefits

              Private 5G networks can enable the remote monitoring and maintenance of a ship’s critical systems, allowing cruise lines to diagnose and resolve issues more quickly and efficiently, reducing downtime and associated maintenance costs, and improving overall performance. And through secure connectivity, real-time data collections, and automated controls, a 5G private network can be employed to optimize energy consumption in a ship’s lighting and HVAC systems.

              And, by allowing video data to be processed quickly and more reliably, a private 5G network with Edge computing can improve the effectiveness of real-time high-definition video surveillance applications, helping to create a safer, more secure onboard environment for passengers and crew.

              The benefits extend beyond greater operational security and efficiencies. 5G private networks can improve the guest and crew experience, too.

              For instance, a 5G private network can offer the immense bandwidth required to enable live streaming of onboard performances, games, and entertainment for thousands of guests. Exclusive, high-profile events can be streamed externally over the internet by prioritizing onboard 5G network traffic using network slicing and optimized SD-WAN routing for backhaul internet connections, which benefit from improved bandwidth via LEO satellite connections.

              Furthermore, by leveraging a 5G private network, cruise line operators can partner with a Mobile Virtual Network Operator (MVNO) or Neutral Host Network (NHN) provider to offer guests and crew low-cost or no-cost seamless cell service. Not only will this optimize onboard operations, enhance the guest experience, and reduce costs associated with cellular service, but this solution also presents a monetization opportunity for cruise lines to offer innovative solutions to guests and crew, addressing the pain points of high cellular service costs and poor connectivity.

              Potential use cases

              There are a many potential use cases for the deployment of 5G private networks onboard a cruise ship. Here are some more examples.

              Improving security, safety, and operational efficiencies:

              • Use IoT/sensors to monitor and optimize fuel/energy consumption, creating a digital shadow/twin of key operational domains for improved real time Inventory Management.
              • Stream real-time high-resolution videos with live vision AI/analytics for enhanced security, faster onboard check-in, improved cruise ship operations, and crowd management, among other benefits.
              • Use automated vehicles and robots onboard for operational efficiencies and for loading crates and baggage for reduced ship turnaround times.
              • Provide onboard immersive crew and employee enablement and training.
              • Conduct drone-based inspections, onboard asset maintenance, and asset revamping.

              Improve guest and crew experiences:

              • Offer live performance simulcast.
              • Provide immersive AR, VR, and XR onboard guest experiences.
              • Stream onboard edge-enabled entertainment and games.
              • Leverage the Smart Energy Management Digital HUB to create Smart Cabin applications for guests.
              • Use AI/ML-enabled vision-based embarkation/disembarkation/identity verification processes at sea and at port.
              • Enable improved onboard guest-to-guest, guest-to-crew, crew-to-crew communication including push-to-talk services through the onboard mobile app
              • Enable seamless connectivity to mobile network operator services for guests and crew to communicate with their families on land.

              Authors

              Pradeep Nambiar

              Director, NA Services BU Pre-Sales – Industry Solutions, Capgemini Engineering
              Mr. Nambiar specializes in creating and advising on a wide range of offerings. These include Digital Continuity/PLM/Digital Twin, IIOT, 5G Private Networks, Digital Native transformations, Servitization potential exploration, secure engineering, and industrial process improvement using Autonomous AI. He serves clients in sectors such as Logistics, Transportation/Airlines/MRO, Hospitality (Hotels/Cruise Lines), Commercial Real Estate, and Engineering Procurement & Construction (EPC).

                Jerry P Nicholas

                Senior Director BU Pre-Sales – 5G & Edge Solutions
                As a 5G/Edge/IoT solution lead; Mr. Nicholas is dedicated to advising clients on their digital transformation journey. Working along with Capgemini’s ecosystem partners, Mr. Nicholas; provides end-to-end solutions to Telecom, Enterprises, Industrial, Maritime, and Automotive customers.

                  Happy or frustrated — How do your customers feel?

                  Christian Schacht
                  13 Jul 2023

                  Studies have shown that agents often know exactly how to resolve the customer’s request but they are either not enabled or not empowered to execute on this

                  I’ve just had a great service experience with one of the world’s best-known tech brands. I’d called about a replacement part and was taken seamlessly through a series of processes all built around resolving my query. I felt guided through every step and loved the fact that the call handler not only had a complete record of my engagement history, from purchase to contract to service, but took the time to check if there was anything else I needed while on the call. Clearly, their key performance indicator was one of resolution, rather than call time/cost.

                  Compare that to how a colleague felt recently on contacting a long-term utility provider to inform them about a change of address. There was no synchronization between his electricity and gas accounts with the same company and, despite having the identical contact details and name on each, both accounts had different customer identifiers. This meant my colleague was passed from pillar to post to de-register (twice) and then register at the new address (twice).

                  That’s not all. Having jumped through countless hoops, my colleague received no offer of a new tariff or incentive for his loyalty. Guess which one of us loves our provider? In fact, while I’m happily telling everyone about my tech company experience, he has moved to a new utilities provider with a reputation for customer centricity.

                  In a situation such as that with the incumbent utilities company described above, the problem is often the way in which the company is organized internally. Internal departmental silos often don’t work together, either because their systems aren’t integrated or for political and/or historical reasons. So, a customer with a product or payment query might be routed via several departments before getting to an agent able to help.

                  Then there is the challenge of customer service agents organized around and measured by KPIs based on call times and the ensuing costs. Sadly, they are far more likely to hang up if they can’t resolve an issue quickly in order to hit their KPI. This applies both to internal customer engagement teams and to outsourced providers. In both instances, organizing agents around a “customer happiness” KPI instead of an “agent productivity” metric would make it far more likely that the call is successfully dealt with. Why? Because it’s all about the customer, rather than the call handler.

                  Studies have shown that agents often know exactly how to resolve the customer’s request but they are either not enabled or not empowered to execute on this. But giving individual agents responsibility for effective call resolution from start to finish of the customer contact is a proven approach. Here we see an agent taking the call, identifying the problem, and liaising with the relevant departments (service, payment, maintenance, contract renewal, etc.) on behalf of the customer rather than handing them on to another department. The result? The customer feels at the center of the story and the agent has only one focus – to resolve the issue or query. And the more efficient your call resolution processes are, the more cost effective they become. What is maybe even more exciting is that empowered agents can significantly increase the company’s top line. Based on a Forrester customer experience benchmark in the US from 2022, home and auto insurers that empower their agents to solve problems themselves could see a whopping $1 billion in incremental revenue. Airlines that do the same could see an $833 million boost to their top line.

                  Technology can be used as an enabler to empower your agent and delight your customers, for example, by routing the customer directly to the best agent to take personal control of the situation. Technology can help you understand the customer context and make smart decisions to match the customer’s needs. Technology can also support the agent by guiding them to the right resolution and providing all required access, information, and support personalized for the specific situation your customer is contacting you for.

                  How you organize your customer service teams is just one of several strands in the customer experience story that I will explore further in this series of articles. Look out for “How to make the customer your biggest fan — Use data” next.

                  To discover how Capgemini’s Augmented Service offer can help you reorganize your customer-facing teams to put your customers first, visit our website here.

                  Author

                  Christian Schacht

                  Vice President, Global Offer Lead Augmented Service; Head of Digital and ERP Financial Services
                  “I have over 18 years of experience for strategy, concept, design and execution, delivering innovative Digital Transformation solutions for multiple industries. I help clients connect with customers, partners and employees and create great experiences across digital and traditional channels.”

                    What a proud moment: Capgemini wins six 2023 Microsoft Partner of the Year Awards!

                    Nico Steenkamp
                    12 Jul 2023

                    I am beyond thrilled to share that 2023 has proved to be another triumph for the Capgemini and Microsoft partnership; Capgemini has won six 2023 Microsoft Partner of the Year Awards. This year’s success really manifests Capgemini’s position as a leading Microsoft partner and the value we jointly bring to our clients.

                    Everyone who knows me knows that I love to move fast – but when moments like this arise, I believe it’s important to pause, celebrate, and reflect. I really want to thank the amazing team behind this top achievement and spend some time reflecting on why these awards are so important to us and what makes up the secret formula for success behind our wins.

                    At Capgemini, we are passionate about helping our clients to transform and manage their businesses by harnessing the power of technology. Together with Microsoft,  we continue to co-create and co-innovate across different industries, and are especially proud to see this commitment recognized through wins across a range of categories and industries, which serve to demonstrate how our diverse capabilities and global coverage empower us to deliver true business value for our clients.

                    Let’s take a look at the winning categories in more detail.

                    • Global System Integrator (GSI) Award, Western Europe The Capgemini Microsoft partnership spans more than 25 years and has allowed us to deliver real impact to our clients in Western Europe. Together, we help organizations use Microsoft technology to yield new, impactful experiences for their customers and employees, redefine and innovate their processes, and deliver new digital products and services that drive impact in their marketplaces. I’m thrilled we were recognized in this category.
                    • SAP on Azure  As one of the most accredited SAP partners with over 12,000 SAP specialists, Capgemini has a unique ability to guide clients on their SAP on Azure transformations. This year we have further strengthened our partnership through a strategic initiative in Europe aimed at bringing differentiated and innovative industry solutions to help clients accelerate and succeed in their SAP on Azure journeys
                    • GSI Growth Champion As a Microsoft Cloud Solution Partner, Capgemini holds more than 45,000 Microsoft certifications, ranking us among the top three Microsoft Partners in the world. To date, we are also proud to be the only Microsoft Partner to achieve accreditations in Analytics, AI and ML, and data warehouse migrations. This combination has allowed us to champion Microsoft’s growth globally across a variety of industries and solution areas.
                    • Country Award, Sweden Sweden is a leading force for digital innovation and advanced research. Over the past year, Microsoft, Capgemini, and Sogeti (part of Capgemini) have accelerated a new approach to customer experience for businesses across the country. Through the Microsoft Digital Customer Experience of the Future offering, we’ve helped organizations create personalized customer experiences that can strengthen relationships, fuel innovation, and drive lifelong growth.
                    • Industry Award, Financial Services, United States Close collaboration and deep knowledge of Microsoft services mean Capgemini is uniquely placed to unlock the full potential of the latest Microsoft financial services technologies for our clients. This past year, we’ve launched several new Microsoft accelerators across FSI domains and verticals. We’ve also accelerated the adoption of newer technologies within the industry and created new industry offerings for Azure Marketplace.
                    • Security Award, France This award resulted from our engagement with a French multinational seeking to enhance its security. Leveraging our strong Microsoft partnership, we provided comprehensive support, including a security operations center (SOC), identity management tools, Microsoft Defender for Endpoint, and Microsoft Defender for Office 365. Our implementation of new solutions through a security roadmap enables the client to continually elevate their cybersecurity measures for the future.

                    Geared towards accelerating our partnership for the future

                    So, what’s next for our partnership?

                    We believe Capgemini’s role centers on our ability to bring business ambitions to life for our clients. Our Microsoft partnership allows us to do just that – it helps us to implement faster and release value quicker, all while reducing risks and increasing security.

                    My counterpart, Soren Lau, General Manager of Partner Development at Microsoft, acknowledges our core strengths and joint success:

                    “Congratulations to Capgemini for receiving six Microsoft Partner of the Year Awards 2023, including the GSI Growth Champion Partner of the Year Award. By combining Microsoft services with their industry knowledge and cloud capabilities, they have created innovative solutions and services for their customers. We’re excited to celebrate Capgemini at Inspire as they continue to enable digital transformation.”

                    Soren Lau, General Manager of Partner Development at Microsoft

                    Each year we set the bar higher, and I’m thrilled to see that we’ve been recognized for it. I know I speak for everyone at Capgemini when I say we’re excited to continue to push the boundaries and deliver even more value to our clients in 2024 and beyond.

                    Look out for more here: https://www.capgemini.com/about-us/technology-partners/microsoft/

                    Author

                    Nico Steenkamp

                    Global Microsoft Partner Executive
                    Nico has an extensive experience in Partner Management, Outsourcing, Consulting, Program Management, and Delivery from multiple leadership roles across the Capgemini Group. He also managed the commercials and portfolio of Capgemini’s private cloud offerings. Nico is based out of Austin, Texas and enjoys playing golf, hiking and attending music concerts in his spare time.