Skip to Content

The major trends in the semiconductor industry right now

Brett Bonthron
Aug 18, 2023

Takeaways from the GSA European Executive Forum and SEMICON West 2023

Introduction

In the past months, we witnessed two major semiconductor events across the globe: The 2023 Global Semiconductor Alliance’s (GSA) European Executive Forum gathered leading global senior executives on June 14-15 in Munich to embrace the most pressing issues affecting an industry caught in the throes of change. SEMICON West 2023 took place in San Francisco on July 11-13 to discuss key challenges affecting the global microelectronics industry. In this article, we’ve distilled the major trends that arose during both events; trends that will continue to shape this industry in the foreseeable future. These include supply chain volatility, sustainability, government investments, generative AI, geopolitical tensions, equality, and the tremendous opportunities in automotive. We’ve also mapped out Capgemini’s role as an intermediary in building trust and understanding and helping to welcome new players to the market.

Resilient supply chains require flexible production and shipments

Semiconductors are pervasive and will become much more pervasive. Semiconductors are the brain of digitization. It is not widely known that semiconductors are among the most traded goods in the world. Any disruption in the semiconductor supply chain can significantly impact the global economy.

The first big trend centers around building resilience to the volatility of the semiconductors’ supply chain and ensuring end-to-end transparency to predict forecasts better and manage demand. Supply chain issues caused by the fragility of the supply chain and the incompatibility of production cycles have cost semiconductor company customers, such as automotive, many billions of dollars in lost sales and profits. Automotive customers controversially asked that they be given control over the flow of chips from one Tier 1 to another.  For semiconductor companies, it is imperative to build the resilience and process maturity that will enable them to switch easily between industries. GSA triggered a dialogue on how to be better prepared for whatever the future may hold by increasing inter-industry cooperation and building strategic relationships.

Harnessing the transformational power of sustainability

Another major trend broached at both events was the clear focus on sustainability and producing the products that drive it. As the earth’s ability to provide what we need decreases, the need to act on sustainability is increasing. Semiconductor companies have come out with their strategy and goals to become sustainable. Companies are launching initiatives focused on producing sustainable products that enable low power consumption or reduce the carbon footprint of their customers.

Sanjiv Agarwal, Vice President Global Semiconductor Industry Leader, says, “Semiconductor companies need to embrace sustainability and aim to make technology sustainable. Sustainability is everyone’s responsibility.”

As the semiconductor industry is projected to double by 2030, and carbon emissions projected to quadruple by 2030, sustainability and government investments also dominated the agenda at SEMICON West as well. Five key messages emerged:

  • AI is very high computing power-intensive – for example, a ChatGPT search consumes thirteen times as much energy as a Google search
  • Companies should design for sustainability – there is a need for dedicated engineering teams to support sustainability goals (equipment, sub-fab, process recipes, and operations). Companies such as Intel and Applied Materials have engineers dedicated to sustainability as part of the engineering PODS
  • Every semiconductor company has thousands of suppliers – Intel, for instance, has 16K suppliers; however, most suppliers have yet to set their sustainability goals. Therefore, there is an urgent need to establish metrics and develop a measurable roadmap to achieve net zero. Sustainable procurement is gaining attraction in the market.
  • Digital technologies can help reduce the carbon footprint and make fabs more sustainable – this can be achieved by optimizing efficiency through advanced analytics (ML, analytics, and AI); improving digital lifecycle collaboration within fabs (digital twin platform across the lifecycle of a fab can reduce production loss and energy waste); and ensuring enterprise-level collaboration across fabs.
  • Companies have made more progress on their US sites than in other regions – for example, Intel and STMicro are net positive water in the US but not in other regions.

Generative AI – Need for extreme compute power and smaller suppliers

GenAI is probably our generation’s most disruptive innovation, and it can potentially shape humanity’s future. From simple automation of tasks to writing codes to drug discovery, the scope of areas where it can find use is practically limitless, and the semicon industry is right at the forefront to enable this transformation journey. With such technologies that have the potential to impact so many different industries in a myriad of ways, there are always the early adopters, the ones who need a plan, the late risers, the ones with the FOMO, and the ones who choose to be in their state of inertia unless the market forces apply.

Surprisingly with Gen AI, no one wants to maintain the status quo. There is a clear indication that almost every industry is looking for ways to adopt Gen AI in its day-to-day operations, be it in Manufacturing, Sales, Marketing, IT, or customer service – and the High-tech segment is leading the pack in terms of adoption. As Gen AI-based applications and use cases for design and manufacturing support start to proliferate, it will transform how the current automation in factories functions. This will create a major shift in how the industry adapts and molds itself to this new reality. According to Vignesh Natarajan, Hi-Tech Segment Leader of Europe, Capgemini, “As generative AI becomes mainstream, the transformation of the data center space will be driven by semiconductor players, who will be the crucial building blocks in the power chain competence.”

For Capgemini, the biggest trends are generative AI-based use cases, AI-based development use cases, AI-based joint design use cases, and foundry solutions. This will be the big wave as demand for consumer electronics continue to grow, albeit slower than during the Covid era. However, demand for electrification, sustainable solutions, and smart cities will soar. Government funding of large-scale projects will provide a floor for demand to produce the next “boom” cycle for semis.

“Our ambition is to support the semiconductor ecosystem companies in scaling up to meet their market opportunity with solutions in Intelligent Industry and Enterprise Management,” says Shiv Tasker, Global Industry Vice President, Semiconductor and Electronics.

Digital Twin offers fast scalability

Digital Twin showcases huge potential in the semiconductor industry through its ability to simulate the entire fab, manufacturing processes, and various use cases and models to improve efficiency and productivity. Companies are looking to transform various aspects of the manufacturing processes. Some of the examples where semiconductor companies are focusing are:

  1. Device-scale twin – detailed visualization of a device to reduce cycles of silicon learning, thus reducing waste and resources,
  2. Process-scale twin – using simulation to streamline process development thus reducing chemicals and electricity usage,
  3. Equipment-scale twin – improving first time right from design through installation by finding issues before physical build or building equipment expertise faster and more effectively.

Digital twin, or the digital omniverse, coupled with Generative AI, provides an incredible opportunity by providing millions of variations to the model, and through reinforcement learning, can change models for best-performing output or model. When implemented well, it can escalate product output at a speed that tests the laws of physics.

OEMs’ growing needs, especially in the automotive

Automotive is a huge driver for many of the changes facing the semiconductor industry. In fact, there was palpable tension at the GSA event between the semiconductor representatives and auto manufacturers. The auto market is hard to resist for any semi-manufacturer due to its size, but the auto manufacturers will never forget the chip shortages of the Covid era and the tremendous damage that did to their business. The evolving supply chain relationships and the trust challenges were the subject of many formal and side-bar discussions.

Sanjiv Agarwal adds: “At Capgemini, we work both sides of the equation, helping chip manufacturers “get to market” and fit into the automotive ecosystem, working with the automotive manufacturers to create their chip strategy, selecting and working with foundries to manufacture those chips, and integrating chips into their designs.” We bring in the promise to create an affordable, ever-smarter, software-driven mobility ecosystem that’s centered around customer needs and protects them from both physical and digital threats.

Geo-political tensions

Geopolitical tensions are a shared concern rather than a trend, but they will have a large impact on the way semiconductor companies work since 60-70 percent of all chips are manufactured in Taiwan or South Korea, which are both relatively volatile. Divergent national approaches exacerbate these concerns. The US, for example, has shifted from outsourcing production to encouraging chip producers to transfer operations stateside. In general, the U.S. CHIPS Act and the European Chips Act will “onshore” more production and drive diversification of production geography.

Brett Bonthron, Executive Vice President and Global High-tech Industry Leader, says, “Through the two Chips Acts, semiconductor companies see that governments understand the criticality of the industry.”

The US Chips Act is a true public-private partnership model and probably the first proactive federal program where the program will be executed along with the states, which would manage permits, labor, land, and other logistics.

Statements of interest are currently being accepted for all direct funding opportunities (USD2B floor, no ceiling), and over 400 have already been received. The US Chips Act envisions success in four areas:

  • Leading-edge logic –at least two new large-scale clusters of leading-edge logic fabs wherein US-based engineers will develop the process technologies underlying the next-generation logic chips.
  • Memory – US-based fabs will produce high-volume memory chips on economically competitive terms and R&D for next-gen memory technologies critical to supercomputing and other advanced computing applications will be conducted in the US.
  • Advanced packaging – the US will be home to multiple advanced packaging facilities and a global leader in commercial-scale advanced packaging technology.
  • Current generation and mature – the US will have strategically increased its production capacity for current-gen and mature chips. Chipmakers will also be able to respond more nimbly to supply and demand shocks.

Similarly, the European Chips Act enables the EU to address semiconductor shortages and strengthen Europe’s technological leadership. It will mobilize more than € 43 billion of public and private investments through the Member states through five key areas:

  1. Strengthen Europe’s research and technology leadership towards smaller and faster chips,
  2. Put in place a framework to increase production capacity to 20% of the global market by 2030,
  3. Build and reinforce capacity to innovate in the design, manufacturing, and packaging of advanced chips,
  4. Develop an in-depth understanding of the global semiconductor supply chains,
  5. Address the skills shortage, attract new talent, and support the emergence of a skilled workforce.

Diversity and workforce development

Diversity, workforce development, and talent were major topics at both events, with the consensus being that inclusion must start at a much earlier age and that more women and minorities must be allowed to enter leadership positions. Considering the existing workforce, many companies are partnering with universities, granting scholarships, and launching apprenticeship programs so that when these fabs are ready, and the existing workforce is close to retirement, the new, more diverse talent will be ready.

Conclusion

The semiconductor industry is in a state of flux. This year’s European Executive Forum by GSA outlined the five major trends – supply chain resiliency, generative AI, geopolitical tensions, the impact of the automotive industry, and sustainability – to emerge from this transition. There are, of course, numerous other factors at play, including issues around inclusion or reducing barriers to entry within the industry. There are also several topics that remained unsaid, for example, shifting relationships between automotive OEMs and tier-one suppliers or the evolution of the semiconductor company vis a vis the value chain. However, at the end of the day, semiconductors are fundamentally about propelling civilization forward and enabling the creation of better societies. As something that is also written our raison d’etre, Capgemini has substantial know-how and near-tech vision to drive this ultimate goal forward.

Meet our experts

Brett Bonthron

Executive Vice President and Global High-tech Industry Leader
Brett has over 35 years of experience in high-tech, across technical systems design, management consulting, start-ups, and leadership roles in software. He has managed many waves of technology disruption from client-server computing to re-engineering, and web 1.0 and 2.0 through to SaaS and the cloud. He is currently focusing on defining sectors such as software, computer hardware, hyper-scalers/platforms, and semiconductors. He has been an Adjunct Faculty member at the University of San Francisco for 18 years teaching Entrepreneurship at Master’s level and is an avid basketball coach.

Vignesh Natarajan

High-tech Segment Leader, North & Central Europe, Capgemini
Vignesh has spent nearly two decades in the Consulting, Engineering, and IT services space with a specialized focus on Manufacturing organizations. He is passionate about Technology and digitalization, and how they can transform the human experience and enrich lives. In his current role, he helps our strategic customers realize their digitalization roadmap fueled by Innovation and state-of-the-art technologies with a strong focus on decarbonization. He strongly believes that unleashing human potential through technology is the only way to a sustainable future for humanity and that Semiconductor organizations will lead from the front in this transformational journey.

Sanjiv Agarwal

Global Semiconductor Lead, Capgemini
With about 30 years of experience in the TMT sector, Sanjiv is experienced with enabling digital transformation journey for customers using best-of breed technology solutions and services. In his current role as a global semiconductor industry leader, he is working closely with customers on their journey on producing sustainable technology, driving use of AI/ ML, digital transformation, and global supply chain.

Shiv Tasker

Global Industry Vice President (ER&D), Technology, Media and Telecom at Capgemini
With more than three decades of executive management, sales, and marketing experience in the hi-tech sector, Shiv possesses a proven track record of helping SaaS organizations scale by building high-performing sales teams. During the course of his career, he spearheaded the growth of a startup, elevating it to over $100 million in annual recurring revenue (AAR) within a four-year period.

    Welcome to where intelligence transforms everything at Google Cloud Next

    Genevieve Chamard
    18 Aug 2023

    Having a finger on the pulse of technology is critical. But with so much happening so quickly, keeping up can be a struggle. So, I’m truly excited for Google Cloud Next 2023.

    Google Cloud Next ’23 is the flagship annual conference where I’ll be joining my team at Capgemini and some of the brightest minds in the field. We’ll be both experiencing and sharing the latest innovations, technology, and trends from industry experts and global business leaders. 

    If you’re planning to attend, I invite you to join me there for exclusive insights and transformative opportunities tailored just for you. This includes immersive industry demos, live podcast episodes and speaking sessions with our clients – exploring how Capgemini and Google Cloud work together to transform businesses like yours every day.

    Here’s a look into two of the spotlight sessions:

    Elevate your possible with responsible Generative AI 

    More than just a buzzword, Generative AI (GenAI) is making tangible impact to businesses across all industries. I’m delighted to share with you a glimpse of Capgemini’s journey with GenAI at Google Cloud Next.

    Get the inside story of how Capgemini specialists trained their teams on 250 use cases of generative AI and built 52 demos for the clients – all in the span of two months. You’ll explore practical applications and best practices for tangible outcomes – to give you a glimpse into the adoption journey. 

    A leading US bank builds a next-gen enterprise with Google Cloud 

    Explore the journey taken by a leading US bank to become a data-master enterprise. The company adopted a data-driven strategy and accelerated product and service development while capitalizing on market trends, better serving its customers, and getting an edge over the competition. 

    I’m excited to share with you three immersive experiences available at our booth, showcasing the advantage of leveraging Google Cloud’s innovations in three domains.

    Explore the combined power of human intelligence, cloud, and Generative AI 

    • Financial services: Witness how the home-insurance sector is adapting to Google Cloud technology, creating personalized payment models based on user behavior, regardless of their risk profile 
    • Retail: Experience how customers can receive tailored recommendations and interact with audio conversations to self-checkout smartly and conveniently – powered by generative and conversational AI 
    • Automotive: Drive your imagination, literally. With AD SHORTY you’ll explore how autonomous vehicles will be able to take drivers to new areas and terrains, and not just through common roads and freeways.  

    Capgemini’s booth will also be hosting the Cloud Realities Podcast; join over 100,000 listeners as Google experts will discuss key trends, challenges, and opportunities for organizations as it explores sustainability, cybersecurity, and business transformation. Tune in to gain practical advice for navigating large-scale cloud transformations from our Chief Cloud Evangelist and Chief Architect for Cloud, Dave Chapman.  

    So, if you’re attending, please drop me a message on LinkedIn, or find me at booth #1215 and let’s discuss the possibilities in the world of cloud technology.

    Author

    Genevieve Chamard

    Global AWS Partnership Executive
    Genevieve is an expert in partnership strategy at a global level with 13 years of innovation and strategy consulting. Teaming up with partners and startups, Genevieve helps translate the latest, bleeding-edge technologies into solutions that create new captivating customer experiences, intelligent operations and automated processes. She specializes in: Global partnership strategy and management, Go-to-market and growth strategy, Industry vertical solution build, Pilot definition and management and Emerging technology and start-up curation.

      Migrating your SAP to the cloud?
      The most important step is before you begin

      Devendra Goyal
      11 Aug 2023

      Migrating SAP to the cloud can be a daunting task for any organization, and it requires a significant investment of time, resources, and expertise.

      As with any major undertaking, there are challenges at every step of the way, from planning and preparation to execution and beyond. That’s why it’s important to partner with an experienced team that has the SAP and cloud expertise needed to ensure a successful migration.

      Why is a partner necessary for an SAP cloud migration?

      Your company likely already has a team with SAP expertise, cloud expertise, automation experience, and project management skills. So why add the cost and hassle of an external partner? There are a few reasons. A good partner knows what to expect, and when. Your partner will keep this project on track, no matter what else is going on in your organization. You have many tasks; your partner has one – getting your SAP up on the cloud, and doing it as efficiently as possible.

      An SAP and cloud partner with experience

      When considering a partner for your SAP migration, there are several factors to keep in mind. One of the most important is SAP experience. You’ll want to work with a partner that has a proven track record in your specific industry and with the SAP Products you use. Look for a partner that can provide references and case studies that demonstrate their ability to deliver successful SAP projects.

      An SAP and cloud partner with expertise

      Another key factor is cloud expertise. Your partner should have deep expertise in the cloud providers and native tools that you plan to use, as well as a good understanding of cloud best practices. They should be able to help you select the right cloud infrastructure for your needs and ensure that your SAP applications are optimized for performance and scalability.

      Automation and offers can also be important differentiators when choosing a partner for your SAP migration. Look for a partner who can offer automation tools to streamline your migration and reduce the risk of errors. Additionally, a partner who provides packaged offers or services will help simplify your migration and reduce costs.

      What else comes with experience and expertise?

      There are numerous other attributes that come with experience and expertise. One of those is sound competency . A well-rounded team should have diverse skills and experience in SAP, database, operating systems, cloud, data migration, security, and compliance. This will ensure that all aspects of your migration are addressed. Connected with competency is project management. Nothing is more frustrating and discouraging than a poorly managed project. You should look for a partner who has a detailed plan and methodology in place, with a clear timeline and a risk management strategy. A partner with strong quality control processes in place can ensure that all aspects of your migration are thoroughly tested and validated early – when changes are still easy.

      The stamp of approval

      Finally, don’t overlook the value of certifications and partnerships when choosing a partner for your SAP to cloud migration. Certifications should be relevant, and partnerships should include hyper-scalers and SAP vendors. These certifications and partnerships can ensure that your migration is completed to the highest standards, and your partner has access to the latest tools and resources. (You do NOT want to finish your migration only to realize that it’s already a year behind the times.)

      Finally, long-term success depends on ongoing support and maintenance for your SAP applications in the cloud. Therefore, it’s essential to choose a partner with the capability to provide BAU support and ensure that any issues are quickly addressed, and your applications continue to run smoothly.

      Learn more about our cloud offers on our website or contact us here to share your experience and questions. 

      Author

      Devendra Goyal

      Head – Global S2C Offer & Transformation Delivery

        Unleashing the data mesh revolution: Empowering business with cutting-edge data products

        Dan O’Riordan
        9th August 2023

        The principles of data mesh have moved beyond being just theoretical concepts for data architects and forward-thinking executives. It’s time to start delivering on data mesh’s promise of exceptional data products. Data mesh principles can help us uncover the valuable insights that businesses need.

        Feedback Fusion: The power of continuous iteration for product success

        When building a product, it’s crucial to understand the utility of the product and how any changes to the product will impact its utility over time.

        If we consider building a mobile phone or any other product, the cost of building a phone that is unusable will be significant. Therefore, conducting thorough research in the beginning to understand what the market wants is critical before beginning the product-development process.

        Once we have built and distributed a phone, we need to continually consider feedback from different channels, including social media and online reviewers, to continuously iterate and improve the phone.

        This feedback loop is also imperative for data, however in the past data developers have typically waited for feedback from data consumers and then reacted. This has introduced time delays and ultimately frustration for data consumers.

        With product thinking this approach is turned on its head, data product developers are continuously monitoring both quantitative and qualitative feedback from consumers.

        This feedback allows data product teams to proactively evolve the data product to ensure that as data consumers need new capabilities, they are being built into the data product, thus avoiding delays and frustration, and enabling better outcomes for the organization.

        Data mesh dilemma: Embracing innovation amidst fear and uncertainty

        Data mesh principles, which focus on the notion of first-class data products and other factors, have gained an unprecedented amount of interest in the past eighteen months. The conversation in the data mesh community has largely focused on the principles data mesh and what they mean for each organization. Most organizations have invested heavily in cloud but are still struggling to keep up to the pace that the business requires. “Why does it take me three to six months to get a new or modified dataset? Who’s responsible for the data governance? How can I trust that the dataset can be trusted?” and the list of questions goes on.

        What we discovered during these conversations with clients is there is an overall acceptance that data mesh and its principles make good sense, but there is the fear factor on the pain an organization needs to go through to get to the promised land of a truly federated data estate of quality, secured, discoverable data products. So, most organizations have kicked the can down the road.

        Start small, think big, and design for industrialization

        Here are useful guidelines to help reduce this fear of failure.

        1. To effectively build data products, it’s crucial to identify the problem you’re trying to solve and determine why a data product is the appropriate solution from the beginning of the process. Taking the time to clarify the reasons behind your approach will ultimately save you a great deal of time, money, and effort. This fundamental step is applicable to any product-development process, and it’s no different when building data products.

        A simple data product canvas together with the business and domain experts need to be committed to this phase. Note: Forget about all technology during this phase.

        2. Many organizations have not changed their approach to data management in the last 30 years. It is commonly believed that all data must be centralized into a data warehouse or data lake before it can be analyzed, which is both difficult and costly in terms of human resources and technology. Today decision makers wait for data to be made available before it can be used. This means waiting for data pipelines to be specified and built, however this is typically done in the absence of the complete knowledge of the value of the data to a particular use case. This unnecessarily elongated process is fragile and has a negative impact on an organization’s ability to compete using data.

        Fortunately, solutions like Starburst/Trino offer intelligent connectors and a highly optimized federated MPP SQL engine that enables the creation of data products by analysts in the lines of business (domains) with no need for intimate knowledge of the source technology. Lines of business can quickly access data and determine its applicability to a use case without having to rely on central data teams.

        If we consider this in the context of cloud-data migrations, solutions like Starburst/Trino enable these data products to be created, managed, and retired while the underlying data platforms are migrated. The system administrators only need to update the connector to ensure uninterrupted service for business users. With Starburst we want to give the data-product teams the option to decide on what works best for them to deliver the best data product that will satisfy the requirements as outlined by the data product canvas.

        3. Finally, to ensure that the quality of data products is maintained over time as business needs change, a continuous monitoring and feedback loop is key. Data-product producers need to understand who, how, and for what purpose their data product is being used, so they can proactively manage the data product. This management requires technology capabilities to provide this insight as well as an agile approach to streamline the pipeline from ideation to production and constantly improve efficiency. We look at this as the building of a factory like model for data products.

        Data mesh in action

        At online fashion retailer Zalando, various lines of business independently utilize Amazon S3 for storing and managing datasets, eliminating the need for a central data team. A central data “enabling team” oversees data-governance standards and identifies reuse opportunities, while a dedicated platform team supplies compute services including a distributed SQL Engine (Starburst) for analytics. This clear division of responsibilities – lines of business managing data, the enabling team governing it, and the platform team providing technology – prevents bottlenecks and centralization, fostering agility in leveraging data to maintain a competitive edge.

        A prominent French state organization has been devising its data-estate roadmap for 2025 over the past year. Its current extensive data platform comprises batch processing, streaming processing, AI, and use cases, with concerns about cloud readiness. With a complex data estate plagued by performance and monitoring issues, its goal is to streamline operations using a new data platform based on Starburst and Apache Iceberg. The primary objective is simplification and reduced complexity, achieved by focusing on business outcomes and scaling with data-mesh principles.

        “Start small, think big and design for industrialization.”

        Dawn of a new era

        The rise of data mesh and its principles plus the technical offerings from Starburst marks the dawn of a new era for data products. As businesses embrace the principles of data mesh, it’s essential to address the fear factor associated with adopting this approach. By following the guidelines outlined in this article – focusing on identifying the problem to be solved, leveraging modern solutions like Starburst/Trino for data management, and implementing continuous monitoring and feedback loops – organizations can confidently embark on their journey towards a truly federated data estate. Success stories like Zalando and the large French state organization demonstrate the transformative power of data mesh in improving efficiency, agility, and competitiveness. As we move forward, it’s crucial for businesses to embrace the promise of data mesh, shifting from theoretical discussions to real-world implementation. Only then will they be able to harness the full potential of exceptional data products and uncover the valuable insights needed for sustained success in an increasingly data-powered world.

        INNOVATION TAKEAWAYS

        OVERCOMING ADOPTION HURDLES IN A FEDERATED DATA ESTATE

        Data mesh principles enhance data-product creation, driving valuable insights and competitiveness, but adoption is slowed by perceived challenges in achieving a federated data estate.

        THE THREE PILLARS OF EFFECTIVE DATA MESH IMPLEMENTATION

        Implementing data mesh effectively involves problem identification, utilizing modern data-management solutions, and establishing continuous monitoring and feedback loops.

        DATA MESH IN ACTION

        Success stories like Zalando and a large French state organization showcase the benefits of data mesh, including improved efficiency, agility, and competitiveness.

        BRIDGING THE GAP, PRACTICAL STEPS TO DATA MESH SUCCESS

        Moving from theory to practice in data-mesh implementation allows organizations to better harness data-product power and succeed in a data-powered world.

        Interesting read?

        Capgemini’s Innovation publication, Data-powered Innovation Review | Wave 6 features 19 such fascinating articles, crafted by leading experts from Capgemini, and key technology partners like Google,  Starburst,  MicrosoftSnowflake and Databricks. Learn about generative AI, collaborative data ecosystems, and an exploration of how data an AI can enable the biodiversity of urban forests. Find all previous Waves here.

        Dan O’Riordan

        VP AI & Data Engineering, Capgemini
        A visionary with the architectural skills, experience, and insight to transform any application, computing platform infrastructure or data operation to the cloud. He works regularly with the CxO’s of large enterprises across different industries as they embark on a digital transformation journey. A key part of digital transformation requires an organization to be data centric. Organizations are on their journey to using Cloud and have started to migrate applications but also are looking at how to migrate their data operations and how to then build & deliver data services using the latest AI & ML services from the Cloud Service Providers. 

        Andy Mott

        Partner Solution Architect, Starburst
        With more than 20 years of experience in data analytics, Andy Mott is skilled at optimizing the utility of analytics within organizations. When determining how to generate value or fortifying existing revenue through technologies, Andy considers the alignment of an organization’s culture, structure and business processes. He ensures that the strategic direction of the organization will ultimately enable organizations to out compete their respective markets with data. Andy Mott is currently EMEA head of partner solutions architecture and a Data Mesh lead at Starburst, and lives in the United Kingdom.

          Navigating the complexity of enterprise asset management in the energy and utilities sector

          Mark Hewett
          Aug 8, 2023

          Enterprise Asset Management (EAM) is a crucial component of any asset-intensive industry and as technology continues to evolve, managing assets effectively and efficiently is becoming increasingly complex.

          In this post, we’ll explore several key areas of EAM in energy and utilities, including connection volume, IT/OT convergence, commissioning and decommissioning, and the circular economy, as well as security considerations.

          Key Challenges:

          The sustained increase in connections to the grid poses a significant challenge in keeping a grid operational and balanced. It also poses a challenge for those who are responsible for tracking and maintaining these assets. The more assets there are to manage, the more difficult it becomes to ensure that they are all tracked, monitored, and maintained effectively. This leads to increased costs, reduced efficiencies, and potential downtime for critical infrastructure assets.

          How to address this challenge:

          To address this challenge, COOs and Operations Directors are turning to new technologies to collect real-time data about their infrastructure’s performance and health, which can then be analyzed to identify potential issues before they become major problems. These digital technologies can help to provide operations managers with insights and actionable information that they can use to optimize asset management and operational processes. Used intelligently, these platforms can “predict” failures and support interventions in the network that drive down cost while also reducing operational impacts (i.e., outages or leakage).

          By leveraging these digital technologies and exploiting an improved operational awareness of the network, asset and operations managers can improve efficiencies, reduce costs, and ensure that all assets are tracked and managed effectively.

          IT/OT convergence:

          The convergence of IT and OT in asset management refers to the integration of two distinct areas of technology, information technology (IT) and operational technology (OT), to create a consistent digital ecosystem that enables organizations to manage their infrastructure efficiently. IT and OT have traditionally been separate areas of technology (and under separate “management” through CIOs and engineering directors, respectively), with IT focused on managing data and information systems while OT focused on managing physical assets, infrastructure, and processes. As digital transformation has accelerated, there has been a growing recognition of the need to integrate and combine these two areas to create a more holistic view of assets and infrastructure across the organization and so drive organizational structures, governance, and processes to change.
          By integrating IT and OT systems, organizations gain a more complete picture of their capital and IT assets in the context of the operational needs of the business, including data on performance, maintenance, and utilization. This enables organizations to identify areas for improvement, optimize asset utilization, and reduce downtime and maintenance costs. Furthermore, IT/OT convergence also enables organizations to align their asset management goals with their overall business objectives. By connecting asset management to business objectives, organizations can ensure that assets are managed in a way that supports their strategy and objectives.

          Commissioning and decommissioning:

          Commissioning and decommissioning are critical processes within infrastructure operations management that can have a disproportionate impact on a company’s strategy, objectives, and ultimately their business results. Bringing new assets online, ensuring they are functioning correctly, and retiring assets at the end of their useful life is a demanding and involved operational procedure. These processes require careful planning and management to ensure that assets are correctly brought into operations and removed from operations effectively and efficiently without disrupting the rest of the network and are key considerations to ensure efficient operation of any grid infrastructure.
          Commissioning involves a series of tests and checks to ensure that new assets are functioning correctly and safely. This process can involve everything from checking electrical and mechanical systems to verifying that the asset meets regulatory and safety standards. The commissioning process is critical to ensuring that assets are safe to operate and will perform as expected in the operational environments they are intended to be used in. Digital system integration is a core enabler to the smooth running of an effective commissioning process.
          Decommissioning, on the other hand, involves retiring assets that are no longer needed or have reached the end of their useful life. This process can involve everything from removing equipment and disposing of hazardous materials to shutting down systems and securing the site. Proper decommissioning is critical because it ensures that assets are retired safely and efficiently, reducing the risk of accidents and the impact on the environment. And in a digitally enabled environment, this also reduces the impact on other operational assets as dependencies are easier to identify and manage accordingly.

          Operations managers leverage data and analytics to gain insights into their infrastructure assets’ performance and make informed decisions about the risks associated with commissioning and decommissioning. By tracking performance metrics and analyzing data on asset utilization, maintenance costs, and downtime, operational managers can identify opportunities to optimize the timeline for when an asset should be decommissioned from operation. To manage this, operational managers must balance the risk of keeping an asset operational against the cost of replacing the asset with newer, more efficient equipment, minimizing impacts on customers and business operations alike.

          Related to the decommissioning process, the circular economy is an emerging trend in EAM that involves designing products, solutions, and systems with a focus on sustainability and circularity. The idea is that operational assets can be retired to secondary or tertiary operational roles or returned to the manufacturer for reconditioning or material reuse. It aims to minimize waste and the consumption of natural resources by keeping materials in use for as long as possible, through reusing, repairing, refurbishing, and recycling.

          Circular economy:

          In a circular economy, assets are designed and managed to ensure their long-term value, with a focus on minimizing waste and reducing environmental impact. Asset managers play a key role in promoting the circular economy by commissioning assets that are designed for maintenance and that support circularity and by implementing effective recycling and repurposing programs. If these requirements are not specified and driven hard through design and implementation, then the assets delivered and deployed are destined for landfill.
          Furthermore, asset managers can implement effective recycling and repurposing programs to ensure that materials are reused and recycled at the end of the asset’s lifecycle. This can involve everything from implementing a recycling program for electronic waste to repurposing old equipment for use in other applications.

          Security Considerations:

          Lastly, the most overlooked element in most operational environments is security. We have found it to be a critical consideration in EAM, particularly as assets become more connected and digitally enabled. COOs and operations directors need to ensure that all assets are protected from cyber threats and other security risks that cause financial, reputational, and operational damage to their business.
          To ensure that assets are protected from security threats, operations managers should implement security protocols such as zero-trust security methodologies, which assume that all devices are potentially compromised and implement measures to verify the identity of users and devices before granting access to the wider infrastructure. Other security measures can, and should, include network segmentation, access control, data encryption, and intrusion detection and prevention systems.

          Operations managers can also incorporate regular security patching and maintenance procedures into asset management processes to ensure that vulnerabilities are addressed promptly. This can involve everything from updating firmware and software to conducting regular vulnerability assessments and penetration testing. They should also prioritize security by incorporating security considerations into asset design and asset selection processes.

          Conclusion:

          In conclusion, EAM is an increasingly complex and evolving field that requires careful planning and management to ensure that assets are used effectively, safely, and efficiently throughout their lifecycle. By leveraging technologies such as IoT sensors and advanced analytical tools, embracing IT/OT convergence, prioritizing security, and promoting supply chain circularity, COOs and operations directors can optimize asset management processes and achieve greater success in the digital age. If one or more of the topics we touched on in this blog is of interest to you or you are curious to know more, please watch these videos where our SMEs Sven Strassburg (from our IBM partnership) and Mark Hewett, Capgemini discuss these very topics with a focus on the Energy Transition and Utilities sector.

          Co-authored by Mark Hewett, Sven Strassburg and Woody Falck.

          Authors

          Mark Hewett

          Vice President | Energy and Utilities
          As Vice President for our Energy Transition and Utilities team in the UK, I have a strong focus on energy networks and the intelligent transformation of network businesses across the UK to meet the challenges of the future. As a chartered engineer and former Army officer, I have worked across several sectors including global high tech and public sector and aviation before finding my home in Energy Transition and Utilities.

            Software-defined vehicles (SDV): The answer to truck driver shortages?

            Fredrik Almhöjd – Our expert
            Fredrik Almhöjd
            Aug 2, 2023

            Although most truck OEMs acknowledge software-defined vehicles as a new norm for the commercial vehicle industry, they still need to convince their customers that these vehicles will add value to their businesses – especially around the top three objectives of improved uptime, productivity, and fuel efficiency.

            SDVs have a major role to play in helping fleet operators overcome the international shortage of truck drivers, explain Fredrik Almhöjd and Jean-Marie Lapeyre, Chief Technology & Innovation Officer, Global Automotive Industry at Capgemini. That’s because SDVs can transform the driver experience, potentially attracting younger people and women who currently don’t see truck-driving as a career option.

            “Without action to make the driver profession more accessible and attractive, Europe could lack over two million drivers by 2026, impacting half of all freight movements and millions of passenger journeys.” That is the stark prediction of the International Road Transport Union (IRU), commenting on a study it conducted in 2022. The outlook isn’t any more reassuring in other regions.

            So what are transportation companies to do, and how can truck OEMs help? In this article, we’ll argue that software-defined vehicles (SDVs) could be a big part of the answer. We’ll be building on ideas from earlier blogs.

            In the passenger car market, the concept of SDVs is often promoted on the basis that it will create a better customer experience for the driver. For commercial vehicle fleet operators, by contrast, the main focus has always been, and will continue to be, on TCO. Until recently, efforts to improve life for the driver, while important, have received less attention.

            However, with driver shortages becoming critical, truck-driving needs to be made more attractive to jobseekers. The IRU suggests that attracting more women and young people is an important part of the solution – but current working conditions make that difficult.

            SDVs can help with the challenge of recruiting and retaining staff.

            SDV features can make drivers’ lives better

            So what SDV features might improve driver experience? Truck drivers will enjoy many of the same benefits as car drivers, such as customized infotainment – though obviously, this must not distract them from the job.

            Consumer-oriented SDV features can be tailored for trucks. For example, a framework for companion apps on smartphones could be adapted to support the needs of HGV drivers in finding places to stop, eat, and sleep, avoiding illegal and dangerous use of phones while driving. In addition, although software can’t improve the quality of facilities available to drivers, it can help direct them to the most satisfactory ones based on a driver’s personal preferences and ratings by other users.

            With all the functionality they need integrated and automated (and configured for personal habits and preferences that they have already stored), the job can be done safely, easily, and legally. Similar technology could be used to help last-mile delivery drivers navigate between stops.

            Integrate drivers’ digital lives

            Many people, especially younger ones, now expect their digital lives to be streamlined and integrated across work and leisure. To appeal to these individuals, SDVs could be equipped to remember drivers’ preferences regarding infotainment modes and transfer them across trucks. Their preferred smartphone apps, or similar ones, could also be made available via the truck’s console.

            By integrating various aspects of working life, we can make the driver’s job easier, as well as more pleasant. A common complaint from truck drivers is that they have to unload cargo themselves because there is nobody else to do it. An SDV can contact the destination to communicate the arrival time and nature of the cargo, increasing the chances of the relevant staff being on hand with the right equipment.

            When a truck is an SDV, ADAS features can easily be added. Some of these features can help to make truck-driving more attractive to younger people and women by allowing multiple tasks to be performed simultaneously. Stress levels for the driver are reduced significantly if they can organize their working day – including route optimization and scheduling of pickups and deliveries – while they’re on the road. This can be achieved through partial automation of driving tasks, whether via assistant systems or fully autonomous driving (say up to level 4), paired with services that help with routing and scheduling.

            Overcome negative perceptions of truck-driving careers

            For women in particular, personal safety issues can be a deterrent to working as a truck driver. Connected vehicle software can help here too. For example, AI-enabled services can monitor sensor data and warn when someone is approaching a stationary truck, and biometrics can control who has access to the cabin. Predictive maintenance can reduce or eliminate the risk of breaking down in a lonely spot. (And with SDVs, we can go beyond preventive maintenance via telematics and alerts to their natural successor, self-diagnosis by the vehicle.)

            Thanks to SDV connectedness, despatchers can more easily monitor drivers’ safety and send help if needed. The same communications facilities could streamline interaction between communities of drivers who can look out for one another, reducing any sense of isolation.

            Long hours away from home are another turn-off for many potential drivers. SDVs’ communications technologies can improve their work-life balance, with social media style software, in-vehicle display screens, and cameras keeping the driver in touch with family or friends during stops.

            Work-life balance can be further improved by advanced route optimization techniques. An SDV route can be automatically optimized to accommodate a driver’s personal preferences and constraints, as well as requirements such as refueling and rest stops. It can then be continuously adjusted to reflect the current circumstances such as weather and traffic conditions, helping drivers to finish work on schedule.

            Deliver better driver experience and financial benefits for fleet operators

            Despite their urgent need to recruit more drivers, at the end of the day truck buyers are still likely to focus on the more tangible benefits of SDVs. The good news is that many of the features that give drivers a better experience simultaneously increase productivity, uptime, or fuel efficiency – for example, predictive maintenance and real-time route optimization, both mentioned above.

            The same is true of services that address electric vehicles’ range limitations and shortages of charging stations (as discussed in our recent e-mobility blog). Suppose the truck’s battery is getting flat, and the nearest charging station has a long wait time. An SDV can save energy in various ways: for example by modifying engine parameters or environmental settings such as aircon, or by advising changes in driving behavior. With these adjustments, the driver can continue to a charging station with an acceptable wait time, improving productivity and likely reducing frustration too.

            Safer driving is yet another example of an SDV capability that benefits both employer and driver. Examples here include the use of sensors to detect when vehicles get too close to one another, or when drivers are tired and need a break. For example, a truck could raise an alert when its driver is blinking more frequently than is normal for them, indicating exhaustion.

            Make driver appeal part of the business case for SDVs

            For truck OEMs and tier 1s, the case for SDVs is clear. They can enhance revenue flows via a shift from one-off purchases to full lifecycle engagement, and improve automotive sustainability performance, for example by reducing waste in R&D processes. Ultimately, SDVs can help to make the brand central to customers’ businesses. In addition, selling SDVs makes sense as part of the journey to autonomous driving and in the context of companies’ overall digital transformation.

            Software-defined vehicles as passenger cars

            SDVs are already proving their worth in the passenger car market, where improved driver experience is a more obvious selling point. (Read our “point of view” report on software-driven transformation for more.)

            An excerpt from a recent Connected Mobility infographic – please download the full version here

            The question is how to demonstrate the value of SDVs to truck customers such as fleet operators. Industry concepts such as software-driven transformation are not always much help here. Instead, OEMs can point to the business benefits that result from SDV adoption. And right now, improved driver experience could be among the most important of those benefits because of its ability to help overcome driver shortages.

            For more information, visit the commercial vehicles area of Capgemini’s website, and read the earlier articles in this blog series.

            About Author

            Fredrik Almhöjd – Our expert

            Fredrik Almhöjd

            Director, Capgemini Invent
            Fredrik Almhöjd is Capgemini’s Go-to-Market Lead for Commercial Vehicles in the Nordics, with 25+ years of sector experience plus extensive knowhow in Sales & Marketing and Customer Services transformation.
            Jean-Marie Lapeyre – Our expert

            Jean-Marie Lapeyre

            EVP and Chief Technology & Innovation Officer, Global Automotive Industry
            Jean-Marie Lapeyre works with automotive clients to develop and launch actionable technology strategies to help them succeed in a data and software-driven world.

              Expert Perspectives

              How a system-based modular approach minimizes risk and accelerates SaMD go-to-market​

              Capgemini
              Capgemini
              27 Jul 2023
              capgemini-engineering

              In the field of healthcare systems compliance, the difference between 99% and 100% is a chasm.  ​ 

              “Software defect.” “False upstream alarms.” “Firmware error.” “Login error.” “Software bug.” “Sensor failure.” “Cybersecurity vulnerability.” The FDA list of software-related recalls is sobering reading. We see the story behind each event – the teams that worked overtime designing and building a new device, quality assurance professionals checking each and every vulnerability, last-minute adjustments, the elation of a seemingly successful release… And all the while a tiny flaw lay hidden from sight, with the power to derail everything compromising safety and exposing liabilities. ​

              As a legal manufacturer, we specialize in safety and compliance for Software as a Medical Device (SaMD). It’s a fast-growing field, with risks hiding in every nook and cranny. How do we make sure we catch and neutralize every risk? How do we ensure safety and compliance, without sacrificing speed? Let’s dive in. ​

              Compliance and agility by design​

              There’s a common fear that safety and compliance slow down go to market. A justified concern? Yes and no. Of course, extra steps take extra time. That’s why, wherever possible, we work to engrain safety and compliance considerations into every process. For example, a remote patient monitoring system that records patient data needs to do many things, some are basic functions; others cross over into the “safety” category. Safety and compliance are mandatory and are integrated into the development process. Pharma and MedTech companies will speed development and improve quality if they adopt two changes: agile processes combined with modularity by design. To see how modularity adds value, let’s look closer at the challenges SaMD teams face.​

              Challenges of connected systems ​

              With today’s more connected and complex systems, the boundaries of what constitutes “Software as a Medical Device” are often blurred. For example, when the system is distributed, with parts of an application running on a wearable, mobile phone, or in the cloud, and  a combination of medical and non-medical functions, is the whole system SaMD? How do we manage a combination of safety and non-safety, administrative and other functions that all need to work together to fulfill a medical purpose? If the system is developed as monolithic, this increases effort to get regulatory approval and places additional compliance effort to modify, improve or add functions after the system has received initial market authorization. What’s the solution?​

              The benefits of modularization for connected health ​

              A pragmatic approach to accelerate SaMD development, manage safety and regulatory complexity, and maintain flexibility, is modularization. We segregate SaMD products by function and by risk category (high, medium and low risk). This makes it possible to apply the appropriate level of risk control and testing measures in each case. Using pre-built, ready-to-use SaMD modules developed under certified processes and qualified tools, assures reliable, fast and compliant software. ​

              Modularization reduces regulatory complexity and – together with agile development models – speeds up the time to market, while providing the flexibility we need to continuously adapt functions. Most critically, it reduces risk. The “hidden flaw” we talked about earlier – in a modular system there’s no place to hide, and agile development makes it possible to and correct flaws early in the development cycles.​

              Reducing risk and regulatory complexity​

              We believe that risk is best managed when technical and regulatory responsibility go hand-in-hand. The closer a development team is to the consequences of success or failure – the more skin they have in the game – the more we count on them to scrupulously manage risk. We’d been working in the SaMD field from the start, at the intersection of software, life sciences and regulatory, so taking regulatory responsibility for our work was a natural step. How to manage regulatory compliance is an important question for every innovator – a far-reaching question with many dimensions. For us, most crucial is the link between technical and regulatory responsibility.​
              ​You can find more about our offer and our Legal Manufacturer capabilities on our webpage, and we’re also available to consult on any aspect of risk and compliance.

              When your innovations hit the market, dozens of factors affect their success. Avoidable mistakes should not be one of them. Let’s make your products flawless.  

              Meet our experts

              Andrew Koubatis

              Intelligent Medical Products and Systems Lead, Capgemini Engineering
              Providing Pharma and MedTech with service offers to accelerate and de-risk product development. “Intelligent products and systems allow us to break the traditional boundaries of the healthcare ecosystem, providing greater patient insights through data, more effective, reliable and personalized treatments, driving better outcomes and supporting value-based care with connected and interoperable technologies.”

              Frédéric Burger Ph.D.

              CTO Life Sciences and Regulatory Affairs, Global Life Sciences Center of Excellence Leader, Capgemini Engineering
              Leading the global life sciences portfolio and solutions in Pharma and Medical Devices “This is undoubtedly a new stage in the use of data in the life sciences industry. With the combination of Regulatory Sciences and a clear strategy on Digital implementation, the data is now at the core of any new journey. We support our clients with expertise, strong assets and methodologies for accelerating their transformation”.

                Sharing without showing: Data clean rooms allow for unprecedented collaboration

                Jennifer Belissent
                26 July 2023

                Imagine the potential for secure data collaboration. With the boundaries between different companies, organizations, and entire industries blurring, the use cases are endless. Organizations can perform joint data analysis and train machine-learning (ML) models while ensuring that confidential information will stay protected from their sharing partners. It’s all happening in the world of data clean rooms.

                Pharmaceutical companies can identify the best hospitals for clinical trials with a look-alike analysis against patient records. Insurance companies can collaborate to identify fraudulent claims. Media outlets can offer premium placement to advertisers to ensure targeted messaging. Loyalty programs can deliver truly personalized services across hotels, airlines, and other services. Telecom operators can collaborate with location data to enrich those personalized services. Emergency and social services can collaborate to help those in need.

                Yet in many cases, the relevant data is personal information, and protected by privacy laws and bonds of trust. How can that data be shared?

                The use cases for secure collaboration with data clean rooms are endless

                Imagine the following scenario.

                A crowd of spectators is watching a big game and the teams are tied. The tension mounts. The fans grow restless. He shoots. He scores! The roar of the crowd can be heard all the way down the neighborhood street. And all the consumer brands want to know who is watching and how to reach these audiences. Yet, these sports fans are watching the game in the privacy of their homes, and the network they’re watching on must legally protect their data.

                How can these media outlets share their viewer data – or the insights from it – without violating data protection laws and the trust of their subscribers?

                It turns out that a similar question was posed by an academic in the early 1980s. Professor Andrew Yao introduced the problem: Alice and Bob, both millionaires, want to know which of them is richer but neither wants to reveal his or her exact wealth. Through complex mathematical proofs, Yao’s Millionaires’ problem was solved, proving it is possible to share insights without showing the underlying data. Fortunately, modern methods do not require arduous manual calculations.

                “Sharing without showing? You bet!”

                Increased demand for data sharing

                For potential advertisers or anyone who wants to collaborate with data, that’s great news. Data sharing and collaboration deliver business value. A recent Capgemini study, Data sharing masters, found that companies with collaborative data ecosystems reported better business outcomes including new revenues, reduced costs, increased productivity, and greater customer satisfaction. And that promise has spurred new data ecosystem initiatives.

                Companies have long used their own data to better understand their customers or to improve operations. Increasingly, data teams turn to external data sources to enrich their internal data and enhance analytics. Budgets for external data are significant and growing. In a recent survey conducted by external data platform Explorium, 22 percent of respondents said they were spending more than $500,000 on external data, with 13 percent saying they spent more than $1 million (up from 7 percent from a similar survey in 2021).

                Customer data was the number one type of data acquisition: 52 percent purchased data on companies, followed by 44 percent purchasing demographic data. And the number of sources has grown as well: 44 percent of firms acquire external data from five or more providers. That’s up from only 9 percent the previous year. However, procuring external data is not without challenges, with regulatory constraints often topping the list. Concerns about GDPR or other privacy regulations loom large, and for good reason.

                Introducing modern data clean rooms

                Not long ago, data sharing meant copying and sending files to a partner. That practice certainly complicated data governance. Short of a manual audit, knowing who accessed the data and for what purpose was impossible. Now, using the principles demonstrated by Yao’s millionaires, two or more parties can derive insights from data without revealing the underlying information.

                With a Snowflake Global Data Clean Room, each party controls its own data, allowing governed, controlled analytics by other parties. That is to say, each party specifies who can access the data and for what purpose. Let’s take a look at how it would work with Yao’s two millionaires, Alice and Bob.

                First, each party creates a table with the data to be shared. Then one party, let’s say Bob, creates a table to store allowed statements. This is where the queries that Bob will allow another party to run against his data will be maintained. He then creates an access policy granting use of these statements and applies this access policy to his data table.

                Next, Bob defines the exact statement or query he will allow, and inserts it into his “allowed statements” table. The statement includes the comparison of their wealth and the answers that will be returned in each case: “Bob is richer,” “Alice is richer,” or “Neither is richer.” Finally, he grants Alice permission to access and use his data for only this specific purpose. Alice then asks the question in the form of the specified query and receives the response: Bob is richer. Sorry, Alice.

                Now imagine a more realistic business scenario where two companies want to know which customers they have in common – an overlap analysis. They would put the data in tables, establish the statements to compare their customer lists, and specify the information to be returned. Or one company might be interested in finding new prospects among a partner’s customers and would perform a look-alike analysis comparing customer attributes.

                Data clean rooms transform the ad world

                In a real use case, commonly seen in media and advertising these days, brands want to optimize their ad spend through better targeting to specific customers or personas – like the fans watching that exciting game. Media outlets want to offer premium placements by knowing exactly which programming the brand’s customers are watching. Comparing customers is a win-win. However, neither wants to show the underlying data. The clean room allows them to share without showing. In this case, as illustrated in the diagram, the returned information would include a customer count for each of the media outlet’s programs, but not specific customer data, in order to ensure compliance with privacy regulations. All queries of the data would be monitored and logged for audit purposes.

                In the past, this scenario required data to be copied and moved across the AdTech value chain from enrichment to activation to attribution. Not only were there the aforementioned governance concerns, but that data was also immediately stale. With Snowflake, live, near real-time data can be shared where it resides – no copies necessary. Data governance capabilities allow all parties to assign access and use policies that limit both who can query the data and exactly which queries are allowed. Additional capabilities add further security to the clean room. Data can be encrypted, anonymized, tokenized, or pseudonymized with built-in hashing functions, or obfuscated with data masking or by injecting differential privacy.

                With today’s technology, data clean rooms allow parties across teams, companies, government agencies, and international organizations to collaborate and securely share sensitive or regulated data. As Thomas Edison said, “The value of an idea lies in the use of it.” The more data is used, the more value is created. Secure data collaboration accelerates value creation.

                INNOVATION TAKEAWAYS

                CROSS – INDUSTRY COLLABORATION AND DATA SHARING

                A growing trend that’s here to stay.

                DATA CLEAN ROOMS FACILITATE JOINT DATA ANALYSIS AND ML

                While ensuring that confidential information will stay protected

                from sharing partners.

                DATA ECOSYSTEMS AND SECURE DATA COLLABORATION

                They accelerate value creation.

                Interesting read?

                Capgemini’s Innovation publication, Data-powered Innovation Review | Wave 6 features 19 such fascinating articles, crafted by leading experts from Capgemini, and key technology partners like Google,  Starburst,  Microsoft, Snowflake and Databricks. Learn about generative AI, collaborative data ecosystems, and an exploration of how data an AI can enable the biodiversity of urban forests. Find all previous Waves here.

                Jennifer Belissent

                Ph.D., Principal Data Strategist, Snowflake
                Jennifer Belissent joined Snowflake as Principal Data Strategist in 2021. Prior to joining Snowflake Jennifer spent 12 years at Forrester Research as an internationally recognized expert in data sharing and the data economy, data leadership and literacy, and best practices in building world-class data organizations. At Snowflake, Jennifer helps customers develop Data Cloud strategies that facilitate data access and deliver business value. Jennifer earned a Ph.D. and an M.A. in political science from Stanford University and a B.A. in econometrics from the University of Virginia.

                  Closing the generational life insurance gap with education

                  Samantha Chow
                  Samantha Chow
                  26 July 2023

                  The percentage of Americans covered by life insurance has been steadily decreasing since the 1970s.

                  My family is no exception.

                  Traditionally, newborn life insurance policies protected against the financial burden of death. My great-grandmother purchased a $1,500 life insurance policy for my grandmother when she was born. When my mother was born, her grandmother bought one for her. Even my brother, only ten years older than me, has been insured since birth. As a child of the ‘70s, though, I was the first in generations not to be insured in childhood.

                  Significant advances in healthcare, changes in the socioeconomic and demographic characteristics of the population and lower war casualty rates have led many to experience mortality resistance. Despite the recent pandemic, the threat of death feels less imminent than it once did and today, we are more likely to be able to cover the costs of a funeral. Therefore, the urgency to insure the lives of our loved ones — and especially youngsters — has subsided.

                  Ultimately, this change has led to a startling reality: Less than 60% of Americans are currently covered by life insurance — a number that has declined steadily since 1971, with a 13% reduction in the last decade alone.

                  Mortality resilience and the growing generational wealth gap

                  American adults continue to put short-term priorities ahead of saving for retirement or planning for a catastrophe. They prioritize vacations (29%), recreational activities (23%), and paying monthly bills (49-60%). Along with believing life insurance coverage is “too expensive,” many say they have “other financial priorities”  beyond saving for what’s next. As a result, less than half of Millennials and Gen Zers currently have a life insurance policy.

                  And yet, global mortality resilience, a measurement of how resilient we are to death, is low, at just 43%, and the global mortality protection gap hit a record $406 billion last year. Both are key indicators that households are more vulnerable than ever to the loss of a breadwinner.

                  At the same time, the majority of seniors today have life insurance coverage. And with the payout of their policies, we are about to experience the greatest wealth transfer the world has ever seen. The result is record-breaking distribution rates of death benefits, with approximately $89 million in 2022, compared to $76 million paid out in 2019.

                  The question is: Will the beneficiaries of these policies — often Millennials and Gen Xers —reinvest in securing their futures, and the future of those after them, with proper life insurance coverage?

                  Meeting the needs of the next generations

                  Meeting the needs of the next generation will require education and innovation.

                  For one, it is our responsibility to teach Millennials, Gen Xers (and younger generations) that the true value of a life insurance policy is not only realized after one’s death but can also serve as income replacement, an investment vehicle, cover long term care — and more.

                  We also need to build a library of policies that match the lifestyle choices of Millennials and Gen Xers to help bridge the gap. If Gen X wants to save, pay their bills and retire early, then maybe carriers need to reconsider offering a Return of Premium Term, for example, that returns the policy premium if the insured outlives the term.

                  Here are examples of how two life insurance companies are already bridging the life insurance gap with education and product — across the generational and financial divide:

                  • Kemper Life Insurance : With a focus on low-income communities, Kemper’s engagement approach is very personal. They provide door-to-door sales and premium collection services, establishing face-to-face relationships with their customers. While many are not in the financial position to purchase large life insurance policies, Kemper’s goal is to help customers make the right financial decisions for their security whether it be life, accident and health or contents coverage.
                  • Prudential Financial : Prudential’s Stages for Retirement education program is designed to help younger generations prepare for retirement at every stage by providing personalized projections on how much they will need for retirement, which products will help them meet these projections over time and advice on how to reach savings goals throughout their lives.

                  To move forward, we need to take a step back and reeducate everyone on the value of income replacement, debt payment, and cash value opportunities within life insurance policies, and to provide products that fit today’s needs. The true value of life insurance has not changed; it’s been forgotten. It’s our job to help everyone remember.

                  Author

                  Samantha Chow

                  Samantha Chow

                  Global Leader for Life Insurance, Annuities and Benefits Sector at Capgemini
                  Samantha Chow is an expert in the global life, annuity, and benefits markets and has 25 years of experience. She has deep expertise in driving the growth of enterprise-wide capabilities that facilitate transformational and cultural change, focusing on customer experience, operational efficiency, legacy modernization, and innovation to support competitive advancement

                    Related research and insights

                    Data and tech: The future of commerce is connected

                    Kees Jacobs
                    Jul 25, 2023

                    Part 3: Embed data within the business

                    Welcome to the third part of this initial blog series on the future of commerce and the role of data and technology. In the earlier parts, we discussed the impact of channel-less commerce, the need for connected capabilities, the significance of data-driven competencies and value, and the importance of managing data foundations, data collaboration and composable technology architectures. In this final part, we will focus more on data cultures and business-transformational data journeys at scale.

                    Data-driven competencies: From data adhocracy to data democracy

                    Data and technology are critical for executing end-to-end connected commerce capabilities and navigating the evolving consumer goods and retail landscapes. Companies need to develop next-level data and technology competencies, ensuring that (real-time) intelligence is embedded in every business decision, operational action, and consumer touchpoint. From experiences with our clients, I know that most companies have made progress in this area through omnichannel initiatives, and often have some strong ‘pockets of excellence’, but there is still a need to further democratize data capabilities across the whole organization. Successful data masters have both the foundational data infrastructure to make data and insights accessible and the right data behaviors to leverage data for business impact at scale.

                    Data culture: The hearts and minds

                    Ultimately, while data and technology professionals play a crucial role, I clearly see that the responsibility of data ownership and value creation should be more prominently extended to business users. A data-driven culture shift is necessary for success in a data-driven era. This requires addressing the hearts and minds of employees across various functions, such as category managers, supply chain operators, store staff, and B2B sales teams. Integration of data within business processes, breaking down silos, transparency in efficiency and effectiveness measures, and fostering open innovation are all essential elements of a data culture. Investing in new skills, both technical and soft, is crucial for driving business outcomes and working in multidisciplinary teams.

                    The transformational data journey: Towards the end-to-end game

                    To embark on a successful data-driven connected commerce journey, I see three pillars to be most critical: intelligence activation within the business, orchestration for scale across the organisation, and appropriate data and technology enablement. Intelligence activation involves embedding data, AI, and analytics at the heart of business operations and decision-making, demonstrating concrete business value. Orchestration for scale requires a balanced approach that combines centralized and local capabilities, as well as skilled talent and automation. Data enablement ensures the proper management of data platforms, data quality, governance, and collaboration.

                    The approach: Think big, start small, scale fast

                    Achieving maturity on these three dimensions simultaneously is essential for unlocking the full potential of data-driven commerce. We have good experiences by integrating all 3 dimensions in so-called ‘hothouses’, bringing together the data and tech capability with business processes and cultural change to help embed analytics into core processes and your people’s work. Multi-disciplinary teams (with T-shaped profiles) are focused on demonstrating real business value (with rapid learning cycles from proven benefits) while building the blue-print for accelerated scaling and enabling the fit-for-purpose data and technology tooling.

                    Where is your company on this journey? A few questions to ask yourself:

                    1. Do you see measurable value from data across your end-to-end business?
                    2. Are you managing your underlying data foundations accordingly?
                    3. Are you effectively scaling up, taking advantage of new innovations – and do you run your data and analytics engine efficiently?

                    The future of commerce belongs to those who can harness the power of data and technology. Embracing channel-less commerce, leveraging connected capabilities, and cultivating a data-driven culture are crucial steps for success.

                    By tapping into various data sources, collaborating within ecosystems, and leveraging advanced analytics, companies can gain valuable insights and deliver personalized experiences to consumers. Investing in data foundations, adopting composable tech architectures, and focusing on the hearts and minds of employees will further accelerate the transformation towards a data-driven era of commerce. It’s time for companies to embrace the opportunities presented by data and technology and position themselves at the forefront of the evolving consumer goods and retail landscape.

                    This was the last part of this blog series. We will further elaborate on the various topics mentioned in subsequent blog series – stay tuned for those.

                    The bottom line: The future of commerce is connected, and it’s essential for companies to embrace it!

                    Meet the author

                    Kees Jacobs

                    Consumer Products & Retail Global Insights & Data Lead, Capgemini
                    Kees is Capgemini’s overall Global Consumer Products and Retail sector thought leader. He has more than 25 years’ experience in this industry, with a track record in a range of strategic digital and data-related B2C and B2B initiatives at leading retailers and manufacturers. Kees is also responsible for Capgemini’s strategic relationship with The Consumer Goods Forum and a co-author of many thought leadership reports, including Reducing Consumer Food Waste in the Digital Era.

                      More expert perspectives

                      Explore more

                      Reimagine the future of consumer products

                      Redefining success in the new era of connected commerce

                      What matters to today’s consumer 2023?