Skip to Content

The Open Footprint® Data Model
A foundation for sustainability data

John Lewsey
16th August 2024

The Open Footprint Data Model Standard helps organizations build trustworthy, well-structured sustainability data. This can support better decision-making for reducing corporate impact on the planet, as well as compliance with tough new reporting rules.

Dealing with the climate crisis is a hugely complex global challenge involving governments, organizations, and citizens. To solve it, we need bold innovation, as well as new skills in learning how to live more efficiently.

Many governments and organizations across the globe are setting out net-zero plans that declare ambitious targets for carbon reduction in the next couple of decades. Alongside this, many governments are introducing tough sustainability reporting rules, such as the European Union’s Corporate Sustainability Reporting Directive (CSRD) and California’s Climate Corporate Data Accountability Act, all to ensure that organizations declare their environmental impact.

With all these plans being published and regulatory reporting rules being introduced, it seems like everyone is starting to get with the beat and make progress towards reducing our impact on the planet.

Right?

Unfortunately, it’s a bit more complicated than that. Organizations need reliable, trustworthy sustainability data to drive decision-making and produce regulatory reporting, and this data is far from trivial to collect and analyze.

GHG reporting is complex

Let’s consider the greenhouse gas emissions (GHG) domain, which is only a part of the wider sustainability domain alongside water use, land use, biodiversity, and other topics. GHG reporting is divided into three scopes as shown in the illustration.

Scope 1 covers direct emissions generated by an organization’s activities. Scope 2 covers indirect emissions generated by purchased energy. Scope 3 covers the remaining indirect emissions generated by upstream and downstream activities such as the supply chain and product distribution, use, and disposal.

For many organizations, Scope 3 emissions far outweigh those from Scope 1 and Scope 2, especially if they have complex supply chain and distribution ecosystems. However, Scope 3 emissions are often the hardest to quantify as they involve gathering detailed sustainability data from potentially hundreds of parties across complex, global ecosystems.

In research published by the Capgemini Research Institute in its Data for net zero report, only 22% of surveyed organizations say they are measuring Scope 3 emissions.

The Open Footprint® Data Model Standard

The recently published Open Footprint Data Model Standard enables organizations to capture and share sustainability data in a consistent, transparent, and traceable way, regardless of industry sector or ecosystem complexity. By using this standard as part of their data discipline, organizations can set up a solid data foundation that supports better decision-making and opens the way for more innovation in tackling the climate crisis.

The standard has been produced by the Open Footprint Forum, part of the Open Group. The Forum includes sustainability and data experts from a wide range of member organizations, including Capgemini. Version 1.0 of the standard covers all air emissions. Later versions will cover other aspects of sustainability.

The model covers all the key information that is critical to recording and sharing sustainability data, including:

  • How an organization is structured
  • How it works with other organizations
  • What facilities and assets it has
  • What activities it conducts that contribute to emissions
  • What emissions are generated or captured
  • How those emissions are calculated
  • How emissions relate to a product over its lifecycle
  • What emissions are included in which sustainability report against which reporting standard

A key principle of the model is the ability to support full traceability of data from source to report. This is critical for building trust in sustainability data with stakeholders such as regulators, shareholders, and the public.

“Organizations can set up a solid data foundation that supports better decision-making and opens the way for more innovation in tackling the climate crisis.”

How the model helps organizations

Standardizing data within the organization. Often the first battle is to get a coherent view of sustainability data within the organization. This involves collecting data from a diverse set of business units and organizing it to support analysis and reporting. Basing this common view on a robust, fit-for-purpose data model is an essential step that must be taken early. The Open Footprint® Data Model is ideally suited to this task.

Sharing sustainability data between organizations. Organizations need to share sustainability data with partners, customers, and other stakeholders. Using the Open Footprint Data Model to standardize how this data is provided reduces friction, increases the utility of the data, and offers a way of proving provenance to increase trust. This is especially true for GHG Scope 3, which involves integrating data from many parties.

A foundation for sustainability insights from analytics and AI. Deciding which activities an organization needs to stop, change, or accelerate is key to meeting net-zero commitments and making concrete reductions to environmental impact. Looking past simple mitigations to find the bigger wins requires aid from more advanced analytics and AI. This kind of analysis relies on sustainability data that is consistent, comprehensive, and of excellent quality, fused with other business data. Basing the sustainability data on a robust model is an essential foundation to this task.

Support for regulatory reporting. Tough rules on sustainability reporting to regulators are already in place for some types of organization in many areas. Over time, these rules are expected to get tougher and widen in scope. Some reporting standards require an organization to not only state the measureof an environmental impact it has made, such as a GHG emission, but how that measure was calculated. Having a data model that supports traceability from source to report, including a record of the calculation steps, is essential for this task. This was one of the main use cases that the Open Footprint Data Model was built around.

Acting on sustainability data

To crack the climate problem, or at least help to minimize an organization’s impact on the planet, we need trustworthy sustainability data. Adopting a common data standard is a clear enabler to building a solid data foundation on which we can rely for evidence-based sustainability decisions. This foundation is also a critical enabler for the innovation we need to apply to the climate problem, so we can look after the one and only planet we live on.

Innovation takeaways

Sustainability decision-making needs trusted data – Any decision-making needs to rely on trusted data, but this is especially true for the sustainability domain. It is a new topic for many organizations and the stakes are high when it comes to placing bets on what interventions are going to make a material difference to reducing environmental impact.

Base sustainability data on a robust model – The sustainability domain is a complex issue. Therefore, a simple data model trying to capture those complexities isn’t going to cut it. It is better to start off with a robust model that can properly cope with the intricacies of the domain rather than having to migrate data repeatedly as simpler models fall short.

Align sustainability data across ecosystems. Having standardized data within an organization is a great start, but a business’ ecosystems often include many organizations and geographies. Sustainability data needs to be aligned using well-founded standards, so that sharing data is low friction and high value.

Author

John Lewsey

Principal Solution Architect, Insights & Data, Capgemini
John is the CTO for Insights & Data in the UK. He uses his 30 years of systems engineering experience to help clients think through and deliver complex, large scale information and analytics programmes. He is also a team co-leader within the Open Footprint Forum, where he contributes to the development of the Open Footprint Data Model Standard. John is based in the UK and is a Chartered Engineer with the Institution of Engineering and Technology (IET).

    The quantum computing teams of the BMW Group and Capgemini benchmark quantum computing applications with Quark

    Julian van Velzen
    Julian van Velzen
    25 Jul 2024

    Quantum computing has the potential to solve industry-relevant problems in the near future. But what applications are the most promising?

    Which optimization algorithm or machine learning model is best-suited for any given use case? And what quantum hardware technology performs best in the most important metrics?

    Why QUARK?

    The BMW Group and Capgemini have recently joined forces to explore the potential of quantum computing, with a particular focus on benchmarking quantum algorithms using the Quantum Computing Application Benchmarking (QUARK) framework. QUARK is a standardized benchmarking tool that aims to provide an unbiased assessment of quantum computing performance.

    One of the key advantages of QUARK is its ability to offer an unbiased framework for evaluating quantum algorithms and quantum machine learning (QML) training models. This is particularly important in the rapidly evolving field of quantum computing, where different hardware platforms and approaches are being developed. QUARK’s open-source nature and neutrality ensures that the benchmarking process is not skewed towards any specific vendor or technology.

    QUARK addresses the challenges inherent in benchmarking QML methods and optimization algorithms. These tasks are notoriously difficult to assess, as they often involve complex interactions between the algorithm, the problem instance, and the quantum hardware. QUARK provides a structured framework to tackle these challenges, enabling a more comprehensive and meaningful evaluation of quantum computing capabilities.

    Why Maximum Independent Set Problems?

    The variety of industry-relevant combinatorial optimization problems is immense. They involve finding the best solution from a large number of possible configurations. Quantum computers, with their ability to explore multiple solutions simultaneously, have the potential to outperform classical computers in solving these complex optimization problems.

    Many optimization use cases, such as sensor placement, windmill placement, and traffic optimization, can be modelled as maximum independent set (MIS) problems. Neutral atom based quantum devices benefit from the structure of this problem class and could be a candidate for an early quantum advantage. By leveraging the QUARK framework, the teams can rigorously assess the performance of such devices on these real-world applications, providing valuable insights into the practical applications of quantum computing.

    Next Steps and Challenges

    As the BMW Group and Capgemini continue to develop and work on the QUARK framework, the next steps will involve further development and refinement of the QUARK framework, particularly in the area of graph coloring problems. This will allow the teams to benchmark quantum algorithms on a wider range of optimization challenges, pushing the boundaries of what is possible with quantum computing.

    The methods generated in the coming weeks and months will undoubtedly challenge quantum hardware providers. By using objective benchmarks with open-source tools like QUARK, the providers can transparently and directly compare the performance for different applications across different quantum technologies. We hope the neutral atom quantum computer providers will join in the open benchmarking methodology, as this will help drive the development of more powerful and efficient quantum hardware, ultimately accelerating the adoption of quantum computing in various industries.

    The quantum computing teams of the BMW Group and Capgemini are very excited to develop and work with this framework and to answer the question: which device will provide a real quantum advantage first?

    Meet the Authors

    Julian van Velzen

    Julian van Velzen

    Quantum CTIO, Head of Capgemini’s Quantum Lab
    I’m passionate about the possibilities of quantum technologies and proud to be putting Capgemini’s investment in quantum on the map. With our Quantum Lab, a global network of quantum experts, partners, and facilities, we’re exploring with our clients how we can apply research, build demos, and help solve business and societal problems that till now have seemed intractable. It’s exciting to be at the forefront of this disruptive technology, where I can use my background in physics and experience in digital transformation to help clients kick-start their quantum journey. Making the impossible possible!

      Buildings are costly – for your bottom line and for the environment. Optimizing their efficiency solves both issues

      Miguel Sossa
      Miguel Sossa
      Aug 23, 2024

      Boost building efficiency, save money, and reduce carbon emissions with Capgemini’s Energy Command Center

      The US is home to approximately six million commercial buildings – all of which consume large amounts of energy, draw on company budgets, and create carbon emissions that contribute to the environmental pollution and resource depletion that is accelerating climate change.

      In spite of global efforts to increase the sustainability of buildings and construction, the built environment continues to account for a staggering 37 percent of global carbon emissions. For organizations that invest heavily in land and commercial real estate – whether for office space, manufacturing and distribution facilities, or commercial storefronts – ensuring buildings are energy-efficient not only helps to keep global carbon emissions in check, but also reduces budgetary impact at a time when every dollar counts.

      But benchmarking and tracking your buildings’ energy consumption and emissions can be difficult. The challenge is in getting a clear picture of emissions and consumption across multiple assets – and often numerous regions. It takes just the right integrated datasets along with robust and actionable insights to optimize efficiency and reduce energy consumption.

      Our new tool is making that happen by helping companies effectively reduce energy consumption by up to 30 percent – and qualify for Energy Star certification.

      Reducing energy use is about data management

      Whether your organization operates a handful of commercial real estate assets or hundreds of facilities around the globe, optimizing for efficiency is an exercise in data management.

      Most building owners simply don’t have the ability to see, gather, and consolidate the data they need to understand the problems – and solutions. Capturing data around energy consumption and emissions from disparate building systems such as heating and cooling, lighting, and ventilation requires a platform that can integrate inputs from diverse sources and systems, analyze them, and provide intelligent insights on how to optimize operations.

      Capgemini’s Energy Command Center (ECC) is a unique new smart building management system that brings data together from various sources to offer a comprehensive view of the energy consumption of a building or a portfolio of assets. This control tower not only monitors existing systems, it also offers guidance on how to optimize them and layer in investments in renewables. It even suggests optimal locations and potential geographic moves based on costs and consumption patterns.

      Data insights are key to Energy Star certification

      With expense mitigation a top priority for companies in many sectors, every dollar saved on energy costs has the potential to increase operational margins. But energy savings are also proof positive that companies are taking measurable steps toward sustainability, which can be critical when it comes to regulatory compliance and marketability to customer segments.

      The Energy Star program was designed by the U.S. Environmental Protection Agency to help organizations adopt cost-saving energy efficiency solutions that protect the climate. It’s also a stamp of approval with wide recognition. Energy Star-certified buildings use 35 percent less energy than average buildings, and enjoy lower utility bills. This enables asset owners to secure better financing terms. Their buildings often have higher property values, and can command a sales price of one to five percent more than average. 

      Commercial asset owners that get their buildings qualified for Energy Star certification not only save money, but also have more leverage when it comes to attracting tenants, building rental income, and meeting compliance requirements. Within the federal government, for example, tenants are only allowed to occupy Energy Star-certified buildings.

      At Capgemini, we’re working on ensuring our properties and those of our clients meet these stringent standards, too. And Energy Command Center is playing a key role, enabling companies to boost their Energy Star score by 25 points.

      Walking the walk on sustainability

      When we first developed Energy Command Center, piloting it with our own assets was the obvious choice. We shaped and deployed the tool with in-house engineering expertise and rigor, and tested it across our large footprint in India.

      Thanks to our long history of systems integration, along with our expertise in utilities and building infrastructure, we designed a comprehensive and partner-agnostic system that works. And by collaborating with certification agencies like Energy Star, we were able to align ECC with climate strategies and sustainability goals in a bid to transform our physical environment.

      This unique tool is now helping our clients and partners reduce their energy consumption and their carbon footprint – and we’re proud to envision its potential for creating positive global change. After all, acting on climate change is at the heart of our corporate priorities, and helping clients to reduce inefficiencies, build strong value propositions, and create the kind of future we all want is ultimately why we do what we do.

      Learn more about how Capgemini is leading sustainability.

      Meet the author

      Miguel Sossa

      Miguel Sossa

      Vice President & Americas Sustainability GTM Lead, Capgemini
      Miguel is Vice President and Sustainability GTM Lead for Capgemini Americas. Miguel has over 20 years of experience navigating Fortune 500 clients through complex sustainability and organizational challenges. He is a champion for positive social and environmental change which has led him to create a new sustainability-focused scholarship fund to empower underrepresented students to pursue their dream careers while meeting urgent environmental and social needs. MÁS will provide financial and mentorship support to graduate students enrolled in Michigan’s Erb Institute for Global Sustainable Enterprise.

        Resilient supply chains
        Supply chain quality management

        Gilles Bacquet
        24 May 2024
        capgemini-engineering

        How supply chain quality management helps suppliers in uncertain times

        In the first blog of this three-part series, we covered the importance of order management to supply chains, and how the process can be improved.

        In this entry, we will discuss Supply Chain Quality Management – what it is, how it works and why it matters.

        Quality problems in the supply chain: a hypothetical example

        A retail company has been experiencing a surge in customer complaints about a particular smartphone model. Customers report issues such as malfunctioning screens, battery failures, and overheating after a few months.

        The company discovers the defects stem from components supplied by an overseas vendor. The vendor has been struggling with quality control issues in its manufacturing processes. However, due to the lack of robust supplier quality management procedures in place, the retail company failed to identify and address these issues promptly, and now struggles to find an alternative, more reliable source of components.

        The importance of supplier awareness

        In manufacturing, an average of 80 percent of a product’s value comes from suppliers.

        Mastering supplier management (and thus the quality of supplier goods) is critical for all organizations with a supply chain – especially in this era of global disruption and uncertainty. This involves mitigating supply risks, which is in the DNA of the Supply Chain Quality Management (SCQM) team. The job of that team is to provide a situational awareness picture of the supply chain, as well as steps to solve the many problems that can occur.

        To this end, businesses need a robust action plan that contains, for example, a set of quick containment actions that support quality control or complaint management. One way to support this is through the Eight Disciplines (8D) approach. Originally developed at Ford Motor Company, the methodology can be used for supply chain problem identification and solving.

        Regularly assessing a company’s global supply chain is the first step in properly monitoring (and understanding) the global manufacturing capabilities of its suppliers. This understanding allows companies to, for example, pre-empt critical component shortages by changing the manufacturer for a specific part. To this end, we implement various specialized audit methodologies (e.g., VDA6.3 and Aero Excellence) that provide detailed insights into the quality of your supply chain, and that leverage best practices from several industries.

        This methodology allows us to identify individual patterns – but also global ones. For example, we recently performed a complete assessment campaign for one of our clients, in which we audited more than 200 suppliers in 35 countries over 17 weeks.

        From this snapshot, we created a supplier development program using robust methodologies initially developed for the automotive industry, but that are today widely applied in other sectors, such as Advanced Product Quality Planning (APQP). We can also create custom audit methods for clients.

        Rightshoring for your supply chain

        Rightshoring is locating manufacturing in areas that provide the best combination of cost and efficiency. To help our customers succeed, we rely on our rightshore vision, which leverages a network of more than 2,500 experts across the world.

        Our local consultants and audit teams quickly get to work on the client premises, reducing travel time, costs and project eCO2 emissions. We interact, if possible, with suppliers in the local language – streamlining remediation. Through this rightshoring approach, we have demonstrated a reduction of 60% eCO2 when compared to the traditional approach of experts traveling overseas.

        As is very clear for anyone who has bought an item that did not deliver upon expectations, quality control is essential. And it is increasingly important the more complex that products become, as more components mean more points of failure.

        As electronic goods become increasingly complex, and as supply chains continue to endure geopolitical instability – SCQM, and the people who do it, will be more important than ever.

        In the final part of this blog series, we will discuss the importance of sustainability in supply chains, and what steps can make your supply chains more sustainable.

        If you are currently facing delivery disruptions, or if you need to ramp up your supply chain to meet changing demand, we can help. Capgemini has years of experience helping companies across sectors and countries with supply chain quality management, along with access to some of the world’s leading experts in the subject. To find out more, contact our expert.

        Author

        Gilles Bacquet

        Senior Portfolio & Product Manager, Resilient & Sustainable Supply Chain offers owner
        Gilles is a Production & Supply Chain engineer and joined Capgemini group in 2001. Starting as a consultant expert in Supplier Quality Management for Automobile & Aeronautic, he has extended his responsibilities in creating supply chain offers and developing business oversea. He leads Resilient & Sustainable Supply Chain offers for Capgemini Engineering.

            Resilient Supply Chains: Order Management

            Quality problems in the supply chain: a hypothetical example

            Resilient Supply Chains: sustainability

            Steps to take to make your supply chains more sustainable

              Resilient supply chains: Sustainability

              Gilles Bacquet
              3rd June 2024
              capgemini-engineering

              Sustainability is a priority for all stakeholders, and organizations are feeling pressure to reduce emissions and improve labor practices across the supply chain.

              In the previous part of this series, we discussed Supply Chain Quality Management – what it is, how it works, and why it matters.

              This final installment is about the importance of sustainability in supply chains, and the sustainability steps you can take.

              On February 23, 2022, the European Commission presented its proposal for a new supply chain law. Now known as The EU Supply Chain Act or Corporate Sustainability Due Diligence Directive (CS3D), it is not yet in effect at the time of writing. However, the C3SD is intended to address the sustainability problems faced (and caused) by modern supply chains.

              In the next few years, C3SD and similar legislation will force many companies to modify their supply chains. Companies must make preparations now.

              The temperature is rising and the pressure is on

              The pressure on corporations to master their environmental impacts is probably highest in consumer goods industries, where each consumer is also a citizen of a world that increasingly feels the effects of climate change. But this pressure to change is being felt in sectors everywhere, and it will continue to grow as companies are required to become more sustainable.

              However, focusing on sustainability is unfeasible if we only look at a company’s direct emissions (i.e., scope 1 and 2) as, on average, 80 percent of the global equivalent dioxide of carbon (eCO2 or CO2e, a measure created by the United Nations’ Intergovernmental Panel on Climate Change) is generated by inbound and outbound supply chains (scope 3).

              Because of the complexities of calculating supply chain carbon, integrated sustainability criteria in selecting, developing, or removing suppliers from the portfolio are critical (scope 3 Category 1 and 2).

              Implementing these criteria begins by asking procurement teams to:

              • Monitor the current eCO2 of each supplier by commodity use, such as oil
              • Maximize eCO2 calculation from mass/product-based models. This is a stoichiometric calculation, an approach that measures which elements enter and exit a system. This is in contrast to spend-based models, which measure a company’s financial priorities, not its eCO2 output.
              • Identify supplier technology levers to optimize their emissions, using the Three Horizons framework – a McKinsey growth strategy approach for planning in uncertain times.
              • Contact suppliers to obtain their commitments to sustainability (their sustainability roadmaps) and establish tools to report on their progress.

              Further insight can be obtained through a global environmental social governance (ESG) evaluation – which provides a detailed assessment of your organization’s performance against various sustainability metrics, in addition to environmental impact. Capgemini’s partnership with Ecovadis (a leading ESG ranking company) can streamline this process.

              More sustainability levers to pull

              Another lever is to optimize the logistical connections between suppliers and delivery centers (scope 3, categories 4 and 9). For example, real-time localization of parcels (including tracking critical parameters such as humidity or temperature) allows route optimization and the maximization of logistic volume (for example, how efficiently shipping containers are packed).

              Through these steps, combined with our sustainable packaging offers that optimize weight and remove low recyclable materials (like moving from plastic to carton), we have helped our clients to save about a hundred thousand kilotons of eCO2 to date.

              All of which means going green is its own kind of resilience.

              These efforts (which are undoubtedly considerable) go beyond eco credentials. The reputational enhancement, improved regulatory compliance, and potential cost efficiencies offered by increased sustainability all contribute to the resilience of supply chains.

              Indeed, most of the carbon that companies are responsible for driving down is in their supply chains (scope 3), but, due to their complexity and scale, a sophisticated analysis and remediation strategy is required.

              A company that calls itself sustainable without integrating its inbound and outboard supply chains is either misinformed or dishonest. And, either way, the consequences for failing to do this work will only get worse as time goes on.

              A variety of stakeholders, international organizations, and industry bodies will continue to require increasing sustainability commitments from companies. Anticipating the required changes is not just risk mitigation, but a competitive advantage. Those who better understand can adapt in a quickly changing regulatory (and planetary) environment. Need some help with your supply chain sustainability efforts? Choose a leading international partner with years of expertise across a range of industries. Find out how we can help you: contact our expert.

              Author

              Gilles Bacquet

              Senior Portfolio & Product Manager, Resilient & Sustainable Supply Chain offers owner
              Gilles is a Production & Supply Chain engineer and joined Capgemini group in 2001. Starting as a consultant expert in Supplier Quality Management for Automobile & Aeronautic, he has extended his responsibilities in creating supply chain offers and developing business oversea. He leads Resilient & Sustainable Supply Chain offers for Capgemini Engineering.

                  Resilient Supply Chains: Order Management

                  How to build a resilient supply chain and provide secure on time delivery (OTD), by better understanding the lifecycle of a purchase order

                  Supply chain quality management

                  How supply chain quality management (SCQM) can help suppliers in these increasingly uncertain times 

                  Capgemini Engineering

                    Let’s talk about scaling GenAI in life sciences

                    Sanjeev Jain
                    Sanjeev Jain
                    Aug 21, 2024

                    Moving beyond proof of concept trials has proven difficult for many, but a recent Capgemini conference produced useful ideas for scaling across the enterprise

                    When discussing generative AI with executives at life sciences enterprises over the past year, it’s clear most of the industry’s leaders are excited about the potential for the technology to transform their business – but it’s equally apparent they have encountered significant challenges. To address this, Capgemini hosted a day-long conference in New York City for some of its clients in the sector, dedicated to addressing the issues related to using GenAI to drive value at scale. Several common threads emerged from the conference – providing delegates with useful insights about the way forward.

                    A highlight of the event was an end-of-day panel discussion about adopting generative AI at scale in life sciences, during which Capgemini experts and industry leaders shared successful strategies based on their own experiences with the technology. The panel included Sheetal Chawla, Executive Vice President Life Sciences, Capgemini; Michelle Pesanello, Vice President Life Sciences, Capgemini Invent; Scott Barnes, Vice President, Capgemini Insights & Data; and Brian Eden, Vice President, Global Life Sciences Tech Ops, Capgemini. It was moderated by Chris Scheefer, Vice President, Intelligent Industry, Capgemini.

                    Executive and business user buy-in

                    The first takeaway was that most enterprises in the sector are actively exploring generative AI. They’re identifying use cases and launching proof of concept trials. What’s more, those designing trials generally understand the importance of starting with simple use cases that will deliver results quickly and without demanding a huge commitment in resources.

                    We think this is a great approach, because quick wins can demonstrate the value of the technology. This seems to be working, as these tests have mostly been positive.

                    Second, the ability to leverage GenAI is often greatest if a champion on the executive team is driving the effort. Interestingly, many conference attendees attribute C-suite buy-in to someone on the executive having been exposed to public GenAI tools such as ChatGPT. They’ve used it, or seen their kids using it, and they understand the potential.

                    We think this commitment from the top is vital: GenAI’s benefits are significant and will transform the entire organization, even as its deployment and use must be carefully overseen to eliminate any potential risks. So, it’s critical the executive committee and board are actively involved.

                    Third, many organizations are thinking beyond the technology itself, and approaching generative AI holistically. During the panel discussion, one delegate noted their company conducts AI innovation workshops to identify promising use cases. Another explained how their IT team identifies partners on the business side of the enterprise who are excited about generative AI, then brings these people into proof of concept tests at an early stage. These enthusiasts become ambassadors who can help bridge the gap between IT and business users, identify use cases, and educate those on the business side once a proof of concept is ready for scaling.

                    These are also great strategies. Based on Capgemini’s engagements with clients, it’s become evident that a successful GenAI strategy must include people from across the organization. That’s why, early in the process, we help our clients establish a cross-functional team responsible for GenAI that includes both technology and business representatives from a range of departments.

                    Scaling challenges: data preparedness, education, managing expectations

                    Panelists and audience members also shared some common challenges and concerns. Scaling up from proof of concept is a major issue for many. One challenge Capgemini often encounters when clients are trying to scale is the need to ensure GenAI can effectively access and leverage all the enterprise’s data. Over the past decade, businesses have done a good job of organizing, validating, and applying governance rules to data so it can be used for analytics – but the data involved has been structured. GenAI requires this data to function, but also needs access to the organization’s unstructured data – such as images and video – and the same discipline must be applied to this material as well.

                    As an attendee pointed out, it’s important that IT work with its technology vendors to address GenAI security within security solutions. At Capgemini, we expand that to include applying security and governance to what we call knowledge forums: the legal, operational, engineering, and other corporate wisdom that forms the intellectual property of the company. These are the sources of an enterprise’s competitive advantages and they must be protected when, for example, training the large language models that enable GenAI.

                    It’s also essential to share IT’s vision for GenAI with business users – not only to educate them about how it will be used, but also to manage expectations and address concerns about the technology’s potential impact on jobs. It’s understandable that employees may hear about GenAI’s ability to create efficiencies and reduce costs, and worry this translates to staff reductions. But we believe the technology’s real strength lies in its ability to unlock the value between the silos within an organization. GenAI can connect disciplines across the enterprise, enabling people to do more. While companies should consider efficiencies, the primary driver for embracing GenAI should be growth.

                    Open dialogs across the enterprise and partners

                    As noted, it’s important to establish the right links, early in the process, between business users and technology professionals. What’s more, expanding that dialog to include strategic partners makes the discussion even more valuable as companies seek to scale GenAI across the enterprise – as our day-long conference demonstrated.

                    The New York event was Capgemini’s second conference of 2024 devoted to GenAI and life sciences, following a similar event in Boston in March. We’re already planning our next conference, to be held in December in San Francisco. If you wish to know more about these events, I am happy to discuss them with you.

                    Slide to submit

                    We are sorry, the form submission failed. Please try again.

                    Meet your experts

                    Sheetal J. Chawla

                    Head of Life Sciences, Americas
                    Sheetal Chawla is a Member of the Americas Executive Committee at Capgemini, leading the Northeast Region and Life Sciences Business. With over 20 years of experience in consulting, business leadership, and the life sciences industry, she has held roles at Omnicom Group, Roche (Genentech), and Iqvia. Sheetal serves on the Board of the New York Botanical Gardens and has previously served on boards including the National Diversity Council and Rutgers University. She’s been recognized as an ISG “Rock Star Leader” and received NAFE’s “Woman of Excellence” Award. Sheetal holds a BA and MBA from Johns Hopkins University and Sciences Po Paris.

                    Michele Pesanello

                    North America Invent Life Sciences Sector Leader, Capgemini Invent
                    Versatile executive with a passion for leveraging data and emerging technologies to enable positive outcomes for patients, HCPs, and health-related organizations. A proven leader, helping companies transform business by way of breakthrough strategies and re-imagined operations through the application of emerging technologies such as AI, intelligent automation, IoT, and digital. Focused on developing and strengthening business processes and client relationships, utilizing a unique blend of business, sales, and technology acumen to deliver sustainable solutions for complex challenges in the evolving healthcare ecosystem.

                    Scott Barnes

                    Vice President | NA, Insights & Data | Head of Customer First and Data Strategy
                    Scott Barnes is a Vice President in Capgemini’s Insights & Data practice, and leads the Customer First and Data Strategy Portfolio offerings. Scott is a seasoned analytics professional with over 30 years in the Analytics, AI & Information Management space. He devotes his time to delivering smarter insights and stronger outcomes to his clients and has a passion for delivering improved business outcomes by harnessing the exponential growth of data, smarter algorithms and faster processing speed. By combining data science and design thinking with digital assets, he delivers transformative value across markets to clients by leveraging capabilities including data mastery, cognitive insights, data science and analytics.

                    Brian Eden

                    Vice President, Global Life Sciences Technical Operations Leader, Capgemini
                    Leading process and digital solutions in Pharma and Medical Device Operations “We are at an exciting moment when our data systems and analytics are finally capable of helping us fulfill the promise of Industry 4.0 for Pharma and Med Tech. We must move digital transformation forward boldly, all the while keeping our efforts grounded in the fundamentals of data architecture and Lean Thinking that got us to where we are today. “

                    Chris Scheefer

                    Vice President & North America Intelligent Industry Lead, Capgemini

                      Building Gen AI applications for business growth – actions behind the scenes

                      Capgemini
                      Capgemini
                      21 Mar 2024

                      Over the last few years, we have been witnessing a strong adoption of artificial intelligence and machine learning (AI/ML) across industries with a wide variety of applications.

                      Use cases range from cost reduction via automation to the generation of additional business via the introduction of AI-infused products and services.

                      The launch of the generative AI (GenAI) application ChatGPT by OpenAI in November 2022 only accelerated AI adoption. At present, many of the tech giants including the leading cloud platform vendors like Google, Microsoft, and Amazon have strong GenAI offerings along with those from many smaller vendors and open-source platforms.

                      In short, GenAI is an AI discipline where the AI foundation models (FMs) are trained on vast amounts of multimodal data (i.e., text, image, audio, video, terabytes of data, trillions of parameters). With proper user requests on the input, FMs can generate a large variety of multimodal synthesized outputs. Large language models (LLMs) are a subclass of FMs specializing in text. An added benefit of GenAI is its highly superior natural language processing (NLP) capabilities, in many cases using multimodal input/output, making it a great and not-realized-before technology for human-computer interfaces. This is one of the key reasons for the heightened interest in GenAI.

                      GenAI, with applicability in virtually all industries, can significantly improve many of the day-to-day operations of a business as well as help launch new business capabilities. While some GenAI-based autonomous products like certain types of text, image, and audio/video processing are emerging, many of the enterprise-grade usage scenarios that are currently in focus involve GenAI-based digital assistants to humans.

                      These assistants can help chatbots (and copilots):

                      • Respond to open-ended questions in a more human-like manner
                      • Improve overall customer experience
                      • Detect features and anomalies in images and transactions
                      • Help with code writing and testing
                      • Expand work automation
                      • Improve a wide range of document processing
                      • Make cognitive and semantic content searches more efficient and effective
                      • Provide advanced analytics to assess what-if scenarios
                      • Assist in creative content generation.

                      Typical metrics for business growth are revenue increase and healthy profitability. Productivity, innovation, and time-to-market are the key enablers of business growth. Depending on the situation, the discipline of GenAI can positively impact some or all of these enablers. A recent McKinsey study [1] estimates that GenAI-enhanced productivity and innovation could add between $2.6 and $4.4 trillion to the global economy annually and identified that around 75% of the value delivered would fall under four use case categories:

                      • Customer operations
                      • Marketing and sales
                      • Software engineering
                      • R&D

                      An early 2023 Capgemini Research Institute report [2] that explored a wide variety of industry use cases and surveyed nearly a thousand executives shows the broad applicability of GenAI and high ROI expectations from GenAI adoption. Of course, to realize significant business growth benefits, GenAI-based applications need to be functionally completed using additional application components besides the GenAI piece and need to be scalable, reliable, and integrated with other enterprise systems as necessary.

                      Example: A GenAI-enhanced multimodal and omnichannel B2C commerce application

                      Figure 1. Modular and component-based architecture for “Casey” – A GenAI-powered virtual retail assistant

                      We, at Capgemini, recently developed a virtual retail assistant, named “Casey” to accept orders and drive the order-to-cash process for partner stores (see figure 1). Casey is voice-activated and GenAI-enabled. Capgemini solution accelerator components power,[3] Google/GCP, and Soul Machines. For the end-to-end application, we layered a ‘digital human’ with conversational AI and cloud-native headless commerce APIs,[3] all pre-integrated for conversational commercial kiosks. It serves as a store-in-store order kiosk allowing the partner stores to maximize their channel reach with minimal investment. Casey is a business growth enabler – it opens a new revenue channel where it is easy to market innovative offers and whose cost does not grow rapidly with business growth, i.e., highly productive, and the solution construction allows for fast time-to-market implementation. Casey’s solution architecture is modular which has enabled us to use this as a basis for many other digital channel use cases in a variety of industries, for example, grocery, general retail, call center, telco, and automotive.

                      As this example illustrates, to build GenAI-powered applications that cover full customer journeys thus yielding tangible business value, we need to either combine several other application components and technologies or integrate the GenAI parts into otherwise functionally complete existing applications in case suitable ones are available

                      Creating enterprise-grade GenAI-based apps: Key considerations

                      To build a GenAI-based enterprise-grade application delivering substantial business growth, we need to consider:

                      • Opportunity formulation. Identification of the right business-relevant opportunities with realistic ROI projections is a critical success factor (CSF) for eventual success with GenAI-based applications. Especially as companies embark on GenAI adoption, it can reduce the risk of failure if GenAI is used to augment existing activities and processes. For example, the addition of GenAI into an existing customer churn prediction algorithm could process unstructured data like call recordings from customer interactions and customer reviews, capture additional insights like ‘sentiment’, specific store or product issues, competition strengths and weaknesses, possible new product bundles, and suggest appropriate ‘white glove’ treatments to reduce churn. As another example, GenAI could assist in a customer’s product exploration by improving existing user interfaces with visuals and helpful hints and by simplifying the actual purchase action by making the supporting processes more transparent.
                      • Solution design. One of the first considerations in drafting a GenAI-based solution strategy is to recognize that GenAI-powered interactions with customers or end users can produce actions that may not follow strict workflows, i.e., the complete application needs to have the flexibility to appropriately react to more free-flowing human-GenAI conversations. If the solution is built from scratch such flexibilities can be developed from the ground up which, of course, means a larger development burden. Cloud-first development and use of pre-built components (such as Capgemini’s Digital Cloud Platform [3]) can significantly reduce this burden. If the GenAI components are incorporated in an existing solution then the existing solution most likely will have to be refactored for proper integration of the new and the old including changing/upgrading some of the functionalities of the older components, for example, from batch processing to real-time response, etc. The choice of the appropriate GenAI tool/platform and the availability of data required for the proper functioning of the solution are also key considerations.
                      • Customer/employee experience and data orchestration. The value of GenAI in chatbots (and copilots) is the level of personalization and context an unscripted conversation can provide to a customer and employee. To retain this value, an enterprise must think through how to orchestrate various interaction points (or digital teammates) for consistency, as well as share interaction and customer data so the next conversation at a different interaction point is able to pick up the conversation where that customer left off last time. These chatbots are also a tool to empower employees to assist customers more broadly, where previously, an employee used to rely on what she knew at that moment now has access to comprehensive and granular data on-demand. Enterprises must also consider an orchestration layer to connect the various GenAI initiatives and data.
                      • Scale-out. GenAI is still an emerging technology; hence, it is advisable to start small, prove concrete business value, and then scale out to realize the target business benefits. However, in GenAI use cases where technical feasibility has already been proven elsewhere and a realistic business case for the solution is deemed positive, it can be worth the time and effort to create solution architectures with possible scale-out in mind. Such architectures would consider solution performance under production workload, availability and disaster recovery, security and data privacy, identity and access management, error handling, development and run cost optimization, and sustainable development practices. In the scale-out phase, a cloud-based solution approach is often superior and should be duly considered. Some of the GenAI-specific considerations are enterprise data foundations and trust (solid source of truth for customers, vendors, products, promotions, knowledge base, etc.), LLM selection, LLM lifecycle management, prompt version control across environment tiers, UX design for free-flowing conversations, balancing intent-based and generative-based interactions, incorporation of human-in-the-loop, response feedback loop, cost monitoring and optimization, technical debt management, and responsible AI governance.
                      • Measure and improve. Adequate measurement of solution performance is essential to understanding the current maturity of the solution and possible future enhancements; thus, measurement mechanisms should be built into the solution as first-class citizens. As such, high-level KPIs from traditional solutions can be reused in GenAI-powered solutions, for example, reduction in churn rate, increase in revenue per customer, efficiency in anomaly detection, and the like. However, it would be insightful to also add some metrics related to the model and system quality, and the performance of the GenAI components (see, for example, a summary of relevant metrics in [4]) which could include response error rate, range of input over which response accuracy stays acceptable, system latency, throughput, and run cost.
                      • Learn and grow. Capturing and sharing experiences as the solutions are developed and rolled out – and learning from them – is extremely valuable for fast-developing technologies like GenAI. Some design documentation, decisions taken along with the rationales, and stakeholder and end-user feedback are good ways to capture experiences from which lessons learned can be derived. This process would help in improving the solution over time as well as increase the organizational maturity to take on higher value (and potentially higher complexity) GenAI-based projects down the line. Over time, defining a robust set of build patterns across use cases would be helpful for asset reuse, solution management, and acceleration of new use case implementations.

                      Concluding remarks:

                      Done right, GenAI has tremendous power to push most enterprises forward with healthier business growth and higher market competitiveness. However, it’s important to acknowledge potential challenges such as bias in training data, the “black box” nature of some AI algorithms (limited explainability), and ethical considerations.

                      As a productivity enabler, GenAI is expected to accelerate automation by ten years with nearly half of the current tasks having been automated by the end of this decade.[1] Not to be left behind, enterprises should focus both on identifying what GenAI-powered applications are the most valuable for them as well as acquiring, either in-house or via partners, adequate skills to understand the ‘what’ and the ‘how’ of GenAI. In the early stages of GenAI maturity, spot solutions can bring quick wins while as the maturity grows, incorporation of GenAI in broader and across enterprise value chains should be considered for reaching higher benefit goals – and this will take some foundational investment in data, UX strategy, integration strategy, and building a GenAI platform.

                      Capgemini at Google Cloud Next 2024

                      Google Cloud Next brings together a diverse mix of developers, decision makers, and cloud enthusiasts with a shared vision for a better business future through technology. As a Luminary Sponsor, Capgemini is committed to elevating the event experience with opportunities to boost learning and engagement and get fresh insight into today’s riveting topics – including generative AI.

                      Whether the aim is empowering businesses or their people to unlock the power of generative AI, Capgemini is at the forefront of this revolution. Our continuous work in this growing domain means we are equipped to help our partners capitalize on this unique technology and engineer use cases for enhanced and unprecedented customer experiences.

                      Come by our booth and let’s discuss the possibilities in the world of Generative AI, Cloud, Data/AI, and Software Engineering. Or reach out to us – we would love to hear your perspective on how we can get ready for what comes next.

                      References:

                      [1] https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/the-economic-potential-of-generative-ai-the-next-productivity-frontier

                      [2] https://www.capgemini.com/us-en/insights/research-library/generative-ai-in-organizations/

                      [3] https://www.capgemini.com/us-en/solutions/digital-cloud-platform-for-retail/

                      [4] https://cloud.google.com/transform/kpis-for-gen-ai-why-measuring-your-new-ai-is-essential-to-its-success

                      Author

                      Manas K. Deb

                      PhD, MBA, VP & Business Leader, Cloud CoE, Capgemini/Europe
                      A long-time veteran of software industry covering products and consulting, Manas has been a co-creator of several Cloud CoEs within Capgemini and has been actively involved in a variety of cloud transformation projects delivering business value. In collaboration with the customer, he explores their challenges and opportunities in the areas of innovation, digital transformation and cloud computing which helps him leverage Capgemini’s assets and his own experience to advise the customer on a best-fit roadmap to reach their goals. Manas has bachelors and masters degrees in engineering, an MBA, and a PhD in applied mathematics and computer science from Univ. of Texas (Austin).

                      Jennifer Marchand

                      Enterprise Architect Director and GCP CoE Leader, Capgemini/Americas
                      Jennifer leads the Google Cloud COE for Capgemini Americas, with a focus on solutions and investments for the CPRS, TMT, and MALS MUs, and supporting pre-sales across all MUs. She has been with Capgemini for 18 years focusing on cloud transformation since 2015. She works closely with accounts to bring solutions to our clients around GenAI, AI/ML on VertexAI and Cortex, Data Estate Modernization on Big Query, SAP on Google Cloud, Application Modernization & Edge, and Call Center Transformation and Conversational AI. She leverages the broader Capgemini ecosystem across AIE, Invent, ER&D, I&D, C&CA, and CIS to shape cloud and transformation programs focusing on business outcomes.

                        Innovation, meet intelligence. Join us at Google Cloud Next ’24 to discover next-level digital transformation 

                        Herschel Parikh
                        2 Apr 2024

                        Reflecting on the past year, it’s truly remarkable how technology has accelerated innovation across industries. Particularly generative AI, once a niche concept, has rapidly become a cornerstone of tech discussions.

                        And within the dynamic landscape of Google Cloud, this cutting-edge technology is seamlessly integrated into their solutions. Together, we’re empowering businesses to drive untapped value on their digital transformation journeys.

                        Capgemini will be showcasing exactly how, as a proud Luminary Sponsor for Google Cloud Next ’24, with an array of topics demonstrated through live demos, speaking sessions with our clients, live podcast episodes and much more.

                        For me, this is not just another tech conference, it’s an opportunity to help businesses explore the possibilities of Google Cloud. We have curated an exciting week that will reveal valuable business transformation strategies and cross-sector intelligence solutions across generative AI, cloud, data/AI, sustainability, and software engineering to help you achieve your business goals.

                        Here’s a sneak peek at the immersive experiences in store for you at Google Cloud Next ’24 

                        In our Capgemini booth, you’ll have the opportunity to experience a range of immersive demos, listen in on our live Cloud Realities podcast series hosting Google Cloud thought leaders – and explore the potential of Google Cloud with our experts.

                        We also invite you to immerse yourself in cutting-edge technology demos in sectors in retail, grocery, telecommunications, and financial services. Come see how data, cloud, and generative AI drive tangible business value through applications such as: 

                        • Smart cart – the world’s smartest shopping cart 
                        • Casey – your digital human assistant 
                        • Inventory management – connecting customer experience habits to business operations 
                        • Proactive home care – living safer and smarter as a homeowner 
                        • Intelligent property – creating digital twins with real-time insights and recommendations 
                        • Autonomous networks – come by and activate “energy savings mode.” 

                        Learn from our leading clients as they take to the stage to explore their experiences: 

                        Auto Club Group’s resilient tech stack journey – Join Madhu Nandagiri and Viral Patel from Auto Club Group along with Capgemini’s Prashant Shastri to explore Auto Club Group’s journey from legacy to leading edge with Capgemini and Google Cloud for seamless migration of core insurance apps. 

                        GenAI transformation: Cox Communications’ strategic journey to innovation – Hosted by Samantha McConnell of Cox Communications and Tim Sandkuhler of Capgemini, this session will explore how Cox Communications has enhanced sales and service value using Google Cloud’s Vertex AI suite. They’ll also share use cases for improving digital sales and Net Promoter Score (NPS), and reducing contact center costs. 

                        Virgin Voyages: Mastering business continuity and operational support – Frank Farro from Virgin Voyages and Kim Wilson from Capgemini will explore how the Google Cloud infrastructure, built by Capgemini, facilitated the migration of all applications and workloads for Virgin Voyages. This migration led to reduced business costs and lower on-premises infrastructure expenses, and eliminated disbursement costs for multiple cloud providers. 

                        Plus, get personalized insights from our experts:

                        Cloud Talk – Are you getting the benefits you expected when moving to the cloud? Hear from our Chief Cloud Evangelist, Dave Chapman, who will explore common challenges for businesses who have moved to the cloud and provide you with strategies to solve them. 

                        Leveraging Real-time 3D Technology for Augmented Commercial Property Insurance –  Explore how Intelligent Property redefines commercial property insurance, empowering you with actionable insights for informed decision-making and sustainable business growth. 

                        Cloud Realities podcasts – Tune in as we speak with industry experts to delve into the latest tech trends and advancements.

                        Next, meet Casey – Who is Casey? Come meet our digital barista assistant, who will craft for you a personalized coffee experience. Visit us on level 2 in the Activation Zone to treat yourself.

                        I’m truly honored and excited to join all these visionaries and industry experts as we showcase market-leading solutions and cutting-edge thought leadership. Join me at Booth #840 to witness how data, cloud, and Generative AI drive tangible business value.

                        Let’s connect and chat about your plans to transform. See you at Next ’24!

                        Author

                        Herschel Parikh

                        Global Google Cloud Partner Executive
                        Herschel is Capgemini’s Global Google Cloud Partner Executive. He has over 12 years’ experience in partner management, sales strategy & operations, and business transformation consulting.

                          How microgrids can harness AI to proactively protect community energy

                          Claire Gotham
                          Jul 4, 2024

                          As the US navigates the energy transition, microgrids will play a key role in building a more resilient, reliable energy supply across the country. Drawing from a range of clean, local energy sources, microgrids will offer independence from the increasingly unstable national grid.

                          Local and smart – the energy of the future

                          Technological innovation is at the heart of a successful energy transition. And as artificial intelligence begins to radically transform industries, Energy & Utilities is no exception. Here we look at the role that AI will play in creating responsive, smart microgrids that harness the power of local energy and empower local energy consumption.

                          Data and AI are at the heart of power grids’ efficiency and security. By the 2030s, the technical architecture of microgrids themselves will be optimized using data-rich models, digital twins, and real feedback across thousands of deployments. This creates sophisticated levels of efficiency and resilience that benefit local and national energy ecosystems.

                          Energy executives today are already realizing AI’s benefits by analyzing production scheduling scenarios using simulation modeling. In everyday usage in our 2030s community, constrained policy optimization (CPO) and deep reinforcement learning will be widely used to predict the times when energy is most cleanly and efficiently produced, for instance while the sun is shining, or the wind is blowing. It will then automatically store any excess in a range of formats of batteries or other forms of energy storage across the community while energy is cheap.

                          AI-driven microgrid management will also be able to forecast the times of high energy usage and then sell accumulated energy as prices rise. In parallel with this automated intelligence, active local prosumers will also participate in the energy ecosystem, making real-time choices around energy usage, storage, and reselling. Thus, micro-producers’ profit will be maximized while also reducing the expense for local end-users and putting them in greater control of their energy.

                          Finally – and critically – AI will determine when any part of the grid could falter. It will then trigger the “islanding” of the microgrid ahead of any grid outages or other potentially damaging fluctuations. This island mode creates an energy ecosystem in which all community buildings continue to be powered independently. Critical infrastructure such as hospitals, manufacturers and retailers, and data centers are protected from energy instability, thereby protecting an area’s commercial health and citizens’ welfare. Thanks to AI, this protection will not be reliant on human intervention, which ultimately bolsters the area’s resilience.

                          Author

                          Claire Gotham

                          VP, Utilities and NA Renewables Lead
                          Claire Gotham is an experienced Utilities and Renewables executive who has successfully developed complex projects, led diverse teams to deliver and achieved the business strategy. Her skill set comprises over 25 years of experience in consulting and business development. Claire Gotham is a SME in Commodities Risk Management, Renewables Strategy, Energy Transition, and Public Speaking and Training. She has led over 100 industry trainings, been a featured speaker on panels, podcasts and industry events. Claire Gotham has also served as an Expert Witness and QIR (Qualified Independent Representative).

                            Local energy: a source of opportunity and resilience in the US energy transition

                            Claire Gotham
                            Jul 4, 2024

                            As we begin to move away from fossil fuels, the electrification of the US economy will be essential. Electricity demand is now estimated to grow by 4.7% over the next five years – a stark jump up from the flatline 0.5% annual demand growth we’ve seen for the past decade.

                            The rising demand for data centers and electric vehicle charging depots is creating new major loads, coupled with the move to reshore manufacturing in the US and the emergence of new energy facilities such as green hydrogen plants. US grid operators are struggling to handle all this additional load, resulting in power gaps and connection delays. Add to this the increase in weather and climate disasters that regularly cause outages across the country, and the problems with the grid in its current state are becoming impossible to ignore.

                            At Capgemini, we believe that it’s not just about what the energy transition averts that’s important, but also what it enables. Here we look at what a brighter energy future could look like for the US – one that successfully navigates the move to electrification and empowers its communities with affordable, reliable power supply. How? By supplementing the main grid with independent, local systems of microgrids.

                            Microgrids – helping to power the future

                            The US Department of Energy (DOE) believes that by 2035, microgrids will be the essential building blocks of the future electricity delivery system to support resilience, decarbonization, and affordability. Community microgrids are distinct from private or single-site commercial ones, in that they span an entire substation grid area, benefiting thousands of customers.


                            In the 2030s, we envision that these microgrids will be spread across the USA. The main grid will of course still be a critical piece of energy infrastructure; however, microgrids will serve to boost and strengthen it. A key use case for them will be in the event of extreme weather. Microgrids can disconnect from the main grid when it is down, unstable, or overloaded and switch the supply to its own network of distributed energy resources. This resilient energy infrastructure will insulate the local area from energy outages, which itself protects critical infrastructure, commerce, and citizens’ welfare.

                            Driving efficiency up and emissions down

                            The USA is a vast country and increasing electrification in end-use sectors means the US electric power demand will only increase through 2050. Despite marginal improvements, the distribution of electricity across the USA remains inefficient, with around 5% lost in transmission and distribution. Thus, local generation can play a key role in reducing emissions through cutting waste, regardless of how ‘green’ the sources. Local power grids will increase efficiency by bringing the generation and storage of energy much closer to its consumption.


                            This energy efficiency gain becomes particularly important when we look at the rise of AI. The potential of AI to transform every industry is undoubtedly huge. Yet its high energy intensity, coupled with the growing demand for it, threatens to put the already overloaded grid under strain. By localizing energy generation, businesses can more efficiently meet the energy demands of AI, thereby empowering their innovation.

                            Building active community engagement

                            A diverse range of energy sources is needed to make the transition successful, and in this microgrid-enabled future, the country will be actively engaged in decisions around energy mixes. The geographic diversity of the US requires solutions that can be adapted to the specific landscape, as well as specific regional preferences. Renewables such as wind and solar can be combined with more novel technologies such as Small Modular Nuclear (SMR) reactors, green hydrogen and carbon capture and storage (CCS) solutions to improve their local natural environment and civic health, as well as their carbon impact, as they transition away from fossil fuels.

                            Residents and businesses will also be able to purchase energy directly peer-to-peer from within their own microgrid, with profits then invested back into the community. In this way, the commercial benefits of the energy transition are shared more widely, with communities witnessing first-hand the positive impacts of local energy development.

                            Decentralization – a key strategy in energy cybersecurity

                            With its dependency on legacy technologies, the US electrical grid of the 2020s is extremely vulnerable to cyberattacks. Attacks are increasing not just in number, but in sophistication, as hostile state actors and criminals dramatically increase their use of AI-enhanced digital tools to disrupt energy critical infrastructure.

                            Microgrids’ distributed architecture offers greater inherent resiliency as there’s no single point of vulnerability. AI-powered microgrids have proactive and predictive intelligence defenses that don’t rely on local cybersecurity skills. The microgrid infrastructure also offers inherent redundancy through its diversity. For instance, if a solar installation is attacked, the microgrid can automatically isolate the affected area while continuing to rely on other energy sources.

                            Energizing high-skilled employment  

                            An additional benefit will be the new highly skilled and valuable local jobs associated with designing, installing, and maintaining microgrids. As reliance on oil and gas subsides, these workers can reskill to become part of this new, positive energy era. The energy employment gender gap will also begin to close as diversity and inclusion programs train more female talent in the growing renewables sector.  

                            Last but not least, this new generation of workers will be highly motivated to continually innovate to improve the way the world is powered, as they feel the widespread impact in their local community. The energy transition will not be something that’s happening to them, but rather be the opportunity for them to actively shape their future, together.

                            Author

                            Claire Gotham

                            VP, Utilities and NA Renewables Lead
                            Claire Gotham is an experienced Utilities and Renewables executive who has successfully developed complex projects, led diverse teams to deliver and achieved the business strategy. Her skill set comprises over 25 years of experience in consulting and business development. Claire Gotham is a SME in Commodities Risk Management, Renewables Strategy, Energy Transition, and Public Speaking and Training. She has led over 100 industry trainings, been a featured speaker on panels, podcasts and industry events. Claire Gotham has also served as an Expert Witness and QIR (Qualified Independent Representative).