Skip to Content

From the pitch to the boardroom: How rugby can empower leadership in the corporate world

Capgemini
Oct 3, 2023

Rugby, representing values ​​that are also relevant in the business world, can help individuals develop not just athletically but also as leaders in the workplace. In this capacity, it can break down some of the barriers that women traditionally face.

What do the captain of a rugby team and the manager at a company have in common? Both motivate, evaluate, train, and – of course – communicate with a group of people. In both cases, a leader must thoughtfully assign the available roles, foster a group spirit, and instill collective momentum by encouraging everyone to work together.

There are, of course, notable differences between the world of the oval ball and that of business. The profiles are more homogeneous (in terms of age, gender, etc.) in a rugby team than in the office. The pace is also different: a few hours in the field versus five days of work per week. And, above all, if one is allowed to take the time to find their bearings at the start of the sports season, the business world may be less patient. The manager is expected to mobilize their team to bring about results as quickly as possible.

The art of being a good leader

Business has always shared many values with the sport. Whether you are a rugby captain or a manager, the most important quality can be summed up in one word: trust. The trust you inspire and the trust you give. You must be able to make decisions that set the collective in motion while giving everyone the space needed to take the personal initiatives that will in turn fuel the group’s dynamic. Another essential quality is passion, which a leader must be able to transmit to and nurture within their team.

Taking leadership skills from the pitch to the office

From stadium turf to the office carpet, the lessons learned from sports are as relevant as ever. Managers need to be able to adapt and question their habits to succeed in defusing conflicts and easing tensions and they must be flexible in order to navigate a fluid environment. It’s a question of both mindset and organization. And it requires a certain intellectual agility – the ability to question the way one works and change direction when necessary.

The values ​​of rugby and sport can aid and inspire decision-makers in their daily lives and careers.

Solidarity, cooperation, collective intelligence, team spirit, respect, and courage – these values, which are fundamental to achieving success in sport, are just as relevant to leaders in other environments.

Sport is a particularly good avenue for developing soft skills, which are in turn highly prized in the professional world. They can complement an individual’s professional skills by enabling them to effectively accomplish tasks within a larger group that requires one to navigate various relationships. Sport also fosters personal development, improving individual well-being, which subsequently helps workers develop better within their companies.

Sport as a vehicle of inclusion

Another advantage of sport, and rugby in particular, is that it contributes to the inclusion of women in the workplace, and especially to their promotion to leadership positions.

Although it took a long time to emerge – the French women’s team was born at the very end of the 1980s – women’s rugby is developing more and more. Illustrating the progress, France had more than 26,000 registered female rugby players in 2022.

And the sport is starting to occupy television screens with, for example, the broadcast of the 2023 Women’s Six Nations Championship. This growth in coverage is just the beginning. Media coverage contributes to developing the sport, which is proving to be an important source of empowerment through which women can acquire leadership abilities that are valued in management positions.

Rugby enables women to affirm their personalities, express themselves, and develop a fighting spirit. Importantly, it gives them the confidence to strive for positions traditionally dominated by men. Rugby and business thus work side by side to facilitate women’s access to leadership – and that’s great news!

With our three-year partnership announced in September 2021, we joined the Worldwide Partners family for Rugby World Cup 2023 and became World Rugby’s Global Digital Transformation partner. Rugby World Cup France 2023 is set to be the major attraction in the sporting calendar this year, bringing the rugby family and new fans together for a celebration of 200 years of the sport. Capgemini has worked with France 2023 to enhance the tournament’s unforgettable moments on and off the field.

Women in Rugby

Global Partner of Women in Rugby and Worldwide Partner of Rugby World Cup 2021

Meta connect 2023 keynote: Unveiling the future of mixed reality and AI, Is the metaverse still in focus?

Alexandre Embry
Sep 29, 2023

Most of us followed the recent keynote from Mark Zuckerberg at Meta Connect 2023. Interesting to see the focus on #mixedreality and the convergence of #immersive and #AI

Quest 3, the latest headset able to blend physical and digital worlds, and new slate of AI assistants, when, embodied as avatars, could be made as incredibly realistic NPCs in immersive environment, sound great improvements when it comes to digital interactions.
Indeed, media observers noticed a change in the messaging, where the metaverse word wasn’t part of the narrative.

Does that mean Meta is no longer focusing on the next generation of the Internet?
I encourage those who are thinking as such to have a look to this recent incredible video ‘Mark Zuckerberg: First Interview in the Metaverse’ from the excellent Lex Fridman‘s podcast: https://lnkd.in/eGD_eeD4

This interview is indeed extremely insightful, and this is always interesting to understand Zuckerberg’s vision of the Metaverse and what will come then.

But beyond this, the technology they used during this remote discussion is just amazing, capturing emotion like never before, with the real feeling of being in the same room.

So, Metaverse is no longer part of Meta’s tech research agenda? The passion Mark is sharing here, drawing his vision of the future of online interactions for the years to come and what Meta is working on, seems almost a good answer. And yes, this will come.

Always interesting to observe the difference between a narrative dedicated to media and investors, and what’s really behind the scene.

Meet the author

Alexandre Embry

Vice President, Head of the Capgemini AI Robotics and Experiences Lab
Alexandre leads a global team of experts who explore emerging tech trends and devise at-scale solutioning across various horizons, sectors and geographies, with a focus on asset creation, IP, patents and go-to market strategies. Alexandre specializes in exploring and advising C-suite executives and their organizations on the transformative impact of emerging digital tech trends. He is passionate about improving the operational efficiency of organizations across all industries, as well as enhancing the customer and employee digital experience. He focuses on how the most advanced technologies, such as embodied AI, physical AI, AI robotics, polyfunctional robots & humanoids, digital twin, real time 3D, spatial computing, XR, IoT can drive business value, empower people, and contribute to sustainability by increasing autonomy and enhancing human-machine interaction.

    How to conduct a watertight risk assessment for 5G networks

    Aarthi Krishna & Kiran Gurudatt
    29 Sep 2023

    By 2025, 5G networks are expected to cover a third of the world’s population. As the global footprint of 5G expands, so does the associated security risk. While the underlying security capabilities of 5G are superior to those of previous generations, they are not without limitations.

    Retail customers use public networks with limited security liability, but most organizations using 5G (typically for manufacturing and operations) will need to build a private network or use a hybrid public/private model that is built to meet their specific requirements. The complexity of such an ecosystem makes risk assessment an essential part of implementing security for 5G.

    In the previous blog, we looked at the challenges associated with 5G deployment architectures and why risk assessment must be holistic in nature, covering both the horizontal and vertical axes of the network. Here, we take a closer look at what it takes to conduct a full risk assessment.

    Essential steps for a robust risk assessment

    A thorough risk assessment ensures full coverage of the 5G network. It has to be comprehensive and end-to-end, with a full understanding of the people, processes, and technology risks, while adhering to the necessary frameworks, such as NISTthe ISA/IEC 62443 standard, and MITRE FiGHT

    At Capgemini, we follow these three essential steps as with any risk assessment :

     1. Discovery

    The discovery phase aims to gather all necessary information about the 5G environment, its assets, the number of and types of devices, and the use cases deployed, along with the organization’s risk appetite and existing security policies.

    2. Assessment

    Once all the necessary information has been collected, the assessment phase evaluates security controls and policies that are pertinent to the 5G network. Every identified gap is assigned a risk score, and reports and visual aids are created to clearly communicate these findings.

     3. Reporting

    After the assessment, a complete view of the current maturity level and the risk scores are reported, and supporting recommendations are presented to enhance the security posture.

    Our approach to 5G risk assessment is divided into two key parts, one covering the technical controls and the other covering managerial and operational controls:

    • Technical controls: This part of the assessment addresses the technical controls implemented in a 5G network spanning its various components that include the endpoints, mobile edge computing (MEC), radio access network (RAN), core, and other functional elements like NFV and network slicing. For each of these components, the assessment is further classified into several sub-categories. For instance, the sub-categories for which an endpoint is assessed include its access control, network security, supplier security, physical security, and asset management. Such an approach to sub-categorization is especially useful when assessing functional elements. For example, network slicing in 5G is a key functional element that enables the delivery of meaningful guarantees for network coverage, performance, capacity, or even security. Slicing essentially divides the underlying physical network infrastructure into multiple virtual networks to cater to a specific quality of service (QoS), such as low latency for real-time applications, high bandwidth for multimedia streaming, and ultra-reliability for critical communications. While this adds significant value to a network and is expected to provide monetization opportunities, each slice requires specific security requirements to protect against attack vectors relevant to itself. Our risk assessment approach covers the following sub-categories for slicing:
      – security for the installation and configuration of a slice
      – security during the slice preparation phase
      – security during the slice run phase, security for the slice decommissioning phase
      – inter- and intra-slice security
      – slice interface security
    • Management & operational controls: This part of the assessment covers risks primarily related to governance, human resource management, incident management, operation management, monitoring audit and testing, and threat awareness. This means asking those non-technical but critical questions such as:
      – Has a potential dependency on a single supplier of 5G equipment been considered?
      – Have personnel with access to critical or sensitive components of 5G networks been security-vetted?
      – Are there documented plans in place in case of a disaster affecting the ongoing operation of the 5G network?

    Some of the key advantages of this risk assessment include ensuring these qualities:

    • Standards: It brings together the recommendations and guidance from various organizations, such as ENISA, ETSI, IETF, ITU-T, ISO, ORAN, OWASP, NIST, and GDPR.
    • Deployment model: It addresses security risk for various non-public network (NPN) deployment models, including stand-alone deployment and public network integrated deployment.
    • End-to-end capabilites: It covers the entire operational technology (OT) and Internet of Things (IoT) ecosystem that sits on top of the 5G network. This includes security concerning manufacturers, suppliers, telco operators, edge, OT, and IoT devices.
    • Compliance: It is compliant to industry-accepted standards (e.g., IEC 62443) to facilitate auditing and certification. It incorporates functional and operational requirements for different security levels such as SL1, SL2, and SL3.
    • Comprehensive Coverage: It provides full comprehensive coverage for technical and non-technical risks while at the same time covering the entire 5G architecture from edge devices, RAN, core, MEC, cloud, and to the applications.

    Ongoing monitoring and compliance

    A sound risk assessment is the start of any security journey. After the assessment and the deployment of various security controls, it is crucial to establish an ongoing process for monitoring and responding to security events in the 5G network. Continuous monitoring allows organizations to detect and respond to any security incident promptly, maintaining a robust security posture – the next blog in the series will consider how to deliver such a monitoring program for 5G networks. Stay tuned.

    Contact Capgemini today to find out about 5G security.

    Author

    Aarthi Krishna

    Global Head, Intelligent Industry Security, Capgemini
    Aarthi Krishna is the Global Head of Intelligent Industry Security with the Cloud, Infrastructure and Security (CIS) business line at Capgemini. In her current role, she is responsible for the Intelligent Industry Security practice with a portfolio focussed on both emerging technologies (as OT, IoT, 5G and DevSecOps) and industry verticals (as automotive, life sciences, energy and utilities) to ensure our clients can benefit from a true end to end cyber offering.

    Kiran Gurudatt

    Director, Cybersecurity, Capgemini

      Network transformation for digital experiences.
      The CAMARA APIs evolution

      Deepak Gunjal
      Sep 28, 2023
      capgemini-engineering

      Learn how software-defined systems and APIs will transform the mobile network landscape, and what you can do now to take advantage of this change.

      The modern economy runs on data, and even more of it is delivered via mobile networks to phones and IoT devices. The applications that run on these devices – from video streaming, to predictive machine maintenance, to drone piloting – increasingly want more functionality from mobile network providers.

      Application providers may want to adjust levels of service dynamically, e.g., a sports streaming service may want more bandwidth during big matches, or a crop inspection drone may want short periods of stable data throughput and low latency during flyovers. Others may want more data about who is trying to connect and where they are, e.g., to optimize services, prevent fraud, or geofence services.

      Modern mobile networks are increasingly able to serve these needs. As they move from physical hardware to software-defined systems, networks gain greater ability to adjust their complex network setups in real-time. That could be very useful for application providers.

      Doing so is a win-win. Communication Service Providers (CSPs) can monetize their network’s increasingly sophisticated capabilities, whilst application providers get more bespoke services, and so can build better products that users will pay for.

      The challenge is getting mobile applications and networks to talk to each other.

      Unlocking the Potential of APIs

      Enter LF CAMARA, a joint initiative between telco operators, vendors, and hyperscalers, which aims to develop APIs that can expose network capabilities to application developers.

      The project launched in February 2022 and is an open-source project in the Linux Foundation with a legal framework and terms of reference to the GSMA OPG group. It is a serious collaborative attempt to solve a problem that has proven tricky in the past.

      As network technologies have undergone rapid evolution, so have the capabilities they can provide to consumers. Advancements from 4G to 5G have brought new network functions, such as the Service Capability Exposure Function (SCEF) and Network Exposure Function (NEF), which are designed to expose certain network capabilities – such as prioritizing traffic flows, monitoring device status, and verifying locations etc – to external applications via 3GPP defined REST APIs (also known as RESTful API).

      The LF CAMARA project is creating open, global, and interoperable REST APIs (the Service APIs), that grant access to network capabilities across various networks, irrespective of the network that customers use. The APIs serve as abstractions of the network capabilities, sparing application developers the need to understand the complexities of network technologies.

      This combination of ease-of-use and cross-network collaboration empowers applications to easily add new functionality, but also to deliver that functionality consistently across different telco networks and countries.

      Digital Everywhere

      A common goal of digital services is to be accessible on user devices regardless of location. Maintaining a consistent quality of experience (QoE) is key.

      Within LF CAMARA, the evolving Open APIs are designed to embed these capabilities directly within communication networks. They empower application providers to monitor changes in user experiences when connected to the network, to interact programmatically with the network through Open APIs, and adjust their application’s behavior accordingly.

      As an example, consider a video streaming application provider. Using the Device Location API, they can subscribe to updates on when users change location. Depending on the location, they may want to change the way the user connects, for example using the Traffic Influence API to request the network connects the user’s application to a local low-latency edge service. This can enhance the user experience, but the application can also take it a step further by utilizing the Quality on Demand API to request specific quality of service for its application traffic, to ensure the desired QoE.

      In essence, Open APIs enable application providers to proactively adapt their services based on real-time network insights, ensuring users consistently enjoy high-quality digital experiences, regardless of their circumstances.

      The API Platform

      Delivering on the potential of LF CAMARA necessitates an API platform which can translate application requests into the right network responses.

      Such a platform must serve a dual role. On one hand, it should provide northbound API exposure capabilities, i.e. allow application providers to talk to the network. On the other, it must implement transformation functions to effectively carry out the instructions expressed through the APIs.

      Transformation encompasses the translation of CAMARA API resource abstractions into 4G/5G network APIs, such as 3GPP SCEF/NEF northbound APIs (among other standard interfaces and APIs). This includes handling associated parameters and managing communication with 4G/5G network functions.

      Engineering the Future of Networks

      Implementing an end-to-end API solution requires a comprehensive understanding of all elements, including familiarity with 3GPP specifications, technical expertise in network infrastructure organization and protocols, procedural knowledge, and security compliances. These factors are critical when integrating an API platform with 4G/5G networks. The API platform must adeptly manage these complexities to offer a simplified LF CAMARA API exposure to its consumers.

      From the start of the LF CAMARA project, Capgemini has been an integral part, actively contributing to the EdgeCloud subproject. Our involvement focuses on evolving the APIs necessary to expose edge computing services within the telco environment. Capgemini’s edge service platform, the Intelligent Edge Application Platform (IEAP) already provides CAMARA APIs for Device Location, Device Status, and Quality on Demand (QoD), complete with the required transformation functions. And, it is continuously being evolved to support the other APIs, like Traffic Influence, Edge Sites Selection and Routing, and more.

      Furthermore, the successful integration of these APIs with the 3GPP-compliant Network Exposure Function (NEF) has been rigorously tested within a 5G Standalone (SA) core network.

      APIs that expose network capabilities could offer significant business benefits, to both application providers and networks themselves. But, getting it right is technically complicated. Thanks to our involvement in LF CAMARA, combined with our deep historical expertise in the area, Capgemini is now in a leading position to help both sides take advantage of this new opportunity.

      Contact us to explore CAMARA APIs for Edge

      Meet our expert

      Deepak Gunjal

      Senior Director – Advanced Connectivity
      Deepak currently serves as Senior Director, CTO Connectivity office, at Capgemini Engineering. He represents Capgemini engineering in various standardization bodies, mainly GSMA Operator Platform Group (OPG), Operator Platform API Group (OPAG) and Linux Foundation CAMARA Project. He also contributes to the architectural evolution of Capgemini cloud native platforms for supporting edge computing, network API exposure etc. in mobile networks. He has over twenty-three years of experience in the telecom and software industry

        Is Cloud-first really a panacea?

        Srinivas Patnaik
        Sep 28, 2023

        How modern enterprises can effectively navigate Cloud adoption and repatriation strategies – no matter where they are on their transformation journeys

        In recent years, a cloud-first approach has emerged as the go-to strategy for organizations looking to modernize and optimize their IT infrastructures. The adoption of Cloud services by Fortune 500 companies has surged significantly – driven by the allure of scalability, reduced infrastructure costs, and heightened agility. The impressive revenue growth of the top three hyperscale Cloud providers further reinforced this trend.

        Nonetheless, amid the euphoria, not everyone found themselves on cloud nine. The year 2021 saw venture capital firm Andreessen Horowitz fire the first shot – garnering considerable attention with an article that challenged the widely-held belief that Cloud adoption was a universal panacea. By analyzing the financial performance of various software companies (including Dropbox), they questioned the sustainability of Cloud’s advantages when business growth slows.

        The Dropbox case, which showcased substantial savings and improved margins after repatriating workloads, served as a compelling example. The article also extended its scope to highlight how Cloud’s impact on profit margins could lead to potential losses in market capitalization – amounting to hundreds of billions of dollars. The takeaway was clear – Cloud costs should take center stage as a primary metric – and companies should explore optimization, repatriation, and hybrid strategies to manage the intricate balance of Cloud costs and benefits.

        In 2022, Basecamp entered the arena with a public endorsement of repatriation – sharing their first-hand experience with Cloud adoption. Their verdict was that Cloud services – while advantageous for simple applications and sporadic workloads – do not necessarily deliver the promised savings for medium-sized companies with stable growth like theirs. The touted benefits of reduced complexity were overshadowed by significant costs that could have been mitigated by in-house management at a fraction of the price. Basecamp’s argument against overselling Cloud advantages resonated in their call for a more decentralized Internet future, which could be achieved through self-managed hardware.

        Fast-forward to 2023 – and a hybrid IT landscape is emerging as the prevailing reality. F5’s report on The State of Application Strategy underscores that organizations of all kinds are adopting hybrid strategies – distributing workloads between public clouds and on-premises infrastructure. The debate over Cloud versus on-premises solutions remains unsettled. As organizations grapple with the decision of adopting the Cloud or repatriating to on-premises solutions, they find themselves navigating a complex terrain that’s rife with implications for IT budgets and performance. However, organizations can start assessing and reviewing their Cloud migration or repatriation efforts by carefully considering the following aspects.

        Enterprises embarking on their Cloud journeys

        For enterprises embarking on their Cloud journeys, an evaluation of application suitability is paramount. Lessons from Basecamp’s experience offer valuable insights into assessing Cloud adoption proposals against the specific business context of applications. Applications with limited growth potential might not justify the return on investment for Cloud adoption. The reality remains that numerous organizations continue to host critical applications on-premises due to technical, regulatory, and economic considerations. Opting out of Cloud migration when ROI is unclear is a legitimate approach.

        Cloud assessment often designates certain applications as prime candidates for “Lift and Shift” – the process of moving applications to the Cloud without extensive modifications. However, organizations should scrutinize this approach, as it might sacrifice the full spectrum of Cloud-native features and optimizations. While this method facilitates quick migration, it tends to perpetuate inefficiencies and complexities, leading to elevated costs and limited scalability. Deciding between rearchitecting and Lift and Shift should hinge on the long-term strategic significance and growth potential of applications.

        Enterprises that have already embraced the Cloud

        For enterprises that have already embraced the Cloud and are now assessing the overall benefits, a focus on Financial Operations (FinOps) is crucial. FinOps aligns technical and financial teams to manage Cloud costs, monitor usage, and make informed decisions to achieve cost efficiency and budget control. By analyzing the Cloud bill, organizations can uncover instances of suboptimal migrations and identify resource inefficiencies. This insight paves the way for migration strategy re-evaluation, application rearchitecting, and adherence to Cloud resource management best practices.

        Repatriating workloads warrants careful consideration – especially for workloads characterized by predictability, stable usage patterns, or specific compliance and security needs. This process involves a meticulous analysis of workload requirements, infrastructure readiness, and potential operational shifts. Repatriation empowers organizations to regain control over resources, curtail Cloud-related expenses, and tailor infrastructure to exact business needs.

        Site Reliability Engineering (SRE) could also be an essential companion on repatriation journeys. The symbiotic relationship between SRE practices and Cloud repatriation underscores their significance. The proliferation of SRE practices – initially conceived by a Cloud provider like Google – emphasizes their role in efficient Cloud infrastructure management. SRE operations seem to facilitate workload repatriation while sustaining cost savings and operational efficiency.

        Artificial Intelligence (AI) and Automation emerge as pivotal enablers when organizations transition back to on-premises data centers. AI holds the potential to optimize data center performance through real-time data analysis and machine learning algorithms. This translates to enhanced power consumption efficiency, improved capacity planning, effective resource allocation, augmented security through anomaly detection, and automated cooling and power systems. AI-driven solutions align with sustainability and Environmental, Social, and Governance (ESG) goals while also enhancing efficiency, cost reduction, and overall data center performance. These are all significant incentives for on-premises alternatives.

        Effectively navigating your Cloud adoption and repatriation strategy with Capgemini

        Organizations must approach Cloud adoption and repatriation decisions pragmatically and harness the wisdom garnered from years of Cloud migration. Tailoring choices to individual business and technical contexts while eschewing dogmatic thinking remains pivotal.

        With a wealth of experience in delivering optimal Cloud migrations, Capgemini’s mature Cloud advisory practice and Cloud Modernization with ADMnext offering are primed to assist organizations in making these pivotal decisions. To learn more about how Cloud Modernization with ADMnext can help you chart your course through the ever-evolving Cloud landscape, drop me a line below.

        Meet the author

        Srinivas Patnaik

        ADM Solutions Leader
        As a Lead Solution Architect, I drive ADM deals as part of Capgemini’s Portfolio Solutions team. I orchestrate solutions across multiple towers and bring together the best and relevant offerings to craft compelling propositions for our customers. I’m passionate about helping customers optimize and modernize their application portfolios.

          What have two decades of tracking Europe’s digital government journey taught us?

          Jochem Dogger
          Sep 26, 2023

          It’s been more than 20 years since Capgemini teams began tracking Europe’s progress toward digital public services. Through the findings of the European Commission’s annual eGovernment Benchmark, we explore what has changed and what remains the same after two decades of digital government transformation.

          Can you imagine a public sector service provider today acknowledging “a total absence of any publicly available website”? No! Neither can we. But that was one of the possible research outcomes explored in the European Commission’s first ever eGovernment Benchmark report published in 2003 and reporting on digital progress between October 2001 and October 2002.


          A lot has changed in the past two decades, not least the massive upswing in the use of mobile technology to access government services. As the 2023 eGovernment Benchmark report is published, we take a look at what it tells us about the ongoing digitalization of public services in Europe and assess what’s changed – and what hasn’t – over the past 20 years.


          The latest eGovernment Benchmark study captured the digital transformation of governments in 2021 and 2022. It seems a far cry from the very first study carried out at a time when it was estimated that, in July 2002, just under 1 in 10 (9.1%) of the world’s population were internet users. By July 2022, that figure had risen to a little under 7 in 10 (69%) of the world’s population.

          Then and now – how things have changed 

          The ubiquity of internet access is understandably reflected in the ‘then and now’ findings of the eGovernment Benchmark studies across the years. In the 2003 report we learned that one in five public sector organizations did not have a website. Today, all the public sector service providers in the latest eGovernment Benchmark evaluation had a website that citizens could visit. This is important in terms of the availability of digital government services, with citizen-centric service delivery a core driver of digital transformation. Indeed, the EU’s ambitious Digital Decade policy program aims to make key public services in Europe available 100% online by 2030.


          So, how have Europe’s government digital services progressed? In the earlier eGovernment Benchmark, we discovered that just 20% of services were available online. Compare this with 84% that can be completed fully online today – in other words, fully electronic case handling end-to-end. Back at the advent of eGovernment, the web-enabling of public services was largely either simply making information available online or enabling people to attain one-way interaction (downloading forms) to start a procedure. The latest e-Government Benchmark reveals that 82% of services now allow online authentication and 70% enable safe and secure authentication with eID. Further, 68% of the online forms are pre-filled with personal data, making sure that users only need to enter information once.

          Interestingly, in the first eGovernment Benchmark, the smartphone was just a concept in the heads of a few ambitious entrepreneurs. It seems extraordinary today that the first report merely addressed mobility as follows: “…and maybe in the future, services provided by governments through WAP” – or, in other words, via wireless networks. Now, of course, 93% of European government websites are mobile friendly and the latest report tells us that 63% of online services can be completed on smartphones. Two decades ago, no one had even heard of smartphones, let alone considered the possibility of using them to access online services.

          20 years on – similar service gaps to be bridged 

          However, not everything in digital government has advanced at speed in the past two decades. For example, it is perhaps surprising in light of the prevailing high-level of consumer internet use that one unchanging aspect of eGovernment is the disparity between online services for citizens and those for businesses. It is, in fact, one of the three main service gaps that the 2023 eGovernment Benchmark report urges Europe’s governments to bridge. 

          In the first ever Benchmark, we learned that by October 2002 around one in ten public services for citizens were available online in the participating countries, with that figure rising significantly to 31% of public services for business users (almost 20% higher than for citizens).   While this gap has reduced, there is still a marked disparity between digital government services for business and for citizens. The 2023 eGovernment Benchmark reveals that 80% of public services for citizens are available online, rising to 92% for business users.

          Central governments set the pace

          Another of the service gaps referenced by both the inaugural 2003 and latest 2023 report is that between local & regional governments and central governments. In the earlier report, while we don’t have statistics to draw on, we read as follows: “… the best results were achieved by centrally co-ordinated public services that have limited complex procedures … and the services with the lowest scores are typically coordinated by local service providers and have more complex procedures…”  

          It’s a similar story in 2023 where 88% of evaluated central government services were completely online, compared to 76% of evaluated regional government services and 62% of evaluated local government services. The report states: “Creating a level playing field between different levels of government is the first step towards better online services for everyone.”

          Breaking down barriers to trade

          The third service gap discussed in the latest report evidences another changing aspect of eGovernment. Back in 2003, cross-border trade in Europe was barely touched on, with the exception of a case study on Dutch Customs. Such services were, essentially, non-existent. Today, while still far from on par with in-country national services, 49% of services for cross-border users are fully online. The report notes that this is an aspect of eGovernment that is “ready for the next step … with more than 30 countries connected to eIDAS and the ongoing improvements on the Your Europe portal”.

          The journey continues: interoperability is key in the future 

          The eGovernment Benchmark is a continuously evolving measurement, inspiring countries to keep improving their online services, whether that’s better accessibility, increased transparency, or tighter online security – all evolving aspects of digital government services for the past two decades.

          So, what next on the ongoing digital transformation journey for Europe’s public sector? The latest report argues that interoperability will be crucial for minimizing the service gaps. For example, it will help to create the recommended ‘level playing field’ between central and regional & local governments by enabling existing architectural building blocks, such as eID and eSignature, to be easily adopted on other websites. Europe’s Interoperability framework and its new Interoperable Europe Act will play a vital role in this move forward. Greater interoperability will also boost cross-border service delivery by removing barriers, such as a lack of appropriate translation functions and cross-border eID options.

          Achieving Europe-wide interoperability needs investment. The 2023 eGovernment Benchmark report concludes that this could ‘potentially’ come from the redirection of funds from the Recovery and Resilience Facility. Such funding would be a boon to smaller government authorities still struggling in their digitalization efforts.

          Informing eGovernment policy

          It has been fascinating to look at the changing context for eGovernment in the EU through the findings of the eGovernment Benchmark across two decades. The annual survey tracks continued improvements in online public services, while comparing how governments deliver those services across Europe. In measuring digital government as a pillar of digital progress, the eGovernment Benchmark aims to help public sector leadership, policy makers and those in everyday operations make better decisions for continual improvement. 

          Find out more

          Read the eGovernment Benchmark 2023 for the full analysis of Europe’s progress towards connected digital governments that put users – citizens, businesses, cross-border organizations – at their heart.

          Authors

          Person in a dark blue suit with a light blue shirt, arms crossed, wearing a wristwatch, and face blurred.

          Jochem Dogger

          Manager in the Data, Research & Evaluations team
          “The public sector is increasingly realizing the potential of the data it gathers to improve citizens’ lives. The challenge ahead is to keep using data in an ethical and responsible manner, while opening up vital data sources to citizens and entrepreneurs and facilitating interoperable data exchange between institutions. This will enable governments to realize the economic, societal, political and environmental benefits that data has to offer.”
          Person in a dark blue suit, white shirt, and light blue tie with a blurred face.

          Niels van der Linden

          Vice President and EU Lead at Capgemini Invent
          “Making it easy for citizens and businesses to engage with government increases the uptake of cost-effective and more sustainable digital services. Currently, however, many governments do not yet share service data, missing out on the one-government experience and preventing them deriving actionable insights from monitoring and evaluating the state-of-play. We help to design, build, and run trusted, interoperable data platforms and services built around the needs of citizens and businesses.”

            The future of logistics – how AI is revolutionizing decision-making

            Jorg Junghanns
            Sep 26, 2023

            Complementing human experience and expertise with AI-generated insights enables logistics professionals to tackle complex challenges with confidence and make informed choices that drive business growth and innovation.

            I was recently interviewed by a renowned German logistics publication on the topic of how organizations are leveraging artificial intelligence (AI) to reshape the logistics landscape, which is leading to smarter decision-making and increased efficiency.

            In this article, I summarize the interview by talking about how AI is making its mark and the exciting possibilities and opportunities it is set to create for logistics in the future.

            AI-powered decision-making

            AI is a broader concept encompassing methods for machines to simulate human intelligence. Machine learning (ML) is a subset of AI that focuses on machines learning from data without explicit programming. In logistics, both AI and ML have distinct roles. AI encompasses rule-based and expert systems, while ML is used for tasks like demand forecasting and route optimization.

            Logistics also employs predictive analytics, natural language processing (NLP), and ML to revolutionize decision-making:

            • Predictive analytics use historical data and external factors to forecast trends
            • NLP bridges language barriers for better customer understanding
            • ML automates tasks while detecting patterns.

            By integrating these technologies, logistics professionals can access actionable data, empowering data-driven decisions. Importantly, AI complements human expertise, enhancing problem-solving and innovation.

            Current AI applications

            AI is making its mark in two key areas – technology and data:

            • Technological advancements – autonomous vehicles, including cars, trains, and drones, are used for efficient last-mile delivery. Built-in cameras and sensors identify package details. Warehouses benefit from AI-driven robotics, including automated guided vehicles (AGVs), autonomous mobile robots (AMRs), and more
            • Data-driven insights – AI algorithms analyze historical data, market trends, and external sources like blockchain. This enhances logistics demand forecasting, route optimization, warehouse layout redesign, and automated inventory management. Technologies like intelligent document processing (IDP) and NLP streamline data management and improve communication.

            Future potential of AI

            The future of logistics holds significant potential, especially in two areas:

            • Autonomous vehicles and drones – self-driving trucks and delivery drones promise to transform logistics by reducing labor costs and enabling faster, flexible deliveries
            • Enhanced visibility – AI, combined with blockchain data and robotic process automation (RPA), will continue to improve supply chain operations with enhanced demand forecasting, inventory optimization, and end-to-end visibility. This shift from reactive to proactive supply chain management will increase resilience and sustainability in our volatile, uncertain, complex, and ambiguous (VUCA) world.

            The current state and industry adoption

            AI is already widely adopted in logistics. Major players such as Kuehne+Nagel, DHL, Amazon, and Alibaba lead the way, optimizing their operations with AI. Startups and technology providers offer specialized AI solutions, making the technology accessible to a broader range of businesses. At Capgemini, we apply these advancements to achieve next-generation supply chain performance.

            Logistics professionals often seek guidance on AI implementation, vendor selection, integration with existing systems, data security, and privacy concerns in AI applications. They also inquire about best practices for navigating the transition to AI-driven logistics. Additionally, concerns about job displacement by AI solutions are prevalent. It’s essential to prioritize AI technologies based on critical use cases and positive business outcomes. This ensures that AI adoption is purposeful and impactful in the logistics sector.

            A glimpse into the future

            Looking ahead to the next 5 to 10 years, quantum computing could usher in transformative changes. This technology can solve complex problems beyond the capabilities of traditional computers, offering real-time fleet and route optimization and simulation of intricate supply chain networks. The focus will be on harnessing technology to develop sustainable and revolutionary inclusive supply chains.

            Another area for opportunity is empowering your people with new and exciting roles to drive digital transformation and unlock enhanced outcomes – not just in logistics, but across your entire supply chain. Significant investment is required to not only streamline processes and implement new technologies, but also support emerging roles and skillsets to respond to and stay ahead of the evolving nature of work within the supply chain.

            To discover how Capgemini’s Intelligent Supply Chain Operations delivers cognitive, touchless operations, and data-driven decision-making to your organization, contact: joerg.junghanns@capgemini.com

            Meet our expert

            Jörg Junghanns

            Global VP – Supply Chain Orchestration, Intelligent Supply Chain Operations, Capgemini’s Business Services
            Jörg is leading Capgemini’s global Supply Chain Orchestration capability within BSv’s Intelligent Supply Chain Operations, driving transformative solutions across industries. He employs innovation and strategic thinking to empower supply chain growth, utilizing Capgemini’s Digital Services for planning, order management, procurement, and automation. With a global background, he excels in digital strategy, shared services, process design, and project management. Additionally, Jörg leads Capgemini’s European business for Intelligent Supply Chain Operations.

              Surviving – and thriving – on the data frontier

              Helen Ristov
              Sep 25, 2023

              How you can strategize and craft a solid data architecture to lead and succeed in a data-centric future 

              The sheer amount of data that is being processed today is pressing organizations to adapt and adopt new technologies that can handle real-time workloads and machine learning applications. The world is evolving at a frantic pace – and data is driving this rapid evolution. Many now liken data to digital oil – and the “data rush” into this space has opened many opportunities as organizations redesign their data architectures to stay relevant and lead their markets going forward.  

              In my previous post, we looked at why taking a deep dive into your data and a comprehensive maturity assessment are critical in outperforming your market. Now, we’ll be going further and looking at how you can take action, strategize, and craft the future state of your data architecture. This is essential in effectively addressing your main data challenges today – and how you will successfully handle and adapt to the data workloads of tomorrow. Here are some common themes you should consider when developing your future data architecture solutions: 

              • Separate and understand your systems of engagements and channels – and relay this information through the appropriate databases and APIs for each one 
              • Maintain a consolidated data lake layer with a curated data vault for analytics and reporting. You’ll want to separate the raw and curated data, as they usually service different business functions – for example – R&D, development, and production 
              • Migrate and incorporate cloud processing for your core data services and integrating real-time channels for analytics using queues.  

              Taking action, strategizing, and crafting the future state of your data architecture – How can Capgemini help you? 

              Our enterprise data architecture services help clients develop blueprints and architectural runways that define the structure and operations of their organizations. Our intent here is to determine how your company can most effectively achieve its current and future business and technology objectives – while simultaneously shepherding innovative methods, processes, and technologies into your organization’s landscape.  

              When evaluating enterprise data architecture services providers, it’s essential to seek out a partner who can seamlessly integrate with your business from end to end. At Capgemini, enterprise data architecture services comprise an overarching strategy and visioning, complete current state assessment, target state definition, delivery assurance, planning and budgeting, and operating model assessment and definition: 

              Strategy and visioning 

              We begin by helping you define the required and appropriate business and technology transformation roadmap – and the requisite strategic and tactical initiatives. We also assist in creating a governance model to marshal the realization of your envisioned transformation. 

              Current state assessment 

              Here, we review your current business and technology environment and seek out areas of opportunity and improvement. This review is based on a fit-gap analysis of your current capabilities and systems – with a plan for attaining your desired business and technology transformation. 

              Target state definition 

              Next, we define your ultimate target state for your business and technology architecture environment, along with the various interim architecture configurations required to attain your preferred state. We utilize various internal and external reference models for acceleration here. 

              Delivery assurance 

              As an extension of target state definition and planning services, we provide advisory services to in-flight programs and projects – or post-facto review of delivered work. These services are focused on realizing the requisite business value/outcome and the solution requirements (including approved architecture standards). 

              Planning and budgeting 

              In planning and budgeting, our focus is on the likelihood of attaining the envisioned solution design based on planned and/or inflight efforts. This includes identifying any required mitigation tactics to increase the chances of successful solution realization. 

              Operating model assessment and definition 

              Here, we provide an assessment, analysis, and definition service to internal and external parties. We guide the design and implementation of an appropriate operating model with relevant monitoring mechanisms that rely on strategic and tactical performance indicators. 

              Bringing everything together with architecture capability development and integrated dashboards 

              As a service to internal and external parties, architecture capability development begins with defining an appropriate architecture capability for your business utilizing Capgemini’s EA Capability Framework. Our EA Capability Framework encompasses a full maturity assessment of your existing situation and the crafting of an engagement model, along with defining a transition roadmap for establishing and improving your architecture capabilities. 

              Dashboards that consume data from multiple sources are a good way to connect and retrieve insights at the enterprise level – and connect your disparate data environment into a unified data and analytics core. Various applications like PowerBI and Tableau can be used to generate business intelligence reports, which incorporate automated refreshes of underlying data sources.  

              In utilizing Capgemini’s EA Capability Framework and our ADMnext^Data offering, we recently engaged with a client to build an executive reporting suite using PowerBI to track and monitor network infrastructure at their plasma centers across all of North America. 

              With ADMnext^Data, we have the capacity to build plug-and-play dashboards that integrate with your existing technology stacks. While we also work with you to develop an MVP data architecture with a unified reporting framework and dashboard application. To learn what ADMnext^Data can do for your business – and how you can take action to strategize and craft the ideal future state of your data architecture, drop me a line below. 

              Meet the author

              Helen Ristov

              Managing Delivery Architect 
              I lead the delivery and architecture of next-generation data platforms and applications. With over 20 years of experience, I work with clients on data transformation and platform enhancements to enable analytics and data-driven environments. I also work on the development of platform-embedded enterprise dashboards and software applications, which can provide a unified view across the scope of business operations – and critical insights for decision makers.

                Seeing the successful growth of your business starts with taking a deep look into your data

                Helen Ristov
                Sep 25, 2023

                Why a comprehensive maturity assessment is critical in outperforming your market 

                Over the last ten years, we’ve seen an exponential increase in the amount of data captured, stored, and consumed across the globe – from just 2 PB (petabytes) to now over 150 PB – with continued projected growth to exceed over 200 PB in the next five years. As the volume of data grows, the infrastructure that supports our data-driven society is being pressed – not only for innovations in data processing – but fresh governance and policy paradigms as well.  

                In a rather short period of time, traditional analytics have evolved into new fields of data science and engineering to handle these challenges by utilizing a plethora of tools developed to address growing demands. As with any disruption of this scale, there were pioneers who ventured into the unknown and helped pave the way – while new governance policies were adapted and refined to fit evolving business needs. 

                Many organizations are still operating on antiquated technologies that do not support the necessary functions for advanced analytics and machine learning. Tools and technologies have been created that are optimized to handle large workloads at the speeds needed for near real-time processing. Not all businesses require this level of sophistication, but a data assessment is a good starting point to aggregate the demands of your business and summarize where you are proficient and where you are falling short.  

                Assessing your data maturity 

                Many organizations are asking themselves how they compare to their industry peers. It’s important to gauge how sophisticated your environments are – and whether your organization can stay competitive in its respective niche areas. There are different data maturity levels ranging from the basic functions of reporting to data-driven – where decisions are made in a fully automated way. Various tools can be used or suggested for each category depending on complexity and use cases. However, proceeding with a realistic assessment of your business and your desired outcomes is a good place to start. Your organization can be evaluated and categorized to align the correct technologies and make suggestions that are appropriate for your business.  

                What are some of the most pressing data challenges that modern organizations face? 

                The sheer volume of data 

                When more data is collected, more monitoring and validation are required to manage the full lifecycle of data. Applications and dashboards that help with data management are becoming increasingly more important in organizing and viewing data through a real-time lens. The pillars to consider when implementing a data lifecycle management solution include data creation, storage, usage, disposition and archival. Many organizations incorporate hot and cold data storage with a retention policy on the cloud to save on costs associated with managing data.  

                Multiple data repositories 

                Large organizations may wind up with dozens of business solutions – each with its own data repository. These could include databases, CRM software, ERP, Cloud Storage, etc. When multiple systems are involved, it’s difficult to break from siloes into a more integrated platform for data-driven decisions. Creating a curated and linked repository should be considered as a top priority for most organizations. 

                Data quality 

                The amount of data passing through multiple data storages (and throughout an entire organization) can lead to a host of data issues such as naming conventions and field types that can be different for the same field. Cases like these can often be rectified using data catalogues and crawlers to standardize variables through common names. In addition, data may be out of date, incorrect, or malfunctioning. Making judgments based on this sort of data might result in your firm losing a lot of money every year.  

                Data integration 

                Data integration is the process of combining disparate data sources into a common view – and often helps improve data quality and synchronization issues. Companies with mature data integration processes often see improved operational efficiency and more valuable insights gained by aligning their systems. Building a common data model eases future integrations because all integration processes will speak the same language.  

                Data governance 

                Comprehensive data policies are essential in effectively keeping track of your changing ecosystem. To attain the value and outcomes you seek, it is critical to build a data governance foundation around trust, transparency and ethics, risk mitigation and security, education and training, collaboration and shared culture, and accountability and decision rights.  

                Data analysis and automation – Supporting your key business cases with trusted analytics 

                Valuable insights are needed to drive effective executive decisions. The reliability of your insights is only as solid as your data and supporting systems. The end goal of your data infrastructure should be to support your key business cases with trusted analytics. Incorporating data processes that automate reports, insights, and forecasts will streamline your operations and enable employees to spend their time deriving value from reports – instead of data scrubbing. As an example, pipelines can be created to automate your typical data transformation jobs.  

                Moving from a nascent and siloed data function to a data-driven organization starts with a comprehensive data maturity assessment  

                An effective assessment can also help recognize where you are struggling the most within your business today – and help provide corrective actions and recommendations to address these concerns.  

                It can also aid you in determining the scope of your data transformation. Identifying the correct KPIs to track progress is a valiant effort but can be objectively difficult – they should be closely aligned with business objectives. Management will want to know if their investments are paying off – and picking the correct measure is key here. For a data transformation project, I would suggest the following categories and KPIs to help you measure success: 

                • Overall: Improvement in operating profit margin 
                • Operational Efficiency: Execution speed on data extraction and processing times, reduction of defects and errors and maintenance costs and labor, and increases in system uptime 
                • Employee Engagement: Increases in employee productivity, working hours saved, reduction in incidents, and improvements to SLAs 
                • Customer Engagement and Satisfaction: Time spent on apps, lead generation, digital marketing KPIs, customer retention, customers registered on site 
                • New Sources of Revenue: Customers buying via AI-based recommendations and product revenue associated with new platform features. 

                With our ADMnext^Data offering, Capgemini has already conducted several assessments with clients to help them understand their key data challenges by examining support tickets and logs and categorizing them into core problem areas. In utilizing these assessments, we’ve helped a global CPG leader achieve over €30M in savings annually and are currently also considering adapting GenAI initiatives to help us provide root-cause analysis and ticket resolution to suggest the best corrective actions.  

                Overall, a complete data assessment can help you measure how you stack up and help you prepare for future workloads as a proficient data-driven organization. And, according to the Capgemini Research Institute, data-driven organizations currently enjoy a performance advantage of between 30% & 90% across customer engagement, top-line benefits, operational efficiency, and cost savings. 

                In my next post, I’ll be taking a deeper dive into how you can address the key data challenges outlined above. In the meantime, if you want to get started on your own comprehensive data maturity assessment, drop me a line below. 

                Meet the author

                Helen Ristov

                Managing Delivery Architect 
                I lead the delivery and architecture of next-generation data platforms and applications. With over 20 years of experience, I work with clients on data transformation and platform enhancements to enable analytics and data-driven environments. I also work on the development of platform-embedded enterprise dashboards and software applications, which can provide a unified view across the scope of business operations – and critical insights for decision makers.

                  Full version of the Taskforce on Nature-related Financial Disclosures (TNFD) framework released: An encouraging signal for nature-related reporting

                  Aurélie Gillon & Anne-Sophie Herbert-Génot
                  22 Sep 2023

                  Following several successive beta versions and a consultation with market participants, the TNFD’s finalized set of recommendations is now available and will allow companies in all sectors to study nature-related issues.

                  The TNFD (Taskforce on Nature-related Financial Disclosures) unites leading private sector players committed to assessing their environmental impact, integrating biodiversity and sustainability into business strategies. Capitalizing on our expertise in these fields, Capgemini is part of the TNFD forum and is providing an initial analysis of the first full version of the methodology.

                  To be fully consistent on sustainability topics, nature concerns need to be properly assessed and tracked by companies.

                  Our economies are inherently interconnected with nature, rather than separate from it. There is an urgent need to recognize the financial risk behind its collapse, as the loss of biodiversity is progressing at an unprecedented rate, much faster than the natural extinction rate: according to the WWF, 69% of wildlife populations have disappeared since 1970[1]. Governments have begun to acknowledge the criticality of this topic. Hence, more than 190 countries adopted the Kunming-Montreal Global Biodiversity Framework (GBF) in December 2022[2], committing to ambitious targets to protect and restore nature, and encouraging governments to introduce requirements for evaluation and disclosures of risks and impacts. Additionally, biodiversity loss is now recognized by the world’s central banks as a source of systemic risk just as important as climate change.

                  Unfortunately, unlike carbon issues for which awareness has risen, only a few companies are currently studying their dependence and their impact (supply chain, operations, corporate values, etc.) on nature. This oversight has been notably driven by the lack of an internationally recognized methodology or tool to measure and assess specific biodiversity-related issues.

                  The adoption of the TNFD aims to address this gap. Built as an addition to the climate focused TCFD, it significantly broadens its scope. Businesses and investors will now have a common comprehensive framework to transparently assess and communicate on both climate and biodiversity impacts, risks and opportunities, promoting the interconnectedness between the urgency of climate change and biodiversity concerns.


                  [1] https://www.wwf.eu/?7780966/WWF-Living-Planet-Report-Devastating-69-drop-in-wildlife-populations-since-1970

                  [2] https://www.unep.org/news-and-stories/story/cop15-ends-landmark-biodiversity-agreement

                  What is the TNFD?

                  The TNFD is a global, market-led, and science-based initiative, supported by governments. Its primary objective is to address the urgent need to incorporate nature considerations into financial and business decisions. It provides a risk management and disclosure framework that helps organizations identify, assess, and report on nature-related issues along their value chains, including financing activities. By promoting the integration of nature-related risks and opportunities into strategic planning, the TNFD aims to steer global financial flows towards nature-positive outcomes, fostering a more sustainable and resilient economy.

                  Starting in October 2021, the TNFD framework drafting has been progressively refined across multiple beta versions taking into account stakeholders’ feedback. September 2023 marked a pivotal milestone with the release of the first full version of the risk management and disclosure framework (v1.0). Among the significant modifications since the last beta version, the framework is unveiling a simplified and more focused “scoping” guidance, used to generate working hypotheses about organizations’ potential nature-related dependencies, impacts, risks, and opportunities.

                  The TNFD framework structure: What’s in it for corporates?

                  The methodology published is composed of the TNFD Recommendations, which are necessary to adopt to comply with the framework, as well as a suite of additional implementation guidance, which is not required, but is suggested to ease adoption.

                  The content can be split into three distinct categories:

                  • Key concepts and definitions are introduced to summarize and make the science-based definitions accessible for businesses (shared language purpose). The framework brings together all the concepts of risks, impacts, dependencies, and opportunities behind the term “nature-related issues” and describes in depth the specificities of each notion.
                  • The TNFD also provides guidance to assess nature-related risks and opportunities and incorporate them in corporates’ and financial institutions’ development strategies in the short, medium, and long term. These elements are not required to formally adopt the TNFD Recommendations but are suggested to help internal assessment in preparation of external disclosure.
                  • To this end, the TNFD created the LEAP approach, a structured and ready-for-use methodology to internally assess impacts and dependencies regarding nature. In its final version published in September, the TNFD reinforced the LEAP approach with:
                    • Filters for sectors, value chains and specific locations to make assessment manageable so that organizations can locate their interfaces with nature;
                    • Alignment with the requirements of new ISSB and CSRD standards on materiality analysis and assessment in Europe;
                    • Pilot testing insights from corporates and financial institutions, with a TNFD case studies catalogue made available.
                  • The LEAP approach is a four-phase process that encourages organizations to be careful with the scope of their assessment and to involve the affected stakeholders:
                    • Locate your interface with nature
                    • Evaluate your nature-related dependencies and nature impacts
                    • Assess your nature-related risks and opportunities
                    • Prepare to respond and report

                      This methodology is not a linear process but an approach with iterative components for analysis: it can and must have different starting points for sectors, companies and even business units regarding their links and dependencies with nature. The LEAP approach is not mandatory to adhere to the TNFD recommended disclosures: it has been drafted to help organizations to prepare their disclosures, and not everything identified, assessed, and evaluated should necessarily be disclosed. 

                      Assessment requires metrics and indicators to compare and evaluate. Consequently, the TNFD dictates specific qualitative and quantitative metrics and indicators regarding business situation (location, sector, and biome) to properly assess both positive and negative impacts. Indicators are not only science-based but also practical to implement and maintain on annual reporting cycles and align with global policy goals. TNFD guidance also covers two steps prior to metrics choice and implementation: target setting, and scenario building to develop and test the resilience of the chosen strategies. Practical examples and case studies are provided to facilitate the understanding of each step of the process.
                  • Finally, the core content of the TNFD are disclosure recommendations, the use of which is required to comply with the methodology.

                    Since consistency of approach, structure, and language with the TCFD is key to early market adoption, the methodology took inspiration from its predecessor: 11 out of the 14 specific recommended disclosures come from the TCFD framework to which nature-related issues have been added, to start nature reporting alongside, or integrated with, climate. The three remaining recommendations were replaced by a new general requirements component introduced to the overall approach (location, scope, approach to materiality, stakeholder engagement, etc.), with additional guidance for certain sectors and biomes. All are organized along the same four pillars as the TCFD: governance, strategy, risk and impact management, and metrics and targets. The harmonization of metrics will allow corporates from all over the world to make external disclosure using the same, comparable format.

                  What are the next steps for the market?

                  Now that the framework has been published, the TNFD encourages voluntary adoption by corporates, assists other organizations in converting these recommendations into sustainability reporting standards and explore with governments and regulators its use for nature-related reporting by corporates. Starting with the TNFD, regulatory evolutions related to biodiversity and nature will be intensifying and financial institutions will be more informed in their investment decisions.

                  Thus, corporates need to anticipate this change and rethink their business models, based on a better understanding of their interaction with nature. While being conscious that exercise will be challenging in the beginning, it is a key and necessary step towards breaking silos in corporates’ sustainability strategies, which are currently more carbon oriented. Having access to a clear framework and guidelines, corporates must now leverage it and provide feedback that will benefit all TNFD users. We believe that such new considerations will unleash new opportunities and value creation, from local to global levels, as previous work on carbon and net zero topics has done in recent years.

                  Companies of all sectors finally have a comprehensive methodology to study biodiversity issues and understand where they currently stand. Beyond disclosure of the situation, companies should anticipate the next step and not wait for the final release of the SBTN’s set of targets for nature: it is time to start thinking about your pledges.

                  Aurélie Gillon, Director, Biodiversity Lead at Capgemini Invent

                  This post has been written by the Capgemini Invent France Sustainable Futures team:  Aurélie Gillon, Biodiversty Director; Anne-Sophie Herbert-Génot, Managing Consultant with the support of Alexandre Le Déméet, Senior Consultant; Adrien Cosson, Senior Consultant; Luna Simonet, Consultant; Mathis Larquetoux, Consultant

                  Authors

                  Anne-Sophie Herbert-Génot

                  Managing Consultant
                  With a background in agronomy engineering, specialized in environmental management, and a Ph.D. in Life Cycle Assessment, Anne-Sophie is dedicated to driving sustainability and energy transition for a large panel of industries in the Sustainable Futures division of Capgemini Invent.

                  Aurélie Gillon

                  Sustainability Director, Biodiversity Lead
                  With a dual educational background from ENS Ulm and HEC Paris, Aurélie is a Director in the Sustainable Futures practice of Capgemini Invent. As one of the Capgemini Group’s thought leaders on sustainability, she has developed the Group’s nature-positive offering, embodied its biodiversity voice, and represented Capgemini Invent at key sustainability events.