Skip to Content

Graphite: The unsung hero of sustainable energy, and why we need more than a ‘Silver Bullet

Pascal Brier
Oct 26, 2023

I recently had a debate with a colleague on the critical materials that will be needed to support our #sustainability transformation.

One of these materials is certainly #graphite, a derivative of carbon renowned for its excellent electrical conductivity, high-temperature stability and chemical inertness.
Although we often mostly mention Lithium, Graphite stands out as a game changer in the energy storage technology, particularly the production of lithium-ion batteries which are central to the electric car industry. Few people know that each current electric battery is made of 60 to 90 kilogrammes of Graphite.

And yet close to 90% of the world’s graphite is currently being refined in China, which recently decided to tighten its export restrictions of this highly strategic material. While I’ve seen many commentators call for Europe and the US to increase their graphite refinement capabilities, even going as far as to develop synthetic graphite, I would add that addressing the diverse energy needs of our evolving world will require a multitude of materials and technologies ; it’s unlikely that a perfect ‘’silver bullet’’ will overcome all challenges.

This is why it is also important to keep investing in alternative forms of battery #technology, such as sodium-ion batteries, solid state, silicon batteries or even hydrogen fuel cells.

Meet the author

Pascal Brier

Group Chief Innovation Officer, Member of the Group Executive Committee
Pascal Brier was appointed Group Chief Innovation Officer and member of the Group Executive Committee on January 1st, 2021. Pascal oversees Technology, Innovation and Ventures for the Group in this position. Pascal holds a Masters degree from EDHEC and was voted “EDHEC of the Year” in 2017.

    Navigating the new norms of retirement with the SECURE 2.0 Act

    Abhishek Singh
    26 October 2023

    The topic of retirement – and people’s readiness for it – plays an important role in investment and financial advice. As concerns grow over retirement savings, financial advice must adapt and change.

    Retirement models around the world

    Different countries have different vehicles for accumulation and decumulation of retirement investments.

    The UK, for example, has:

    • The state Pension,
    • Workplace Pensions,
    • and Personal Pensions.

    Australia has the Superannuation Guarantee. Meanwhile IRAs & 401(k)s are the primary vehicle in the USA.

    All plans evolve over the years, and much government legislation has been enacted to strengthen them. One such plan which has recently been introduced in the USA is the SECURE 2.0 Act. Passed by Congress in December 2022, it is an expansion of the Setting Every Community Up for Retirement Act, 2019 (SECURE Act).

    A game-changer in the American landscape

    The Act was introduced amid growing concerns about retirement security, influenced by global issues and lasting economic challenges. As a recent US News Survey of 2000 adults points out “50% of respondents say they had to pause saving for retirement at some point in 2022, and 41% of those surveyed stopped contributing to retirement funds like 401(k)s or individual retirement accounts.”

    A recent poll, conducted by YouGov for Bankrate, found that over half of Americans feel they’re lagging in retirement savings, and almost half don’t have any retirement plan. This is especially true for small business employees. One of the main goals behind the SECURE 2.0 Act was to target these concerns.

    It is important to remember that a significant percentage of this population is going to be the beneficiary of the Great Wealth Transfer, estimated to be close to $72 trillion. Wealth firms, especially those advising Mass Affluents and High Net Worth Gen Zs and Millennials, must adjust their strategies keeping in mind some of the important changes brought by the SECURE 2.0 Act.

    Unlocking benefits for all ages

    The Act addresses a range of age groups. On one hand, it allows for higher catch-up amounts of $10,000 for employees aged 60-63. At the same time, it treats student debt repayments as contributions to a 401(k) and enables transfers from certain 529 accounts to Roth IRAs.

    Addressing student loan payments is, as the official Senate paper on SECURE 2.0 puts it, a significant benefit for young adults who are only just embarking on the retirement savings journey. The act allows for more tax optimized funds to become available for both accumulation and decumulation cycles.     

    A greater role of financial advisors in the SECURE era

    Under all these circumstances, the role of Financial Advisors becomes more and more important. As the spectrum of Wealth Management expands to the Mass Affluent segment, maximizing the benefits of SECURE 2.0 becomes a key part of advice. As a Vanguard article on the subject points out, “starting in 2024, employers will have the option to make contributions to an employer-sponsored retirement plan—i.e., 401(k), 403(b), SIMPLE IRA, 457(b), and similar plans—that match an employee’s qualified student loan payments, even if the worker doesn’t directly contribute to the retirement plan.” It is important for Financial Advisors to be aware of these changes, to help their clients optimize their retirement savings journey.

    Small businesses, big benefits

    SECURE 2.0 requires businesses to auto-enroll employees in new (k) and 403(b) plans. From 2025, these plans will start with a minimum 3% contribution rate. There is also the provision for transferring employees’ retirement plans, especially low-balance ones, when switching jobs. This is vital for low-balance savers, often found in smaller enterprises.

    JP Morgan Asset Management’s 11th Annual Guide to Retirement highlights the Act’s incentives for small businesses to offer retirement plans via tax credits. This targets nearly 50% of private sector employees working for in such establishments with no such plans. Those with employer-sponsored setups are more likely to save towards retirement by two-fold. It’s essential for Financial Advisors to be aware of this significant change and integrate it into their own advice.

    Conclusion

    The SECURE 2.0 act brings significant changes that should encourage retirement savings for Americans and will inevitably become an important part of all financial advice conversations. Wealth firms must ready themselves to quickly adopt these changes so they can remain relevant and continue to be the advisor of choice for their clients. 

    Author

    Abhishek Singh

    Head of Wealth Management (North America) –  Banking and Capital Markets
    Abhishek provides Wealth Management domain leadership for clients, bringing to the table an understanding of the latest trends and strategies in the industry as well as Capgemini collaboration with industry-leading partners, to provide innovative solutions in Wealth Management.

      Exploring probabilistic modeling and the future of math

      Robert-Engels
      Robert H. P. Engels
      Oct 23, 2023

      Some days you get these chains of interesting events following up on each other.

      This morning I read the “GPT can solve math[.. ]” paper (https://lnkd.in/dzd7K3sx), then I read some responses to that (o.a. Gary Marcus, X-tweets etc). During and after the TED AI I had many interesting discussions on the topic of probabilistic modelling vs models of math as we know (knew?) it, and this paper sparked some thoughts (so: mission accomplished).

      It occurs to me that we have a generation of PhD students building LLMs that have probably not really got interested model thinking and mathematical proofs. I.e. the thinking behind Einsteins´ relativity theory, the thinking behind Euler’s Graph theory, the type of thinking that led (indeed) to a mathematical model that you can implement in a calculator (low footprint), calculates correctly (100% trustworthy) and in addition can calculate things 100% correct on input unseen before.

      The question really condenses down to the fact whether you believe in the abstraction capability in current algorithms used for training todays LLMs. Are attention layers at all able to build abstractions on their own (and not regurgitating from abstractions it got ready-served by humans)? Optimism in the Valley is big, just add more data and the problem will go away.

      But without changing the underlying attention layer design this seems to be a fallacy. Learning to abstract really means to build metalevels on your information, condense signals and their relations. That is something different as predicting chains of tokens. Such an abstraction layer can be seen as building a 3D puzzle, whereas current attention mechanisms seem single-layered. With a single layer, the most you can build is a 2D puzzle.

      With that picture in mind, you can observe that current solutions by LLM suppliers is to extend the 2D puzzle, making it larger (adding more data from all over), or giving it higher resolution for a specific task (like the math paper mentioned above). But no sincere attempts seem to have been made yet to build a 3D picture, which would mean to rock the foundation of the foundation models and rebuild the attention layer mechanism to cover for this deficit.

      Until then, let’s focus on getting functions to work reliable and off-load model-based tasks (math, engineering, logic, reasoning) to external capability agents and stop pretending that 2D can become 3D without changing the foundation.

      Meet the author

      Robert-Engels

      Robert Engels

      CTIO, Head of AI Futures Lab
      Robert is an innovation lead and a thought leader in several sectors and regions, and holds the position of Chief Technology Officer for Northern and Central Europe in our Insights & Data Global Business Line. Based in Norway, he is a known lecturer, public speaker, and panel moderator. Robert holds a PhD in artificial intelligence from the Technical University of Karlsruhe (KIT), Germany.

        Transforming the data terrain through generative AI and synthetic data

        Aruna Pattam
        Aruna Pattam
        18th October 2023

        Welcome to the brave new world of data, a world that is not just evolving but also actively being reshaped by remarkable technologies. It is a realm where our traditional understanding of data is continuously being challenged and transformed, paving the way for revolutionary methodologies and innovative tools.

        Among these cutting-edge technologies, two stand out for their potential to dramatically redefine our data-driven future: generative AI and synthetic data.

        In this article, we will delve deeper into these fascinating concepts.

        We will explore what generative AI and synthetic data are, how they interact, and, most importantly, how they are changing the data landscape.

        So, strap in and get ready for a tour into the future of data!

        Understanding generative AI and synthetic data

        Generative AI refers to a subset of artificial intelligence, particularly machine learning, that uses algorithms like generative adversarial networks (GANs) to create new content. It’s “generative” because it can generate something new and unique from random noise or existing data inputs, whether that be an image, a piece of text, data, or even music.

        GANs are powerful algorithms that comprise two neural networks — the generator, which produces new data instances, and the discriminator, which evaluates them for authenticity. Over time, the generator learns to create more realistic outputs.

        Today, the capabilities of generative AI have evolved significantly, with models like OpenAI’s GPT-4 showcasing a staggering potential to create human-like text. The technology is being refined and optimized continuously, making the outputs increasingly indistinguishable from real-world data.

        Synthetic data refers to artificially created information that mimics the characteristics of real-world data but does not directly correspond to real-world events. It is generated via algorithms or simulations, effectively bypassing the need for traditional data collection methods.

        In our increasingly data-driven world, the demand for high-quality, diverse, and privacy-compliant data is soaring.

        Current challenges with real data

        Across industries, companies are grappling with data-related challenges that prevent them from unlocking the full potential of artificial intelligence (AI) solutions.

        These hurdles can be traced to various factors, including regulatory constraints, sensitivity of data, financial implications, and data scarcity.

        1. Regulations:

        Data regulations have placed strict rules on data usage, demanding transparency in data processing. These regulations are in place to protect the privacy of individuals, but they can significantly limit the types and quantities of data available for developing AI systems.

        • Sensitive data:

        Moreover, many AI applications involve customer data, which is inherently sensitive. The use of production data poses significant privacy risks and requires careful anonymization, which can be a complex and costly process.

        • Financial implications:

        Financial implications add another layer of complexity. Non-compliance with regulations can lead to severe penalties.

        • Data availability:

        Furthermore, AI models typically require vast amounts of high-quality, historical data for training. However, such data is often hard to come by, posing a challenge in developing robust AI models.

        This is where synthetic data comes in.

        Synthetic data can be used to generate rich, diverse datasets that resemble real-world data but do not contain any personal information, thus mitigating any compliance risks. Additionally, synthetic data can be created on-demand, solving the problem of data scarcity and allowing for more robust AI model training.

        By leveraging synthetic data, companies can navigate the data-related challenges and unlock the full potential of AI.

        What is synthetic data?

        Synthetic data refers to data that’s artificially generated rather than collected from real-world events. It’s a product of advanced deep learning models, which can create a wide range of data types, from images and text to complex tabular data.

        Synthetic data aims to mimic the characteristics and relationships inherent in real data, but without any direct linkage to actual events or individuals.

        A synthetic data generating solution can be a game-changer for complex AI models, which typically require massive volumes of data for training. These models can be “fed” with synthetically generated data, thereby accelerating their development process and enhancing their performance.

        One of the key features of synthetic data is its inherent anonymization.

        Because it’s not derived from real individuals or events, it doesn’t contain any personally identifiable information (PII). This makes it a powerful tool for data-related tasks where privacy and confidentiality are paramount.

        As such, it can help companies navigate stringent data protection regulations, such as GDPR, by providing a rich, diverse, and compliant data source for various purposes.

        In essence, synthetic data can be seen as a powerful catalyst for advanced AI model development, offering a privacy-friendly, versatile, and abundant alternative to traditional data.

        Its generation and use have the potential to redefine the data landscape across industries.

        Synthetic data use cases:

        Synthetic data finds significant utility across various industries due to its ability to replicate real-world data characteristics while maintaining privacy.

        Here are a few key use cases.

        Testing and development:

        In testing and development, synthetic data can generate production-like data for testing purposes. This enables developers to validate applications under conditions that closely mimic real-world operations.

        Furthermore, synthetic data can be used to create testing datasets for machine learning models, accelerating the quality assurance process by providing diverse and scalable data without any privacy concerns.

        Healthcare:

        The health sector also reaps benefits from synthetic data. For instance, synthetic medical records or claims can be generated for research purposes, boosting AI capabilities without violating patient confidentiality.

        Similarly, synthetic CT/MRI scans can be created to train and refine machine learning models, ultimately improving diagnostic accuracy.

        Financial services:

        Financial services can utilize synthetic data to anonymize sensitive client data, allowing for secure development and testing.

        Moreover, synthetic data can be used to enhance scarce fraud detection datasets, improving the performance of detection algorithms.

        Insurance:

        In insurance, synthetic data can be used to generate artificial claims data. This can help in modeling various risk scenarios and aid in creating more accurate and fair policies, while keeping the actual claimant’s data private.

        These use cases are just the tip of the iceberg, demonstrating the transformative potential of synthetic data across industries.

        Conclusion

        In conclusion, the dynamic duo of generative AI and synthetic data is set to transform the data landscape as we know it.

        As we’ve seen, these technologies address critical issues, ranging from data scarcity and privacy concerns to regulatory compliance, thereby unlocking new potential for AI development.

        The future of synthetic data is promising, with an ever-expanding range of applications across industries. Its ability to provide an abundant, diverse, and privacy-compliant data source could be the key to unlocking revolutionary AI solutions and propelling us toward a more data-driven future.

        As we continue to explore the depths of these transformative technologies, we encourage you to delve deeper and stay informed about the latest advancements.

        Remember, understanding and embracing these changes today will equip us for the data-driven challenges and opportunities of tomorrow.

        Aruna Pattam

        Aruna Pattam

        Head of AI Analytics & Data Science, Insights & Data, APAC
        Aruna is a seasoned data science leader with a successful track record of developing and implementing data and analytics and data science solutions through cutting-edge technologies, agile development, continuous delivery, and DevOps. With over 22 years of experience, Aruna is a Microsoft-certified data scientist and AI engineer. She is a member of the Responsible AI Think Tank at CSIRO NAIC, which focuses on the responsible and ethical use of #AI in businesses in Australia, and a known public voice in Australia for Women in AI.

          The future of learning is immersive

          Alexandre Embry
          Oct 18, 2023

          Here’s the last PoV from Capgemini that I co-authored with my colleague Isabelle Lamothe addressing how immersive and metaverse experiences can revolutionize the future of learning and training.

          Against a backdrop of transformation in our relationship with work, companies today must continually adapt to gain or maintain their competitive edge. Today, 77% employers find it difficult to recruit the right people while 60% of workers will require training before 2027.

          If we talk about employee experience, we cannot ignore the major pillar that is what we call “future of learning”. And a lot of work is required to transform the companies in the right direction.

          This objective can be accelerated by using technologies to empower our ways to train and learn. And we are deeply convinced that immersive experiences, like metaverse, are a main lever to reach the next level of learning.

          Today, many organizations continue to innovate through the metaverse and immersive experiences and curiosity about the metaverse remains high, especially when it comes to training, as shown by a study carried out by Capgemini Research Institute. 61% of respondents believe that immersive experiences can have an impact on the training sector. We still have some time to go before the metaverse reaches full maturity, but it opens a wide field of actionable possibilities right now.

          At Capgemini, we aim at becoming a strong partner for our clients by accompanying them through their will to put their training ecosystem to the next step.

          Let’s open discussion if you’re interested in this.

          Meet the author

          Alexandre Embry

          Vice President, Head of the Capgemini AI Robotics and Experiences Lab
          Alexandre leads a global team of experts who explore emerging tech trends and devise at-scale solutioning across various horizons, sectors and geographies, with a focus on asset creation, IP, patents and go-to market strategies. Alexandre specializes in exploring and advising C-suite executives and their organizations on the transformative impact of emerging digital tech trends. He is passionate about improving the operational efficiency of organizations across all industries, as well as enhancing the customer and employee digital experience. He focuses on how the most advanced technologies, such as embodied AI, physical AI, AI robotics, polyfunctional robots & humanoids, digital twin, real time 3D, spatial computing, XR, IoT can drive business value, empower people, and contribute to sustainability by increasing autonomy and enhancing human-machine interaction.

            Empowering agents: How generative AI supports contact center agents in banking

            Vinay Patel
            18 October 2023

            In the fast-paced world of banking, customer service plays a pivotal role in building trust and loyalty.

            Contact center agents are the frontline champions responsible for addressing customer queries, resolving issues, and providing essential assistance. However, with the increasing complexity of financial products and the growing demands of customers, contact center agents often face challenges in providing accurate and timely information. This is where generative AI emerges as a game-changer, empowering agents with the right tools to enhance their performance and deliver a seamless customer experience.

            Generative AI in banking

            Generative AI, a groundbreaking technology, is making significant strides in the banking sector. By harnessing the power of advanced language models, it allows banks to streamline customer service operations, automate routine tasks, and provide personalized interactions. From generating tailored responses to enhancing fraud detection and risk assessment, Generative AI is transforming the way banks interact with customers and optimize their internal processes, ultimately leading to improved efficiency and customer satisfaction.

            Use cases of generative AI in banking contact centers

            1. Abstract summarization: Abstract summarization is a cutting-edge approach implemented in contact centers to empower agents and streamline their workflow. The process begins with the contact center system recording and transcribing customer calls accurately. Utilizing the power of GPT-3, a robust language model, the system then generates concise summaries by extracting essential information from the conversations. Later, they are presented to the agents, allowing them to quickly review and approve the condensed call transcripts.
            2. Insight extraction: Harnessing the power of Generative AI, contact centers can extract valuable business insights from vast conversation-related datasets. By analyzing the reasons behind customer calls in conjunction with average handle time, average hold time, and average queue time, agents become truly empowered. These insights provide a deeper understanding of call patterns, enabling agents to optimize their performance and improve the overall efficiency.
            3. Real time digital assistant: Generative AI can be used to assist agents in handling customer inquiries more efficiently. By granting access to a vast knowledge base, the AI model efficiently retrieves relevant information and proposes suitable responses for agents. This capability empowers them to deliver accurate and consistent answers to customer queries, resulting in enhanced satisfaction and a more seamless customer service experience.
            4. Complaint resolution: Resolving customer complaints is crucial for maintaining a positive brand image and delivering a high-quality experience. Generative AI can analyze historical data on successful complaint resolutions and recommend the most effective strategies to agents. This ensures that agents have the necessary tools to address customer grievances promptly and effectively, while also providing a superior customer experience.
            5. Personalized customer interaction: Generative AI can empower agents to provide personalized customer interaction and tailored responses to customers, thereby enhancing their overall experience. By analyzing customer data, Gen AI can generate customized responses that cater to specific needs, fostering meaningful and impactful interactions that make customers feel valued and understood.
            6. Automated ticket routing: Some queries might require specialized assistance or escalation to the higher-level support. Generative AI can analyze incoming queries and route them to the appropriate departments, ensuring a streamlined workflow and reducing response times.
            7. Automated email response: By analyzing vast amounts of customer data, Gen AI can generate personalized and contextually appropriate email replies, tailored to each customer’s specific inquiry. This technology streamlines the response process, saving agents valuable time and effort, while ensuring customers receive timely and accurate solutions, ultimately enhancing overall customer service and experience.

            Benefits of Generative AI for agents:

            1. Enhanced efficiency: By automating repetitive tasks and providing quick access to information, Generative AI significantly reduces the average handling time for customer inquiries. Agents can focus on more complex issues and provide a higher level of service.
            2. Consistency and accuracy: Generative AI ensures consistent and accurate responses across all customer interactions. This consistency builds trust and eliminates the risk of misinformation.
            3. Continuous learning: AI models learn from every customer interaction, making them more knowledgeable and better equipped to handle future queries effectively.
            4. Improved job satisfaction: Empowered with Generative AI tools, agents can handle customer inquiries with confidence and efficiency, leading to higher job satisfaction and lower agent turnover.
            5. Cost savings: Automating routine tasks and optimizing agent performance results in cost savings for banks, as they can handle more queries with the same staff strength.

            Conclusion

            Generative AI is revolutionizing the banking industry by empowering contact center agents to deliver exceptional customer service. By providing quick access to information, personalized customer interactions, and compliance support, Generative AI enhances agent performance and customer satisfaction. As banks continue to adopt this technology, they will undoubtedly witness significant improvements in their contact center operations, ultimately leading to a more positive and seamless customer experience.

            Author

            Vinay Patel

            Senior Director, Contact Center Transformation Leader
            Banking and Capital Markets sector are focused on delivering a customer-centric contact center leveraging a customer experience hub to  optimally engage customers across interactions.

              Closing the distance on last-mile delivery

              Pravin Chaudhary
              Oct 17, 2023

              In the first blog of this series: Navigating the complex web of last-mile delivery, we explored the complex challenges of last-mile delivery

              We covered: 

              • Physical limitations 
              • Technological limitations 
              • The limitations of traditional supply and demand planning 
              • Now let’s see what well-managed last-mile delivery looks like in practice. 

              What constitutes the capacity of a Delivery Hub? 

              Determining the capacity of a delivery hub is a complex process that requires the consideration of multiple factors. Typically, a delivery hub serves a certain radius within a city, ranging from a few localities to several kilometres based on the area’s population, housing, and office density.

              The overall capacity of a delivery hub comprises several components, including storage capacity for parcels, sorting and bagging capacity, and manpower capacity needed for parcel delivery. 

              Manpower is a crucial and expensive aspect of last-mile delivery and necessitates careful planning.  During the eight hours of operation, a delivery executive must report to the hub in the morning, collect the bag and route plan, travel the route, deliver the parcels, return to the hub, account for undelivered parcels, deposit cash collected from cash on delivery orders, and resolve any other issues. 

              To estimate the manpower needed for a given day, first, the historical productivity of delivery executives (i.e., the number of parcels delivered in a day) is determined. Those numbers are then used to estimate the required number of personnel needed to deliver, handle returns, etc. at the desired capacity.

              Demand forecasting plays a critical role in this process as delivery executive productivity depends on the parcel-carrying capacity of their vehicle. The mix of small, medium, and large parcels is also essential as it directly impacts their carrying capacity, making it a separate prediction problem. Once demand is calculated, it’s time to plan your routes. 

              Route planning and bagging 

              To ensure optimal productivity, a delivery executive must receive a daily route plan that takes advantage of population density. For instance, someone delivering to a large office complex can efficiently deliver all parcels in just a few hours, while someone delivering to sparsely populated residential areas may have to travel longer distances.

              Delivery hubs must be intelligent enough to create daily route plans based on orders, maximizing productivity while minimizing distance travelled. 

              Once the route planning is complete, picking, and bagging operations commence. The pick list must be designed intelligently to facilitate parcel bagging according to the route plan. Route planning and bagging are critical components that contribute to the productivity of a delivery executive,  i.e. the number of packets delivered by an executive per day.

              While doorstep delivery is a time-consuming process, increasingly, consumers are looking for additional value in various forms. 

              Value-added services 

              To provide exceptional customer experience and enhance loyalty, various value-added services are included on top of the standard last-mile delivery, such as:  

              • Free returns 
              • Replacements 
              • Exchange offers and detach options
              • Open box deliveries 
              • Slotted deliveries  
              • Demos, and installations
              • Recycle options 

              These additional services are performed by the same delivery executives, but they require more time and thus reduce productivity (i.e., the number of packages delivered per day).

              Moreover, options such as pre-orders, early access, and part payments add complexity to the process, and the planner must be adept at integrating these nuances into the plan to ensure that last-mile delivery is successful and timely. 

              To manage these services effectively, excellent back-end technology is crucial to ensure smooth workflow. A look at the process of returns and replacements makes it clear why. 

              Returns and replacements 

              Returns and replacements are critical aspects of the eCommerce industry as they contribute to both higher sales and greater customer satisfaction. Typically, the return rate ranges from 20-30%, and it is the same last-mile delivery team that handles customer returns, replacements, and exchanges.

              To manage these processes efficiently, planners must estimate the daily return rate and create plans to ensure sufficient delivery hub capacity to accommodate the pick-up, transportation, storage, and processing of returns. These complex processes must be supported by your back-end technology, such that every event is automatically logged into the system and shared with the appropriate people.  

              One more way delivery companies manage these challenges is through the use of third parties – let’s see how that looks in practice. 

              Third-party logistics 

              The eCommerce industry has event-driven peak and BAU (business as usual) days every month. To manage costs, companies use hybrid models for deliveries, allocating volumes between their own capacities and 3PLs to handle peak loads while maintaining service levels. With variable pricing models and differing SLAs, the speed vs cost vs reliability equation is crucial in choosing the right 3PLs.

              Technology integration is also key to seamless order flow and tracking, as well as accurate planning for offloading volumes at the source, middle-mile, or last-mile delivery hub. 

              In summary 

              To optimize last-mile delivery, it is essential to focus on several key areas. The first is route optimization, which involves using data and technology to identify the most efficient delivery routes to reduce travel time and distance. This can be achieved with GPS tracking, real-time traffic updates, and predictive analytics. 

              Another critical area to focus on is customer communication and transparency. Customers want to know when their delivery will arrive and be able to track it in real-time. Providing accurate and timely updates can help improve customer satisfaction and reduce the number of missed deliveries. 

              Furthermore, the use of alternative delivery methods, such as locker systems or drones, can help overcome the challenges of traditional last-mile delivery methods. Locker systems provide a secure and convenient location for customers to pick up their packages, while drones can deliver packages directly to customers’ doorsteps, bypassing traffic and roadblocks. 

              Finally, it is crucial to prioritize sustainability in last-mile delivery. The increasing volume of deliveries has resulted in a significant increase in carbon emissions, and companies must take steps to reduce their environmental impact. This can be achieved using electric vehicles, bike couriers, and optimizing delivery routes to reduce unnecessary travel. 

              Optimizing last-mile delivery is crucial to meet customer expectations, reduce costs, and reduce environmental impact. Challenges in the areas of capacity, workforce and logistics are cutting deep into retailers’ margins, but they are solvable. By focusing on route optimization, customer communication, alternative delivery methods, and sustainability, companies can create a more efficient and effective last-mile delivery process.  

              A supply chain planner who possesses a deep understanding of the intricacies of last-mile delivery, and is equipped with the right tools and technologies, will play a critical role in the profitability of the entire organization. 

              To learn more, strike up an exploratory conversation or share experiences and feedback, contact us at Pravin Chaudhary, Director Supply Chain, Industry Platform CPR Pravin.a.chaudhary@capgemini.com 

              Author

              Pravin Chaudhary

              Director, Consumer Products & Retail Lead Capgemini
              Pravin is Capgemini’s Supply Chain thought leader for Consumer Products and Retail Sector. He has more than 17 years of experience in running supply chains for Global Consumer Products and e-commerce companies. Pravin specializes in Supply Chain planning, Fulfillment design and Optimization, Order to Cash process and e-commerce Supply design and Last-Mile deliveries. 

                Explore more

                Reimagine the future of consumer products

                Redefining success in the new era of connected commerce

                Turning complex supply chains into intelligent ecosystems

                Design, build, and run “intuitive” deal desk operations

                Deepak Bhootra
                Oct 16, 2023

                Implementing frictionless, digitally augmented, and data-driven deal desk operations can drive operational excellence and increased value across the sales lifecycle.

                Just as a rally driver relies on a navigator to get them from A to B so they can stay focused on the speed and agility needed to win the race, so the sales function needs an “intuitive deal desk” to optimize deal value while mitigating risk.

                But what – you might ask – is a “deal desk”? And how can it be made “intuitive”?

                A deal desk is a cross-functional team that analyzes, tracks, converts to quotes, reviews, and approves sales opportunities in real time, so they can focus more on speed to sales. This supports the strategic revenue and profitability goals of the sales strategy and provides rigor and structure to the tactical and dynamic realities of negotiation, approvals, and handling non-standard attributes of supporting a deal across the sales lifecycle.

                However, a deal desk operation frequently comes up against a range of challenges that include:

                • Insufficient, inaccurate, and inconsistent data
                • Lack of insights into sales drivers
                • Lack of visibility into pricing analytics
                • Outdated sales technology.

                These challenges often result in inefficient and non-intuitive deal desk operations.

                In turn, this can lead to unnecessary time and effort spent dealing with day-to-day tactical issues and losing sight of strategic levers such as win-rate drivers, improving margins, and stopping revenue leakage.

                Leverage intelligent, frictionless deal desk operations

                What’s needed is a smart, seamless deal desk operations model that can be tailored to the culture, practices, and needs of the individual organization – and that empowers the people who use it.

                At Capgemini, we recommend implementing a “Deal Life Cycle Cockpit” (Figure 1) that leverages three key elements:

                • Deal input – information on the deal helps establish accountability across the deal sales lifecycle.
                • Deal optimization – a deal review interface that automates analysis, expands configuration capabilities, and helps to make sense of optimization recommendations and how they are adopted by sales personnel.
                • Deal review – integrating recommendations (from data science and pricing analytics) on key deal optimization techniques support business outcomes and, in turn, the bottom-line.
                At Capgemini, we recommend implementing a “Deal Life Cycle Cockpit” that leverages three key elements: Deal input , Deal optimization , and Deal review
                Figure 1: Intuitive Deal Desk Framework – “Deal Lifecycle Cockpit”

                Technology eliminates process drag and improves how internal and external users experience it. This not only delivers hard gains (ROI, revenue gains and incremental margin), but also qualitative benefits such as improved CSAT/NPS that translates to stickiness, repurchase, loyalty, and “mindshare.” The integration of rich data and extended sales and pricing analytics delivers incremental business gains.

                Accelerating deal approval and win rates

                Capgemini partners with our clients to reimagine their deal desk operations model – integrating, streamlining, and optimizing sales touchpoints across the lead-to-order lifecycle through leveraging our tested process, data, technology, and outcome-driven approach.

                Enriching our clients’ digital sales strategy with data-driven insights and operational excellence across the sales function has led to some truly transformative business outcomes, including:

                • 20–40% decrease in deal approval turnaround time
                • 3–5% increase in win rates
                • 20–30% increase in customer satisfaction
                • 35–45% reduction in costs, and
                • 15–25% reduction in escalations.

                To prevent you from crashing into a ditch and getting covered in mud, intuitive deal desk operations help you drive operational excellence and increased value across your organization’s sales lifecycle.

                To learn how Capgemini’s Empowered Sales Operations solution can help you in unleashing competitive advantage from your deal desk operations for increased sales and user experience, contact: deepak.bhootra@capgemini.com.

                Meet our expert

                Deepak Bhootra

                GTM Lead, Empowered Sales Operations, Capgemini’s Business Services
                Deepak Bhootra is an established executive with two decades of global leadership experience. He delivers process excellence and sales growth for clients by optimizing processes and delivering seamless business transformation.

                  How to securely monitor a 5G network

                  Aarthi Krishna & Kiran Gurudatt
                  16 Oct 2023

                  Every generation of wireless technology has required organizations to adapt their security practices to effectively monitor and protect their networks. But monitoring a 5G network presents a new level of complexity due to the different protocols and architecture involved.

                  In our final blog of the 5G security series, it’s time to explore the complexities of monitoring a 5G network and how organizations can ensure that their infrastructure is watertight.

                  The 5G step change

                  Traditionally, security monitoring has focused on IT networks, such as MPLS or IP networks, where most security operations centers (SOCs) operate from. These SOCs primarily monitor enterprise systems like office, financial, and HR systems. However, with the proliferation of connectivity in operational environments, including manufacturing facilities, warehouses, and critical infrastructure, the monitoring scope has evolved to include operational technology (OT) networks too.

                  OT networks differ from enterprise networks in terms of protocols and tools required for monitoring. Proprietary protocols often govern devices and equipment in the OT environment, each requiring specific protocols and tools for monitoring.

                  Unlike IP networks, 5G networks operate on cellular protocols and follow cellular standards developed over previous generations (e.g., 2G, 3G, 4G). The difference is that as organizations deploy their own private or hybrid 5G networks, the responsibility for monitoring these networks shifts from telco providers to the enterprises themselves.

                  This is a completely new world for organizations, introducing unique complexities tied to cellular protocols and the division between the control and data plane (the former handles the initial handshake, authentication, encryption, and bandwidth allocation, while the latter facilitates the actual data transfer). Monitoring both planes and correlating the data is essential for effective 5G network security operations.

                  24×7 log collection

                  A fundamental aspect of 5G network security is continuous monitoring through 24×7 log collection. Logs are gathered from various components spanning from the user equipment (UE) to the core, providing crucial insights into potential security events.

                  The extent of log collection depends on the deployment model adopted. In private deployment models, higher volumes of log collection is possible. However, where the 5G architecture is shared with mobile network operators (MNOs), the service provider must collaborate to ensure the necessary logs are collected.

                  To achieve comprehensive monitoring, it is essential to collect logs from both the control plane and the data plane of the 5G architecture. Additionally, specialized toolsets are required as existing enterprise log collection tools may not fully comprehend the specific protocols, such as GTP, used in 5G networks. These tools not only collect data but also correlate them to identify ongoing attacks effectively.

                  Indicators of compromise

                  The next aspect of monitoring are indicators of compromise(IoCs),which play a vital role in detecting security attacks within the 5G environment. The best-in-class toolsets available today provide a range of IoCs that can be utilized by SOC analysts to identify potential security breaches. These IoCs can subsequently be categorized into device-related and traffic/performance-related indicators.

                  Some examples of device related IoCs include:

                  Detecting unknown devices in the network, Monitoring changes in device connection status, Identifying new device vendors, Detecting devices that have not been seen for a specific period, Identifying new device types, Monitoring abnormal device traffic usage, Tracking abnormal traffic usage by devices in specific locations, Identifying user equipment (UE) connection failures, Detecting consistent failures in UE IP allocation, Identifying conflicting IMEI numbers with SUPI and SUCI mapping, Detecting unknown UEs joining the network, Monitoring repeated UE authorization failures, Identifying devices with unknown locations, Identifying devices with vulnerabilities or performance issues.

                  Similarly some traffic and performance IoCs include

                  Identifying unauthorized traffic patterns, Monitoring compliance with quality of service (QoS) parameters, Detecting abnormal traffic for specific devices or applications, Monitoring the absence of traffic, Identifying abnormal protocol usage for user equipment (UE) and Internet of Things (IoT) devices, Detecting spikes in control traffic to UE, radio access network (RAN), and core, Monitoring spikes in user plane data, potentially indicating distributed denial of service (DDoS) attacks

                  These IoC examples offer a glimpse into the extensive use cases built around them, and with the right tools, SOC analysts should feel empowered to detect and respond to security breaches effectively.

                  Best practices for securely monitoring 5G networks

                  Monitoring a cellular network can be complex but when taken step-by-step it can also be manageable and efficient:

                  • Develop expertise: Invest in training and familiarize the team with the unique aspects of 5G protocols and the control-data plane division.
                  • Collaborate with telco providers: Engage with telco providers to understand their monitoring capabilities and coordinate efforts to ensure end-to-end security for private 5G networks.
                  • Adopt specialized tools: Acquire monitoring tools designed specifically for 5G networks, capable of monitoring both the control plane and the data plane. These tools should provide comprehensive visibility and correlation capabilities.
                  • Implement network slicing: Leverage network slicing capabilities to isolate and monitor different slices within the 5G network. This approach enhances security and allows focused monitoring for specific services or devices.
                  • Continuous monitoring and analysis: Implement real-time monitoring and analysis to identify anomalies, detect potential threats, and respond promptly. Incorporate threat intelligence feeds to stay updated on emerging threats and vulnerabilities in 5G networks.

                  As these different components come together in different deployment models, achieving end-to-end security in 5G can become challenging. This is why IT, OT, and cellular network security policies must all be well aligned and integrated to bring enterprise grade security that is governed by zero trust principle protecting north–south and east–west traffic as well as data at storage and in transit. 

                  Overall, any monitoring of a 5G network requires organizations to adapt their security practices to the unique characteristics of cellular protocols and the control-data plane division. By investing in expertise, collaborating with telco providers, leveraging specialized tools, and adopting best practices, organizations can ensure the security of their 5G networks and start embracing the benefits of 5G technology.

                  Contact Capgemini today to find out about 5G security.

                  Author

                  Aarthi Krishna

                  Global Head, Intelligent Industry Security, Capgemini
                  Aarthi Krishna is the Global Head of Intelligent Industry Security with the Cloud, Infrastructure and Security (CIS) business line at Capgemini. In her current role, she is responsible for the Intelligent Industry Security practice with a portfolio focussed on both emerging technologies (as OT, IoT, 5G and DevSecOps) and industry verticals (as automotive, life sciences, energy and utilities) to ensure our clients can benefit from a true end to end cyber offering.

                  Kiran Gurudatt

                  Director, Cybersecurity, Capgemini

                    Our journey to winning the NTIA 5G challenge

                    Ashish Yadav
                    Oct 16, 2023
                    capgemini-engineering

                    The 5G Challenge aimed to accelerate the adoption of open, interoperable wireless network equipment to support vendor diversity, supply chain resiliency and national security.

                    From Morse Code to the NTIA

                    When Guglielmo Marconi sent a Morse Code signal using radio waves to a distance of 3.3 kilometers in 1895, he may not have fully understood its impact on communication in the years to come. His actions sparked a revolution that continues to transform how people communicate.

                    83 years later, the National Telecommunications and Information Administration (NTIA) was created. It has played a crucial role in shaping the nation’s telecommunications policies and promoting innovation and growth in the technology field.

                    As part of the U.S. Department of Commerce, the NTIA is the Executive Branch agency that advises the President on telecommunications and information policy issues. NTIA’s programs and policymaking focus primarily on expanding broadband Internet access and adoption in America, expanding the use of spectrum by all users, advancing public safety communications, and ensuring that the Internet remains an engine for innovation and economic growth.

                    NTIA office bearers worked tirelessly to fight the digital divide and bring digital equity. In 1994, it sponsored the first virtual government hearing over the Internet. Schools and public libraries nationwide offered public access points for Americans who otherwise would not have access to view the hearing. It was undoubtedly a significant shift in thinking and how people connected.

                    In 1995, the principles of a book co-written by former US Vice President Al Gore, ‘The Global Information Infrastructure: Agenda for Cooperation’ helped transform the Internet into a shared international resource throughout the rest of the 1990s. As part of this, the digitization of information on the website of the NTIA and other U.S. agencies made government more accessible to everyday Americans.

                    The 2022 5G Challenge

                    In 2022, NTIA launched a 2-year ‘5G Challenge’ program to foster a vibrant 5G O-RAN vendor community. The 5G Challenge, hosted by the US Institute for Telecommunication Sciences (ITS), aimed to accelerate the adoption of open, interoperable wireless network equipment to support vendor diversity, supply chain resiliency, and national security.

                    In its first year (2022), NTIA/ITS required the contestants to successfully integrate hardware and/or software solutions for one or more 5G network subsystems: Central Unit (CU), Distributed Unit (DU) and Radio Unit (RU).

                    The Capgemini team participated in the 2022 5G challenge, competing in the Central Unit category and winning all three challenge stages

                    Capgemini qualified for Stage One by demonstrating compliance with 3GPP and O-RAN Alliance. We were evaluated successfully on End-to-End functionality and standards conformance. We proceeded to Stage Three, successfully demonstrating wrap-around testing for Stage Two. Capgemini won Stage Three and the challenge by successfully integrating subsystems from five different vendors: user equipment (UE), Radio Unit, Distributed Unit, Central Unit, and Core.

                    The 2023 5G Challenge

                    The 2023 5G Challenge was the second of the two 5G Challenges. NTIA selected contestants with high-performing 5G subsystems that showcased multi-vendor interoperability across RUs and combined CUs and DUs (CU+DU). CableLabs hosted the challenge and provided two separate 5G test and emulation systems.

                    Capgemini was assigned one of the two subsystems. Contestant subsystems that passed the emulated testing were integrated with the Capgemini subsystem and the CableLabs baseline system consisting of a 5G standalone core and UE emulator. ‘Plug-and-play’ performance was evaluated using a standard corpus of performance metrics. The testing involved cold pairing with radios selected by NTIA.

                    The level of testing and interoperability justifies the event’s name. It really was a challenge, but one in which the Capgemini team participated with great enthusiasm. It was also a testimonial to the resilience, compliance, and quality of the Capgemini CU and DU framework. We completed all the stages for wrap-around emulation testing, End-to-end (E2E) integration testing to establish and test E2E sessions (CU+DU and RU), and Mobility testing between two E2E sessions.

                    Success!

                    The NTIA awarded Capgemini first place for Multi-Vendor E2E Integration. NTIA specifically applauded Capgemini during the ceremony for achieving a 100% pass rate on all feature and performance tests, which was the icing on the cake.

                    It took hard work, teamwork, personal sacrifices, motivation, hope, and triumph to reach the final hard-won victory. The Capgemini team’s incredible journey with the 5G Challenge 2023 started on March 1st, 2023, with Stage One kick-off, and culminated in September 2023, when NTIA awarded us first place.

                    At the closing ceremony, held in the CableLabs/Krio facilities, Boulder, Colorado, on September 21st, Capgemini was presented with two prizes: Multi-Vendor E2E Integration and Wrap-around testing for the Open RAN CU and DU.

                    This experience of competing in the two NTIA challenges over the past two years reinforced the foundation and importance of O-RAN, and why interoperability is so crucial for the broader ecosystem. 

                    It has been an incredible journey for the Capgemini team to be part of this historic initiative by NTIA. It brought the diverse vendor community together to give a meaningful push to making wireless networks interoperable. It also gave us all a common purpose to serve, by building a resilient supply chain for national security.


                    Meet our expert

                    Ashish Yadav

                    Head of Strategic Partnerships and Technical Product Management, Software Frameworks & Solutions, Capgemini Engineering
                    Ashish Yadav is a leader with more than 20 years of engineering experience, managing strategic partnerships for start-ups and Fortune 500 global technology companies. In her current role, she is responsible for global strategic partnership alliances and technical product marketing for the Software Frameworks & Solutions portfolio at Capgemini. This group is responsible for building innovative offerings in the area of 5G, networking, cloud/Edge and the automotive sector.