Skip to Content

E-mobility for truck OEMs: overcoming obstacles to take-up

Fredrik Almhöjd – Our expert
Fredrik Almhöjd
09 June 2023

Truck OEMs are committed to electrification to reduce emissions, particularly Scope 3 emissions. Most OEMs have forged ahead with launching EVs, but this segment of their portfolio is not growing as fast as it needs to.

In this article, Fredrik Almhöjd recommends three actions commercial vehicle manufacturers can take now to start overcoming customer resistance to e-mobility, with an emphasis on charging and connected services.

In the earlier blog article, Fredrik gave an overview of automotive sustainability in the commercial vehicle space and explained why companies need to take a holistic view to achieve their CO2 targets and other environmental aims.

This time, however, we are going to focus on a specific contributor to automotive sustainability – e-mobility. This is one of the industry’s biggest trends and undoubtedly the most important single enabler of OEMs’ sustainability ambitions.

That’s because emissions arising from products sold and in use represent the biggest and most challenging element of truck OEMs’ total emissions. Switching sales from ICE trucks to EVs is crucial in reducing these downstream emissions, which fall within Scope 3 of the Science Based Targets initiative (SBTi).

We will be talking mainly about BEVs here, but most of the discussion applies equally to FCEVs.

The balance isn’t shifting fast enough towards e-mobility

Truck OEMs are pushing hard in this area – virtually all have already launched BEV trucks onto the market. And they all have ambitious plans to grow the proportion of their portfolio represented by BEVs. Volvo Trucks, for example, has set itself a target of 50% fully electric vehicles by 2030 and aims to reach net zero in its value chain by 2040. Similarly, Scania has estimated that 50% of its vehicle sales will be electric by the end of the decade. Other truck OEMs have similar ambitions.

However, many truck companies are finding the EV segment of their portfolio is not growing as fast as they need it to. This is mainly because their customers are strongly TCO-driven, and the TCO just isn’t attractive enough compared with ICE trucks. In addition, concerns remain about range limitations and insufficient charging infrastructure.

As well as jeopardizing net zero targets, slow take-up of electromobility creates a Catch-22 situation because, while cost is a barrier to take-up, high volumes are needed to drive costs down.

Truck OEMs have the power to accelerate EV take-up

Therefore truck OEMs urgently need to improve EV take-up. Here are three ways they can start doing so now.

1. Formulate a clear go-to-market strategy and organization

The go-to-market strategy is an important determinant of EV success, and the right internal organization is key. Although the process of selling EVs is longer and more complex than that for conventional trucks, my personal opinion is that EV sales should come within the traditional sales organization, which already knows the customers.

Of course, the sales organization needs to be motivated to promote EVs, which involves setting appropriate KPIs and targets. At the same time, salespeople need training and tools to equip them with the competencies to satisfy customers’ EV concerns. In addition, the salespeople need to know how to work with other stakeholders within the wider automotive ecosystem, such as power companies and providers of charging equipment, to create the right package for each customer.

The sales force also needs to be conversant with the arguments for EVs, both financial and environmental. In contrast with image-conscious private car users, truck customers are firmly focused on commercial realities, and will usually only buy an EV if they are convinced that it makes business sense. This won’t happen unless salespeople themselves are convinced of the case for EVs, and therefore education is key.


Major benefits of EVs

Many authorities consider EVs the #1 technology for the decarbonization of road transport. Aside from CO2 reduction, other benefits include noise reduction, which means that trucks can continue to operate at night, even in cities. This brings both general productivity benefits and specific advantages in areas such as waste removal and distribution.

Circular economy benefits include the possibility of giving truck batteries a second life in industries such as energy storage support and backup. Cutting waste like this brings financial as well as automotive sustainability benefits. In addition, EVs can store energy, potentially offsetting fluctuations in wind and solar power and hence facilitating growth in renewable energy use.


2. Put customers’ minds at rest via connected services and other tech tools

Truck OEMs should offer software tools and services that address customers’ questions, such as simulators for calculating a fleet’s battery and charging needs, and the implications for routing. We’ll discuss the important topic of automotive-connected services in more detail later in this blog series.

Ecosystem partners are likely to deliver many of the solutions required, and it’s vital to work out who will be responsible for what, as we discussed in an earlier article in this series, The commercial vehicle ecosystem – who will do what? Typically, services that access critical vehicle functions, as well as those relating to productivity and other top operational priorities, will be among those provided by the OEM itself. Still, many others can be left to third parties.

3. Develop an adequate charging infrastructure

Of course, customers will not experience true peace of mind until charging anxieties are satisfactorily addressed. For fleet managers, the risk that a truck will not be fully utilized because of charging issues can be a deal-breaker. Fortunately, this risk can be virtually eliminated using route planning and battery monitoring services in the context of adequate charging infrastructure.

To provide this infrastructure, OEMs need to collaborate with energy industry partners and major customers to agree on strategy and responsibilities; they may well decide to invest in the infrastructure themselves. Design complexities include deciding how to use public charging points versus those at the depot, and which types of chargers should be used where. Additional options for trucks include megawatt chargers, which deliver higher power at a higher cost.

It should be remembered that trucks can’t usually use existing infrastructure because of their need for bigger parking spaces and access routes, and for drawing higher levels of energy from the grid. Dedicated infrastructure for trucks is likely to be needed, therefore.

Truck OEMs can’t wait for the charging challenge to solve itself. Each company should undertake a proactive study of its customers’ charging requirements, evaluate potential business models for strategic fit, and then create an implementation roadmap for its chosen model.


European infrastructure requirements, and official plans to meet them

ACEA, the European Automobile Manufacturers’ Association, believes that trucks in the EU region will need 10,000‐15,000 higher power public and destination charging points by 2025, and 40,000‐50,000 by 2030. By this later date, ACEA says there should also be 40,000 lower-power public overnight chargers at truck parking areas.
 
Recent EU initiatives to address this need include the Alternative Fuel Infrastructure Regulation (AFIR), which requires governments to provide at least 3,600 kW of electric truck charging capacity every 60 km on primary motorways and 1,500 kW of truck charging capacity every 100 km on secondary motorways. The regulation also calls for charging hubs in every major city and charging stations in designated truck parking areas. 2030 is the deadline for all these requirements but there are also interim targets.
 
Although the regulation has been criticized for not going far or fast enough, it provides manufacturers with a useful basis for their infrastructure roadmaps.


Moving forward through concerted action

With the actions suggested above, OEMs should be able to significantly increase EV take-up in the short to medium term, accelerating progress toward automotive sustainability goals. As ever, though, they can’t do it all on their own. They need help from a wider range of ecosystem partners, and, just as importantly, from governments, which should use fiscal and other measures to make the TCO of EVs more favourable relative to ICE vehicles so that cost-conscious truck buyers become willing to embrace e-mobility. We’ll revisit these topics in later articles, and will also take a deeper dive into the two key e-mobility enablers: connected services and charging.

About Author

Fredrik Almhöjd – Our expert

Fredrik Almhöjd

Director, Capgemini Invent
Fredrik Almhöjd is Capgemini’s Go-to-Market Lead for Commercial Vehicles in the Nordics, with 25+ years of sector experience plus extensive knowhow in Sales & Marketing and Customer Services transformation.

    Software-defined vehicles (SDV): The answer to truck driver shortages?

    Fredrik Almhöjd – Our expert
    Fredrik Almhöjd
    Aug 2, 2023

    Although most truck OEMs acknowledge software-defined vehicles as a new norm for the commercial vehicle industry, they still need to convince their customers that these vehicles will add value to their businesses – especially around the top three objectives of improved uptime, productivity, and fuel efficiency.

    SDVs have a major role to play in helping fleet operators overcome the international shortage of truck drivers, explain Fredrik Almhöjd and Jean-Marie Lapeyre, Chief Technology & Innovation Officer, Global Automotive Industry at Capgemini. That’s because SDVs can transform the driver experience, potentially attracting younger people and women who currently don’t see truck-driving as a career option.

    “Without action to make the driver profession more accessible and attractive, Europe could lack over two million drivers by 2026, impacting half of all freight movements and millions of passenger journeys.” That is the stark prediction of the International Road Transport Union (IRU), commenting on a study it conducted in 2022. The outlook isn’t any more reassuring in other regions.

    So what are transportation companies to do, and how can truck OEMs help? In this article, we’ll argue that software-defined vehicles (SDVs) could be a big part of the answer. We’ll be building on ideas from earlier blogs.

    In the passenger car market, the concept of SDVs is often promoted on the basis that it will create a better customer experience for the driver. For commercial vehicle fleet operators, by contrast, the main focus has always been, and will continue to be, on TCO. Until recently, efforts to improve life for the driver, while important, have received less attention.

    However, with driver shortages becoming critical, truck-driving needs to be made more attractive to jobseekers. The IRU suggests that attracting more women and young people is an important part of the solution – but current working conditions make that difficult.

    SDVs can help with the challenge of recruiting and retaining staff.

    SDV features can make drivers’ lives better

    So what SDV features might improve driver experience? Truck drivers will enjoy many of the same benefits as car drivers, such as customized infotainment – though obviously, this must not distract them from the job.

    Consumer-oriented SDV features can be tailored for trucks. For example, a framework for companion apps on smartphones could be adapted to support the needs of HGV drivers in finding places to stop, eat, and sleep, avoiding illegal and dangerous use of phones while driving. In addition, although software can’t improve the quality of facilities available to drivers, it can help direct them to the most satisfactory ones based on a driver’s personal preferences and ratings by other users.

    With all the functionality they need integrated and automated (and configured for personal habits and preferences that they have already stored), the job can be done safely, easily, and legally. Similar technology could be used to help last-mile delivery drivers navigate between stops.

    Integrate drivers’ digital lives

    Many people, especially younger ones, now expect their digital lives to be streamlined and integrated across work and leisure. To appeal to these individuals, SDVs could be equipped to remember drivers’ preferences regarding infotainment modes and transfer them across trucks. Their preferred smartphone apps, or similar ones, could also be made available via the truck’s console.

    By integrating various aspects of working life, we can make the driver’s job easier, as well as more pleasant. A common complaint from truck drivers is that they have to unload cargo themselves because there is nobody else to do it. An SDV can contact the destination to communicate the arrival time and nature of the cargo, increasing the chances of the relevant staff being on hand with the right equipment.

    When a truck is an SDV, ADAS features can easily be added. Some of these features can help to make truck-driving more attractive to younger people and women by allowing multiple tasks to be performed simultaneously. Stress levels for the driver are reduced significantly if they can organize their working day – including route optimization and scheduling of pickups and deliveries – while they’re on the road. This can be achieved through partial automation of driving tasks, whether via assistant systems or fully autonomous driving (say up to level 4), paired with services that help with routing and scheduling.

    Overcome negative perceptions of truck-driving careers

    For women in particular, personal safety issues can be a deterrent to working as a truck driver. Connected vehicle software can help here too. For example, AI-enabled services can monitor sensor data and warn when someone is approaching a stationary truck, and biometrics can control who has access to the cabin. Predictive maintenance can reduce or eliminate the risk of breaking down in a lonely spot. (And with SDVs, we can go beyond preventive maintenance via telematics and alerts to their natural successor, self-diagnosis by the vehicle.)

    Thanks to SDV connectedness, despatchers can more easily monitor drivers’ safety and send help if needed. The same communications facilities could streamline interaction between communities of drivers who can look out for one another, reducing any sense of isolation.

    Long hours away from home are another turn-off for many potential drivers. SDVs’ communications technologies can improve their work-life balance, with social media style software, in-vehicle display screens, and cameras keeping the driver in touch with family or friends during stops.

    Work-life balance can be further improved by advanced route optimization techniques. An SDV route can be automatically optimized to accommodate a driver’s personal preferences and constraints, as well as requirements such as refueling and rest stops. It can then be continuously adjusted to reflect the current circumstances such as weather and traffic conditions, helping drivers to finish work on schedule.

    Deliver better driver experience and financial benefits for fleet operators

    Despite their urgent need to recruit more drivers, at the end of the day truck buyers are still likely to focus on the more tangible benefits of SDVs. The good news is that many of the features that give drivers a better experience simultaneously increase productivity, uptime, or fuel efficiency – for example, predictive maintenance and real-time route optimization, both mentioned above.

    The same is true of services that address electric vehicles’ range limitations and shortages of charging stations (as discussed in our recent e-mobility blog). Suppose the truck’s battery is getting flat, and the nearest charging station has a long wait time. An SDV can save energy in various ways: for example by modifying engine parameters or environmental settings such as aircon, or by advising changes in driving behavior. With these adjustments, the driver can continue to a charging station with an acceptable wait time, improving productivity and likely reducing frustration too.

    Safer driving is yet another example of an SDV capability that benefits both employer and driver. Examples here include the use of sensors to detect when vehicles get too close to one another, or when drivers are tired and need a break. For example, a truck could raise an alert when its driver is blinking more frequently than is normal for them, indicating exhaustion.

    Make driver appeal part of the business case for SDVs

    For truck OEMs and tier 1s, the case for SDVs is clear. They can enhance revenue flows via a shift from one-off purchases to full lifecycle engagement, and improve automotive sustainability performance, for example by reducing waste in R&D processes. Ultimately, SDVs can help to make the brand central to customers’ businesses. In addition, selling SDVs makes sense as part of the journey to autonomous driving and in the context of companies’ overall digital transformation.

    Software-defined vehicles as passenger cars

    SDVs are already proving their worth in the passenger car market, where improved driver experience is a more obvious selling point. (Read our “point of view” report on software-driven transformation for more.)

    An excerpt from a recent Connected Mobility infographic – please download the full version here

    The question is how to demonstrate the value of SDVs to truck customers such as fleet operators. Industry concepts such as software-driven transformation are not always much help here. Instead, OEMs can point to the business benefits that result from SDV adoption. And right now, improved driver experience could be among the most important of those benefits because of its ability to help overcome driver shortages.

    For more information, visit the commercial vehicles area of Capgemini’s website, and read the earlier articles in this blog series.

    About Author

    Fredrik Almhöjd – Our expert

    Fredrik Almhöjd

    Director, Capgemini Invent
    Fredrik Almhöjd is Capgemini’s Go-to-Market Lead for Commercial Vehicles in the Nordics, with 25+ years of sector experience plus extensive knowhow in Sales & Marketing and Customer Services transformation.
    Jean-Marie Lapeyre

    Jean-Marie Lapeyre

    EVP and Chief Technology & Innovation Officer, Global Automotive Industry
    Jean-Marie Lapeyre works with automotive clients to develop and launch actionable technology strategies to help them succeed in a data and software-driven world.

      Expert Perspectives

      How distributed ledger technology can impact the role of centralized clearing parties

      George Holt
      29 July 2024

      Periodically, transformative technology emerges that instigates profound changes across multiple industries, redefining the way we live our lives and conduct business. Distributed ledger technology (DLT) is one contemporary example.

      DLT operates on the principle of decentralization, structured upon layered protocols and frameworks. At its core, it distributes transaction data across numerous points, all connected to a shared ledger acting as the golden source – removing many reconciliation efforts. This shared ledger, updated collectively by “nodes” utilizing diverse consensus mechanisms, forms the backbone of DLT’s architecture, with blockchain technology driving its mechanics.

      Tokenization and physical assets

      One of DLT’s most striking applications is the tokenization of physical assets, creating what are known as digital assets. From tokenized securities to cryptocurrencies and central bank digital currencies (CBDCs), these digital representations revolutionize the trading landscape. Tokenization empowers investors to transact assets with unprecedented speed and fractional ownership, fostering liquidity and lowering bureaucratic hurdles.

      Impact on centralized clearing parties (CCPs)

      Through DLT, counterparties engage in direct trading, bypassing traditional intermediaries. This raises pertinent questions about the future role of CCPs in this DLT-driven paradigm shift. But before we look into the future, let’s examine some current developments in this arena.

      What’s the latest in the industry?

      BlackRock, a globally leading asset manager, has revealed plans for a digital fund leveraging Ethereum’s blockchain, while simultaneously acquiring a stake in Securitize, a platform facilitating asset tokenization. This strategic move underscores their commitment to tokenization infrastructure and a shift in the approach.

      Meanwhile, Cleartoken are a new industry disrupter in this space, who recently raised $10 million in seed investment. Their declared mission is to be one of the first entrants in the CCP space for digital assets. The plan is to establish a central clearinghouse that will mitigate risk and encourage wider institutional adoption of crypto currencies by creating a more secure trading environment.

      As the industry forges ahead, regulators and governments starting to recognize the potential of digital assets. Treasury Secretary Janet Yellen advocates for US leadership in the crypto space, while regulators from the UK, Singapore, Switzerland, and Japan collaborate to explore digital asset use cases. Additionally, the US Securities and Exchange Commission’s approval of Bitcoin ETF earlier this year highlights the evolving regulatory landscape.

      Opportunities for traditional CCPs

      Despite the potential threat emerging to the role of traditional CCPs – they also have opportunities to take advantage of. Many parties will require guidance and assistance to navigate the digital assets infrastructure space. CCPs are uniquely placed with their existing relationships to both help inform regulators, and guide parties through the new regulatory landscape. CCPs could also take a lead in setting up technical infrastructure within organizations to make digital asset trading possible. Clearing parties’ familiarity with the market, individual participants and the regulators mean they are uniquely placed to be at the forefront of change if they have the right strategy. This would allow them to continue their role as facilitators, with less control of the processing, but more influence within the individual parties. There will also be a demand for hard copies of ledgers, at fixed points in time. As leaders on the blockchain with high stakes in various chains, existing CCPs could be able to produce this.

      Whether participants strongly believe in the power of DLT or have lingering doubts, one thing is clear: DLT is here to stay and it’s changing the world of post-trade financial services forever.

      Meet our expert

      George Holt

      Senior Consultant, Capgemini

        Expert perspectives

        How digital assets reshape the post-trade landscape in capital markets

        George Holt
        08 July 2024

        The continued advancement of digital assets, including cryptocurrencies, security tokens, and other blockchain derivatives, is building a new era in the capital markets. This shift extends beyond trading, by revolutionizing the logging, clearing, and settling of transactions within the post-trade sphere. As digital assets establish their presence, they reveal significant challenges and unique opportunities. These developments have the potential to reshape the financial services industry.

        Streamlining operations and mitigating risks

        Digital assets expedite and streamline transaction processes far beyond the capabilities of traditional financial tools. Underpinned by blockchain technology, they facilitate transactions that are not only faster but also settle in real time, paving the way for atomic settlement. This eliminates the need for the protracted settlement periods typical of legacy systems, thereby reducing counterparty risks and boosting market liquidity.

        The revolutionary role of smart contracts

        Smart contracts are a pivotal innovation in utilizing digital assets post-trade. Embedded directly into blockchain code, these contracts execute automatically, upheld by a decentralized network of computers via network-wide consensus. For example, in a bilateral trade, both parties must agree on the trade economics before the contract is considered upheld and the trade is written to the ledger. Smart contracts can automate the complex and labor-intensive tasks of post-trade operations, from compliance verification to dividend issuance and managing corporate actions. This automation potential may significantly reduce operational costs and curtail human error, streamlining the entire post-trade process.

        Navigating the integration minefield

        Yet, for all their advantages, digital assets present formidable integration challenges within the traditional capital markets framework. Regulatory clarity is still, at best, a work in progress globally, as authorities grapple with appropriate frameworks to govern these digital assets. Moreover, the existing technological infrastructures of conventional financial institutions often require extensive overhauls to accommodate blockchain transactions, necessitating significant investments in new technology and workforce retraining.

        Evolving regulations

        As the impact of digital assets becomes more apparent within financial markets, regulators are under pressure to evolve existing legislation to include these innovations. The trajectory of these evolving regulations will critically shape the digital asset landscape within capital markets. Clear, consistent regulatory directives are vital to balancing fostering innovation, ensuring market stability, and safeguarding investor interests.

        The path ahead

        The impact of digital assets on the post-trade sector signals a pivotal transformation in capital market operations. Though the journey ahead is fraught with regulatory, technological, and operational complexities, the promise of enhanced efficiency, reduced costs, and bolstered security presents a compelling case for broader adoption of digital assets. As the market landscape adapts, stakeholders must remain flexible, leveraging new technologies and adapting to emerging paradigms to stay competitive in this evolving arena.

        Meet our expert

        George Holt

        Senior Consultant, Capgemini

          How GenAI is transforming document search and knowledge management

          Rajesh Iyer
          26 July 2024

          From data deluge to insights: How GenAI is transforming document search and knowledge management in financial services

          Organizations, particularly in the financial services sector, have long mastered the management of structured data within relational databases. These firms have honed their expertise in data storage, ensuring data quality, and leveraging this data for applications, reporting, and analytics. However, the advent of GenAI has transformed the handling of unstructured data, unlocking new possibilities in knowledge management and search capabilities across enterprise processes and workflows.

          While structured data benefits from centralized storage and easy retrieval through tables and keys, managing unstructured data presents unique challenges. Ensuring that documents are not duplicated across various storage platforms like SharePoint, Teams, and Content Management Systems is less straightforward. Although some progress has been made in solving storage issues, the rigor seen in relational databases is often lacking.

          The time-consuming process of gathering and auditing information from large collections of documents can significantly hamper productivity. The complexity increases when integrating structured and unstructured data to provide a seamless and efficient user experience for business, technical, and operational purposes. The value of advanced, GenAI-powered search and knowledge management systems becomes evident, offering speed, accuracy, and scale, thus enhancing overall organizational efficiency.

          Approaching the problem from multiple fronts

          Now that we’ve examined the challenges and business value of this organizational capability, let’s discuss how to address it from multiple angles. The following chart offers an overview of the key dimensions involved in building this capability. In the subsequent sections, we will delve into the specifics of how AI and advanced techniques can be effectively implemented across the organization.

          1) Information Stewards for feedback loop

          The role of Information Stewards in ensuring ongoing data readiness is crucial. Information Stewards are responsible for monitoring and managing the quality, security, and compliance of the data environment. Their oversight ensures that the data remains accurate and secure. Additionally, integrating feedback from Information Stewards is essential for continuously improving data quality and AI model performance. This ongoing process helps maintain a high standard of data readiness and enhances the effectiveness of AI implementations.

          The organizational structure of the financial services firm will determine the specific responsibilities of each Information Steward. For example, every line of business (LOB) and operational horizontal, such as contact centers, back-office operations, and strategy teams, will have designated stewards. If the firm uses disparate content management systems, additional effort will be required to standardize unstructured data governance processes, ensuring the integrity of the unstructured data landscape.

          2) AI augmented data enhancement

          To ensure the quality, accuracy, and completeness of data, several capabilities are essential. Deploying classification algorithms to automatically identify document types and topics is crucial for effective data classification. Tag generation and metadata management play a significant role by automatically generating metadata tags for roles, topics, and divisions. Additionally, adhering to data standards is necessary to ensure that documents are reviewed and approved before publishing.

          Document standards, such as mandatory sections for intended audience, role-based security permissions, and change audit trails, must be strictly enforced. Approaches need to be developed to automate data augmentation from system logs, incorporating this information into service desk tickets to record which systems were accessed for resolving issues. The goal is to enhance human entries with automated data from logs and other sources, thereby reducing user friction and improving the accuracy and completeness of information.

          3) Database for unstructured data augmented with structured data

          Combining structured and unstructured data involves several key strategies. Implementing vector databases for dynamic indexing of unstructured data significantly improves the speed and accuracy of search queries. Enhancing unstructured data with structured data, such as document metadata and access permissions, adds valuable context.

          Adding user role-based context makes large language models (LLMs) more effective in addressing queries. By including roles and their key performance indicators (KPIs) as additional context, GenAI applications can better understand the motivations behind specific questions. This enables them to respond to general queries, such as “What are the top three things I should worry about today?” with greater expertise and relevance.

          Additionally, exploring advanced techniques like combining Retrieval-Augmented Generation (RAG) architecture with knowledge graphs can further augment the enterprise context, providing a more comprehensive and efficient data management solution. GraphRAG approaches add an extra advisory layer that helps identify related document chunks specific to the document repository being queried.

          To enable quick and effective data search and presentation to end users, a hybrid search and agentic architecture is essential. This approach combines the precision of vector search with semantic search to enhance search accuracy. Result enhancement is achieved through ranking fusion techniques, which merge results from both search types.

          Additionally, the ability to call APIs across multiple domains, such as CRM, document repositories, service desks, and requirements, further enriches the search capabilities. An agentic architecture, with libraries for specific functionalities, ensures an improved customer experience (CX). This architecture allows AI libraries to augment GenAI applications’ capabilities, such as performing mathematical calculations, rendering reports, and creating SQL queries against specific databases.

          This evolution is crucial as it enables applications to explore areas like intelligent decision-making, rules execution, and product recommendations. The goal is twofold: first, to enhance enterprise context retrieval, and second, to augment GenAI with AI and other APIs to deliver a superior customer experience.

          5) Establish process for alerts for missing information with workflow

          To automate continuous monitoring of processes and workflows, it’s essential to integrate systems for alerts and monitoring. Establishing a monitoring and alerts system allows for the oversight of data quality and completeness, promptly notifying teams of any anomalies or gaps.

          Once alerts are triggered, workflow automation is used to respond efficiently, with predefined workflows in place to address and rectify identified data issues. This ensures timely and effective resolution of data quality problems.

          Given that this is an ongoing effort, there is a pressing organizational need to keep the data fresh, up-to-date, and complete with the highest level of quality. This dedication to data integrity ensures that users receive the best possible information when they need it.

          Bringing it all together

          While financial services firms have long excelled in managing structured data within relational databases, the advent of GenAI has opened up transformative possibilities for handling unstructured data. This evolution is crucial for enhancing knowledge management and search capabilities across enterprise processes and workflows. Managing unstructured data poses unique challenges, including preventing document duplication across various storage platforms and ensuring data accuracy and completeness.

          To overcome these challenges, the problem needs to be approached from multiple fronts:

          • Information Stewards ensure data quality, security, compliance, and continuous improvement of AI performance.
          • Classification algorithms and metadata management ensure data quality and adherence to standards.
          • Combining structured and unstructured data with vector databases and RAG architecture improves search accuracy.
          • Incorporating hybrid vector and semantic search, ranking fusion, and API integration further refines search precision.
          • Monitoring and alert systems with automated workflows maintain data quality and completeness.

          By addressing these challenges from multiple fronts and leveraging advanced AI techniques, financial services firms can unlock the full potential of their data, driving superior decision-making and operational efficiency.

          Want to learn more?

          Check out the latest reports from the Capgemini Research Institute, packed with cutting-edge insights on Generative AI. Explore topics such as Turbocharging software with GenAI, Harnessing the value of GenAI, and Why consumers love GenAI.

          Click here to download a PDF copy of this expert perspective.

          Meet our expert

          Shankar Ramanathan

          Senior Director, AI & Machine Learning, Financial Services Insights & Data

          Vishal Bhalla

          Senior Director, Portfolio Lead, Financial Services Insights & Data

            Welcome to where imagination transforms everything, at Power Platform Community Conference 2024

            Vivek_Desai
            Vivek Desai
            Aug 27, 2024

            Technology advances in large leaps, so having a finger on the pulse of technology is critical – but keeping up can be a struggle. With that, I’m truly excited for the third annual Power Platform Community Conference (PPCC) 2024.

            The PPCC is a prime opportunity to learn how to leverage the latest innovations in Microsoft Power Platform powered by generative AI (Gen AI). You’ll gain access to more than 150 sessions and keynotes, along with 20 hands-on workshops, and opportunities to connect with and gain insights from Microsoft thought leaders, product experts and developers, MVPs, and peers.

            Our booth will be hosting immersive industry demos, live podcast episodes, and speaking sessions with our clients exploring how Capgemini and Microsoft Cloud transform businesses like yours every day. I invite you to join me there for customized insights and transformative opportunities, including:

            • Art of the Possible: Demos showcasing the power of Microsoft Power Platform and Copilot Studio.
            • Our tailored offerings for diverse business needs to help you drive adoption and realize value from Microsoft Power Platform, Copilot Studio, Dynamics 365 and Azure AI.
            • Success Stories: Hear real-world examples of our solutions in action.
            • Expert Consultations: Talk to our SMEs about how financial services customers are leveraging our solutions.

            Our expert sessions

            Unleashing Innovation & Digital Transformation with Power Platform and Copilot 

            In this dynamic panel discussion, industry leaders from the financial services sector will share their transformative journey with Power Platform and Copilot. Discover how they harnessed low-code development, automation, and AI-driven insights to revolutionize employee experiences, drive innovation, and foster creativity within their organizations.

            Empowering Financial Services: Build and Modernize Enterprise Applications using Power Platform  

            Join this session to learn how Power Platform disrupts traditional enterprise app development in financial services. Attendees will learn about the shift from monolithic systems and legacy tech stacks to agile, low-code development, enabling faster innovation.

            If you’re attending, please send me a message on LinkedIn, or find me at booth #113. Let’s discuss the limitless possibilities in the world of cloud technology.

            Meet our expert

            Vivek desai

            Vivek Desai

            Global Head of FCC Hyperscaler – Center of Excellence
            Vivek is a global leader for Microsoft COE and Business Group at Capgemini Financial Services, responsible for hands-on solution architecture, strategy, and engineering at scale for complex cloud transformation initiatives across various sectors. Additionally, Vivek leads global go-to-market and new business growth initiatives, driving budgets and KPIs to deliver profitable business growth while excelling in solution, architecture, and engineering delivery.

              Insuring the future with a payer-provider partnership

              Capgemini
              10 September 2024

              New technologies and regulations make collaboration more valuable than ever

              In brief:

              • New technologies and regulations are changing the healthcare landscape.
              • For proactive health payers, these changes carry immense opportunity.
              • By partnering with providers and leveraging new technologies, healthcare payers can unlock new value.

              New technologies and evolving regulations present opportunities for healthcare payers, and also highlight the need for closer collaboration with providers. Many of the current challenges in healthcare – including simple and transparent payments, consistent quality of care, and data standardization – could be improved if payers and providers had access to the same information. Recently, the Healthcare Financial Management Association (HFMA) cited collaboration between payers and providers as essential. By reducing administrative friction and breaking down information silos, payers and providers can reap the full benefit of the changing healthcare landscape, and further their common goal of delivering high quality care to patients.

              Here are four key technological and regulatory changes, and the value that could be gleaned from closer collaboration between insurers and providers:

              1. Technology advancements create valuable patient data and personalized care

              Advancements in technology such as electronic health records (EHR), wearable devices, and remote patient monitoring (RPM) are expanding the capabilities of personalized care. These are giving rise to new innovations such as Google AI tools that offer a non-invasive, scalable, and cost-effective way to predict cardiovascular risks using retina scans. This enables early detection, personalized treatment, and broader access to care. Current barriers to reaching patients earlier can be overcome by sharing the data responsibly these new technologies produce, in compliance with data and privacy regulations. This will enable payers to collaborate with providers to play an active part in early detection, better define insurance plans, process payments more quickly, and deliver better care earlier before issues progress.

              2. Generative AI enables efficiencies for payers and providers alike

              Generative AI (Gen AI) has opened a new frontier helping payers automate claims, assess risks, personalize coverage, and support members through chatbots and virtual assistants. For providers, Gen AI is being used in various areas, including supporting clinical decisions, automating routine administrative tasks, and educating patients.

              Gen AI has started helping both payers and providers reduce operational costs, streamline processes, and bring efficiencies. However, for members to fully benefit from these innovations, challenges like system integration, data privacy, and security must be addressed. Investments in new technologies can break down data silos and improve information sharing between payers and providers.

              3. Changes in Medicare Advantage (MA) create opportunities

              Although current enrollments are concentrated between two MA providers with a combined share of 47%, there may be an opportunity for smaller payers to bite off a bigger share of the market. A report from the Kaiser Family Foundation found that 40% of MA beneficiaries underutilized their benefits in 2023. Payers that encouraged customers to better take advantage of those benefits could be rewarded with growth. Also, the pie is growing for all payers. The Congressional Budget Office (CBO) projects that the share of all Medicare beneficiaries enrolled in Medicare Advantage (MA) plans will rise from 54% to 64% by 2034.

              Considering these developments, MA plan providers are revisiting their strategies to take advantage of this potential growth. This will ensure improved benefit design and transparency with respect to sharing data with CMS.

              4. Regulations for coverage transparency and authorization wait times

              In 2024, the health payer industry will undergo significant regulatory changes, focusing on price transparency. Healthcare payers are at various stages of adopting the Transparency in Coverage Rule and the No Surprises Act, both of which are central to these transparency efforts.

              The Medicare Advantage and Part D Final Rule will introduce policy updates affecting marketing, prior authorization, and network adequacy. Payers must also adapt to the CMS Advancing Interoperability and Improving Prior Authorization Processes Final Rule, which emphasizes the integration of system functions and coordination across the healthcare ecosystem. These rules address weaknesses in prior authorization processes by:

              • Requiring payers to issue decisions within 72 hours for expedited requests, and seven days for standard requests to reduce urgent care wait times, starting in 2026.
              • Mandating adoption of HL7 Fast Healthcare Interoperability Resources (FHIR) Prior Authorization API, which will automate authorizations, therefore boosting efficiency.
              • Requiring payers to publicly report prior authorization metrics, including denial rates and reasons.
              • Requiring payers to upgrade their patient access API to include prior authorization data and implement a provider access API by January 2027.

              This will streamline, automate, and bring transparency to the prior authorization process, dramatically reducing patient wait times.

              Payers should go beyond the mandate and embrace interoperability

              The CMS Advancing Interoperability and Improving Prior Authorization Processes Final Rule should not be limited to prior authorization only. The healthcare payer of tomorrow should treat this as a step towards enhancing the interoperability of healthcare data across systems, improving the transparency and efficiency of all processes, and ultimately ensuring better coordination of care.

              The FHIR standard enhances healthcare data exchange and integration, and while most healthcare payers have taken steps towards adopting it, few are benefitting fully. To gain the most value from interoperability will require:

              • Embracing cloud-based solutions for scalability and real-time access
              • Standards compliance and governance
              • Implementing patient-centric interoperability through APIs.

              New technologies are worth the investment

              For healthcare payers that keep pace with new technologies, these changes represent an opportunity. To support API-based secure data exchange and governance, payers will need to update core administrative systems. Investments in integrated data analytics, predictive modeling solutions, and Gen AI are also crucial for delivering accurate, personalized, real-time information to members.

              The health payer ecosystem must be modular to allow for flexible data sharing with external entities, including information about plans, pricing, coverage, members and compliance, as well as analytics derived from the same. Establishing standards, robust auditing, rigorous testing, and regular monitoring is essential for seamless data exchange and governance.

              There’s work for providers too. By implementing EHR and interoperability solutions, providers will improve clinical workflows, enhance personalized care plans, and improve patient engagement, ultimately resulting in superior service delivery and coverage.

              By focusing on these measures, payers and providers can drive operational excellence, creating a more efficient, responsive, and cost-effective healthcare system.

              “We will explore in detail how leveraging digital tools, data analytics, and AI can deliver operational excellence in the health insurer’s complex provider management space. Join us in in the next chapter for understanding how regulatory changes, value-based benefit plans and industry changes will impact your future organizational goals and creating a proactive roadmap. Stay tuned for our next blog.”

              A quiet revolution is on for Semiconductors  

              Ravi Gupta
              May 3, 2024

               
              Softwarization for Semiconductors 

              At the heart of our era’s digital transformation—powering everything from satellites to energy-saving lights, from our connected world to the GPS we use to get away from the world for the weekend—all the changes we’ve lived through come down to tiny, intricate semiconductor chips. Today, the semiconductor industry is undergoing a quiet revolution, one that will have profound effects on the digital world. 

              In this article ,we’ll be exploring the opportunities facing chip companies at this turning point, along with some of the challenges along the way We’ll answer the questions:

              • What’s driving this softwarization shift?
              • What opportunities will this give rise to?
              • How has this transformation affected other industries?
              • How can we take full advantage of these changes?
              • How will we know if we’re on the right track?

              Market opportunities and present approach 

               The first question is, what horizon are we looking at? If your core mission is simply to manufacture the best chips possible, your horizon will be different than that of a company whose mission is, for example, to support its customers’ needs. Whatever your mission may be, expect opportunities everywhere.  

              More custom chips… More custom applications  

              More custom everything, in fact, and chips are no exception. Multiple trends are driving the need for more specialized chips, often called Application-Specific Integrated Circuits (ASICs). This includes demand across industries, from automobile, life sciences, healthcare, and telecom to entertainment, space, manufacturing, defense, and the list goes on! 

              It all starts from the need for semiconductor companies to pursue the strategy of expanding into several adjacent industries for these crucial reasons: 

              • Diversification of revenue streams: Diversifying into multiple industries reduces dependence on a single market, thereby mitigating risks of market volatility. 
              • Leveraging core competencies: Semiconductor companies can capitalize on their existing strengths and technologies to offer solutions in related industries, maximizing the ROI of their research and development investments. 
              • Growth opportunities: Adjacent markets often provide significant growth opportunities, especially as emerging technologies and trends (like IoT, AI, or 5G) create new demands and opportunities (R&D) across various sectors. 
              • Economies of Scale: Operating across multiple markets can lead to economies of scale in manufacturing and R&D, reducing costs per unit and increasing overall efficiency. 

              In other words, semiconductor companies (a.k.a Semicoms) create customized hardware tailored to specific industry solutions. This involves designing and manufacturing chips with particular features and capabilities that cater to the needs of different sectors such as automotive, healthcare, or telecommunications. This is what we call a ’hardware-centric’ approach. 

              In the HW-centric approach, the focus is on creating a product that meets the industry’s unique hardware needs first, and the primary value lies in the physical chip capabilities, with software playing a ‘supporting but an important role’ which is essential to bring out the full potential of the hardware….Well it’s the intermediary that makes the hardware te accessible and useful to the overall system by providing the necessary interfaces, controls, and customizations. 

              Then, what’s the challenge with this approach? 

              1. Long lead times. Time-consuming design and manufacturing processes due to the complexity of custom HW – often leading to missed time-to-market targets. 
              1. Rigid solutions/fixed functionality. HW with fixed functions designed for specific tasks within an industry, with limited flexibility for updates or changes once the HW is deployed. 
              1. High HW costs. Significant investment in design, prototyping and fabrication of industry specific HW. 
              1. Software (SW) costs still constitute a significant portion (up to 40%) of the overall budget, yet there’s no effective corresponding revenue model. 
                • SW is seen as an essential part of the HW, expected to be included in the purchase price of the chip. 
                • SW is viewed only as a cost – the cost of doing business, necessary to make the HW operable and appealing to customers, rather than as a product or service that could be sold independently. 
                • SW updates, bug fixes, and support are usually provided as part of the post-sale services with no additional charges. 
                • SW customization possibilities are limited by the HW – hence there are very few opportunities for additional SW-based revenue. 

              So, as a semiconductor company that needs to pursue the strategy of expanding into several adjacent industries whilst still leveraging a HW-centric approach, we can summarize the UN-DESIRED challenges as follows: 

              1. Costly and time-consuming design and production. Today’s business model necessitates designing and manufacturing a diverse array of chips, each tailored to specific industry requirements. This process is not only expensive but also time-consuming, involving extensive hardware and software development for each unique industry solution. The high cost of design and production is a major concern, especially as we navigate the risk of missing crucial market deadlines. 
              1. Software development as a cost center. Despite the considerable investment (nearly 40% of overall ) in software development to support these industry-specific solutions, it doesn’t translate into direct revenue generation….so Software, in this model, is a cost center rather than a profit center. 
              1. Rigidity and lack of adaptability. The solutions the industry today offers are inherently rigid. They come with fixed functionalities, which means any significant change in industry requirements or standards necessitates a new round of costly and time-consuming chip development. This lack of flexibility in chip offerings limits the ability to adapt to evolving market needs without incurring substantial expenses and facing the same risks. 
              1. Scalability challenges: Scaling production up or down to meet fluctuating market demands is a major hurdle in my current operation. With each industry requiring distinct chip designs, rapid adjustment of production volumes becomes complex and costly, affecting my ability to respond to market dynamics efficiently. 
              1. Environmental concerns: The current approach also raises sustainability issues due to the frequent development of new hardware thus also increased material use and waste. This conflicts with global environmental sustainability trends, pushing us to consider more eco-friendly production methods. 

              Some additional considerations include: 

              • Supply chain dependencies. Reliance on a complex supply chain for diverse hardware production is a vulnerability, particularly in times of unprecedented disruption. 
              • Inventory and logistics complexities. Managing a broad spectrum of customized chips leads to intricate inventory and logistical challenges. 
              • Rapid obsolescence: The pace of technological advancement can quickly make our hardware obsolete, demanding continual innovation. 

              Moving from hardware-designed to Software-defined 

              Softwarization for Semiconductor depicts the desired future based on the adoption of a “software-centric” approach  

              In a software-centric approach, the objective is to develop a more standardized limited array of base chips that can be customized for various industries and solutions through software. The idea is to reduce the number of unique hardware designs and instead leverage software to provide industry-specific functionalities – moving the “logic” from silicon to software where the base hardware is standardized and simplified, the software layered on top is what provides the industry-specific customization. 

              Finally, the software-designed custom silicon is validated against industry frameworks for performance and functionality., The integrated solution of (hardware plus software) still needs to meet the stringent performance and functionality standards of specific industries.

              An “industry framework” in the context of semiconductor products and software refers to a set of standards, regulations, guidelines, or specifications that have been established by industry groups, regulatory bodies, or standard-setting organizations. These frameworks are designed to ensure that products and services meet certain levels of quality, performance, safety, compatibility, and interoperability within a specific industry. It can include (but is not limited to):

              • Technical Standards: Specifications for product design, materials, processes, and performance. For example, in telecommunications, standards like 3GPP or IEEE define how devices should communicate and interoperate.
              • Industry-specific software protocols: For software, frameworks might include coding standards, architectural guidelines, and protocols that are widely accepted in specific industries.
              • Compliance checklists: In some industries, there are comprehensive checklists or guidelines that products must adhere to for legal or market access reasons.
              • Interoperability guidelines: Standards ensuring that products from different manufacturers can work together seamlessly, common in areas like home automation (e.g., Zigbee or Z-Wave standards) or data technology (e.g., USB or Bluetooth standards).
              • Quality certifications: Benchmarks for product quality and reliability, such as ISO 9001 for quality management systems or the Automotive Quality Standard IATF 16949.
              • Security protocols: In industries where data security is paramount, like finance or healthcare, there are specific standards for data protection and cybersecurity (e.g., HIPAA for healthcare data in the U.S., or PCI DSS for payment card security).

              Advantages of Softwarization

              Our Desired Future is to gain the following advantages

              1. Efficient and cost-effective design and Production: Adopting a software-centric approach, the Semiconductor company streamlines its business model by developing a limited array of versatile base chips. These chips can be customized for various industries through software, significantly reducing the cost and time involved in hardware development. This approach allows for quicker iterations and a more efficient production process, effectively addressing the risk of missing market deadlines.
              • Software development as a revenue generator: In this future model, software development transitions from being a cost center to a key revenue stream. By offering customizable software solutions, feature upgrades, ongoing service subscriptions, and developing a robust ecosystem for third-party applications, Semiconductor companies can monetize their software development efforts more effectively. This ecosystem approach not only allows for direct revenue generation through licensing and platform fees, but also enhances the value proposition of their products, creating a ‘platform effect’ that attracts more users and developers, thereby expanding market reach and creating new revenue opportunities.
              • Flexibility and adaptability: The ability to update and customize software for different industry requirements means Semiconductor companies (Semicoms)  can adapt to market changes swiftly, without the need for time-consuming and expensive hardware redevelopment. This adaptability allows them to respond rapidly to evolving industry needs.
              • Enhanced scalability: The standardized hardware base in a software-centric approach simplifies scaling production to match market demand. The need for distinct hardware designs for each industry is eliminated, making it easier and more cost-effective to adjust production volumes, enhancing Semicom’s ability to respond efficiently to market dynamics.
              • Sustainable and environmentally friendly: This future-focused approach aligns semicoms with global environmental sustainability trends by significantly reducing the frequency of new hardware development. The focus on software updates and longer-lasting hardware reduces material use and waste, promoting a more eco-friendly production model.

              Additional Considerations:

              • Reduced supply chain Dependencies: The reliance on a complex supply chain is diminished as the demand for diverse hardware production decreases. This shift reduces semicoms vulnerability from supply chain disruptions, creating a more resilient business model.
              • Simplified inventory and logistics: By minimizing the variety of customized chips, inventory and logistics management becomes more straightforward, reducing operational complexity and cost.
              • Slower technological obsolescence: With a software-centric approach, the lifecycle of hardware is extended. The ability to continually update and adapt the software reduces the pressure of rapid hardware obsolescence, allowing for sustained innovation and relevance in the market.

              Overall

              Transitioning to this futuristic software-centric approach transforms key aspects of operations, positioning a Semiconductor company to be more agile, cost-effective, environmentally conscious, and capable of generating new revenue streams.

              As a direct consequence of this Software based transformation, a semiconductor company can offer tailored products and services to its industry customers thereby enhancing their value proposition and increase its differentiation in the market.

              In essence, for semiconductor companies, the software-centric model means access to cutting-edge, customizable technology solutions that are sustainable, cost-effective, and come with comprehensive support, all of which are crucial for staying competitive in today’s fast-paced market.

              To view Capgemini’s approach and Point of view on the Softwarization for Semiconductors, visit below

              No posts

              Author

              Ravi Gupta

              Senior Director – Semiconductors Tech & Digital Industry
              Ravi brings over 30 years of experience in IT and High-tech. Prior to joining Capgemini, he worked at Intel for 25 years where he held various leadership roles in Systems Engineering, Platform Validation, Presales, and Business Development. At Capgemini , Ravi is charted to work with global semiconductor industry to recognize new technology trends & closely partner with Capgemini Engineering for developing the capability offers, thought-leadership content and account specific GTM functions. Ravi holds a Bachelor’s degree in Engineering from the University of Mumbai, specializing in Microprocessor design and has earned many industry certifications in technical and business management streams.

                How Gen AI will redefine the chip design process

                Ravi Gupta
                Jun 13, 2024

                Simulated Futures – how generative AI enables the semiconductor industry to rethink its ways to produce semiconductor chips.

                In our first article, we build on ways Gen AI will revolutionize the semiconductor chip design and manufacturing process, as outlined in the first article of this series (Part 1 – How Gen AI will revolutionize the semiconductor industry). Here we explore the current applications (Now Technology) as well as forthcoming possibilities (Near Technology)

                Gen AI encompasses a diverse array of applications throughout the silicon design and manufacturing flows, ranging from ML-driven architecture exploration to automated RTL coding, test suite generation, floorplan optimization, bug tracking, and predictive modeling for manufacturing processes. These applications aim to shorten design cycles, increase productivity, reduce costs, and improve overall product quality, aligning with the industry’s goal of continuous improvement.

                While Electronic Design Automation (EDA) companies have made strides in AI integration, Gen AI propels further advancements by addressing holistic data interdependencies across design and manufacturing workflows, while emphasizing IP protection and data security. Indeed, the data generated throughout the design phase is a gold mine that every silicon company would want to keep tightly protected. Unlike traditional tool-centric approaches, Gen AI focuses on comprehensive solutions that recognize the intricate relationship of data across all design stages and unlock new opportunities for innovation and efficiency.

                The integration of artificial intelligence into semiconductor design processes through Gen AI empowers continuous improvement to optimize efficiency, enhance product quality, and transform competitiveness in the industry. By fostering collaboration between AI specialists and semiconductor SMEs, companies can unlock the full potential of Gen AI, to drive innovation and shape the future of silicon design.

                Innovating the chip design with the AI revolution (Now Technology)

                The semiconductor chip design process is complex, and it involves various stages, from system specifications, architectural design, functional design, logic design, circuit design, and physical design verification prior to manufacturing. The chip development design process requires a delicate balance of Performance, Power, Area (PPA), while also adhering to stringent design rules. This means semiconductor companies must follow an iterative process to optimize designs.

                The adoption of Gen AI into chip design processes yields numerous benefits, including competitive differentiation, innovation opportunities, and the creation of valuable IP assets through collaborative ventures.

                When we look at today’s technology (Now technology), ready solutions are available to streamline this complex process and improve productivity. These tools pre-empt bugs ahead of time with root cause analysis to significantly reduce iterations. Thus, the utilization of costly compute farm resources, plus minimizing design errors, leads to improvements in overall chip design quality. Some of the commonly available solutions are:

                1. Synopsys DSO.ai™ (Design Space Optimization AI), searches for optimization targets in large solution spaces of chip design, utilizing reinforcement learning to enhance power, performance, and area (PPA).
                2. Cadence Verisium™ offers AI-enhanced verification, debugging, and testing capabilities. The platform optimizes verification workloads, boosts coverage, and accelerates root-cause analysis of bugs.
                3. In early 2023, Synopsys™ launched a full-stack AI-driven EDA (Electronic Design Automation) suite that employs AI across architecture, design, and manufacturing stages to automate many tedious and repetitive tasks, thus liberating engineers time for enhancing chip quality.

                Thanks to AI, these tools can enhance the semiconductor chip design process by assisting chip designers in creating, verifying, and optimizing designs faster and with better quality. This shift was transformative for the semiconductor industry, overcoming challenges of ever-increasing complexity and design cost.

                Innovating the chip design with generative AI revolution (Near Technology)

                Semiconductor chip designs are becoming increasingly complex in the face of ever-emerging compute-intensive applications propelling the need for advanced node chips. Chip design complexities also continued to increase over the last few years, posing significant challenges for engineers. This process requires incredible attention to detail as a single mistake can be costly. This is where generative AI can play a transformative role.

                Generative AI encompasses the collecting and preprocessing of data from prior designs, the training of machine learning models, and their seamless integration into design flows for optimization. This shift necessitates designers to adopt new workflows with a data-driven approach. This transformative approach is showcased through the introduction of GenAI-powered design Co-Pilots by several companies. These Co-Pilots streamline processes, leveraging existing design data including requirements, specifications, IP details, and bug histories, to offer personalized guidance to designers and engineers. They excel in scenario exploration, rapidly evaluating design alternatives, and mitigating risks. Through continuous learning from historical data and user feedback, these Co-Pilots evolve into tailored tools, accelerating the design process while enhancing product quality. These tools don’t just facilitate collaboration among design teams but also serve as repositories of collective intelligence.

                For instance, Synopsys’s AI tool, called Synopsis.ai CoPilot announced last fall, is intended to answer questions about how to use the company’s design tools and can create workflow scripts. It can also generate RTL, a form of chip design language that specifies chip architecture, just by conversing in plain English.

                Now, let us take this idea to the next level.

                Imagine having a tool that can learn from numerous existing designs of recent years.  A tool to learn about chip characteristics like performance, power, transistor size, process technology, and materials used, and a tool to use this data to prepare and recommend new models of chip design for industry-specific use-cases like Ultra-low-power, IoT (Internet of Things), and Compute.

                Today, a compelling instance highlighting the significant advancements facilitated by a high degree of automation in chip development is proprietary design generator tools. These tools enhance the efficiency, quality, and time-to-market outcomes. Engineers utilize these design generator tools to delineate the parameters of the System-on-a-Chip (SoC) through a specifications framework based on standard applications like Excel. The tool then processes this specification, orchestrating the integration of semiconductor Intellectual Properties (IP) according to the outlined criteria to construct the SoC. The design generator tools assist beyond integration, conducting comprehensive quality assessments and generating essential collateral for subsequent phases of development. These design tools also significantly impact the streamlining of the efforts required for SoC Register Transfer Level (RTL) design.

                With the industry’s progression towards Generative Artificial Intelligence (Gen-AI), a pressing need arises for our existing tools to evolve, to incorporate a blend of automation, AI, and Gen AI capabilities, seamlessly. We believe it is not far when design generator tools will leverage Gen-AI features to augment their automation prowess further. As an expected use case, it could expand its capabilities to generate output code in various programming languages and align collateral generation with diverse formatting requirements for subsequent development stages. Better yet, we could use GenAI to build better chips to run GenAI.

                Additionally, as the adoption of chiplets grows, generative AI can help make design and verification processes more effective. The biggest thing that separates chiplets from the SoC is the partition and interconnect. Imagine having a generative AI-based design tool that can help improve productivity when partitioning the chiplets. The tool will do this by learning from various Chiplet designs about Performance, Power, Area, Memory, and I/O characteristics to help optimize the silicon block, this allows designers to meet PPA target specs and cost challenges of a design. It does leave us wondering though, how might minor changes in the design affect a chiplet’s overall performance and characteristics? And can it be predicted based on previous designs?

                Generative AI offers immense potential to revolutionize the analog and digital chip design process, starting by tackling executive repetitive tasks efficiently and offering predictive capabilities to designers on future designs. But this journey will be an experience with a lot of learning.

                In the next series of articles, we will cover ways generative AI can simulate future demand to adjust the manufacturing process to plan, in advance, the indicators that will drive changes. We call it the Simulated Future.

                So, we will conclude this article with three questions for our readers:

                1. When will generative AI be integral and essential to the chip design process?
                2. What Challenges can we see in our current design methodologies that could be averse to the adoption of generative AI?
                3. And how do we plan to get there?

                Authors

                Ravi Gupta

                Senior Director – Semiconductors Tech & Digital Industry
                Ravi brings over 30 years of experience in IT and High-tech. Prior to joining Capgemini, he worked at Intel for 25 years where he held various leadership roles in Systems Engineering, Platform Validation, Presales, and Business Development. At Capgemini , Ravi is charted to work with global semiconductor industry to recognize new technology trends & closely partner with Capgemini Engineering for developing the capability offers, thought-leadership content and account specific GTM functions. Ravi holds a Bachelor’s degree in Engineering from the University of Mumbai, specializing in Microprocessor design and has earned many industry certifications in technical and business management streams.

                Sanjiv Agarwal

                Global Semiconductor Lead, Capgemini
                With about 30 years of experience in the TMT sector, Sanjiv is experienced with enabling digital transformation journey for customers using best-of breed technology solutions and services. In his current role as a global semiconductor industry leader, he is working closely with customers on their journey on producing sustainable technology, driving use of AI/ ML, digital transformation, and global supply chain.

                Mourad Aberbour

                CTO for Silicon Engineering at Capgemini Engineering
                Mourad Aberbour is the CTO for Silicon Engineering at Capgemini Engineering. He comes with over 25 years of experience in the silicon domain. He has managed large silicon organizations across the globe (Europe, America and Asia Pacific) in his successful tenure. Previously, Mourad held technical and senior executive leadership positions at Texas Instruments, Intel Corporation and AMD and led teams that have delivered over 1B silicon units across many semiconductor businesses: wireless modem, phone, tablet etc.

                  How Gen AI will revolutionize the semiconductor industry

                  Ravi Gupta
                  May 30, 2024

                  The semiconductor industry drives every industry in the world, yet the semiconductor value chain is complex as components travel more than 25,000 miles and cross 70 borders before completion.

                  During COVID-19, the semiconductor industry got a big boost due to changing consumer habits – driven by remote work, distance learning, gaming, and online shopping – which increased the demand for consumer electronic devices. This unprecedented growth in semiconductor demand resulted in increased supply chain complexities, forcing companies to deploy makeshift solutions across their supply chain and manufacturing processes to handle the various challenges.

                  Then Gen AI happened

                  The launch of Chat GPT in late 2022 propelled the technology industry to take a closer look at generative AI, a broad field of artificial intelligence, thus bringing Gen AI into the zeitgeist. Truth be told, like many industries, Artificial Intelligence (AI) and Machine Learning (ML) have always been used in Semi, but the availability of large language models (LLMs) and foundational models encouraged companies to quickly realign their technology roadmaps to include generative AI for enterprise use. At most companies, the initial focus was first, to find ways to improve customer experience via marketing content creation, and second, to deploy touchless solutions to respond to customer queries efficiently and accurately. But as generative AI gained significant visibility and popularity, it became clear to consumers and enterprises from all industries, including the semiconductor industry, that its potential is much deeper and wider.

                  We see generative AI is poised to influence the entire value chain of the semiconductor industry. On one hand, it will uncover applications that will drive up growth in chip demand. On the other hand, it will drive up demand for generative AI enabled chips for processing specialized applications. As more and more companies are launching new ‘AI enabled chips’, it is expected that Gen AI will also change the way companies do business with process and task automation and intelligence along the entire value chain. We call this “Simulated Futures” because it brings a whole new ease and intelligence to design and simulation at every turn.

                  The substantial benefits

                  The adoption of Gen AI promises substantial benefits for semiconductor value chain companies, including streamlined design workflows, accelerated exploration of design alternatives or layout optimization, efficient bug tracking, predictive analytics for manufacturing optimization, simulating manufacturing and supply chain challenges, improved equipment utilization, and yield improvement. These advancements translate into tangible improvements in Time-to-Market (TTM), cost reduction, and overall product quality, offering a competitive edge in the market.

                  At the same time, integrating Gen AI offers semiconductor companies numerous Go-to-Market (GTM) benefits, including competitive differentiation, automated product specs documentation, innovation opportunities, and the creation of valuable IP assets through collaborative ventures. By leveraging Gen AI across the enterprise, companies can enhance productivity, profitability, and operational efficiency, thereby solidifying their position in the industry.

                  The immense opportunities in the horizon

                  In this multi-part thought leadership series, we will talk about ways Gen AI is revolutionizing the semiconductor value chain. We’ll explore the comprehensive integration of AI throughout the silicon design workflow, from conception to high-volume manufacturing. and look at how it underscores the critical collaboration between AI specialists and semiconductor Subject Matter Experts (SMEs) in realizing its full potential. We’ll explore how the future can now be simulated.

                  Read further on how we will share what we already see today (Now technology) and predict the near future (Near technology) on ways semiconductor ecosystem companies can leverage Gen AI to develop intelligent products and services, operations, innovation, and customer experience.

                  Our next article will focus on How Gen AI will Revolutionize the Chip Design Process.

                  Authors

                  Sanjiv Agarwal

                  Global Semiconductor Lead, Capgemini
                  With about 30 years of experience in the TMT sector, Sanjiv is experienced with enabling digital transformation journey for customers using best-of breed technology solutions and services. In his current role as a global semiconductor industry leader, he is working closely with customers on their journey on producing sustainable technology, driving use of AI/ ML, digital transformation, and global supply chain.

                  Ravi Gupta

                  Senior Director – Semiconductors Tech & Digital Industry
                  Ravi brings over 30 years of experience in IT and High-tech. Prior to joining Capgemini, he worked at Intel for 25 years where he held various leadership roles in Systems Engineering, Platform Validation, Presales, and Business Development. At Capgemini , Ravi is charted to work with global semiconductor industry to recognize new technology trends & closely partner with Capgemini Engineering for developing the capability offers, thought-leadership content and account specific GTM functions. Ravi holds a Bachelor’s degree in Engineering from the University of Mumbai, specializing in Microprocessor design and has earned many industry certifications in technical and business management streams.

                  Mourad Aberbour

                  CTO for Silicon Engineering at Capgemini Engineering
                  Mourad Aberbour is the CTO for Silicon Engineering at Capgemini Engineering. He comes with over 25 years of experience in the silicon domain. He has managed large silicon organizations across the globe (Europe, America and Asia Pacific) in his successful tenure. Previously, Mourad held technical and senior executive leadership positions at Texas Instruments, Intel Corporation and AMD and led teams that have delivered over 1B silicon units across many semiconductor businesses: wireless modem, phone, tablet etc.