Skip to Content

Does generative AI dream of building apps? The disruptive potential of AI-enabled software engineering  

Terry Room
23 May 2023

Like many, I have been keenly watching the recent developments in AI, with the advent of next-generation large language models like ChatGPT. I recently attended a CTO roundtable event where we discussed the transformative power of AI.

One domain we centered on was software engineering and the disruptive potential of AI assistive tools such as GitHub Copilot and Amazon CodeWhisperer. 

Some of the questions which drove the debate include: 

How will these tools make the creation and support of software more efficient? 
Will we need less developers in the future? 
Will the lines between no-code, low-code and custom-code apps continue to blur? 
What governance should be in play, and what risks must we manage? 

Here are some reflections and follow-on thoughts. 

The magic of software 

Arthur C. Clarke, author and futurist, famously wrote:  

“Any sufficiently advanced technology is indistinguishable from magic.” 

Initial interactions with large language models (LLMs) such as ChatGPT can feel magical – a far cry from Googling and trawling through links to attempt to stitch together the required information to solve a particular problem. The “magic” taps deeply into our consciousness, into the natural language processing areas of our brain, the core of our cognition and how we communicate with the outside world. Interaction with “talking machines” which seem indistinguishable from a human being has simultaneously and rapidly captured our collective imagination and concern.  

As a sense check of the state of the art, I think back to the rise of natural language interfaces and BOTs a few years ago. In some cases, a useful and valuable means of enabling channel shift and accessibility – in others, an added source of digital frustration. The believability of the new generation of language models is on a whole new level of credibility by comparison. 

For software engineering, seeing a machine generate code can feel even more magical. Especially when compared to the practices of the founders and pioneers of electronic computing (when bugs were actual bugs!). Prompt-based engineering approaches stand in stark contrast to punching holes in cards and then having to wait a day to be scheduled to see if your program ran without errors. 

But the hype around generative AI when applied to app creation requires closer inspection. And a good place to start is to look at some of the current offerings available.  


GitHub Copilot  

Copilot provides code suggestions based on prompts and code context. Recent updates have filtered insecure recommendations such as hard coded credentials and SQL injection vulnerabilities. GitHub’s research claims that the recommended code acceptance rate has increased from 27% to 35% with these changes (with variance by language) The brand positioning is also worth taking note of – i.e., it is a “copilot” and not an “autopilot’ (more on this later).  

CodeGPT is an interesting Visual Studio Code add-in, which has a rich set of generative features such as:  

  • Get Code (generates code based on natural language),  
  • Ask Code (ask any generative question – e.g., “write me C# which validates an email address using a regex”)  
  • AskCodeSelected (ask generative question of any selected code – e.g., “generate .md file with markdown” or “convert this to python”) 
  • Explain 
  • Refactor (refactors code, with recommendations to reduce dependencies, improve readability, etc.) 
  • Document 

  

Interestingly, it plugs into a number of LLMs (OpenAI, Cohere, AI21, Anthropic). 

Visual Studio Intelicode 

This is the AI-enabled next gen of Intelisense. Intelicode prompts are based on the context of the code you are writing, and not just an indexed list of overloads. It also has features for addressing stylistic consistency across your code in an effective manner. And it will even flag potential typos, like mistakes in variable use. Also, of note is that it runs locally on your dev stack and does not back off to cloud-hosted APIs, a key requirement for highly secure code creation processes in regulated industries such as financial services and health.  

Amazon CodeWhisper 

Similar in intent to Copilot, providing contextual code complete suggestions. 

CodePal 

https://codepal.ai/

An interesting service which offers various code generators, by language, as well as translators, unit test writers, query writers, schema resolvers, and security analyzers.

We’ve also seen generative AI being included in no code/low code apps. Microsoft PowerApps for example provides an OpenAI-powered natural language interface (“create a workflow…”, “build a table…”) to take even more of the toil away from building this type of app, further blurring the boundaries between no-code, low-code, and custom. 

Evolution vs Revolution 

A little perspective can be helpful sometimes. 

Programming has evolved considerably from the first days of electronic computing to harness increasingly powerful hardware and has been applied to create an increasingly complex array of applications. We’ve gone from punch card mainframe systems to highly complex distributed systems.  

We evolved to client server, to cloud, and from monoliths to distributed API-enabled microservice architectures.  

The evolution of the tools, skills, processes, and practices for the successful creation of these systems had had to keep pace. In fact, it can be argued it has set the pace. The innovations in tooling and practices have had to come hand in hand and provided the ability to exploit increasing computing power.  


The modern developer is massively enabled compared to the early pioneers – frameworks and languages, inteli-sense tools, static analysis and linting tools, managed run times, increasingly safe compilers, increasingly powerful code collaboration platforms, security analysis, automated build and release tools, and branch management tools. Yet the demands on the modern developer have increased proportionally – faster, higher quality with fewer bugs, and more secure against increasing cyber threats. And as an overall trend, in the last five years we have seen the increase in that demand accelerate, fueled by mass adoption of cloud computing and the digital transformation of many industries. 

So is generative AI a gamechanger, or a next natural step in the computing tradition of the last 50 years? 

Productivity 

It is also necessary to ask: so, where (and how) does “generative software development” play in the end-to-end value chain? I would start with – if you are looking for productivity gains you should start elsewhere! There is still a lot of productivity upside opportunity in many of today’s enterprises through a rightsized approach to standardization of architecture patterns, tools, platforms, templates, libraries, and methodologies. Furthermore, it is important to look holistically and to frame the problem that actually needs solving – for example: 

How we can deliver digital products which underpin our business strategy faster and at higher quality, whilst being robustly secure?  

Copilot, not autopilot 

In his best-selling book Outliers, Malcom Gladwell cited the safety transformation story of Korean Airlines. Central to this transformation was not technology, but dealing with a cultural legacy – in particular, an inbuilt deference to one’s superiors, such as first officers being deferential to the captain, even when it was clearly obvious that the captain was about to make a catastrophic error. Transforming from one of the worst safety records to one of the best was achieved by dealing with this cultural legacy – by training flight staff in clear and concise communication, and empowering staff (and making it an imperative) to validate and challenge each other.  

Similarly, when we consider driverless cars, there are many legal and ethical issues to address before vehicles ever become totally driverless. While assistive driving technology can improve safety and has potential to improve road safety on aggregate, there remains the gnarly question of who is responsible in the event of an accident. The driver, the other road user, the software, the model, or the hardware? Are we going to outsource this to our AI lawyers and AI insurance underwriters? Of course not – ultimately the driver must still be accountable.  

As a side note, on the state of the art in driverless tech, Ford’s system has just been approved for use on UK motorways. It monitors the driver whilst providing a “hands-off” driving experience. Think “next-gen cruise control” rather than fully driverless. 

Ford launches hands-free driving on UK motorways – BBC News 

  

The key point here is that AI should be considered assistive, and never in full control. Even when the AI takes increasing levels of control, the system should still have effective fail safes built in. 

The same mindset should apply to the code you create. As the creator you own the app – intent, features, architecture, technology building blocks, and non-functional characteristics – a point made clear on the landing page of GitHub Copilot. And any AI-enabled app architecture must fail safe. 

All apps != All code 

Another perspective of assistive AI software generation needs consideration. In the technology industry, we have a tendency to talk about code in a somewhat generic sense. To the lay person, code is code. But all apps are not equal. And all code is note equal. 

For illustration, consider a payment processing engine, or a software control system for a power station. Both have non-functional characteristics such as extreme availability, robust resiliency features such as idempotency and transaction management, and stringent security controls to protect processes and data from many threat vectors. The consequences of failure of such systems can be severe.  

By comparison, consider a field service app – maybe one enabled by a no-code, low-code app platform, where forms, processes, and data storage are generated, and appropriate security and data controls are built in. 

The end-to-end process for the construction of these apps is vastly different because they have vastly different required characteristics. Like comparing the creation of a 12-bar blues to a symphony. Yes, they are both still music, in the same way that the software in all systems is still code. The implication is that the mileage of generative software development will and should therefore vary based on the type of app or system that you are developing. 

Maintaining the craft of building apps 

But there is more to it than this. The actual creation of code is just one part of the process of constructing digital products, albeit an important one. But what about architecture (enterprise, solution, app, system, service, data), what about a security model based on threat models and regulatory compliance needs?  More fundamentally, generative AI will not identify the need for an app or platform – what needs does it service, what value does it create, and what investment is required? Nor will it manage the complexity of delivery and execution – which processes should we use, what does the team structure look like, what quality and assurance controls should we apply, how should it be operated, and (most importantly) how will we manage people (strengths, weaknesses, communication, culture, aspirations, hopes, fears)?  

It is safe to assume that generative AI will take away some of the toil of the code creation process, allowing the developer to spend their time on higher order and higher value tasks. But we must still endeavor to maintain the craft of building software and not outsource it all to the machines. This is because we still need to know what good looks like. We need to know what secure code looks like. We need to know whether the architecture of the system being created is appropriate and fit for purpose (with appropriate failsafe mechanisms built in). And, of course, we need to guard against “AI hallucinations.” Fundamentally, we should not sleepwalk into the enablement of our developer androids without the right controls being firmly in place – software that generates software (that generates software!?). We must continue to own the craft of creation. Anything else would be denigration of responsibility. 

AI-enabled development futures 

It is clear that generative AI has high potential to offer efficiency gains and increase developer productivity, and to improve the developer experience.  

Whether we will need less (or more) developers is impossible to predict accurately (like most predictions to complex questions!), but it is clear that the developer experience will continue to evolve at pace. The opportunity is there to harness these productivity gains to create better software, and at greater speed. Will we see the rise of the Chief Prompt Engineer? Maybe. But code generation is just part of the story. If our “AI assistant developers” take away some of the toil from repetitive coding, we can focus on the creation of new classes of distributed systems and apps to help solve the prescient issues of our time (sustainability, the environment, food poverty, the health of an increasing and aging population), backed by emergent capabilities such as machine learning and quantum (which your LLM deep learning model will probably not be able to help you with btw!). We can build apps more effectively, where cost and value are more in line, and where risks of delivery and operation are significantly reduced, even against an increased landscape of cyber threats.  

But there are many issues to address, and sooner rather than later. It took many (many!) years for the various policy and regulatory frameworks that govern the web to come into place (some would argue even this is by no means done yet). This time, governance needs to move faster, and to keep pace with the innovation in technology.  

“Copilot not autopilot” practices should be mandated and need to not just be a safe practice, but one which is backed practically, such as with “generative transparency” tools (where generated parts of a code base are clearly tagged, and therefore validated and tested as such), as well as by appropriate policy controls and regulation. What are the (Hippocratic) responsibilities of the developer (and architect) here? And intellectual property and copyright issues need greater clarity.  

Sustainability needs to be addressed (training LLMs uses a lot of compute power, and Moore’s Law may be on the wane!). 

Models will need to be diversified, to align with the compliance needs of different industries. And the compliance needs of different industries will need to be redefined.   

Does the current generation of generative AI technology dream of building apps? Maybe, but the extent to which we allow that is entirely up to us. We need to maintain control. 

Terry Room

Global Cloud CTO
Terry Room is a Distinguished Architect with over 20 years of technology industry experience, including the delivery of several mission critical platforms to production. He currently works as a Global Cloud CTO, supporting Capgemini customers in the inception and delivery of digital and cloud transformation programmes of significant value and scale.

    Capgemini and Zendesk – making personalized customer experience a reality

    Stephen Barnett, Zendesk Global Offer Leader, Intelligent Customer Operations, Capgemini’s Business Services
    Stephen Barnett
    17 May 2023

    Capgemini’s customer interactions solution leverages Zendesk to deliver frictionless, connected, and personalized experiences to your customers that drive enhanced engagement and loyalty.

    In today’s digital age, customers have come to expect personalized and connected experiences from the brands they interact with, delivered via a range of channels.

    With the increasing reliance on digital channels for customer service, putting a proper customer interaction strategy in place that provides a frictionless omnichannel experience for your customers is more important than ever.

    But how can you make this a reality for your organization?

    Driving personalized, connected customer experiences

    At Capgemini, our Intelligent Customer Interactions solution leverages an enterprise-grade customer service and engagement platform – provided by Zendesk – that puts your customers at the heart of everything you do. This, in turn, delivers more frictionless, personalized, and connected digital experiences to your customers, increasing engagement and loyalty for your organization with minimal effort on your part.

    Our solution combines decades of customer interaction design experience with Zendesk’s powerful SaaS based omnichannel ticket management and self-service tools. By combining their expertise Capgemini and Zendesk, create a next-generation digital contact center service solution that leverages AI augmentation to deliver a persona-influenced service design that integrates humans and machines.

    In turn, this enables you to drive a more meaningful, emotive, and frictionless relationship with your customers through delivering:

    • A more personalized customer experience: that creates a virtuous circle of satisfied customers, helping your business grow
    • An omnichannel customer journey: that leverages digital-first customer interactions across a range of channels, including phone, email, chatbots, social media, self-service platforms, and user portals
    • An improved net promoter score (NPS): that significantly enhances your brand value and loyalty
    • Enhanced customer engagement: that helps your experienced customer contact agents drive more meaningful conversations across digital channels with your customers.

    In short, enabled by Zendesk, our solution drives synergies across your upstream and downstream process value chain to ensure smoother customer interactions.

    Building lasting relationships with customers

    Capgemini’s and Zendesk’s partnership is built on designing and implementing software solutions that significantly improves customer relationships through personalization, while still ensuring it is flexible enough to meet any business need.

    This approach helps you build lasting relationships with your customers by ensuring you put your customers’ needs at the heart of everything you do, enabling you to stay connected with them regardless of the challenges come your way.

    To learn more about how Capgemini’s Intelligent Customer Interactions delivers enhanced customer experience excellence through frictionless customer interactions contact stephen.barnett@capgemini.com

    Stephen Barnett is responsible for designing and implementing core digital customer operations technology solutions, leveraging his 20-plus years of BPO call center industry knowledge to improve our client’s customer experience.

    Author

    Stephen Barnett, Zendesk Global Offer Leader, Intelligent Customer Operations, Capgemini’s Business Services

    Stephen Barnett

    Zendesk Global Offer Leader, Intelligent Customer Operations, Capgemini’s Business Services

      Aircraft of the future perspectives
      from the strategic aerospace seminar December 2022

      Patrice Duboe
      17 May 2023
      capgemini-engineering

      I was recently fortunate enough to participate in discussions on the major innovation challenges for future aircraft, as part of the Strategic Aerospace Seminar. Its mission is to make the industry carbon-free and so ensure its survival. Here are my key takeaways from the event.

      The seminar brought together over 200 decision-makers and experts from across the ecosystem to the Safran Campus in the Paris region. The plane of the future was at the heart of the discussions, which was also the topic of the roundtable discussion I hosted.

      There has never been a more critical time to discuss innovation challenges in the aeronautical sector. For all the stakeholders in attendance, carbon-free aviation represents the primary challenge that will drive innovation for at least the next decade. Indeed, for Olivier Criou, Head of R&T & Lead Architect at Airbus, “the challenge is so big that we need to throw everything at it.”

      Reaching Net Zero is an industry requirement

      The well-known industry objective is to reach Net Zero by 2050. “We must do it… and we can!” reassured Jean-Paul Herteman, former CEO of Safran and current Honorary Chairman of GIFAS (the French aeronautic and space industry group). But time is running out. To meet its commitments, the industry must be ready to deploy zero-emission aircraft by 2035.

      And considering development timelines, the real deadline for being ready is much closer. By 2027/2028, manufacturers and their partners must have defined both incremental and breakthrough innovations that can be integrated into future aircraft, and the associated systems design, ahead of design, test, and certification programs. It is a tight and rigid timescale for a 360° revolution.

      According to Olivier Criou, everything must be reviewed, rethought, and optimized for Zero-Emissions aircraft, with three large areas of consideration and action:

      1. New fuels and energy sources
      2. Optimization and energy efficiency of the aircraft
      3. Optimization of operations

      1. We need to develop hydrogen and SAF value chains

      “Sustainability for airplanes is mostly an energy management issue.” – Olivier Criou, Head of R&T & Lead Architect at Airbus

      Batteries will not meet the power challenges of large aircraft. The future for many is seen in green hydrogen, and that will mean transformation across the entire value chain, from upstream production to distribution at airports. However, producing this energy in a carbon-free way requires a massive amount of renewable electricity. Air France estimates its fleet will need six nuclear reactors to produce the energy it needs. This poses questions about the future availability of this energy and its cost.

      Another avenue is SAF, Sustainable Aviation Fuels. These can be biofuels from green waste, biomass, or dedicated agricultural production, or synthetic chemical fuels. Each has drawbacks regarding availability and cost. An airline like Air France plans to consume barely 10% SAF by 2030, while existing aircraft can accept between 50% and 100% SAF, depending on their age.

      Innovation must also address the entire combustion cycle, beyond CO2. That will mean improving aviation’s NOX (nitrogen oxide) and fine particulate matter performance, as Axel Krein, Executive Director of The Clean Sky Joint Undertaking (Brussels) noted.

      Listing these constraints leads to a simple conclusion: substituting kerosene with a greener fuel alone will not be enough to meet the industry’s high ambitions around the environmental performance of aircraft. Manufacturers, suppliers, and integrators must work extensively on the entire aircraft.

      2. Aircraft fuel efficiency must increase dramatically

      In addition to new fuels, the sector has strategic plans and ambitious objectives to improve fuel efficiency. These include:

      • 50% increase for regional aircraft,
      • 30% initially for Small and Medium Sized Aircraft (i.e., the A320 and A330 families), before also rising to 50%.

      To reach these, aeronautical engineers must activate all the levers of innovation, starting with continuous incremental developments, particularly around materials and additive manufacturing to reduce weight, fluid dynamics to improve lift and further limit drag, and the electrification of new subsystems.

      But breakthrough innovations will also be essential, especially for engines. For example, Airbus and CFM— owned by GE Aviation and Safran – are looking at open fan engines. These involve counter-rotating fans and dispensing with the nacelle, increasing the flow of cooler air through the engine, which allows more thrust to be produced with less energy. Eliminating the nacelle also reduces weight.

      Whether continuous or breakthrough, these innovations will significantly impact plane system structure (implementing new open fan architectures, for example). Some future aircraft may involve a completely reinvented architecture, with revised engine positioning, or wingspans that alternate positions for flight and taxi phases.

      In this changing structural environment, only one element remains steadfast and critical: safety. That of the aircraft, its passengers, or the airports that host it. As different systems emerge, safety must be maintained at levels equivalent to or higher than at present to help pilots manage increased system complexity.

      3. We need to improve sustainability by rethinking flight plans and optimizing traffic

      Improving “flight management” and “traffic management” represent additional opportunities in the search for reducing aircraft fuel consumption. Even if eco-piloting now plays a part in pilot training, it can still be optimized further. For example, real-time weather data, provided by enhanced device connectivity, can encourage pilots to choose one route over another.

      Landing is also being looked at, with a view to improving the use of runways and tarmac to avoid long waits in the air for landing slots.

      The introduction of flights in special formations is considered an interesting option. Here a lead aircraft is followed by one or more aircraft that take advantage of the vortices generated by the lead aircraft to enhance their lift. The envisaged gain in fuel consumption could reach 5% or even 10%.

      Inventing new ways to work together

      These three key trends have been emerging over a long period and are starting to cause market disruption. However, we need to note that this innovation must be undertaken while maintaining what Olivier Criou described as “the economic affordability of traveling by plane for our customers and for the end customer“.

      To do this, the whole ecosystem must mobilize, coordinating and organizing efforts with many new entrants. These new entrants, often startups, position themselves in unaddressed niches or on breakthrough elements that complement offers from long-standing stakeholders. They work on the verticalization and integration of new systems such as EMS (Energy Management System) into aircraft. There are many technological building blocks that will interest key stakeholders in the future. In the aviation industry, the cross-fertilization of ideas between existing stakeholders and startups, to accelerate innovation and mitigate risk, has a bright future.

      Beyond breakthrough innovations, continuous innovation involves all stakeholders in the design sector, production lines, and the entire supply chain, as new methods evolve of operating, collaborating, and delivery. Digital continuity, digital twins, and artificial intelligence are essential tools to manage complexity and accelerate design and production cycles. The challenge we see right now at Capgemini is to translate the commitments of our Intelligent Industry approach into action.

      In the near future, AI will be embedded in aircraft that are “safety-critical environments.” That brings unknowns: how will AI be certified in future? How can safe virtual environments be built that allow for the training and deployment of AI pilots for aircraft or drones with passengers? More than ever, data must be relevant, validated, accessible, complete, and not corruptible, while responding to different global privacy regulations or, perhaps, a specific unified framework. Cybersecurity is already and will be more so in the future, a prerequisite between manufacturing divisions, clients and contributors, users, and devices.

      How Capgemini can help

      The strength of an integrated group like Capgemini lies in our ability to offer all the skills related to these future challenges to long-standing stakeholders, new entrants and the two combined. Manufacturers and their partners will need to enlist expertise that is not always at the heart of their business models, but which exists in abundance at a company like ours. Similarly, the lack of engineers in Europe leads to work in ecosystems, including resources. This forces us to create new cooperation patterns and collectively revalue engineering streams in aeronautics. Without this work, the entire development of the aircraft of the future is likely to be slowed down, and the ambitious deadlines for Net Zero will be missed.

      Meet our expert

      Patrice Duboé

      CTO Global Aerospace and Defense, CTIO South and Central Europe
      Patrice Duboé has been working in innovation and technology for more than 20 years. He leads innovation and technology teams to deploy innovation at scale for global corporations and clients, with key partners and emerging startups.

        Telco insights

        Capgemini
        Capgemini
        12 May 2023

        A revolution is happening in telco, one that has the potential to make the sector a truly intelligent industry.

        The dynamic and the opportunities arising from new technologies and innovation are huge.

        To stay close to the pulse of the trends in telco, our blog series Telco Insights puts a spotlight on new developments, new technologies, and current hot topics.

        Join us in reading insights from across 5G, metaverse, networks, digital transformation, customer experience, data and AI, cloud, sustainability, and more, to stay on top of telco’s connected future.
         

        Our most recent insights

        5G and Edge

        Bringing intelligence to the telco edge

        Shamik Mishra
        Feb 24, 2025
        Trends

        Telecom predictions: Trends to watch in 2025 

        Praveen Shankar
        Dec 23, 2024
        5G and Edge

        Private 5G in the future of aviation

        Fredrik Valo
        Oct 23, 2024
        5G and Edge, Data and AI

        How telcos are using AI to make smarter capex decisions

        Inês Pacheco
        Aug 23, 2024
        5G and Edge, Engineering, Innovation

        How Telcos can deliver reliable Open RAN by 2025

        Arnab Das
        Jun 18, 2024
        Innovation, Trends

        MWC 2024: key takeaways re-shaping the Telco industry

        Johannes Aasheim
        Mar 14, 2024
        Innovation, Technology, Trends

        Building intelligent networks: How telcos can take advantage of autonomous networks

        Dr. Ehsan Dadrasnia
        Feb 22, 2024

        We secure a world powered by cloud, data, and identity

        Jérôme Desbonnet
        11 Jul 2023

        Soft launch during RSA Conference at Thales Cloud Security booth of our joint data protection offer.

        James Langley , Serge Dujardin , Sreekumar Vadakkepat , Marieke van de Putte , Geert van der Linden , Michael Wasielewski Jr. , Anne Saunders

        What telcos can learn from consumer experiences

        Capgemini
        Capgemini
        8 May 2023

        A point of view on what telcos can learn from consumer experiences and how to operationalize design as a strategic differentiator

        Customer expectations are being set by consumer-grade experiences by the likes of AirBnB, Uber, and Doordash.  Likewise, telco business customers bring these expectations to their work and is compounded by Hyperscalers investing in frictionless experiences. These software-based, service-driven companies have built their very existence around customer experience.

        Telcos are not alone in the struggle to operationalize the design of customer experience in part because “design” is not well understood amongst C-suite stakeholders. Design is frequently simplified as making a digital UI more elegant often coming late in the go-to-market journey, rather than putting the core needs of the customers front and center in the strategic planning process. To meet today’s customer expectations requires “design” as a strategic enabler throughout the customer relationship. From onboarding to support, to the products and services offered, every interaction across the digital and retail experience can and should be viewed through the lens of design.

        There are high barriers for Telcos including ongoing infrastructure investments, technical debt, and high support costs. However, working with telcos across the world, we have seen few ways telco organizations can operationalize design and see greater impact to their business.

        Considerations for Telco B2C and B2B Organizations

        1. Enable a strategic “design” function in the organization (design can be used to describe product and customer experience organizations)
          • Partner with stakeholders to establish a shared and unified definition of “design” across the organization that minimizes ambiguity and articulates how it drives business outcomes.
          • Build believers in your executive stakeholders by bringing them into the process. A former Verizon design executive brought their CFO close to a program which humanized net add and ARPU metrics, and enhanced collaboration in budget planning.Close the gap between corporate and creative culture by establishing experience principles. In this podcast, Verizon discusses how they established these standards across the organization.
          • Develop a DesignOps capability internally and/or in partnership with external partners to extend the reach of your team and maximize the impact. Listen to this episode of frog’s Design Mind frogcast to hear how AT&T has delivered DesignOps at Scale.
        2. Integrate “design” into business strategy and product planning (i.e., design tools, methods, and frameworks). Report: How to Drive ROI in CX and Design
          • Leverage the wealth of institutional knowledge combined with bespoke market and customer research (qualitative and quantitative) to identify how to play and how to win.
          • Partner with the business with a customer-first mindset and methodology in CapEX planning and ongoing product development efforts.
          • Demystify the impact of design by modeling outcomes in business KPIs (Customer Acquisition, ARPU, retention, NPS, etc.) tied to program to secure funding and share success stories.
          • Place as much value on getting the product right (problem-solution fit) as with getting it to market quickly. Report: Sprinting Towards a Failed Product.
        3. Use Service Design in your organization to break down siloes and simplify the complex to realize better customer experience.
          • Align the purpose of the work to a clear and measurable problem. Is your telco business needing to diversify services so as not to be relegated to the connectivity pipe, or more acutely suffering low engagement or cost to deliver existing product and services?
          • Become intimate with the underlying constraints and opportunities in technology, process, people, and policy to bring together relevant stakeholders to deliver a competitive customer experience.
          • Prototype early and often to test with customers and socialize the results with key executives to secure commitment for next steps.
          • Consider hiring a leader with a Service Design degree or partnering with a reputable firm with service design capabilities. Report: Demystifying Service Design in the US.
        4. Position your design system as a unifying force across your organization, bridging upstream and downstream product teams. Report: How to Systematize a Design System for Success
          • Develop and govern Design Systems across various customer touchpoints including digital, physical product, packaging and self-install, and retail. Stc’s investment in a DLS for digital self-service experience improved brand reputation and service cost metrics.
          • Manage the Design System like a product (for building products) and ensure it represents the culture and brand it supports; consistency across customer touchpoints builds trust that you care about their experience.
          • Create design-to-code toolchains (Design Tokens) to enable consistency and product teams to focus their valuable time on growing and evolving the Design System.
          • Seek a business sponsor in each of the areas where a Design System can lower costs, increase speed and strengthen your brand across products and services.

        TelcoInsights is a series of posts about the latest trends and opportunities in the telecommunications industry – powered by a community of global industry experts and thought leaders.

        Author

        Courtney Brown

        VP, Business Development
        Courtney is a VP of Business Development based in Austin, Texas. She works with frog’s interdisciplinary teams in North America to help early stage startups and Fortune 500 clients grow their business by reinventing the way customers experience their brand, product, and service

          AI is useless without context

          Robert-Engels
          Robert Engels
          May 8, 2023

          During my career in artificial intelligence I have been through the developing, improving, applying and fine-tuning of AI algorithms many, many times. At a specific point of time it become clear to me that the algorithms alone will never be able to solve your problem or use case other than in a lab-setting.

          The reason? Context. AI models put into work in the real world have no possibility to relate to all possibilities across all dimensions in a real-world setting.

          So I started to work on context for AI. First with explicit modeling of context using rules (the if-this-than-that kind of things). That did not work to well (too much work, I would say). So we aimed at describing the world and offering that as context. From the early 2000s I worked on Knowledge Graphs and their standards (and I still love them). They enabled modeling knowledge, but also flexibility by logical reasoning and inferencing, finding inconsistencies in our world and much more. But they are not the final or only answer either (as nothing is, I guess). So when we started to work with deep learning we thought part of the quest was solved. But it did not really work either. In real-world scenarios the AI models we made failed hopelessly at unexpected and unwanted moments. Why? They failed on context. Again.

          And so came ChatGPT. Featuring a model which we had seen (failing) before, becoming racist after only a few hours in the real-world. But now with a wrapper that actually made it work…. much better! And more reliable. Still not perfect, but hey, given the previous attempts: great improvements!

          And what was the trick, why did it work this time? The layer that was added by OpenAI was a genius strike: it added a context-layer, able to interpret what was happening, able to stop unwanted outcomes to a large extend and thus enabling the AI Model to work in the real-world.

          We are not there yet, also this is not enough. But all the great work that has been done last years, on the graph tech, on deep learning, on transformer models and, not in the least, this first actually working context-layer, make me very optimistic that we can look ahead with confidence and trust. Still a lot of work to do, but the basics for a great future with AI seem to fall in place.

          Next thing to add to the equation? Let´s rock and allow these models to use the context awareness in order to solve the parts that language models cannot do: the knowledge parts: factuality, causality, planning, maths, physics etc. First approaches popped up already, I cannot wait to see more integration of it all!

          Read this article on Medium.

          Meet the author

          Robert-Engels

          Robert Engels

          CTIO, Head of AI Futures Lab
          Robert is an innovation lead and a thought leader in several sectors and regions, and holds the position of Chief Technology Officer for Northern and Central Europe in our Insights & Data Global Business Line. Based in Norway, he is a known lecturer, public speaker, and panel moderator. Robert holds a PhD in artificial intelligence from the Technical University of Karlsruhe (KIT), Germany.

            Green quantum computing

            Capgemini
            8 May 2023

            The hunger for computing power is ever-increasing, as complex problems and vast amounts of data require faster and more accurate processing

            Quantum Computing has the potential to be revolutionary in many computation-heavy area’s: ranging from drug discovery to financial applications. The reason? Higher accuracy and faster computation times. However, one question is often neglected: at which cost? We’ve seen that supercomputers and data centres can consume an enormous amount of energy [1,2]. Will quantum computers be the next energy-thirsty technology, or are they instead the gateway to a green computing era?

            Quantum computing uses the most intriguing properties of quantum physics: entanglement, superposition, and interference. Quantum computers use these phenomena to do calculations in a completely different way than normal computers do. The result is an enormous speedup of the calculations, the ability to achieve higher accuracy levels, and solve problems that are intractable for the classical computer.

            These quantum phenomena take place at a very small scale: the scale of an electron. As such, one computer calculation would barely cost any energy. However, to observe these potent quantum phenomena, the system must be completely isolated. Temperatures must be cooled to near absolute zero (-273 degrees Celsius). This comes with a large energy bill.

            The energy consumption of a quantum computer scales fundamentally different from a classical computer. Classically, there is a linear scaling with problem size and complexity. For quantum computers, this may be very different. Insight into this new energy consumption of a quantum computer is essential for a green future of quantum computing.

            The Scaling of the Energy Consumption

            Currently, the power consumption of a quantum computer is about 15-25kW, due to the cryogenic refrigerator [3, 4, 5]. This is comparable to the energy consumption of about 25 households. Note that this power is not only consumed when a calculation is performed but is continuously consumed by the quantum computer. This leads to a large energy bill.

            There is hope for the future. When a classical computer becomes twice as large, it requires twice as much energy. In the near future, a quantum computer, by contrast, may barely increase its energy consumption when scaling up. This is because the cooling volume barely increases, and heat created by extra electronics is also not expected to be significant. The largest quantum computer today is 127 qubits and scaling to 1.000 or even 10.000 qubit is possible with similar energy consumption.

            In the far future, we envision quantum computers with millions of qubits, situated in large data centres. It would be naive to assume that this does not add any energy costs. Recent research shows that the energy costs will scale with the number of qubits and operations at a point in the future. This is mostly due to increased cooling costs.

            There is another very important factor that positions quantum computers as potential candidates for green computing. The idea is as follows: if you must run a supercomputer for a month to solve a specific problem and a quantum computer can do it within minutes – this drastically reduces the energy cost. An example of how energy costs would scale differently for Monte Carlo simulations is shown in figure 1.

            Figure 1: The Energy Consumption of a Quantum Computer scales very differently than that of classical computers. When high accuracy or complexity is required, the quantum computer may become the more “sustainable” candidate.

            Recent research shows a difference in energy consumption between quantum computers and classical computers of a factor of 10.000 (!) [4]. A clear quantum energy advantage, but for a toy problem, favouring the quantum computer. The question remains whether this is applicable to more generic problems.

            Recently, an energy estimate for a more generic problem was made, namely breaking the RSA encryption [6]. RSA is a very common encryption method for secure data transmission. The quantum computer is expected to have an energy consumption of 1000 times as little as a classical computer. It must be noted that this energy estimate was based on futuristic full-stack quantum computers, and still require major advances in quantum hardware.

            Interestingly, this estimation also showed the timeframe where a quantum computer might be slower but requires less energy [6]. This gives a great perspective for the future. Before implementing quantum computers due to their speedup, can we implement them for green computing?

            Green Computing for Financial Institutions

            At Capgemini, the Olive project researched the opportunity of using quantum computers for green computing in the financial industry. This is specifically applied to using quantum computers for pricing derivatives, based on a new algorithm that allows one to do this on a quantum computer [7,8]. (See more here)

            Green Computing is becoming increasingly important for financial institutions. Mischa Vos, Quantum Lead at Rabobank (one of the largest banks in The Netherlands), emphasises its importance for Rabobank:

            “At Rabobank, sustainability is an integral part of our corporate mission: “Growing a better world together. The focus is now on green coding and sustainable data centres. On top of that, Rabobank is investing in green computing technologies. Quantum Computers would be an interesting new candidate.”

            Financial institutions use an enormous amount of computational power to ensure security, price financial products and perform risk management. Based on the insight about the “quantum energy advantage”, quantum computing can reduce the carbon impact of these computations. Would this be interesting for Rabobank?

            “This has great potential for Rabobank. Running these calculations, especially when Artificial Intelligence is involved, has a negative impact on the carbon footprint of Rabobank. Rabobank is dedicated to reducing this. At the same time, as a financial institution, we still need to perform accurate risk analysis and provide security. If quantum computing would allow us to combine the two, this would be very interesting.”  

            There may be a timeframe when the quantum computer is slower, but more energy efficient than classical computers. Would Rabobank already be interested in quantum computers at this stage?

            There are certain batch-oriented calculations that Rabobank performs, and these would be ideal for this. For example, evaluating the risk portfolio of investments at a large scale, or certain fraud detection methods. There will definitely be opportunities where Rabobank can already use the slower, but more efficient quantum computers during this time frame.”

            A future scenario

            The current hardware limitations are the main bottleneck for practical quantum computing. However, it is important for financial institutions to be ready for implementing quantum computers when the time is right, especially when this can be important from a sustainability perspective.

            Phase 1. Research & Development

            The current hardware limitations are the main bottleneck. As such, firstly, the hardware challenges need to be overcome before it becomes feasible to run relevant calculations on quantum computers. The Quantum Energy Initiative points out it is important to already make conscious design choices during this phase to ensure an energy-efficient quantum computer [9,10]. This should not slow down technological progress but instead, prepare for long-term energy advances.

            Phase 2. Green Energy Advantage

            Due to slow quantum clock speeds, and intensive quantum error correction codes, the quantum computational advantage can take longer than the quantum energy advantage. As such, the first applications of quantum computers may be due to their energy efficiency. This will be dependent on the specific advances in quantum hardware.

            Phase 3. Overall Quantum Advantage

            Finally, both the quantum computational advantage and quantum energy advantage are achieved. Here, it is important to make conscious choices in the usage of quantum computers and avoid the Jevon paradox. See for example this blog on quantum for sustainability. On the other hand, this is also the phase where quantum computers can really make a difference in sustainability – making better simulations leading to better material design all the way to general climate crisis mitigation plans [11]. 

            Technology leaves an indelible mark on the environment. Capgemini is determined to play a leadership role in ensuring technology creates a sustainable future. Capgemini can help with implementing sustainable IT as the backbone of a company for a greener future.  It is important to consider the environmental footprint of emerging technologies. Capgemini’s Quantum Lab can help clients understand the future possibilities of quantum technologies and build their organization and strategy that will make the potential become a reality. With this project, more insight into the real environmental cost of quantum computers is acquired, as well as the opportunities that Quantum Computers can give for green computing.

            For more information on the results of Milou’s research, watch the webinar here

            References:

            [1] IEA, Data centres and data transmission networks, 2022. [Online]. Available: https://www .iea .org/reports/data-centres-and-data-transmission-networks .

            [2] A. S. Andrae and T. Edler, “On global electricity usage of communication technology: Trends to 2030,” Challenges, vol. 6, no. 1, pp. 117–157, 2015. .

            [3] F. Arute, K. Arya, R. Babbush, et al., “Quantum supremacy using a programmable superconducting processor,” Nature, vol. 574, no. 7779, pp. 505–510, 2019.

            [4] B. Villalonga, D. Lyakh, S. Boixo, et al., “Establishing the quantum supremacy frontier with a 281 pflop/s simulation,” Quantum Science and Technology, vol. 5, no. 3, p. 034 003, 2020.

            [5] Personal communication with Olaf Benningshof, Cryoengineer of QuTech, 2023.

            [6] M. Fellous-Asiani, J. H. Chai, Y. Thonnart, H. K. Ng, R. S. Whitney, and A. Auffèves, “Optimizing resource efficiencies for scalable full-stack quantum computers,” arXiv preprint arXiv:2209.05469, 2022.

            [7] P. Rebentrost, B. Gupt, and T. R. Bromley, “Quantum computational finance: Monte carlo pricing of financial derivatives,” Physical Review A, vol. 98, no. 2, p. 022 321, 2018.

            [8] N. Stamatopoulos, D. J. Egger, Y. Sun, et al., “Option pricing using quantum computers,” Quantum, vol. 4, p. 291, 2020.

            [9] A. Auffeves, “Quantum technologies need a quantum energy initiative,” PRX Quantum, 3(2), 020101., ISO 690, 2022.

            [10] quantum-energy-initiative.org [11] Berger, Casey, et al., “Quantum technologies for climate change: Preliminary assessment,” arXiv preprint arXiv:2107.05362, 2021.

            Milou van Nederveen

            Master Student
            She is a master’s student in Applied Physics at the TU Delft, and is passionate about quantum computing and its real-world impact. Milou firmly believes that considering the environmental impact of quantum computing is crucial, and this is why she decided to join Capgemini’s Quantum Lab for her internship. She worked closely with her Capgemini supervisor, Camille de Valk, to explore the complicated question about the energy consumption of (future) quantum computers. In this blog, Milou shares her insights and findings, giving us a glimpse into the future of quantum computing and its role in creating a more sustainable world.

            Nadine van Son

            Senior Consultant Strategy, Innovation and Transformation | Financial Services
            As a consultant in the field of financial services I am passionate about innovation and new technologies, which motivates me look beyond the current standards and status quo. I find inspiration in combining insights, trends and developments with their effect on society and how the business environment should navigate.vation on customer behaviour is a topic that inspires me specifically.

              Dark factories, bright future?

              Jacques Mezhrahid
              24 Apr 2023
              capgemini-engineering

              An automatic (or ‘dark’) factory can be defined as ‘a place where raw materials enter, and finished products leave with little or no human intervention’. One of the earliest descriptions of the automatic factory in fiction was Philip K. Dick’s 1955 short story ’Autofac’, a dystopian and darkly comic scenario in which entirely automated factories threaten to use up the planet’s resources, by continuing to produce things that people don’t need.

              The term ‘dark factory’ can be thought of as metaphorical, for example, the factory might not actually be completely dark – its machines may require some light, if equipped with optical sensors.

              Dark factories are a part of the global digital transformation and move to the Industrial Internet of Things (IIoT), which is being driven by increasingly capable robotics and automation, AI and 5G connectivity. In this article, we’ll discuss the benefits, challenges, and how companies can move forward with this concept.

              Pros and opportunities

              Dark factories offer a number of benefits.

              • First among them is increased efficiency and productivity. Dark factories are favourable on classic efficiency drivers such as production output, for example, offering 24/7 capacity beyond traditional shift hours – and they are unaffected by the human need for breaks, vacations, or sick days. And a secondary benefit is that dark factories do not need to be located near a labor pool – which means they can be set up in other areas, exploiting opportunities like cheaper land prices or more attractive surroundings.
              • This also makes them more sustainable. Dark factories can be designed to be more energy-efficient and environmentally friendly than traditional ones; an obvious example of this is that they can do away with lighting and central heating.
              • All of that means decreased operating costs, due to a reduction in non-added value tasks and staff numbers, a benefit which is especially prominent in high labor cost areas.
              • It also improves worker safety. Fewer workers present means reduced risk of accidents and injuries in the workplace, a significant challenge in hazardous environments. Moreover, repetitive and physical tasks can be monitored (and assisted) to avoid safety issues or future physical disablement.  
              • Finally all this can lead to improved quality as well as performance. Highly specialised machines monitored by a new generation of integrated industrial information systems work with the kind of efficiency that a human cannot match. They can also provide relevant recommendations to the operator, to avoid mistakes or support decisions (eg. to recycle the product or anticipate corrective actions).

              Cons and challenges

              There are, of course, some shortcomings.

              • Whether retrofitting an existing brownfield facility or building a greenfield one from scratch, the CAPEX required to create a dark factory is considerable – new infrastructure is required and existing infrastructure may require modification. As is obvious, there are a number of technological barriers to overcome also, for example – AI, ML, 5G, robotics and system integration. These questions should be addressed with a clear vision of the future industrial platform and/or footprint, in order to avoid any “techno push” (a risky approach in which new products and services are driven by new technology and not validated by existing market needs).
              • Additionally, dark factories will necessitate new training and staffing requirements.It’s clear that new specialist skills will be required in order to design, install, maintain and operate the systems that will run these plants.
              • Suitability, scalability and over-specialization form another issue. Humans are still better at many tasks, and not all processes can be automated (yet). It may be a long time before dark factories are suitable for certain types of manufacturing. For example, it’s more difficult to build generalized (as opposed to specialized) automated systems and processes. This may limit a manufacturer’s ability to quickly respond to changes in production requirements. Here, we require AI sophisticated enough for generalized problem-solving (without human aid). For example, the automation of quality control is a particular challenge.
              • Technological dependence is another issue that must be planned for. Cyber-driven industrial espionage is already a serious problem in conventional factories. The sheer connectivity of dark factories creates security vulnerabilities that could be exploited by malicious actors. This could result in data breaches, production disruptions, or worse. In addition, any non-malicious technical failures could result in major production delays without rapid human intervention.

              The new human structure of the Dark Factory

              How might humans fit in this new environment?

              Lean manufacturing taught us that we could cut out much of middle management and improve the efficiency of operations. A dark factory could cut the bureaucracy further. Broadly speaking, the dark factory means fewer people in total, but more added value per person.

              Consider the ’enhanced operator’ – which could be an XR-equipped human who makes periodic visits to the facility. Instead of a person with specialist skills on one part of an assembly line, this enhanced operator would be a generalist, with a very broad understanding of the factory’s E2E processes and systems.

              Headcount may reduce, but collaboration will still be key. First – collaboration between teams to understand systems, engineering, impacts on manufacturing, impacts on operations and how to handle complex situations. Second – collaboration between robots and humans, to perform complex tasks requiring both capabilities.  

              Darkening the factory: what now?

              Implementing a dark factory (either from scratch or by retrofitting an existing facility) will not be easy. And, the pace of transformation is sector dependent. For example, it is easier to completely automate simple and repetitive tasks, ones in which every step in the end-to-end process is understood, down to the movement and the millimeter. But not all kinds of manufacturing are quite so straightforward. As companies progress the concept, here are some steps to consider.

              A transformation roadmap and change management plan

              Identify the steps you need for your transformation roadmap. Is now the right time? Transitioning to (or constructing) a dark factory requires a significant investment of time, resources, and capital. It’s important to carefully evaluate the potential benefits and risks of this transition before making any decisions.

              Conduct a thorough analysis of the existing manufacturing processes to determine which ones can be automated and which cannot. Is it still worth it, in light of this?

              If so, you may need to work with a recognized specialist company to determine which technologies will be most effective for your specific manufacturing process. The transition could also be phased – for example, a partially automated factory could run a ’dark shift’ overnight, which could provide a test or proof of concept.

              And of course – build cyber security into the plan, not as an ‘afterthought’. The dark factory’s level of connectivity (and potential vulnerabilities that result) requires it.

              Consider the human implications

              How can we keep humans safe in this new (mostly) non-human environment? What safety measures are required – for example, can you create areas that are safe for people to traverse? And how must people behave in a space built primarily for robots, not humans? 

              Anticipate and prepare for workforce transformation: think about recruiting for the skills needed for tomorrow. What will be done about those who may lose their job to a robot – can they be retrained and retained?

              Consider future operations: flexibility and scalability

              As previously mentioned, people are more flexible than robots and machinery. As such, forward planning must consider how the infrastructure will flex and scale, in order to meet future market needs. Detailed monitoring and analytics can help here, identifying what systems can be optimized or replaced.

              Dark factories, bright future?

              The fragility of global supply chains has become increasingly apparent in recent years – Russia’s 2022 invasion of Ukraine, and the COVID-19 pandemic, in particular, have demonstrated the need to ‘onshore’ (bring back) manufacturing, so as not to be dependent on foreign sources of vital goods.

              But manufacturing was, of course, originally ‘offshored’ because it was cheaper to do the work abroad. Dark factories could be an equalising force – bringing down costs so goods can be produced back at home.

              It’s also important to consider that fully automated factories have been tried previously, with varying degrees of success. There are a few cautionary tales; IBM tried its own in the 1980s, but closed it because it wasn’t able to respond to changing market needs. Apple also built such a plant in the 1980s, but closed it in the early 90s – likely because the plant was unable to deal with increasingly smaller components. More recently, Tesla walked back some of the automation at its Fremont CA facility, when machines failed to meet its ambitious manufacturing targets. This shows us the importance of flexibility and forward planning.

              That said, successful dark factories do exist today. In perhaps the best example, robotics manufacturer, FANUC (Fuji Automatic NUmerical Control), operates a lights out facility in Japan. Here, complex robots assemble other complex robots, with zero human involvement in the manufacturing process.

              As the previous examples demonstrate, success with a dark factory can be difficult – but is possible. Dark factories offer transformative benefits in terms of cost efficiency, sustainability, safety, and supply chain resilience. They also offer a considerable competitive advantage to those who ‘get there first’, who get it right and, returning to Philip K Dick’s Autofac, keep control in human hands.

              Meet our expert

              Jacques Mezhrahid

              VP & CTO Industrial Information System at Capgemini Engineering
              Jacques supports clients in IndustryX.0 transformation. Analyzing the impact of new technologies for next wave of such transformation and helping client to answer the business, societal and human challenges are also in his field of interest

                The future of talent management

                Sylvia Preuschl
                5 May 2023
                capgemini-invent

                How to unlock workforce agility with AI-based Talent Marketplaces

                Digitalization, automation, augmentation, robotics, advanced analytics – we are all part of the fourth industrial revolution as it introduces new ways of working and challenges current business models. The pace of technological and digital advancement has accelerated significantly during the last couple of years and continues to change the nature of work considerably. Accordingly, the Organization for Economic Co-operation and Development (OECD) reports that more than one billion jobs will be transformed by technology over the next 10 years.[1] Already today, we observe that new jobs with shifting skill sets are emerging, particularly in the field of data analytics, cybersecurity, or cloud computing, while others are disappearing (e.g., in administration).

                However, as specified by Capgemini’s Research Institute study The fluid workforce revolution, in many companies, the current workforce lacks the critical skills necessary to reach strategic goals. More precisely, in this research, 65% of executives agree that the gap between the skills their organization requires and the ones that people possess is widening. On top of that, with the labor market fully disrupted by demographic changes and talent shortage, companies struggle to recruit the right talents with the right skills.

                Do you agree? If so, how do you ensure that your workforce is future-ready to meet business demands?

                Internal mobility helps organizations to re- and upskill, redeploy, and retain talents

                The tense situation on the competitive employment market drives organizations to rethink their talent strategy. Consequently, many companies are beginning to recognize the importance of internal mobility, since it offers more advantages than just filling existing gaps.

                On the one hand, internal mobility enables organizations to become more agile and efficient in developing and redeploying the current workforce by means of re- and upskilling and lateral or vertical moves. On the other hand, employees get the chance to actively drive their professional development, leading to increased motivation and higher retention rates. As confirmed by our latest research The People Experience Advantage, for 65% of employees, learning and skill development is the most important aspect of their work. Correspondingly, companies need to create a culture where talents can grow skills and follow individual career aspirations.

                As part of an agile response to business disruptions, talent mobility requires a mindset shift. Instead of only seeking college education degrees and former job experience, it expects organizations to focus on a candidate’s relevant skills. Thus, the basis for a successful talent mobility strategy constitutes transparency of available skills and future skill needs. But many companies encounter difficulties when they attempt to identify, assess, and manage skills in an agile and adaptable approach.

                Do you have a strategy to efficiently manage and develop your internal resources?

                Talent Marketplaces create visibility into available talents and possible development opportunities

                This is where Talent Marketplaces come into play. In simple terms, a Talent Marketplace can be defined as a powerful platform that uses AI to dynamically align employees’ skills with new career and development opportunities. By analyzing the current and potential workforce, Talent Marketplaces improve data-driven decision-making and enable organizations to better understand themselves. In fact, these platforms deliver real-time insights on which skills are available and which are missing but needed to meet business priorities.   

                Figure 1: Overview of the functionalities and benefits a Talent Marketplace platform can offer

                As a first step, every employee creates a personal profile on this technology-supported platform, where they can both self-assess current skills and define career goals. Based on AI, a person’s existing skills or adjacent skills can automatically be collected from input data, such as CVs, LinkedIn profiles, and HCM data. Depending on the analysis of personal abilities and interests, the tool then matches employees to promising jobs within the company, builds customized career plans, and suggests required learning and development measures that will help them reach their defined goals. Here’s how Josh Bersin’s describes this recent development:

                “In many ways, these are the “new talent management platforms” of the future, because they connect employees to learning, mentors, developmental assignments, and jobs. And unlike the old “pre-hire to retire” systems that tried to do this with competency models (Cornerstone, Saba, etc.), these are highly dynamic systems that can infer and import new skills, content, and assessments by design.”

                Source: Bersin, J. (2023), HR Technology 2023: What’s hot? What’s not?

                Put this way, it is not hard to see the benefit of these dynamic systems. Once successfully implemented, employees gain new experiences as they move internally while organizations get to retain valuable knowledge.

                Select the best fitting Talent Marketplace provider that meets an organization’s individual requirements

                Given the potential of these platforms, a series of vendors now offer amazing new solutions on the market.[2] A Capgemini Invent internal study compares the leading providers on the market (e.g., Gloat, Eightfold.ai, HR Forecast, 365Talents and ODEM). The study evaluates the functional strength of different Talent Marketplaces and shows that features vary amongst providers. Therefore, organizations must choose a platform that meets their individual demands (e.g., in terms of needed functionalities, pricing, and cultural fit).

                Sound interesting? We will present a concrete use case in our next article, Talent Marketplaces: Train vs. Hire – The Cybersecurity Reskilling Solution.

                Until then, stay curious!

                At Capgemini Invent, we believe that Talent Marketplaces can be the right AI-based solution for companies seeking to manage talents more effectively, create an augmented workforce in an ever-changing environment, and gain competitive advantages in the “war for talent.”

                Let’s get in touch and discuss how we can help you to Reinvent Your Workforce by turning today’s talent and skill management challenges into great opportunities.


                [1] Zahidi, S. (2020). We need a global reskilling revolution – here’s why

                [2] Bersin, J. (2023). HR Technology 2023: What’s hot? What’s not?

                Contact our experts

                Sylvia Preuschl

                Vice President and Head of Workforce Transformation Germany, Capgemini Invent

                Nele Kammann

                Senior Manager, Workforce Transformation, Capgemini Invent

                Ines Lampen

                Consultant, Workforce Transformation, Capgemini Invent

                  Stay informed

                  Subscribe to get notified about the latest articles and reports from our experts at Capgemini Invent