Skip to Content

Seven ways to foster long-term customer relationships

Marc De Forsanz – Our Expert
Marc De Forsanz
25 July 2022

Real-world examples from clients of Data-driven Customer Experience by Capgemini

Enterprises understand the value of creating long-term relationships with their customers. Whether they operate in B2C or B2B spaces, successful organizations have learned the importance of customer acquisition, engagement, and retention. In today’s environment, customers choose brands with the tap of a screen or the click of a mouse – and can access a vast amount of information and recommendations to make buying decisions.

To continue to thrive, companies must move towards making use of data and AI to drive the customer experience. The solution is to embrace a dynamic customer experience and journey management program – one that:

• Delivers personalized, contextualized engagement across all customer touchpoints
• Anticipates individual customer behaviors and addresses each customer’s needs
• Operates in real-time, to deliver impactful customer experiences when they’re needed, seamlessly across all channels
• Encompasses the entire enterprise ecosystem – from traditional CDP roles, such as marketing, to eCommerce, point-of-sale, customer service, R&D, production, supply-chain management, and shipping.

It’s easy to understand the value of these objectives, but the descriptions are quite a high level. When shopping for a customer-experience solution, enterprises want to know how they can accomplish these goals. To answer that, here are seven examples.

Better marketing – including suppression lists. Targeting and retargeting are important aspects of every marketing strategy. But targeting loyal customers with irrelevant advertisements not only annoys them, but also wastes precious marketing dollars. The ideal customer-experience solution will provide the marketing team with a suppression list created from systems across the enterprise. This will resolve multiple identities related to the same customer, to engage existing customers with relevant advertisements and exclude those customers from new-user acquisition campaigns.

Customer trust. The first step towards building a loyal customer is to win customer trust. Brands need to be totally transparent about data they collect and how they provide each customer with complete control of their own data. The ideal customer experience solution will create a unified customer profile – including data the customer supplied during the registration process and data the organization has inferred or consolidated from other enterprise systems. It will then give full control of that data to the customer.

Managing consent. Consent is a large and evolving issue for enterprises to manage – especially as jurisdictions introduce different regulations. The well-planned customer journey will capture consent from the outset and automatically add the details to the customer’s profile. The company can then immediately act upon those preferences by ensuring customers are only included in engagement initiatives for which they have provided consent.

Real-time conversion. Here’s a startling statistic: 70 percent of customers abandon their eCommerce carts before they check out. The ideal customer-experience solution will monitor the customer’s journey and predict when they are likely to become a “cart abandonist.” It will then initiate actions to encourage the customer to follow through with the sale, while the customer is still active on the channel.

Identifying customers and prospects with high lifetime value. Repeat customers are not all created equally. Every company wants to identify customers with high lifetime value – and figure out how to acquire more of them. The right solution will classify existing customers, build profiles of them, and then use a “look-alike” approach to identify prospects that are likely to also become high-value customers.

Next-best action for customer service agents. Solving problems and reducing churn is a key role for customer service agents, and these frontline workers do their jobs best when they have access to useful data. The ideal customer-experience solution will unify online and offline data to provide the agent with a complete and up-to-date view of the enterprise’s interactions with that customer. This data includes interactions such as:

  • browsing on eCommerce sites, the company’s social media, and other platforms
  • purchases, both online and in-store
  • fulfillment and delivery
  • warranty status
  • any contact with the customer, including compliments, questions, or complaints.

When a customer contacts an agent, a well-designed customer-experience solution should apply AI to the issue and the customer profile, then suggest to the agent the best action or actions to take in order to satisfy the customer and build loyalty.

Reading the signs to reduce churn. Any enterprise that relies upon a subscription model needs to reduce churn. The best customer-experience solution will employ AI to read signals – such as a call or email to customer service – and score each customer on their likelihood to cancel. AI can then blend this score with the customer’s lifetime value and make recommendations about how to better engage with that customer. This engagement – in the form of personalized content, delivered across the most appropriate channel – can reduce the potential for churn.

These are just a few of the ways in which a well-designed solution can enhance the relationship between companies and customers. What’s more, these are all real-world examples – drawn from the experiences of clients who have deployed Data-driven Customer Experience by Capgemini.

Data-driven Customer Experience empowers enterprises to take full advantage of interconnected data while building trust, transparency, and long-term relationships with customers. It collects all relevant customer-first and enterprise functions and related data into unified customer profiles. These profiles uniquely identify each customer and provide a personalized, global view of their relationship with an organization’s brands. It does this in real-time to turn raw data into more reliable, actionable insights. And it preserves privacy and ensures compliance with all relevant regulatory requirements.

With Data-driven Customer Experience, Capgemini’s Insights & Data professionals are ready to help organizations transform their customers’ journeys. Capgemini supports its clients through every step of the process – from creating the strategy and operating model to selecting the best mix of technology components and data platforms, to implementing the solution, training those who use it, and measuring its effectiveness.

Author

Marc De Forsanz – Our Expert

Marc De Forsanz

Insights & Data Global "Customer First" playing field Business Development and Portfolio leader. 

    5G and security: are you ready for what’s coming? 
    New risks and complex challenges require a comprehensive new strategy

    Chhavi Chaturvedi
    25th July, 2022

    5G opens up all new avenues of attack – is your organization ready?

    Every tech revolution comes with risks, and 5G is no exception. From IoT applications to the 4G – 5G transition, the scale of 5G usage is opening up an enormous surface area to potential attackers. The promise of high bandwidth + low latency in the coming years is extraordinary, but organizations that are slow to react to these threats are taking a gamble. Fortunately, there are a number of security measures that can substantially reduce these risks. Read on to learn how to keep pace with the security demands of 5G today.

    Security challenges

    Every promised benefit of 5G brings with it a corresponding risk. The number of connected IoT devices is growing at upwards of 18% per year, on course to pass 14 billion this year. Each new edge-computing device creates new vulnerabilities for bad actors to exploit. The decentralized nature of IoT products makes security measures difficult to implement at scale, while 5G’s greater bandwidth has the potential to fuel new DDoS attacks with the power to overwhelm organizations. And the expansive nature of 5G itself poses new risks. As the number of users increases into the millions and billions and networks expand to accommodate more devices, network visibility plummets. It becomes harder to track and prevent threats, especially against sophisticated attackers. Device vulnerabilities, air interface vulnerabilities, RAN, backhaul, 5G packet core & OAM, and SGI/N6 & external roaming vulnerabilities all need to be re-examined. 

    Network Slicing is not enough 

    There are many services in today’s industries that require various performance measures such as high throughput, low latency, high reliability, etc., which can be achieved by network slicing, which integrates multiple services with customized local networks. In theory, network slicing should raise security – like the bulkheads on a ship, which contain a potential breach to one flood zone. This is the same logic behind IT network segmentation, which is an established best practice. However, just like network segmentation, network slicing alone does not guarantee that threats are contained. Without additional measures, they’re likely to pass seamlessly into the wider system. Network slicing also faces security challenges connected with resource sharing among the slice tenants and slice security coordination, which are fairly straightforward to solve, but do require attention. 

    Mitigation Approaches 

    Businesses deploying 5G-connected equipment need an up-to-date set of security solutions capable of monitoring and protecting against the new generation of cyber threats. The specifics will vary according to each user, but the backbone of the new strategy may look something like the following: 

    Security Edge Protection:

    Security edge protection is the foundation of 5G security, upon which all other strategic considerations rest. The following methods can help secure 5G edge installations:  

    • Encrypted tunnels, firewalls and access control to secure edge computing resources 
    • Automated patching to avoid outdated software and to reduce attack surface 
    • AI/ML technology to detect the breach and send alerts accordingly or act accordingly  
    • Continuous maintenance and monitoring for the discovery of known and unknown vulnerabilities  
    • Securing the edge computing devices beyond the network layer 

    Zero trust architecture: never trust, always verify 

    Zero Trust Architecture (ZTA) eliminates implicit trust by continuously validating a set of actions at every step. Based on perimeter-less security principles, ZTA requires each asset to implement its own security controls. It includes security features such as: 

    • Continuous logging, continuous monitoring, alerts and metrics 
    • Threat detection and response 
    • Policies & permissions 
    • Infrastructure security & secure software deployment lifecycle (supply chain security) 
    • Data confidentiality from service providers of both hardware and software 
    • Container isolation 
    • Multiple authentication and TLS security 

    Container-based technology

    Containers bring the potential benefits of efficiency, agility, and resiliency. Gartner expects that up to 15% of enterprise applications will run in a container environment by 2024, up from less than 5% in 2020. Containers are orchestrated from central control planes which are configurable, used for scaling up and down workloads, collecting logs and metrics, and monitoring security. Containers bring a few unique security risks, but they are solvable.  

    When containers run in privileged mode or as root, they provide attackers with direct access to the kernel, from which they can escalate their privileges and gain access to sensitive information. It is therefore essential to add role-based access control and limit permissions on deployed containers. It’s easy to run a container in non-root, simply by providing instructions in the docker file. Two more ways to enhance container security are by rejecting pods or containers in privileged mode, or by keeping privileged containers but limiting access to the namespaces.  

    Automated operations and AI 

    The complexity of 5G infrastructure requires security applied at multiple levels. Handling complex security such as threat, risk, different devices, scaling etc, is so difficult manually as to be impractical. Additionally, manual operations introduce an element of uncertainty which may in some cases be exploited. There is absolutely a place for human ingenuity. But increasingly the operations level needs to be automated. 

    What about AI/ML technologies – are they helpful, or just hype? Currently, a bit of both. They already have a role in security, primarily in detecting irregularities. The next step in AI/ML-based security will involve deep learning, through which the system builds its own capabilities through experience – theoretically going so far as to predict threats before they’re deployed. Claims about revolutionary AI protection need to be considered very sceptically, but at the same time the potential for AI to fundamentally alter network security is real. This is a space to watch. 

    Building on firm ground 

    The Capgemini Research Institute recently probed organizations’ preparedness to cyber-attacks and revealed a concerning level of disconnect: 51% of industrial organizations expect cyberattacks on smart factories to increase over the next 12 months, and yet nearly that same number (47%) report that cybersecurity is not a C-level concern. We see the lack of a comprehensive, system-wide approach to security as a serious long-term threat. 

    It is tempting to describe security breaches as instantaneous, but in fact, an honest examination often reveals vulnerabilities that had been left out in the open for months or years, with no adequate security protection. Security you can rely on starts early, with solid fundamentals across people, process, and technology. It’s not easy, but it’s doable.  

    We can see the risks that come with 5G. Let’s put a security plan in place now. To learn more about our 5G security capabilities, contact us below. 

    TelcoInsights is a series of posts about the latest trends and opportunities in the telecommunications industry – powered by a community of global industry experts and thought leaders.

    Developing a quantum advantage includes joining an ecosystem of complementary partners

    Pascal Brier
    21 Jul 2022

    Business leaders with the right mindset will reap the rewards of an ecosystem approach to implement a successful quantum strategy and develop a long-term quantum advantage

    No single business can expect to manage every aspect of quantum exploration because of the scale of costs and skills involved. As we concluded in our last blog post, business leaders will need to look outside the organization to evaluate, prioritize, and implement a successful quantum strategy – and that’s where an ecosystem approach can pay dividends.

    Being part of an ecosystem is crucial to quantum success. Business leaders can use the ecosystem to create an amalgam of quantum expertise, to build the best possible platform, and to plug any skills gaps that appear.

    An ecosystem approach also ensures an investment in quantum is not overstretched. Financial resources will be a critical concern because quantum technology requires a broad range of expertise. It’s a pioneering area of technology led by specialists with knowledge in several key areas:

    Hardware specialists – Computers, sensors, and infrastructure, from tech giants such as IBM, AWS, and Google to smaller specialists such as Atom, Quandela and Pascal, including quantum cloud and hybrid architectures
    Software specialists – Quantum algorithms development, artificial intelligence, and machine learning from companies such as ApexQubit, which uses frontier technology in medicine discovery
    Crypto agility and quantum cybersecurity specialists – Security protocols and standards supporting multiple quantum cryptographic primitives and algorithms, such as ISARA
    Startups and venture capitalists – Estimates suggest $5 billion of private capital has been invested in quantum technologies since 2002, with $3 billion of this being in 2021 alone .
    Academic institutions – Technical institutes, such as Fraunhofer, and universities, including Cambridge and Instituto Superior Técnico, are providing cutting-edge research in quantum theory and practice.

    Strong industry examples point the way

    Multinational firms, including BMW and Goldman Sachs, are already working across these areas of expertise and developing an ecosystem approach to quantum. The result is an early foothold in a fast-evolving space that should result in a long-term advantage.

    Perhaps the most significant examples of a successful ecosystem approach so far are those that bring government and industry together to work on quantum challenges. Many of the applications that these partnerships are investigating concern issues with far-reaching consequences, such as climate change, environmental sustainability, industrial competitiveness, and economic growth.

    In these challenging scenarios, the consortium approach is becoming a norm, allowing a collection of knowledgeable parties to tackle intractable issues in partnership. These consortia are driven by core technology firms, blue-chip companies, governments, academic institutions, and other initiatives, such as Horizon Europe, which is the European Union’s research and innovation program.

    Take the example of EuroQCI, which is a European Commission initiative led by Airbus. This consortium includes a range of companies and research institutes that are exploring the future design of the European quantum communication network. Other examples of cross-institution consortia include the European Quantum Industry Consortium and the Quantum Economic Development Consortium.

    What’s more, a joined-up approach is already helping to forward research into quantum key distribution (QKD), which uses quantum mechanics to develop secure cryptographic protocols. The development of QKD technology is a significant challenge on a global scale. A large ecosystem of players, including hardware providers and software startups, is working to create quantum-based solutions.

    Across all these pioneering approaches, one thing remains constant: all organizations must ensure data is used in-line with strong regulatory requirements. Standards-setting bodies, including NIST in the US and ETSI in Europe, are helping to set the agenda and create boundaries for the private and public sector organizations that are working together on quantum technologies.

    The right mindset for an ecosystem approach

    As outlined in our recent report Quantum technologies: How to prepare your organization for a quantum advantage now, business leaders who want to develop a long-term quantum advantage should explore how to become part of an ecosystem at the earliest opportunity. Successful companies will develop a proactive mindset that supports other ecosystem members as they work together to develop creative solutions to intractable challenges. We think the right mindset includes five key characteristics:

    Communication – Working with external partners and across internal business units, as quantum’s impact goes beyond R&D and affects operational activities and processes
    Trust – Sharing information, perhaps highly confidential and related to intellectual property, that might not normally be revealed
    Collaboration – Fostering a collegiate belief in the potential of complementary skills
    Imagination – A willingness to invest in meaningful research with a long-term view rather than just a focus on the top line or short-term revenue streams
    Participation – Being active in the wider community through initiatives such as industry events, meetups, and hackathons

    Business leaders with the right mindset will reap the rewards of an ecosystem approach.

    What new opportunities does the metaverse present for product and services companies?

    Dheeren Vélu
    20 Jul 2022
    capgemini-invent

    A few years from now, we may test drive our new EV in a metaverse showroom. Once we buy it, we may also opt for a virtual version, which we can drive around virtual worlds. Perhaps the carmaker will throw in some sweeteners, such as exclusive access to live virtual events for its metaverse drivers.

    Predicting exactly how the metaverse will look is a fool’s game. But with 120bn investment in 2022, we can be reasonably confident it will become a place where people interact and transact. McKinsey reckons that by 2030, over 50% of live events could be held in the metaverse, and 80% of commerce could happen at least partially there. That is a big opportunity for products and service companies.
    All this may sound scary, especially if you are a company that makes real things in the real world. You should not feel scared, for two reasons. Firstly, the metaverse is not as complicated as it sounds. Secondly, you will miss out on big wins if you let fear hold them back.

    What exactly is the metaverse?

    To understand the opportunity, we first need to agree on what the metaverse is. The concept will evolve, but we would describe it as follows:

    The metaverse is an umbrella term for a range of virtual environments – accessed via screens or headsets – in which multiple parties, represented as avatars, can interact and transact. In most cases, these environments are ‘persistent’, ie any change you make or ownership you acquire remains when you leave.

    Practically, these worlds are made up of an immersive front end – a 3D virtual landscape – and a backend infrastructure that validates transactions using tokens and blockchain to confer permanent records of ownership.

    What are the business opportunities in the metaverse?

    For product and service providers wishing to take advantage of this, there are two overlapping avenues. One is to build virtual equivalents of products – eg, a car or a sneaker – that can be used in the metaverse. The other is to buy ‘land’ and build your own space – a shop, a stadium, a village – where your customers can buy your products or experience your services.

    So, a sneaker company could offer a virtual add-on that let’s buyers wear the shoes – with all the status they represent – as they go about different activities in the metaverse. But the digital world also offers the potential for more functionality – perhaps wearing a particular brand gives entry to exclusive virtual events, thus increasing its value in both worlds.

    You can also set up physical shops to sell or showcase your products. The beauty of this virtual world is they can be anywhere – why sell luxury handbags in a busy high street when you can sell them from a tropical beach? Likewise, B2B companies could create virtual showrooms to demonstrate high value equipment that would be hard to bring to a customer.

    Or you may want to branch out into whole new areas. We have seen sports brands offer fitness subscriptions via apps; the next step for them could be virtual fitness studios in the metaverse, where users can work out, buy clothes, and sit down with experts such as nutritionists or personal trainers.

    How to grow your business in the metaverse

    So, how should products and service companies – who are not themselves tech companies – go about benefitting from all this?

    The most important driver of success will be a value-focussed strategy. Don’t just jump into the technology without a plan. Decide what you are trying to achieve from the metaverse.

    For example, you could:

    • Create virtual versions of existing products and services that you can sell in the metaverse
    • Create virtual shops and showrooms to sell your real-world offer in more immersive ways
    • Reach a new generation of customers
    • Create complementary services that generate new revenue streams
    • Build virtual spaces, events, or communities where you can gather your audience
    • Form partnerships – eg, with virtual event promoters – that allow you to increase your exclusivity by giving your customers unique offers or opportunities

    Right now, the key will be experimentation. No one knows exactly how the metaverse will settle, so now is the time to get hands-on with the metaverse, brainstorm ideas for your business, test a few hypotheses, and develop proofs-of-concept for the most promising.

    Even if you don’t see high potential use cases immediately, they will come as you experiment and see what works. Like digitalization via IoT, products in the metaverse create the opportunity to collect detailed data on how users engage with your offer, allowing you to continually test and refine, kill failing projects quickly, and spot behavioral signals that hint at new opportunities.

    Experimentation with new and unproven technology may sound risky and costly. But there are a lot that can be done cheaply at first. And if the tech revolution taught us anything it is that focused experimentation and failing fast is the route to new revenue. You will pay one way or another – through the cost experimenting, or the cost of falling behind.

    Don’t be afraid of the metaverse technology

    A major risk for companies is that they will focus on the technology and get bogged down in technical complexities and end up with no use cases. We urge you to focus on the business case and trust the technology will support it.

    Metaverse tech is less complicated than it sounds. The building blocks – both of the virtual worlds (eg those provided by Unity or Roblox) and the contracts for virtual transactions – are becoming simpler and standardized. For more sophisticated offers, there is a growing pool of experts who can customize both. Decision makers should immerse themselves in the metaverse to get familiar with the possibilities, then contract technical people to make it happen.

    It is true that the metaverse still needs to mature. Right now, there are many sellers of digital land (eg, Decentraland, Sandbox) and knowing which one will be right for you is hard. This shouldn’t hold you back from picking the one that seems to be popular with your audience and experimenting. This will stand you in good stead as the industry evolves, and our prediction is that they will soon become interoperable, allowing your customers to take your virtual car or trainers between worlds.

    The next killer app

    Most of the ideas discussed above are ‘lift and shift’ – taking something that exists and creating a metaverse version. There are many examples already happening, and it is likely that most big businesses will have some metaverse offer by 2030.

    But the big opportunities will come from the things we haven’t even thought of yet. We don’t know what those are, any more than the early internet pioneers predicted Facebook or YouTube. But as a culture of innovation grows in the metaverse, someone – maybe you – will come up with the next killer app that changes the world.

    To discuss the opportunities or ideas outlined in this article, please contact the authors.

    Dheeren Velu

    Dheeren Vélu

    Head of Applied Innovation Exchange, AUNZ
    Dheeren Velu is Head of AIE and AI Leader at Capgemini ANZ, driving innovation at the intersection of technology and business. He leads the GenAI Task Force, delivering high-impact AI solutions. With deep expertise in AI and emerging tech, he’s a TEDx speaker, patent holder, and Chair of RMIT’s AI Industry Board, focused on transforming industries and the future of work.

    Nitin Dhemre

    Immersive Stream Lead of the Capgemini Metaverse Lab
    Nitin is Director, Innovation Strategy & Design, frog Growth Strategy Paris, and Immersive Stream Lead of the Capgemini Metaverse Lab

      How the retail industry can benefit from Edge IoT?

      Vijay Anand
      6 Jul 2022
      capgemini-engineering

      Today, shoppers waste a lot of time searching for products. Retailers must physically check the quantity of a product on the shelf at certain times during the day. If a product is out of stock, and this isn’t recognized until a customer comes to physically check it out, it can lead to a big loss for the retailer. The potential of the retail market is overwhelming in today’s market trends, but the retail industry is aware of its own issues, and the major drawbacks its faces today, such as understanding consumer habits, reducing check out timings, spoilage of food, managing product shelves, preventing theft, monitoring goods, tracking energy utilization, managing in-floor navigation, and detecting crowded areas. Problems like burglary, worker theft, paperwork errors, vendor fraud, and monitoring shipments require quick, real-time analytics instead of storing data and analyzing it later with standard surveillance equipment. Another challenge is spoilage, especially in the food retail sector. For instance, one large retail chain reported a nearly $2 billion loss due to wasted and spoiled food caused by a legacy refrigeration system. Specifically, alarms from controllers on refrigerators were slow to reach the operations and maintenance team.

      To address these challenges, there is a strong inclination towards the digitalization of day-to-day operations in retail stores, which has opened-up new use cases for the industry. Many retailers have started the digital transformation from analog, brick-and-mortar locations to digitally enabled, immersive customer experiences at various locations across the globe. As shoppers’ preferences, based on their lifestyle choices, are changing rapidly in the retail market, it’s high time that retailers adopt digital transformation. Here,there are many ways for retailers to provide a fabulous experience for shoppers, allowing them to choose and buy products based on their preference in-store(s) as shown in Fig 1. What’s more, the COVID-19 pandemic has also made many retailers rethink their strategies toward speeding up towards digitalization.

      Fig.1: “Smart Retail – Evolving ecosystem” (Source: Internet/Public)

      To realize the vision of digital transformation in the retail industry, Consumer IoT (CIoT) is driving retail businesses to convert standard retail stores into smart retail stores that help to understand a customer’s tastes, needs, and habits in real time, while they shop. For example, retail store owners can integrate different forms of sensors in key zones of their stores and connect them to a retail gateway, also known as an EDGE gateway, to perform real-time data analysis related to various product sales captured by these sensors. The CIoT, along with other emerging connected technologies such as data analytics, Artificial Intelligence (AI), and Machine Learning ( ML), is changing the shape of the retail industry. Many retailers are ready to make the changes required to implement intelligence in their stores, helping them to improve store operations, enhance customer experience, drive more business conversions, and solve day-to-day problems. The CIoT can enable retailers to predict customers’ behavior and provide more details about the products and services. As a result, shoppers want to purchase these products, which helps retailers to act upon and increase their daily sales and profits.

      To realize this vision, retailers can consider many emerging technologies to not only enhance consumer experience, but also expand data collection and support data-based operations management. Some emerging technologies such as Edge IoT, AI, ML, and analytics have the capability to break down boundaries between brands, products, and customers.  With many retail stores considering IoT, the market is expected to grow to $94.44 billion through 2025[1]. Current trends show most retail stores are focusing on leveraging these emerging technologies to create ‘connected retail’ services and better shopping experiences for their customers.

      The CIoT is revolutionizing the retail industry, facilitating a greater, in-store shopping experience and enhanced business engagement. IoT technology for retail can enable retailers to boost sales, increase customer loyalty, and allow shoppers to enjoy a more convenient and satisfactory shopping experience. As different retailers might have a different focus on achieving a connected retail store, many technologies, protocols, and hardware platforms (as shown in Fig 2) have been identified to build a smart retail ecosystem.  These realize possible use cases, as listed below, for achieving an improved shopping experience, automated business processes, better stock inventory, and supply chain management.

      Fig.2: Smart Retail ecosystem covering different technologies, protocols, and hardware platforms (Source: Capgemini + internet)

      Millions of data points must be processed at the edge of the retail network, instead of utilizing a high amount of a CSP’s network bandwidth, transmitting data to its data center, or cloud platform, and back to the retail store. Hosting the computing at the edge of their network instead helps retailers to engage with customers instantly, leading to much higher customer satisfaction. Carrying out compute at the edge of the network, processes real-time data immediately and returns intelligence quickly. An Edge-based system design for retail stores includes edge sensor nodes, edge gateway, and edge cloud (see figure 3) based on many wireless technologies and relevant protocols. The Edge-based approach for retail infrastructure not only covers the Edge design approach but also provides the methodology for addressing key challenges and other important aspects such as zero touch provisioning of IoT devices, theft prevention, energy management, secure transmission of data within the retail network, effective data compression techniques to manage the in-store network bandwidth, and finally, ensuring seamless network switching, network bonding between wired (fiber optic) and wireless networks (4G/5G), based on 3GPP ATSSS (Access Traffic Steering, Switching and Splitting) standard for remote access by retailers, allowing continuous interaction.

      Fig.3: Smart Retail Ecosystem based on EDGE Concept (Source: Capgemini)

      While the usage of Wi-Fi networks in retail stores offers simplicity, reduced administration, and greater cost-effectiveness, it suffers from issues around coverage, capacity, reliability, security, andhandoff. As the 5G market emerges, retail networks will soon be enabled with a 5G radio interface. This 5G network will be a dedicated private wireless network for retailers and shoppers, delivering secure, reliable, and efficient communication services to various products connected with the 5G network via an Edge gateway. In the world of retail, the adoption of new technology has never been more extensive, and retailers are left with no choice to deliver a customized and seamless experience to customers with varied services such as in-store navigation, digital signage, smart mirror, quick checkout, video surveillance, and electronic shelf tags – the list is ever-growing. Using RFID tags, cameras, mobile apps and other wireless technologies like LoRa, along with5G, retail stores can accomplish up to 100% inventory accuracy, minimize unexpected out-of-stock incidents, enable end-to-end store inventory management, and increase sales margins by certain percentage. The CIoT-based solutions can provide retailers with the ability to track the collection of products, analyze product popularity, and check out information on sold products at any time they need, including brand name, price, anddescription. With the use of machine learning software along with 5G, retailers can automate operations, optimize various processes, reduce their operation costs, and deliver a new, personalized experience to their valuable customers.

      Smart retailing is the emergence of a new ecosystem for providing a newly personalized experience for shoppers. Retailers are working hard towards super-charging the in-store experience and the convenience that they can offer shoppers visiting their store to purchase different products. Retailers across the globe have taken various steps to realize the vision of a smart retail store, looking at “hot” future directions. One of these, the Edge-based concept, as presented in this blog, covers Edge nodes, Edge gateway, and Edge cloud, that enhances operational efficiency and customer experience to create smart and sustainable retail stores. The various killer applications identified as part of smart retail activities can be measured through a set of indicators via Edge sensor nodes, the Edge gateway, andEdge cloud to address different retail challenges and to provide the services required for shoppers and retailers to meet their end expectations. Today, every activity in the retail value chain – including logistics and warehousing, centralized procurement, marketing, operations, customer services, management, and cash flow – can be incorporated into a digitalized and intelligent platform, where Edge-based computing is just one of the countless number of emerging technologies transforming many of the mid-scale and large-scale retail stores. It has become a highly critical technology adoption cycle for retailers.

      In retail stores, Edge computing provides a new paradigm for running existing applications like POS, and inventory management. At the same time, it is “oven ready” for other applications like augmented reality, smart shopping carts, smart shelves, surveillance, in-store navigation, energy management, digital signage, and many more. Edge computing and its associated software-defined concepts also give retailers the power to make things happen very quickly, to expand retail infrastructure with the facility of zero touch provisioning without having to rely on technical engineers in the retail store for the installation and manual configuration of IoT devices. The Edge-based solution design for retailers is all about transforming the way, workloads are hosted and managed in retail stores, with various advantages such as:

      • Low latency for critical / time-sensitive applications
      • Reducing the dependence on network bandwidth from CSPs
      • Improving the data privacy and security of retail stores
      Fig.4: Retail Network Design based on 5G

      In summary, a 5G-based retail network can be enabled for various use cases, as shown in Fig 4, under three different categories (eMBB, mMTC and URLLC) as per 3GPP R16/R17 standards. Key requirements like QoS, high speed/seamless connectivity, less latency, mobility, and security play an important role and address many of the challenges foreseen in other technologies. Each retail store can have its own 5G RAN network, operating in a private mode without any external interference, and can use it exclusively. Only authorized people (retailers) and end devices are able to access the private 5G network, and any data generated within the store is processed locally within the dedicated 5G retail network, ensuring high security and data privacy.

      The retail business model will soon become a de-facto standard for many emerging technologies such as 5G, AI, augmented reality, and Edge data analytics to manage various retail operations as well as supply chain integration. Retail network infrastructure will become more sophisticated and integrated with the combination of these smart technologies and will create more autonomous outlets like Amazon Go. Technology like robots and machine/deep learning mean retail employees will be able to focus more on customer expectations and improve their stores’ performance to create the store of the future – smart, responsive, connected, and secure.

      Vijay Anand

      Senior Director, Technology, and Chief IoT Architect, Capgemini Engineering
      Vijay plays a strategic leadership role in building connected IoT solutions in many market segments, including consumer and industrial IoT. He has over 25 years of experience and has published 19 research papers, including IEEE award-winning articles. He is currently pursuing a Ph.D. at the Crescent Institute of Science and Technology, India.

        Quantum’s a step change in technology with distinct challenges; take advantage of specialists

        Julian van Velzen
        Julian van Velzen
        14 Jul 2022

        Quantum is challenging. We believe a range of factors, including partners and collaborations, are important to understand for businesses that are starting to get involved in quantum

        As we discussed in our previous blog post, What if quantum technology could solve the world’s biggest challenges, Quantum is predicted to have a huge impact on the world during the next decade and beyond. Quantum computing, for example, using quantum properties and algorithms, can perform complex computations exponentially faster than classical supercomputers, contributing to optimal molecular design in drug discovery, maximized fluid dynamics in aerospace, and more accurate financial risk models. And quantum sensing can be used to make advances in medical diagnosis and autonomous transport. Business leaders looking to secure the benefits of this quantum revolution will require an entirely new technology stack, with fresh partners, collaborations, and policies.

        Let’s be clear, quantum computing doesn’t fit neatly into the traditional model of classical computing. Just as deep learning evolved from a combination of processing power and neurobiological advances, and synthetic biology is evolving from a mix of computer science and nanotechnology, so quantum is being born from a new blend of disciplines, tools, and skills.

        This fresh combination of requirements is a step change in technology. It requires access to financial and research resources that are far beyond the means of a single enterprise. For now, the baton for quantum has been picked up by technology companies. These resource-rich giants, such as IBM, are leading the charge and developing nascent quantum computers.

        Riding the wave – quantum technology is not a single entity

        Yet it’s also important to recognize that quantum technology is far from a single entity. While computing receives the most attention, other quantum technologies – notably quantum sensing and quantum communication – are forming key elements of this revolution. However, each of these quantum technologies is at a different stage of maturity.

        That variability in development creates an issue for business leaders who might be keen to explore quantum. Yes, it will in time change how organizations find answers to intractable questions, but quantum is not yet mature enough to meet ever-growing expectations around its potential.

        So, when should business leaders dive into quantum? Is there a way to ride the crest of the quantum wave when it does start to mature? Our recent report, Quantum technologies: How to prepare your organization for a quantum advantage now, looks at the current uptake of quantum initiatives within business, and we believe there is a range of factors that are important to understand for businesses that are starting to get involved in quantum.

        Complex, interdisciplinary, and expensive

        Quantum is both multi- and inter-disciplinary. It draws on a range of science and technology areas, from quantum mechanics and quantum physics to advanced artificial intelligence, security, and infrastructure and onto mathematics. As quantum evolves and matures, and moves into an operational stage, it will become increasingly interconnected with an organization’s existing technology stack. Companies will need to address this issue as they move towards a hybrid infrastructure.

        Quantum technologies are also expensive and complex to design. They are built with precision, and they operate, in many cases, at very low temperatures. Qubits – or quantum bits, the basic units of information in quantum computing – are fragile, unstable, and prone to errors. The expense involved in creating complex quantum systems means deep pockets are crucial – and that’s where well-funded tech giants and venture capital-backed startups play a crucial role in leading the quantum charge.

        Developing relevant use cases

        While quantum exploration has been mainly focused on research work in the lab until now, the technology will slowly move into the enterprise, and this transition will rely on the identification of potential realistic use cases.

        However, it will not be easy to identify where the quantum advantage lies. It’s hard to assess the value of quantum because the technology remains at a nascent stage of development. And without identifying a strong sense of commercial or competitive value, business leaders will struggle to allocate the right level of research resources and funding.

        Further complexity comes from the fact that specialist skills are needed to develop potential use cases. This process relies on a combination of talented individuals, such as scientists, engineers, and technologists, and subject-matter experts from across industry. These highly capable specialists are in short supply because of their deep expertise across a range of specialized areas.

        Moving towards an ecosystem approach

        From technical concerns to cost considerations and onto skills gaps, business leaders who want to get involved in quantum face a range of complex considerations. No single company will be able to deal with these quantum challenges unless they create an ecosystem of partners that can provide the necessary technical resources and deep expertise.

        Our final blog in this series from Pascal Brier, our Chief Innovation Officer, we explain how business leaders can create an ecosystem of complementary specialist partners that will help their organization find its quantum advantage.

        Author

        Julian van Velzen

        Julian van Velzen

        Quantum CTIO, Head of Capgemini’s Quantum Lab
        I’m passionate about the possibilities of quantum technologies and proud to be putting Capgemini’s investment in quantum on the map. With our Quantum Lab, a global network of quantum experts, partners, and facilities, we’re exploring with our clients how we can apply research, build demos, and help solve business and societal problems that till now have seemed intractable. It’s exciting to be at the forefront of this disruptive technology, where I can use my background in physics and experience in digital transformation to help clients kick-start their quantum journey. Making the impossible possible!

          AI-infused innovation for automotive data

          Jean-Marie Lapeyre – Our expert
          Jean-Marie Lapeyre
          14 Jul 2022

          AI is integral to many current production-line automation initiatives designed to increase efficiency and quality.

          The automotive industry is innovating rapidly in response to multiple disruptions. To focus first on the customer perspective, vehicle buyers increasingly want the whole purchasing experience to happen online, from initial research right through to buying. The ownership model, too, looks set to change, with the increasing use of carpooling, ride-hailing services, short-term rentals, and community fleets. And relationships between manufacturers and customers are being extended: drivers will continue to receive services and over-the-air updates throughout their vehicle’s life.

          Frugal innovation through applying AI to existing data

          Innovation is critical to dealing with industry changes, but innovation does not necessarily mean invention; often, it’s about reusing an idea or a resource in a different context. This frugal approach to innovation, called ”Jugaad” in India, is a theme of Capgemini’s TechnoVision for Automotive 2022 playbook. The application of AI is a prime example of how the automotive industry is innovating by making better use of what it has. Of course, AI needs data, and this is where the frugality comes in, because much of the necessary data often exists already. For example, data generated by designing and building a vehicle was often discarded after the product’s completion, but with software increasingly determining which options an individual vehicle offers, its value is clear. AI can help companies make the most of data in a range of contexts.

          Inside the vehicle: Perhaps the best-known use of AI in automotive is to automate the task of driving. Even if fully autonomous vehicles are still a few years away, Advanced Driver-Assistance Systems (ADAS) features are already appearing. AI can open up a whole world of seamless driver interactions and can support safety, reliability, and robustness.

          On the factory floor: AI is integral to many current production-line automation initiatives designed to increase efficiency and quality. AI-based systems can help to analyze camera outputs, carry out shop-floor quality checks on the assembly line, optimize truck loading to improve space utilization, or power augmented-reality goggles to minimize operator errors. In the design workshop: With AI, revolutionary propositions can emerge from data, including new elements for use by human designers. AI can also evaluate solutions generated by humans or machines and recommend the most promising.

          In the back office: For strategic planning purposes, AI-enabled processes can assist with rationalizing the choice of vehicle configurations. AI could even help the human resources function because, when talent is scarce, AI can make the most of the people you have.

          AI helps optimize ADAS General Motors is assessing the potential of an AI-enabled pattern recognition technology to accelerate the design of ADAS. The Multi-node Evolutionary Neural Networks for Deep Learning rapidly evaluates convolutional neural networks for use in pattern recognition. This approach could, for instance, reveal ways for cars to quickly and accurately assess their surroundings in order to navigate safely through them.

          Sharing safety data Capgemini has been working with Volkswagen and Audi to demonstrate the value of the German Federal Government’s Mobility Data Space, of which Volkswagen Group is a founding member. An early use case is Local Hazard Information, which provides aggregated event data on traffic hazards collected from vehicle sensors in the Audi fleet. This data could be used by a navigation service to warn road users of upcoming danger spots in near real-time.

          Driving into the future with data and AI

          The industry needs to make sure that the data required to power AI is available in the right form, at the right place, and at the right time. Various data-sharing initiatives are underway to help this happen. Once we organize the data correctly and make it available for the right AI applications, the sky’s the limit – perhaps literally. AI-enabled progress in drones could lead to the development of cars that are not only autonomous but also hover in the air. Watch this space!

          Interesting read?

          Capgemini’s Innovation publication, Data-powered Innovation Review | Wave 4 features 18 such articles crafted by leading Capgemini and partner experts sharing inspiring examples of it – ranging from digital twins in the industrial metaverse, “humble” AI, serendipity in user experiences, all the way up to permacomputing and the battle against data waste.. In addition, several articles are in collaboration with key technology partners such as  AlationCogniteToucan TocoDataRobot, and The Open Group to reimagine what’s possible. 

          Author

          Jean-Marie Lapeyre – Our expert

          Jean-Marie Lapeyre

          EVP and Chief Technology & Innovation Officer, Global Automotive Industry
          Jean-Marie Lapeyre works with automotive clients to develop and launch actionable technology strategies to help them succeed in a data and software-driven world.

            Next-Gen BSS evolution
            for 5G networks and IoT

            Ashutosh Shukla
            11 Jul 2022
            capgemini-engineering

            Previously, Business Support System (BSS) software was used to support a specific set of use cases and was designed as monolithic packages. In the future, 5G and IoT will allow Communication Service Providers (CSPs) to offer many critical use cases, which demand that BSS infrastructure will be automated from both the front-end and back-end.

            This is essential given that service providers will be working with several different industry verticals and seamlessly onboarding a high number of partners to monetize their 5G and IOT investments. With 5G demanding more automation to manage volume and scale, the BSS system needs to offer flexible, agile, and simplified management. The CSP-as -a-platform vision is that partners will be able to self-design and order services on the platform, with all necessary network fulfillment, security, policy, charging, billing, and customer interaction provided as required. To accomplish this, CSPs must digitize their existing BSS. Applying a microservices approach to BSS combined with DevOps practices is the best way to achieve this transformation. A microservices architecture decomposes a monolithic application as a set of small autonomous services that are independently deployable. Each microservice executes on its own process and is designed to achieve a single business capability which provides the flexibility to offer different use cases in a short time.

            Evolved BSS enhancements and new capabilities provide the opportunity for operators to develop new business models and services for 5G and IoT. Along the way, this journey opens up substantial new revenue streams in verticals such as industrial automation, security, health care, and automotive.

            Service providers are required to enhance their existing BSS with cloud native and microservices architecture, along with an IT-based 5G core network. This transformation will not only help operators achieve agility, flexibility, and scalability, but also derive granular insights, which can be leveraged to deliver a superior customer experience and enable new business streams.

            The impact of AI on emotional intelligence in the workplace

            Jonathan Kirk, Data Scientist, I&D Insight Generation, Capgemini’s Insights & Data
            Jonathan Kirk
            11 Jul 2022

            The application of AI is increasing employee and organizational focus on unique human cognitive capabilities that machines simply cannot master. Emotional intelligence is one such area that AI and machines find hard to emulate – making it an essential skill set in today’s age.

            Artificial intelligence (AI) is becoming more prevalent in our lives – both at work and at home. While many traditional job roles within organizations have already been automated, more sophisticated AI and machines are supplementing human intelligence and helping the human workforce to evolve their skills and roles.

            One example of this is how AI can also be used to make our workforces more emotionally aware. Potential applications could include:

            • What is the best new role for an individual based on their experience?
            • How can we ensure the most efficient and slick retraining programs?
            • How can we provide a more personalized experienced for our people and customers?

            How do we make AI emotionally aware?

            Given increasing customer demand for more meaningful and personalized experiences, we can easily see how customer and agent emotions could impact these experiences – considering not only what customers want, but also understanding how they feel in that moment and modify the customer journey based on their feelings.

            When we provide a recommendation based on a customer query, we anticipate a set of feelings and thoughts that govern that behavior and the actions we take. And behind these actions are thousands of emotionally aware judgments we make.

            Currently, there are two ways we can learn in AI:

            • The first is using known outcomes to train a model that finds patterns and data trends to give the best result (method 1).
            • The second involves observing our environment and making decisions accordingly. The outcomes of these decisions teach us how to make better decisions and so on (method 2). This is the way humans learn, and it’s this flexibility that enables us to respond to new stimuli and make new decisions.

            If we use method 1, the AI agent doesn’t understand the emotion of the customer and act accordingly, but applies the emotional intelligence of previous human agents to a similar problem – and, therefore, isn’t emotionally aware. Even if we train the AI to recognize emotions such as anger or happiness, the machine learns our interpretations of these emotions based on our labeling of the emotions and not on the data it is receiving. If we use method 2, until the AI agent has had enough experience to have learnt effectively, it would be like talking to a child.

            A better approach is to combine methods 1 and 2 – build an AI agent to use current outcomes and labeled examples, and then as more data is collected, allow the AI agent to learn new patterns on its own. The AI doesn’t need to know which responses are from angry people, it will associate all similar responses together and call it whatever it likes. The AI then offers bespoke solutions based on similar behavior within the profile, learns from the responses, and records feedback to improve the outcomes each time.

            How is an emotionally aware AI engine better than what we have today?

            AI brings universal benefits such as consistency, repeatability, and scale – but it also enables us to understand empirically the role of emotions in customer interactions.

            We can quickly change the offer to the customer if they change their emotion mid correspondence, and we can try out completely novel solutions and observe the emotional response to them. We have built a flexible and stable agent that can handle complex customers and understand how they feel.

            Going back to the impact of AI on the workplace, what does a human agent do when replaced by a non-human agent? They can either retrain to manage the AI, spend more time innovating solutions for the business, or be available if the customer wishes to speak to a human agent – all of which are higher-value activities.

            Indeed, perhaps the most important part of an AI agent is the ability to know when to revert to a human agent based on how the correspondence is progressing. After all, if the context is highly emotional, people prefer to talk to a human agent

            As a closing thought, perhaps we are simply living through a transition period, which will end when we can no longer discriminate between AI and human agents?

            Author

            Jonathan Kirk, Data Scientist, I&D Insight Generation, Capgemini’s Insights & Data

            Jonathan Aston

            Data Scientist, AI Lab, Capgemini Invent
            Jonathan Aston specialized in behavioral ecology before transitioning to a career in data science. He has been actively engaged in the fields of data science and artificial intelligence (AI) since the mid-2010s. Jonathan possesses extensive experience in both the public and private sectors, where he has successfully delivered solutions to address critical business challenges. His expertise encompasses a range of well-known and custom statistical, AI, and machine learning techniques.

              Governments and the public sector lead the cloud sovereignty debate

              Stefan Zosel
              11 Jul 2022

              As volumes of cross-border data proliferate on cloud platforms, how are governments and public sector organizations both regulating and planning for their own use of sovereign clouds?

              With worldwide spending on cloud services expected to cross the $1t threshold by 2024[1], cloud sovereignty is increasingly part of the strategic mix when defining cloud journeys. Who — or perhaps more pertinently which organization — has control over the data held in public clouds? And where does responsibility lie when there are operational or technology issues in the face of political or geographical unrest?

              In its report The Journey to Cloud Sovereignty the Capgemini Research Institute evaluates evolving trends, awareness, and readiness of organizations for cloud sovereignty. So, first let me clarify that a sovereign cloud is a cloud computing environment that is owned, deployed, governed, and managed locally or regionally within a single nation or jurisdiction.

              The report draws on the findings of a survey of senior executives working in different functions from 1,000 organizations across multiple sectors, including 200 in government and the public sector. What’s clear is that governments are leading the charge to cloud sovereignty, with a range of regulatory developments shaping the way forward.

              Further, as well as shaping regulations pertaining to cloud use, government and public sector bodies are among the leaders in pursuing (or considering) a sovereign cloud in their organizations. For example, 76% of government/public sector respondents to the survey believe a sovereign cloud will be adopted in their organization to ensure compliance with the regulations and standards of the nation/state/local government, versus 71% across all-sector respondents.

              Mitigating risks

              Why the need for a sovereign cloud? The key idea is to respond to each organization’s desire for control, choice, and autonomy as cloud adoption accelerates globally. In some instances, even the cloud provider is obligated to be of local origin to mitigate certain risks, among which, according to a recent European Commission communication, are threats to cyber security, supply vulnerabilities, and unlawful access to data by other countries or suppliers. Indeed, security or resilience-related concerns with public cloud providers were cited by 74% of public sector participants, marginally higher than the 73% all sector average .

              Similarly, the threat posed by potential exposure to extra-territorial laws and/or the possibility of data access by foreign governments owing to a vendor’s location of origin was cited as a concern by 68% of public sector respondents. So, it is no surprise that 69% of government/public sector survey respondents believe a sovereign cloud will be adopted in their organization to ensure immunity from extra-territorial laws and regimes.

              We can see provisions to mitigate such risks already in existence in some countries. In the US, for example, the 2018 CLOUD Act gave law-enforcement authorities access to data belonging to US-based cloud service providers, and this extends to non-US firms that are subsidiaries of a US cloud or IT service provider, even if headquartered outside the US.

              Similarly, the threat posed by potential exposure to extra-territorial laws and/or the possibility of data access by foreign governments owing to a vendor’s location of origin was cited as a concern by 68% of public sector respondents. So, it is no surprise that 69% of government/public sector survey respondents believe a sovereign cloud will be adopted in their organization to ensure immunity from extra-territorial laws and regimes.

              We can see provisions to mitigate such risks already in existence in some countries. In the US, for example, the 2018 CLOUD Act gave law-enforcement authorities access to data belonging to US-based cloud service providers, and this extends to non-US firms that are subsidiaries of a US cloud or IT service provider, even if headquartered outside the US.

              Strategies being developed

              Although cloud sovereignty is being embedded as part of overall cloud strategies across all sectors, many organizations surveyed for the report feel some uncertainty about the next steps. For example, 28% of organizations say  they need more clarity on this topic to form a cloud-sovereignty strategy, and only 3% of public sector organizations say they have a well-defined cloud sovereignty approach.

              Even the definition of cloud sovereignty is subject to different interpretations. Nearly 40% of  government and public sector organizations see it as a combination of public and private cloud (including vendors of non-local origin), and data localization within a country or region’s borders at locally–approved data centers, whereas 15% view it as the exclusive use of cloud providers based in the same legal jurisdiction and storing data within a country or region’s borders. This latter model might well be hindered by the thinking on the part of 59% of  government/public sector respondents who believe that current local cloud solutions have performance-related issues compared to existing global solutions.

              Recommendations for untapping benefits

              What’s in it for my organization? It’s a question I’m often asked, and one that the survey also investigated.  68% of government/public sector organizations say they believe it provides a trusted and safe cloud environment for data, while 61% cite the ease of sharing data with trusted ecosystem partners as a benefit. An example of such an ecosystem is GAIA-X, the public/private consortium initiative established in 2020 comprising cloud suppliers, businesses, and the public sector to create a unified ecosystem of cloud and data infrastructure and services for the European Union.

              So how can governments and the public sector accelerate their path to achieving benefits like these? We recommend building the ‘move-to-sovereign’ strategy on the following four pillars:

              • Define sovereignty objectives and compliance requirements:
                • identify your sovereignty objectives based on the three elements of cloud sovereignty (data sovereignty, operational sovereignty, and technical sovereignty)
                • understand the rules and regulations concerning sovereignty and the real facts behind them — the privacy and protection of different types of data will demand varied levels of security in the cloud depending on a range of factors, such as approaches to risk, innovation ecosystems, geo-political affiliations, digital readiness to share data, and more
                • track key developments in the cloud and data sovereignty space; continuously assess risk exposure; and set up a compliance organization.
              • Assess cloud providers through a sovereignty lens, embracing data sovereignty (for data residency, controls, transparency, storage, back-ups, etc.), operational sovereignty (for security, compliance, and operational resilience), and technical sovereignty (to assess interoperability and migration features and clear exit policy/processes.)
              • Align for a flexible cloud architecture. Identify your sensitive workloads and most viable use-cases — staying aware of the types of data that are hosted in the cloud and the importance of different data types for your organization, with some data points, such as citizen data, being highly sensitive. Consider end-to-end encryption, as well as key management solutions. At the same time, evaluate hybrid options, and prepare for a multi-cloud architecture by understanding the potential as well as the challenges it brings.
              • Develop the potential of sovereign cloud by exploring its value proposition in terms of trust, security, and collaboration through ecosystem participation. Within the “trusted cloud” environment, sovereign cloud can help in developing new solutions, especially in data-sensitive sectors like the public sector and healthcare. The French Government, for instance, is planning to digitize public-sector initiatives with citizens and institutions with its sovereign cloud infrastructures.

              Accelerating cloud adoption

              There is no doubt that cloud adoption has ramped up enormously over the past two years. The much-publicized work-from-home model has seen organizations relying on cloud services to support remote workforce collaboration, productivity, resilience, and more. This acceleration in cloud adoption has also revealed critical vulnerabilities and emphasized the importance of secure data storage and control. The solution? Cloud sovereignty as a means to maintaining physical and digital control over strategic assets, including data, algorithms, and critical software.

              We are still at an early stage on the sovereign cloud journey, with many questions remaining, such as those concerning data localization, ownership, traceability, and access controls, along with the role of an open-source approach to enabling transparency. And whether a solution allows applications and data to be moved from one cloud-computing environment to another with minimal disruption. These questions — and many more — are gradually being answered and I am looking forward to the next stage of the journey.

              Find out more

              [1] IDC, “Cloud adoption and opportunities will continue to expand, leading to a $1 trillion market in 2024,” October 2020.

              Author

              Stefan Zosel

              Capgemini Government Cloud Transformation Leader
              “Sovereign cloud is a key driver for digitization in the public sector and unlocks new possibilities in data-driven government. It offers a way to combine European values and laws with cloud innovation, enabling governments to provide modern and digital services to citizens. As public agencies gather more and more data, the sovereign cloud is the place to build services on top of that data and integrate with Gaia-X services.”