Skip to Content

The growing need for private 5G networks in manufacturing plants

Vijay Anand
28 Jul 2022
capgemini-engineering

The IoT world is evolving today with the enablement of various technologies such as data analytics, 5G, edge computing, cybersecurity, robotics, augmented & virtual reality, artificial intelligence, and machine learning. However, there is a huge amount of gap in business performance potential based on the evolution happening in the industrial automation system with respect to technology change as shown in figure 1.

Fig 1: Gaps seen in the manufacturing industry and the trends (Source: Internet (Images))

In the manufacturing industry, there is a growing need for autonomous networks with interconnected sensors and actuators, and for greater collaboration to improve productivity without the need for human intervention. To design such a factory based on autonomous technology, a 5G-enabled private, or dedicated, factory design is the better choice. 5G technology can allow communication between people, devices, and sensors belonging to the same factory, while a private 5G network provides specific services within the factory environment to run critical business operations.

To understand this, let’s consider the challenges faced in the manufacturing industry.

Problem Statement 1:

Accidents and injuries are a huge problem in today’s manufacturing industry, where thousands of people are injured in factories around the world, costing hundreds of millions of dollars in treatment, rehabilitation, and downtime. The major accidents as shown in figure 2 happen in many factory environments due to factors including: –

  • Poor design of storage tanks for petrochemical products
  • Faulty cooling systems
  • Bad design of air circulation systems
  • Inadequate measures and parameters of factory operations
  • Poor safety awareness
  • Inadequate risk assessment response
  • Poor safety management processes
  • Insufficient knowledge of the chemical properties of petrochemical products especially during storage conditions
  • Inadequate emergency response procedures during machine breakdown
  • Safety protocols not being followed by key authorities during factory operations
Fig 2: Key challenges faced in the manufacturing industry (Source: Internet (Images))

For example: A toxic gas leak in a chemical factory kills people and the workers are exposed to the poisonous gas. This leads to a variety of health issues, such as breathing problems, asthma, gastrointestinal disorders, lung cancer, kidney, liver failure, and eye disorders, all of which could lead to other, additional diseases.

Problem Statement 2:

To address the problem statement 1, many factories require Condition-Based Monitoring (CBM) to measure and monitor the critical operational parameters of various assets and machines to understand their behavior and how it changes in different critical situations. The CBM technique is required to measure various process parameters like pressure, temperature, pH, noise, vibration, and flow and samples such as oil conditions in a wide variety of equipment like pumps, electric motors, combustion engines, gearboxes, fans, electrical control panels, compressed air, and hydraulic systems as shown in figure 3. The measurements taken from this sophisticated equipment, called ‘conditional measurements’, are essential for regularly identifying the condition of equipment to avoid any major accidents, and require a large amount of data to be managed and processed in real-time systems.

Fig 3: Condition-Based Monitoring (Source: Internet (Images))

CBM is required to enable manufacturing plants to handle both preventive and predictive maintenance and to manage and monitor a large amount of equipment and instruments, all of which generate a high volume of data. It is very important to take measurements at regular intervals in real-time, and to quickly assess data that’s generated close to that equipment to manage its critical operations. CBM requires reliable high-end wireless technology, which can interface with different equipment to stream data in real-time. This technology can enable maintenance teams and other factory stakeholders to quickly assess conditions with less latency, generate alerts, and trigger any maintenance activity that needs to be performed immediately to avoid any major damage to a factory’s operations. Field technicians, working in a manufacturing plant, should be able to monitor the performance of a range of equipment to reduce the risk of downtime or failure. The collection and analysis of these critical data helps make the diagnosis more accurate for the factory’s maintenance and IT team, allowing them to plan for any action required to prevent failure and ensure the continuous operation of the equipment. This, in turn, helps to save money, improve the efficiency of day-to-day operations, and create new revenue streams based on the data being monitored and analyzed. CBM methodology in a factory environment can enable field engineers and maintenance teams to measure the health, reliability, and integrity of the equipment, as well as predicting and preventing major issues before any major failure occurs.

Problem Statement 3:

Across the globe, manufacturing plants have deployed dedicated, or private, networks for managing various items of equipment on their premises, designed and deployed with either Wired Ethernet and/or Wi-Fi technologies. Wired Ethernet-based connectivity is very cheap, as shown in figure 4(a), and provides stable communication quality and performance. Because it is wired, however, it cannot provide mobility: the wiring cost is high, and construction time is too long. Wi-Fi-based instrumentation devices, as shown in figure 4(b), are easy to deploy in the factory, where network deployment cost is low, but there are still drawbacks:  wireless communication connections are unstable; the communication distance is short; the latency is longer than tens of ms; it’s vulnerable to outside threats, and its mobility is also very limited

Traditionally, data generated from wired & Wi-Fi-based instrumentation devices installed in manufacturing plants are processed either on the local premises or in the public cloud to control the behavior of these devices. Typically, these devices require highly reliable connectivity for quick communications, a latency of less than 1ms, secure data management and data storage, proper traffic isolation between different critical applications running in the factory, and guaranteed QoS for day-to-day operations managed over the private network.

Fig 4a & 4b: Challenges faced in the factory based on wired (Ethernet) and Wi-Fi ecosystems (Source: Internet)

Methodology and Approach:

Today, digital transformation is happening in various manufacturing plants, focused on business agility, operational efficiency, and the enablement of technologies such as IIoT, cloud, edge computing, robotics, 3D printing, big data, digital twins, AI/ML/DL, and next-generation connectivity like 5G. It is clear, however, that a broad ranging ‘one size fits all’ approach is NOT sufficient, and each factory must consider what digital transformation could look like for their key business requirements. Service providers (MNOs) traditionally offer wired and wireless connectivity to factory premises via inflexible and static connections due to which certain services gets ‘overprovisioned’ for peak demand. As a result, factories need to pay more for unused capacity. Manufacturers are therefore looking for service providers as key partners, who can supply ‘fit-for-purpose, next-generation connectivity solutions’ with the option to request, configure, and modify network resources ‘on demand’ for various individual requirements and critical applications to meet the demands of expected QoS.

To manage production in a reliable manner, manufacturers must balance the use of conditional maintenance to meet growing demands with various limitations in the traditional Wired LAN or Wi-Fi networks used in the factory environment. These businesses have realized that, to begin their digital transformation and scale up their business value and revenue, using devices like cobots, drones, and AR/VR/MR in manufacturing plants to enable more automation requires a dedicated private network with extremely high throughput, high data speed, strong security, and lower latency and delay to handle the increase in connectivity and data transmission. The growing number of interconnected devices in the factory environment, spanning from sensors to actuators to Programmable Logic Controllers (PLCs), need to communicate with each other seamlessly in real-time for data management, monitoring, and analytics.

To address the needs of factories, and with digital transformation happening in Industry 4.0 today, the establishment of 5G standards and the emergence of private 5G spectrum are opening bigger opportunities for factories to consider private 5G networks. Many large industrial manufacturing plants have their own wired and/or wireless networks, and there are many emerging technologies that demand a significant amount of computation and resources for operations within the factory environment. This has led many manufacturers around the world to seek a private 5G network as a preferred solution for their factory environment. But to realize the full potential of private 5G networks, each factory needs to have its own strategic approach based on its specific application and product requirements.

Fig 5: Private Factory Network based on 5G (Source: Capgemini + Internet (Images)

There are many implementation options for designing a private or dedicated 5G factory network that can be owned and managed either by an MNO or by the factory’s IT staff. A 5G network design based on the 3GPP standard, for example, can be built using licensed, license-shared, or unlicensed spectrum. Figure 5 shows the high-level deployment of 5G in a private network based on standalone mode.

There is NO one-size-fits all approach that exists today. Many manufacturing plants have multi-range devices that need to be handled – some industrial devices like AR, VR, and factory surveillance might generate a high volume of data while others, like motors and pumps, generate a lower data rate. As they cover a large volume of equipment, sensors in the factory network will be operating at both high and low bit rates. 5G private network architecture must therefore be able to support low, mid and high band ranges to interconnect 5G NR devices operating at different frequency bands. A 5G private network design has to handle radio frequencies ranging from sub 1 Ghz (low band) to extremely high frequencies (25+ Ghz). The lower the frequency, the farther the signal can travel in the factory environment. The higher the frequency, the more data it can carry over a short distance. Designing a 5G private network architecture for industrial automation systems based on the need to support of all bands, as well as enabling emerging technologies like AR and VR, is not an easy task.

With potentially hundreds of thousands of critical sensors and control systems used in larger factory environments, 5G private network implementations are increasingly finding a way. 5G networks will be powered by massive, distributed computing, located closer to sensors and machines, and capable of applying artificial intelligence and machine/deep learning algorithms to handle huge amounts of industrial and critical data within the factory environment. A 5G factory has a private network design with its own 5G network built in, where 5G devices, RAN, and core are integrated into a complete ecosystem from end-to-end. A private 5G network does not interface with or leverage resources and functionalities from the public 5G MNO network. However, a private 5G frequency is used when a factory creates its own private 5G network, whereas an MNO’s publicly licensed frequency can be used if the MNO builds a private 5G network for a factory. MNOs’ public 5G networks can be used as backup to an existing private 5G network, enabling it to connect all the manufacturing equipment and devices installed in a factory environment to a public 5G network if the private 5G network fails for any reason.

For example: – Since 5G devices in a private network use the same technology as public 5G MNO networks, a private 5G network can handover the devices to public 5G MNO networks if any of the 5G enabled devices leave their private network’s coverage area. To enable a public 5G MNO network as a backup, manufacturers need to have extended connectivity of their private 5G network’s operation to a public 5G MNO, so that any 5G devices in a manufacturing plant can use the same SIM in both a private network and a public MNO network. For example, the factory can monitor and control an automated forklift machine after it has crossed the street and moved out of range of its private 5G factory network, where it gets switched over to the public MNO’s 5G network based on its inbuilt capability.

A private 5G network solution is a fully integrated ecosystem based on industrial-grade hardware with dedicated 5G radio access and 5G core software modules that provides enhanced data security, broadband speeds, deterministic behavior, real-time response, and cost efficiencies to factories deploying on-premises, business critical applications. As such, it forms part of the backbone for the smart factories of the future, as shown in Figure 6. Private 5G factory network deployments replace wired networks (LAN) and Wi-Fi networks in manufacturing plants, where a fixed communication infrastructure will be more complex due to the increase in the number of CNC (Computer Numerical Control) machines, and industrial devices which require additional wiring. 5G private factory networks are specifically designed to support Industrial IoT (IIoT) applications, where a private 5G network can deliver ultra-low latency and incredibly high bandwidth connections supporting artificial intelligence-driven applications by serving the larger number of sensors and various instrumentation equipment.

Fig 6: Private 5G Network with Network Slicing (Source: Capgemini + Internet (Images))

A private 5G network offers a combination of essential security, reliability, and performance enhancements over other wireless technologies as it has been designed from the ground up to meet business and mission critical application needs in three main categories: Ultra-reliable and Low-latency Communications (uRLLC), Enhanced Mobile Broadband (eMBB), and Massive Machine Type Communications (mMTC). In addition, network function virtualization (NFV) and software-defined networking (SDN) technologies have also been considered as part of 5G network design for handling data communication. An attractive attribute of 5G networks is network slicing, which allows several applications or services to run on the same physical network by dividing the physical bandwidth among them. Each slice is created during the 5G private factory network design to meet the requirements of various critical applications. In a factory, for instance, one slice might serve video surveillance applications that demand higher speeds, while another would serve robots, for which lower latency is critical. In addition, network slices can be allocated to reduce bottlenecks and improve throughput as workload demands increase.

The distributed architecture within a factory environment based on a 5G network design allows local data processing with machine learning algorithms for handling massive amounts of data and takes care of the security and privacy of the factory network. As a dedicated or private domain of a 5G network, the real-time and non-real-time traffic can be managed at the edge of the factory network by the IT team in a more efficient manner.

Cutting through the noise – the future of marketing

Tim van der Galiën
Tim van der Galiën
28 Jul 2022
capgemini-invent

The brands we remember are not just unique because their appearance is attractive, or their customer service is good. It is because they deliver a coherent experience in which the brand delivers a unique, individual journey that goes beyond ‘just’ a digital marketing experience; they cut through the noise, rather than simply adding to it. How do you cut through the noise?

The brands we remember, and love outshine their competition. It is not just because their appearance is attractive, or that their customer service is good. It is because they deliver a coherent experience in which the brand delivers a unique, individual journey that goes beyond ‘just’ a digital marketing experience; they cut through the noise, rather than simply adding to it. How do you cut through the noise? What are the key drivers of the future marketing experience? And how do you keep up with the fast-changing marketing organization as a marketing leader?

The rapidly changing face of marketing

The expectations of consumers are rising and the demand for engaging content across channels, in every type of format, in real-time is high. We all want a sublime Starbucks-like experience with a brand. In effect, we are witnessing a data and content explosion:

  • Consumers expect to interact with brands 24/7, at every touchpoint.
  • The one-way dialog from the brand to the consumer has been replaced by a two-way conversation.
  • Immediacy is the new watchword as brands are expected to respond now, not in a week’s time.
  • Personalization rests on the effective orchestration of contextualized content.
  • Relevancy is becoming the next battleground; demanding more personalized or well-timed initiatives requires more data capacity than ever.

What should we, as companies and especially as marketing leaders, do to manage these expectations? The keywords are data and connection.

“Customer data is at the heart of a great experience”.

Data-driven real-time marketing as a growth driver, now and in the future

As customers crave personalized engagement delivered through multiple channels, it is crucial to deliver the right experience, at the right time, at every touchpoint. Data is at the heart of a great experience. Marketing teams that were traditionally more focused on brand awareness and demand generation, now become a lot more data-focused. Due to the continuous interaction with brands, there is an exponential increase in the volumes of data available. Think for example about your personalized interface or suggestions after buying a product or service via a website. This real-time data enables us to get insights into every touchpoint of the customer journey. This allows marketers to gain an unprecedented level of customer insights, to eventually deliver customized experiences in real-time and at scale.

“Data-driven real-time marketing lays the foundation for a longer relationship and brand loyalty with the consumer”.

Real-time marketing can process, analyze, and leverage data to swiftly enhance digital commerce campaigns, increase brand awareness, customer satisfaction, conversion rates, and customer retention. More specifically, data-driven real-time marketing is one of the most sought-after and fast-growing areas in brand development.

  • It enables marketers to connect instantly with their customers through extensive personalization and contextualization.
  • Convergence of technology, data, and creativity helps to win in marketing.
  • Innovative technologies and increasing use of data must be managed to activate and captivate consumers.

Leading marketers are merging creative inspiration with real-time signals to create brand awareness and engagement at an unprecedented scale. The convergence of technology, creativity, and data is not only possible but essential; when the free-flowing creative process meets the precision of advanced data use with the right technological tools, they can create a powerful impact to make sure marketing wins and accelerates above the competition. Innovative technologies and exponentially increasing data must be managed to activate and captivate customers with data-driven, contextualized marketing experiences, eventually driving brand success.

The marketing leader is the orchestrator of the connected marketing experience

The marketing leader plays a key role in creating a connected marketing experience. They must answer important questions with regards to the interactions consumers have with their brand:

  • What do customers want and when do they want it?
  • What touchpoints are your customers using?
  • What are your customers saying about their experience?
  • How do you deliver a seamless end-to-end journey?

“It requires a bold approach for CMOs to connect the dots to achieve future-readiness with a connected, data-driven marketing ecosystem”.

future of marketing

It requires a bold approach for CMOs to connect the dots between these wide ranges of activities and requirements to achieve future readiness with an ecosystem that embraces data-driven real-time marketing and focuses on connecting with customers and driving sustainable growth. How to do that?

  1. Marketing organization: Make sure you make people and culture part of the change within your organization. The success of a redesign of the marketing organization is very much dependent on the culture and people within the organization. Highly engaged teams show up to 22% greater profitability and are almost five times more likely to perform their best work. Important to add agencies to support this.
  2. Customer activation: Organize yourself around the customer journey in an end-to-end operating model (holistic marketing organization) – do not only focus on the marketing department but connect all departments that are directly or indirectly connected to the customer journey: marketing, sales, services, IT, etc.
  3. Marketing technology: Design and deploy the right set of solutions and tools, marketing technology (MarTech) is used to understand what it is that customers want, why they want it and what they intend to do next. MarTech drives customer and marketing team interaction, measure effectiveness, and facilitate data-driven decisions. Marketing leaders that utilize 70% of the MarTech stack’s capabilities will achieve 20% better marketing ROI than peers.

How should companies, or more specifically marketing leaders, take the lead as an orchestrator of a connected ecosystem? And what are the key building blocks of a connected marketing experience? We will help you out! In the upcoming three blogs we will dive into the three main topics above:

  • Customer Activation
  • Marketing Organization
  • Marketing Technology

About the Authors

Tim van der Galiën

Tim van der Galiën

Senior ManagerConnected Marketing at frog, part of Capgemini Invent
Tim is responsible for the strategic marketing offering within frog, part of Capgemini Invent. He is an expert in marketing transformation & customer strategy and helps brands build bridges between people, data and technology.

Ruth Bos

Digital Transformation and Innovation

    Key challenges in the KYC space and how to address them

    Manish Chopra
    Manish Chopra
    5 July 2022

    Addressing KYC Challenges and Enhancing AML Compliance

    The KYC process is perhaps the most critical aspect of AML compliance, as it enables all other facets of AML including transaction monitoring, SAR filings, and sanctions screening. Indeed, FinCEN has deemed KYC’s alter ego, customer due diligence, the “5th Pillar” of AML Compliance and fundamental to a satisfactory AML program.  

    The challenges to implementing an effective KYC program, always high, have escalated since the pandemic, which accelerated the trend of opening new accounts completely online. With the increasing competition of Fintechs, traditional financial institutions are pressed, in order to stay competitive, to offer speed-to-market solutions and a positive online customer experience, difficult tasks when balanced against KYC demands. Moreover, various new and significant regulatory requirements and expectation have arisen, presenting significant KYC challenges for banks.

    Emerging Needs in the KYC Space

    In response to this, various needs in the KYC space have emerged, including the following half dozen key ones:

    1. Integrate KYC with the new AML National Priorities

    Last year, FinCEN for the first time announced the following national AML priorities:

    1. corruption
    2. cybercrime, including relevant cybersecurity and virtual currency considerations
    3. foreign and domestic terrorist financing
    4. fraud
    5. transnational criminal organization activity
    6. drug trafficking organization activity
    7. human trafficking and human smuggling, and
    8. proliferation financing

    Expectations for assimilating these priorities into AML programs will be high. This includes typologies and red flags provided by FinCEN. For example, on the KYC front, red flags that may indicate EDD is warranted include location of the business, presence of a PEP, and identification of beneficial owners who may be bad actors.  

    para In response, financial institutions need to identify how each of the priorities applies to them and then consider how policies, procedures, and their overall AML risk assessment should be amended. They also should perform a thorough threat and vulnerability assessment for each priority to identify an institution’s true risk, then examine their AML/KYC processes to determine how well they detect and report the criminal activity related to the priorities. 

    2. Focus on upcoming Ultimate Beneficial Ownership (UBO) requirements

    TherLed by FATF, most countries are stressing the importance of UBO requirements and registries to identify bad actors hiding behind legal vehicles. Recent factors spurring emphasis on UBO include the Pandora Papers (a comprehensive exposé that revealed the shell accounts of over 100 world leaders, billionaires, and celebrities), new EU criminal liability for non-compliance with UBO requirements, and the Russia-Ukraine conflict, given that Russia is noted for its establishment and abuse of complex networks of shell and front companies and non-resident bank accounts. 

    In the U.S., which is behind the EU and other countries in imposing a UBO regime, the Treasury Department recently released a National Risk Assessment that highlighted the abuse of legal entities. FinCEN has noted that its highest priority is establishing the national UBO registry and completing implementation of the beneficial ownership information reporting and collection regime. 

    Financial institutions can benefit from digital tools such as graph analytics to correctly identify complex beneficial ownership structures and calculate ownership percentages. They also should take advantage of enhanced data sources to keep track of new or changed legal entities globally. Internal policies and procedures need to be enhanced to include varying definitional criteria, ownership thresholds, and recordkeeping requirements. Ultimately, the entire KYC process will need to be geared to performing checks against UBO registries for a range of customers, employees, and third parties. Financial institutions will use digital tools to help verify registry information against other accessible sources to help ensure the accuracy of information and to cross-reference new information with existing data to potentially uncover suspicious activities. 

    3. Move to eKYC and touchless due diligence

    It’s vital that automated KYC solutions be in place to allow machines perform repetitive tasks such as routine data entry and collection, spreadsheet formatting and analysis, querying, and simple verifications, as well as make straightforward rules-based decisions. This enables skilled KYC analysts to spend their time on work that adds the most value and makes best use of their knowledge and expertise.  

    Cutting-edge techniques now allow, for example, for the touchless processing of identity documents by extracting their data, checking security features, and comparing them against templates. Algorithms that draw together the results of these checks can indicate whether the document is authentic. Incorporating automation into the KYC operation reduces error-prone, manual methods and decreases costs. Integrating KYC solutions also reduces friction during onboarding, creating a better overall user experience. 

    Choosing the right technology involves several considerations. For example, the solution must support document types from diverse countries, allow for different languages, and be compliant with the regulatory requirements of the business’s jurisdictions. Financial institutions should be able to create customizable verification flows for different products and customers. Also, the solution should have short processing times and high verification speed, so users won’t need to wait long.  

    4. End the siloed approach to KYC Challenges

    As with all associated operations, one hand should know what the other hand is doing. Many large financial institutions are now working on consolidating various related areas that are subsets of the overall KYC effort, such as the Customer Identification Program, risk ranking and ratings, PEP and adverse media screening, and UBO. This is because managing the complex KYC process in discreet operations that don’t coordinate with each other, or use multiple IT platforms, is costly, inefficient, and error-prone, and complicates the effort to launch new products. A single unified solution that handles all KYC/AML requirements renders the entire compliance endeavor more effective and prevents information from being either not sufficiently communicated or inefficiently replicated across redundant storage types and locations. 

    In addition, given that CDD must be ongoing through the life of the relationship, “lifestyle management” is also a trend. In other words, the onboarding process is no longer divorced from managing the ongoing customer relationship. The key is to form and maintain a holistic view of a customer with data gleaned from disparate sources held in a unified database. 

    5. Efficient KYC Management with Proper Data Governance

    Financial institutions’ underlying KYC data architectures are strained by the growing flood of structured and unstructured data volumes, their complexity and rate of change, and more advanced analytic demands. Thus, a critical need for every institution is maintaining an efficient method for capturing, storing, analyzing, and managing data and ensuring its quality, one that ideally also provides a competitive edge. For this reason, financial organizations continue to implement significant changes in data management, driven by supporting and exploiting big data technologies that blend new sources of data, such as social media, into the business decision-making process.  

    KYC is hampered when repositories store multiple copies of data sets in an inconsistent manner, resulting in overly complex mapping efforts. Data feeds into account opening systems frequently are designed only for specific functions. For example, customer risk scoring can be made difficult because integrating unstructured data sets, such as those related to adverse news media, is manually intensive and inefficient. Key data sets such as the one for beneficial ownership may not be appropriately mapped to the overall KYC and AML systems. All of which means that institutions are left without a comprehensive, “golden” record source of integrated data with which to comprehensively understand and perform due diligence on its customers.

    Proper data governance requires reaching into all the places where enterprise data hides (e.g., servers, desktops, tablets, smartphones, cloud apps), identifying all data categories, deriving ownership for common data environments, and establishing appropriate roles for data stewardship and other decision-making groups. Another key task is data enrichment, which in the KYC environment includes elements such as identifying UBOs within complex legal structures, integrating unstructured datasets derived from adverse news searches, utilizing parsing technology platforms such as NLP for keywords, and recognizing all possible PEPs and their kinships. Other necessary measures include establishing an exhaustive suite of data quality rules, ensuring proper data lineage across multiple systems, and the creation of useful MIS.  

    6. Strive to ensure a positive customer experience

    KYC requirements for data collection and analysis (such as UBO) can be cumbersome and add time, frustration, and friction to new account opening and onboarding. Financial institutions increasingly are leveraging data already obtained and otherwise transforming the old-fashioned, manual process of filling out paperwork, which can be tedious, set a poor tone for the customer relationship, results in high rejection rates, and be a heavy lift on resources to input and manually correct information. 

    For example, if customer information already exists in a CRM solution, the customer can simply verify that the information is correct and update as necessary. Customers can be empowered to start and stop a digital interview on different devices as needed and allow for the capture of information simultaneously from a spouse or other joint investor. The bottom line is to use all available means to ensure that customers spend less time on data collection for KYC. 

    Meet our expert

    Manish Chopra

    Manish Chopra

    Global Head, Risk and Financial Crime Compliance
    Manish is the EVP and Global Head for Risk and Financial Crime Compliance for the Financial Services Business at Capgemini. A thought leader and business advisor, he partners with CXOs of financial services and Fintech/payments organizations to drive transformation in risk, regulatory and financial crime compliance.

      Why your data doesn’t need protecting
      (As much as It needs unleashing!)

      Gianfranco Cecconi - Collaborative data ecosystems lead
      Gianfranco Cecconi
      21 Jul 2022
      capgemini-invent

      Whether in data management circles or in everyday conversations in the street, the concept of personal data protection is a sacred cow. But once an idea achieves this status, it stops being questioned. This is a mistake.  

      As I walked the sunlit streets of Helsinki, Finland, on the 20th of June, I was all set for a relaxing and informative day of data management presentations. Little did I know, I would be opening this year’s MyData Conference just after the keynote by activist journalist Julia Angwin, a person whose work I greatly admire. And then, my friend and colleague of many years, Esther Huyer, called to ask me to stand in for her after her journey to Helsinki was compromised. “Fear not, Esther,” I said. “You’re in good hands!” We brainstormed over the phone as she shared her original intention for the speech. In making it mine, I took the opportunity to do something different and say the quiet part out loud. In short: data does not need protection; data needs unleashing.   

      Don’t get me wrong, I’ve been working in this field long enough to know the necessity of data protection, whether describing individuals or organizations, to ensure privacy, confidentiality, or intellectual property. However, in recent years, I have realized that the balance between the need for data and data protection needs has started to shift. Unprecedented global challenges are changing our priorities. We now need more and better data to tackle climate change, address the next health emergency and make our economy equitable and sustainable for nine billion people. We still need data protection solutions, but they should not come at the expense of sacrificing the immense value of using it in the first place. This was what I wanted to communicate at this year’s MyData conference.  

      The data conundrum  

      It might sound like a marketing pitch, but it is true: data has the potential to address and mitigate – if not find a solution to – many of society’s biggest problems. Despite the advantages, many of us still resist the opportunities. I’m not asking you to give up your privacy or to stop trying to protect any sensitive data. I’m merely asking you to consider taking the risk of sharing your data to unlock its real value when you have the opportunity. Whether you like it or not, you are already part of collaborative data ecosystems. We all are. Our data feeds AI algorithms, design visualizations that communicate and educate, and produce blueprints for change. We limit our collective progress when we limit ourselves to the data we think of as safe.  

      “I’m not asking you to give up your privacy. I’m just asking you to consider the wider impact of what you make possible when you share your data.” 

      Think of it this way: the risk of driving doesn’t stop us from seeking the rewards of visiting new people and places. The pleasure and benefits vastly outweigh the risks. We put our seatbelts on and mitigate the risk as much as reasonable, and then off we go. We should think of data sharing in the same way. Security concerns and legal implications should not prevent us from unleashing the power of data, because there are “seatbelts” for data sharing, too. Technology is progressing continuously and significantly to reduce security risk. Evolving regulatory frameworks give you the means to manage the legal aspects.

      Enduring ideals 

      The good news is that thousands of civil servants, entrepreneurs, lawyers, and data scientists worldwide are working hard to deliver those seatbelts, the sort of safety solutions that would encourage more sharing across all dimensions relevant to this space, from technology to regulation. Many of these individuals attended the conference in Helsinki. More still are associated with MyData Global and contribute to its mission every day. We all have one thing in common: we welcome the progress that’s been made in cybersecurity, confidentiality, and privacy-preserving practices. Remember, even those among us who argue the loudest for change, the activists, the politicians, the technologists, we also have sensitive data of our own to protect. Even looking at technology alone, solutions like differential privacy and homomorphic encryption make data more secure than ever. Despite only recently becoming popular, the former was developed in the early 2000s and the latter was first conceptualized in the 1970s. This proves technology preserving privacy and confidentiality has always gone hand in hand with data processing, and it always will. The difference today is that innovation is being accelerated by a collective shift in consciousness. Our changing sensitivity to data protection is moving the goalposts. 

      “Civil servants, politicians, technologists, lawyers, activists, they are all working for you to give you seatbelts for your journey, as you share data.” 

      Much like with the decarbonization and net-zero goals, governmental regulators are stepping in for the benefit of society. Hundreds of civil servants and members of parliament in Europe contributed to writing and delivering the General Data Protection Regulation (GDPR), for example. It led to more precision and clarity in what businesses and organizations in general can and cannot do when processing data describing European Union residents. Far from being a burden – though it was perceived by many as such – legislation freed us. Like brakes for a car, it allowed us to drive faster within the constraints of sharp bends. Within the framework of the GDPR, individuals feel they can trust these organizations with the data that describes them.  

      More recently, the Data Governance Act was passed by likeminded people to establish a safer framework for sharing and reusing sensitive data. Similarly, only a few days ago, the Digital Markets Act and the Digital Services Act were approved, both further refining citizens’ rights in the digital space, often with implications on how data is used. Sure, progressing legislation is a long process, but the topics are far from being neglected. 

      The way in which our data is handled affects everyone, and when used ethically, it can change the world. As such, good deeds are no longer the sole domain of NGOs, governments, and researchers. But many businesses keen to unleash data need guidance from an experienced partner. 

      MyData: the movement committed to self-determination 

      Professionals who associate themselves with and support the MyData principles are human-centric when it comes to managing sensitive data. On one side, we know how concerned people are about how their data gets handled. On the other side, we know the handlers are equally concerned. It can be challenging to unpack all this new legislation, and talent is needed to master the technology. It’s so problematic that many organizations are reluctant to accept your data. Firstly, there could be significant legal consequences for any mistakes, even honest ones. Secondly, incidents can ruin an organization’s reputation and credibility. Of course, there’s nothing intrinsically wrong with legislation. Rather, the general lack of knowledge about how to use data within the legislative framework is the problem. Once educated in the field, many leaders quickly realize the framework is not a cage but scaffolding, not constraining their operations but exposing them to new opportunities. This is the benefit of joining a movement like MyData. It can help organizations develop the knowledge, tools, and solutions they need to unleash the true value of data. The movement’s annual conference is a chance for more people to begin catalyzing fair data solutions. 

      Writing the next chapter  

      I want to leave you with a word or two about personal agency. Given today’s challenges, we’ve never been so individually encouraged to do our part. Sure, there are movements like MyData dedicating a significant amount of bandwidth to helping you navigate this increasingly challenging space. But as with the other challenges we now face, there’s plenty you can do with data to help yourself and your community. To begin with, you can work on becoming data literate. No one’s asking you to be a data scientist, just literate. After that, you can help your friends and relatives do the same. You can learn about data licenses and your rights within the legal framework. You can take an interest in the introduction of new legislation and begin thinking about how to react. But why stop there? You can go one step further and advocate for the ideals you believe in, thereby ensuring they are turned into law by legislators. The European Commission, for example, regularly runs consultations, giving you the chance to have your say and thereby shape the future. And in the end, this is the kind of control we all want.   

      Learn more about Capgemini’s commitment to a sustainable and prosperous digital society: More Power to You: Capgemini Joins Data-sovereignty Pioneer MyData 

      Stay tuned for the next part of our series of reports on the MyData Conference 2022: Pierre-Adrien Hanania’s presentation on How Your Personal Data Helps to Achieve UN Sustainability Goals

        

      Intelligent transformation – driving a personalized approach to customer interactions

      Leigh Birkbeck, Regional Head of Intelligent Customer Interaction, Capgemini’s Business Services
      Leigh Birkbeck
      26 July 2022

      Implementing an intelligent transformation approach to drive meaningful customer interactions optimizes operational performance and delivers enhanced value to your business.

      The decision to outsource your contact center is not made overnight, by a single person – and it certainly isn’t made without considering all your options. This is a multi-stage process – from the initial concept phase, all the way through the design, build, and transition phases, right up to when your first customer is handled.

      Businesses do this for different reasons, whether it’s to reduce costs or simply because they don’t have the expertise or resources needed in-house. Whatever the reason, contact centers are still expected to have all the tools, techniques, and technology necessary to deliver a better service than their parent organization.

      This means there are a few key questions that need to be considered surrounding these tools, techniques, and technologies. Questions such as does your contact center team have full mastery of them? Are they deployed in a way that the client gets the best value out of them? And does the vendor you’re using really add value to your operations through better training, and improved productivity, sales, retention, and cost rates?

      An intelligent transformation approach – why it’s important

      Driving success in customer interactions requires you to be truly committed to collaborating with your clients across the running and managing of their services, while also delivering a true transformational approach. But where do you start?

      Being bold with continuous improvement initiatives enables you to build longer-term commitments with your clients. However, this approach to transformation can’t be rushed. It has to anticipate customer needs, reimagine the client’s setup for better ways of working, and bring together the technology and methods needed to reduce waste, while creating experiences that add value to their day-to-day operations and customer experience.

      Achieving customer experience success requires modern automation techniques such as artificial intelligence (AI) and intelligent automation, underlined with proven operational tools, to become the new norm within your business. Focusing on finding a quick fix won’t work, as this is simply just a short-term answer to a long-term challenge.

      The question is then: how can you achieve success quickly and easily, while ensuring you can continue to focus on business-critical tasks – without disrupting how your business operates?

      Collaboration and longer-term commitments – the benefits

      The answer to these challenges lies in a solution that enables you to intelligently handle critical customer interaction activities together with your clients – while optimizing customer and operational performance, with minimal effort on the part of your customer service agents. This ensures you drive the best customer and business value, while making your services stand out from the crowd.

      To learn how Capgemini’s Intelligent Customer Operations solution enables you to drive a more meaningful, personal, and emotive relationship with your customers through delivering a frictionless customer experience, contact: leigh.birkbeck@capgemini.com

      About author

      Leigh Birkbeck, Regional Head of Intelligent Customer Interaction, Capgemini’s Business Services

      Leigh Birkbeck

      Regional Head of Intelligent Customer Interactions, Capgemini’s Business Services
      Leigh is Regional Head of Intelligent Customer Interactions for Europe, with more than 20 years of transformation expertise working across a variety of industries.

        Implementing too many tools can impact customer experience

        Leigh Birkbeck, Regional Head of Intelligent Customer Interaction, Capgemini’s Business Services
        Leigh Birkbeck
        26 July 2022

        Tools are great – but they aren’t the complete answer to your customer experience challenges. Examining your customer-first processes helps you design meaningful customer journeys that drive enhanced business outcomes to your organization.

        How do you feel leaving voicemails or interacting with automated systems? Happy? Frustrated? Would you prefer to talk to a normal human being? Well, you’re not alone.

        While automated systems are growing in popularity within call centers due to their ability to reduce costs and workloads – these systems aren’t perfect. This is why complex customer queries are almost certainly answered by your agents, as they can listen, respond, and act better than any machine.

        However, with the right number of handpicked tools, you can answer customer queries faster and more accurately, which has a direct positive impact on customer experience. In short, quality matters more than quantity when it comes to tools to improve customer experience.

        Do more tools improve customer experience?

        Adding the right new products, platforms, and systems to support your growing or changing business is nothing new, but it comes with certain advantages and disadvantages. Yes, these tools and systems – and the features that come with them – can help your call center teams interact better with each customer. But these experiences are often implemented without a single source for capturing or seeking critical information.

        Leveraging a large number of tools creates operational challenges through your agents needing more time to learn their roles and longer coaching or briefing times – leading to more agent downtime. In turn, this increases the workloads of your agents, which directly impacts operational cost-to-serve. After all, a reduced number of agents means everything takes significantly longer – including handle, hold, and resolution times – which generates poorer overall experiences for your customers.

        Worse, these issues can cause your customer interactions to be less fluent, leading to important information getting missed and mistakes being made. All of which can negatively impact your net promotion scores (NPS), customer satisfaction (CSAT), and customer retention. This low performance can lead to a reduced employee experience, and can even impact agent retention attrition.

        In conclusion, a smaller number of handpicked tools side steps all of the issues stated above, so if you genuinely want to improve your customer experiences the less tools you use the better.

        Less tools = happier customers

        To overcome these challenges you need to invest in the right customer interaction tools that give your teams the information they need to resolve customer queries quickly – all from one place, while providing a complete, end-to-end view of your customers.

        For example, integrating Zendesk Case Management within your toolset can deliver an easy-to-use, self-service interface to your customers. While collecting data and analysis can help your agents continuously improve by providing greater insight into their customer’s activity or issues, while providing all the operational oversight and data they usually lack.

        Good customer experience – more than just tools

        Keeping the number of tools you implement to a minimum reduces agent effort significantly, and helps drive customer experience and productivity further. This means ROI can often be achieved much quicker than expected.

        It’s important to note that tools are just one aspect of the bigger picture and won’t produce the customer experience results your organization is looking for on their own. To truly generate better customer experiences you need to examine the processes surrounding your customer journeys and the goals you want to achieve, while designing a proven roadmap that delivers these outcomes.

        This may seem like trying to scale Mount Everest with little or no supplies, and it can be difficult to know where to start. But the end goal will enable you to drive a more meaningful and frictionless relationship between your agents and your customers.

        To learn how Capgemini’s Intelligent Customer Operations solution enables you to drive a more personal and emotive relationship with your customers through delivering a frictionless customer experience, contact: leigh.birkbeck@capgemini.com

        About author

        Leigh Birkbeck, Regional Head of Intelligent Customer Interaction, Capgemini’s Business Services

        Leigh Birkbeck

        Regional Head of Intelligent Customer Interactions, Capgemini’s Business Services
        Leigh is Regional Head of Intelligent Customer Interactions for Europe, with more than 20 years of transformation expertise working across a variety of industries.

          Seven ways to foster long-term customer relationships

          Marc De Forsanz – Our Expert
          Marc De Forsanz
          25 July 2022

          Real-world examples from clients of Data-driven Customer Experience by Capgemini

          Enterprises understand the value of creating long-term relationships with their customers. Whether they operate in B2C or B2B spaces, successful organizations have learned the importance of customer acquisition, engagement, and retention. In today’s environment, customers choose brands with the tap of a screen or the click of a mouse – and can access a vast amount of information and recommendations to make buying decisions.

          To continue to thrive, companies must move towards making use of data and AI to drive the customer experience. The solution is to embrace a dynamic customer experience and journey management program – one that:

          • Delivers personalized, contextualized engagement across all customer touchpoints
          • Anticipates individual customer behaviors and addresses each customer’s needs
          • Operates in real-time, to deliver impactful customer experiences when they’re needed, seamlessly across all channels
          • Encompasses the entire enterprise ecosystem – from traditional CDP roles, such as marketing, to eCommerce, point-of-sale, customer service, R&D, production, supply-chain management, and shipping.

          It’s easy to understand the value of these objectives, but the descriptions are quite a high level. When shopping for a customer-experience solution, enterprises want to know how they can accomplish these goals. To answer that, here are seven examples.

          Better marketing – including suppression lists. Targeting and retargeting are important aspects of every marketing strategy. But targeting loyal customers with irrelevant advertisements not only annoys them, but also wastes precious marketing dollars. The ideal customer-experience solution will provide the marketing team with a suppression list created from systems across the enterprise. This will resolve multiple identities related to the same customer, to engage existing customers with relevant advertisements and exclude those customers from new-user acquisition campaigns.

          Customer trust. The first step towards building a loyal customer is to win customer trust. Brands need to be totally transparent about data they collect and how they provide each customer with complete control of their own data. The ideal customer experience solution will create a unified customer profile – including data the customer supplied during the registration process and data the organization has inferred or consolidated from other enterprise systems. It will then give full control of that data to the customer.

          Managing consent. Consent is a large and evolving issue for enterprises to manage – especially as jurisdictions introduce different regulations. The well-planned customer journey will capture consent from the outset and automatically add the details to the customer’s profile. The company can then immediately act upon those preferences by ensuring customers are only included in engagement initiatives for which they have provided consent.

          Real-time conversion. Here’s a startling statistic: 70 percent of customers abandon their eCommerce carts before they check out. The ideal customer-experience solution will monitor the customer’s journey and predict when they are likely to become a “cart abandonist.” It will then initiate actions to encourage the customer to follow through with the sale, while the customer is still active on the channel.

          Identifying customers and prospects with high lifetime value. Repeat customers are not all created equally. Every company wants to identify customers with high lifetime value – and figure out how to acquire more of them. The right solution will classify existing customers, build profiles of them, and then use a “look-alike” approach to identify prospects that are likely to also become high-value customers.

          Next-best action for customer service agents. Solving problems and reducing churn is a key role for customer service agents, and these frontline workers do their jobs best when they have access to useful data. The ideal customer-experience solution will unify online and offline data to provide the agent with a complete and up-to-date view of the enterprise’s interactions with that customer. This data includes interactions such as:

          • browsing on eCommerce sites, the company’s social media, and other platforms
          • purchases, both online and in-store
          • fulfillment and delivery
          • warranty status
          • any contact with the customer, including compliments, questions, or complaints.

          When a customer contacts an agent, a well-designed customer-experience solution should apply AI to the issue and the customer profile, then suggest to the agent the best action or actions to take in order to satisfy the customer and build loyalty.

          Reading the signs to reduce churn. Any enterprise that relies upon a subscription model needs to reduce churn. The best customer-experience solution will employ AI to read signals – such as a call or email to customer service – and score each customer on their likelihood to cancel. AI can then blend this score with the customer’s lifetime value and make recommendations about how to better engage with that customer. This engagement – in the form of personalized content, delivered across the most appropriate channel – can reduce the potential for churn.

          These are just a few of the ways in which a well-designed solution can enhance the relationship between companies and customers. What’s more, these are all real-world examples – drawn from the experiences of clients who have deployed Data-driven Customer Experience by Capgemini.

          Data-driven Customer Experience empowers enterprises to take full advantage of interconnected data while building trust, transparency, and long-term relationships with customers. It collects all relevant customer-first and enterprise functions and related data into unified customer profiles. These profiles uniquely identify each customer and provide a personalized, global view of their relationship with an organization’s brands. It does this in real-time to turn raw data into more reliable, actionable insights. And it preserves privacy and ensures compliance with all relevant regulatory requirements.

          With Data-driven Customer Experience, Capgemini’s Insights & Data professionals are ready to help organizations transform their customers’ journeys. Capgemini supports its clients through every step of the process – from creating the strategy and operating model to selecting the best mix of technology components and data platforms, to implementing the solution, training those who use it, and measuring its effectiveness.

          Author

          Marc De Forsanz – Our Expert

          Marc De Forsanz

          Insights & Data Global "Customer First" playing field Business Development and Portfolio leader. 

            5G and security: are you ready for what’s coming? 
            New risks and complex challenges require a comprehensive new strategy

            Chhavi Chaturvedi
            25th July, 2022

            5G opens up all new avenues of attack – is your organization ready?

            Every tech revolution comes with risks, and 5G is no exception. From IoT applications to the 4G – 5G transition, the scale of 5G usage is opening up an enormous surface area to potential attackers. The promise of high bandwidth + low latency in the coming years is extraordinary, but organizations that are slow to react to these threats are taking a gamble. Fortunately, there are a number of security measures that can substantially reduce these risks. Read on to learn how to keep pace with the security demands of 5G today.

            Security challenges

            Every promised benefit of 5G brings with it a corresponding risk. The number of connected IoT devices is growing at upwards of 18% per year, on course to pass 14 billion this year. Each new edge-computing device creates new vulnerabilities for bad actors to exploit. The decentralized nature of IoT products makes security measures difficult to implement at scale, while 5G’s greater bandwidth has the potential to fuel new DDoS attacks with the power to overwhelm organizations. And the expansive nature of 5G itself poses new risks. As the number of users increases into the millions and billions and networks expand to accommodate more devices, network visibility plummets. It becomes harder to track and prevent threats, especially against sophisticated attackers. Device vulnerabilities, air interface vulnerabilities, RAN, backhaul, 5G packet core & OAM, and SGI/N6 & external roaming vulnerabilities all need to be re-examined. 

            Network Slicing is not enough 

            There are many services in today’s industries that require various performance measures such as high throughput, low latency, high reliability, etc., which can be achieved by network slicing, which integrates multiple services with customized local networks. In theory, network slicing should raise security – like the bulkheads on a ship, which contain a potential breach to one flood zone. This is the same logic behind IT network segmentation, which is an established best practice. However, just like network segmentation, network slicing alone does not guarantee that threats are contained. Without additional measures, they’re likely to pass seamlessly into the wider system. Network slicing also faces security challenges connected with resource sharing among the slice tenants and slice security coordination, which are fairly straightforward to solve, but do require attention. 

            Mitigation Approaches 

            Businesses deploying 5G-connected equipment need an up-to-date set of security solutions capable of monitoring and protecting against the new generation of cyber threats. The specifics will vary according to each user, but the backbone of the new strategy may look something like the following: 

            Security Edge Protection:

            Security edge protection is the foundation of 5G security, upon which all other strategic considerations rest. The following methods can help secure 5G edge installations:  

            • Encrypted tunnels, firewalls and access control to secure edge computing resources 
            • Automated patching to avoid outdated software and to reduce attack surface 
            • AI/ML technology to detect the breach and send alerts accordingly or act accordingly  
            • Continuous maintenance and monitoring for the discovery of known and unknown vulnerabilities  
            • Securing the edge computing devices beyond the network layer 

            Zero trust architecture: never trust, always verify 

            Zero Trust Architecture (ZTA) eliminates implicit trust by continuously validating a set of actions at every step. Based on perimeter-less security principles, ZTA requires each asset to implement its own security controls. It includes security features such as: 

            • Continuous logging, continuous monitoring, alerts and metrics 
            • Threat detection and response 
            • Policies & permissions 
            • Infrastructure security & secure software deployment lifecycle (supply chain security) 
            • Data confidentiality from service providers of both hardware and software 
            • Container isolation 
            • Multiple authentication and TLS security 

            Container-based technology

            Containers bring the potential benefits of efficiency, agility, and resiliency. Gartner expects that up to 15% of enterprise applications will run in a container environment by 2024, up from less than 5% in 2020. Containers are orchestrated from central control planes which are configurable, used for scaling up and down workloads, collecting logs and metrics, and monitoring security. Containers bring a few unique security risks, but they are solvable.  

            When containers run in privileged mode or as root, they provide attackers with direct access to the kernel, from which they can escalate their privileges and gain access to sensitive information. It is therefore essential to add role-based access control and limit permissions on deployed containers. It’s easy to run a container in non-root, simply by providing instructions in the docker file. Two more ways to enhance container security are by rejecting pods or containers in privileged mode, or by keeping privileged containers but limiting access to the namespaces.  

            Automated operations and AI 

            The complexity of 5G infrastructure requires security applied at multiple levels. Handling complex security such as threat, risk, different devices, scaling etc, is so difficult manually as to be impractical. Additionally, manual operations introduce an element of uncertainty which may in some cases be exploited. There is absolutely a place for human ingenuity. But increasingly the operations level needs to be automated. 

            What about AI/ML technologies – are they helpful, or just hype? Currently, a bit of both. They already have a role in security, primarily in detecting irregularities. The next step in AI/ML-based security will involve deep learning, through which the system builds its own capabilities through experience – theoretically going so far as to predict threats before they’re deployed. Claims about revolutionary AI protection need to be considered very sceptically, but at the same time the potential for AI to fundamentally alter network security is real. This is a space to watch. 

            Building on firm ground 

            The Capgemini Research Institute recently probed organizations’ preparedness to cyber-attacks and revealed a concerning level of disconnect: 51% of industrial organizations expect cyberattacks on smart factories to increase over the next 12 months, and yet nearly that same number (47%) report that cybersecurity is not a C-level concern. We see the lack of a comprehensive, system-wide approach to security as a serious long-term threat. 

            It is tempting to describe security breaches as instantaneous, but in fact, an honest examination often reveals vulnerabilities that had been left out in the open for months or years, with no adequate security protection. Security you can rely on starts early, with solid fundamentals across people, process, and technology. It’s not easy, but it’s doable.  

            We can see the risks that come with 5G. Let’s put a security plan in place now. To learn more about our 5G security capabilities, contact us below. 

            TelcoInsights is a series of posts about the latest trends and opportunities in the telecommunications industry – powered by a community of global industry experts and thought leaders.

            Developing a quantum advantage includes joining an ecosystem of complementary partners

            Pascal Brier
            21 Jul 2022

            Business leaders with the right mindset will reap the rewards of an ecosystem approach to implement a successful quantum strategy and develop a long-term quantum advantage

            No single business can expect to manage every aspect of quantum exploration because of the scale of costs and skills involved. As we concluded in our last blog post, business leaders will need to look outside the organization to evaluate, prioritize, and implement a successful quantum strategy – and that’s where an ecosystem approach can pay dividends.

            Being part of an ecosystem is crucial to quantum success. Business leaders can use the ecosystem to create an amalgam of quantum expertise, to build the best possible platform, and to plug any skills gaps that appear.

            An ecosystem approach also ensures an investment in quantum is not overstretched. Financial resources will be a critical concern because quantum technology requires a broad range of expertise. It’s a pioneering area of technology led by specialists with knowledge in several key areas:

            Hardware specialists – Computers, sensors, and infrastructure, from tech giants such as IBM, AWS, and Google to smaller specialists such as Atom, Quandela and Pascal, including quantum cloud and hybrid architectures
            Software specialists – Quantum algorithms development, artificial intelligence, and machine learning from companies such as ApexQubit, which uses frontier technology in medicine discovery
            Crypto agility and quantum cybersecurity specialists – Security protocols and standards supporting multiple quantum cryptographic primitives and algorithms, such as ISARA
            Startups and venture capitalists – Estimates suggest $5 billion of private capital has been invested in quantum technologies since 2002, with $3 billion of this being in 2021 alone .
            Academic institutions – Technical institutes, such as Fraunhofer, and universities, including Cambridge and Instituto Superior Técnico, are providing cutting-edge research in quantum theory and practice.

            Strong industry examples point the way

            Multinational firms, including BMW and Goldman Sachs, are already working across these areas of expertise and developing an ecosystem approach to quantum. The result is an early foothold in a fast-evolving space that should result in a long-term advantage.

            Perhaps the most significant examples of a successful ecosystem approach so far are those that bring government and industry together to work on quantum challenges. Many of the applications that these partnerships are investigating concern issues with far-reaching consequences, such as climate change, environmental sustainability, industrial competitiveness, and economic growth.

            In these challenging scenarios, the consortium approach is becoming a norm, allowing a collection of knowledgeable parties to tackle intractable issues in partnership. These consortia are driven by core technology firms, blue-chip companies, governments, academic institutions, and other initiatives, such as Horizon Europe, which is the European Union’s research and innovation program.

            Take the example of EuroQCI, which is a European Commission initiative led by Airbus. This consortium includes a range of companies and research institutes that are exploring the future design of the European quantum communication network. Other examples of cross-institution consortia include the European Quantum Industry Consortium and the Quantum Economic Development Consortium.

            What’s more, a joined-up approach is already helping to forward research into quantum key distribution (QKD), which uses quantum mechanics to develop secure cryptographic protocols. The development of QKD technology is a significant challenge on a global scale. A large ecosystem of players, including hardware providers and software startups, is working to create quantum-based solutions.

            Across all these pioneering approaches, one thing remains constant: all organizations must ensure data is used in-line with strong regulatory requirements. Standards-setting bodies, including NIST in the US and ETSI in Europe, are helping to set the agenda and create boundaries for the private and public sector organizations that are working together on quantum technologies.

            The right mindset for an ecosystem approach

            As outlined in our recent report Quantum technologies: How to prepare your organization for a quantum advantage now, business leaders who want to develop a long-term quantum advantage should explore how to become part of an ecosystem at the earliest opportunity. Successful companies will develop a proactive mindset that supports other ecosystem members as they work together to develop creative solutions to intractable challenges. We think the right mindset includes five key characteristics:

            Communication – Working with external partners and across internal business units, as quantum’s impact goes beyond R&D and affects operational activities and processes
            Trust – Sharing information, perhaps highly confidential and related to intellectual property, that might not normally be revealed
            Collaboration – Fostering a collegiate belief in the potential of complementary skills
            Imagination – A willingness to invest in meaningful research with a long-term view rather than just a focus on the top line or short-term revenue streams
            Participation – Being active in the wider community through initiatives such as industry events, meetups, and hackathons

            Business leaders with the right mindset will reap the rewards of an ecosystem approach.

            What new opportunities does the metaverse present for product and services companies?

            Dheeren Vélu
            20 Jul 2022
            capgemini-invent

            A few years from now, we may test drive our new EV in a metaverse showroom. Once we buy it, we may also opt for a virtual version, which we can drive around virtual worlds. Perhaps the carmaker will throw in some sweeteners, such as exclusive access to live virtual events for its metaverse drivers.

            Predicting exactly how the metaverse will look is a fool’s game. But with 120bn investment in 2022, we can be reasonably confident it will become a place where people interact and transact. McKinsey reckons that by 2030, over 50% of live events could be held in the metaverse, and 80% of commerce could happen at least partially there. That is a big opportunity for products and service companies.
            All this may sound scary, especially if you are a company that makes real things in the real world. You should not feel scared, for two reasons. Firstly, the metaverse is not as complicated as it sounds. Secondly, you will miss out on big wins if you let fear hold them back.

            What exactly is the metaverse?

            To understand the opportunity, we first need to agree on what the metaverse is. The concept will evolve, but we would describe it as follows:

            The metaverse is an umbrella term for a range of virtual environments – accessed via screens or headsets – in which multiple parties, represented as avatars, can interact and transact. In most cases, these environments are ‘persistent’, ie any change you make or ownership you acquire remains when you leave.

            Practically, these worlds are made up of an immersive front end – a 3D virtual landscape – and a backend infrastructure that validates transactions using tokens and blockchain to confer permanent records of ownership.

            What are the business opportunities in the metaverse?

            For product and service providers wishing to take advantage of this, there are two overlapping avenues. One is to build virtual equivalents of products – eg, a car or a sneaker – that can be used in the metaverse. The other is to buy ‘land’ and build your own space – a shop, a stadium, a village – where your customers can buy your products or experience your services.

            So, a sneaker company could offer a virtual add-on that let’s buyers wear the shoes – with all the status they represent – as they go about different activities in the metaverse. But the digital world also offers the potential for more functionality – perhaps wearing a particular brand gives entry to exclusive virtual events, thus increasing its value in both worlds.

            You can also set up physical shops to sell or showcase your products. The beauty of this virtual world is they can be anywhere – why sell luxury handbags in a busy high street when you can sell them from a tropical beach? Likewise, B2B companies could create virtual showrooms to demonstrate high value equipment that would be hard to bring to a customer.

            Or you may want to branch out into whole new areas. We have seen sports brands offer fitness subscriptions via apps; the next step for them could be virtual fitness studios in the metaverse, where users can work out, buy clothes, and sit down with experts such as nutritionists or personal trainers.

            How to grow your business in the metaverse

            So, how should products and service companies – who are not themselves tech companies – go about benefitting from all this?

            The most important driver of success will be a value-focussed strategy. Don’t just jump into the technology without a plan. Decide what you are trying to achieve from the metaverse.

            For example, you could:

            • Create virtual versions of existing products and services that you can sell in the metaverse
            • Create virtual shops and showrooms to sell your real-world offer in more immersive ways
            • Reach a new generation of customers
            • Create complementary services that generate new revenue streams
            • Build virtual spaces, events, or communities where you can gather your audience
            • Form partnerships – eg, with virtual event promoters – that allow you to increase your exclusivity by giving your customers unique offers or opportunities

            Right now, the key will be experimentation. No one knows exactly how the metaverse will settle, so now is the time to get hands-on with the metaverse, brainstorm ideas for your business, test a few hypotheses, and develop proofs-of-concept for the most promising.

            Even if you don’t see high potential use cases immediately, they will come as you experiment and see what works. Like digitalization via IoT, products in the metaverse create the opportunity to collect detailed data on how users engage with your offer, allowing you to continually test and refine, kill failing projects quickly, and spot behavioral signals that hint at new opportunities.

            Experimentation with new and unproven technology may sound risky and costly. But there are a lot that can be done cheaply at first. And if the tech revolution taught us anything it is that focused experimentation and failing fast is the route to new revenue. You will pay one way or another – through the cost experimenting, or the cost of falling behind.

            Don’t be afraid of the metaverse technology

            A major risk for companies is that they will focus on the technology and get bogged down in technical complexities and end up with no use cases. We urge you to focus on the business case and trust the technology will support it.

            Metaverse tech is less complicated than it sounds. The building blocks – both of the virtual worlds (eg those provided by Unity or Roblox) and the contracts for virtual transactions – are becoming simpler and standardized. For more sophisticated offers, there is a growing pool of experts who can customize both. Decision makers should immerse themselves in the metaverse to get familiar with the possibilities, then contract technical people to make it happen.

            It is true that the metaverse still needs to mature. Right now, there are many sellers of digital land (eg, Decentraland, Sandbox) and knowing which one will be right for you is hard. This shouldn’t hold you back from picking the one that seems to be popular with your audience and experimenting. This will stand you in good stead as the industry evolves, and our prediction is that they will soon become interoperable, allowing your customers to take your virtual car or trainers between worlds.

            The next killer app

            Most of the ideas discussed above are ‘lift and shift’ – taking something that exists and creating a metaverse version. There are many examples already happening, and it is likely that most big businesses will have some metaverse offer by 2030.

            But the big opportunities will come from the things we haven’t even thought of yet. We don’t know what those are, any more than the early internet pioneers predicted Facebook or YouTube. But as a culture of innovation grows in the metaverse, someone – maybe you – will come up with the next killer app that changes the world.

            To discuss the opportunities or ideas outlined in this article, please contact the authors.

            Dheeren Velu

            Dheeren Vélu

            Head of Applied Innovation Exchange, AUNZ
            Dheeren Velu is Head of AIE and AI Leader at Capgemini ANZ, driving innovation at the intersection of technology and business. He leads the GenAI Task Force, delivering high-impact AI solutions. With deep expertise in AI and emerging tech, he’s a TEDx speaker, patent holder, and Chair of RMIT’s AI Industry Board, focused on transforming industries and the future of work.

            Nitin Dhemre

            Immersive Stream Lead of the Capgemini Metaverse Lab
            Nitin is Director, Innovation Strategy & Design, frog Growth Strategy Paris, and Immersive Stream Lead of the Capgemini Metaverse Lab