Skip to Content

Unlocking the power of AI with data management

Capgemini
Capgemini
02 Mar 2022

Artificial intelligence is crucial to innovation and business growth in today’s digital world but, without data management, AI can be a black box that has unintended consequences.

This article first appeared on Capgemini’s Data-powered Innovation Review | Wave 3.

Written by:

Jitesh GhaiChief Product OfficerInformatica

In today’s data-driven economy, artificial intelligence (AI) and machine learning (ML) are powering digital transformation in every industry around the world. According to a 20 https://www.weforum.org/agenda/2021/01/ here-s-how-to-flip-the-odds-in-favour-of-your-digital-transformation21 World Economic Forum report, more than 80 percent of CEOs say the pandemic has accelerated digital transformation. AI is top of mind for boardroom executives as a strategy to transform their businesses. AI and ML are critical to discovering new therapies in life sciences, reducing fraud and risk in financial services, and delivering personalized digital healthcare experiences, to name just a few examples that have helped the world as it emerges from the pandemic.

For business leaders, AI and ML may seem a bit like magic: their potential impact is clear but they may not quite understand how best to wield these powerful innovations. AI and ML are the underpinning technology for many new business solutions, be it for next-best actions, improved customer experience, efficient operations, or innovative products.

“AI IS MOST EFFECTIVE WHEN YOU THINK ABOUT HOW IT CAN HELP YOU ACCELERATE END-TO-END PROCESSES ACROSS YOUR ENTIRE DATA ENVIRONMENT.”

Machine learning in general, and especially deep learning, is data-hungry. For effective AI, we need to tap into a wide variety of data from inside and outside the organization. Doing AI and ML right requires answers to the following questions:

  • Is the data being used to train the model coming from the right systems?
  • Have we removed personally identifiable information and adhered to all regulations?
  • Are we transparent, and can we prove the lineage of the data that the model is using?
  • Can we document and be ready to show regulators or investigators that there is no bias in the data?

The answers require a foundation of intelligent data management. Without it, AI can be a black box that has unintended consequences.

AI needs data management

The success of AI is dependent on the effectiveness of the models designed by data scientists to train and scale it. And the success of those models is dependent on the availability of trusted and timely data. If data is missing, incomplete, or inaccurate, the model’s behavior will be adversely affected during both training and deployment, which could lead to incorrect or biased predictions and reduce the value of the entire effort. AI also needs intelligent data management to quickly find all the features for the model; transform and prepare data to meet the needs of the AI model (feature scaling, standardization, etc.); deduplicate data and provide trusted master data about customers, patients, partners, and products; and provide end-to-end lineage of the data, including within the model and its operations.

Data management needs AI

AI and ML play a critical role in scaling the practices of data management. Due to the massive volumes of data needed for digital transformation, organizations must discover and catalog their critical data and metadata to certify the relevance, value, and security – and to ensure transparency. They must also cleanse and master this data. If data is not processed and made usable and trustworthy while adhering to governance policies, AI and ML models will deliver untrustworthy insights.

Don’t take a linear approach to an exponential challenge

Traditional approaches to data management are inefficient. Projects are implemented with little end-to-end metadata visibility and limited automation. There is no learning, the processing is expensive, and governance and privacy steps can’t keep pace with business demands. So how can organizations move at the speed of business, increase operational efficiency, and rapidly innovate?

This is where AI shines. AI can automate and simplify tasks related to data management across discovery, integration, cleansing, governance, and mastering. AI improves data understanding and identifies privacy and quality anomalies. AI is most effective when you think about how it can help you accelerate end-to-end processes across your entire data environment. That’s why we consider AI essential to data management and why Informatica has focused its innovation investments so heavily on the CLAIRE engine, its metadata-driven AI capability. CLAIRE leverages all unified metadata to automate and scale routine data management and stewardship tasks.

As a case in point, Banco ABC Brasil struggled to provide timely data for analysis due to slow manual processes. The bank turned to an AI-powered integration Platform-as-a-Service and automated data cataloging and quality to better understand its information using a full business glossary, and to run automated data quality checks to validate the inputs to the data lake. In addition, AI-powered cloud application integration automated Banco ABC Brasil’s credit-analysis process. Together, the automated processes reduced predictive model design and maintenance time by up to 70 percent and sharpened the accuracy of predictive models and insights with trusted, validated data. They also enabled analysts to build predictive models 50 percent faster, accelerating credit application decisions by 30 percent.

With comprehensive data management, AI and ML models can lead to effective decision-making that drives positive business outcomes. To counter the exponential challenge of ever-growing volumes of data, organizations need automated, metadata-driven data management.

INNOVATION TAKEAWAYS

Accelerate engineering
Data engineers can rapidly deliver trusted data using a recommender system for data integration, which learns from existing mappings.

Boost efficiency
AI can proactively flag outlier values and predict issues that may occur if not handled ahead of time.

Detect relationships among data
AI can detect relationships among data and reconstitute the original entity quickly, as well as identify similar datasets and make recommendations.

Automate data governance
In many cases, AI can automatically link business terms to physical data, minimizing errors and enabling automated data-quality remediation.

Interesting read?

Data-powered Innovation Review | Wave 3 features 15 such articles crafted by leading Capgemini experts in data, sharing their life-long experience and vision in innovation. In addition, several articles are in collaboration with key technology partners such as Google, Snowflake, Informatica, Altair, A21 Labs, and Zelros to reimagine what’s possible. 

Proud to support the EcoBeautyScore consortium

Capgemini
Capgemini
28 Feb 2022

As part of our priority to shape a sustainable future, we are proud to partner with 30+ leading cosmetics brands in the breakthrough EcoBeautyScore Consortium initiative.

Consumer expectations are driving a transition towards more sustainable cosmetics, from ethically sourced ingredients and raw materials to eco-friendly packaging and production processes. Their purchasing decisions bring the influence to reduce the environmental footprint of the cosmetics industry and help build a regenerative future.

The EcoBeautyScore Consortium was established to provide increased transparency on the environmental impact of cosmetic products to enable consumers to more easily make such informed decisions.

“We are excited to be actively involved in this breakthrough EcoBeautyScore Consortium initiative, partnering with leading cosmetics brands to enable consumers make more sustainable choices.”

– Emmanuel Fonteneau, Global Head of Consumer Products & Retail, Capgemini Invent

For more insights on this exciting initiative, please see below the latest press release from the EcoBeautyConsortium.

The EcoBeautyScore consortium is now Live with 36 industry players in a breakthrough initiative to enable more sustainable consumer choices

36 cosmetics and personal care companies as well as professional associations have joined forces to form the EcoBeautyScore Consortium, aiming to develop an industry-wide environmental impact assessment and scoring system for cosmetics products.

With small and large companies and associations from 4 continents, the EcoBeautyScore Consortium is truly global and inclusive. It remains open for other companies and associations to join.

The 36 members so far include: Amorepacific, Babor, Beiersdorf, Colgate-Palmolive, Cosmébio, COSMED, Cosmetic Valley, Cosmetics Europe, cosnova, Coty, The Estée Lauder Companies, Eugène Perma, FEBEA, The Fragrance Creators Association, Henkel, IKW Beauty Care, The International Fragrance Association, Johnson & Johnson Consumer Inc., JUST International AG, Kao, L’Oréal Groupe, LVMH, Nafigate, NAOS, Natrue, Natura &Co, NOHBA, Oriflame, P&G, Paragon Nordic, Puig, PZ Cussons, Shiseido, Sisley, STANPA, Unilever.

The purpose of the EcoBeautyScore Consortium: enable consumers to make sustainable choices through an environmental impact assessment and scoring system

The EcoBeautyScore Consortium is developing an industry-wide environmental impact assessment and scoring system for cosmetics products. The approach has a global scope and may help provide consumers with clear, transparent, and comparable environmental impact information, based on a common science-based methodology. This will contribute to meet growing consumer demand for greater transparency about the environmental impact of cosmetics products (formula, packaging and usage). Indeed, a significant proportion of consumers (42%) is interested in buying brands that concentrate on circular and sustainable practice.

The work plan of the EcoBeautyScore Consortium: to co-build a scientific methodology for the environmental impact assessment and scoring system

The Consortium is working with the experienced sustainability consultancy Quantis to ensure a robust and scientific approach to co-build an assessment methodology and scoring system that are guided by and articulated around:

  1. A common method for measuring environmental impacts throughout the lifecycle of products, backed by the principles of the “Product Environmental Footprint” (the European Union’s PEF scientific method based on life cycle assessment (LCA) for quantifying the environmental footprint of products).
  2. A common database of environmental impact of standard ingredients and raw materials used in formulas and packaging, as well as during product usage.
  3. A common tool that enables the assessment of the environmental impact of individual products, usable by non-experts.
  4. A harmonized scoring system that enables companies, on a voluntary basis, to inform consumers about the environmental footprint of their cosmetic products. The methodology, data base, tool and scoring system will be verified by independent parties.

Operationally, the EcoBeautyScore Consortium is also supported by Capgemini Invent (project management) and Mayer Brown (legal counsel).

The EcoBeautyScore Consortium next steps

The 35 members of the EcoBeautyScore Consortium have started to work together organized in thematic working groups. A footprinting and scoring prototype is targeted for end of 2022, providing the environmental scoring for a selection of product categories at first. It will then be verified by independent parties.

The EcoBeautyScore Consortium is calling on cosmetics and personal care companies and professional associations to join this unique initiative

This Consortium is open to all cosmetics and personal care companies, regardless of their size or resources. Other stakeholders will be informed and consulted throughout the process. All companies will benefit from the pre-existing work and are invited to contribute with their own experience. The Consortium will also consult external experts, including scientists, academics, and NGOs to make sure the process is as inclusive as possible. The work developed by the Consortium will be published and may be used on a strictly voluntary basis by both Consortium participants and all other interested parties.

Cosmetics and personal care companies and professional associations wishing to know more are invited to contact: contact@ecobeautyscore-consortium.org.

Connected marketing – best practice

Abha Singh Senior Director, Capgemini Business Process Outsourcing
Abha Singh
2022-02-25

In the previous article, we looked at the increasing responsibilities of chief marketing officers (CMOs), and at how smart, integrated approaches to data can help them attain their goals. In this article, we’re taking a closer look at closely at what connected, data-driven marketing looks like, and what it can achieve.

In its recent report, “A New Playbook for Chief Marketing Officers,” the Capgemini Research Institute (CRI) found that data-driven marketers outperform their counterparts in other organizations in four key areas:

1. Data-driven marketers can make the most of real-time marketing

For instance, 88% of highly data-driven marketers say they can adapt and change content based on real-time data (versus only 38% of other marketers), and 79% also say that they can deliver content based upon real-time understanding of customer needs (compared to 38% of other marketers). In addition, 77% say they can decide the next best course of marketing action based upon data and insights collected (against 48% of other marketers).

2. They realize better business outcomes from real-time marketing

Data-driven marketers also report better performance against key metrics for real-time marketing campaigns. These metrics include improved brand awareness/consideration; improved customer satisfaction; an increase in conversion rates of prospects to customers; and an increase in customer retention.

3. They have well-rounded talent

Data-driven marketers have a greater supply of data and technology talent. For example, almost three-quarters of data-driven marketers (72%) say they have the data analytics and data-science skills they need (against 40% for others). They also have a greater supply of core marketing skills, as well as soft qualities and skills such as empathy, collaboration, and emotional intelligence.

4. They foster creativity

Creativity and data are often considered opposites. Creativity is seen as requiring a more artistic and emotional mindset, while data skills are regarded as needing a more analytical and methodological viewpoint.

However, data – especially the insights obtained from first-party customer data – can be used to enhance the creativity of marketers. The CRI research finds that data-driven marketers nurture creativity, which can take one of several forms:

  • Building quick responses for changing trends
  • Syncing data and creativity in customer engagement:
    • Understanding consumer intent across different channels
    • Delivering new ideas for personalized content
    • Driving hyper-targeting in customer engagement
  • Pairing data and creative talent.

Recommended practices

Drawing on its research and experience, the CRI identified six focus areas in its report that are critical to ensuring CMOs are prepared for the future in a data-driven marketing environment:

  • Create a clear vision for the marketing strategy – ensure data-driven capabilities are at its core, and define the roadmap for transformation
  • Implement a framework-driven data-collection process – consider data from emerging digital touchpoints, and unify internal data silos
  • Ensure talent is equipped with data, creative skills, and specialists – focus on developing analytical mindset in your team, and upskill on digital and performance marketing. Establish a center of excellence – and, in general, develop a learning culture
  • Accelerate collaboration across the marketing ecosystem – collaborate with key functions, such as IT, sales, and finance, and also with external partners
  • Reimagine the customer journey with real-time engagement – implement a customer data platform, and make use of listening tools to understand customer intent. Have a clear content management strategy, with appropriate solutions, and use automation tools for delivery
  • Integrate long-term brand building and short-term marketing engagements – allocate separate budgets for long-term and short-term marketing engagements.

Taking stock

Data is growing at explosive rates. It’s being driven by the quickening pace of digitalization, and also by the rise of e-commerce, which is itself accelerating because of the global pandemic.

This data growth is enabling marketing to achieve its potential. Marketing has never been more integral to business, as the CMO role has broadened and become more holistic, with many CMOs now responsible for customer experience and growth strategy. Given the need for marketers to understand how customers interact with brands and companies, and to know when and where to engage with them, real-time data will be a critical enabler for CMOs to deliver their broadened remit.

Successful organizations are reaping benefits ranging from more effective decision-making, better business outcomes, and the ability to perform real-time marketing that consumers increasingly expect.

In short, it is critical that today’s marketing teams be data-led, so they can drive sustainable growth.

To learn more about how Capgemini’s Connected Marketing Operations unlocks enhanced brand value and revenue impact through frictionless, digitally-augmented marketing operations, contact: abha.singh@capgemini.com

Read the full CRI “A New Playbook for Chief Marketing Officersreport to learn why CMOs should enable real-time marketing to drive sustainable growth.

About author

Abha Singh Senior Director, Capgemini Business Process Outsourcing

Abha Singh

Senior Director, Capgemini Business Process Outsourcing
Abha drives large transformation and consultative sales, presales, and marketing projects for Capgemini’s clients, bringing innovation into the core of every area of her work.

    Innovation Nation | Summer 2022 edition

    Innovation Nation is much more than a magazine – it’s a zoom on what’s been happening in the last six months across the world of Intelligent Business Operations.

    5G cybersecurity: fusing automation with the human factor

    Geert van der Linden
    11 Mar 2022

    Ensure end-to-end collaboration to effectively manage complexity, firms need an organization with global capabilities

    What’s the difference between 5G and 4G? Simple: one is quicker than the other. It’s much more challenging, however, to answer the real question: what does this mean for organizations? Leaders must know the answer if they want to protect their businesses in the new threat landscape.

    So, before we look at tackling cybersecurity in the 5G era, let’s take a step back and consider why this network upgrade changes everything.

    Navigating the new threat landscape

    5G calls for significant changes in the uses of computing power and how apps are constructed. Once upon a time, an endpoint was typically a single PC terminal connected to a mainframe that took on all the computing. Entry points were easier to identify and secure.

    With 5G, the possible number of devices connected to the network is substantially higher, and so the volume of endpoints increases exponentially, forming what is called the internet of things (IoT). Research by Statista estimates that the total number of connected devices worldwide is set to triple from 8.7 billion in 2020 to 24.4 billion in 2030. It will no longer be necessary for the computing to be done within these new devices, which will shift to Edge networks, where much of the intelligence will be held. Unfortunately, new tides of data mean more entry points for cyberattacks.

    The federated nature of 5G infrastructure adds complexity to the security landscape that even leaves security professionals scratching their heads. A key question we must understand is: who is responsible for user security? First and foremost, responsibility lies with the endpoint provider – producers of 5G-enabled devices such as smartphones or cars, for instance. Take a truck, which, manufactured today and in the near future, has 50% of its parts connected to the internet. Failure to secure any one of these parts against potential cyberattacks could seriously affect the safety of the driver and create related risks for the organization.

    Although their responsibility is smaller than the endpoint provider’s, telcos must also ensure that their core network is protected as more traffic flows through their networks.

    And finally, a degree of responsibility is shared with the user too, especially where sensitive data is involved.

    No choice but to automate

    If this all sounds unmanageable, that’s because it is. At least by humans, who are the weakest link in 5G cybersecurity. Why? Because multiplied scale and interconnectivity make traditional security measures, such as encrypting a laptop with a hard disk, almost redundant.

    Vulnerability managers, confronted with a vast multiplication of assets that require scanning at pace, won’t be able to find enough hours in a day to do so. For this reason, there must be a high level of automation in cybersecurity programs. The quicker organizations and security professionals understand this, the sooner they’ll be able to adapt to and succeed in the 5G connected world.

    Employing the human factor

    But we cannot remove humans from the cybersecurity equation completely, and many people are understandably concerned about leaving security in the hands of automated systems. One example is when the endpoint is a machine in an intensive care unit with responsibility for someone’s life. In such cases, some level of human input will always be required.

    It is, of course, essential that we also have humans who truly understand the new tools, and it may not surprise you that there’s a battle going on to recruit them. In 2013, there were 1.5 million unfilled cybersecurity jobs, whereas today, there’s an estimated 3.5 million. The lack of skilled professionals is a problem that’s clearly not going away anytime soon. Whether it’s 5G or quantum computing, every new technology requires two kinds of specialists: one group with a deep knowledge of the overarching concept and another with deep knowledge of specific security considerations.

    This critical combination of factors has serious implications for the type of provider that wishes to deliver good-enough services to meet new requirements. Clients generally enjoy the intimacy of local security services and, until now, may have relied upon smaller firms or in-house personnel.

    With 5G, this will no longer suffice. To ensure end-to-end collaboration to effectively manage complexity, firms need an organization with global capabilities.

    Keeping pace with the new threat landscape can – and will – feel like an impossible task if you don’t act quickly. 5G offers extraordinary opportunities to level up an organization but seizing them requires robust protection that fuses the power of automation with the all-important human factor.

    Contact Capgemini today to find out how our network of global Cyber Defense Centers can help your organization embrace this.

    The rise of the cloud-native network data platform

    Yannick Martel
    25 Feb 2022

    Cloud-Native network data platforms are changing the game for telco data. Learn how cloud can help you embed agile and Dev Ops throughout your company in our new blog.

    Growing demand for network data

    CSPs are facing challenges on many fronts – rolling out new network technologies such as FTTH (Fiber to the Home) and 5G, improving customer satisfaction, while at the same time trying to reduce operating costs. The one link between these disparate goals is the need for network data. The variety of these data and their volume (often 10’s or 100’s of TB per day), make them valuable to many operational processes, and enable CSPs to:

    • build personal, one-to-one interactions with the customer, relying on a deep understanding of her behavior and experience;
    • improve service assurance, gaining insight into the quality of service that customers experience, without the need for surveys or complaints;
    • Help Engineering make smart network investments through insights into the nature and quality of services, value of customers, bandwidth, and latency appetite.

    In these cases and many others, readily available network data would help a lot. The question is: what’s the best way for CSPs to gather, collate and use their network data?

    An outdated dilemma: buying a platform or building your own?

    To most CSPs this is nothing new; indeed, they’ve been amongst the first adventurers in the deployment of big data. But up until recently they’ve been choosing between two strategies, neither of which fully meets their needs:

    1. Buying an off-the-shelf, proprietary solution. On the plus side, this option is pre-built with integrated telecom expertise. Unfortunately, data and derived insights are too often confined within the limits of a proprietary environment – tied to a specific data model, which is owned by the vendor and not the CSP. This makes it difficult or impossible to use your data in new ways, which may not have been considered when the platform was designed. It can also be very difficult to find the expertise necessary to manage these platforms, further limiting the potential uses of valuable data.
    2. Building one’s own big data platform with open-source technologies. This has proven quite effective at capturing massive amounts of data, but it demands significant resources. It’s difficult to evolve; it relies on dedicated expertise, and like the off-the-shelf option, it’s difficult to scale. It ties up valuable resources which you would prefer to allocate on solving business problems!

    All in all, both options have proven to be more expensive than initially expected, especially in the long run. They both lack flexibility and fail to exploit the full potential of data across the many dimensions of the organization.

    Why cloud-native technologies are taking the lead

    CSPs have already embraced cloud-native technologies to support their data transformations, with the first initiatives focusing on the corporate and customer domains. More recently, many have embraced the cloud for their network data initiatives, and for good reasons:

    • AI – artificial intelligence is becoming a ubiquitous tool, using data to improve operational processes as well as quality of service and relationships with clients. To implement and operationalize AI, CSPs and their data scientists need a wide choice of advanced tools and techniques, plus access to large datasets and large computational power at an economical cost when needed for training models. Cloud-native data platforms deliver these advanced AI capabilities, and with a much lower price tag than most in-house solutions;
    • Value – open source and cloud provide a wide range of other advanced capabilities as well, in a way that’s easy to use and highly cost-effective. Under constant pressure to improve their operations and invest wisely, CSPs can rely on cloud to make sure every penny counts;
    • Ownership – many CSPs have recently announced plans to become more “software-oriented” – in a sense more like the big Internet companies – developing software on their own, within their own control. They realize the criticality of their data and associated software, which manages, extracts value from and activates those data. It benefits CSPs greatly to own their code and make it a core asset. Cloud aids massively in the creation of agile software solutions, and in later adaptations and scaling.
    • Uniqueness – behind a façade of high normalization, networks and service platforms actually differ substantially from CSP to CSP, due to the histories of the particular organizations, the choices of architecture, and the combination of vendors and technologies. Thus, any off-the-shelf solution requires significant amounts of adaptation to meet the individual needs of a given CSP (due to the innate limitations of its design). Cloud-native data platforms make it possible to collect, process and massage data according to the unique needs of each operator. One size does not fit all when managing complex networks and extracting information out of them!

    In a word, cloud offers agility. It makes it possible to experiment, adjust, pivot, personalize and scale with a freedom that’s simply not practical with off-the-shelf or in-house platforms. And with the increasingly central role of data, this freedom helps to enable Agility and DevOps throughout your organization.

    Functions expected from the Network Data Platform

    Cloud-native data technologies are making the dream of network data democratization come true, while helping CSPs advance many of the challenges they’re facing, including the three mentioned at the outset. How?

    • One-to-one interactions – a cloud data platform renders network and service usage data more accessible – breaking down silos while ensuring security and privacy. This makes it possible to share the data necessary to drive the personalized, one-to-one customer interactions CSPs strive for.
    • Service assurance automation – a combined edge / cloud data platform collects CPE (Customer Premises Equipment) and network telemetry data efficiently – filtering, aggregating and correlating to obtain real-time insights into the operation of services. CSPs can thus spot potential problems early and identify root causes.
    • Smart investments – a cloud data platform collects high volumes of data on network, usage and quality of service (typically from cell towers and transmission equipment), aggregates them and applies advanced analytics and machine learning to anticipate future consumption patterns, identifies revenue opportunities, and prepares optimal investment allocation.

    Capgemini as partner for building cloud network data platforms

    At Capgemini we have experience deploying network data platforms for our clients, starting with on-premises big data platforms and then migrating them to the cloud, or starting with a cloud-native use case and expanding. We expect to see more CSPs taking advantage of the possibilities of cloud network data platforms to unleash the power of their data, remain in control, extract value and become data masters in combining network and customer data. Contact us to learn more.

    Interested in cloud network data platforms?

    Read about Capgemini’s Cloud Platform

    TelcoInsights is a series of posts about the latest trends and opportunities in the telecommunications industry – powered by a community of global industry experts and thought leaders.

    Ubiquitous Edge-based telematics solution for proactive maintenance of the connected car

    Capgemini
    Capgemini
    2022-02-25

    Proactive maintenance uses high-level software to monitor and analyze the condition of the vehicle’s systems, share the data with the service center, and schedule a cost-effective service.

    The automobile industry follows a periodic maintenance model for servicing cars defined by the OEM based on travel distance or time. During the inspection, certain critical parts are examined visually by the OEM service center’s maintenance team to ensure they are functioning as designed. Sometimes, the team will identify car parts that need replacing to avoid major breakdowns. However, one downside of periodic maintenance is that it can lead to higher service costs as technically functional parts are replaced to improve performance even though they are functioning.

    A better way to manage car servicing is proactive maintenance that utilizes intelligence embedded in the car to monitor and assess the working condition of parts. The car collects sensor data at regular intervals and sends the relevant data to the service center. The maintenance team uses the data to predict failures in advance and provides quick service as needed to meet the car owner’s service level agreement.

    The transition to proactive maintenance is part of the automotive industry’s digital transformation journey. Cars now have cellular connectivity (e.g., 4G and 5G), enabling them to share telematics data with the service provider over the internet for real-time monitoring and to schedule a service. The shared data from the car include speed, idling time, harsh acceleration, harsh braking, fuel consumption, mileage per gallon/liter, coolant temperature, level of coolant, maximum speed, engine oil level, fuel level, and distance traveled. (See Figure 1.)

    The data collected can trigger an emergency or routine service. In addition, managing the runtime/real-time data can lower the cost of the traditional pre-set service. For example, if a specific part needs immediate replacement, it can be addressed before failure. The alternative is to risk a breakdown that would be significantly more expensive for the car owner.

    Telematics data captured from the car network
    Source: Capgemini Engineering

    Real-time monitoring continuously assesses the connected car’s health and predicts faults and component failure. The data collected from the car can be used to create personalized plans for vacation car travel, improve driving enjoyment, ensure safety and reliability, and reduce the chance of a significant breakdown during vacation. The car collects sensor data using a controller area network (CAN) installed in the car network that sends data to the service center to decide whether to replace parts and predict the improved performance.

    The biggest challenge of monitoring real-time car data is data management. Data generation from various sensors include cameras, Lidar, radar, ultrasonic sensors, and the Global Navigation Satellite System, which can reach 25 GB per hour or higher.[1]  Therefore, analyzing the data in real-time requires a mechanism to analyze and manage the data at the network’s edge – in the car’s e-cockpit platform – because 25 GB per hour cannot be transmitted for analysis over the mobile network operator’s 3G or 4G LTE cellular networks to the OEM’s datacenter in the cloud.

    The car data monitoring system (CDMS) performs data collection, processing, and analytics inside the car network at the edge. After processing, filtering, and analyzing the data at the edge, only essential data is transmitted securely to the OEM datacenter to avoid hackers tampering with it. (See Figure 2.) Always-on connectivity is required to avoid any interruption when transferring the data.

    Using CDMS in the connected-car ecosystem is a growing trend for monitoring the wear and tear of car parts when in drive mode based on data collected from various sensors. The service center uses the information to predict any upcoming maintenance needs, which helps prevent drivers from getting stuck on the side of the road facing significant repair costs. Establishing secure, always-on connectivity with the service center and sharing critical data can potentially enrich the car’s performance in the long run and avoid major breakdowns.

    The benefits of CDMS in the car’s e-cockpit platform include:

    • Lower cost of maintaining, repairing, and inspecting the car ecosystem
    • Less downtime
    • Better performance
    • Improved reliability, availability, and maintainability
    • Delivering intelligence by evaluating the relevant data and using it to achieve longer service life and lower service costs

    The CDMS enables the service center to monitor the car’s condition remotely. When servicing the car, staff can account for how the car is driven to estimate the car’s lifespan. The car telematics data is collected, processed, and analyzed to predict possible failures, and the data is used to inform the type of maintenance work the car needs. If an error is found, service staff can take action to delay or prevent failure.

    The CDMS can diagnose faults before they become critical and predict the life of the components to ensure operational effectiveness, thereby reducing maintenance overhead. The CDMS addresses this challenge by measuring different parameters and providing support for proactive analysis and other statistics to display the present status of car components by evaluating the car’s current health and the driver’s safety data. (See Figure 3.)

    There are five key elements required for the CDMS to handle the data in the e-cockpit platform:

    1. Data collection and processing
    2. Data monitoring and management
    3. Data traffic optimization (compression)
    4. Secure data transmission
    5. Seamless connectivity

    The connected car market is emerging and can significantly boost revenues for OEMs, Tier-1 suppliers, and service providers in the coming years. Four technologies will enable tomorrow’s cars: connected, autonomous, shared, and electric. If the car breaks down or suffers equipment failures while in use, it can transmit its telematics data securely to a roadside assistance technician who can quickly trace the problem, fix it, and get the driver on their way.

    Telematics will be a vital component of the car ecosystem in the coming years. For instance, it will provide immediate service information to the driver by constantly monitoring the car’s parameters. The CDMS framework embedded in the e-cockpit platform will collect and processes data received from the CAN bus, enabling seamless switching over a 4G or 5G cellular network. With fast and stable access to the internet, the driver can access the menu of services offered by the OEM service center.

    With advanced technology in the connected car ecosystem, the OEM service center will monitor the car’s parts while driving and determine the exact time for the next inspection and replacement of critical components based on their actual condition. Also, the CDMS framework will monitor the car’s systems and the owner’s driving habits to suggest ways to improve situational awareness displayed on the e-cockpit dashboard. In addition to analyzing the data, the CDMS will manage the car’s data security, data compression, and high-speed data mobility (i.e., seamless connectivity with persistence) to improve the end-user experience addressing all network-related challenges.

    [1] Simon Wright, “Autonomous cars generate more than 300 TB of data per year,” Jul 2, 2021, Tuxera https://www.tuxera.com/blog/autonomous-cars-300-tb-of-data-per-year

    For an in-depth analysis, download the white paper “OPTIMIZE VEHICLE SERVICE WITH EDGE-BASED TELEMATICS“.

    Author: Vijay Anand, Senior Director, Technology, and Chief IoT Architect, Capgemini Engineering

    Vijay plays a strategic leadership role in building connected IoT solutions in many market segments, including consumer and industrial IoT. He has over 25 years of experience and has published 19 research papers, including IEEE award-winning articles. He is currently pursuing a Ph.D. at the Crescent Institute of Science and Technology, India.

    Connected marketing – faster, higher, stronger, together

    Abha Singh Senior Director, Capgemini Business Process Outsourcing
    Abha Singh
    25 February 2022

    The marketing function has become central to the success of the enterprise. Real-time, data-driven marketing enables marketers to be more proactive in engaging customers, making decisions and enhancing the e-commerce customer experience.

    Last summer – in 2021 – Tokyo played host to the 2020 Olympic and Paralympic Games. It’s a sign of the times in which we’ve been living. Since then, a new normal has set in – and the world has embraced virtual digital experiences across every aspect of the life.

    The world of retail is another case in point. Pre-pandemic, we were seeing significant growth in e-commerce sales: in 2019, they grew 20.2% year-on-year to reach $3.35 trillion. But in 2020, under lockdown, that growth accelerated breathtakingly – by 27.6%, to $4.28 trillion.

    eMarketer, “Worldwide ecommerce will approach $5 trillion this year,” January 2021: quoted in “A New Playbook for Chief Marketing Officers,” published by the Capgemini Research Institute

    The parallel between the two struck me recently. The traditional motto of the Olympic Games is “citius, altius, forties” – “faster, higher, stronger.” Retail organizations in general, and their chief marketing officers (CMOs) in particular, are experiencing broader, more sustained, and more rapid growth than ever before – and the same is true of CMOs’ roles.

    Their function is expanding beyond brand-building and scheduled promotions to include a wide range of other activities, including data analysis, mar-tech deployment, business strategy, business growth, supply chain integration for fulfilment, and customer experience. Faster, higher, stronger demands, indeed.

    A strategic partner to driving business growth

    It’s a trend that’s corroborated by a recent report from the Capgemini Research Institute (CRI). “A New Playbook for Chief Marketing Officers” tells us that over half (57%) of marketers agree that their C-suite executives now see marketing not as a cost center, but as a strategic partner in driving business growth.

    Given the increasing breadth of their responsibilities, you might think that CMOs are seeking greater support from external marketing specialists. That’s not completely the case. While the CRI report does show that most marketers are saying their teams work in partnership with agencies for activities such as branding and marketing strategy and digital marketing, it also reveals that in the next two to three years, 43% of them plan to bring this work in-house.

    In short, marketing has become central to the success of the enterprise – and, given the growing importance of e-commerce and the need for marketers to understand how customers interact with brands and companies (and to know when and where to engage with them), it increasingly needs to respond in real time.

    Real-time, data-driven marketing

    Real-time marketing enables marketers to collect relevant customer data, make quick decisions along the customer journey, be more proactive in engaging customers, support customized content, and enhance the e-commerce experience. It depends not just on gathering data, but on interpreting it and acting upon it quickly. Data-driven marketers process, analyze, and leverage data to fine-tune their campaigns, their content, and their other marketing outputs. By taking a data-driven approach, they also gain deeper understanding of consumers and trends, and target customers with personalized and relevant offers and services.

    Analysis in the CRI report shows that fewer than half of marketers can turn their data to good advantage. More of them would like to be able to develop data-driven go-to-market strategies.

    Opportunity for progress

    There is an opportunity here to make real progress. To take it, CMOs need access to a data platform that provides a unified view of the customer. They also need AI tools and skills to automate their customer segmentation and grouping. The best way they can do all this is to bring together the people and the processes in their organizations, removing obstacles, and creating what we at Capgemini call the Frictionless Enterprise.

    Capgemini’s own Connected Marketing Operations offer provides one such solution. It acts as a central hub, and aggregates knowledge, so organizations can see both the bigger and the smaller picture; so they can extend their channel reach, and increase the effectiveness of campaigns; and so they can improve their operational efficiency at the same time.

    Integrated – and smart

    Just before the Tokyo Olympics last July, the International Olympic Committee added a word to that famous motto. Translated from Latin, it now reads: “Faster, higher, stronger – together.” In today’s fast-paced, rapidly growing marketing environment, that sense of togetherness is just as important. When everything comes together in a single enterprise, with a common platform sharing data that can be interpreted and actioned in real-time – that’s when the magic can happen.

    In the second article in this short series, we’ll be looking more closely at what data-driven marketing looks like, and what it can achieve.

    To learn more about how Capgemini’s Connected Marketing Operations unlocks enhanced brand value and revenue impact through frictionless, digitally-augmented marketing operations, contact: abha.singh@capgemini.com

    Read the full CRI “A New Playbook for Chief Marketing Officers” report to learn why CMOs should enable real-time marketing to drive sustainable growth.

    About author

    Abha Singh Senior Director, Capgemini Business Process Outsourcing

    Abha Singh

    Senior Director, Capgemini Business Process Outsourcing
    Abha drives large transformation and consultative sales, presales, and marketing projects for Capgemini’s clients, bringing innovation into the core of every area of her work.

      For telcos, reaching net zero means mastering energy efficiency

      Vincent-de-Montalivet
      Vincent de Montalivet
      18 Feb 2022

      Find out how telcos are leveraging technology to reach their net-zero targets.

      Pulled in two directions

      Over the next five years, energy efficiency will become a core focus in the telco industry, as two sets of forces pull in opposite directions. First, energy needs will rise. A phenomenal rise of data and energy-hungry tech will continue to drive energy usage, which already constitutes 20 – 40% of telco OPEX. Second, industries around the world are beginning to respond in earnest to calls for more sustainable practices, which are coming not just from governments and the public, but increasingly from enterprise clients. It’s no wonder that upwards of one-third of operators have committed to net-zero goals. Telcos that fail in this challenge will soon find themselves operating with higher costs, degraded reputations and fewer clients. But the opportunity for those who successfully increase their energy efficiency will be profound. With enviable cost savings and well-earned ESG status, these telcos will operate with a powerful competitive advantage.

      Exponentially increasing data needs

      Data has been on the rise for years, and will continue accelerating exponentially in the near future. Technologies such as holograms, video-led digital experiences and new, digitally-generated realities such as the metaverse will require data on a scale that dwarfs anything we’ve seen. Even basic services will become increasingly data-heavy. Worldwide, data usage of average mobile subscribers is expected to increase from the current average of 11.4GB/month up to 34 GB/month by 2025 and 53 GB/month by 2027. Although 5G is proportionally more energy efficient than 4G, that difference will be swamped by the amount of data surging through networks. Additionally, 5G sites require more energy than their 4G equivalents, and 5G depends upon large numbers of closely-linked data stations. Today telcos are consuming between 2 and 3% of the world’s energy. Consumption will rise over the coming years; there’s little doubt of that. Telcos would be wise to put their energy efficiency strategy into action before this rise in data and energy becomes unmanageable.

      And many are doing just that. A recent GSMA survey of mobile operators found that 92% rated energy efficiency and sustainability as very or extremely important. But how to make that change is another matter. New science-based target initiatives (SBTi) standards limit offsetting carbon to just 5%. To move beyond that requires the analysis of large amounts of data to identify critical network weaknesses, but for most operators it is precisely these capabilities that are incomplete. In fact, many active and passive network equipment elements are not currently set up to measure energy consumption, let alone optimize it.

      Where to start

      The number one place to look for improvements is in the networks. Network energy usage swallows up  a huge amount of a mobile operators’ OPEX, and 70-90 % of that energy is consumed by RAN. These massive networks were built to maximize connectivity, not to minimize energy usage. And that means plenty of opportunity to go back in and make improvements. 

      From our white paper, 5G Network Energy Efficiency

      Some telcos have begun to find ways to cut the emissions from their networks. Our team recently worked together with one industry leader on a project to increase spectral efficiency. This project has demonstrated that through AI based RAN optimization, there is strong potential to  reduce the number of sites and lower emissions. Other telco players have found  success implementing remote sites driven by self-sufficient renewable energy sources – solar, wind and hydrogen. Vodafone, for example, has been launching eco towers across the UK, and Telia has launched a self-sufficient tower in the scenic Trollstigen region to address the coverage need in the remote but trafficked area. 

      The transition from 4G to 5G also provides an interesting opportunity to optimize for energy saving. Many operators are in the midst of sunsetting their 2G and 3G networks, which is beneficial from an energy efficiency perspective, especially when equipment can be reused or recycled. A few additional steps operators might take include:

      • Modernizing the network across all network nodes (especially transport, data-center and RAN) 
      • Implementing energy measurement and saving features, such as AI powered MIMO sleep mode,  and in turn also improve network performance 
      • Selecting virtualization-based architecture across network architecture and virtualized-RAN architecture 
      • Utilizing the AI/ML concepts and O-RAN based architecture to improve network efficiency 

      Nokia and Telefónica have made great strides in the last category, and are working together to build green 5G networks, as well as developing smart energy network infrastructure and AI / ML technologies to improve sustainability and performance. Despite traffic tripling since 2015, Telefonica su

      cceeded in reducing their energy consumption last year by 1%. Ericsson has also had successful trials on the application of reinforcement learning (a type of machine learning) to remote electrical tilt of antennas, resulting in a 20% decrease in downlink (DL) transmission power, without affecting performance. And the trend is spreading. Research finds that fully half of CSPs expect to save 10 – 20% in energy costs in the coming years – that’s 50% of CSPs that will be operating with significantly lower costs, and with a powerful sales advantage. 

      The triple bottom line

      Tackling inefficiency in networks is a crucial step, and many telcos already have a strategy in place. Where they face difficulty is primarily in tracking and measuring their progress along their net zero roadmaps. In addition to reducing their own carbon footprints, CSPs need to develop greener services and products in order to reduce their scope-3 emissions. These scope-3 emissions are looming large on the horizon – accounting for around 75% of the carbon footprint of a typical telco, according to some estimates. They will require impressive data capabilities and new levels of industry collaboration. Let’s not fool ourselves. Reaching net zero will not be an easy task, for mobile operators, or other companies, governments or private individuals. Lasting change depends on a mindset change in all layers of the organization. The benefits are a triple bottom line  – bringing value to telecom operators, customers and our shared environment.

      The fact is, what you can’t measure, you can’t manage. That’s where Capgemini’s capabilities are playing a crucial role in CSPs’ journey to net zero. To learn more about the ways we can help you leverage your data and equipment to reduce emissions, contact us below. With the power of intelligent platforms, the future of telecommunications will be efficient, sustainable and bright.

      #TelcoInsights is a series of posts about the latest trends and opportunities in the telecommunications industry – powered by a community of global industry experts and thought leaders.

      Authors

      Ane-Marte Weng

      Ane-Marte Weng

      Expert in Media & Entertainment, Telecom
      Shamik Mishra

      Shamik Mishra

      CTO of Connectivity, Capgemini Engineering
      Shamik Mishra is the Global CTO for connectivity, Capgemini Engineering. An experienced Technology and Innovation executive driving growth through technology innovation, strategy, roadmap, architecture, research, R&D in telecommunication & software domains. He has a rich experience in wireless, platform software and cloud computing domains, leading offer development & new product introduction for 5G, Edge Computing, Virtualisation, Intelligent network operations.
      Vincent-de-Montalivet

      Vincent de Montalivet

      Senior Director – Sustainability Transformation, Data & AI Portfolio, Capgemini
      Vincent leads the Global Sustainability Practice for Data & AI Portfolio along with North America Sustainability GTM. With a background in Sustainability Strategy, Engineering and Architecture, Vincent has been instrumental in driving sustainability digitalization efforts and transformation programs within major organizations across industries while ensuring tangible business outcomes.

        How cross-industry data collaboration powers innovation

        Eve Besant
        2022-02-18

        This article first appeared on Capgemini’s Data-powered Innovation Review | Wave 3.

        Written by Eve Besant SVP, Worldwide Sales Engineering, Snowflake

        Innovation doesn’t happen in a vacuum. The development of new products, services, and solutions involves input and information from a multitude of sources. Increasingly, many of these sources are not only beyond an organization’s borders but also beyond the organization’s industry. According to a 2020 research paper on cross sector partnerships, “Cross-industry innovation is becoming more relevant for firms, as this approach often results in radical innovations.” But developing innovations through cross-industry partnerships must involve coordinated data collaboration. “Firms can only benefit from cross-industry innovation if they are open to external knowledge sources and understand how to explore, transform, and exploit cross-industry knowledge,” the paper’s authors noted. “Firms must establish certain structures and processes to facilitate and operationalize organizational learning across industry boundaries.”

        WE’VE SEEN AN INCREASE IN THE NUMBER OF CUSTOMERS WHO WANT TO COLLABORATE ON DATA FROM OTHER INDUSTRIES TO SPUR NEW IDEAS.”

        Examples of cross-industry data collaboration

        There is a multitude of examples of how organizations across industries have spurred innovation through collaboration.

        • In financial services, institutions that must prevent and detect fraud use cross-industry data sharing to better understand the profile of fraudsters and fraudulent transaction patterns.
        • In manufacturing, companies are using AI to manage supply-chain disruptions. Using data from external sources on weather, strikes, civil unrest, and other factors, they can acquire a full view of supply-chain issues to mitigate risks early.
        • In energy, smart meters in individual homes open new doors for data collaboration, transmitting information about energy consumption.
        • In education, school systems, local governments, businesses, and community organizations work together to improve educational outcomes for students.
        • In healthcare, during the COVID-19 pandemic, hospitals relied on information from health agencies and drug companies regarding the progression and transmission behavior of diseases. Governments followed data from scientists and healthcare professionals to create guidance for the public. Retailers heeded guidance from the public and healthcare sectors to create new in-store policies and shift much of their business online.

        The role of cross-industry data collaboration in innovation during the pandemic is perhaps nowhere better exemplified than in the COVID-19 Research Database, involving a cross-industry consortium of organizations. The database, which can be accessed by academic, scientific, and medical researchers, holds billions of de-identified records including unique patient claims data, electronic health records, and mortality data. This has enabled academic researchers in medical and scientific fields as well as public health and policy researchers to use real-world data to combat the COVID-19 pandemic in novel ways.

        Best practices for cross-industry collaboration

        As the examples above show, organizations that have developed cross-industry data collaboration capabilities can more easily foster innovation, leading to a competitive advantage. Here are some of the considerations and best practices that enable sharing and collaborating on knowledge across industries.

        • A single, governed source for all data:
          Each industry – and indeed, each company – stores and formats its data in different ways and places. Housing data in one governed location makes it easier to gather, organize, and share semi-structured and structured data easily and securely.
        • Simplified data sharing:
          The relevant data must be easily accessible and shareable by all partners. Data is stored in different formats and types, and it can be structured, semi-structured, or unstructured. It can be siloed in specific departments and difficult or slow to move, or inaccessible to the outside world. What processes and tools are in place to transform cross-industry knowledge into a shareable, usable format?
        • Secure data sharing:
          Data privacy is of the utmost importance in today’s society. Data must be shareable securely and in compliance with privacy regulations. Cross-industry data sharing often involves copying and moving data, which immediately opens up security risks. There may also be different data protection and privacy regulations in different industries.
        • Inexpensive data management:
          Data must be shareable, and budgets kept in mind. Centralizing, organizing, securing, and sharing data is often resource-intensive, so organizations need to find ways to manage and share their data more efficiently.
        • Democratized data:
          While data security and privacy are paramount, companies must “democratize” data so that it is accessible and shareable in a way that allows non-technical users in both internal and external parties to use it easily.
        • Advanced analytics:
          Technologies such as AI and machine learning can help companies glean deeper insights from data. This requires a data foundation and tools that can analyze all types of data. Technological tools are making it easier for organizations to follow and gain ROI from these best practices.

        For example, Snowflake’s Data Cloud enables the seamless mobilization of data across public clouds and regions, empowering organizations to share live, governed, structured, semistructured, and unstructured data (in public preview) externally without the need for copying or moving. Snowflake enables compliance with government and industry regulations, and organizations can store near-unlimited amounts of data and process it with exceptional performance using a “pay only for what you use” model. They can also use Snowflake’s robust partner ecosystem to analyze the data for deeper insights and augment their analysis with external data sets.

        “We’ve seen an increase in the number of customers who want to collaborate on data from other industries to spur new ideas,” Snowflake’s Co-Founder and President of Products Benoit Dageville said, “ to foster innovation, to be able to freely collaborate within and outside of their organization, without added complexity or cost.”

        The future of mass collaboration In the future, cross-sector data collaboration will only play a larger role in innovation as technology becomes more ubiquitous and the public grows more comfortable with sharing data. We could see worldwide consortiums that collaborate on data to solve some of humanity’s biggest problems: utilizing medical and scientific information to tackle global health crises, enabling more-efficient use of resources to fight poverty and climate change, and combating misinformation.

        Organizations such as the World Bank are already working on such initiatives. Its Data Innovation Fund is working to help countries benefit from new tools and approaches to produce, manage, and use data. According to a recent World Bank blog post, “Collaboration between private organizations and government entities is both possible and critical for data innovation. National and international organizations must adopt innovative technologies in their statistical processes to stay current and meet the challenges ahead.”

        To unlock the potential of innovation through data collaboration, organizations must make sure their data management and sharing capabilities are up to date. A robust, modern data platform can go a long way. But what’s also needed is an audit of internal processes and tools to ensure that barriers to data sharing and analysis are not impeding innovation and growth.

        INNOVATION TAKEAWAYS

        COLLABORATION NEEDS BEST PRACTICES

        Organizations that implement best practices in cross-industry data collaboration can foster innovation, leading to a competitive advantage.

        DATA CAPABILITIES MUST BE UP TO DATE

        Organizations must make sure their data management and sharing capabilities are current, to unlock the potential of innovation through data collaboration.

        TECHNOLOGY AND PLATFORMS TO THE RESCUE

        Dedicated tools and data platforms make it easier for organizations to gain cross-sector data-collaboration capabilities much quicker.

        Interesting read?

        Data-powered Innovation Review | Wave 3 features 15 such articles crafted by leading Capgemini experts in data, sharing their life-long experience and vision in innovation. In addition, several articles are in collaboration with key technology partners such as Google, Snowflake, Informatica, Altair, A21 Labs, and Zelros to reimagine what’s possible. Download your copy here!

        2022 Key trends in Tax

        Simon Pearson - VP, Global Tax and Trade
        Simon Pearson
        2022-02-18

        In many ways tax authorities must become disrupters and innovators to keep pace with changing user expectations and the opportunities enabled by adjacent industries such as retail banking, fintech, payments and connected supply chains. Using advances in intelligent industry, digital, data and cloud will make tax much easier to administer for businesses, citizens and tax authorities alike.

        As governments scramble to respond to the enormous challenges facing societies, economies and our planet, speed and agility are now essential attributes for public authorities. During the pandemic, national treasuries often had to set aside traditional structures and processes in order to release the huge sums of money so urgently needed to maintain social cohesion.

        In turn, many tax and customs authorities are transforming too, embracing new and innovative ways to keep essential tax revenues flowing, that respond to changes in society, and the financial imperatives of the health and climate emergencies, while maintaining security and compliance.

        Digital technologies, data and cloud are providing the transformational tools required. Automation and AI are replacing manual processes, producing more agile, service-driven organizations, able to meet customer demands for convenience, speed, and ease of use. Data and analytics are informing decision making and financial planning, as well as nudging citizens towards the right behaviors, while helping with their entitlements and obligations. Skilled tax professionals are becoming active change agents, creating more flexible and technology-enabled tax regimes that help drive key social, economic, and environmental policies.

        1. Building trust and security will help transform tax authorities’ place in the economy and society

        As tax authorities continue their fightback against cyber criminals by bolstering their defenses with increasingly robust and sophisticated cybersecurity measures, not only are they protecting vital national resources and infrastructure, but they are also building that priceless commodity – trust.

        Trust is a critical component in the ongoing evolution of tax authorities, from enforcers to business enablers and active participants for good, providing the resources that deliver governments’ key social, economic, and environmental policies.

        Trust can be truly transformational in the tax world. When people trust their tax authority, they are more likely to pay their taxes in full and on time. When citizens feel that their tax system is fair, secure, transparent, and operating in the best interests of society, they are more likely to share their data; more likely to adopt digital processes and modern payment mechanisms; and more likely to use technologies such as cognitive care when they need support.

        In these circumstances businesses are more likely to see tax authorities as potential partners, participants in rich data ecosystems, collaborating and sharing information on their tax affairs while bringing benefits to society by tracking ethical practices such as the living wage or adherence to modern slavery legislation. This is a powerful reach far beyond that tax authority’s traditional role.

        As these new relationships – and the trust that sits at their core – become established and grow, so that spirit of collaboration can extend throughout economies and societies, driving sustainable economic growth, supporting businesses, and achieving social responsibility goals.

        Enhanced cybersecurity has also enabled tax authorities to successfully embrace hybrid working during COVID-19, at a time of unique risk and vulnerability, with criminals eager to exploit any loopholes as public sector organizations scrambled to formulate their responses to the pandemic. This must remain a focus as criminals become more inventive in their exploitation of weaknesses, with great emphasis on supporting and protecting users in their critical tasks through education, new processes and technology enablers.

        2. User-centric products and services, combined with technology, will drive participation

        Today’s customers, whether consuming services from their mobile phone company, clothing retailer or tax authority, expect a fast, frictionless, and personalized multi-modal digital experience, informed by an understanding of life events and, in the case of their own tax situation, precise information about tax obligations and entitlements.

        In 2022, the drive for hyper-personalization will accelerate, with tax authorities adopting best practices from across the economy to apply user-centricity to all stages of the customer journey, to increase trust, confidence and compliance with tax laws and obligations, while also reducing the need for costly agents and accountants.

        Digitally native customers will adopt self-sovereign data practices, ensuring that the data that tax authorities hold on them and their businesses is accurate, while also deciding who else they wish to share it with. This will give rise to new forms of data sharing and consent across geographical boundaries, facilitating ease of movement, and also improving overall tax compliance by making pre-populated tax returns and payments an easy process. In 2021 the UK’s HMRC launched the world’s first public sector Open Banking payment initiation system, enabling payments to be made directly from bank payment accounts to payee bank accounts, without the use of cards.

        Meanwhile, advances in mobile technology, 5G and edge computing will enable more media and AI-enabled applications for tax administration to become available, serving the needs of all taxpayers, but in particular younger people for whom smart devices are instinctive and the default choice. By providing a feature-rich user experience, new taxpayers can be better informed about the role of tax in society and be confident to manage their tax affairs and share their data from the palm of their hand.

        3. Data sharing and data sovereignty will deliver choice and control

        Real-time data will drive tax obligations and welfare entitlement at the point of the transaction, driven by closer integration with customers’ third-party applications, and voluntary compliance through integration with their banking and platform lives.

        The importance of real-time data will be amplified across Europe as e-invoicing and VAT standards are mandated, enabling both more accurate data capture and AI-driven repayments, based upon risk and provenance. This will promote stronger economic activity with greatly reduced friction.

        Combining rich data from Open Banking, payment and other third-party data will allow AI and pattern recognition to enable early identification of business vulnerabilities, allowing customers to declare their risks, seek support and prevent unrecoverable business debt and individual hardship. Early warnings will enable tax authorities to make better decisions about compliance and debt management interventions as early as possible.

        Meanwhile the use of common data spaces and ecosystems, driven by standards in Open Finance, will allow tax-related data to be shared, with consent, to recover tax in a more transparent and frictionless manner. There will also be an ongoing focus on closed ecosystems sharing critical, cross-border financial information to close gaps in financial crime and tax evasion.

        4. Demographic shifts will produce growth in indirect taxes – and automation and AI

        Demographic studies reveal growing social and economic challenges facing industrialized nations, caused by rapidly aging populations. The UN predicts that those over 65 years of age will double from 727 million in 2020 to more than 1.5 billion by 2050.

        Among the many consequences of this trend are a reduction in the working-age population, rising healthcare and pension costs, and increased demand in the economy for products and services for older citizens. As a result, 2022 will see governments, through their tax authorities, continuing the trend towards more indirect tax regimes, where citizens will pay for the things they use and the assets they own, rather than contributing to national budgets through income or business taxes.

        At the same time, similar effects are being experienced by tax authorities themselves as older, skilled and experienced tax professionals retire, with lower numbers of experts available to replace them.

        Here, 2022 will see further extensions in hybrid, more sustainable working models and intelligent industry techniques, using data to allocate tasks to the most appropriate resources, deploying automation, AI and collaboration tools to enhance productivity, reduce errors and enable smaller teams to work on higher value tasks.

        5. Tax will be increasingly used to drive consumer behavior

        Humanity must achieve the most fundamental change in its behavior, in the shortest period of time in its history, if Net Zero 2050 is to become a reality.  Although by common consent we’re starting to fall behind in the race to Net Zero, even at this late stage, all is not lost.

        By redoubling our efforts and taking fast, effective and coordinated action, the line on the graph can still be reset to the required trajectory, towards the 2030 targets that we must hit to achieve 2050.

        To achieve the mass consumer participation that brings Net Zero into range, more and more products and services produced by sustainable means must be affordable, easy to access and simple to use, for the overwhelming majority of consumers.  Currently, uncompetitive prices, lack of availability and perceived complexity are still pushing too many consumers in the direction of high-carbon, unsustainable solutions.

        In 2022, tax authorities will have an increasingly important role to play in enabling more and more consumers to contribute to the global effort, deploying a variety of tax policies to encourage citizens to make the vital personal changes in lifestyle and purchasing decisions – electric cars versus petrol or diesel power for example – that are essential if we are to deliver a brighter future for all.

        Further reading

        For information about Capgemini’s tax and customs services, visit here.

        Our look at 2022 trends in tax and customs was compiled in conversation with:

        Simon Pearson - VP, Global Tax and Trade

        Simon Pearson

        VP, Capgemini, Global Tax and Trade Cluster Leader
        “While tax brings essential funds to economies, compliance depends on the perception of tax justice. Authorities must ensure fairness by closing the tax gap and bearing down on the non-compliant. This is both a national and cross-border issue and tax authorities are recognizing the value of data sharing and tackling new forms of evasion with innovative detection capabilities.”