Skip to Content

Is talent management dead? Long live employee experience!

Capgemini
Capgemini
22 Mar 2021
capgemini-invent

Talent management is evolving into employee experience as organizations recognize that prioritizing the workforce is key to achieving better business outcomes. Driving this change is now a strategic imperative embracing not only HR’s role in enabling workforce satisfaction but also the CMO’s skills in creating a brand and purpose that employees value and want to be part of.

Talent management is dead! Don’t get me wrong: people still need to be managed but changing employee expectations demand a new approach to engaging, nurturing, and valuing your workforce. A shortage of talent with the right skills also means your future employees can afford to be choosy. If you don’t match what candidates are looking for, plenty of other employers will. Further, a Capgemini study found that more than 65% of executives agreed the gap between the skills their organizations needed and the ones that people possessed was widening.

So, what does today’s talent expect? Perceived thinking is that employees are looking for more than just ‘a job’. They want an experience, not management; a purpose, not just profit. This quest begins long before they join a company. Potential new hires will check out social media and read employee (current and former) reviews to get a feel for a company. They’ll also explore the brand’s publicly-stated commitments to the environment, social impact, diversity, and more.

In the end, they want to work somewhere they can grow and learn; where they feel part of a broader purpose. And where they have freedom to shape their own way of working. This is employee experience (EX) and now, more than ever, it has become an important differentiator in the talent market.

Better business outcomes

Shifting from talent management to employee experience has wider ramifications for the business beyond just recruitment and retention. We believe that to win more customers in the market, employers must first win their internal customers — their employees. Indeed, research suggests that companies with highly engaged employees outperform their competitors by 147 percent.  The following quote from Sir Richard Branson, entrepreneur and founder of the Virgin Group, reinforces that belief:

“Clients do not come first. Employees come first. If you take care of your employees, they will take care of the clients.”

Thus, the successful companies of tomorrow will be those that prioritize their workforce as a crucial component of growth. Analyst firm Forrester supports this in ‘It’s Time for CMOs to Pay More Attention to Employees’ report, saying:

“Our latest data clearly demonstrates that there is a correlation between EX, CX (customer experience), and business performance: Engaged employees are more likely to become brand ambassadors.”

To this end, talent experience needs to mirror the customer experience — on demand, any time, anywhere. Organizations that invest in EX successfully attract, retain, and empower employees to do their best work, achieving 50 percent higher productivity, and being six times more frequently ranked as an in-demand employer.

A formula for success

On the understanding that EX can negatively influence operational performance and customer experience when employees are not personally invested in their jobs and organization, we’ve given this its own formula: CX = StrategyEX.

So, how should organizations make the shift to an employee experience strategy, as opposed to more traditional talent management approaches? At Capgemini Invent, we advocate broadening the responsibility for talent beyond that of HR to embrace the Marketing function in an alliance of the Chief Marketing Officer and Chief People Officer. This strategic and operational alignment will increase the success of both. How? Because EX will accelerate the performance of CX — our CX = StrategyEX formula.

Forrester too expounds on this in its report, saying:

“EX leaders should leverage the talents of the CMO to amplify their efforts to activate company values and develop employees’ relationship with the brand for which they work.”

We believe that the CMO and CPO can benefit from each other’s skills in this respect. Interestingly, their skills and outcomes are similar. For example, while the CMO is the creator of data-driven insights, the CPO is the planner of a data-driven workforce; the CMO is a brand builder and storyteller, while the CPO shapes culture and connects the workforce. Finally, the CMO is responsible for customer satisfaction and engagement, while the CPO has the same responsibility for employees.

Tying this alignment back to business outcomes, the Forrester report continues:

EX has become a critical component of business success, so a coordinated effort between marketing and employee functions is necessary to sustain this success long term. Employee engagement is entering a new era, sitting at the crossroads of HR, IT, and Marketing, which means CMOs need to foster these relationships.”

New structures and a new cultural mindset

In some organizations we’re already seeing a merging of EX and CX responsibilities, or of People and Marketing functions coming together under a single business leader. This enables a more strategic focus on employee experience as it aligns with customer experience. Of course, this also suggests a need for some restructuring and, at Capgemini Invent, we help our clients redesign their organizations — see ‘A new employee experience model’, below.

A new cultural mindset is also part of the shift to employee experience. Today’s workforce is looking for a more personalized experience at work. For example, there is an expectation for self-led, continuous, and personalized development. Getting this right has an impact on reducing talent churn. Some 94 percent of employees would stay longer if the company invested in their learning and development (L&D). Managers should be provided with learning that equips them with EX behaviors and practices so that they can better understand their teams and tailor L&D around individual needs.

The role of leaders in shaping employee experience is also implied in our report ‘The Future of Work’ assessing new hybrid working models enabled by digital and data. It asserts that in defining ‘authentic leadership’, businesses should:

Encourage autonomy, empathy, and transparency. Redefine the role of leaders to empower employees to make data-driven decisions, use data to manage the remote workforce, and enhance remote leadership skills, such as empathy, active listening, and adaptability, etc.”

At Capgemini Invent, we put this thinking into practice for German railway company Deutsche Bahn when we helped to reshape the job profiles of its leaders. Managers had been getting lost in administrative tasks, leaving little time for good leadership. We worked with the company’s leadership team to prototype future leadership roles, enabling managers to reduce administrative and technical activities and invest up to 14 hours per week in focused management tasks.

 A new employee experience model

We have identified the following components of the new employee experience model and work with our clients to bring them to life:

  • Organizational Design: modifying governance and process to support EX initiatives and routinely measure EX throughout the employee lifecycle.
  • Workspace: transforming the workspace environment to empower employees, drive innovation, and optimize employee experience.
  • Culture & Purpose: adapting to changing workplace expectations and embedding EX as a core value in the organization.
  • Leadership: establishing an EX-driven leadership mindset that creates and maintains a strong employee-first approach promoting employee well-being.
  • Technology & Tools: providing innovative, intuitive and fit-for-purpose digital tools that empower employees to do their best work within a ‘frictionless enterprise’.
  • Career & Growth: establishing mechanisms for professional growth that ensure employee satisfaction, motivation, and retention.

Several of the above components came together in a project that we delivered at Siemens. Its Supply Chain Management organization recognized that it needed to look closely at how digital transformation was changing its leadership. We helped to develop a Learning Journey for Digital Leaders, and a #Digital Leaders training product is now being implemented throughout the Siemens Group.

When enterprise computing company Cisco sought to shake up internal silos and systems in order to better foster a design-led thinking mindset, it turned to our creative agency Idean. Together, Cisco and Idean developed and launched the Cisco Design Thinking program giving employees, partners, and clients a new set of tools and methods to work better together. Design thinking has since become a practice shared by thousands of Cisco employees across the organization, including product, sales, and services teams, with both team performance and deal size increasing as a result.

Creating competitive advantage

Of course, we know that adopting an employee experience model isn’t going to happen overnight. Nonetheless, it is a necessary transformation. To quote author and futurist Jacob Morgan:

“In a world where money is no longer the primary motivating factor for employees, focusing on the employee experience is the most promising competitive advantage that organizations can create.”

 So, while it is tempting to focus on financial metrics to help meet ambitious goals, you first need employees who are excited and care about their work. Improving your team’s performance (and its emotional well-being) begins with ensuring what you say, how you say it, and your metrics communicate one simple message: Your work matters.

I recommend every leader shifting the communication focus from internal metrics to customer-outcomes. The result? A next level of customer empathy and value, yielding higher employee satisfaction and performance.

Why blockchain is here to stay on the roadmap of digital healthcare

Shyamsree Nandi
March 22, 2021

A decade ago, we could never have contemplated clinical and medical health records leaving the network of payer and provider infrastructures. Cloud was a plausible future but there were several concerns around safety, rightful use of patient data, and health data security in general. Those concerns are still real but mitigated on many fronts by using the right technology. In 2012, when Estonia became the torchbearer in eHealth and launched Blockchain for Healthcare, much of the perception was that the initiative was too far ahead of its time. Today, 99% of health data is digitized and 100% billing is electronically accomplished in a country with a population of 1.32 million (2019 data).

Most payers have a strategy in place to become all-digital health plans within next three to four years. While efforts are well into mature stages of enabling the best possible member experience, there exist some significant deterrents. The most daunting challenge is about enabling seamless care coordination, given that healthcare is managed by so many different stakeholders across the value cycle with care transition steps along the way. Improving trust levels between healthcare entities and scalable, distributed data exchange driven by interoperability are vital to successful digital health programs.

Blockchain is inherently wired with three central schemes of zero knowledge proof, distributed digital ledgers, and immutability of records – all of which are amenable to the building blocks of an efficient care coordination service mosaic. Consider the fundamental concern about PHI security and privacy related to medical records. Leveraging ZKP and quadratic equations, the verifier could verify specific elements without even revealing the whole transaction and hence it would be impossible for any actor in the healthcare value chain to fully construct the clinical history of a member. This is a big leap that the healthcare industry needs today to drive the ethical use of members’ most private health data.Likewise, distributed ledgers can give ownership of isolated segments of a member’s health journey without requiring the physical record to move anywhere beyond its current location, thus enabling transparency while simultaneously preserving the dynamicity of the health journey.

The immutability aspect guarantees that the record can at no point in time be altered without being transparently visible to all other stakeholders in the chain. That coupled with audit and traceability of every action would make healthcare fraud or data breaches extremely difficult.

Another interesting dynamic of blockchain is that it is built for interoperability and has the capability of unchaining clinical systems that reside in their own silos. Top that off with the ability to use smart contracts, which could define immutable transaction logs of all known exchanges of a patient’s health record. They could be further augmented through FHIR consents, which could be represented in smart contracts too, offering the member full flexibility to modify the consent levels as needed. Some of the longstanding problems in the industry, such as Medicaid churn or coverage continuity as beneficiaries constantly exit and re-enter due to eligibility changes, can be addressed to a large extent by defining smart health profiles. In the longer run, artificial intelligence could influence the inception of completely new healthcare financing models on the back of medical blockchain. Although still nascent, blockchain has increasing claim to fame, given the focus on digital health and more federated healthcare services with an openness to embrace newer care delivery and coordination models. Blockchain is here to stay!

For more details, contact healthcare@capgemini.com

No one likes waiting. With Continuous Delivery, now you don’t have to!

Venky Chennapragada
March 22, 2021

There’s a long – almost continuous – list of reasons why organizations should move to continuous delivery (CD). In this blog, I’ll discuss the benefits of doing so – for both for developers and users. I’ll also look at the best way to get started with CD.

These days, businesses expect IT to deliver new features or fix defects faster and more consistently using Agile and DevSecOps methods. In order to accomplish this in 2-3-week sprints, it’s crucial to automate CD tasks. In DevSecOps workflows, end-to-end functional testing and user acceptance testing in QA to pre-prod environments are conducted in the CD phase. These types of testing take time, so focusing on the CD phase and automating CD tasks is critical for successful DevSecOps. Moreover, unit testing is done in the development environment by a developer during continuous integration (CI) and after the code merges to the main branch, independent validation is done in the CD phase. Therefore, implementing CD is critical for the quality of deliverables.

Out of patience and on to Continuous Delivery

Continuous delivery offers developers three main benefits. First of all, there is “no waiting for environments.” As part of a CD environment, provisioning is automated, and environments are created on demand using Infrastructure-as-a-Code (IaaC). Developers don’t have to wait for the creation of new environments or reworks due to manual provisioning. The second benefit is that there is “no waiting for testing feedback.” During the CD phase, testing teams write automated test cases for functional testing to UAT. Developers can fail early, but make the necessary fixes, as they are receiving faster feedback from testers. Finally, the third benefit is “code quality and security scans.” These are integrated into CD workflows – and any critical or major vulnerabilities can be detected and fixed early.

No waiting, no defects, no problem

CD offers even more benefits to users. Since it recommends testing in lower environments and adopting a culture of failing fast, code is promoted with fewer (or no) defects to higher environments, resulting in better quality of deliverables. CD also leads to reduced technical debt since it recommends a test coverage of over 95%, meaning that no newer debt is added. As part of sprint planning and automation, teams attain the bandwidth to fix technical debt. As part of CD development, testing, release, and environment provisioning, all these are automated. This reduces process and wait times for resources and feedback – and in turn – lowers development and testing costs.

CD requires designing, implementing, and enforcing stringent quality gates for code promotion through to production. This improves the stability of production environments and reduces tickets and incidents, while lowering operational costs. Finally, with improved deliverable quality, production environments are stable, highly available, and newer features are released more frequently. This leads to substantially improved customer satisfaction and loyalty.

There are many reasons to adopt CD – both for users and developers – and the best way to get started is to choose a CD orchestration tool for building the workflow. Next, it is important to create production-like environments and implement continuous testing. Then, you have to design and implement promotion-quality gates from quality assurance and production environments. Finally, it’s time to automate workflows. Throughout the process, it’s important to avoid certain common mistakes. The first is not having the required environments or production-like environments. Many newcomers also do not conduct enough automated testing. We’ve also seen a lack of CD skills such as release automation, the ability to set up code quality gates, security testing (static, dynamic, etc.), which can decrease the chances of a successful transformation.

To maximize your success with Continuous Delivery, don’t wait – check out ADMnext and get in touch with me here.

Digital twin within the supply chain – the benefits

Capgemini
March 18, 2021

In the first blog in this series, we looked at the concept of leveraging a digital twin approach to optimize your supply chain. In this blog, we look at how Capgemini has helped a client to efficiently minimize the friction in the order processing area, and delve into the benefits that a digital twin can bring to your supply chain organization

15–20% FTE reduction with minimal change

One of our global clients was concerned about the low scores it received in a customer satisfaction survey, which was caused by inconsistencies and errors in its order validations. On top of this, the client’s huge, unmapped data lake wasn’t helping its increasing need for end-to-end supply chain visibility – especially due to these order-related concerns.

To address this, Capgemini took an industrialized approach to quickly identify the cause of the problem, leveraging our Digital Global Enterprise (D-GEM) transformation platform to create a digital twin of the client’s order management operations. The steps we took are outlined below:

  • Business mining – the client’s data was fed into a process mining application to understand what was actually happening in its order management operations. This revealed that the client’s order managers were confirming orders without the proper downstream validation, which was the main cause of the low customer satisfaction scores. It also revealed that, despite a lack of order validation, customers experienced issues in only 40% of cases
  • Modeling – the process mining outcomes were translated into a virtual model of the “as-is” operations and an optimized scenario leveraging solutions that would accelerate order processing operations
  • Simulating – digitally-isolated, offline simulations revealed that the model outcomes had high queues in a number of downstream tasks – no matter what the order confirmation step sequence was. At this stage, we also verified the transformation by inputting the relevant values into our offline model
  • Improving – process re-engineering analysis revealed that a specific downstream order validation process could be robotized due to its low complexity and high repeatability. The implementation of robotic process automation (RPA) manual processes to be eliminated, leading to a 91% improvement in overall processing time.

Capgemini’s D-GEM platform confirmed the digital twin model predictions, delivering a 15–20% reduction in order to invoice FTEs and a 94% increase in order completion time.

What digital twins bring to the table

As the use case above shows, implementing a digital twin across your supply chain can deliver a range of tangible benefits, including:

  • Enhanced bottleneck identification – provides a continuous, end-to-end view of frictions and bottlenecks across your supply chain, which enables faster problem resolution with minimal human intervention
  • Improved process design testing – mitigates business continuity and transformation risks before they occur
  • Improved outcome prediction – enables benefits, savings, and potential ROI to be calculated before transformation occurs, maximizing your transformation positives while minimizing the negatives that often come with them
  • Enhanced risk monitoring and emergency simulation – discovers the best course of action for emergency situations through proven, digitally-isolated testing techniques that can significantly improve your organizational stability.

Tangible, real-world benefits

In short, implementing a digital twin can improve efficiency and reduce supply chain costs by delivering business process outcomes simulated in a virtual environment that enable your organization to make the right investments and decisions for guaranteed return.

As digital twin technology matures and developers gain a wider appreciation for how it can be applied in the supply chain world, these tangible, real-world benefits will only increase. The only question remaining is: “Have you digitally twined your processes yet to ensure a successful transformation?”

To learn more about how Capgemini’s Digital Supply Chain practice  can help your organization leverage a digital twin to build a resilient, agile, and frictionless supply chain , contact: joerg.junghanns@capgemini.com

Jörg Junghanns  leverages innovation and a strategic and service mindset to help clients transform their supply chain operations into a growth enabler.

Frictionless accounts payable – happy customer, happy employee

Capgemini
March 18, 2021

Here’s a philosophical question for you – well, sort of. Is business all about money?

Let’s take the accounts payable (AP) function. Data discrepancies or errors can be costly – but they are also bad in other ways. For instance, they can be damaging to supplier relationships, and to brand image. They can take up time, too, because those errors are going to need rectifying.

So, it’s not just about money, then. Except, well, maybe it is. Because damage to supplier relationships and to brand image can affect demand, as well as supply – and damage to either can affect sales. Which means money. And fixing those errors isn’t free, either, because as we all know, time is money, too. So, yes. At least as far as AP is concerned, maybe it really is all about money.

Sources of AP friction…

If businesses want to save that money, they’ll need to know where to look for potential problems, so they can stop them happening. Here are some possible areas of friction:

  • Supplier onboarding – manual updating of master data is a request-driven process that can involve many people, many steps, and often unsatisfactory results, leading to a high lead time – and to unhappy suppliers
  • Invoice processing – manual, paper-based invoicing systems or manual-driven exception processes can lead to late payments, errors, internal process issues, disputes, and strained relationships with suppliers, in addition to added costs from multiple areas. In a word, friction
  • Payment – erroneous payments and duplicate payments are cripplers to the invoice-to-pay process. Inconsistencies in manually entered supplier information, invoice amounts, or coding, can cause a single invoice to be paid twice. Companies may also accidentally make double payments if they use multiple financial applications, instead of a single integrated system.

… and the Frictionless Enterprise

In short, what’s needed in AP is an approach that we at Capgemini unsurprisingly call the Frictionless Enterprise.

The Frictionless Enterprise enables a smooth and seamless flow of information and collaboration between employees, their departments, and those with whom they work. It also encompasses their relationship with customers, with partners, and in the case of AP, obviously with suppliers, too.

Achieving the Frictionless Enterprise doesn’t mean the arbitrary application of technology, rules, or processes. It entails whole new, digital ways of thinking and working, combined with the capacity to adapt constantly to new contexts.

Frictionless AP benefits

There are several benefits when the AP function is part of a smart, seamless operation.

For example, supplier onboarding becomes a smooth process, with AI-enabled zero touch validations. Businesses can also set up supplier portals with self-service voicebot and chatbot options, thereby removing the hassle from invoice submission, making it possible to automate the exchange of certain kinds of data, and enabling suppliers to get a sneak-peek of the status of their invoices.

Invoice processing can become paperless, and frictionless, with a seamless integration of workflow from procurement through to accounts payable. Machine learning (ML) pattern matching systems can automate approvals, and process controls can be automated, too.

Fraud detection can be improved. This is another area in which ML pattern matching can help. Companies can also set up autonomous data set scans to identify patterns, and detection of discrete error types that bypass traditional controls and audits can be automated.

The payment process is also improved, with automated alerts for early or dynamic discounting; automated pay schedules that are integrated into the ERP system; automated remittance advice; and protection of working capital by automating the identification of overpayments and fraud before the pay run.

Finally, artificial intelligence (AI) can be brought to key AP processes. Service desk functions can be automated, using natural language processing (NLP), ML, and other smart technologies to resolve supplier queries automatically. Also, AI can be incorporated into an AP Control Tower that can, among other things, measure performance efficiency benchmarks, provide prescriptive analytics and strategic analysis, and enable real-time insights on payables data, including retrospective reporting, spend analytics, and data modeling.

The pursuit of happiness

So, then – to return to the question with which I started. Is business all about money?

Well, it’s true that in accounts payable, everything could indeed be interpreted that way, and it’s equally true that a Frictionless Enterprise approach to finance can help to lower costs, protect sales, and so, ultimately, maintain and even boost margins.

But in fact, and in spite of what I said at the outset, it’s not just about money. Sure, you could measure supplier and customer goodwill in purely financial terms – but this goodwill also has emotional value. It’s good for a business to know it’s doing things right, and it’s good to know that it’s treating people well. It’s good, too, to know that by streamlining processes and removing hassle, it’s also making life better for employees.

Business is about more than money. And the Frictionless Enterprise is about more than efficient processes. It’s about making, and keeping, people happy.

To learn more about how Capgemini’s Frictionless Finance can help you start your frictionless journey towards enhanced accounts payable processes and improved customer satisfaction, contact: mahalakshmi.r@capgemini.com

Mahalakshmi Ramakrishnan leads multi-national multi-cultural teams and transformation projects across the accounts payable.

Alternate data can streamline the underwriting evaluation process

Shane Cassidy
March 17, 2021

As a financial services consultant, I often talk with life insurers facing similar challenges. They say they want to reduce underwriting costs and shorten approval cycle times. And they want to improve accuracy without adding new processes that some policyholders consider tedious and invasive.

Alternate data changes the scene

Underwriting accuracy is directly proportional to the data underwriters use. Until recently, insurers relied almost exclusively on information they collected through customer applications and medical exams. Underwriters evaluated an applicant’s age, gender, height, weight, driving record, and family medical history to calculate a premium rate. Blood/urine tests, drug screening, and hypertension/cholesterol readings were often required.

Now, insurers can access multiple alternate sources for a range of customer information, including:

  • Data retrieved through digital channels such as biometric wearables, social media, telemetric devices, or genomics websites
  • Electronic health record (EHR) information via partnerships with ecosystem specialists, historical clinical lab test data, or data from governments or open API platforms.

These days, many life insurers leverage claim reports and AI-based systems to generate insights from EHRs and prescription databases to accelerate underwriting decisions.

The role of advanced data management technologies

Alternate data has significant advantages, although managing high volume and velocity – as well as the veracity – of information can be daunting. However, technology advancements enable firms to get the best from alternate data. For instance, cloud infrastructure enables convenient data storage, on-demand computing power to process vast data volume, and advanced machine learning algorithms to gain insights from unstructured data.

Quicker underwriting

As more ecosystem players collaborate, life insurers gain a supply of consistent high-volume data to speed and scale up the underwriting process, with an enhanced critical view of the risks to be insured.

Haven Life, a digital life insurance agency, subsidiary of MassMutual, uses artificial intelligence algorithms to assess applicants’ historical lab results and medical claims data to generate instant coverage quotes and dispense without in-person medical exams.

More accurate underwriting

From a risk management perspective, life insurers can leverage alternate data better to identify new risk parameters and price risk.

India-based Max Life Insurance leverages predictive analytics during the underwriting stage to spot the likelihood of an early claim or potential fraud and routes questionable applications for additional verification.

Life insurers are turning to alternate data sources for continuous underwriting. In Poland, AXA launched a life insurance policy for senior citizens. The policy comes with an intelligent medical wristband to continuously monitor policyholders’ vital health parameters.

The data-driven risk management will help insurance firms enhance their underwriting processes, reducing loss, and providing a better customer experience.

Benefits of using alternate data for underwriting

Source: Capgemini Financial Services Analysis, 2021

Create new value-added services

With alternate data, insurers can identify emerging risks and create personalized services because they thoroughly understand each customer’s risk profile.

In the United States, John Hancock developed Aspire, a wellness program, to help people with diabetes manage and improve their health. The firm uses data from wearables to provide coaching, clinical support, education, and rewards for healthy behavior.

In South Korea, Kyobo Life Insurance developed a fraud prediction system based on internal and external data sources to identify potential fraud before it happens.

When used together, alternate data and advanced data processing can help life insurers:

  • Significantly reduce application processing time
  • Free prospective policyholders from invasive medical test procedures
  • Make policy approval decisions via virtual meetings and examinations
  • Enable continuous underwriting and accurate risk assessment.

Download a free copy of Capgemini’s Top Trends in Life Insurance: 2021 to learn more about how life insurers can use alternate data.

To discuss an alternate data strategy for your firm and to exchange ideas about this topic, feel free to connect with me.

CMOs are leveraging data and compliance to augment their marketing ecosystem

Capgemini
March 17, 2021

Data. It’s either a marketer’s dream or a compliance nightmare. Whichever way you look at it, data and compliance in combination are a strategically important topic for today’s CMOs.

New technology and channels, along with rising volumes of data, provide new ways of engaging with customers. But with this data comes challenges relating to evolving and inconsistent regulatory and legal environments. Surprisingly, rather than fixing the issues and investing in a compliant way of handling data as a vital business asset, many CMOs are simply allocating budget to paying fines for non-compliance. But they are looking in the wrong direction.

At Capgemini, we believe that getting to grips with your data compliance offers an extraordinary opportunity to build trust and make privacy both a brand differentiator and growth enabler. In this blog, we look at the data and compliance challenge from the two perspectives of the customer and the CMO.

CMOs are well placed to understand what their customers want and target engagement in a way that addresses any concerns regarding customer privacy.

Making customer data count – the CMO challenge

CMOs must keep up with new channels and focus on becoming leaders of change instead of being disrupted. It’s up to the CMO to establish the Marketing/IT shared goals and represent the voice of the customer inside the company – embedding data and compliance every step of the way.

There are several internal hurdles for the CMO to climb to transform data into an asset:

  • A rapidly evolving business context that requires CMOs to build a comprehensive strategy to tackle governance & compliance, internal collaboration and external partners.
  • Handle scattered and siloed customer data with robust data management system ensuring high data quality and cross-device connectivity.
  • Move away from regulations and ethics perceived as constraints and tackle data regulation across geographies.

This list of data- and compliance-driven challenges is not insurmountable. Addressing them gives the CMO control and an overview of all data relevant to Marketing activities. In turn, this clarity enables a seamless and holistic approach to customer data privacy, building trust in the brand.

Data and compliance are Marketing gold

Consumers want to know their data privacy is being taken seriously. CMOs have an opportunity to stop paying non-compliance penalties and invest the budget in improving brand differentiation. It’s surely a win-win situation.

So, how do you make it happen? How can the CMOs enable and/or establish compliant and data-driven Marketing that makes the customer happy?

CMOs need to focus on to the three pillars of customer need: Transparency, Accountability, and Empowerment.

  • Transparency – create a value adding database
  • Accountability – ensure thorough consent management
  • Empowerment – give people greater control over their data and actions

Data and compliance-led transformation

At Capgemini, we have helped many leading brands to transform their approaches to privacy, data, and compliance. The outcome is typically stronger customer centricity and brand trust built around a transformed marketing data landscape.

To reach this outcome, we have developed an approach that enables the compliant handling of customer data and consent. This will become an important strategic need in the coming years.

What we do

Our proven and standardized approach enables us to gain an understanding of our clients’ data organizations, assure results, and safeguard their customers’ personally identifiable information (PII).

Here’s how:

  • We analyze all the information relating to how data privacy/compliance is organized.
  • We identify potential threats to compliance from the CMO’s perspective.
  • We define safeguards/actions to tackle the identified threats.
  • We improve long-term compliant data handling and drive efficiency.

What direction will you take?

We hope this blog persuades you that data and compliance are a prerequisite for marketing success. While withholding budget for possible fines might offer a quick fix, it won’t turn your customer data into the valuable data asset it could be.

Rethinking your data, tracking it across every marketing touchpoint, safeguarding it, and using it responsibly will reap rewards. How? By ensuring your marketing content is managed compliantly, targeted relevantly, and has a defined purpose.

At Capgemini, we combine our marketing and data compliance experience to identify the threats, put in place measures enabling compliance, and ensure our clients are trusted to capture and use customer data appropriately.

To know more, read our detailed point of view paper on Connected Marketing – Data and Compliance.

This blog was authored by:

Marian Meyer-Tischler

Senior Manager | Insights Driven Enterprise

Svetlana Ollivier

Manager | Insights Driven Enterprise

Securing critical infrastructure environments, no matter their size

Capgemini
March 16, 2021

When assessing critical infrastructure environments, security teams should always ask: What is considered an acceptable risk? Is there a limit on the danger to life? Is it one life, 100 lives, 1,000 lives? According to recent news reports, a hacker gained unauthorized entry to the system controlling the water treatment plant of Oldsmar, Florida, a city of 15,000. The hacker tried to taint the water supply by increasing the level of a caustic chemical, lye, in the water supply. This act exposes a danger that has grown as systems become more computerized and accessible via the internet and remote connectivity. Was this an act of domestic terror or was it a prank gone terribly wrong? Is this a target considered worthy of a nation-state bad actor? Maybe not, but it surely is a target worth protecting from a hacker who can stumble upon an open remote session where they can do damage. This is a small municipal water treatment plant, one that probably didn’t consider itself a target. This plant has now disabled remote access. The management team realized the risk and anticipated this attack could happen. Over the years, many other municipal providers have observed that they are not targets of potential attacks. But this case shows the risk is now very real.

What was amazing about the water hack is that a supervisor saw a mouse moving on its own, and he stopped the attack. This is great threat detection, but with all the remote work being performed, what are the chances that the hacker could have accessed a system that wasn’t well monitored? Would the lye have made it into the water supply? How many of the 15,000 residents could have been injured? The fact that the remote session was hacked is significant and points to the basic lack of cybersecurity in some critical infrastructure networks. Taking down the power grid may be problematic but contaminating the water supply can be deadly. The fact that the hacker “briefly increased the amount of sodium hydroxide by a factor of one hundred (from 100 parts per million to 11,100 parts per million)” indicates that the numbers were changed … that fact equals danger! Are there backup controls? I am sure there are, but if a hacker has access to your internal systems, there is a potential for anything to be overridden.

The Oldsmar water plant was also in two prior data breaches dating back to 2017. It appears their credentials may have been exploitable for a while. Although smaller organizations do not have large security budgets, there is a need for more effective account monitoring and control. This does not necessarily require a big investment. If they had been more in tune with this important security control, then uncontrolled access would have been eliminated back in 2017 and the plant would have been rendered more secure.

What should be done to protect municipalities? Power plants that provide 1,500 MW of power are subject to NERC CIP regulations. What about the smaller power plants or water treatment plants such as the one in Oldsmar, FL? This is not to suggest that these smaller providers should be more formally governed, as regulations drive compliance, but not security. Public utilities should be held responsible for the communities they serve. These smaller plants are often underfunded and understaffed, which increases risk. Ensuring security requires a commitment not just in words but also in the funding of effective cyber controls.

Aside from regular risk compliance assessments, there is a need to ensure municipalities are providing necessary controls that address cybersecurity in their operations too.

Key steps to ensure basic cybersecurity in ICS environments:

  • First and foremost, develop a cybersecurity policy that aligns with best practices, reference: CIS Controls to Industrial Control System environment, and reevaluate it regularly.
  • Enforce strong passwords with frequent changes (at least 90-day intervals). This could possibly have alleviated the unauthorized access in the Oldsmar case, but in any case, change your passwords, folks!
  • Consult with any of your OEM automation providers and maintain their recommended updates. Almost every OEM provides its customers with a list of approved updates. They will try to sell you automated solutions to help you push this information but consider investing in a resource to maintain these updates. Don’t shortchange basic cybersecurity best practices.
  • Use change control for everything. This is a pretty common practice, keep it going.
  • Patch your machines – these security patches are free but do require resources. Again, consider using the patch list, but invest in the resource. This could be a contracted resource whose sole purpose is to come in once a month and manage your security updates.
  • Do not overlook your network equipment, lockdown basic ports, and keep your firmware updated.
  • Use secure remote access methods – this may require some research or some professional consultancy. With the increasing remote workforce, remote access is a growing reality. Take the time and money to secure this important resource.
  • Use and update anti-malware on any machine that will accept it. Again, consult the OEM and get a list of approved updates for your systems.
  • Backup your critical machines and test those backups, regularly. These machines don’t change their configuration often, but they should be backed up and tested at least quarterly.
  • Again, do not ignore your network equipment, back it up, compare configurations with the last backup, and investigate any changes. If your network switch, router, or firewall has changed in the last month, be suspicious.
  • Set basic limits on controls that would reduce the risk of hazardous outcomes, check with your OEM provider to set these limits so you don’t affect production.

This list can be a foundational guide, as every plant no matter its size should prioritize its obligation to its communities to keep it safe as a critical responsibility. Additionally, cybersecurity needs to be considered in every budgeting cycle and cannot be shortchanged. If you feel the organization needs help addressing these issues, get some professional guidance, have a risk assessment done, and follow basic cyber best practices to keep your community safe.

Follow me on LinkedIn.

To find out more about how we can help you, visit our Secure IoT/OT Services page.

Author


Larry Alls
OT Solution Architect | NA Cyber Center of Excellence
Experienced Senior Solutions Architect with a demonstrated history of working in the oil & energy industry. Skilled in Firewalls, Network Engineering, Network Security, Wireless Networking, and Cross-functional Team Leadership. Strong engineering professional with an AAS focused in Computer Science from Tampa Technical Institute.

How artificial intelligence can accelerate a brand’s eco-responsibility

Vincent-de-Montalivet
Vincent de Montalivet
March 16, 2021

A study conducted by the CSA[1] in May 2020 showed that a brand’s positive contribution to societal issues is the main criterion of brand loyalty for 57% of people in France. At a time when consumers are particularly keen for brands to prove their usefulness and commitment to social, economic, and environmental issues, this figure lines up with the fact that consumers would be indifferent if 77% of brands disappeared altogether.[2]

Although awareness is important today and has been increasing year after year (seven out of ten people in France consume organic foods at least once a month)[3], when it comes to sustainability, the market and expectations are still difficult to gauge. Organic, carbon neutral, recycling, recycled, zero-plastic, circular economy, green, vegan, natural products, etc. – the list of characteristics is often long and complicates the notion of eco-responsibility.

Today, there are many measures that companies can take to support their eco-responsible approach both on their operations and design phase.

The good news is that a brand, by pursuing sustainable development, also improves its economic gains. Nearly 80% of brands say that doing so increases customer loyalty and 63% report that it directly contributes to an increase in revenues.[4]

Artificial intelligence (AI) has already reduced greenhouse gases by about 13% among manufacturers and retailers and can help brands reach 45% of their carbon reduction target by 2030.[5] Thus, how concretely can it help brands accelerate their eco-responsible initiatives?

1. Investing in the optimized use of resources: operational excellence in global logistics

AI is involved in many stages in the value chain, from supplying materials to shipping products. Once the units are produced, the right products must be sent to the right stores while avoiding error and optimizing distribution. Inventory management, logistics, and overall processes are improved through algorithms. At the warehouse level, AI solutions have helped Amazon reduce packaging requirements by 33%, saving of more than 915,000 tons of packaging materials, which is equivalent to the elimination of 1.6 billion shipping cartons.[6] On the shipping side, Capgemini has deployed a tool that generates a complete cost-benefit analysis, which details estimated fuel consumption, fuel costs, and CO2 emissions and identifies different scenarios to optimize delivery strategies.[7]

Manufacturing only the number of products sold is the dream of any organization. In addition to the economic gains associated with reducing the number of unsold items, it is also a great way to lower the carbon footprint (fewer raw materials used, less energy consumed, etc.). However, for several decades sales forecasts have been made based on legacy, for example historical projections or expected sales objectives. Today, new algorithms can achieve unprecedented accuracy that translates into strong gains in economic and environmental performance.

In early 2018, Swedish multinational clothing company H&M announced that it had a stock of unsold clothing worth more than USD4 billion,[8] highlighting the increasing complexity of sales forecasting in a context of tight-flow logistics. Since ongoing investment in AI is making it possible to better predict in-store demands based not only on past sales but also on external data such as the weather forecasts or scheduled sports or cultural events, it is possible to make much more accurate predictions at a very reliable item and store level.

Carrefour, with the help of Capgemini, is using AI to optimize inventory management and reduce waste by integrating the SAS solution into its supply chain. By collecting and processing data from stores, warehouses, and e-commerce sites, Carrefour can better anticipate demand and refine orders from suppliers. Through smarter management of its supply chain, Carrefour has reduced the number of breaks and overstock in stores as well as warehouses.[9]

The stakes for the agri-food sector are immense; reducing food waste through AI could generate nearly USD127 billion.[10] Specific applications include using image recognition to determine when fruit is ripe, more effectively matching food supply and demand, and improving the value of food by-products.

Retailers and manufacturers now have the tools and data to optimize their operations to improve not only their economic but also their environmental benefits. By implementing these eco-responsible measures, brands can improve their carbon footprint; less transport saves fuel, optimized stock requires lower heating costs, optimized packaging decreases the carbon weight of products, the reduction of unsold items eliminates unnecessary production and lowers the manufacturing impact  (shortage of resources, water, energy, etc.)

2. Creating new eco-responsible products for a circular economy

The development of sustainable products is a challenge for companies that pursue traditional development focused solely on cost optimization. The product design phase is a major driver to prevent cycles of reuse, repair, refurbishment, and recycling of materials. Because the tools to assess the environmental impact of eco-responsible products are complex and not very user friendly,  designers and marketers often have to outsource this assessment.

Data mining[11] makes it possible to automatically recover key lifecycle data in the decision-making process. With AI, it is possible to predict not only the cost of a product but also the carbon cost that must be incorporated into the design phase in order to develop optimized scenarios – for example, sourcing local products to reduce the carbon footprint associated with transport or substituting products during the manufacturing phase.[12]

AI can improve and accelerate the development of new products, components, and materials[13] that are suitable for a circular economy through machine-learning-assisted iterative design processes that enable rapid prototyping and continuous testing.[14] It can also facilitate the implementation of new economic models based on the circular economy. For example, AI makes it possible to simplify the resale process in the second-hand market. Capgemini and its partners have developed a “circular” customer journey:[15] customers bring in used clothing and photograph the items with a store camera; the solution scans and analyzes the clothing and automatically generates a product page, listing defects, holes, stains, or scratches; it then estimates the value of the item based on brand, authenticity, description, size, condition, and materials.

Decision makers are increasingly relying on the circular economy and sustainable development. Recently, Ikea announced that it was creating a store 100% dedicated to second-hand sales.[16] In northern Europe, a shopping mall was built to focus solely on second-hand resale. In France, La Redoute launched its second-hand online store, “La Reboucle,”[17] and major retailers are now putting second-hand departments in their stores.

The acceleration of these changes has spread to the regulators who strive to make the ecological impact of our consumption more transparent. Much like nutritional qualities, which are evaluated with a nutri-score, a “carbon score” informing consumers about the environmental impact of the products they buy and helping them in their purchasing behavior is becoming the new normal.

We can no longer afford to evaluate companies based solely on extra-financial criteria. Instead, the carbon standard must be integrated into the entire value chain to allow consumers to choose not only its brand but also the most respectful product on the planet.[18]

The “Yuka of CO2” is becoming widespread.[19] By not valuing environmental data to make the carbon footprint of its products transparent, companies risk being noted on non-affiliated channels, beyond the control of the company. It is therefore up to the brands to use the latest technologies to activate the right levers and assert themselves as resolutely eco-responsible entities.

Capgemini seeks to serve governments and public organizations in their quest to build sustainable territories by leveraging data, AI, and analytics expertise. Want to know more? Check out our AI4Environment point of view here.

Please reach out to the author for more information.

How test and learn delivers value in data-driven marketing

Capgemini
Capgemini
15 Mar 2021
capgemini-invent

Data has changed the game for marketers. It informs campaigns, helps to secure budgets, maximizes the return on investment, and drives a better customer experience. But data alone is not enough. Test and Learn (Experimentation) is an essential element of ensuring your data delivers the results demanded by the business – and there are some useful steps to take when running tests.

Gone are the days when marketers made decisions relying purely on intuition and experience. With data and analytics proliferating, marketers are increasingly looking to data to inform every decision, running tests to understand and validate what works and what doesn’t. A test could be as simple as changing the color of a button on the mobile app, or as complex as using different machine learning algorithms to send a personalized offer. In both these cases, tests enable marketers to experiment by applying the change on a small percentage of the overall customer base and measuring the effect.

According to Gartner survey analysis, organizations that significantly outperform their competitors are almost twice as likely to make testing and experimentation a marketing priority. Running tests helps marketers on their journey to drive superior customer experience, growth, and return on investment (ROI). Almost half of companies in an Optimizely survey credited experimentation for driving a 10% uplift in revenue. Even a failed test provides some learning, some insight into customer behavior.

Three steps to good testing outcomes

Running a test that provides valuable learning without impacting the business adversely is a mix of science and art. A well-conceived test involves three steps, as follows:

Design

  • Understanding business goals — Goals help to ensure that tests (aka experiments) are aligned to business objectives, which could be increasing customer engagement, improving campaign efficiency, or increasing revenue. They provide structure and scope to experimentation. Goals also enable prioritization when multiple tests are needed but cannot be run simultaneously.
  • Formulation of hypothesis — A hypothesis explains clearly “what” change is being made, “why” we want to make the change and the expected outcome. For example, for an ecommerce business, using a more creative email subject line (“what”) might resonate better with its younger customer base (“why”) resulting in an increase in email open rate and higher customer spend (expected outcome). Running the test will confirm or reject the hypothesis. An if-then-because statement is a good way of framing a hypothesis. Insights from prior tests, analysis, intuition, and business knowledge, as well as creative thinking, help create hypotheses.
  • Definition of the success metrics and alignment on next steps – It is important to have clear and precise KPIs (primary and secondary), ensuring that different individuals interpret the test results in the same way. This also ensures that the hypothesis is measurable before the test is run. Getting alignment with key stakeholders on the success metrics at the design stage ensures that the learnings from the test will be subsequently used.
  • Determination of sample size and duration of the test — A large treatment size (i.e. subjecting the change to a large group of customers) provides greater ability to detect small change at the desired significance level. However, having a large treatment size could be costly if the change being tested is detrimental to the business. For example, using a new algorithm that provides discount-rich, personalized offers to a large customer base will result in lower profitability and poor marketing ROI. Hence, it is essential to determine the minimum sample size that provides a confident (statistically significant) test result. This is done using power analysis, a common practice in scientific experiments that can be implemented using packages available in python and other programming languages.

Execution

  • Deployment — This is an operational step to run the test. Care needs to be taken to ensure that the sample size determined during the design phase is adhered to and the right population is exposed to the test.

Analysis and results shareout

  • Post-test analysis and measurement of metrics — While every effort may have been taken to ensure that treatment and control (holdout) population groups are truly random, it’s possible some bias might have crept in that needs to be eliminated. Using regression modeling, it is possible to increase the quality of the results (e.g. adjusting for prior spend bias between the treatment group targeted by the marketing campaign vs. a holdout group that was not targeted). For tests that are run over a long period, having a mid-test result readout provides a kill switch if required.
  • Results shareout — Quite often, tests are run by teams to identify themselves as being data driven but when the result shows that the hypothesis is incorrect, decisions are still made using the original hypothesis. Having alignment with decision makers at the start on how the results will be used will ensure commitment and proper use of the organization’s resources.
  • Maintenance of a learning repository — This includes the goal, hypothesis, treatment and control details, sizing, duration, success metrics, result and next steps involved in the test. In large organizations, as people move in and out of teams, having a learning repository helps new members understand what has worked and what hasn’t, helping in ideation and formulation of new hypotheses for future use.

Creating a test and learn culture

For organizations to truly build a test and learn culture, management support is critical. It takes courage for marketers to acknowledge and work with the unpredictability of human behavior. Great managers recognize this and are focused on outlining goals and providing direction to the team. They also understand the importance of having the right infrastructure and tools to enable employees to run tests at scale with speed. To conclude, a quote by Amazon’s Jeff Bezos sums up the importance that leading businesses place on test and learn: “Our success is a function of how many experiments we do per year, per month, per week, per day.”

Find out more

Want to know how to get started, assess your current test and learn capabilities, or accelerate your journey? Get in touch and we can help you explore your options.

Author

Darshit Tolia

Managing Consultant, Insights-Driven Enterprise (IDE),
North America

Capgemini Invent