Skip to Content

Who’s got talent?

Amit Paul
04 May 2022

Why taking a fresh look at your skill management is critical in attracting, developing, and retaining the right people to take your business forward in this new normal.

In our previous post, we delved into the importance of innovation in the ADM space – and the value that a collaborative idea incubation center can deliver across your operations. But now, we’re looking at something even more important than innovation. We’re talking about talent – the people who create ideas and bring new offerings to life with innovative technologies.

At the core of every successful modern business, you’ll always find two Ts – talent and technology. We need good talent to develop and make the best use of current and emerging technology that’s essential for starting, running, or transforming businesses. Making the best use of technology is just as important as creating or developing new technology – and for success with both – nurturing your talent is the key success factor.

Continuous change, new technologies and techniques, and new ways of working

In today’s chaotic world, certainty has almost entirely lost its meaning – and as innovators in the technology space, we must always be on our toes. This means adapting to continuous change to innovate, adopting new technologies and techniques to transform, and implementing new ways of working to meet business demands:

  • Adapting to continuous change: business needs are changing faster than we can anticipate – be it effects stemming from the pandemic, tense geo-political situations, or climate change
  • Adopting new technologies and techniques: these are essential in augmenting your ability to respond to change. You can see Moore’s Law in action every day with new surprises on the technology front – be it innovation in Cloud, Data with AI, ML and IoT, blockchain, transformation with data fabric, composable applications, MLOps, etc.
  • Implementing new ways of working: this is essential to keep pace with changes in business priorities and technology innovation. Agile and DevOps are the new guidance systems in responding on time and making your business viable.

And as you may have guessed, the key to adapting to continuous change, adopting new technologies and techniques, implementing new ways of working – and ultimately succeeding in this new normal – lies within your ability to attract, develop, and retain talent. But how should you get started here?

From “I” to “Pi” – Taking a fresh look at skill management, involvement and ownership, and effective training

With maturity in Agile ways of working, it’s been observed that for successful execution of projects and service management, there’s a real need to develop a totally new mindset and take a fresh look at skill management.

The main objective of a team should always be the solving of business challenges – rather than the fulfilment of specific skills. In the technology space, a business challenge could be the creation of a new application or platform, the efficient and cost-effective running of an application landscape, or the transformation of a legacy environment. Time and again, we have seen that individuals with broader knowledge bases (in addition to specializations) are better equipped to handle complex problems and can quickly adapt to new environments.

There’s a real need to nurture your talent to meet the expectations of this new normal, which should include fostering speed, agility, and out-of-box thinking. To accomplish this, it’s critical to take a fresh look at the skill management of your people on their journeys to Agile maturity – from I to T to Pi-shaped skill sets.

Personal ownership and involvement of individuals within teams

Addressing and aligning with the career aspirations and life goals of individuals is crucial. Some people may thrive on the familiarity of an environment and take pride in maintaining the stability and efficiency of business as usual, while others may be more adventurous in terms of learning new skills and trying out different things to come up with innovative solutions. Nurturing both these types of individuals is essential for your teams.

Additionally, enabling individuals to choose their own mode of engagement is also essential – whether it be time-boxed (depending on at what point value gets delivered) or more fluid. While onboarding needs to show every individual the plan for the success of the team, project, and customer engagement, along with conveying how the team is going to achieve its goals – which in turn relates to how we are going to solve future business problems.

Training is critical – but effective training is the real challenge

Getting away from stereotyping or typecasting – a developer can learn the necessary skills to address business expectations appropriately, while a business analyst could be interested in learning the basics of microservices, to visualize the changes required in business function workflow.

Rather than long, rigorous learning regimes, breaking down information into small, bite-sized learning chunks that are relevant and interesting enough to prompt learners to try out things and attain a sense of achievement can play a huge role. While trainings designed to nurture T and Pi-shaped skill sets help teams address pressing business issues without any feelings of being forced by compliance measures or a need to add another certification.

On-the-job training can help develop prototypes for solutions to a variety of business issues, as team members actively learn the skills required to solve specific problems. For example, maintaining applications hosted in the Cloud versus on premise (data center) requires different collaboration, coordination, and networking skills, in addition to technical and functional skills.

Enabling your people with the right training and tools can help transform your applications development to innovate faster, work smarter, improve operational efficiency and TCO, meet specific business goals in less time, and enhance collaboration between business and IT. Within Application Development and Maintenance (ADM) space, Low code/No code with ADMnext can bring your people the right tools that will enable them to seamlessly create applications using a graphical interface, with virtually no programming experience required.

While overall, Capgemini’s ADMnext provides a comprehensive platform with tools and techniques necessary to adopt and effectively use technology to address business challenges. ADMnext empowers your people to step outside of a ticket-focused mentality and into a value-based mentality by developing heightened human and business connections – and adding direct value to customers.

In our next post of this series, we’ll look at Cloud within the ADM space and how the right Cloud modernization strategy can take your business to new heights.

In the meantime, to learn more about Low code/No code with ADMnext can help nourish your talent and the overall ADMnext offering as a whole, shoot me a message here.

Decarbonization: How data is critical to realizing your net zero ambition

Vincent-de-Montalivet
Vincent de Montalivet
28 April 2022

While many businesses and governments have set net zero targets, data-powered intelligence is key to bridging the gap between net zero ambition and action.

Decarbonization is now firmly at the top of the C-suite agenda. Legislation is evolving fast, and civil society is increasingly sensitive to the carbon catastrophe we face. Citizens, customers, and the whole of society is demanding climate action, right now.

Consumers, investors, and employees expect organizations to be accountable and transparent about climate action. Greenwashing is a major issue, for example, with fossil fuels being marketed as carbon neutral [1], and 59% of sustainability claims by fashion brands having been found to be greenwashing [2].

Promises and platitudes are no longer enough. The carbon cost of every activity is scrutinized intensely, since reports on climate risks and social impacts are now expected to be disclosed in routine corporate accounting.

Finance and asset managers are using ESG performance as a decision-making criterion. During the height of the global pandemic in 2020, large funds with ESG criteria outperformed the broader market. In fact, the most carbon virtuous companies can expect positive impacts on corporate financials, enjoying more favorable financing terms and being seen as more resilient in times of crisis.

Increasingly, organizations face legal demands to act and even more importantly, to prove their actions. In May 2021, a ruling by the Dutch Supreme Court ordered Shell to reduce its carbon emissions by 45% by 2030 for failing to deliver formal proof that it was keeping its commitments. This legal thunderclap warns all companies that lip service is no longer enough. Action is everything.

We are entering a new era of carbon accounting, and carbon is our new currency. There is no doubt that enterprise needs to be fully equipped for carbon accounting, as it is for financial accounting now.

The critical role of data in reporting on ESG commitments

While many businesses and governments have set net-zero targets, data-powered intelligence is key to bridging the gap between net-zero ambition and action. The shift towards action – and proof of action – demands a super refined level of ESG data management.

Simple in theory, less so in real life. First, we make sure that all the data is complete, consistent, and compliant with the taxonomy and frameworks, cataloging to build a trusted foundation.

Next, we move from a one-off, batch collection logic to a recurring collection logic, or even continuous integration. This allows us to create a real data platform dedicated to the net zero program designed for analytics and optimizing its value using data science. Not forgetting, throughout the process, to implement the governance required to manage the project over the long term.

Net zero intelligence supercharges decarbonization

Evolving frameworks, regulations, and standards require that organizations make their emissions data transparent and visible. Not so long ago, this data was reported annually. Today environmental data is a new parameter that feeds into real time decision-making processes, enabling us to make the optimum compromise between cost, time, quality, and now carbon.

Data, AI and analytics are key levers to secure and execute the enterprise sustainability agenda.

Data is an essential lever to build resilience and reduce climate and business risks by addressing three main objectives:

  • Measure to steer progress
  • Improve to reduce impact
  • Anticipate, adjust the climate action plan

 Data for Net Zero

Capgemini has developed a seminal net zero program, underpinned by an enviable track record in data analysis, governance, and the deployment of data solutions and products. These experiences have convinced us of the power of data to fuel the decarbonization process through the creation of Data for Net Zero.

We translate the carbon assessment into tangible insights to monitor and report your ESG commitments at scale through industrialized measurement, powered by our trusted data and AI platforms.

Data for Net Zero enables simulations and advanced analytics that provide centralized real time enhanced insights. These enable organizations to transform their ESG commitments into a pragmatic and viable action plan.

Create a data strategy for net zero

First, your data vision needs to be seamlessly integrated into your overall net zero trajectory. This means breaking down your net zero objectives into key data projects and indictors, then sharing them right across your business.

To anchor your data challenges, you’ll need to review the best calculation methodology for GHG emissions and define the optimum organizational model and parameters of governance. To achieve your data ambition, you’ll also need to select the right technologies and solutions. And finally, you’ll need to create and nurture the optimum data partner ecosystem, which means focusing on seamless data collaboration.

For example, a large American retailer asks its suppliers to formalize the improvements it has made, from one year to the next, on key environmental indicators, as a condition of their partnership. For large industrial companies, fulfilling this simple request would usually take months of work, collecting information and creating an appropriate response. A foundation of data collaboration focused on sustainability, activated across a data ecosystem, revolutionizes the process, quickly identifying those business needs and organizing data and systems accordingly.

Establish a sustainability data hub

It’s crucial that businesses set up a Data for Net Zero nerve center at the crossroads of their enterprise functions. Creating a Sustainability Data Hub will enable you to identify granular data to feed your data hub, from sources such as operational data, the operating system, and external sources, including the emission factor database and suppliers’ carbon data.

We’ll help you design and set up the optimum technological platform for sustainability related data, based on your current data estate. You’ll be able to measure data founded insights and report the environmental impacts of your activities, including scope 3. And you’ll soon be packaging data models to enhance advanced analytics and help business functions to simulate reduction paths to reduce their footprint powered by AI driven use cases.

In the automotive industry, one of our clients is applying this analytical platform to its inbound and outbound logistics activities. Until now, it reported its carbon footprint annually. Today, in its decision-making, sustainable development is on the same level as the safety of the vehicles it designs.

Activate ESG data performance

Your ESG data performance can be harnessed and put to work as a corporate asset. First, you’ll need to set up a cross-organizational ESG performance steering infrastructure and choose relevant the ESG reporting framework to meet mandatory disclosing process from SEC, EU, HKEX, TCRD, and ISSB to name of few. Then you’ll need to measure the ESG insights of all your activities, projects, and transactions, as well as those of third parties.

Once this is in place, you can industrialize and automate ESG reporting to comply with evolving regulations. And in the process, you’ll be able to extract a specific environmental dataset to meet and exceed the increasing expectations of investors, customers, and other stakeholders.

Cross-functional projects require cross-functional skills

It’s a fact net zero is a cross-function responsibility that needs a holistic approach. In addition to creating a solid data and AI foundation, it’s critical that senior managers in information systems management, data, and corporate social responsibility (CSR) are fully invested and committed to the cause. You’ll also need the full buy-in of decision makers right across the business, in operations, purchasing, supply chain, and sales, who need to track their respective sustainability performance.

Going forward, collaboration with an external partner, like Capgemini, with cross-functional expertise in strategy and operations transformation, industrial process engineering, data management and AI, ESG, and sustainability will transform your approach to put decarbonization in action and at scale. We’ll facilitate cross-functional exchanges, helping you to technically embrace sustainable performance and measurement in your data roadmap.

In truth, ESG is no longer an “optional extra” or a “nice to have.” Instead, it’s now a given that businesses will deliver clear and transparent ESG reporting. A business that doesn’t deliver a comprehensive ESG program is likely to experience poor investor satisfaction, as well as a negative impact on its financial results.

Wherever you are on your ESG journey, Capgemini is the perfect partner to help you reach the first stage of compliance maturity level, or to help you accelerate towards high value creation.

In short, we’ll ensure you maximize the full benefits of decarbonization, far beyond delivering basic annual carbon footprint statistics.

The future of ADM and your business: Fresh perspectives on the increasing maturity of ADM services

Ritesh Mehta
20 April 2022

Looking at innovation, talent, and cloud – and how your strategy for all three within the ADM space can dramatically affect the growth of your future business

ADM services are maturing rapidly, but their ability to lead and drive change – and ultimately grow the future potential and development of your business – is only growing. In this blog series, we’ll delve into the key elements affecting ADM today – innovation, talent, and Cloud technology – and tangible tools and strategies you can utilize to lead the market.

Creating creativity: A collaborative idea incubation center for positive impacts across your operations, IT, and customer experience

Innovation is one of the most frequently used words in IT – especially now when IT is seen by many as not just an enabler – but rather the core of the business itself. Innovation and operational efficiency sit on opposite sides of the business equation. Innovation requires a healthy mix of space, playfulness, and curiosity, while efficiency thrives more on dexterity and sheer efficiency.

Evaluate, build, kill

Unlike operational efficiency, experimentation and the typical trial and error that follows are essential for a successful innovation learning process and your desired outcomes – unique and productive ideas that can be rapidly applied to grow your business. A “kill-build” factory model is ideal here. This entails the careful evaluation of ideas through an objective selection process – where the rejection of ideas is celebrated as part of an organizational evolution process in shifting to an innovative growth mindset.

Capgemini’s Innovation Offering: Creating ideas you can touch – and change you can feel

Capgemini’s Innovation Offering is a collaborative idea incubation center where fresh ideas are sparked together – with and for our clients. Throughout the creation process, ideas are sourced, recorded, deliberated on, and augmented using a fresh outside-in perspective. Ideas that make the cut are rapidly transformed from the imaginary into innovations that you can touch and changes your business can implement.

Capgemini’s Cloud Modernization with ADMnext supports the implementation of these ideas by rapidly setting up servers, allocating the right resources, and proliferating a viable product throughout your organization. While our unique Customer experience (CX) combines multiple business and user touchpoints to translate business benefits into experience benefits – because in the end – selected ideas need to be user tested to assess their impacts on operations, IT efficiencies, customer experience, and overall business outcomes.

Innovation in action: Ideation for heightened customer experience and better business outcomes

While this all sounds very promising – what does the actual ideation process look like with Capgemini’s Innovation Offering – and how can it create positive impacts across your operations, IT, customer experience, and overall business?

Well, one prime example is a French multinational utility client of Capgemini’s that wanted a platform to test ideas for improved, data-driven decision making. This required massive resources for storage and computing functions. In applying Capgemini’s Innovation Offering, it was agreed that Hyperscalers could be utilized to rapidly set up the platform to massage, transport, and analyze a vast amount of data to promote and select the best ideas to build out into full projects

In today’s markets, innovation and big ideas are almost always inextricably linked to technology as the core enabler. However, some ideas can be too big to swallow all at once and need to be sliced into smaller pieces. For example, a Dutch dairy producer’s existing Global Trade Service (GTS) system required frequent manual interventions for its over 7,000 declarations. So, we came up with a system that was based on a two-step Minimum Viable Product (MVP). This entailed the automation of pro-forma statements and followed with exception handling. Each step, in turn, had several smaller incremental developments to meet the expected target. This reduced manual interventions by over 50% – avoiding product spoilage and making operations paperless.

True transformation ideas cut across businesses by providing integrated solutions that span the entire globe. For example, a major furniture manufacturer in Sweden was mulling over a spatial planning optimization solution for loading cargo and simplifying the existing cumbersome process. We first proposed the creation of a visual representation of the problem, which brought snap-to-fit and drag-and-drop functionalities. This idea was developed via Cloud native with microservices to enable comprehensive partner integration. And as they say, the proof is in the pudding – this idea generated an increase of over 10% in cargo.

Monoliths present their own set of challenges and require ideas that first enable key stakeholders to find faith in the potential change and then gradually build on it. For example, a large railway company in France was seeking to modernize disconnected legacy systems and streamline task flows into a simplified, proven process. As a first step, we developed an MVP to demonstrate the potential for an “all-in-one” solution. This is what I refer to as the first “believing step.” Once this was achieved, we quickly moved on to implementing more MVPs that integrated the flow of tasks, which ultimately resulted in a mobile interface that displayed the task lifecycle from end-to-end. Additionally, along the way, MVPs were moved to full-fledged projects that resulted in reduced revenue loss and effort for the client.

In our next post of this series, my colleague Amit Paul will lay out how you can nurture the only element that’s more important than ideas – your talent – the people that create the actual innovation and drive your business forward.

In the meantime, to learn more about Capgemini’s Innovation Offering and ADMnext, and how we can work together to create impactful ideas like those above for your business, drop me a line here.

Digital cloud platform for restaurants

Capgemini
2022-04-20

The restaurant industry is experiencing an unprecedented transformational change. Business disruptions from the global health crisis prompted the industry to restructure and diversify its operating model and digital portfolio in response to the shift of market trends and customer preferences.

Elevated expectations for cleanliness, health, and safety are accelerating the adoption of contactless technologies, curbside pickup options, and mobile payments to minimize human interaction. Gartner predicts most organizations will leverage contactless technologies for up to 80 percent of their ordering and replenishment activities by 2024.

Restaurant businesses must prepare to meet the operational requirements of the new normal and drive supply-chain efficiency. Furthermore, restaurants should strive to maintain brand loyalty by providing a seamless and differentiated omnichannel experience through enhanced digital capabilities.

Capgemini’s Digital Cloud Platform (DCP) for Restaurants empowers the industry to accomplish these objectives by reimagining digital transformation through compelling customer interactions and streamlined operations, while reducing costs and improving efficiency.

Capgemini is an AWS Premier Tier Services Partner and Managed Service Provider (MSP) with a multicultural team of 325,000 people in nearly 55 countries. Capgemini has more than 12,000 AWS accreditations and over 4,900 active AWS Certifications.

Solution overview

Capgemini’s Digital Cloud Platform (DCP) for Restaurants is a high-velocity software engine that enables rapid innovation for restaurant businesses.

Leveraging the breadth and depth of Amazon Web Services (AWS), DCP provides a comprehensive collection of cloud-native built-in accelerators for building an end-to-end restaurant experience. DCP’s library of customizable software components reduces time-to-market and improves the return of investment (ROI) by up to 50 percent.

Furthermore, DCP enables end-to-end connectivity and integration between existing systems, data, and new business capabilities. Over 75,000 restaurants across 120 countries leverage DCP capabilities for digital transformation.

The following diagram shows the business processes provided by DCP across the restaurant value chain, ranging from omnichannel customer experience, personalization, menu and kitchen management, and online ordering, to loyalty, promotion, and system integration with point-of-sale (PoS), payment, and logistics.

AWS Blog-graphic-01 - DCP business processesFigure 1 – DCP business processes

In this section, let’s look at how restaurants can easily configure a fully functional online presence using DCP.

  1. Log in to the DCP console and choose a solution blueprint, such as online ordering. The visual library consists of Capgemini’s view of business processes for online ordering, including product management, store management, ordering, marketing, payments, and customer management.
    AWS Blog-graphic-03- DCP business functionsFigure 2 – DCP solution blueprints
  2. Next, expand the selected business process such as product management and choose relevant business-function components. For example, the product-management business process provides functions to create a product catalog, search products, and manage real-time menu updates.
    AWS Blog-graphic-03- DCP business functionsFigure 3 – DCP business functions
  3. Business functions are exposed as pre-built REST APIs following microservice principles. For example, the product association API manages the metadata for the fields of association types.
    AWS Blog-graphic-04- DCP business-function microservicesFigure 4 – DCP business-function microservices
  4. Repeat steps one to three to add more business functions. Users can customize the online storefront using pre-built APIs from the DCP catalog.
    AWS Blog-graphic-05- Sample online storefront built with DCP Figure 5 – Sample online storefront built with DCP

Solution benefits

Designed with restaurant industry needs in mind, DCP provides a multifaceted solution that enables restaurants to launch a fully functional, cost-effective, and omnichannel food ordering experience with faster time-to-market.

Capturing actionable insights is essential to gain a holistic view of the end-to-end guest experience. Built with microservices and cloud-native architecture, DCP provides a scalable and extensible platform to support future business growth, including multiple brands management, geographic expansion, and analytics enabled by artificial intelligence.

Solution architecture

The following architecture diagram provides a high-level outline of key services used in DCP.

AWS Blog-graphic-06- DCP architecture on AWSFigure 6 – DCP architecture on AWS
  1. The user logs in to an admin application hosted on Amazon Simple Storage Service (Amazon S3) through a fast and secured custom domain managed by Amazon CloudFrontAmazon Route 53, and AWS Certificate Manager.
  2. The user selects the desired business functions from the admin interface. Then, appropriate microservices and resources will be deployed to the user’s AWS account through Amazon Elastic Container Registry (Amazon ECR) and AWS CloudFormation.
  3. The admin application sends requests to the business function microservices running on AWS LambdaAmazon Elastic Kubernetes Service (Amazon EKS), and Amazon Elastic Container Service (Amazon ECS) through Amazon API Gateway.
  4. Business-function APIs leverage core services, including:
    • Cost-effective and scalable document store on Amazon S3
    • In-memory data cache with sub-millisecond latency on Amazon ElastiCache for Redis
    • Fully managed messaging and email service by Amazon Simple Notification Service (Amazon SNS) and Amazon Simple Email Service (Amazon SES) for mass transactional and marketing communication
    • Identity management for secure user sign-up and sign-in across the web and mobile apps using Amazon Cognito that scales to millions of users and supports integration with social and enterprise identity providers via SAML 2.0 and OpenID Connect (OIDC)
    • Search, index, visualize, and analyze petabytes of text and unstructured data using Amazon OpenSearch Service
    • Automate employee identity verification and workplace safety with facial recognition and computer vision APIs using Amazon Rekognition
    • Conversational AI chatbots using Amazon Lex that allows customers to place online food orders, make a table reservation, and request delivery status
    • Real-time personalized recommendations using Amazon Personalize, including product ranking, specific product recommendation, and customized direct marketing
    • Derive valuable customer insights using Amazon Comprehend, including sentiment analysis and product reviews.
  5. Business function microservices follow event-driven architecture. Real-time interactions between services are facilitated by Amazon Managed Streaming for Apache Kafka (Amazon MSK).
  6. DCP uses relevant database services, including Amazon DocumentDBAmazon DynamoDB, and Amazon RDS for PostgreSQL, based on data types used by business functions
  7. Business function APIs and core services are monitored by Amazon CloudWatch for unified operational health and complete observability of the AWS cloud environment.

Running a successful business in a rapidly evolving and competitive restaurant industry takes more than creating a profitable menu and choosing an ideal location.

Restaurant businesses must optimize financial performance by regularly evaluating key performance indicators, including sales, food and labor costs, time per table turn, online ordering, sentiment analysis, inventory, and operational efficiency.

Furthermore, restaurants must transform and enhance the digital guest experience with innovative technologies to adapt and gain a competitive advantage in future restaurant models.

With extensive restaurant industry experience and a strong relationship with AWS, Capgemini helps multiple major restaurant brands re-envision and transform their business models with innovative technologies at an accelerated pace.

In this post, we covered how Capgemini’s Digital Cloud Platform (DCP) for Restaurants enables innovation at scale and drives actionable data insights by developing customer-focused innovation. We provided an overview of DCP, including features, benefits, and reference architecture.

Please reach out if you have any questions, or you can learn more about DCP on the website for Capgemini restaurant solutions on AWS.

Authors:

Cher Simon, Principal Partner Solutions Architect, Data/ML, AWS;
Rahul Khandelwal, Director – Solutions, Capgemini
Ejas Ahamed, Senior Director, Capgemini

Creating a data-powered culture

2022-04-20

Data provides answers, but people drive change

“Culture eats strategy for breakfast.” The famous quote by Peter Drucker is a perfect call out here as no matter how detailed and solid your data-powered vision and strategy are if the people executing it don’t nurture the data culture then your journey is likely to fail. Becoming a data-driven organization isn’t just about data or technology, it is about transforming the way decisions are made based on deep analysis of facts rather than intuitions and emotions. Organizations are made up of individuals and the people working there determine the success of a data-driven transformation. Therefore, any transformation journey needs to have organizational culture at the root of any change it wants to affect.

The Capgemini Research Institute report on The data-powered enterprise found that a majority (75%) of data masters invest in a collaborative and innovation-driven data culture building a data-first culture.”

Companies that follow a data-powered culture not only stay resilient but thrive in disruptions. Data forms the lynchpin of their flywheel model of operations that drives customer-centricity, innovation, and adoption of advanced technologies. Data is ingrained in their DNA, guiding all decision-making and helping them be nimble, agile, and able to adapt to adversities.

But how do you successfully do that?

To build a data-first culture, we have adopted a structured approach of advocacy and adoption

In our AI & Data Activate programs for clients across the globe and industries, there is usually a technology stream to modernize a data platform, a business stream to implement use cases, an organizational/process stream to streamline data governance, and a change-management stream to support the people dimension. It is about winning the hearts and minds of all stakeholders within (and often even outside) organizations to create and/or strengthen a data-driven mindset.

Hofstede depicts an organizational culture like different layers of an onion, with values at its core and practices permeating each layer. Driving a data-driven culture within an organization starts with values. This means that data-driven decision-making within an organization should be seen as the default. This requires commitment at all levels in the organization, amplified through communication, training, and management attention. At a global consumer-products company that we worked with for many years, executives at the CxO level would not even consider proposals not underpinned by a thorough data-driven assessment, and they considered their analytical prowess a unique differentiator in the market.

After values, it is important to embed a data-driven culture in rituals which are mostly processes, meetings, and ways of working. The next layer is heroes or having champions within an organization that has adopted the new data-driven way of working. They are the key advocates that exemplify the new way of working through leading by example. These examples are supported by success stories. Hence, PR and communication play a crucial role to amplify these stories. At the outer layer of culture are symbols. Marketing and expanding the reach of a data-driven transformation means creating brand awareness within an organization, through PR management and publishing assets, images, and stories.

Finally, every layer of the culture model is permeated by practices or, simply put, what you do. Without a strong data-powered culture, evidence-based decision-making gets relegated to only a few areas of operations, and organizations fall back on tried-and-tested strategies for important decision-making.

The power of habit: Cultivating data-driven behaviors across the enterprise

For the best data-powered organizations, data has become a habit. In “The Power of Habit,” Charles Duhigg writes about why companies and people do what they do. He looks at habits from a scientific perspective and identifies a three-step habit loop with Cues, Routines, and Rewards. Changing a habit to support culture change often focuses on changing the routines ingrained within an organization to promote a data-driven culture. Instilling new data-driven habits means creating new cues with new routines and associated rewards. To make data the cornerstone of their organization, companies need to invest across four operational pillars: people, platforms, partners, and processes. A relationship that is less transactional and more strategic between business groups and IT and BI teams would enable data to permeate the organization and become an enterprise-wide priority.

Finally, when making any change in an organization, make sure user-centricity is at the core of everything you do. Every design of a new tool or process needs to be ruthlessly user-centric. We’re all marketers and behavioral scientists. If you desire to augment decision-making with data, cognition is key. Less is more, with simplicity being the ultimate sophistication – to quote Leonardo Da Vinci.

Data success

An international consumer goods manufacturer inculcated data-powered thinking and used meaningful and relevant data from across the organization to connect closely with its one billion customers. A data incubator was set up and launched to initiate a transformation that put information and insights at the heart of all decision-making. The needs of the customers were defined. The insights and analytics supported the human decisions to improve desired outcomes, leading to demonstrable business benefits by surfacing opportunity, supporting human creativity, and increasing penetration, effectiveness, and revenues. These success stories were actively shared, champions were promoted, stories were told, and new ways of working were designed with a pure user-centric lens. Becoming more data-driven became powerful, simple to do, and a daily habit.

While every company aspires to utilize data to make better decisions consistently, many fall short due to old habits or not having a clear approach. To infuse data-powered culture in an organization’s DNA, business leaders must take a step-by-step approach to win the hearts and minds of everyone.

INNOVATION TAKEAWAYS

VALUES ARE AT THE CORE

Data-driven decision-making within an organization should be seen as the default, accepting nothing less through all the ranks.

WE COULD BE HEROES

Recognize, profile, and support cultural role models that show a data-powered mindset in their daily work.

SYMBOLS LEAD THE WAY

PR management and publishing assets, images, and stories all help to brand and market a data-powered culture.

Interesting read?

Data-powered Innovation Review | Wave 3 features 15 such articles crafted by leading Capgemini experts in data, sharing their life-long experience and vision in innovation. In addition, several articles are in collaboration with key technology partners such as Google, Snowflake, Informatica, Altair, A21 Labs, and Zelros to reimagine what’s possible. Download your copy here!

How cross-industry data collaboration powers innovation

Capgemini
2022-04-20

This article first appeared on Capgemini’s Data-powered Innovation Review | Wave 3.

Written by:

Eve Besant SVP, Worldwide Sales Engineering
Snowflake

Innovation doesn’t happen in a vacuum. The development of new products, services, and solutions involves input and information from a multitude of sources. Increasingly, many of these sources are not only beyond an organization’s borders but also beyond the organization’s industry. According to a 2020 research paper on cross sector partnerships, “Cross-industry

innovation is becoming more relevant for firms, as this approach often results in radical innovations.” But developing innovations through cross-industry partnerships must involve coordinated data collaboration. “Firms can only benefit from cross-industry innovation if they are open to external knowledge sources and understand how to explore, transform, and exploit cross-industry knowledge,” the paper’s authors noted. “Firms must establish certain structures and processes to facilitate and operationalize organizational learning across industry boundaries.”

WE’VE SEEN AN INCREASE IN THE NUMBER OF CUSTOMERS WHO WANT TO COLLABORATE ON DATA FROM OTHER INDUSTRIES TO SPUR NEW IDEAS.”

Examples of cross-industry data collaboration

There is a multitude of examples of how organizations across industries have spurred innovation through collaboration.

  • In financial services, institutions that must prevent and detect fraud use cross-industry data sharing to better understand the profile of fraudsters and fraudulent transaction patterns.
  • In manufacturing, companies are using AI to manage supply-chain disruptions. Using data from external sources on weather, strikes, civil unrest, and other factors, they can acquire a full view of supply-chain issues to mitigate risks early.
  • In energy, smart meters in individual homes open new doors for data collaboration, transmitting information about energy consumption.
  • In education, school systems, local governments, businesses, and community organizations work together to improve educational outcomes for students.
  • In healthcare, during the COVID-19 pandemic, hospitals relied on information from health agencies and drug companies regarding the progression and transmission behavior of diseases. Governments followed data from scientists and healthcare professionals to create guidance for the public. Retailers heeded guidance from the public and healthcare sectors to create new in-store policies and shift much of their business online.

The role of cross-industry data collaboration in innovation during the pandemic is perhaps nowhere better exemplified than in the COVID-19 Research Database, involving a cross-industry consortium of organizations. The database, which can be accessed by academic, scientific, and medical researchers, holds billions of de-identified records including unique patient claims data, electronic health records, and mortality data. This has enabled academic researchers in medical and scientific fields as well as public health and policy researchers to use real-world data to combat the COVID-19 pandemic in novel ways.

Best practices for cross-industry collaboration

As the examples above show, organizations that have developed cross-industry data collaboration capabilities can more easily foster innovation, leading to a competitive advantage. Here are some of the considerations and best practices that enable sharing and collaborating on knowledge across industries.

  • A single, governed source for all data:
    Each industry – and indeed, each company – stores and formats its data in different ways and places. Housing data in one governed location makes it easier to gather, organize, and share semi-structured and structured data easily and securely.
  • Simplified data sharing:
    The relevant data must be easily accessible and shareable by all partners. Data is stored in different formats and types, and it can be structured, semi-structured, or unstructured. It can be siloed in specific departments and difficult or slow to move, or inaccessible to the outside world. What processes and tools are in place to transform cross-industry knowledge into a shareable, usable format?
  • Secure data sharing:
    Data privacy is of the utmost importance in today’s society. Data must be shareable securely and in compliance with privacy regulations. Cross-industry data sharing often involves copying and moving data, which immediately opens up security risks. There may also be different data protection and privacy regulations in different industries.
  • Inexpensive data management:
    Data must be shareable, and budgets kept in mind. Centralizing, organizing, securing, and sharing data is often resource-intensive, so organizations need to find ways to manage and share their data more efficiently.
  • Democratized data:
    While data security and privacy are paramount, companies must “democratize” data so that it is accessible and shareable in a way that allows non-technical users in both internal and external parties to use it easily.
  • Advanced analytics:
    Technologies such as AI and machine learning can help companies glean deeper insights from data. This requires a data foundation and tools that can analyze all types of data. Technological tools are making it easier for organizations to follow and gain ROI from these best practices.

For example, Snowflake’s Data Cloud enables the seamless mobilization of data across public clouds and regions, empowering organizations to share live, governed, structured, semistructured, and unstructured data (in public preview) externally without the need for copying or moving. Snowflake enables compliance with government and industry regulations, and organizations can store near-unlimited amounts of data and process it with exceptional performance using a “pay only for what you use” model. They can also use Snowflake’s robust partner ecosystem to analyze the data for deeper insights and augment their analysis with external data sets.

“We’ve seen an increase in the number of customers who want to collaborate on data from other industries to spur new ideas,” Snowflake’s Co-Founder and President of Products Benoit Dageville said, “ to foster innovation, to be able to freely collaborate within and outside of their organization, without added complexity or cost.”

The future of mass collaboration In the future, cross-sector data collaboration will only play a larger role in innovation as technology becomes more ubiquitous and the public grows more comfortable with sharing data. We could see worldwide consortiums that collaborate on data to solve some of humanity’s biggest problems: utilizing medical and scientific information to tackle global health crises, enabling more-efficient use of resources to fight poverty and climate change, and combating misinformation.

Organizations such as the World Bank are already working on such initiatives. Its Data Innovation Fund is working to help countries benefit from new tools and approaches to produce, manage, and use data. According to a recent World Bank blog post, “Collaboration between private organizations and government entities is both possible and critical for data innovation. National and international organizations must adopt innovative technologies in their statistical processes to stay current and meet the challenges ahead.”

To unlock the potential of innovation through data collaboration, organizations must make sure their data management and sharing capabilities are up to date. A robust, modern data platform can go a long way. But what’s also needed is an audit of internal processes and tools to ensure that barriers to data sharing and analysis are not impeding innovation and growth.

INNOVATION TAKEAWAYS

COLLABORATION NEEDS BEST PRACTICES

Organizations that implement best practices in cross-industry data collaboration can foster innovation, leading to a competitive advantage.

DATA CAPABILITIES MUST BE UP TO DATE

Organizations must make sure their data management and sharing capabilities are current, to unlock the potential of innovation through data collaboration.

TECHNOLOGY AND PLATFORMS TO THE RESCUE

Dedicated tools and data platforms make it easier for organizations to gain cross-sector data-collaboration capabilities much quicker.

Interesting read?

Data-powered Innovation Review | Wave 3 features 15 such articles crafted by leading Capgemini and partner experts in data, sharing their life-long experience and vision in innovation. In addition, several articles are in collaboration with key technology partners such as Google, Snowflake, Informatica, Altair, A21 Labs, and Zelros to reimagine what’s possible. Download your copy here!

Unlocking the power of AI with data management

Capgemini
2022-04-20

This article first appeared on Capgemini’s Data-powered Innovation Review | Wave 3.

Written by:

Jitesh Ghai Chief Product Officer Informatica

In today’s data-driven economy, artificial intelligence (AI) and machine learning (ML) are powering digital transformation in every industry around the world. According to a 2021 World Economic Forum report, more than 80 percent of CEOs say the pandemic has accelerated digital transformation. AI is top of mind for boardroom executives as a strategy to transform their businesses. AI and ML are critical to discovering new therapies in life sciences, reducing fraud and risk in financial services, and delivering personalized digital healthcare experiences, to name just a few examples that have helped the world as it emerges from the pandemic.

For business leaders, AI and ML may seem a bit like magic: their potential impact is clear but they may not quite understand how best to wield these powerful innovations. AI and ML are the underpinning technology for many new business solutions, be it for next-best actions, improved customer experience, efficient operations, or innovative products.

“AI IS MOST EFFECTIVE WHEN YOU THINK ABOUT HOW IT CAN HELP YOU ACCELERATE END-TO-END PROCESSES ACROSS YOUR ENTIRE DATA ENVIRONMENT.”

Machine learning in general, and especially deep learning, is data-hungry. For effective AI, we need to tap into a wide variety of data from inside and outside the organization. Doing AI and ML right requires answers to the following questions:

  • Is the data being used to train the model coming from the right systems?
  • Have we removed personally identifiable information and adhered to all regulations?
  • Are we transparent, and can we prove the lineage of the data that the model is using?
  • Can we document and be ready to show regulators or investigators that there is no bias in the data?

The answers require a foundation of intelligent data management. Without it, AI can be a black box that has unintended consequences.

AI needs data management

The success of AI is dependent on the effectiveness of the models designed by data scientists to train and scale it. And the success of those models is dependent on the availability of trusted and timely data. If data is missing, incomplete, or inaccurate, the model’s behavior will be adversely affected during both training and deployment, which could lead to incorrect or biased predictions and reduce the value of the entire effort. AI also needs intelligent data management to quickly find all the features for the model; transform and prepare data to meet the needs of the AI model (feature scaling, standardization, etc.); deduplicate data and provide trusted master data about customers, patients, partners, and products; and provide end-to-end lineage of the data, including within the model and its operations.

Data management needs AI

AI and ML play a critical role in scaling the practices of data management. Due to the massive volumes of data needed for digital transformation, organizations must discover and catalog their critical data and metadata to certify the relevance, value, and security – and to ensure transparency. They must also cleanse and master this data. If data is not processed and made usable and trustworthy while adhering to governance policies, AI and ML models will deliver untrustworthy insights.

Don’t take a linear approach to an exponential challenge

Traditional approaches to data management are inefficient. Projects are implemented with little end-to-end metadata visibility and limited automation. There is no learning, the processing is expensive, and governance and privacy steps can’t keep pace with business demands. So how can organizations move at the speed of business, increase operational efficiency, and rapidly innovate?

This is where AI shines. AI can automate and simplify tasks related to data management across discovery, integration, cleansing, governance, and mastering. AI improves data understanding and identifies privacy and quality anomalies. AI is most effective when you think about how it can help you accelerate end-to-end processes across your entire data environment. That’s why we consider AI essential to data management and why Informatica has focused its innovation investments so heavily on the CLAIRE engine, its metadata-driven AI capability. CLAIRE leverages all unified metadata to automate and scale routine data management and stewardship tasks.

As a case in point, Banco ABC Brasil struggled to provide timely data for analysis due to slow manual processes. The bank turned to an AI-powered integration Platform-as-a-Service and automated data cataloging and quality to better understand its information using a full business glossary, and to run automated data quality checks to validate the inputs to the data lake. In addition, AI-powered cloud application integration automated Banco ABC Brasil’s credit-analysis process. Together, the automated processes reduced predictive model design and maintenance time by up to 70 percent and sharpened the accuracy of predictive models and insights with trusted, validated data. They also enabled analysts to build predictive models 50 percent faster, accelerating credit application decisions by 30 percent.

With comprehensive data management, AI and ML models can lead to effective decision-making that drives positive business outcomes. To counter the exponential challenge of ever-growing volumes of data, organizations need automated, metadata-driven data management.

INNOVATION TAKEAWAYS

Accelerate engineering
Data engineers can rapidly deliver trusted data using a recommender system for data integration, which learns from existing mappings.

Boost efficiency
AI can proactively flag outlier values and predict issues that may occur if not handled ahead of time.

Detect relationships among data
AI can detect relationships among data and reconstitute the original entity quickly, as well as identify similar datasets and make recommendations.

Automate data governance
In many cases, AI can automatically link business terms to physical data, minimizing errors and enabling automated data-quality remediation.

Interesting read?

Data-powered Innovation Review | Wave 3 features 15 such articles crafted by leading Capgemini and partner experts in data, sharing their life-long experience and vision in innovation. In addition, several articles are in collaboration with key technology partners such as Google, Snowflake, Informatica, Altair, A21 Labs, and Zelros to reimagine what’s possible. Download your copy here!

A case for context awareness in AI

Capgemini
2022-04-20

Does applied AI have the necessary insights to tackle even the slightest (unlearned or unseen) change in context of the world surrounding it? In discussions, AI often equals deep-learning models. Current deep-learning methods heavily depend on the presumption of “independent and identically distributed” data to learn from, something which has serious implications for the robustness and transferability of models. Despite very good results on classification tasks, regression, and pattern encoding, current deep-learning methods are failing to tackle the difficult and open problem of generalization and abstraction across problems. Both are prerequisites for general learning and explanation capabilities.

There is great optimism that deep-learning algorithms, as a specific type of neural network, will be able to close in on “real AI” if only it is further developed and scaled up enough (Yoshua Bengio, 2018). Others feel that current AI approaches are merely a smart encoding of a general distribution into a deep-learning networks’ parameters, and regard it as insufficient to act independently within the real world. So, where are the real intelligent behaviors, as in the ability to recognize problems and plan for solving them and understand the physics, logic, causality, and analogy?

“THERE IS A NEED FOR CONTEXTUAL KNOWLEDGE IN ORDER TO MAKE APPLIED AI MODELS TRUSTABLE AND ROBUST IN CHANGING ENVIRONMENTS.”

Understanding the real world

What is needed is a better understanding by machines of their context, as in the surrounding world and its inner workings. Only then can machines capture, interpret, and act upon previously unseen situations. This will require the following:

  • Understanding of logical constructs as causality (as opposed to correlation). If it rains, you put on a raincoat, but putting on a raincoat does not stop the rain. Current ML struggles to learn causality. Being able to represent and model causality will to a large extent facilitate better explanations and understanding of decisions made by ML models.
  • The ability to tackle counterfactuals, such as “if a crane has no counterweight, it will topple over.”
  • Transferability of learned “knowledge” across/between domains; current transfer learning only works on small tasks with large domain overlap between them, which means similar tasks in similar domains.
  • Withstand adverse attacks. Only small random changes made in the input data (deliberately or not) can make the results of connectionist models highly unreliable. Abstraction mechanisms might be a solution to this issue.
  • Reasoning on possible outcomes, finding problematic outcomes and
    a) plan for avoiding them while reaching the goal
    or b) if that is not possible, find alternative goals and try to reach those.

In the first edition of this review, we already made the case for extending the context in which AI models are operating, using a specific type of model that can benefit from domain knowledge in the form of knowledge graphs. From the above, it follows that knowledge alone probably will not be enough. Higher-level abstraction and reasoning capabilities are also needed. Current approaches aim at combining “connectionist” approaches with logical theory.

  1. Some connectionists feel that abstraction capability will follow automatically from scaling up networks, adding computing power, and using more data. But it seems that deep-learning models cannot abstract or generalize more than learning general distributions. The output will at the most be a better encoding but still not deliver symbolic abstraction, causality, or showing reasoning capabilities.
  2. Symbolic AI advocates concepts as abstracted symbols, logic, and reasoning. Symbolic methods allow for learning and understanding humanmade social constructs like law, jurisprudence, country, state, religion, and culture. Could connectionist methods be “symbolized” to provide the capabilities as mentioned above?
  3. Several innovative directions can be found in trying to merge methods into hybrid approaches consisting of multiple layers or capabilities.
  • Intuition layer: Let deep-learning algorithms take care of the low-level modeling of intuition or tacit skills shown by people having performed tasks over a long time, like a good welder who can hardly explain how she makes the perfect weld after years of experience.
  • Rationality layer: The skill-based learning where explicit learning by conveying rules and symbols to a “learner” plays a role, as in a child told by her mother not to get too close to the edge. A single example, not even experienced, might be enough to learn for life. Assimilating such explicit knowledge can steer and guide execution cycles which, “through acting,” can create “tacit skills” within a different execution domain as the original layer.
  • Logical layer: Logics to represent causality, analogy, and providing explanations
  • Planning and problem-solving layer: A problem is understood, a final goal is defined, and the problem is divided into sub-domains/problems which lead to a chain of ordered tasks to be executed, monitored (with intuition and rationality), and adapted.

In general, ML models that incorporate or learn structural knowledge of an environment have been shown to be more efficient and generalize better. Some great examples of applications are not difficult to find, with the Neuro-Symbolic AI by MIT-IBM Watson lab as a good demonstration of how hybrid approaches (like NSQA in this case) can be utilized for learning in a connectionist way while preserving and utilizing the benefits of full-order logics in enhanced query answering in knowledge-intensive domains like medicine. The NSQA system allows for complex query-answering, learns along, and understands relations and causality while being able to explain results.

The latest developments in applied AI show that we get far by learning from observations and empirical data, but there is a need for contextual knowledge in order to make applied AI models trustable and robust in changing environments.

INNOVATION TAKEAWAYS

HYBRID APPROACHES are needed to model and use causality, counterfactual thinking, problem solving, and structural knowledge of context.

NEURAL-SYMBOLIC PROCESSING combines the benefits of connectionist and symbolic approaches to solve issues of trust, proof, and explainability.

CONTEXTUAL KNOWLEDGE AI needs modeling more of the world to be able to understand the physics and logic, causality, and analogy in the surrounding world.

Interesting read?

Data-powered Innovation Review | Wave 3 features 15 such articles crafted by leading Capgemini and partner experts in data, sharing their life-long experience and vision in innovation. In addition, several articles are in collaboration with key technology partners such as Google, Snowflake, Informatica, Altair, A21 Labs, and Zelros to reimagine what’s possible. Download your copy here!

Using AI augmentation to empower database administrators

Capgemini
2022-04-20

This article first appeared on Capgemini’s Data-powered Innovation Review | Wave 3.

Written by:


Arvind Rao
Partner Architect Advisor
Google Cloud

Most enterprises already have the talent in-house to start using AI to unlock the full potential of their data. They are the database administrators. They know the data, they know the organization, and they are trusted advisors – they just need a little help from data-platform vendors.

The world’s largest organizations generally understand that to continue to succeed in today’s competitive environment, they need to become data-powered enterprises. They acknowledge that it’s imperative to modernize their data and harness the full power of tools such as AI to derive actionable insights. However, many of these companies have also learned that human resources are a major challenge in making this transformation.

In short, there are not enough data scientists – those who create the solutions that leverage state-of-the-art technologies such as AI. Based on my experience in data analytics over the past couple of decades, in an ideal world data scientists would account for 10 to 15 percent of the staff at a data-powered organization. Yet the majority of organizations – including most successful technology enterprises – have not achieved that ideal/goal.

“ORGANIZATIONS UNDERSTAND IT’S IMPERATIVE TO MODERNIZE THEIR DATA AND HARNESS THE FULL  POWER OF AI WHILE HUMAN RESOURCES POSE A MAJOR CHALLENGE.”

DBAs to the rescue

The good news is most enterprises already have the talent to successfully make this transformation. Database administrators (DBAs) – those who manage a company’s data warehouses and similar data platforms – are the backbone of most IT operations. These professionals understand the data an enterprise has collected, where it’s stored, and how to use it. They ensure authorized people have access to the data they need. And since data is sensitive and valuable, they control who has access to it to keep it safe from misuse or theft.

Knowledge and trust

As a result, database administrators know more about their company’s data than anyone else in their organization. They certainly know more than the data scientists who work for the technology vendors that develop the data platforms upon which modern enterprises rely.

At the same time, database administrators are trusted advisors within their enterprise. They’re the go-to source for help when a business user needs to derive insights – whether that’s a salesperson looking to improve lead generation, a service manager trying to spot potential customer satisfaction issues, or an executive seeking market predictions for the coming year.

It, therefore, makes sense to ensure database administrators can leverage the insights and capabilities of AI-augmented data platforms.

The Lake House

The majority of data-platform vendors have been working towards the concept of a Lake House – a convergence of databases, data warehouses, and data lakes – that makes the platform usable and accessible to everyone and everywhere. With data scientists increasingly focused on creating these new platforms, vendors have fewer resources to dedicate to building, managing, and maintaining the – often highly customized – tools required by business users. That’s why it’s important that data platform vendors augment their solutions with AI. It’s also why these AI augmentations must be easy to use in the DBA’s day-to-day role: They should not have to invest huge amounts of time learning data science to take advantage of these tools. Enterprises are increasingly demanding this simplicity of their suppliers – whether they are vendors of databases, data platforms, analytics, or cloud-based solutions.

At Google, we’ve developed a number of solutions that help bridge the gap and create data warehouses infused with AI/ML that work for all users – not just data scientists.

  • Vertex AI brings together Google Cloud services for building machine learning in a unified user interface and API. With Vertex AI, a database administrator can easily train and compare models using AutoML or custom code training. All models are stored in one central repository and can be deployed in ways that allow DBAs and other non-data scientists to start using AI/ML in their day-today work, with very little training.
  • Dataplex is an intelligent data fabric that breaks apart silos. It provides a single pane of glass that allows database administrators to centrally manage, monitor, and govern an organization’s data – including ingestion, storage, analytics, AI/ML, and reporting. It does this across any type of platform – including data lakes, data warehouses, and data marts – with consistent controls that provide access to trusted data and power analytics at scale.
  • BigQuery is a serverless, cost-effective, multi-cloud data warehouse designed for business agility. BigQuery democratizes insights with a secure and scalable platform to perform functions such as anomaly detection, customer segmentation, product recommendation, and predictive forecasting. It features built-in machine learning to derive business insights using a flexible, multi-cloud analytics solution and adapts to data at any scale from bytes to petabytes with zero operational overhead. Most importantly, database administrators can learn BigQuery and easily incorporate it into their tasks

The smart data-warehouse platform

Looking ahead, I envision a future in which most successful organizations deploy a smart data-warehouse platform that provides a number of important benefits. These include:

  • Easy access to the organization’s data, public data, and other business data – without worrying about what kind it is or where it’s stored
  • Serverless tools to access data in real-time, to mine and infuse AI/ML capabilities. These would be scalable on-demand, set a strong foundation for building AI models, and be cost-effective.
  • Reporting tools that showcase analytics in real-time – in a safe, secure, and scalable way
  • Modern data warehouse capabilities equip all users with the tools and resources they need to do their jobs efficiently and effectively, and that provide CXOs with the tools they need to keep their staff motivated.

As enterprises work to achieve this goal, leveraging AI to empower database administrators in their day-to-day work is something they can do now, and do cost-effectively. They just need the right tools from their vendors.

Giving DBAs easy-to-learn AI-powered tools will enhance the value they already provide to the enterprise. It can also help keep these knowledgeable team members – the organization’s trusted advisors on all matters IT-related – relevant as the enterprise embraces a new, more powerful, and innovative data-powered future.

INNOVATION TAKEAWAYS

A VALUABLE RESOURCE

Database administrators know the company’s data and are trusted by its people. They have important roles to play in an organization’s transformation into a data-powered enterprise.

SHARE THE LOAD

Database administrators bridge the gap between the data scientists who are creating the next generation of AI-powered analytics tools and the business users who will benefit from the insights such tools provide.

VENDORS MUST HELP

Data-platform vendors must incorporate easy-to-learn AI tools into their products so database administrators can take full advantage of these state-of-the-art solutions.

Interesting read?

Data-powered Innovation Review | Wave 3 features 15 such articles crafted by leading Capgemini and partner experts in data, sharing their life-long experience and vision in innovation. In addition, several articles are in collaboration with key technology partners such as Google, Snowflake, Informatica, Altair, A21 Labs, and Zelros to reimagine what’s possible. Download your copy here!

Data and the sustainability ecosystem

Capgemini
2022-04-20

Written by:

Philip Harker VP, Sustainability Lead, Insights & Data
Capgemini
Courtney Holm VP, Sustainability Solutions
Capgemini Invent

“Data isn’t oil,” says Zhiwei Jiang, CEO of Capgemini’s Insights & Data global business line. “At least not anymore. Data is more like sunshine, abundant and unlimited in its potential. It has a positive impact on the environment. And critically, sunshine wants to be shared – not hoarded.” Carbon and its equivalents (CO2E) have a currency, and with carbon reduction ambitions and even net-zero initiatives, it is essential that organizations commit to adopt change, act in ways that reduce carbon across the business, and further monitor and report carbon, as in Scope 1, 2, and 3.

A new currency is born

With the same intent and discipline as money, carbon should be treated as a currency, flowing through the organization, and with it the scrutiny, controls, and systems to avoid diversion and waste – just as with money. Hence the need for systemized and structured data to give transparency and insights to not only support decisions for reporting on Scope 1, 2, and 3 but moreover to empower planning and modeling for business planning and execution.

Equally, data residing in siloes isn’t going to answer the enterprise requirements for sustainability reporting; data needs to be accessible and available in a form fit for consumption. The potential is with the executives of the software industry and their ability to make data flow and be shared through adoption by the enterprise. Here are some examples. Salesforce has an established service, the Sustainability Cloud, which leverages its proven approach for rapid templating of action of the interpreted adoption of greenhouse gas (GHG) data. This, coupled with MuleSoft for data integration and APIs, will not only ease the market burden on the shortage of sustainability data skills but also quicken the rolling deployment of GHG reporting of Scope 1, 2, and potentially 3 emissions.

” WITH THE SAME INTENT AND DISCIPLINE AS MONEY, CARBON SHOULD BE TREATED LIKE A CURRENCY, FLOWING THROUGH THE ORGANIZATION, AND WITH IT THE SCRUTINY, CONTROLS, AND SYSTEMS.”

SAP is also building on its core capability and has a unique position in this space, with a perhaps more strategic stance: that of being the guardians of transactional data. Will it become the “ERP for carbon” as is expected? As organizations mature their carbon reporting to be more accountable, their organizations will evolve more into a sustainable P&L, where the business understands, measures, and reduces the environmental, social, and financial impact of its operations. This in itself will play to the strengths of SAP’s system-of-record heritage. It has offerings with Product Footprint, circular economy, and enterprise reporting which will surely be harmonized into a suite of holistic products.

Then there is Microsoft, with a different advantage. Many organizations today are reporting on their efforts in a natural reporting fashion, building custom reports to meet their obligations or regulations. With Microsoft having the de-facto toolset for extracting, cleansing, sorting, and reporting data with 365, Azure, and PowerBI tools, its ubiquitous use will be a significant liberator, enabling transparency within the confines of legacy silos to enable reporting ease. Certainly, in the short-term while enterprise-class solutions are being further developed, Microsoft may well have an advantage and, indeed, the imminent release of Microsoft Cloud for Sustainability will be a testament to its tooling.

A new dawn is approaching for sustainability solutions, as is evidenced by these three examples (and there are many others). With all of these comes the need for contextual data and sustainability expertise to implement the technology, instill confidence in the solution and drive business outcomes. This is an evolution, not a revolution.

Darwinism: adapt or else…

Consider how many businesses rarely operate in isolation. A consumer packaged- goods company making shampoo, for example, will understand the ingredients to make the product and the manufacturing and packaging processes, and this data is most likely to reside in its core ERP and PLM systems. The products are shipped from the factory to a warehouse for distribution, to retailers, and, naturally, onto consumers. Data will need to coalesce and harmonize along the journey to measure actuals or by defined proxy to report the impact. These data points will reside outside the CPG organization, with the logistics provider and retailers. Data can also come from open data sources standardizing routes and traffic that are set with revisions. These supply chains are built over a long period, typically through the lens of a financial business case. Will the currency of carbon and the need to report on Scope 3 emissions trigger the need for these supply chains to be recalibrated?

Indeed, on logistics, the current routes are typically optimized by distance, traffic, and cost, so what about optimizing routes based on impact on the environment, like avoiding a school or a designated low-emissions zone? Here’s where open data sources from Google, governments, academic institutions, and not-for-profits can collaborate to form innovative solutions based on a #DataForGood basis.

Tooling for the (new) trade

This is the reality that will evolve, and suddenly solutions get complex. If this is coupled with supply-chain transparency initiatives like digital twins and blockchain, there is indeed another level of complexity. So, simplification is needed, and will come with standards: data systemized and shared openly in a governed and secure ecosystem. Expectations will evolve and with it the need to trust the inputs and outputs of these data ecosystems along with the need for organizations to share data with many other parties, and to do so for mutual benefit. Core to this is the need for data mastery, and in our research paper we discuss the various models for collaboration and opportunities to serve and monetize data. One thing is for sure, expertise in climate, energy, and data lifecycles will be paramount for these to succeed and evolve at pace.

INNOVATION TAKEAWAYS

CARBON DATA IS A CURRENCY
And to manage it effectively, just as any other currency, its data needs to freely flow within and between organizations.

COLLABORATION CHANGES BUSINESS
On their journey to reducing carbon emissions, organizations find new ways to partner and collaborate on data, innovating their business models while doing so.

Interesting read?

Data-powered Innovation Review | Wave 3 features 15 such articles crafted by leading Capgemini experts in data, sharing their life-long experience and vision in innovation. In addition, several articles are in collaboration with key technology partners such as Google, Snowflake, Informatica, Altair, A21 Labs, and Zelros to reimagine what’s possible. Download your copy here!