Skip to Content

How cross-industry data collaboration powers innovation

Eve Besant
2022-02-18

This article first appeared on Capgemini’s Data-powered Innovation Review | Wave 3.

Written by Eve Besant SVP, Worldwide Sales Engineering, Snowflake

Innovation doesn’t happen in a vacuum. The development of new products, services, and solutions involves input and information from a multitude of sources. Increasingly, many of these sources are not only beyond an organization’s borders but also beyond the organization’s industry. According to a 2020 research paper on cross sector partnerships, “Cross-industry innovation is becoming more relevant for firms, as this approach often results in radical innovations.” But developing innovations through cross-industry partnerships must involve coordinated data collaboration. “Firms can only benefit from cross-industry innovation if they are open to external knowledge sources and understand how to explore, transform, and exploit cross-industry knowledge,” the paper’s authors noted. “Firms must establish certain structures and processes to facilitate and operationalize organizational learning across industry boundaries.”

WE’VE SEEN AN INCREASE IN THE NUMBER OF CUSTOMERS WHO WANT TO COLLABORATE ON DATA FROM OTHER INDUSTRIES TO SPUR NEW IDEAS.”

Examples of cross-industry data collaboration

There is a multitude of examples of how organizations across industries have spurred innovation through collaboration.

  • In financial services, institutions that must prevent and detect fraud use cross-industry data sharing to better understand the profile of fraudsters and fraudulent transaction patterns.
  • In manufacturing, companies are using AI to manage supply-chain disruptions. Using data from external sources on weather, strikes, civil unrest, and other factors, they can acquire a full view of supply-chain issues to mitigate risks early.
  • In energy, smart meters in individual homes open new doors for data collaboration, transmitting information about energy consumption.
  • In education, school systems, local governments, businesses, and community organizations work together to improve educational outcomes for students.
  • In healthcare, during the COVID-19 pandemic, hospitals relied on information from health agencies and drug companies regarding the progression and transmission behavior of diseases. Governments followed data from scientists and healthcare professionals to create guidance for the public. Retailers heeded guidance from the public and healthcare sectors to create new in-store policies and shift much of their business online.

The role of cross-industry data collaboration in innovation during the pandemic is perhaps nowhere better exemplified than in the COVID-19 Research Database, involving a cross-industry consortium of organizations. The database, which can be accessed by academic, scientific, and medical researchers, holds billions of de-identified records including unique patient claims data, electronic health records, and mortality data. This has enabled academic researchers in medical and scientific fields as well as public health and policy researchers to use real-world data to combat the COVID-19 pandemic in novel ways.

Best practices for cross-industry collaboration

As the examples above show, organizations that have developed cross-industry data collaboration capabilities can more easily foster innovation, leading to a competitive advantage. Here are some of the considerations and best practices that enable sharing and collaborating on knowledge across industries.

  • A single, governed source for all data:
    Each industry – and indeed, each company – stores and formats its data in different ways and places. Housing data in one governed location makes it easier to gather, organize, and share semi-structured and structured data easily and securely.
  • Simplified data sharing:
    The relevant data must be easily accessible and shareable by all partners. Data is stored in different formats and types, and it can be structured, semi-structured, or unstructured. It can be siloed in specific departments and difficult or slow to move, or inaccessible to the outside world. What processes and tools are in place to transform cross-industry knowledge into a shareable, usable format?
  • Secure data sharing:
    Data privacy is of the utmost importance in today’s society. Data must be shareable securely and in compliance with privacy regulations. Cross-industry data sharing often involves copying and moving data, which immediately opens up security risks. There may also be different data protection and privacy regulations in different industries.
  • Inexpensive data management:
    Data must be shareable, and budgets kept in mind. Centralizing, organizing, securing, and sharing data is often resource-intensive, so organizations need to find ways to manage and share their data more efficiently.
  • Democratized data:
    While data security and privacy are paramount, companies must “democratize” data so that it is accessible and shareable in a way that allows non-technical users in both internal and external parties to use it easily.
  • Advanced analytics:
    Technologies such as AI and machine learning can help companies glean deeper insights from data. This requires a data foundation and tools that can analyze all types of data. Technological tools are making it easier for organizations to follow and gain ROI from these best practices.

For example, Snowflake’s Data Cloud enables the seamless mobilization of data across public clouds and regions, empowering organizations to share live, governed, structured, semistructured, and unstructured data (in public preview) externally without the need for copying or moving. Snowflake enables compliance with government and industry regulations, and organizations can store near-unlimited amounts of data and process it with exceptional performance using a “pay only for what you use” model. They can also use Snowflake’s robust partner ecosystem to analyze the data for deeper insights and augment their analysis with external data sets.

“We’ve seen an increase in the number of customers who want to collaborate on data from other industries to spur new ideas,” Snowflake’s Co-Founder and President of Products Benoit Dageville said, “ to foster innovation, to be able to freely collaborate within and outside of their organization, without added complexity or cost.”

The future of mass collaboration In the future, cross-sector data collaboration will only play a larger role in innovation as technology becomes more ubiquitous and the public grows more comfortable with sharing data. We could see worldwide consortiums that collaborate on data to solve some of humanity’s biggest problems: utilizing medical and scientific information to tackle global health crises, enabling more-efficient use of resources to fight poverty and climate change, and combating misinformation.

Organizations such as the World Bank are already working on such initiatives. Its Data Innovation Fund is working to help countries benefit from new tools and approaches to produce, manage, and use data. According to a recent World Bank blog post, “Collaboration between private organizations and government entities is both possible and critical for data innovation. National and international organizations must adopt innovative technologies in their statistical processes to stay current and meet the challenges ahead.”

To unlock the potential of innovation through data collaboration, organizations must make sure their data management and sharing capabilities are up to date. A robust, modern data platform can go a long way. But what’s also needed is an audit of internal processes and tools to ensure that barriers to data sharing and analysis are not impeding innovation and growth.

INNOVATION TAKEAWAYS

COLLABORATION NEEDS BEST PRACTICES

Organizations that implement best practices in cross-industry data collaboration can foster innovation, leading to a competitive advantage.

DATA CAPABILITIES MUST BE UP TO DATE

Organizations must make sure their data management and sharing capabilities are current, to unlock the potential of innovation through data collaboration.

TECHNOLOGY AND PLATFORMS TO THE RESCUE

Dedicated tools and data platforms make it easier for organizations to gain cross-sector data-collaboration capabilities much quicker.

Interesting read?

Data-powered Innovation Review | Wave 3 features 15 such articles crafted by leading Capgemini experts in data, sharing their life-long experience and vision in innovation. In addition, several articles are in collaboration with key technology partners such as Google, Snowflake, Informatica, Altair, A21 Labs, and Zelros to reimagine what’s possible. Download your copy here!

2022 Key trends in Tax

Simon Pearson - VP, Global Tax and Trade
Simon Pearson
2022-02-18

In many ways tax authorities must become disrupters and innovators to keep pace with changing user expectations and the opportunities enabled by adjacent industries such as retail banking, fintech, payments and connected supply chains. Using advances in intelligent industry, digital, data and cloud will make tax much easier to administer for businesses, citizens and tax authorities alike.

As governments scramble to respond to the enormous challenges facing societies, economies and our planet, speed and agility are now essential attributes for public authorities. During the pandemic, national treasuries often had to set aside traditional structures and processes in order to release the huge sums of money so urgently needed to maintain social cohesion.

In turn, many tax and customs authorities are transforming too, embracing new and innovative ways to keep essential tax revenues flowing, that respond to changes in society, and the financial imperatives of the health and climate emergencies, while maintaining security and compliance.

Digital technologies, data and cloud are providing the transformational tools required. Automation and AI are replacing manual processes, producing more agile, service-driven organizations, able to meet customer demands for convenience, speed, and ease of use. Data and analytics are informing decision making and financial planning, as well as nudging citizens towards the right behaviors, while helping with their entitlements and obligations. Skilled tax professionals are becoming active change agents, creating more flexible and technology-enabled tax regimes that help drive key social, economic, and environmental policies.

1. Building trust and security will help transform tax authorities’ place in the economy and society

As tax authorities continue their fightback against cyber criminals by bolstering their defenses with increasingly robust and sophisticated cybersecurity measures, not only are they protecting vital national resources and infrastructure, but they are also building that priceless commodity – trust.

Trust is a critical component in the ongoing evolution of tax authorities, from enforcers to business enablers and active participants for good, providing the resources that deliver governments’ key social, economic, and environmental policies.

Trust can be truly transformational in the tax world. When people trust their tax authority, they are more likely to pay their taxes in full and on time. When citizens feel that their tax system is fair, secure, transparent, and operating in the best interests of society, they are more likely to share their data; more likely to adopt digital processes and modern payment mechanisms; and more likely to use technologies such as cognitive care when they need support.

In these circumstances businesses are more likely to see tax authorities as potential partners, participants in rich data ecosystems, collaborating and sharing information on their tax affairs while bringing benefits to society by tracking ethical practices such as the living wage or adherence to modern slavery legislation. This is a powerful reach far beyond that tax authority’s traditional role.

As these new relationships – and the trust that sits at their core – become established and grow, so that spirit of collaboration can extend throughout economies and societies, driving sustainable economic growth, supporting businesses, and achieving social responsibility goals.

Enhanced cybersecurity has also enabled tax authorities to successfully embrace hybrid working during COVID-19, at a time of unique risk and vulnerability, with criminals eager to exploit any loopholes as public sector organizations scrambled to formulate their responses to the pandemic. This must remain a focus as criminals become more inventive in their exploitation of weaknesses, with great emphasis on supporting and protecting users in their critical tasks through education, new processes and technology enablers.

2. User-centric products and services, combined with technology, will drive participation

Today’s customers, whether consuming services from their mobile phone company, clothing retailer or tax authority, expect a fast, frictionless, and personalized multi-modal digital experience, informed by an understanding of life events and, in the case of their own tax situation, precise information about tax obligations and entitlements.

In 2022, the drive for hyper-personalization will accelerate, with tax authorities adopting best practices from across the economy to apply user-centricity to all stages of the customer journey, to increase trust, confidence and compliance with tax laws and obligations, while also reducing the need for costly agents and accountants.

Digitally native customers will adopt self-sovereign data practices, ensuring that the data that tax authorities hold on them and their businesses is accurate, while also deciding who else they wish to share it with. This will give rise to new forms of data sharing and consent across geographical boundaries, facilitating ease of movement, and also improving overall tax compliance by making pre-populated tax returns and payments an easy process. In 2021 the UK’s HMRC launched the world’s first public sector Open Banking payment initiation system, enabling payments to be made directly from bank payment accounts to payee bank accounts, without the use of cards.

Meanwhile, advances in mobile technology, 5G and edge computing will enable more media and AI-enabled applications for tax administration to become available, serving the needs of all taxpayers, but in particular younger people for whom smart devices are instinctive and the default choice. By providing a feature-rich user experience, new taxpayers can be better informed about the role of tax in society and be confident to manage their tax affairs and share their data from the palm of their hand.

3. Data sharing and data sovereignty will deliver choice and control

Real-time data will drive tax obligations and welfare entitlement at the point of the transaction, driven by closer integration with customers’ third-party applications, and voluntary compliance through integration with their banking and platform lives.

The importance of real-time data will be amplified across Europe as e-invoicing and VAT standards are mandated, enabling both more accurate data capture and AI-driven repayments, based upon risk and provenance. This will promote stronger economic activity with greatly reduced friction.

Combining rich data from Open Banking, payment and other third-party data will allow AI and pattern recognition to enable early identification of business vulnerabilities, allowing customers to declare their risks, seek support and prevent unrecoverable business debt and individual hardship. Early warnings will enable tax authorities to make better decisions about compliance and debt management interventions as early as possible.

Meanwhile the use of common data spaces and ecosystems, driven by standards in Open Finance, will allow tax-related data to be shared, with consent, to recover tax in a more transparent and frictionless manner. There will also be an ongoing focus on closed ecosystems sharing critical, cross-border financial information to close gaps in financial crime and tax evasion.

4. Demographic shifts will produce growth in indirect taxes – and automation and AI

Demographic studies reveal growing social and economic challenges facing industrialized nations, caused by rapidly aging populations. The UN predicts that those over 65 years of age will double from 727 million in 2020 to more than 1.5 billion by 2050.

Among the many consequences of this trend are a reduction in the working-age population, rising healthcare and pension costs, and increased demand in the economy for products and services for older citizens. As a result, 2022 will see governments, through their tax authorities, continuing the trend towards more indirect tax regimes, where citizens will pay for the things they use and the assets they own, rather than contributing to national budgets through income or business taxes.

At the same time, similar effects are being experienced by tax authorities themselves as older, skilled and experienced tax professionals retire, with lower numbers of experts available to replace them.

Here, 2022 will see further extensions in hybrid, more sustainable working models and intelligent industry techniques, using data to allocate tasks to the most appropriate resources, deploying automation, AI and collaboration tools to enhance productivity, reduce errors and enable smaller teams to work on higher value tasks.

5. Tax will be increasingly used to drive consumer behavior

Humanity must achieve the most fundamental change in its behavior, in the shortest period of time in its history, if Net Zero 2050 is to become a reality.  Although by common consent we’re starting to fall behind in the race to Net Zero, even at this late stage, all is not lost.

By redoubling our efforts and taking fast, effective and coordinated action, the line on the graph can still be reset to the required trajectory, towards the 2030 targets that we must hit to achieve 2050.

To achieve the mass consumer participation that brings Net Zero into range, more and more products and services produced by sustainable means must be affordable, easy to access and simple to use, for the overwhelming majority of consumers.  Currently, uncompetitive prices, lack of availability and perceived complexity are still pushing too many consumers in the direction of high-carbon, unsustainable solutions.

In 2022, tax authorities will have an increasingly important role to play in enabling more and more consumers to contribute to the global effort, deploying a variety of tax policies to encourage citizens to make the vital personal changes in lifestyle and purchasing decisions – electric cars versus petrol or diesel power for example – that are essential if we are to deliver a brighter future for all.

Further reading

For information about Capgemini’s tax and customs services, visit here.

Our look at 2022 trends in tax and customs was compiled in conversation with:

Simon Pearson - VP, Global Tax and Trade

Simon Pearson

VP, Capgemini, Global Tax and Trade Cluster Leader
“While tax brings essential funds to economies, compliance depends on the perception of tax justice. Authorities must ensure fairness by closing the tax gap and bearing down on the non-compliant. This is both a national and cross-border issue and tax authorities are recognizing the value of data sharing and tackling new forms of evasion with innovative detection capabilities.”

    Next steps towards the total hybrid experience

    Capgemini
    Capgemini
    2022-02-17

    The news that hybrid working is the future of work surely comes as music to the ears of many. For office-based workers, it means a flexible start each morning, or the opportunity to make the most of ‘digital nomad’ visas. Organizations too have benefited, with as many as 70% of them seeing improvements to productivity, reduction in facility management costs. More importantly, hybrid working bodes well for diversity and inclusion.

    This rapid shift to hybrid working means employees want to be able to work seamlessly across locations, devices, and on the move. Organizations now need to re-look at their digital operations model and address technical and operational debt accrued over the years. With the fundamental infrastructure in place, it is imperative organizations take the next steps to deliver the Total Employee Experience.

    In 2022, there are two aspects to hybrid workplace leadership: optimizing the base and preparing for transformation. Here we look at key trends that we expect to accelerate this year.

    Narrowing the ‘Digital Dexterity’ Gap

    Digital dexterity denotes the ability of an employee to take full advantage of the technology at hand. In plain language, can I use the tools I’ve been given to their full capability? Or is it proving a challenge? The answer to these questions is a bit tricky as we have four generations in the workplace – Baby Boomers and Generations X, Y, and Z – with differing technology skillsets. Failure to manage this disparity and maximize capabilities directly impacts productivity- and with more connected technologies entering the workplace this year, issues will only snowball if left unaddressed.

    Narrowing the gap starts with training. Employing external trainers or utilizing in-house IT teams to run employees through the ‘whats’ and ‘hows’ might sound obvious, but few organizations have done it. Productivity tools like Microsoft Teams are more than messaging services – they are platforms for a more efficient, collaborative way of working. Optimizing familiar programs is essential; introducing productivity augmenting capabilities with automation and emerging technologies is the next step.

    Patching Vulnerabilities

    With the introduction of IoT connected workplaces and with many employees working from home, data ecosystems are getting larger and more complex. Both multiply the entry points for cyber attackers, making robust cybersecurity more important than ever.

    With a lens on addressing new vulnerabilities, leaders must evaluate how this will affect the employee experience. To address this, we must step away from bolted-on security towards built-in capabilities while ensuring partners and supply chains are protected. The raised threat level punches above the capabilities of small or in-house security providers, and so leaders must engage with larger, global organizations to manage security. This does not spell the end of in-house security but should accelerate its evolution into a more employee-centric service.

    Achieving True Flexibility

    We are much closer to understanding what flexibility means to employees. It’s the freedom to work wherever, whenever, and however you can be most productive and attentive to your personal wellbeing. Leaders must make informed decisions about the degree to which this is offered, but what is clear is that flexibility is desired and desirable. A recent study found that flexible working delivered a £37 billion-a-year boost to the UK economy and with 50% more uptake, its value could rise to £55 billion.

    The next step is operational flexibility. We are already seeing a shift in how customers engage with services. Agile contracts from niche service providers are becoming more in demand as they enable organizations to respond faster – as opposed to being weighed down by locked in, long-term commitments. The surge of new providers delivering this ‘Netflix’ model will be important in addressing industry-specific needs by offering verticalized solutions.

    Augmenting Inclusivity and Sustainability

    Gartner Research into hybrid working has found that it can boost inclusion by 24%. The technology keeping us connected has enabled organizations to widen the pool of candidates by extending geographical range. This reflects the importance of delivering a workplace experience that goes beyond simple connection. Increasingly, employees expect consumer-grade experiences that are both personalized and intuitive to their needs.

    Technologies such as Virtual Reality and Augmented Reality can be used to enhance training, collaboration, and recruitment. Engaging in simulations of day-to-day interactions, for instance, is an effective way to inculcate a culture of diversity and inclusion.

    Digital adoption is and will be fundamental to delivering this. Therefore, it is important to design a workplace that interconnects technology with people and workspaces. If, for example, an employees’ laptop is coming to the end of its lifecycle, new predictive technology should be able to anticipate this and replace it before it malfunctions. The ‘self-healing’ workplace is just one of many ways new technology can be leveraged to make employees feel more valued and engaged. In addition, AI and data driven technologies can automate workloads, freeing up employees’ time to innovate or engage in tasks higher up the value-chain.

    As well as inclusivity, we will see sustainability goals accelerated by hybrid working and new technology this year. We at Capgemini have a two-fold ambition of delivering a net zero future to our clients as well as within our group. By leveraging the right workplace technology and tools we seek to help our clients save 10 million carbon tonnes by 2030. As a group, we have an internal target of becoming carbon neutral by 2025 and net zero by 2030. Global carbon emissions dipped by 7% in 2020, attributed to reduction in business travel and lower office energy bills for electricity, heating, and cooling. With the right Digital Workplace tools and platforms, there’s no reason why this trend shouldn’t continue.

    Putting Experience in the Driving Seat

    At Capgemini, experience drives everything. This might once have been misconstrued as fluffy but the last two years have shown us that connecting employees is far more nuanced than providing essential technology. Experience is a vital measurement of how people interact with what they are working with and how it interacts with them.  We are investing in technologies such as Metaverse, quantum computing, and applied innovation to design a future-ready workplace that delivers frictionless, consumer-grade experience for all. The global pandemic has taught us the importance of resilience and business agility. Therefore, it’s fundamental that we optimize the base and secure the workplace so that we can accelerate on a firm footing. The future workplace is responsive to human emotion, motivates employees to adopt technologies, is inclusive by nature, and foregrounds agility.

    Looking to enable hybrid working in your organization? Visit our website to know more about connected employee experience offer or get in touch with our expert

    Author


     Alan Connolly
    Global Head of Digital Workplace Services, Cloud Infrastructure Services

    A deeper level of personalization is the new strategy

    Capgemini
    2022-02-16

    What does it mean to be a customer-first brand to your consumers? It means making everything feel personal, from the products customers buy to the services they use – and from the way, products are designed to how employees speak to customers. Customer-first makes the pain feel painless, the complex seems simple, and every moment feels intuitively right. When customers feel valued, the brand feels valuable to them.

    Knowing what your customers want is the key to successfully becoming a customer-first brand – and the answer is in the data. CMOs are leaning in heavily to maximize these important insights and, while it has never been particularly easy to predict what customers will think and want at any given moment, taking steps to closely evaluate historical patterns and trends makes these predictions far more accurate.

    Personalization initiatives have been incorporated into marketing strategies for many years. However, consumers are now demanding nuanced, hyper-personal interactions from their favorite brands. Forced to play a never-ending game of catch-up based on the latest news cycles, restrictions, and publicly available health information, brands are still struggling to keep pace with changing consumer behaviors.

    However, a deeper layer of personalization can solve this problem. Here are four emerging marketing trends that can help brands activate a customer-first strategy and deliver a more personalized experience.

    Real-time analytics for evolving purchasing patterns

    Consumers are disrupting marketing spending and strategies, and brands are responding with real-time analytics to monitor and align with their emerging behaviors. The latest technology platforms can determine automated next-best actions when real-time interactions take places, such as through two-way conversations on social media or other service channels. This level of sophistication requires individualization to optimize the customer journey – but it must be used carefully to ensure customers don’t push back. Marketers who take a thoughtful approach to real-time interactions will break through the noise and win loyal customers with perhaps the most useful benefit they can offer: relevance.

    Micro-segmenting at scale

    Newly gained advancements in data and marketing tools are enabling brands to embrace micro-segmenting – a trend that identifies very specific persona groups to convert into new customers by delivering hyper-targeted messaging and content at key moments in their shopping journeys. Data-gathering technology helps organizations see the results quickly and easily shift strategies if a certain type of content didn’t resonate with the audience. Social-listening tools are also being utilized to unlock behavior trends and measure brand sentiment. Micro-segmenting is still under-leveraged but is ripe with opportunities to drive conversions and build loyalty among customers who haven’t previously made purchases with a particular brand.

    CMOs embracing data and storytelling

    CMOs recognize that they must be increasingly nimble with their approaches. That includes how they leverage data to make decisions and support the business – being open to small experiments without abandoning traditional marketing tactics. This fine balance leads to a greater prioritization of knowing their customers by combining data and storytelling. Naturally, CMOs are looking to better understand the most effective approach to connect with customers – such as the right marketing channel, the best time for outreach, the products to promote, etc. But the best data still won’t make a strong impact if the insights aren’t integrated into an effective story or experience that resonates with customers.

    Data-Driven Customer Experiences

    Organizations need to have the right capabilities in place to merge experience data from devices and channels with enterprise data to activate 1-on-1 experiences. Previously, most organizations focused on collecting enterprise data and used it to build their analytics. But to create effective 1-on-1 experiences with customers, they need to collect, store, and process experience data – coming from the customer’s browsing pattern, individual preferences, and time spent. Harmonizing and orchestrating experiences with enterprise data becomes critical in helping to drive consistent experiences across the entire customer journey of content, product, services, and experiences. This allows AI models built with both experience data and enterprise data to enable experiences that make customers feel like a brand truly knows who they are and understands them.  Combine that with a commitment to data privacy and trust, and customer loyalty forms through a seamless experience that’s safe and reliable.

    While the past two years have been an enormous challenge for CMOs, the increased inventory of consumer data has created opportunities for never-before-seen levels of personalization and focus, and a customer-first experience that’s being fine-tuned in real-time. Marketers have always been adaptable in the face of change but focusing on micro-segmentation, real-time analytics, and the combination of data-driven customer experiences and storytelling will put them in the best position to succeed in this ever-changing marketplace.

    How do you ensure you can trust software?

    Capgemini
    2022-02-16

    We go about our everyday lives without fully understanding the systems that keep us safe and secure. That is because safety is vested in software. Gone are the days of knowing how your car works. Modern vehicles require around 100 million lines of software to make them work. Of course, most of the software runs the navigation and entertainment systems and heats the seats. However, some of that software is tasked with keeping us safe, such as making sure the braking and engine management systems are working.

    But how can we know for sure that the software is doing its job? The short answer is, we can’t. So we have no choice but to trust the people who designed and built it.

    Moving critical functionality from hardware into software is a well-trodden path as complex industries mature. Arguably, the aviation industry was the original pioneer with fly-by-wire systems as far back as the 1960s. It also drove early work in programming language design and international software quality standards.

    A rocky road

    Of course, the pace of change has accelerated in the last fifty years. Today’s critical software functionality that ensures your safety and security is not just in planes and cars. It is in medical devices, home automation, the electrical grid, gas meters, etc.

    The transition to trusted software has not always been smooth. Consider these three examples:

    • Between 2010 and 2014, over 700 UK postal service employees were prosecuted, and some went to jail for fraud because the new computer system was adding up the number incorrectly
    • Toyota’s well-documented US safety issues with its car braking system in 2014 resulted in a $1.2 billion criminal penalty
    • In 2018, the UK National Health Service reported that computer software kills between 100 to 900 people a year in the UK alone

    The fact that software plays a crucial role in keeping us safe is not disputed. Yet cases like these give the perception that software cannot be written without bugs. It’s just too hard to make it work all the time. Consumers still get told to “turn it off and on again” to clear the problem. And for companies with software problems, this excuse can be the screen they hide behind.

    But it’s a myth. A false perception that needs correcting. For many reasons, the software industry has not led the way in correcting this false perception. Now, as can be seen from the examples above, law courts around the world are challenging best practices in software production.

    The way forward

    It has been argued that a solution to this problem is to make all source code “open” and thus available for independent scrutiny. Consider the Heartbleed bug in the security library used for online internet transactions. Despite being used by millions of people, the bug went undetected in open software. This shows us that just because something is open, it’s not necessarily bug-free. Software quality is an orthogonal topic to software ownership, visibility, or business model.

    The good news is that we know how to use engineering discipline and rigor to write correct software. Software that you can rely on to keep you secure and alive. When automotive companies say it is inevitable that autonomous cars will have many software faults, I point to air travel. Air traffic control, autopilot, automatic landing systems, etc., perform their jobs daily with minimal fuss because the software does exactly what it was designed and written to do.

    The Capgemini Engineering approach

    Capgemini Engineering has over 35 years of experience building software systems for demanding can’t-fail environments across industries as diverse as air traffic control, aircraft avionics, defense, railways signaling and train control, nuclear and renewable power generation, and banking and finance. The common theme across these sectors is that whatever the function, and whoever the end-user, the software must work first-time every time.

    Cross-pollination of good ideas from one industry to another is an integral part of the solution, with aviation leading the way. Using the right processes and tools is equally important. It is essential to train and empower staff to understand the implications of the software they write and their responsibilities.

    Into the future

    As manufacturers realize the technical complexity of their products has moved from hardware to software, they will need to step up their game. But, for industries with little regulation, consumers will drive accountability through the courts.

    Rather than all industries starting from ground zero, manufacturers can look to the aviation industry for guidance. No other industry has the depth and breadth to produce high-quality software. We should not be shy about our achievements. We know that applying engineering discipline to software development produces a reliable product.

    The hard reality today is that software is responsible for our safety. So, we need to make sure we build the software correctly. The aerospace industry has a lot of mature processes, tools, and culture that it can share with sectors that are new to the challenge of producing safety-related software.

    Capgemini Engineering is proud of its track record in taking these tools and processes from aerospace and adapting them to work in other industries.

    Neil White

    Author: Neil White, Director, High Integrity Software, Capgemini Engineering

    Neil has over 25 years of experience building software systems for environments that cannot tolerate failure. He has worked in industries as diverse as aviation, power generation, railway infrastructure, defense, and banking security.

    Data masters in action: Unveiling chief digital officer’s most wanted asset

    Capgemini
    Capgemini
    2022-02-16

    A Q&A with Marc de Forsanz, Global Head of Customer First, Insights & Data; Padmashree Shagrithaya, Global Head, Analytics & Data Science; and Naresh Khanduri, Vice President, Digital Customer Experience at Capgemini

    A happy customer is a repeat customer – and a repeat customer results in higher revenues, larger profits, and lower costs for customer acquisition and retention. Not only this, but a happy customer also feels safe about transacting with the organization. Such customers deliver greater lifetime value to the enterprise. Every business knows this, but improving customer satisfaction, being relevant/meaningful, aligning to customer values and appealing to their beliefs is an ever-challenging goal.

    Marc de Forsanz, Padmashree Shagrithaya, and Naresh Khanduri, are responsible for Capgemini Data-driven CX– an AI-powered offering that helps companies deliver next-level customer experiences. They discuss the need for Chief Data Officers to help organizations manage the customer journey seamlessly and in real-time across a range of channels, and how Data-driven CX builds on the work that many companies have already undertaken with customer-data platforms.

    What’s the elevator pitch for Data-driven CX?

    de Forsanz: Data-Driven CX enables clients to take full advantage of related customer data (transactional, behavioral, product) from all channels to impact customer acquisition, retention and advocacy (in real-time). In short, it’s an AI-augmented customer-data ecosystem. Customers interact with brands through a variety of channels – including marketing, ecommerce, customer service centers, and in-store visits. Data-driven CX aims to enhance the experience of customers across all channels through data harmonization and activation through AI&ML. Specifically for the Chief Data Officer, it ensures the data is high quality so there’s trust in the data, and that the data is collected, stored, and used in compliance with all relevant privacy laws and regulations.

    How much of an issue is this, really?

    de Forsanz: Customer data is both strategic for a company (scheduled end of third-party cookie, first party data) and complex to manage (multiple data sources, data quality, PII topics, legal regulations). Whether the subject is initially marketing or broader into areas like E-Commerce, Customer Service, It is essential to partner with a firm who is versed in turning data strategy into a competitive advantage. Capgemini’s own research has shown there’s plenty of room for improvement. To highlight a couple of challenges, 57 percent of marketers we asked admitted they’re missing important data points required to obtain a full view of their customers. And only 45 percent of firms believe they have the data they need to understand the connection between online points of contact and in-store behavior.

    How is Data-driven CX different from a customer-data platform?

    Shagrithaya: Our data-driven CX shapes the customer experience and drives analytics and engagement. It harmonizes the data about each customer. But it also makes that data contextual. Many of our clients have walked the path of building what’s known as a Customer 360 – but despite this, they find they’re not able to engage with the customer in a meaningful way. The approach, many a times, is to gather as much information about the customer, whether relevant or not and whether that’d be intruding into the privacy or not! Data-driven CX combines knowledge of the customer with the contextual information the company needs to deliver the most satisfying experience through responsible personalization. So, for example, if a customer visits a company’s ecommerce site because they have recently bought something, the context is “Why are they visiting?” Are they looking for a service, or for an add-on for the product, or for a different product entirely? Understanding their intent within context and then servicing that intent is important because it generates a positive outcome for the company, and provides a meaningful experience to the consumer.

    How does AI help with this?

    Khanduri: AI is all about understanding the customer’s intent. Behavior on the channel should be driven by the insights generated by the AI engine, not by some rule that somebody came up with because “it feels right.” Data-driven CX allows enterprises to examine each customer’s data (with their consent) and learn from it, and then let the AI help with the next interaction with that customer. It allows companies to anticipate their customer’s intent based on data, not gut feelings. The outcome is that the customer has a better experience – they believe that the organization truly understands their needs – and they become more loyal to the brand. That’s the ultimate goal – a loyal customer who advocates for the brand.

    And that’s an outcome with many financial advantages for a company…

    Khanduri: Exactly. From the perspective of engaging with customers, the biggest investment a company makes is the cost to acquire that customer in the first place. Companies invest in traditional and online advertising, marketing campaigns, search-engine optimization, and so on – all to encourage that customer to make their first visit to a channel. And today, if the chosen channel does not deliver a personalized, unique experience, the company can lose that customer – often, for good. The bottom line is, it’s more cost effective to serve a customer you already have than to acquire a new one – and Data-driven CX is designed to help companies build that loyalty.

    Shagrithaya: It can also reduce the total cost of operation for the enterprise. Some companies have implemented a customer-data platform but they do not think through the underlying questions of how to treat data of customers coming in from multiple channels in a consistent and coordinated manner, or how to activate the same through appropriate AI/ML algorithms. By not addressing these questions up front, they would have made the whole process more expensive than it should be. Data-driven CX helps bring those costs under control by providing the CDO with the tools to properly manage all aspects of their company’s data.

    How does a company get started on this journey?

    de Forsanz: Data Scientists, data analysts and business users need reliable customer related data to build analytics, AI use cases and an optimal Omnichannel experience. The beginning of all stories are the use cases which will drastically increase the performance of a department or the whole enterprise when this vision is shared. I encourage CDOs to identify their pain points with customer data right now. They should identify which departments are using customer data, how they’re using it, what they would like to do with it, and whether they’re achieving those goals.

    Khanduri: We try to determine the use cases they want to activate – but also encourage them to adopt a holistic picture of their needs. They can start with a single business case – but at the same time, they should be planning to support other potential cases. If they’re trying to improve a single use case but not planning for others, they will not be able to solve all of their business challenges. For example, if they’re trying to convert sales leads into sales, they should also be looking at how they will improve after-sales service for those new customers.

    As they look to enhance their customer experiences with a solution such as Data-driven CX, what can CDOs do to maximize their success?

    Khanduri: One thing that has always surprised me is that few people actually ask, “Do I have the right data?” They’ve collected whatever data is available and they’ve created a customer profile with it, but they haven’t actually figured out if it’s the data they need to achieve their business objectives. I encourage CDOs to have that conversation with those who use data in their company – to examine the data they have in the context of their business objectives and the experiences they want to deliver to ensure they are collecting the information they need.

    Shagrithaya: CDOs sometimes look at what they’ve already done and wonder why it’s not working for them. They’ll point out to me that they’ve invested in CRM, they’ve invested in Customer 360, they have a customer-data platform – and they were told this was all going to help their sales team – but they’re still not able to move the needle with respect to their KPIs, for example, of increasing the lifetime value of their customers. That’s a common starting place when we’ve had conversations with potential clients. We typically start with an architecture review to understand the existing landscape and propose appropriate end state architecture, with the existing investments in mind so that our clients need not shelve their current investments altogether to move to the new platform.

    de Forsanz: That’s one of the things Data-driven CX does really well. It builds a perfect asset for the CDO to manage customer data, so all departments in the enterprise can unlock the value in the company’s data.

    Authors

    Padmashree Shagrithya

    In her diversified career spanning over 25 years has crafted and led multiple large and complex transformation programs delivering strong business outcomes for many clients, leveraging Data, Technology, Machine Learning and Artificial Intelligence.

    Marc de FORSANZ

    Marc has solid knowledge in digital (website development, CRO, UX/UI design, brand content, traffic generation (Google SEO certification, Openclassrooms SEA certification, bloggers outreach). He also is also well versed in data science and machine learning capabilities.

    Naresh Khanduri

    In his current role as Data-Driven CX Lead, he helps clients maximize, and scale business value across CX channels. He specializes in combining Experience data with Enterprise data and applying advanced analytics, artificial intelligence to build immersive experiences.

    Sharing (data) is caring

    Pierre-Adrien Hanania
    2022-02-15

    Conversation with Pierre-Adrien Hanania, Global Offer Leader Data & AI for Public Sector

    Though there are legitimate reasons why governments tend to be cautious about making big changes in their storage and processing of data, it’s still possible – and even critical – to find ways to break the logjam.

    “It’s definitely slower than other sectors because the innovation is less rooted into the structures. Indeed, the right data culture is key here, given the sensibility of the data. In many cases we are dealing with critical data such as patient data, social benefit data, and sometimes we are dealing with a security-relevant data.”

    In order to thrive on openness and make knowledge available, the public sector has made big strides in rolling out Open Data initiatives. Particularly in Europe, Open Data portals are providing ready access to government data to third parties – and with this changing the digital culture of organizations and how they can leverage information playfields. While this is a first step, it’s a common misconception to equate Open Data and data ecosystems.

    “Open Data is one part of the ecosystem, but it’s only one part.”

    In many data ecosystems, the data is not necessarily open to the public. Instead, data ecosystems utilize the advances in security and partitioning to allow private and non-public data to be safely shared between citizens, businesses, and governments. Whether that data is  about transportation, health, education, or social services, data ecosystems enable more and richer data to be shared. In doing so, the data ecosystem allows different stakeholders to collaborate for the first time. The data is put into a common virtual playground where they can all leverage it.

    “The case of data sharing public sector – a special kind of business.”

    Often the difficulties start with the sensitivity of the data amassed by public sector agencies. This creates hurdles around finding the right ways to anonymize data and navigate privacy rules. But it also means finding ways to talk about the value of data ecosystems that are distinct from those that apply to the private sector.

    “Whenever we talk on things around data, I read about the topic of data monetization. But of course, when we deal with the public sector, we cannot approach it the same way. Because we’re not talking about the business only, we are often simply talking about rights and duties in regard to citizen engagement and societal mandates safeguarded by governments.”

    The public sector needs to identify the distinct motivations that apply to its needs and obligations. Where executives in the private sector might be driven by the specter of competition, governments aren’t necessarily facing rivals.

    Instead, the catalysts are move civic minded. Data ecosystems are a way for governments to develop better relationships with citizens by leveraging their data. Through advances in UI, accessibility, and speed, consumers are being subtly trained by the private sector to expect and demand high-quality experiences in their digital interactions and transactions – whether for tax collection, homecare journeys or crossing-border experiences. This includes robust personalization and responsiveness. When they encounter un-intuitive government websites that offer few features, a poor process flow and little flexibility, they are bound to start grumbling.

    “When we use apps on our phones to buy shoes or pick an Uber for going home, that experience sets the standard. So, then you try changing your address for tax collections and it’s frustrating that the experience is so poor and complicated.”

    While governments can’t necessarily measure outcomes by the bottom line, they can still be held accountable for costs and efficiencies. In that respect, data ecosystems can help reduce operating expenses through consolidating infrastructure spending.

    Sharing is winning – but what exactly?

    In the field of insights, data sharing can also improve security. Bad actors are increasingly using tools like AI to commit fraud even as the good guys are using it to improve products and services. When it comes to fraud detection, data sharing will enable the public sector to use data ecosystems to react quicker and collectively in regard to an international disease that knows no borders – and by that will get its hands on a richer analysis that comes from expanding the pool of data sharing with a variety of public and private partners.

    And it’s not only just about improving operations. Data sharing can make it easier to find the right and relevant information for both government employees and citizens. Indeed, this can be a way to improve insights and the way governments serve the public.

    For instance, bringing together different stakeholders perspectives via data enables greater personalization. If several actors from different geographies and structures take part in a process, it provides a 360-degree view that allows for richer solutions. Imagine a government offering an intelligent job matching powered by data from the national employment agency, a specific job providing organization, and dynamic market data. Job seekers and employers would be far better served.

    As an example, a hospital having trouble filling job openings because it lacks information on candidates. Joining a skills data ecosystem that dynamically evolves along the market evolution with the unemployment agency would be a natural way to leverage the shared data in a way that would the hospital and recruits connect more efficiently – with a focus on available skills rather than on only long-term schemes.

    That path to achieving these breakthroughs starts with a full inventory of data. Once they have achieved real Data Mastery, these agencies can turn outside and start connecting with other organizations in the public sector.

    Government agencies must also be clear about any data compliance issues that determine which types of data they can leverage. While there can be limitations, particularly on data that identifies individuals, the good news is that technology is available to help address many of these issues through anonymization techniques such as synthetic data creation, differential privacy and zero knowledge proof concepts.

    In data culture we trust!

    Beyond the technological part, culture can often become the most crucial step of the transformation.

    The public sector can address this in part through training, education, and upskilling of employees and managers. It helps to show them real-world examples of other agencies that have used tools like AI to reinvent their services and relationship with citizens. It also puts them in the driver seat, acknowledging what potential they want to activate with the available data – and what use case fields they don’t want to explore, due to technical or cultural decisions and values.

    In general, combining first discovery steps in data ecosystems with a cultural change plan – can be done by appointing a Chief Data Officer, or by setting-up re- and upskilling training in order to involve the public servants on this digital journey. For instance, Capgemini has recently helped the German Federal Agency for Migration and Refugees take this a step further by helping to build a Data and AI competency center that allows managers to assess the maturity and potential of data cases they may want to pursue.

    Capgemini embraces data sharing across geographies – and puts the citizen at its center

    In other areas, data ecosystems supported by Capgemini have taken many forms, in Healthcare with the French Health Data Hub, in UK with the Data & Analytics Facility for National Infrastructure (DAFNI) or at EU level with the European Open Data Portal.

    Many of the components a data ecosystem must bring to the table are coming together in an ambitious Smart City that Capgemini is leading with the city of Dijon, France.

    Capgemini helped design and build a new digital platform for Dijon that connects citizens, utilities, businesses with government services such as waste removal, street cleaning, smart parking and traffic regulation. The project was a true collaboration of various partners such as SUEZ or EDF, who helped the city administration and Capgemini to best leverage a platform nurtured by multiplied data endpoints.

    One of the keys to success was indeed federating the data involved by also including everyone from big utilities to local government to startups to citizens. Specifically, citizens can benefit from the system but are also important contributors. For example, they can use an app to report a bike accident and that data flows immediately to a central command to dispatch emergency crews far more quickly. Earlier this year, the city also launched a “Data Challenge”, calling participants of the ecosystem to bring ideas in relation to specific city challenges, like waste management and citizen engagement in the public area.

    We embraced the citizen as a consumer but also the producer of data. We made the citizen the heart of the data ecosystem.”

    It’s more than just creating efficiency. This is the key to changing that relationship so that residents go from being passive consumers of information and become smart citizens. This is absolutely a critical lesson about the ability of data ecosystems in the public sector to profoundly improve and reshape society in the decades to come.

    Know more

    We demonstrated our support to the United Nations’ 2030 Agenda for Sustainable Development Goals again last year by joining the AI for Good global on Nov 16, 2021. In this event, Capgemini experts, public organizations, society and business stakeholders explored how data-sharing can enable us to monitor and understand the development of the SDGs. Watch the replay of the sessions here.

    Security chain cooperation without having to share data

    Capgemini
    Capgemini
    2022-02-15

    Privacy and classification pose many challenges for public security organizations Data sharing can take place without providing protected information There are no safety obligations or risks for data that is not there Zero-Knowledge Proof cryptography is a gamechanger Information-driven work increases efficiency and safety Within the public security and safety domain, data sharing is important but also risky and difficult. New technology can make all the difference.

    Old problems, new solutions

    Since its introduction in 2018, the EU General Data Protection Regulation (GDPR) has governed the security and use of personal information by all organizations in Europe. For those organizations involved in public security and safety, there is another important body of information that must be protected in the interests of society as a whole: classified information.

    It can be difficult or even impossible to exchange information within the public security domain. This information comes from the multiple sources of different organizations and/or security chain partners. Violating privacy rules or being unable to communicate due to unequal levels of classification poses a problem. One consequence of this is that, while too much information can harm the organization or an investigation, too little information has an inhibiting, if not destructive, effect.

    Security chain interactions without revealing data

    Today’s reality forces organizations to look further ahead. Because if data lasts forever, then data that is leaked or stolen lasts forever as well. Once data has been revealed, it cannot be hidden again. For those in the public security domain in particular, the damage done at every level — up to and including the strategic level — can be substantial, both to the organization and to individuals.

    When it comes to data exchange, there are also obligations around guaranteeing integrity, security and, above all, reliability. Traditional integration technologies such as an Enterprise Service Bus (ESB) or a Service Oriented Architecture (SOA) play a crucial role here. The technology used to meet modern standards cannot provide the desired integration solution without making it very complex and time-consuming. Current architectures are mainly based on intensive customization in which security is guaranteed by authorizations for the users and security requirements for the data carriers.

    Introducing a new innovative technology, such as the use of cryptography, can offer a solution. A special type of cryptography that is now maturing due to the rise of blockchain technology is Zero-Knowledge Proof cryptography. This form of cryptography is mainly used as a means to ensure privacy, while the correctness of the transaction can still be verified.

    For example, core register surveys in security chains could (start to) interact with each other based on cryptographic evidence.

    This technology has now made its appearance within the financial world[i] and could play a role within the security domain.

    What is Zero-Knowledge Proof cryptography?

    A Zero-Knowledge Proof, or Zero-Knowledge Proof Protocol, is a method by which a party can prove to another party that they are familiar with value x, without conveying any information, apart from the fact that they are aware of the value x. It enables the parties to determine through a number of interactions whether certain data is known, without revealing any information about this data.

    Underlying this cryptography are several mathematical principles with which it is simple to generate and verify evidence of someone being familiar with value x, but where it is impossible to trace what value x is.

    A Zero-Knowledge Proof should have the following three properties:

    1. Completeness: if the claim is true, an authentic verifier (i.e. one who follows the protocol correctly) will be convinced of this fact by an authentic evidence provider
    2. Robustness: if the claim is false/incorrect, a cheating evidence provider will be unable to convince an authentic verifier
    3. Zero-Knowledge: if the claim is true, a verifier will obtain no information other than the fact that the claim is true.

    Solving a complicated problem with ease

    Technology with such simple basics could easily contribute to highly complex systems. In the current landscape, we have many forms of classification, from the national level to NATO and EU level, each with its own degree of classification. Access to data is based on the provision of authorizations. This is a time-consuming process, for which the procedure must ensure data security. In many cases, the transfer of classified information from one system to another is difficult (i.e. manual) or impossible. In addition, for classified information held in documents rather than in systems, there is always a danger that the processing will be unsecured (via open mail, by telephone, or on paper).

    Zero-Knowledge Proof cryptography is viewed as one of the most important developments in the field of privacy-enhancing computation[ii]. Solutions based on Zero-Knowledge Proof cryptography can make the jungle of authorizations unnecessary and increase security at the same time. In many cases this is because the purpose of the transmission of privacy-sensitive or classified information is not to transfer the complete package of information, but only to indicate whether something is correct or not. Zero-Knowledge Proof cryptography can provide that evidence.

    Where the technology stands now

    Within its Applied Innovation Exchange (AIE), Capgemini has developed a solution in which provable interactions between parties are carried out based on data, without any sensitive data being shared and/or replicated. In a proof-of-concept, this has reduced the registration process for a rental property, which normally involves sharing personal data and financial information, to a minimal set of (cryptographic) interactions between the parties involved. Several specific APIs have been developed that generate and verify cryptographic evidence. Framed with a number of additional techniques, such as digital signature of evidence and a rule-management system to prevent abuse and exploitation, the solution provides secure and reliable data interaction with just a few simple ‘API interactions’ between source systems. This proof-of-concept shows that, with the right technology, it is possible to significantly reduce opportunities for fraud, privacy risks and security risks in security chain cooperation. At present, the AIE is working on use cases in the security domain.

    Use cases reveal the potential

    Within this domain, Zero-Knowledge Proof cryptography offers the potential to support organizations both at a security level and to relieve them of the management of sensitive data. This potential is already being realized, as the following uses cases demonstrate.

    The MIT (Multi-disciplinary Intervention Team) is a recent partnership between organizations in the Netherlands — the Police, the Royal Netherlands Military Constabulary, the Public Prosecutor’s Office, the Dutch Fiscal Intelligence and Investigation Service, and the Tax and Customs Administration. Here, the organizations involved are working together to build up a joint information position for the purpose of effective security intervention. The use of Zero-Knowledge Proof cryptography can enrich the information position of the MIT because it allows for data to be integrated that otherwise cannot and/or may not be used. This could have a direct impact on the efficiency and decisiveness of this partnership.

    Also in the Netherlands, the collaboration agreement on counter terrorism — Counter-Terrorism Information (CTI) — involves the Ministry of Defense, the Tax and Customs Administration, the General Intelligence and Security Service, the Royal Netherlands Military Constabulary, the Dutch Fiscal Intelligence and Investigation Service and the Inspection Service of the Ministry of Social Affairs and Employment. In this type of exchange of information, the data is so secret that cooperation is seriously hampered. If one of the parties wants to look up data about someone, other parties must be prevented from finding out that an investigation into this person may be ongoing and taking their own steps (which could hinder the initial investigation). In this case, information is ‘leaked’ the moment the data is requested because it shows that there is an interest in that person in the context of counter terrorism. Zero-Knowledge Proof cryptography allows for these searches to be ‘packaged’ and obscured. In some cases, one of the parties might have certain data but is not legally allowed to act on it or inform other parties. Zero-Knowledge Proof cryptography can also play a role in facilitating secure notification of another body, without revealing data about a person or potential investigation.

    In a broader context, security chain interactions already exist in the security domain where the unwanted release of information would have major consequences in the context of information security or privacy assurance. Consider, for example, the communication of medical data of military personnel to/from conflict areas where both the personal data and the medical condition of a soldier must not be revealed. Zero-Knowledge Proof cryptography offers new potential. It allows for the design of security chain interactions that are currently not possible or permitted at all. This is a new world for the security domain to discover.

    Conclusion

    Systems that communicate based on Zero-Knowledge Proof cryptography can ensure a high degree of security in the exchange of data and thus bring peace-of-mind both to public security organizations and to society. What is not known cannot just end up on the street or in the wrong hands. Systems connected to each other based on Zero-Knowledge Proof cryptography can solve the challenge of unequal classification levels without having to make a direct connection. In addition, setting up such processes in day-to-day operations can help to reduce the jungle of authorizations and relieve the pressure on the security services, while increasing overall security.

    Find out more

    This article has been adapted from a chapter in the Trends in Safety 2021-2022 report giving European leaders insight into the safety and security trends affecting citizens in the Netherlands.

    • The full report in Dutch can be found here.
    • An executive summary in English can be found here.

    For information on Capgemini’s Public Security and Safety solutions, visit our website here

    Authors

    Joop Koster
    Chief Architect Daan specializes in recognizing, translating, and responding to the challenges in the field of IT and beyond faced by the public safety and security domain every day.
    Emaildaan.verwaaij@capgemini.com
    Daan Verwaaij
    Business Analyst Daan specializes in recognizing, translating, and responding to the challenges in the field of IT and beyond faced by the public safety and security domain every day.
    Emaildaan.verwaaij@capgemini.com
    Michael Kolenbrander
    Domain Architect Michael specializes in the realization of systems for information provision in the public safety and security domain, in particular in the context of information as intelligence. He is creator and co-author of the Whiteflag Protocol and expert in emerging technologies, such as blockchain technology.
    Emailmichael.kolenbrander@capgemini.com

    [i] ING launches Zero-Knowledge Range Proof solution, a major addition to blockchain technology

    [ii] Gartner: Top Strategic Technology Trends 2021

    Driving a frictionless sales experience within the MedTech space

    Capgemini
    Capgemini
    2022-02-14

    Otto Von Bismarck, the famous Prussian statesman, once said: “Laws are like sausages, it is better not to see them being made.” But laws and sausages are a good thing, right? The same is true for data. It is seen as a very good thing within the MedTech industry, in fact it is often seen as a precious commodity here – right from when a medical device is approved for sale, all the way to the assembly line.

    However, as most MedTech organizations have made investments in tailor-made tools and applications for specific sub-processes, it’s difficult to harmonize and leverage data for holistic sales management purposes. This is a problem for MedTech salespeople, as owners of their customer relationships, they already face a variety of complex compliance and technological challenges to overcome.

    This is precisely why MedTech salespeople would benefit from more meaningful insights and greater transparency across the sales cycle – generated in real-time. In short, they want to know how the sausage is made, so to speak.

    The current state of pricing in the sales cycle

    Although discovering buyer behavior and examining price rationality across segments helps align MedTech product pricing decisions, this pricing discovery is typically the outcome of a standalone exercise – often after a sale. Worse still, most pricing analytics projects see winning prices as the optimal outcome of complex pricing analytics algorithms, which are typically separate from how any MedTech salesperson navigates the deal cycle with their customers.

    This means that most pricing algorithms designed and used by data analysts end up being “black boxes.” They only allow salespeople to see the prices they produce, without letting them see how they are actually generated or why the proposed price is the most optimal one available. As a result, most pricing within the MedTech sector is currently done in isolation, with the salesperson being a consumer of the output of this process – rather than being informed on how the output was generated or on the rationale behind their price offering.

    In short, this setup undercuts any MedTech salesperson’s relationship with their customer, as these analytically generated prices often don’t consider the unique factors present in every customer relationship. These prices are never clearly explained or outlined to any MedTech salesperson, leaving them unable to explain why their offering costs more or less than other offerings on the market. This damages trust and can cause the customer to see the salesperson as unprofessional or even inept.

    Making the frictionless sales experience a reality

    However, making pricing analytics a key component of an integrated decision support mechanism overcomes the challenges outlined above, enabling MedTech salespeople to make better-informed decisions while keeping their proprietary knowledge, valuable insight, and pre-existing customer relationships intact.

    Ensuring pricing analytics becomes a fully-fledged decision support mechanism also enables sales managers to get better insights into how and why deals are approved. They receive more context and information surrounding any deal in real-time, which helps make the frictionless sales experience a reality.

    Finally, it’s important to remember that proper pricing analytics change management also requires accepting and incorporating how salespeople work into the pricing process. Pricing is critical to the sales process, and salespeople should be at the core of any pricing conversations.

    Getting a frictionless sales experience is easier than you think…

    To ensure salespeople introduce a frictionless sales experience into their sales cycle, a provider needs to be capable of dealing with technological change, complex compliance requirements, regulatory demands, and the power of relationships during direct sales interactions. Price is a key variable in all sales discussions, and it can be spectacularly disenfranchising for a salesperson if they are not actively engaged in this critical sales activity.

    Capgemini enables salespeople to drive productivity and growth within their teams – backed by whatever digital transformation or long-term operational support they might require. Capgemini also integrates deal pricing data into how MedTechs apply price differentiators to their customers, which optimizes margin and win-rates and drives sales productivity gains and sales engagement. All of this helps make the frictionless sales experience a reality.

    To learn how to make the frictionless sales experience a reality for your MedTech organization – and understand how the sausage is made from end-to-end – reach out to deepak.bhootra@capgemini.com

    Deepak Bhootra is an established executive with two decades of global leadership experience. He delivers process excellence and sales growth for clients by optimizing processes and delivering seamless business transformation.

    Innovation Nation | Summer 2022 edition

    Innovation Nation is much more than a magazine – it’s a zoom on what’s been happening in the last six months across the world of Intelligent Business Operations.

    Can we stop intelligent technology from making mistakes?

    Capgemini
    2022-02-10

    Self-driving cars – including fully autonomous prototypes and incrementally ADAS (Advanced Driver Assistance Systems) rely on cameras and lidar to collect data on their surroundings. Both use AI to make high-stakes decisions, such as ‘should I brake, swerve or accelerate?’ The wrong choice has real consequences for other vehicles and pedestrians.

    This is perhaps the most high-profile example of a new technology that must take in its surroundings and make high-stakes decisions. Another example would be medical diagnostic tools, which capture subtle information in an MRI scan, a DNA sample, or real-time patient monitoring and use AI to advise on life or death interventions.

    When we talk about AI, we mean the mathematical and statistical models that transform data into real-world decisions. If these models are presented with a large number and wide variety of training scenarios, they can become accurate and versatile. But they struggle with any scenario outside their learning. In unfamiliar situations, an AI can make bad decisions with human consequences.

    For these technologies to be accepted, we need to build in mechanisms to recognize the limits of their knowledge and ensure appropriate action is taken when they step outside it – whether that is defaulting to a safe mode or handing over control to humans.

    The limits of machines when processing the real world

    AIs can ingest data on various scenarios, such as other cars on the road swerving, braking, skidding, and tailgating. They can learn to recognize these scenarios and react instantly, sometimes better than human experts. This capability is transforming many industries.

    But a self-driving car is not really learning to recognize “erratic driving”—a human-centric concept. It is learning to recognize a series of sensor data in a high-dimensional space, a specific trajectory that corresponds to a type of event. This trajectory will have a slightly different shape for every example of a scenario. The AI can learn a signature within this cloud of data that permits it to say: ‘a car has swerved in front of me, and I need to activate a response’.

    The industry’s enormous challenge is to make these learnings generalizable, so a car or a diagnostic tool can apply its knowledge to scenarios outside its training. We want a self-driving car that can make correct decisions even if it finds itself in a different country, in different weather, or surrounded by different types of cars and roads.  In other words, we want the AI to be able to say, ‘these new data look similar enough to my training data that I confidently recognize a car swerving’.

    Right now, AI is bad at this. The data are just data, and the AI cannot separate the fact of a swerving car from other contextual factors, at least not without a great deal of human guidance during training and a larger amount of road test data than is economically feasible. Further, the AI will try to classify the situation as a known scenario even if the similarity is only partial.

    As soon as an AI is presented with a scenario that does not appear in its training data, it becomes erratic. Just one sensor delivering an unexpected stream of values can make the current driving scenario unrecognizable. We have seen ADAS tricked by something as simple as stickers on road signs. For a more commonplace example of this problem, consider virtual assistants: they are good at following voice instructions to play songs or set timers but struggle with unusual requests.

    How can we stop machines from making bad decisions in unfamiliar situations?

    Maybe one day, AI will learn to recognize and navigate the human-centric context surrounding their data. Until then, if we want to use AI for high-risk/high-reward applications such as autonomous driving or medical diagnosis, we need to manage how they respond to scenarios outside their training.

    One solution – and an area where we are actively conducting research – is understanding the limits of what the AI has learned. When the AI is trained, it builds an internal model of what it is trying to recognize or predict – but this model is only valid inside the cloud of data points shown. By drawing a boundary around that cloud, we can limit an application to classifying incoming scenarios that the AI can recognize.

    A very “tight” boundary around the point cloud would mean that the AI only recognizes scenarios that are exactly the same as its training data. On the other hand, we might want some leeway to reflect real world variability, allowing the AI to make decisions for examples that are somewhat different from those it learned on. The skill is in managing this envelope for your desired application.

    We can decide what the AI should do when encountering a scenario outside its boundary. A diagnostic or predictive maintenance tool might be designed to say: ‘I don’t know what this is, please seek expert human input’. An assistive driving tool cannot afford to wait for human validation in real time, so instead, it may require the driver to take the wheel as soon as the environment becomes even a little unfamiliar.

    Managing the boundary of known examples s not a complete solution for autonomous cars, which need to make split second decisions. However, thinking about the shape of the sensor data in high dimensions can help design training simulations by ensuring that the data are as diverse as possible while still representative of real-world road tests.

    We also need to consider how humans will respond. An AI can offer a degree of certainty alongside its prediction, for example: ‘I’m 60% sure this is a positive diagnosis, but a human expert needs to check’. This approach could backfire if the tool makes a highly confident but wrong diagnosis based on an incorrect rationale. A human who has come to trust the AI, in this case, may not even be aware that edge cases where the AI fails completely are possible. Further, “60% certainty” to an AI may mean 60% overlap of data points, which is not necessarily the same as a 60% chance of cancerous tissue. Rather than expressing a degree of certainty, sometimes it’s better that the machine holds up its hands and says ‘I don’t know, please defer to someone who does’.

    Implementing safety constraints on high-risk technologies

    In case you feel we are being unduly harsh on these new technologies, it is important to note that a well-trained AI will be right almost all of the time when deployed under familiar conditions. We should continue to build better and more powerful software based on AI. But we need to remember that these models are always limited by their training data, so we need systems to handle failure before deploying them in high-stakes applications.

    By describing the shape of the training data, we can build quality assurance into this emerging class of complex autonomous technologies. This would mean that we can fully trust their decisions during the 99% of cases when the data are familiar because we have confidence that the AI will alert us when it is operating in unknown territory.

    Capgemini Engineering’s Hybrid Intelligence team is actively researching how Trusted AI can be deployed within high-stakes automated decision making in autonomous transport and healthcare.

    Benjamin Mathiesen

    Author: Benjamin Mathiesen, Lead Data Scientist, Hybrid Intelligence, Part of Capgemini Engineering

    Ben has 20 years of research experience in data modeling, scientific programming, numerical analysis, and statistics, with a modern focus on data science and AI. He directs client projects and internal R&D related to knowledge modeling, natural language processing, and trusted AI within the Hybrid Intelligence group.