Skip to Content

Procurement – the challenges and its changing role

Greg Bateup
Greg Bateup
July 04, 2022

Procurement is becoming an integrated, cognitive function that supports your end-to-end value chain and business ecosystem of customers and suppliers.

The key grip in a film crew. The back four of a soccer team. The quartermaster in an army unit.

What do they all have in common? It’s this: while they may all lack the glamor of the stars of their respective shows, they are all of them vital to collective success.

Take the last one, for instance. Quartermasters don’t train the troops, or lead them into combat – but without them, there would be no food, fuel, or water. Or clothing. Or field services.

Procurement in business is just the same. It’s a beyond the scenes function, but it’s vital – and it faces challenges.

Cinderella, disconnected

Those challenges fall into two broad categories.

The first is this back-room, Cinderella status. It’s not glamorous, and so the value it brings to the business is not understood by the rest of the organization. It’s seen as functional or maybe tactical, rather than strategic. In many cases, it has to compete for talent with other parts of the enterprise, which leads to retention issues for key members of staff, who may feel their career prospects would be better elsewhere.

The second can be summarized as disconnect. Procurement’s role makes it prey to multiple, non-integrated processes and systems, making it hard to share data or to achieve visibility across the organization. It’s not the lack of integrated systems, either: frequently, systems are also outdated and leave gaps in the process, leading to excessive manual processing and a poor experience for end-users.

The disconnect extends to supplier interaction. Procurement often has only limited visibility of the picture beyond its major suppliers. There are challenges in interactions between procurement and other parts of the business, such as operations, business, IT, and finance. As a result, business stakeholders don’t keep the procurement team in the loop when they begin to work with suppliers. This means that, in turn, there is less collaboration with, and management of, the supplier portfolio than there might be, and weaker risk processes, too.

A new and better future

Many organizations are waking up to these challenges. They see that a new and better future for procurement is possible.

This is a future in which the procurement function is digitally woven not just into the operation, but the strategy, of the business. A centralized organization model can, for example, facilitate innovation and competitiveness, rather than merely act in a fulfillment capacity. This same model can incorporate the increased use of shared services and business process outsourcing (BPO) for service delivery, maintaining a sustainable set of processes and metrics for managing the external team. A Center of Excellence can carry out strategic activity, and digitization can make the most of the efficiency and value of sourcing and procurement tools and enablers.

A new role for procurement will also see supplier relationships improve. When data is better managed, suppliers can be better profiled – but also, innovation can be more easily initiated, and better monitored. Better supplier management also makes it likely that procurement can play a greater role in meeting the organization’s corporate social responsibility (CSR) commitments.

Also, right now, the skills of the procurement team are frequently misaligned to the changing needs of the business, especially in the areas of digitization and analytics – but the growing awareness of procurement’s potential is increasing the pressure to implement bespoke training and digital skills programs. This is another way in which the procurement role is evolving.

Procurement is changing in order to address broader issues, too. The global pandemic has, of course, made an impact on businesses’ ability to meet their own supply requirements, and has created a greater need for flexibility.

In addition, and finally, the drive for sustainability, which is now pretty much universal, means that organizations are collaborating ever more closely with their suppliers to reduce the ecological footprint of product sourcing.

Integrated procurement – frictionless and intelligent

In short, we are witnessing a significant change in the role of procurement within the enterprise. In response to new as well as long-standing challenges, there is pressure for it to become an integrated, cognitive function supporting the end-to-end value chain, fully embedded into business processes, increasing the extent to which it supports customers, suppliers, and the business as a whole.

In the next article in this short series, we’ll take a look at how digitization can facilitate this change. A frictionless, cognitive approach to procurement, incorporating technologies such as cloud, artificial intelligence (AI), robotic process automation (RPA), and blockchain, will tackle the disconnect issues, and increase the efficiency and value that procurement can bring to the business.

As we’ll see, when all this happens, and procurement goes frictionless, then, perhaps, it will get the recognition it deserves.

To find out how Capgemini’s Cognitive Procurement Services offer can transform your organization to drive effective, sustainable, and frictionless procurement, contact: greg.bateup@capgemini.com

Author

Greg Bateup

Greg Bateup

Head of Cognitive Procurement Services, Capgemini’s Business Services
Greg Bateup has worked with clients to deliver business transformation and BPO services for almost 30 years. For the last few years, Greg has focused on the digital transformation of the source-to-pay function, and how organizations can not only drive efficiencies in the procurement function, but also drive compliance and savings.

    Procurement – going frictionless

    Greg Bateup
    Greg Bateup
    July 04, 2022

    When everything is cognitive and seamless, procurement ceases to be an operational function, and evolves into something much more strategic, providing a dependable, tailored way to manage supplier performance.

    In the first article in this short series, we considered the challenges facing the procurement function, and also the changing nature of its role. This time, we’ll be looking at integrating it into a cognitive, enterprise-wide, digital model.

    The Frictionless Enterprise

    Let’s start with that very point. If procurement is to address its challenges and meet changing responsibilities, it does indeed need to be properly plugged into the organization as a whole.

    But that’s not enough. It’s not just about procurement tapping into other business functions – it’s about all of those functions tapping into one another.

    This is a concept that we at Capgemini call the Frictionless Enterprise[HB1] . It’s an approach that seamlessly connects processes and people, intelligently, and as and when needed. Everything works together.

    It’s an approach that dynamically adapts to the circumstances of individual organizations, and that addresses each and every point of potential operational friction – regardless of whether that friction is between individual departments, between functions, or apps, or data sources, or devices, or involving something else altogether.

    Cohesion and improvement across the function

    When the entire enterprise is cognitive and cohesive in this way, procurement becomes a constituent part of it, like any other. It may perhaps have been previously regarded by some as a backroom function, lacking glamor – but no longer. More importantly, when everything else, from finance through marketing to logistics, is working seamlessly, procurement can start to deliver far greater value than was ever possible before.

    Intelligent automation and analytics can streamline processes, and save time and money across the whole function. Smart insight can deliver consolidation and improvement in:

    • Demand management – request validation; purchase order processing; expediting, receiving, and returns; invoice exception management; and procurement support
    • Intelligent sourcing – tail spend management; tactical procurement; strategic sourcing; category management; and contract and relationship management
    • Supplier management – supplier performance; supplier enablement; contract management; compliance management; and supplier support
    • Accounts payable – invoice receipt; invoice processing; invoice issue management; payment; and payables support
    • Risks and insights – spend analytics; working capital analytics; risk management and compliance; market intelligence; and corporate social responsibility (CSR) and sustainability

    Taking stock

    When everything is smart and seamless, procurement ceases to be an operational function, and evolves into something much more strategic. It becomes predictive, responsible, and reliable, delivering actionable insight. It provides a dependable way to manage supplier performance, tailoring it to the needs of the business while ensuring that the suppliers’ own needs and expectations are also met. And it establishes a means of continuous feedback in contract management compliance and risk management.

    In fact, several of these and other benefits are quantifiable. Working within an enterprise-wide digital model, frictionless procurement can:

    • Enhance compliance and risk mitigation: cognitive, connected models can achieve over 90% procurement policy compliance, and deliver significant reductions in what might be termed “maverick spend”
    • Increase productivity by up to 50%, by enabling changes such as automation of purchase orders and dynamic channel switching
    • Deliver an unobtrusive user experience – because why should doing the day job be any less straightforward for someone than, say, shopping online in the evening?
    • Enhance transparency and insights: when friction between functions and systems is removed, visibility can be increased across the enterprise, and across the supply chain in particular. This can deliver up to 26% identified supplier consolidation savings, reduce risk, and – once again – cut back on that “maverick spend”
    • Start paying back promptly. We’ve seen reductions of up to 50% in back-office costs, and savings of 15% on spend.

    These, then, are benefits that can be achieved in principle. But there’s no substitute for practice. In the third and final article in this short series, we’ll take a look at some cognitive, integrated procurement implementations in the real world. The results are pretty impressive…

    To find out how Capgemini’s Cognitive Procurement Services offer can transform your organization to drive effective, sustainable, and frictionless procurement, contact: greg.bateup@capgemini.com

    Author

    Greg Bateup

    Greg Bateup

    Head of Cognitive Procurement Services, Capgemini’s Business Services
    Greg Bateup has worked with clients to deliver business transformation and BPO services for almost 30 years. For the last few years, Greg has focused on the digital transformation of the source-to-pay function, and how organizations can not only drive efficiencies in the procurement function, but also drive compliance and savings.

      Procurement – real-world transformational benefits

      Greg Bateup
      Greg Bateup
      July 04, 2022

      Procurement is not merely a fulfillment service, but an important contributor to the strategic and tactical success of an organization that gives them the flexibility, cost-effectiveness, and resources they need in order to go out and win.

      In the first article in this short series, we headlined the challenges facing the procurement function. We also considered the changing nature of its [HB1] role. In the second article, we looked at how procurement could be integrated into a cognitive, enterprise-wide, digital model[HB2] , as part of what we at Capgemini call the Frictionless Enterprise[HB3] .

      In this, the third and final article in the series, we’re going to assess some real-world implementations of this cognitive, integrated approach to procurement. In each case, you’ll see some impressive – and measurable – operational benefits; but you’ll also see the extent to which new, intelligent procurement models can make a significant contribution to business strategy.

      Case #1 – financial services

      This multinational financial services company sought to increase procurement efficiency in general, and in particular to improve purchasing compliance in its global insurance business.

      A digital global approach was introduced that included an outsourcing model, a user-friendly buying portal, intelligent automation, a closed loop process for compliance and change management, and a Command Center concept to provide greater visibility into process bottlenecks.

      In fact, visibility was improved not just in this respect, but across the entire procure-to-pay (P2P) function. Processes were harmonized globally across business units, and scalable, fit-for-purpose platforms maintained compliance, and locked in savings.

      As a result, the organization achieved 30% productivity gains over five years. There was a 90% increase in purchase order (PO) compliance, savings of over 10% in tail-spend management, an increase in no-touch POs to 80%, and a significant improvement in end-user satisfaction.

      Case #2 – food

      One of the world’s largest food companies was experiencing delays in processing purchase requisitions which led to internal customer dissatisfaction, delayed internal projects and a loss of revenue. PO compliance was at only 40%, limiting control over purchasing, and 30% of purchases required multiple touches which further delayed on-time payment to suppliers. In addition, poor data visibility meant it was difficult to identify savings.

      A global managed service process model was introduced, with standardized desktop procedures, including the use of a catalog that was generated using improved content via analysis of repeat spend and training. Functions including PO processing, PO cancellations and changes, and invoice exception handling were monitored against new SLA-based metrics.

      The new model reduced the number of interactions per transaction, leading to a significant improvement in on-time supplier payment. In fact, over three years, there was an increase in touchless POs of up to 63%, as well as a 90% increase in PO compliance. There was a 75% improvement in invoice block resolution, and a year-on-year rise of 10% in the productivity of full-time employees.

      Procurement – key to strategic success

      As I said in the introduction to this article, while all these stats are impressive, it’s not just about measurable benefits. In this last case, for instance, what is perhaps more important than any one operational improvement is that the procurement function is now much more closely aligned to the company’s business objectives.

      We need to see procurement for what it is – not merely as a fulfillment service, but as an important contributor to the strategic and tactical success of an organization. When it’s part of a Frictionless Enterprise , procurement can give businesses the flexibility, the cost-effectiveness, and the resources they need in order to go out and win.

      To find out how Capgemini’s Cognitive Procurement Services offer can transform your organization to drive effective, sustainable, and frictionless procurement, contact: greg.bateup@capgemini.com

      Author

      Greg Bateup

      Greg Bateup

      Head of Cognitive Procurement Services, Capgemini’s Business Services
      Greg Bateup has worked with clients to deliver business transformation and BPO services for almost 30 years. For the last few years, Greg has focused on the digital transformation of the source-to-pay function, and how organizations can not only drive efficiencies in the procurement function, but also drive compliance and savings.

        Why securing the SAP landscape is a business essential

        Marieke Van De Putte
        1 Jul 2022

        Looking at the current cybersecurity landscape calls to mind the fable The Boy Who Cried Wolf. Many businesses are fully aware of the threat of the wolf, but with repeated calls to look at a multitude of dangers, businesses are fast becoming disorientated as to where the danger really lies.

        Where it differs from the tale is that many of the cries are not unfounded, and yet, they might not always be as fatal as they’re made out to be. With SAP security, however, the risks are real and the consequences of not acting on the dangers, seriously, can be serious.

        Used by the vast majority of multinationals around the world, SAP (systems, applications, and products) security protects business processes and data of high value, such as sales, finance, and personnel information. Traditionally, businesses would lock critical information in a data center, protected by a proverbial lock and key, with peace of mind. SAP, however, is primarily focused on authorization management and segregation of duties.

        While this approach was once enough, today it is not. The digitization and movement of assets in the cloud means that online threats are increasing in number, diversity, and impact, leaving organizations under a far greater threat of opportunistic attacks. In 2019, research found that nearly two-thirds of organizations reported an ERP (enterprise resource planning) system breach over a 24-month period from attackers after critical data. More often than not, this is a result of unsuitable SAP risk management solutions.

        Shifting sands

        In 2021, SAP and Onapsis issued an intelligence report warning organizations to take immediate action and review and monitor their SAP landscape. This was aimed at companies using old and vulnerable SAP versions, which are difficult to patch by today’s security standards.

        Just the year before, the RECON vulnerability left tens of thousands of customers’ data exposed to attackers; if exploited, an unauthenticated user would have been able to create a new SAP profile with maximum privileges to circumnavigate access and authorization controls. It was patched soon after, but it exposed the risks associated with relying on a system without proper monitoring.

        Today’s leap to the digital economy creates opportunities for companies to transform and scale, and SAP encourages customers to move to SAP S/4HANA – a cloud-based ERP system – to reap such benefits. This migration is a necessary modernization for many businesses, but it’s worth bearing in mind that it doesn’t guarantee out-of-the-box security. It still requires continuous monitoring to identify threats and vulnerabilities.

        Getting ahead

        Securing SAP platforms demands a proactive approach. With more and more critical infrastructure connected to the internet, possible entry points multiply every day. Research by Statista estimates that the number of connected devices is set to triple from 8.7 billion in 2020 to 24.4 billion in 2030, and this number will only continue to grow.

        Hackers generally want to create the greatest impact possible, which is why we’re seeing increasing attacks on critical sectors such as life sciences and energy and utilities. Take the example of medicine: if an attacker can bypass security and enter through the back door, there’s a chance they could edit essential information related to the product make-up at the production stage. This goes beyond business disruption to affect consumers, sometimes with life threatening consequences.

        Ideally, SAP applications provide businesses with a way to manage their departments effortlessly. But nothing should be taken for granted, especially as businesses move data and even applications into the cloud. Whether you have public or private clouds, you must have security measures so that you know who is taking care of what when it comes to security. To understand what they are or aren’t doing, it’s important to be proactive by examining, monitoring, and assessing.

        Assessment and management

        Locking down SAP security may seem like a complex task – and it is without the right practical processes in place.

        This is why by implementing effective vulnerability assessment and vulnerability management, you’ll be able to identify new threats and weaknesses in security configurations and prioritize vulnerability remediations for mission-critical SAP systems.

        To cover these bases, Capgemini has developed a unique holistic solution that shields your system from data breaches and potential losses. With quick and real-time insights, our Vulnerability Assessment reviews, evaluates, identifies, and reports on SAP weaknesses. But with the landscape always changing, it cannot end there, which is why we combine continuous management to monitor and identify new vulnerabilities in the environment.

        For businesses to seize the advantages of cloud-based security, it is essential they understand the SAP landscape better. To navigate complexity, improve compliance, and seize the potential of cloud-based digitization they must be able to know which cry of the wolf to prioritize. This is no small feat, but through better collaboration, it is more than possible.

        Contact Capgemini today to find out how our network of global SAP experts can help your organization embrace this new generation of cybersecurity.

        Author

        Marieke Van De Putte

        Global Domain Lead Cyber Compliance | SAP & Cyber | NL Service Line Lead Security & Compliance 
        Specialized in developing practical approaches to security, risk and compliance, and applying automation possibilities. Contributing our team’s expertise to digital transformation projects, like IT outsourcing and cloud migration.

        Mark Sampson

        Principal Enterprise Architect SAP and Cloud Centers of Excellence
        Has over twenty years’ experience in both IT consultancy and end user positions with a specialization in SAP design, architecture and delivery. Since 2012 he has been focused on delivering SAP solution onto hyper scale cloud providers (AWS/Azure).

          Project bose : a smart way to enable sustainable 5G networks

          Capgemini
          Capgemini
          30 Jun 2022
          capgemini-engineering

          Energy consumption has always been a significant consideration for service providers as it is one of the highest operating costs. But it is becoming even more important due to climate change and sustainability considerations.

          As per the GSMA, energy costs account for 20-25% of a network’s total cost of ownership for 4G operators. Operators globally spend approximately USD 17 billion on energy every year. With the pending deployment of more NR base stations in 5G, such as small base stations with massive MIMO in high band and 5G networks, energy consumption is expected to increase by as much as 140% in some 5G deployment scenarios. It therefore calls for immediate action.

          As operators aiming to decrease power consumption in 5G networks through different energy-saving techniques, the complexity of non-standardized energy-saving mechanisms are becoming a blocker to achieving the desired energy reduction targets.
          However, AI/ML intelligence, combined with technology enablers like NWDAF/RIC in 5G, has unlocked various innovation paths for the telco industry to design and execute such energy efficiency solutions in an economical and standard way to overcome the challenges encountered in current methods.

          Project Bose represents a great example of such innovative use cases. Its objective is to create a sustainable 5G network – and beyond – using a data-driven approach. To achieve this, it has introduced five energy-saving levers – directional UE paging, MICO mode, energy aware NF placement, smart UPF selection, and intelligent CPU tuning – that leverage analytics information from an underlying Capgemini NWDAF framework, and work in tandem to optimize energy consumption in the network and associated IoT devices.

          This result in significant CO2 emission reductions and cost savings, with no negative impact on end users’ QoE. Project Bose also provides a platform to build new energy-saving levers in the future.
          Thanks to a collaboration with Intel for its observability framework, Project Bose is able to capture a vast variety of network and infra metrics at different levels in the 5G ecosystem and provides a holistic energy-saving solution for operators.

          In tests conducted with this solution in the lab, we have observed an average energy saving of 18%, resulting in around a 14% reduction in CO2 emissions. It is a very impressive first step. But it is just the beginning. Project Bose still has many exciting innovations to come.

          Download our white paper to learn more about Project Bose.

          Providing accurate T&E information through an AI-based chatbot

          Bartosz Grochowski
          28 Jun 2022

          Capgemini’s award-winning chatbot leverages natural language processing and machine learning to deliver more accurate travel and expenses information. This is helping our people travel more easily and safely in the post-pandemic world.

          Organizations today need to provide up-to-date travel information to consulting professionals travelling on business at speed – including accurate information on local hotel rates, food allowances, and travel entitlements.

          This process is typically handled manually, with requests often sent to a third-party system that can lengthen response times significantly. In addition, requests are often sent to teams outside of office hours, which means questions frequently go unanswered until the next business day – much to the frustration of traveling professionals.

          Enable instant and accurate travel information

          What travel and expenses (T&E) teams needed is a tool that delivers simple, quick, and user-friendly support to its business people, without any specific training, implementation costs, or additional software installation.

          On top of this, the tool would ideally leverage an extensive knowledge repository, simple plug-and-play architecture, an intuitive user interface, and intelligent automation to enable finance teams to respond accurately to a wide range of T&E, HR, IT, finance, and policy-related queries.

          NLP and machine learning drives speed and accuracy

          Enter, Capgemini’s Microsoft-based chatbot. With natural language processing (NLP) embedded at its core, the chatbot recalls all previous user conversations, while leveraging machine learning to predict future queries and inputs. This ensures that the accuracy and amount of travel information it provides continuously improves over time.

          We’re extremely proud that our Microsoft-based chatbot recently won the “Chatbot Innovation” category at the 2022 AI Breakthrough Awards – for the fourth year running. Not only that, the tool is also being currently being implemented across many of our internal engagements, which is helping our people to travel safely in a post-pandemic world.

          To learn how Capgemini can help you provide up-to-date, accurate travel information to your people through leveraging Intelligent Process Automation, contact: bartosz.grochowski@capgemini.com

          The growth power of artificial intelligence

          Dr. Lobna Karoui
          28 Jun 2022

          The rise of AI

          Since the first century BC, humans have been concerned with creating machines capable of imitating human reasoning. Artificial intelligence (AI) has been defined by Arthur Samuel as the field of study that gives computers the ability to learn without being explicitly programmed. More globally we can define AI as a process of mimicking human intelligence based on the creation and application of algorithms executed in a dynamic computing environment. Its final goal is to enable computers to think and act like human beings. In 1956, John McCarthy and his collaborators organized a conference called the Dartmouth Summer Research Project on Artificial Intelligence, which gave birth to machine learning, deep learning, predictive analytics, and more recently, prescriptive analytics. In 2007,  McCarthy published an important paper titled “What is Artificial Intelligence,” where he clearly answered multiple questions about AI and its precise branches (pattern recognition, ontology, inference, search, etc.). Also, in the last decade, data science has emerged as a new area of study.

          The current rise of AI was made possible by four enabling conditions:

          • With the advent of the internet and the development of connected objects, tremendous quantities of data are now available. In 2020, 1.7 MB of data was created each second by every person. In the past two years alone, an astonishing 90% of the world’s data has been created. Every day, 95 million photos and videos are shared on Instagram, 306.4 billion emails are sent, and 5 million Tweets are made1. IDC predicted that the global datasphere will reach 175 zb by 2025.
          • New technologies such as cloud computing have emerged, and we are witnessing an exponential increase in storage capacity and computing power.
          • Society assist to the growing progress in available algorithms developed by researchers. Libraries like TensorFlow (Google) or scikit-learn (Inria: National Institute for Research in Digital Science and Technology), which contain major AI algorithms, are available with no fees. Many communities, like Stack Overflow, are helping AI developers solve problems.
          • The support from industries is growing. Many business sectors have come to understand the importance of AI and are investing massively in this exponential technology.

          The importance of data in AI

          Data is the new oil, and some big companies like the GAFA (Google, Amazon, Facebook, and Apple) are monetizing it. Today, one of the main challenges faced by business leaders is how to improve productivity and increase profits by using their data assets efficiently.

          Then comes the question of what data policy to implement. An efficient data strategy must ensure that good quality data sets are collected and can be used, shared, and moved easily from one system to another. The objective being to make information usable at the right time, in the right place, and by the right person to bring added value to the organization.

          AI business applications

          Many AI applications have already been deployed in diverse sectors of activities, with great impact on our daily lives as users, consumers, customers, and more. In the following paragraphs, we propose categories of AI usage and propose with concrete examples based on our experience in the AI development area.

          Customer first

          Customer segmentation

          It is a targeted advertising approach. Customer data is used to suggest homogeneous groups for marketing. This classification approach is based on common characteristics, such as demographics (age, geography, urbanization, income, family, job type, etc.) or behaviors (basket size, share of wallet, long-term loyalty). Customer segmentation is popular because it helps you market and sell more effectively. This is because you can develop a better understanding of your customers’ needs and desires. Clustering algorithms are key techniques in building a personalized customer experience. In a Capgemini Research Institute report about customer experience, we found that “66% of consumers want to be made aware when they interact with an AI system.” Being able to implement AI in such processes while  saving the human intelligence part of it is essential.

          Weekly churn prediction

          Churn is when a customer stops doing business or ends a relationship with a company. It’s a common problem across a variety of industries. It’s one of the most well-known AI applications in the customer relationship management (CRM) and marketing areas. A company that predicts churn can take proactive action to retain valuable customers and get ahead of the competition. Consumer characteristics and history are used to give a churn score to marketing leaders every week using the cloud.

          Real-time chatbot

          We’ve all had to deal with a voice server at least once. Behind this technology, you may have a real-time chatbot. It’s a conversational robot that communicates with users in natural language. It is a permanent point of contact for customers, users, or employees. It acts as a virtual assistant and sends them the right information in real time. For the most performant virtual assistant, the benefits are reducing human interaction costs and increasing user satisfaction with immediate and 24/7 responses. Natural language understanding (NLU) algorithms and cloud infrastructure are used here. Not all instant messaging and virtual assistants are based on AI techniques utilizing NLP and NLU. Some of them are mainly rules based.

          Intelligent industry

          Prescriptive maintenance

          With the emergence of the industrial internet of things (IIoT), the field of maintenance is connecting tools, software, and sensors to collect, store, and analyze multiple data sources in one place. Those tools are already unlocking predictive maintenance, where sensors and software predict future failures. However, many maintenance leaders are looking towards a near future based on prescriptive maintenance, where AI machines not only predict failures but also identify solutions. Prescriptive maintenance uses AI with IIoT to make specific recommendations for equipment maintenance. It combines technologies that analyze histories, make assumptions, and test and retest data freely. Complex AI algorithms enable software to automatically identify and learn from trends, recognize data patterns, and apply the best maintenance plan. This AI application, which uses reinforcement learning, helps to reduce maintenance costs.

          Real-time anomaly detection

          There are three commonly accepted types of anomalies in statistics and data science:

          • Global outliers represent rare events that have likely never happened before.
          • Contextual outliers represent events that fall within a normal range in a global sense but are abnormal in the context of seasonal patterns.
          • Collective outliers represent events that on their own do not fall outside of the standard expected behavior, but when combined represent an anomaly.

          Anomalies within a company’s data set can represent opportunities and threats to the business. Real-time detection of anomalies empowers enterprises to make the right decisions to seize revenue opportunities and avoid potential losses. Data from production is used to detect anomalies in a plant in real time thanks to unsupervised learning and a SCADA (Supervisory Control And Data Acquisition) system.

          Forecast methods

          Business forecasting is the process of using time series data to estimate and predict future developments in areas such as sales, revenue, and demand for resources and inventory. Business forecasting can be divided into two main categories:

          • Demand forecasting: Anticipate demand for inventory, products, service calls, and much more.
          • Growth forecasting: Anticipate revenue growth, expenses, cash flow, and other KPIs.

          Time-series algorithms are designed for these categories. These methods are widely used to estimate the evolution of the Covid-19 pandemic.

          Many other AI applications developing computer vision and deep learning algorithms are discovering drugs, identifying cancer cells, and used for sorting devices in factories.

          Enterprise management

          Monthly KPI dashboard

          Financial data is used to display important KPIs for top managers every month in a slideshow. An automation system is set to guarantee the quality of the data and the results. The technologies used are ETL (extract, transform, load), Analytics, and Dataviz. In the context of enterprise management, connecting siloed data across the sales, finance, supply chain, and services domains and embedding AI is keyfor better and smarter decisions. Such achievements help large organizations reduce costs, optimize operating performance, and harness the power of data.

          Career management

          Digitalization entered HR departments several years ago, and AI has logically become part of this evolution. For many entities, it’s today a part of all career management processes.

          • In recruitment, AI significantly reduces delays thanks to intelligent CV sorting. Some HR departments go so far as to carry out a first interview with chatbots.
          • For training, AI allows employees to benefit from an ideal training plan for their skills development.
          • When it comes to internal mobility, today there are solutions for finding the best profile corresponding to a position that needs to be filled within a company.

          Among plenty of successful applications in the HR field, AI is bringing productivity gains, procedure reliability, HR processes improvement, and responsiveness in career management.

          Conclusion

          Artificial intelligence has so far delivered many benefits and is a huge economic growth accelerator. It embraces many sectors of activities and is already impacting our daily lives. It also raises questions about embedding humans in the process, sharing the benefits, being fair, employment, data confidentiality, privacy, violation of ethical values, ​​and trust in results. These concerns need to be addressed through global regulations, certifications of AI models, and more. In a coming blog article, we will address these necessary fundaments around trusted AI.

          A case for building digital trust for a more sustainable world

          Jean-Baptiste Perrin
          Jean-Baptiste Perrin
          28 Jun 2022

          Trust in technology is dropping in an increasingly digital world


          The use of technology to build the future business model of many businesses has been increasing exponentially – it helps them get closer to their customers, facilitates daily operations, and increases corporate opportunities. At the same time, digital technology is vulnerable to issues like breach of security, misuse of personal data, or even algorithmic bias and lack of transparency. Close to 60% of organizations, for example, attracted legal scrutiny, and 22% faced customer backlash in the two to three years prior to 2020 due to decisions made by their AI systems.

          It is therefore not surprising that public trust in technology is dropping significantly. The 2021 Edelman Trust in Technology report shows that globally, trust in technology has plummeted to 68% in 2021 from 77% in 2012. Customers believe organizations are not doing enough about key ethical issues: a Capgemini Research Institute study reveals that the share of customers who believe that organizations are being fully transparent about how they are using their personal data fell from 76% in 2019 to 62% in 2020.

          In an unfortunate world where, in true Orwell’s 1984 fashion, we have real examples of authoritarian states using technology to subjugate their citizens – how can businesses ensure customers understand and trust organizations’ digital transformation agendas?

          Elements of transparency, security, privacy, and responsibility constitute digital trust

          Decisions taken by technology affect a major part of an individual’s life today. We have cars driven by AI, insurance premiums based on big data analytics, nursing homes staffed with robots, and targeted ads and recommendations based on the information we put on the internet. With all the data that organizations have amassed, consumer apprehensions are not only expected but warranted.

          However, organizations are also becoming increasingly sensitive to ethical issues. In a 2020 study, 62% of organizations said that they adhere to all data protection regulations applicable in their region (e.g., the GDPR in Europe), vs. 48% in 2019.

          To alleviate consumer concerns about technology and become ambassadors of trust, businesses must continue to demonstrate a mature stance towards digital ethics and security. To earn digital trust, organizations need to focus on the four areas of ethics and responsibility, privacy and data governance, transparency, and security.

          The role of technology in advancing sustainability

          A lack of trust in any technology leads to its adoption being hindered. This is truly a shame when the same technology could potentially be used in a multitude of ways to improve our world. Big data analytics, artificial intelligence, machine learning, and digital twins are among the many digital tools that could help us become more sustainable in various ways.

          According to a study, AI-enabled use cases helped organizations across industries reduce GHG emissions by 13% and improve power efficiency by 11% from 2018 to 2020. AI use cases have also helped reduce waste and deadweight assets by improving their utilization by 12%. In another study, the Capgemini Research Institute found that 34% of organizations with ongoing digital twin programs have implemented the technology, at scale, to understand and predict energy consumption and emissions, and 16% is the average improvement in sustainability realized owing to digital twins.

          The figure below highlights some of the use cases of technologies like AI/ML and digital twins to help boost sustainability.

          Source:
          1) Capgemini Research Institute, Climate AI, October 2020, https://www.capgemini.com/research/climate-ai/;  
          2) Capgemini Research Institute, Digital Twins: Adding Intelligence to the Real World, May 2022, https://www.capgemini.com/insights/research-library/digital-twins/.           

          Digital trust can result in increased sustainability, faster

          The need for digital transformation to accelerate and scale solutions to mitigate the impact of climate change and to help disadvantaged communities is evident. We have the technologies we need and we want to change the game. But there is still resistance to adopting them and scaling them. We need to build digital solutions that are perceived as enablers rather than intrusive – respecting digital human rights and opening new frontiers for society.   

          So, where do you stand? Do you want to participate in building trust in technology for a more sustainable world? Connect with us and help us get the future we all want!

          Green software engineering – Back to the roots!

          Thilo Hermann
          27 Jun 2022

          Sustainability is one of the hottest topics in IT currently. It’s obvious that software engineering also has an impact on our environment. The effect of mitigating this impact might not be as big as that of optimizing the steel industry, but there is still value in taking a closer look.

          Green software engineering is an emerging discipline with principles and competencies to define, develop, and run sustainable software applications. The result of green software engineering will be green and more sustainable applications.

          Green applications are typically cheaper to run, more performant, and more optimized – but that’s just a welcome addition, and I will explain the reason for this correlation later. The key thing is that developing applications in such a manner will have a positive impact on the planet. So, let’s have a closer look at the principles. Please note that green software engineering is just one part of sustainability in IT, but in this blog, I will focus on it!

          According to https://principles.green/, the following principles are essential when building green applications:

          1. Carbon: Build applications that are carbon efficient.
          2. Electricity: Build applications that are energy efficient.
          3. Carbon Intensity: Consume electricity with the lowest carbon intensity.
          4. Embodied Carbon: Build applications that are hardware efficient.
          5. Energy Proportionality: Maximize the energy efficiency of hardware.
          6. Networking: Reduce the amount of data and distance it must travel across the network.
          7. Demand Shaping: Build carbon-aware applications.
          8. Measurement & Optimization: Focus on step-by-step optimizations that increase the overall carbon efficiency.

          You should take code and architectural changes into account that reduce the carbon emissions and energy consumption produced by your application. Please note that most of the examples are based on Java and cloud technologies like containers.

          Just another NFR?!

          When I read these principles, it came to my mind that they should be reflected in non-functional requirements (NFRs) for the application. If you treat them like this, it’s obvious that you must find the right balance, and typically there is a price tag attached.

          The good news is that green principles are also related to well-known non-functional requirements like “performance efficiency” (see ISO 25010 https://iso25000.com/index.php/en/iso-25000-standards/iso-25010).

          Those NFRs regarding performance and efficiency can typically be fulfilled by optimizing your code. We often have challenging performance requirements, and once you optimize your algorithms (e.g., moving from a “bubblesort” with the complexity of O(n^2) to a “quicksort” with the complexity of O(n*log(n)) you will reduce the CPU utilization, especially for huge data sets. Those optimizations also have a positive effect on energy consumption and thus for the CO2 emissions.

          On the other hand, a brute force solution by keeping the inefficient algorithms and just adding additional hardware (e.g., CPU, RAM) might work for the performance NFR, but not for efficiency, and thus this “lazy” approach will have a negative impact on your sustainability targets! Especially in the cloud with “unlimited” scaling this solution could be tempting for developers.

          You might remember your computer science lectures around “complexity theory” and especially about “big O notation,” and you might have wondered what those are good for … now you know that those are key for a sustainable world! A green software engineer must be a master in implementing highly efficient algorithms!

          You should be aware that the NFRs are sometimes conflicting, and you must make compromises. It’s a best practice to document your design decisions and their impact on NFRs. With this, it’s easy to visualize the impact and conflicts. Once you know those it’s the right time to make decisions!

          One question still to be answered: when and how to optimize your green application?

          Let’s start with two quotes from pioneers of computer science:

          First rule of optimization: don’t do it.
          Second rule of program optimization (for experts only!): Don’t do it yet –­­ that is, not until you have a perfectly clear and unoptimized solution. — Michael Jackson

          Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%. — Donald Knuth

          To rephrase it: “know your enemy” and optimize according to the following algorithm.

          1. define your target
          2. measure as accurately as possible
          3. identify the bottlenecks
          4. optimize the biggest bottleneck (and only one at a time!)
          5. measure again
          6. check if you reached your target:
            a. Yes – you’re done.
            b. No – go back to step 2.

          With this approach, you should be able to avoid micro-optimization and premature optimization which is the “root of all evil” (see https://stackify.com/premature-optimization-evil/).

          To move on: it’s not only the next NFR, but rather a mindset change to strive for resource efficiency wherever appropriate.

          Abstraction as environment killer?

          Since I started my career in computer science, the level of abstraction has grown over time. In the beginning I was still learning machine/assembly language (for the famous 6502 microprocessor), moved on to C/C++, later to Java, and finally to low code (e.g., Mendix, Outsystem, Microsoft PowerApps). This made life as a programmer much easier and more efficient, but the drawback is that with all those abstractions the actual resource usage is hidden and typically went up.

          The obvious conclusion of this is that we should move back to machine language to build highly efficient green applications. “Unfortunately,” this is an aberration for several reasons:

          • It’s really tough and time consuming to implement in machine/assembly languages
          • Complex, huge systems are tricky to implement, even with a higher level of abstraction
          • Cost most probably will explode
          • Time to market will be much longer

          On the other hand, compilers have improved a lot in the last years, and they often optimize the code better than an average programmer is able to. Techniques like loop optimization, inline expansion, and dead store elimination, to name a few, will improve the efficiency and thus lead to a greener application.

          In some cases, it might be worth choosing a lower level of abstraction (e.g., use C) to optimize to the max, but this must be an explicit decision for the really heavily used code parts. As already shown above, you need to “know your enemy” and only optimize according to this pattern where you expect a huge impact!

          Impact of chosen programming language

          As mentioned, the chosen programming language can have a major impact on energy consumption (see https://thenewstack.io/which-programming-languages-use-the-least-electricity/). As expected, once more C seems to be the benchmark in this area. The match “compiler vs. interpreter” is definitely won by the compilers. The optimizations done on the Java JVM seem to also have a positive impact on energy consumption and performance.

          Bad news for all scripting languages, as they are at the lower end of the “ranking” for power usage. If you’re using JavaScript or even TypeScript heavily for computation tasks, you should look for better options. As always, this might change over time as interpreters can get optimized, as we saw for Java in the past. Techniques such as just-in-time compilers (JIT) and Java Hotspot JIT compilers had a major impact for Java, and this most probably could also be done for JavaScript and TypeScript.

          As Python is for a lot of topics (incl. AI) the language of choice, I would encourage the experts to optimize it for energy consumption. For the time being, Python is the second to last and only PERL is worse. The energy footprint is approximately 75 times higher as for C, and thus there should be a lot of potential for optimizations.

          Parallelization – Use your cores!

          Modern hardware architectures with multiple cores on one chip are utilized best by parallelized algorithms. So once more it helps to know the theory and best practices in this area. As parallel computing is an old topic, you should watch out for parallelized algorithms for a given problem.

          To be honest, we nowadays have so many abstraction layers between the actual CPU and the code we’re running on them that the impact is really hard to measure and to calculate, and thus I wouldn’t invest too much of my time in parallelization of my algorithms as this is a really tricky and error prone task!

          Complex frameworks/COTS vs. lightweight alternatives

          Besides abstraction, the usage of frameworks has an impact on the green features of your application. Middleware COTS products like application servers (e.g., IBM Websphere, Oracle WebLogic, JBoss, …) introduce complexity which might not be needed in every case. Lightweight alternatives like Tomcat or even further Spring Boot (see https://spring.io/projects/spring-boot), Micronaut (see https://micronaut.io/), or Quarkus (see https://quarkus.io/) will reduce the memory footprint, startup time, and the CPU usage by a lot. So, once more, you should check if you really need the complex and feature-rich frameworks or if the lightweight alternatives are good enough. For most of the cases I was involved with, the lightweight alternatives were perfectly fine!

          I strongly recommend selecting the “lightest” framework that still fits the requirements. Don’t stick to given or even mandatory standards and be willing to fight for a greener alternative!

          Architecture – Microservices vs. monoliths

          Architecture can have a huge impact. Over the years architectural pattern evolved, and currently microservices are an often-used pattern. When you compare legacy monoliths with a modern microservice architecture, there are positive and negative effects on the resource usage:

          Positive:

          • Scaling: With a microservice architecture it’s easily possible to scale only where needed. Mechanisms like auto-scaling on container platforms (e.g., Kubernetes, OpenShift, AWS EKS, MS Azure AKS) do this only on demand and thus reduce the overall resource usage.
          • Best technology: You can choose the best fitting technology (e.g., databases, programming languages) for your purpose, thus reducing the complexity and level of abstraction for every microservice independently. This will lead to a more efficient application if done in a proper manner.

          Negative:

          • Network traffic: Within a microservice architecture the number of “external” calls is much higher than in a monolithic application. Even with lightweight protocols this introduces an overhead, and thus the resource usage will go up. If there is an API management tool involved, the overhead is even bigger.
          • Data replication: It’s quite common to replicate data in a microservice architecture to enable independency between the services. This will lead to a higher storage demand and the propagating of changes via events (e.g., event sourcing, command query responsibility segregation (CQRS)) will increase the network traffic and CPU utilization.

          You need a case-by-case evaluation to determine if the chosen architecture has a positive effect on the green principles or not!

          … and now to something completely different: no blog is complete without” – referring to the KISS principle (see https://en.wikipedia.org/wiki/KISS_principle).

          KISS – Reduce to the max

          For all green software developers, the KISS principle shall be applied on the following dimensions: CPU – RAM – DISK – NETWORK

          Now let’s have a look at what are typical measures to achieve those reductions:

          • Efficient algorithms: That’s obvious and already explained in the beginning of this blog. The better the algorithm works (e.g., CPU utilization, memory consumption), the faster you will reach your target. The reuse of existing libraries that include optimized solutions for common problems is a best practice.
          • Caching: Caches can reduce the amount of external service calls to a minimum. This includes database, file systems, external services, and web-content calls. Be aware that when introducing caches, you must make sure that the functional requirements are still fulfilled. You might get eventual consistency as drawback.
          • Compression: Data compression can be applied on several dimensions. Communication protocols shall be optimized in respect to size. For example, moving from SOAP to REST will already reduce the size and the marshalling and un-marshalling effort. You can even go further and use binary formats (e.g., like gRCP, ActiveJ). The drawback of binary protocols is that they are not easily human readable and some lack interoperability with other programming languages. Besides protocols, you should also check if you can reduce the resolution of graphics used within your (web) application, or better still move to textual representations. Tree shaking is another approach which is often used in JavaScript to reduce the amount of code transferred/compiled in the web browser. In general, if you compress and decompress during runtime, you need to calculate this overhead and check if it’s a real efficiency improvement.
          • Scream tests: The concept of a scream test is quite simple – remove the application/service and wait for the screams. If someone screams, put it back online. This helps to get rid of unused applications/services. If you are lucky, no one screams and thus this will reduce resource consumption.

          Recurring fully automated tests

          With the rise of automated testing in CI/CD pipelines as standard for engagements, the electricity consumption went up. If you take DevOps seriously, you need to have highly automated tests with a high test coverage that are executed frequently. This should include functional and non-functional tests.

          Should we get rid of those tests for the sake of sustainability?

          For sure not, but we might need to optimize our testing strategy. Instead of running all tests, we should rather focus on running the relevant ones. Thus, you need to know the dependencies and the impact of the changes you made. The existing tools support you in this, so you should avoid the “brute-force” approach of testing everything after every build.

          … and now for something completely different – AI?!

          Machine learning and AI are interesting topics. The training of a neural network model can emit as much carbon as five cars in their lifetimes. And the amount of computational power required to run large AI training models has been increasing exponentially in the last years, with a 3.4-month doubling time (see https://hbr.org/2020/09/how-green-is-your-software). On the other hand, you might save a lot of carbon and/or electricity by using those models. It’s in some ways an investment, and one needs to calculate a “business case” to make the right decision. This seems to fall into the well-known “it depends” category of the consulting business.

          It’s obvious that we need optimized algorithms in the future in this area (i.e., training of neural networks and also using them), and the research on this is at its starting point.

          Finally, money!

          In a service business, money is always important. The old rule that “each non-functional requirement will cost you something” is also true for those imposed by sustainability. You can spend money only once, and thus you need to make a decision if it’s for an additional feature or optimizing the footprint. This discussion must be carried out with your client, and you might learn how important this topic is! Please note that an efficient, green application will save money during the lifecycle by reducing the run-time costs. In the long tail, the payback of the investment might lead to a positive business case.

          Know the basics, learn from the past…

          Green software engineering also reminds me of the typical waves we have in IT and adds another wave: efficient application vs. efficient development.

          Finally, I would state the following: every experienced software engineer with a strong focus on performance and efficiency is well equipped to become a green software engineer!

          And sometimes the best solution for the environment would be to get rid of the application completely to save the planet! For whatever reason, Bitcoin comes to my mind while I’m writing this ;-)

          An approach for sustainable IT implementation

          Marius Vöhringer
          27 Jun 2022

          Sustainability is becoming an integral part of corporate agendas

          The current German IT-Trends study by Capgemini shows that the reporting obligation on sustainability is expected to be significantly expanded from 2023 onwards. It is planned that all companies in Europe with more than 250 employees, as well as small and medium-sized capital-market-oriented companies, will then have to submit a report showing concrete measures to promote sustainability. Nearly 71% of companies intend to reduce their annual greenhouse gas emissions, by an average of almost 37% by 2026. The vast majority also consider this value to be realistic.

          IT is still neglected in the sustainability agenda

          At the same time, as another study by Capgemini shows, IT still plays no or only a minor role in the sustainability strategies of most companies (“Sustainable IT: Why it’s time for a Green Revolution for your organization’s IT”). Although half have their own sustainability strategy, only one in five companies includes IT in their sustainability agenda.

          IT itself causes a CO2 footprint but can also make a profitable contribution to CO2 savings. However, only 43% of executives know the carbon footprint of their corporate IT and only 18% have defined a comprehensive strategy with clearly defined goals and timelines.

          CO2 footprint reductions of IT can be achieved primarily by optimizing and streamlining IT landscapes and architectures and require a corporate social responsibility (CSR) strategy that addresses the levers for reducing environmental impacts in the various business areas.

          Introducing a framework for sustainable IT implementation

          We recommend the following process to implement a sustainable IT strategy with clear and measurable goals, and defined milestones which also indicate necessary adjustments to the organization, processes, and culture.

          1. Sustainable IT Employees
            Promote a sustainability culture that includes cooperation with cloud Hyperscalers like AWS, Microsoft, or Google. Adapt teams and corporate culture accordingly to achieve and accelerate sustainability goals: everyone must take responsibility for the environmental cost of cloud use.
          2. Sustainable IT Strategy
            Define a corporate sustainability strategy including your vision of sustainable IT, embedded into a broader approach to CSR, and conduct a baseline assessment of the IT’s overall environmental footprint and sustainability maturity. For implementation, establishing a Sustainable IT Lead helps to control activities by KPIs and sets up the necessary governance.
          3. Digitalization for Sustainable Business
            Analyzing the value chain and target areas of the current business will evaluate the measures and focus areas to assess improvements in the value chain, e.g., through disintermediation. Another way to achieve added value in the business is through a significant consolidation of extensive application landscapes with targeted shutdown of applications.
          4. Assess and Calculate
            Based on data-driven business and IT assessments, a digital twin of the IT and application landscape helps to determine current CO2 emissions and simulate CO2 reduction potential. Decision criteria, defined in a joint approach, help to identify the potential for improvements within the IT landscapes and architectures. The effect of measures can be extrapolated and simulated. Combined with an on-premises facility (DC) sustainability assessment, consider the leverage of the power usage effectiveness (PUE) of cloud data centers by cloud transformations.
          5. Design and Plan
            Architecture is the key for data center setup, landing zones, network, and communication. Building a sustainable IT platform will allow significant efficiency gains by moving workloads and storage to cloud, which consumes less energy.
            By using the cloud and automating the scaling of necessary hardware resources, large energy savings can be achieved. Cloud-based application landscapes achieve compute resource utilization rates of more than 60%, whereas classic on-premises data centers typically achieve utilization rates of less than 20%.
            While cloud computing – compared to the usage of traditional data centers – can play a significant role in reducing the use of energy, it causes extra energy consumption for required networking and communication. A sustainable architecture will thus strive to reduce network transfers and use efficient data transfer mechanisms, including the deployment of edge computing. That overall planning approach will be the foundation for a holistic planning based on portfolio analysis with focus on sustainable modernization paths.
          6. Development and Green Coding
            In addition to switching to green electricity, data center emissions can also be significantly reduced by modernizing the workload through improved software architecture as the design of the software architecture impacts required hardware sizing and electrical energy consumption. Sustainable application development and application transformation with a focus on green coding thus needs to be a part of your approach to sustainable IT. Some companies have already saved 50% of the energy demand of optimized applications and reduced CO2 emissions accordingly.
          7. Operate
            DevOps can contribute to a better energy management if DevOps processes and automation are used for continuous optimization. Establish site reliability engineering to create efficient, scalable, and highly reliable software systems through automation and continuous integration and delivery. Use modern DevOps tools and technologies, automated CI/CD pipelines, and tests to detect defects early in the process before they hit production. Use monitoring systems and act on the alerts before they turn into incidents. In this way, you reduce the waste of resources that can be used more effectively for other efforts.
          8. Re-Calculate and FinOps
            Use calculators for on-premises and cloud-based infrastructure to determine the carbon footprint of your IT and establish greenhouse gas emission dashboards. Make sustainability part of FinOps to achieve maximum business value within the financial and environmental management of IT/cloud systems by shaping the use of resources.