Skip to Content
CFT_BannièresSite_Montalivet

Perspective from Vincent de Montalivet

Green or mean? Separating the myth of AI’s carbon impact from the reality. Given that AI, like any other technology, requires an energy supply, how do we assess its impact on the organization’s carbon footprint? Careful attention is needed to separate the myths attached to AI’s energy impact and the actual reality.

Download the full-length edition

Alternatively, you can download the discussion to the right, or scroll down to read on.

In organizations today, AI has yet to reach maturity: only 13% of companies have implemented AI on an industrial scale. However, while its use may not be widespread, the fact remains that AI consumes energy like any other technology and will do so once it is a mature technology that is routinely used across organizations:

  • AI needs large volumes of data in order to learn the behaviors needed for automated response. Machine learning algorithms based on mathematical models rely on sampling or “training” data to make predictions or decisions without being explicitly programmed to do so. It is through training that an AI application becomes “intelligent.”
  • Training smart systems in this way can require considerable processing power, depending on the number of operations to be performed.

This is where the carbon footprint cost can emerge:

  • Researchers at the Allen Institute for AI have calculated that training an AI system to generate or recognize words and sentences similar to human language (NLP) can result in as much carbon dioxide as five American cars over their lifetime (including the manufacture of the car itself).
  • Driving AI innovation and new levels of performance can come at a cost: performance gains are achieved through a larger volume of data, and therefore from larger models and more calculations. In May 2020, OpenAI announced the largest AI model in history. Known
    as GPT-3, it took months of training and has 175 billion parameters!

The energy impact of business AI: reality check

But to build a precise picture of the energy consumption of AI in business, we need to bear in mind a number of reality checks: There are many ways to implement AI and some projects do not involve the training of models. In fact, they only require a dozen parameters to be adjusted. Thanks to statistical and descriptive analyses – coupled with a good understanding of business requirements – it is now possible, using techniques such as regression or clustering, to achieve a range of goals, from optimizing warehouse stock to detecting fraud in finance or public services. In our recently published research, we tried to ascertain the GHG footprint of some of the popular AI use cases. Our analysis shows, for instance, that the GHG emissions produced in training and executing these AI systems amount to only a few kilograms (1–10) of CO2 equivalent.5 This is very small in comparison with the overall GHG emissions of large organizations which typically range in millions of tons of CO2 equivalent per year. We must keep in mind that the need to train complex models only applies to a few of the AI solutions now deployed at scale. When more complex neural network techniques must be used – such as in image recognition – the models
used are often open source ones that have already been trained. Transfer learning techniques are then applied so that the results obtained with the training data are adapted to the client’s data. This means you can avoid the need to retrain a model. This technological “recycling” helps to limit the impact on energy resources of a massive deployment of artificial intelligence projects in companies.

Broadly speaking, AI solutions, digitization, and increasing data volumes (including the production of new data) rightly raise questions about the future energy impact of tech and data innovations. But one reality is clear: the carbon footprint of technologies does not follow the same
growth curve as data volume,6 not least because significant advances in energy efficiency have helped to limit the impact.

Towards “sustainable AI” and a convergence of technological and ecological transitions With a clear sense of what is myth and reality in terms of AI’s energy impact, we can turn to the bigger question: how can we accelerate the age of green AI. Researchers and engineers around the world are working to optimize the energy consumption of AI solutions. This ambition is driven by a clear purpose: for AI to achieve the same level of performance as human intelligence (i.e., being capable of performing thousands of trillions of operations per second) while only consuming 20 watts of energy.

There are many initiatives underway to achieve that goal:

  • A new eco-designed chip to meet the high demands of deep neural network model computation with more efficient energy
    consumption.
  • Using innovative and carbon-efficient methods for training and running certain neural networks to avoid default configurations that may not be optimal from an environmental standpoint. Several versions of reinforcement learning environments have been developed
    which reduce run-times significantly.
  • The availability of “mother” models that have already been trained. Researchers at MIT developed an AI system that improved the computational efficiency of the system in some key ways, cutting down the pounds of carbon emissions involved – in some cases,
    down to low triple digits.
  • Making and using tools for the automatic reporting of consumption and impact measurements available as open source. For instance, a
    group of researchers from University of Montreal created a “Machine Learning Emissions Calculator” to estimate the environmental impact of training machine learning models. It takes into account the location and energy grid of the servers used for training, the length of training, and the make and model of servers used in training to estimate the amount of carbon released in the atmosphere.11

As well as minding its own footprint, AI is also a transformational technology that has the power to positively influence sustainable
development. With an almost 80% positive impact on sustainable development goals,12 AI is a critical technology in implementing climate strategies to reduce organizations’ greenhouse gas emissions by an average of 13%.

This focus on positive outcomes for society as a whole reflects the importance of ethical AI. Key stakeholders – from governments to academic experts – agree on the need to adopt an ethical approach14 to trusted artificial intelligence. This means improving people’s lives while not exacerbating existing problems or creating new ones. In other words, the colossal power of artificial intelligence must be placed at the service of sustainable development.

This points to a reality where AI is used across the value chain to help companies achieve their sustainable goals: designing new environmentally responsible products, calculating carbon impact from resource extraction to distribution, optimizing logistics, improving energy efficiency at factories and warehouses, and reducing inefficiencies and waste through the promotion of recycling and the circular economy. The reality is that while AI has an energy impact, the opportunity for ethical and green AI to drive a sustainable future far outweighs any immediate cost.

Just as AI must be ethical, it must be sustainable.

    <div class="row">
        <div class="col-12 text-center">
            <a href="https://www.capgemini.com/research/conversations-for-tomorrow/edition-1" id="btn-1616000720773" class="section__button btn-general" target="_self">&lt;&lt; Return to Conversations for Tomorrow Edition 1</a>
        </div>
    </div>