Green Coding and IT Energy Consumption: The Energy-Intensiveness of Artificial Intelligence

Green coding series Part 4
You can find part 1 here, part 2 here & part 3 here.

Welcome back to our ongoing exploration of green coding practices and their impact on IT energy consumption. 

In our previous post, we explored various forms of waste and their impact on energy consumption in software development. We examined issues ranging from redundant software and improper use to algorithmic inefficiencies and inefficient programming languages. 

In this fourth installment, we delve into the world of Artificial Intelligence (AI), which has been making significant strides in various fields. However, an often-overlooked aspect of AI is its substantial energy consumption.

The AI Revolution: Expectations and Realities

In 2022, AI stepped into the limelight, with user-friendly AI models like ChatGPT and DALL-E making AI more accessible to a broader audience. This accessibility opened the floodgates of expectations regarding AI’s business impact. Predictions abound, with Gartner forecasting that by 2024, a whopping 40% of enterprise applications will integrate generative AI capabilities, and by 2027, 15% of applications will be authored by AI itself.

But beneath the glossy surface of AI’s transformative potential lies a less discussed concern – its substantial energy consumption. Just like any other software, AI requires energy to operate efficiently. However, quantifying this energy use isn’t straightforward due to a lack of public disclosure by AI developers. AI applications are often cloud-based, making it challenging to pinpoint their real energy consumption behind the curtain of service billing.

AI’s Growing Presence: Reality Check

AI, especially user-friendly models like ChatGPT and DALL-E, gained prominence in 2022, fuelling high expectations for its role in business applications. Predictions abound, with Gartner estimating that 40% of enterprise applications will incorporate generative AI capabilities by 2024, and 15% of applications will be AI-authored by 2027.

Beneath AI’s transformative potential lies a less-discussed concern: its substantial energy demands. Similar to other software, AI requires energy for efficient operation. However, quantifying this energy usage is challenging due to limited public disclosure by AI developers, often operating as cloud services.

Estimates suggest that AI contributes significantly to the energy consumption of tech giants. For instance, it’s believed that AI operations account for 10-15% of Google’s electricity usage. In 2021, Google consumed a massive 18.3 terawatt-hours of electricity, implying AI consumption ranging from 1.8 to 2.7 terawatt-hours, a range equivalent to a month’s output of Europe’s most potent nuclear plant, Olkiluoto Unit 3 in Finland.

It’s vital to note that these figures are rough estimates with broad margins of error, underscoring the need to address AI’s energy consumption proactively.

The Anatomy of AI Energy Consumption

Understanding AI’s energy usage involves breaking it down into three phases:

Compiling and structuring training data: AI requires curated training data for learning. Collecting, organising, and cleaning this vast amount of data demands automation, which, in itself, consumes energy.

Teaching AI: The volume and format of training data affect energy consumption during AI’s learning phase. For example, training AI on video or image-based data consumes more energy per unit of information than text-based data. The model’s size and complexity also impact energy efficiency. For instance, GPT-3 with 175 billion parameters consumed an estimated 1,287 megawatt-hours (MWh) during training, emitting 552 tons of carbon dioxide.

Energy use during AI usage: Like other software, AI consumes energy when performing tasks. A single ChatGPT query, for example, is estimated to consume 1.7 to 2.6 watt-hours (Wh). With around ten million daily queries, daily power consumption reaches 17 to 26 megawatt-hours (MWh). Additionally, loading the AI model into server memory requires substantial energy, considering the model’s size, which can be as much as 800 gigabytes

In summary, AI solutions can be energy-intensive, often hidden within cloud services. Substantial differences exist in energy consumption among AI models and solutions, emphasizing the need for informed decisions.

Navigating the Green AI Landscape

Given AI’s potential to drastically increase an application’s energy consumption, it’s crucial to approach its usage with caution. Here are some recommendations to steer through the green AI landscape:

Assess the necessity of AI: Determine whether AI is genuinely necessary to solve a particular problem. Many issues can be resolved using traditional methods, and opting for simpler solutions can reduce energy consumption.

Define constraints: If AI is indeed required, set clear constraints on its usage. Balance business needs, computational accuracy, and energy consumption to strike the right equilibrium.

Choose the right AI model: Select the AI model that best suits your specific requirements. If it’s a cloud-based solution, opt for data centres powered by clean energy sources.

Mind the training data: Ensure the training data aligns with business needs, minimises biases, and is curated efficiently to reduce energy consumption.

Optimise model configuration: Experiment with different parameters to find the right balance between computational efficiency and results.

Lastly, let’s join hands in sharing our experiences and knowledge about AI usage and its energy consumption. The scarcity of information on this vital topic highlights the urgency of creating a collective repository of best practices.

In our next blog post, we’ll journey into the world of blockchain technology and cryptocurrencies, exploring their captivating yet energy-intensive nature. We’ll uncover the intricacies of blockchain, its decentralised structure, and the environmental impact of cryptocurrencies. Stay tuned as we dive into this fascinating and energy-conscious realm.

Thoughts by

Janne Kalliola

Chief Growth Officer

24.11.2023

Share on social media:

Latest blogs