Skip to main content
 

Is AI the superhero or supervillain of the energy transition?

by
Engagement Director and Growth Lead
by
Engagement Manager

AI is believed to be the greatest technological advance of this generation , and in the climate space we’re seeing the emergence of many AI -enabled solutions.  However, there have been widespread reports over the last year regarding AI's  high energy consumption  resulting in AI becoming a leading contributor to the climate crisis

Earlier this year, the International Energy Agency published its projections up to 2026 for worldwide electricity consumption associated with data centres, cryptocurrency, and artificial intelligence. In 2022, these 3 sectors consumed 460TWh of electricity worldwide , almost 2% of the total global electricity demand. The b aseline projection for 2026 estimates that this demand will almost double, to just over 800TWh. This increase is equivalent to the electricity consumption of an entire country, such as Sweden or Germany.

A secret supervillain?  

This is no surprise, given that one of the main drivers of current energy demand in this sector is computing (40%). Generally, energy usage from AI comes from one of two types of compu ting : Training, where the model “learns” about the data that is fed into it, and inference, where the AI model responds to certain queries. Training compute has increased by a factor of 10 billion since 2010 and currently doubles every 6 months. It is a metric commonly used by researchers to show the energy demands of AI.

However, most of an AI workload comes from inference, the carbon footprint of which can vary by task. For example, according to researchers from HuggingFace and Carnegie Melon, generating 1000 image responses versus generating 1000 text responses using AI is 6 2 times more energy intensive . Shockingly, it takes roughly the same amount of energy to generate an image as it does to charge a smartphone .

A tamed beast?   

The amount of training compute may decrease. EpochAI , who produced the study on a historical increase in compute, has noted in a recent article that the growth of training compute is slowing , likely due to a lack of opportunity for improvement. There may also be less of a need to train models in the first place; s o-called foundational models are trained on broad data sets such that they can be applied across a wide rang e of use cases, and then later optimised for more specific applications .  

Beyond this, there is potential for more computationally efficient training of AI models . LLSC has developed numerous energy-saving techniques for data centres , all of which have minimal impacts on performance . One technique, developed in collaboration with Northeastern University , stands out : a tool that uses the rate of learning to predict the likely performance of a model , and stop the training of underperforming models early. This could save up to 80% of energy spent for training, and likely a significant amount of time spent by practitioners on waiting for their code to run.

Training is only one side of the story, given that most AI energy use comes from the ever-growing number of inferences made, and that GenAI outputs are becoming increasingly complex and thus energy demanding . LLSC researchers also designed an optimiser that matches model s with the most carbon-efficient mix of hardware, such as assigning low-power CPUs over high power GPUs (two types of computational processing unit) for the less-demanding aspects of inference , saving 10-20% of energy .  

However, t o fully rob ou r supervillain of its destructive appetite, we need to address some key myths within the industry, one of which is the idea that AI can solve all problems. We need to consider , for any given application, whether using AI is truly necessary or whether the same problem can be fixed using a simpler method .

We also need to dispel the “bigger is better” approach to AI development. A key force driving competition and thus growth in AI is the overwhelming belief that more parameters and more data are essential for AI performance , yet designing smaller, more efficient models may be the way to reduce both training and inference energy cost s , while maintaining quality of service .

A misunderstood superhero ?  

However, it could be argue d that only considering the energy use of AI is an inaccurate assessment of its sustainability, as using AI in certain applications can have a positive environmental impact. For example, s mart design and monitoring of building s can reap significant energy savings. This applies to old buildings as well as new ones: Over 10 years , t he Empire State Building , now almost 100 years old, has been retrofitted with smart sensors along with a n automated building management sys tem. The aim of this endeavour was to reduce its energy consumption by 38%, and to return savings of $4.4million per year on energy – the building has consistently beaten its efficiency targets in this time frame .

In addition, there are several AI technologies that are now optimising wind farms by monitoring the tur bine performance and modifying the turbin e settings, such as the blade pitch and yaw angles . Within industry, there are also solutions which are using AI to predict changing process conditions and optimise maintenance events accordingly. In these and many other cases, the y are not using AI alone , but are combining AI with i ntelligent algorithms that define the underlying logic (often physics) - this is critical for effective operation .

A green giant?   

The companies that are building data centres in their masses (Amazon, Google, Microsoft ) all have strong decarbonisation targets . This means that they can act as heavy weights in shifting the industry to invest in early-stage technologies t hat decarbonise data centres , and to adopt them at scale. This includes:

  1. Renewables – beyond wind and solar, there is also opportunity for tidal power, which provides a 4x daily consistent energy source    
  2. Long Duration Energy Storage – to support data centres to run 24/7 in all seasons  
  3. Thermal efficiency and reuse – which help to minimise the cooling energy requirements and reuse the heat to positive sources    
  4. Demand balancing – creating incentives to only train or run AI models when there is excess energy on the grid   

Looking even further up the supply chain, perhaps the company with the most potential to create change in this space is NVIDIA, who may own as much as 98% of the data centre GPU market . Being an industry leader for these technologies could help to grow these technologies to a scale whereby they can even more commercially viable in other industries.

Conclusion  

Modern narratives around AI can give us the impression that it is a ridiculously powerful, intangible entity – a superhero or supervillain . In reality, AI i s neither friend nor foe : it is a tool , crafted and utilized by intelligent people around the world . This is not to take away from the consequences of AI ’s use or misuse on our lives and that of our planet – rather what this article attempts to highlight is the responsibility of us humans to stee r technological developments in the right direction , and to encourage those leading the sustainable growth of AI . To quote Spiderman: w ith great power, comes great responsibility .