AI could consume as much electricity as The Netherlands by 2027. We need game-changing innovation to keep up with its growth.

 

SwissCognitive Guest Blogger:  HennyGe Wichers, PhD – “AI Is Huge – And So Is Its Energy Consumption”


 

SwissCognitive Logo

Replacing every Google search with an LLM interaction uses as much electricity as Ireland, Alex de Vries writes in a commentary published in the journal Joule on October 10, 2023. The PhD candidate at Vrije Universiteit Amsterdam is raising concerns about the environmental impact of artificial intelligence.

Large models like ChatGPT (OpenAI), Bard (Google), and Claude (Anthropic) consume a lot of energy. Take, for example, ChatGPT. The chatbot is built on a model called GPT-3, which used an estimated 1,287 MWh in electricity during its training phase. According to the chatbot, that’s the equivalent of driving a Tesla Model 3 around the Earth’s equator 21,500 times.

AI Is Huge – And So Is Its Energy Consumption_2

ChatGPT puts 1,287 MWh in perspective

ChatGPT suggested using household energy consumption for scale first. It was a good idea, but its answer was clearly wrong. It doesn’t make sense that 1,287 MWh powers either 1,468 American homes for a year or 122 for a month. The machine politely apologised when I pointed this out.

That’s nice, but I no longer trusted the AI and had to verify its second suggestion – with a Google search. For now, LLMs are probably increasing rather than replacing traditional search traffic. But I digress.


Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!


 

It’s well-known that training an LLM uses a lot of energy, yet that’s only the first step in the process. The second step is inference, a phase few of us have heard of, but many participated in.

For ChatGPT, inference began when it was launched. It continued to learn while interacting with the public and creating live responses to user queries. The chatbot’s estimated energy consumption during this phase was 564 MWh daily. That’s close to half the electricity consumed in training (44%) – but used every single day.

“Looking at the growing demand for AI service, it’s very likely that energy consumption related to AI will significantly increase in the coming years,” de Vries commented in an interview.

ChatGPT exploded as soon as it launched, registering an incredible 100 million users in just 2 months and igniting a chain reaction of artificial intelligence products. Unfortunately, we can’t skip the inference phase for new AIs because, without it, the machine would not have the ability to learn.

From 2019 to 2021, a whopping 60% of Google’s AI-related energy bill was for inference. But progress is being made. Hugging Face, a relative newcomer founded in 2016, developed the open-source alternative BLOOM using significantly less energy in the inference phase relative to the training phase.

Let’s take a look at the energy consumption for a single user request. Comparing the different methods, the graph looks as follows.

AI Is Huge – And So Is Its Energy Consumption_3

Fig 1: Energy consumption per user request (replicated from Joule)

The last two bars show estimations for AI-powered Google Search by two independent research firms, New Street Research and SemiAnalysis. It’s very costly at 20 to 30 times the usage of regular Google Search. That’s not an immediate problem, however, because NVIDIA can’t supply the hardware required.

HARDWARE CONSTRAINTS

Google would need 512,821 of NVIDIA’s AI servers to make every search an LLM interaction. That’s more than 5 times the company’s production for 2023, when it’s expected to deliver around 100,000 servers. The gap is enormous. Moreover, NVIDIA has around 95% market share, so no alternative supplier exists today.

Chips are an issue, too. NVIDIA’s chip supplier, TSCM, is struggling to expand its chip-on-wafer-on-substrate (CoWoS) packaging technology, which is essential for the chips NVIDIA needs. TSCM is investing in a new plant, but it will only begin to produce volumes in 2027. By then, NVIDIA could have demand for 1.5 million of its AI servers.

Hardware will remain a bottleneck for several more years. Still, without hardware constraints, we’d encounter problems further upstream. Building permissions for data centres take time, and construction does, too. Not to mention, the energy grid needs to expand to deliver the electricity required to run and cool them.

INNOVATION

But these constraints will drive innovation. We will find more efficient models and ways to operationalise them. A breakthrough in quantum computing could change everything, both for the supply and demand for AI.

De Vries points out, “The result of making these tools more efficient and accessible can be that we just allow more applications of it and more people to use it.”

He’s referring to Jevons’ Paradox, which occurs when an increase in efficiency causes costs to fall and demand to increase to the point where we use the tools more than we would have without the improvement.

The paradox was formulated in 1865 and has been observed many times since. LED lighting is a nice example: running LEDs is so cheap that we’ve covered the planet with them and now use more electricity for lights than ever before. If that is hard to imagine, just take a look at Sphere in Las Vegas.

Link to the YouTube video.

How amazing is that? But we wouldn’t have built it if innovation hadn’t graduated to LEDs. AI might follow a similar path. Nevertheless, without game-changing innovation to reduce energy use and ensure sustainable supply, we must think twice.

“The potential growth highlights that we need to be very mindful about what we use AI for. It’s energy-intensive, so we don’t want to put it in all kinds of things where we don’t actually need it,” de Vries offers as a parting thought.

Still, the very AI that poses the risk may also help us solve climate change and sustainability challenges.

Source: Joule via EurekAlert!


About the Author:

HennyGe Wichers is a technology science writer and reporter. For her PhD, she researched misinformation in social networks. She now writes more broadly about artificial intelligence and its social impacts.