Watt’s Up? Generative AI’s Disruption of Energy Demand

Generative AI models like ChatGPT are revolutionizing what is possible but their energy needs risk undermining progress on climate change and clean energy adoption. While data centres currently only use around 2% of global electricity, generative AI poses new challenges as its graphics processors are “energy addicts.” A single AI server rack can consume 40-60 kilowatts compared to 10-15KW for traditional servers.

Training huge foundation models like GPT-4 is also a major user of energy. As their applications grow, so too may electricity demand. The International Energy Agency estimates data centres could double their energy use by 2026 to the level of Japan’s total consumption. The CEO of chip designer Arm warned AI centres may consume a quarter of US power by 2030, up from less than 5% today.

This surge in demand coincides with economic growth, more electric vehicles, and efforts to electrify while transitioning to renewables. It is “not easy for utilities to build new renewable capacity quickly” due to supply chain issues, high project costs, and lengthy approval times for new transmission lines.

Making GPUs more efficient can help but “simply stimulate more usage.” Hyperscalers investing in utilities’ grid constraints may address seasonal peaks when air conditioning increases overall demand. However, temporary gas plants risk “undermining the cloud providers’ climate commitments.”

If clean energy supplies fall short, costs will rise for both generative AI expansion and the broader electrification agenda. As Economists columnist, Schumpeter, notes, “No one knows yet how generative AI will make money. What people do know is that the cost of acquiring GPUs is rocketing. If the energy costs of running them soar, too, it could put the brakes on expansion.” Careful policy is urgently needed.

By The Economist

Photo by Joel Filipe

Comments are closed.

Create a website or blog at WordPress.com

Up ↑