menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

Why frugal AI alone won't fix AI's energy problem

13 0
24.03.2025

A robot arrives on stage as Nvidia CEO Jensen Huang introduces new products during a keynote session at the SAP Center in San Jose, California, March 18. AFP-Yonhap

PARIS – While the rise of AI could revolutionize numerous sectors and unlock unprecedented economic opportunities, its energy intensity has raised serious environmental concerns. In response, tech companies promote frugal AI practices and support research focused on reducing energy consumption. But this approach falls short of addressing the root causes of the industry’s growing demand for energy.

Developing, training, and deploying large language models (LLMs) is an energy-intensive process that requires vast amounts of computational power. With the widespread adoption of AI leading to a surge in data centers’ electricity consumption, the International Energy Agency projects that AI-related energy demand will double by 2026.

Data centers already account for 1-2 percent of global energy consumption – roughly the same as the entire airline industry. In Ireland, data centers accounted for a whopping 21 percent of total electricity consumption in 2023. As industries and citizens shift toward electrification to reduce greenhouse-gas emissions, rising AI demand places enormous strain on both power grids and the energy market.........

© The Korea Times