Regulating AI use could stop its runaway energy expansion
Generative AI promises to help solve everything from climate change to poverty. But behind every chatbot response lies a deep environmental cost.
Current AI technology requires the use of large datacentres stationed around the world, which altogether draw enormous amounts of power and consume millions of litres of water to stay cool. By 2030, datacentres are expect to consume as much electricity as all of Japan, according to the International Energy Agency, and AI could be responsible for 3.5% of global electricity use, according to one consultancy report.
The continuous massive expansion of AI use and its rapidly growing energy demand would make it much harder for the world to cut its carbon emissions by switching fossil fuel energy sources to renewable electricity.
So, we are left with pressing questions. Can we harness the benefits of AI without accelerating environmental collapse? Can AI be made truly sustainable – and if so, how?
We are at a critical juncture. The environmental cost of AI is accelerating and largely unreported by the firms involved. What the world does next could determine whether AI innovation aligns with our climate goals or undermines them.
At one end of the policy spectrum is the path of complacency. In this scenario, tech companies continue unchecked, expanding datacentres and powering them with private nuclear microreactors, dedicated energy grids or even reviving mothballed coal plants.
Some of this infrastructure may instead run on renewables, but there’s no binding requirement that AI must avoid using fossil fuels. Even if more renewables are installed to power AI, they........
© The Conversation
