Let’s not panic about AI’s energy use just yet
Consider the transistor, the basic unit of computer processors. Transistors can be tiny, down to single-digit nanometers in size. Billions can fit on a computer chip.
Though they have no moving parts, they devour electricity as they store and modify bits of information. “Ones and zeros are encoded as these high and low voltages,” said Timothy Sherwood, a computer science professor at the University of California Santa Barbara. “When you do any computation, what’s happening inside the microprocessor is that there’s some one that transitions to a zero, or a zero that transitions to one. Every time that happens, a little bit of energy is used.”
This story was first featured in the Future Perfect newsletter.
Sign up here to explore the big, complicated problems the world faces and the most efficient ways to solve them. Sent twice a week.
When you add that up — across the billions of transistors on chips and then the billions of these chips in computers and server farms — they form a significant and growing share of humanity’s energy appetite.
According to the International Energy Agency, computing and storing data accounts for somewhere between 1 and 1.5 percent of global electricity demand at the moment.
With the growth of artificial intelligence and cryptocurrencies that rely on industrial-scale data centers, that share is poised to grow. For instance, a typical Google search uses about 0.3 watt-hours while a ChatGPT query consumes 2.9 watt-hours. In 2024, the amount of data center capacity under construction in the US jumped 70 percent compared to 2023. Some of the tech companies leaning into AI have seen their greenhouse gas emissions surge and are finding it harder to meet their own environmental goals.
How much more electricity will this computation need........
© Vox
