Ultra-efficient AI won’t solve data centers’ climate problem. This might.Despite DeepSeek’s AI efficiency gains, data centers are still expected to gobble up huge amounts of U.S. electricity.
https://www.washingtonpost.com/climate-solutions/2025/02/07/deepseek-ai-efficiency-climate-change/
Inside internal operations at a data center in Ashburn, Virginia. (Amanda Andrade-Rhoades/for The Washington Post)
When Chinese AI start-up DeepSeek announced a chatbot that matched the performance of cutting-edge models such as ChatGPT with a fraction of the computing power, it sparked a glimmer of hope that AI might be less of an energy hog than people had feared.
But making AI cheaper and more efficient could just prompt people to use more AI, meaning data centers will still wind up using a lot more electricity, according to computer scientists, energy experts and tech investors.
“People ask, ‘Can we just forget about this? Is AI still an energy problem?’” said Vijay Gadepally, a senior scientist at the MIT Lincoln Laboratory who studies ways to make AI more sustainable. “The answer is a resounding, ‘Yes, it is.’”
One solution would be for tech companies to use cleaner energy, Gadepally said. In the United States, most electricity is still generated by fossil fuels, but data centers could monitor local power grids and slow down when power plants are burning the dirtiest fuels. Tech companies, including Google and Microsoft, have started experimenting with this idea.
Why efficiency won’t solve AI’s energy problem
The data centers that train and run AI algorithms can use as much electricity as small cities — and tech companies are racing to build thousands more in the next few years, prompting power companies to burn more planet-warming fossil fuels to keep up.
Even though AI has become more efficient over time, its energy use has only gone up.
This phenomenon has played out before, as well. In the 19th century, economist William Stanley Jevons documented how England’s coal consumption shot up after the invention of a steam engine that used less coal. The new technology made it so cheap to run a coal-powered engine that companies all over England started doing it, creating even more demand for the fuel.
Since DeepSeek unveiled its extra-efficient chatbot this month, tech and energy experts — including Microsoft CEO Satya Nadella — have been citing the Jevons paradox as the reason AI’s energy use won’t change. Microsoft is the biggest investor in ChatGPT-maker OpenAI.
Regardless of whether that prediction comes true, the tech CEOs behind the data center boom show no signs of slowing planned construction — including Project Stargate, an OpenAI-led plan to invest $500 billion in up to 20 new data centers over the next four years.
Plans for the first Stargate data center campus include a 360-megawatt natural gas power plant, which could produce enough electricity for up to 170,000 average U.S. homes and as much planet-warming pollution as about 75,000 cars, based on industry averages from the Energy Information Administration and the U.S. Environmental Protection Agency.
Data centers already gobble up more than 4 percent of all U.S. electricity, according to the Lawrence Berkeley National Laboratory. Even if more efficient AI leads to the number of data centers growing on the slower side of experts’ predictions, they would still devour more power over the next few years.
“Whatever we do, energy usage is likely going to go up,” Gadepally said. “That train has left the station.”
Cutting AI’s carbon emissions
The biggest tech companies — including Amazon, Google and Microsoft — pay for clean-energy credits to offset their energy use and invest in green-power projects, such as restarting the Three Mile Island nuclear plant, developing a new kind of geothermal energy or building fusion reactors. But, for the most part, they still use the same electricity as everyone else, which often comes from fossil fuels.
Tech companies could work around that by slowing down their data centers in moments when there’s a lot of fossil fuel energy on the local grid, or shifting the work to data centers in parts of the world that have more renewable energy in that moment. They could also use more powerful, energy-hungry versions of their AI models when the grid is clean and less powerful models when it’s dirty.
In a 2023 study, Gadepally tested the idea on an image recognition AI model. When the sun was shining and the wind was blowing, Gadepally used a state-of-the-art version of the AI. But when the grid used more fossil fuels, he switched to an older version that performs slightly worse but hogs less energy. He compared the performance of this green approach with what would happen if he used just the energy-hogging state-of-the-art system all the time.
Over the course of two days, the green AI cut carbon emissions 80 percent compared with the standard version — and its scores on image recognition tests were only 3 percent worse. He ran a similar test using a language model last year and found that the green version cut carbon emissions 40 percent with no difference in performance.
Companies could put a dent in AI’s carbon emissions using these strategies, said Benjamin Lee, an electrical and systems engineering professor at the University of Pennsylvania who was not involved in the research — but only if they were willing to accept that their AI might perform slightly worse or take more time to train.
“The challenge is not implementing these techniques but rather convincing AI companies and users that some accuracy loss … is worth the carbon savings,” he said.