How AI Is Fueling a Boom in Data Centers and Energy Demand

8 minute read

While AI could change the world in many unforeseen ways, it’s already having one massive impact: a voracious consumption of energy. Generative AI does not simply float upon ephemeral intuition. Rather, it gathers strength via thousands of computers in data centers across the world, which operate constantly on full blast. In January, the International Energy Agency (IEA) forecast that global data center electricity demand will more than double from 2022 to 2026, with AI playing a major role in that increase. 

AI industry insiders say the world has plenty of energy capacity to absorb this increased demand, and that technological efficiency improvements could offset these increases. But others are skeptical about the industry’s promises, and worry its aggressive consumption of fossil fuel energy sources will hamper collective efforts to combat climate change. 

Here’s how we arrived at this point, and what the rise of data centers could mean for communities on a local and global level. 

What are data centers and why are they expanding so rapidly? 

When someone’s phone data gets backed up to the “cloud,” it really gets stored in data centers: massive facilities filled with thousands of computer servers running on a constant basis. In the era of 5G and cloud-based storage, data centers have become essential infrastructural cogs, supporting everything from financial transactions, to social media, to government operations. Data centers need a continuous and stable supply of energy to operate. They now account for more than 1% of global electricity use, according to the IEA.

Data centers were already vastly increasing in number before AI. The bitcoin mining industry played a role in this increase: A report from the Energy Information Administration found that bitcoin mining was responsible for 2 percent of the nation’s total electricity demand in 2023.  

But the tech industry’s hard pivot into AI has even more dramatically escalated their construction and usage. That’s because it is extremely energy intensive to train AI models, which burn through power at a far higher rate than traditional data center activities. A ChatGPT query, for example, uses ten times more energy than a standard Google query, says David Porter, a vice president at the Electric Power Research Institute. Porter says that while 10-20% of data center energy in the U.S. is currently consumed by AI, that percentage will likely “increase significantly” going forward. 

This energy usage has been exacerbated by the stiff competition between major tech companies, who are racing to build more powerful generative AI models. Researchers recently found that the cost of the computational power required to train these models is doubling every nine months, with no slowdown in sight. As a result, the IEA predicts that in two years, data centers could consume the same amount of energy as Sweden or Germany. Relatedly, researchers at UC Riverside estimated that global AI demand could cause data centers to consume over 1 trillion gallons of fresh water by 2027. 

Despite these staggering figures, the exact energy consumption of many AI models remains opaque. “For the longest time, companies like Google and Meta were pretty transparent: up until ChaptGPT came out, quite honestly,” says Sasha Luccioni, an AI researcher and the climate lead at the AI platform Hugging Face. “In the last year and a half, they’ve gotten a lot more secretive about data sources, training time, hardware, and energy.” 

Read More: The Billion-Dollar Price Tag of Building AI

Endangering climate goals and energy infrastructure 

This rapid increase in energy usage threatens to derail the climate pledges that major tech companies set for themselves before the AI hype cycle. In 2020, for example, Google set a goal to run on carbon-free energy 24/7 by 2030. Microsoft made a similar pledge the same year, vowing to become carbon negative in a decade. 

But last year, Microsoft took a massive step backwards in that regard by increasing its greenhouse gas emissions by 30%, mostly due to its ambitious AI pursuits. “In 2020, we unveiled what we called our carbon moonshot. That was before the explosion in artificial intelligence,” Microsoft president Brad Smith told Bloomberg. “So in many ways the moon is five times as far away as it was in 2020, if you just think of our own forecast for the expansion of AI and its electrical needs.”

Microsoft, which has invested billions in OpenAI, has spent more than $10 billion in past quarters on cloud-computing capacity, and is planning to double its data center capacity. In Goodyear, Arizona, which faces a water shortage, Microsoft’s data centers are expected to consume more than 50 million gallons of drinking water every year. 

Some locales are pushing back against data center construction. There is currently a de facto moratorium against them in Dublin, as they already consume nearly a fifth of Ireland’s electricity. But data centers are ramping up massively elsewhere—including Northern Virginia, colloquially known as Data Center Alley. Just outside of Washington D.C., historically residential tracts of land are quickly being rezoned as industrial to make room for data centers, drawing the ire of local citizens. Because it is cheaper for companies to build data centers in places with robust power sources and existing infrastructure, many of them cluster together.

Data centers are now the “number one issue we hear about from our constituents,” says Ian Lovejoy, a Republican state delegate in Virginia. Aside from quality of life concerns, he says that local politicians and residents are worried about data centers threatening electricity and water access, as well as the idea that taxpayers may have to foot the bill for future power lines. 

“Today, there isn't enough power generation capacity or transmission capacity to fuel the data centers that are in the pipeline,” Lovejoy says.  “Everyone's just kind of gambling that we will build up the infrastructure and everything will be fine. But if the data centers outpace power generation, you could see brownouts, and a real constrained power situation growing.” 

Of the 8,000 data centers that exist globally, about a third are in the U.S., compared to 16% in Europe and almost 10% in China. The Hong Kong-based think tank China Water Risk estimates that data centers in China consume 1.3 billion cubic meters of water per year—nearly double the volume that the city of Tianjin, home to 13.7 million people, uses for households and services. 

Potential Solutions

The very companies consuming all of this energy say they are working on solutions. Many AI boosters argue that their technology will be crucial towards combating climate change, in part by combating inefficiencies. In 2016, for example, Google announced that its DeepMind AI had helped reduce its company’s data center cooling energy usage by “up to 40 percent.”

Efficiency improvements in chip hardware could also have a big impact in reducing energy usage. This year, NVIDIA rolled out a new line of GPUs (graphics processing units) with 25 times lower energy consumption than its previous models. However, Luccioni believes that any hardware efficiency gains may be offset by Jevons Paradox, named after a 19th century British economist who noticed that as steam engines became more efficient, Britain’s appetite for coal actually increased. “The more a resource becomes more efficient, the more people will use it,” she says. 

Industry insiders also argue that they are leading the way in the energy transition, by spurring the construction of renewable energy sources, particularly in wind, solar and nuclear. But many new data centers are still powered by fossil fuels, especially because it takes a while for new energy sources to come online. “Unfortunately, a lot of renewables don’t meet the needs of high speed compute, in their demand for consistent and high quality energy,” says Paul Prager, the CEO of the bitcoin mining company Terawulf, which is also expanding into AI. “And you can’t snap your fingers and have a power plant tomorrow. I think we’re in for a near term period over the next three to four years where we’ll see energy costs escalate a little bit on a local level.”  

Regulation may be on the way. Singapore announced a sustainability standard for data centers in tropical countries last year, and the European Commission moved towards regulating data center sustainability across the EU. In the U.S., Massachusetts Senator Ed Markey introduced a bill to study the environmental impacts of AI in February, and the House Committee on Energy and Commerce held a hearing about the energy usage of AI in June.

“Transparency is currently our main hurdle, because many people don’t realize the environmental impacts of AI: and not only consumers, but lawmakers and companies,” Luccioni says. “When we establish transparency and start having a little bit more information, then we can start trying to regulate.” 

More From TIME

More Must-Reads from TIME

Contact us at letters@time.com