These functions are increasingly integrated into everyday Internet use, through tools like Google’s Gemini, Amazon’s Rufus and other AI assistants. “We’ve seen over the last couple of decades a fairly steady increase in the amount of electricity that’s used for computing,” says Tony Dutzik of the Frontier Group, a think tank focused on the environment.
Functions such as video streaming and cloud storage have already increased electricity demands. But those increases have come at more or less predictable increments. AI tools like ChatGPT rely on large language models trained to respond to queries in ways that are legible to humans, which increases the complexity of the task. “AI is the place where you begin to see these projections of exponential growth,” Dutzik says.
All this computing requires servers stored in physical locations. Data centers are a growing feature of the built environment in the U.S. They are linked to power generation sources and require transmission hookups and on-site generators. They need intensive cooling systems to keep servers from overheating, which requires water. About half of U.S. states offer tax incentives that benefit data centers.
Researchers are still working to understand the full scope of energy demands created by AI and cryptocurrency. Much of the tech is fueled by private companies, which only share partial information about their energy use. Even with expected improvements to the energy efficiency of nascent technology like generative AI, industry observers say the energy impact of AI will grow exponentially in the coming years.
This story first appeared in Governing, part of e.Republic, Government Technology's parent company. It can also be found in Governing's Winter 2025 magazine issue. You can subscribe here.