IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

AI Is an Energy Hog, and Government Needs to Be Aware

The environmental costs of using artificial intelligence tools is an area of growing concern for government technology officials. Transparency from vendors can shed light on their energy and water usage.

From left, Leila Doty, privacy and AI analyst in the San José Information Technology Department; Tamara Kneese, who directs a new climate, tech and justice program at the Data and Society Research Institute; Brian Pascal, policy adviser for Supervisorial District 5 in Santa Clara County, Calif.; Irina Raicu, director of the Internet Ethics Program at the Markkula Center for Applied Ethics at Santa Clara University; and Kris Schaffer, senior director at Lumen Technologies, discuss "Social Impact: A.I. and the Environment" at the GovAI Coalition Summit in San Jose, Calif., Dec. 4.
From left, Leila Doty, privacy and AI analyst in the San José Information Technology Department; Tamara Kneese, who directs a new climate, tech and justice program at the Data and Society Research Institute; Brian Pascal, policy adviser for Supervisorial District 5 in Santa Clara County, Calif.; Irina Raicu, director of the Internet Ethics Program at the Markkula Center for Applied Ethics at Santa Clara University; and Kris Schaffer, senior director at Lumen Technologies, discuss "Social Impact: A.I. and the Environment" at the GovAI Coalition Summit in San Jose, Calif., Dec. 4.
Skip Descant/Government Technology
SAN JOSE, Calif. — Local and state governments can consider using their procurement processes to require technology vendors to disclose more information about the environmental impacts of artificial intelligence (AI) tools.

“Ask them to disclose their energy usage and water usage,” Irina Raicu, director of the Internet Ethics Program at the Markkula Center for Applied Ethics at Santa Clara University, said Wednesday at the GovAI Coalition Summit* in San Jose.

“Put it in a contract, and see what happens. Is that company really going to not want to work with you, rather than not disclose their usage?” she said during a summit panel, in a discussion centered on the environmental costs associated with emerging AI tools used by government and consumers. “If a company is saying, probably. Think about what that means. Think about what it means if a company says, we would rather not work with you than disclose that information.”

The emergence of AI tools like ChatGPT and the rising number of AI-enabled applications used every day by government and consumers at all levels is fueling growth in data centers — and an appetite for the energy and water needed to support those centers. Other environmental concerns are associated with activities like the development of computer chips and related devices also needed in today’s super-computing world, experts said.

“All of those ChatGPT prompts that are helping us work more efficiently, they come at not just a financial cost, but also at a cost to our environment, from the energy and the water that’s needed to power the data centers that make [generative AI] GenAI tools possible,” said Leila Doty, privacy and AI analyst in the San Jose Information Technology Department, who served as panel moderator.

The energy needed for a single query to a large language model in ChatGPT is six to 10 times higher than a traditional web search using Google, she added. And training a large language model can lead a data center to consume millions of liters of fresh water.

These are the kinds of concerns starting to trickle through city departments, particularly as they make decisions about not only employee use of AI tools but whether cities should choose to purchase these technology tools in the first place.

“When vendors are selling you technology, they’re selling you what it could do, and they’re obfuscating the effort and kind of background work that goes into spitting that out,” said Brian Pascal, policy adviser for Supervisorial District 5 in Santa Clara County, Calif. (Pascal was careful to note his comments on the panel were on his behalf, and not the county.)

“It’s not free,” he said, in a form of cautionary warning to others in local, county or state government, adding: “There’s a cost associated with all of this,” one often hidden by private-sector developers.

“There’s this huge disconnect between the externalities, and the downside of the technology and the purported upsides that got you to buy it in the first place,” Pascal said. “The whole point of government is to try to get a handle on these externalities … . And try to rein in the companies.”

Government’s use of AI tools — and its green-lighting of data center development — can quickly eat away at sustainability goals, Raicu said.

“If you’re talking about local government, and people who care about sustainability and environmental impacts … every time you use generative AI, you have to know that you are pushing in the opposite direction,” she said. “It might be worth it, in certain cases. But you need to understand that you are making a tradeoff every time you use it. And as long as people don’t understand that, they can’t make ethical decisions about when to use it, and when not.”

*The GovAI Coalition Summit was hosted by Government Technology in partnership with the GovAI Coalition and the city of San Jose.
Skip Descant writes about smart cities, the Internet of Things, transportation and other areas. He spent more than 12 years reporting for daily newspapers in Mississippi, Arkansas, Louisiana and California. He lives in downtown Yreka, Calif.