Sam Altman Claims ChatGPT’s Water and Energy Consumption Is Negligible. He’ll Have to Prove It

  • In a recent blog post, the OpenAI CEO raises expectations about AI once again.

  • Altman seeks to dispel the notion that using AI is costly in terms of energy and water consumption.

Sam Altman
No comments Twitter Flipboard E-mail
javier-pastor

Javier Pastor

Senior Writer
  • Adapted by:

  • Alba Mora

javier-pastor

Javier Pastor

Senior Writer

Computer scientist turned tech journalist. I've written about almost everything related to technology, but I specialize in hardware, operating systems and cryptocurrencies. I like writing about tech so much that I do it both for Xataka and Incognitosis, my personal blog.

260 publications by Javier Pastor
alba-mora

Alba Mora

Writer

An established tech journalist, I entered the world of consumer tech by chance in 2018. In my writing and translating career, I've also covered a diverse range of topics, including entertainment, travel, science, and the economy.

1572 publications by Alba Mora

A recent study by researchers at the University of California concluded that a 100-word email generated by GPT-4 consumes 519 milliliters of water. However, OpenAI CEO Sam Altman has released his own estimate regarding the water and energy consumption of each ChatGPT query, which is surprisingly different.

1,000 times less. According to Altman, an average ChatGPT query uses much less energy than earlier studies suggested. His data is compelling, and he uses engaging analogies to explain it:

“As datacenter production gets automated, the cost of intelligence should eventually converge to near the cost of electricity. (People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one fifteenth of a teaspoon.)”
Consumption A previous study by Epoch AI corroborates Altman’s claims.

Evidence. However, there’s a notable issue with the data provided by Altman: His figures lack visible backing. He presents them without citing sources or explaining their origins, making them difficult to verify. In contrast, a Meta executive previously said that it would only take two nuclear power plants to fuel AI inference.

Previous studies. Despite Altman’s lack of specific evidence, earlier studies support his assertions. In February, researchers at Epoch AI published a study estimating that an average ChatGPT query using GPT-4 consumes just 0.3 watt-hours. This is “ten times less than the older estimate” cited by researcher Alex de Vries. Much has happened since then.

Too pessimistic. According to the Epoch AI study, the efficiency of these models has significantly improved since 2023, when de Vries conducted his research. This improvement also pertains to the hardware running these models. Additionally, the estimate from this study adopted an “overly pessimistic” approach.

Notably, the Epoch AI study claims that “it is possible that many or most [ChatGPT] queries are actually much cheaper [in terms of energy] still.”

Data Centers The water consumption in data centers for several online activities, according to an independent study from 2025.

Further studies. An independent study published in January 2025 concluded that “using ChatGPT is not bad for the environment.” This study used data from EPRI from May 2024, which estimated a high energy consumption of 2.9 Wh per ChatGPT query. Moreover, an analysis by Sunbird on water consumption in data centers indicated that it’s considerably lower than for other online activities.

15 queries per teaspoon of water. According to Altman’s estimate, a typical ChatGPT query requires only “0.000085 gallons of water; roughly one fifteenth of a teaspoon.” This suggests that the water needed for cooling data centers processing these queries is much less than previously thought.

What about training? These estimates primarily focus on AI inference, specifically, the use of ChatGPT when processing queries. Altman doesn’t clarify whether this includes the energy and water costs associated with training AI models, which can be substantial.

Training these models demands thousands of GPUs running at full capacity for months, resulting in significant water usage for cooling the components that generate considerable heat. Researcher Ethan Mollick highlighted that GPT-4 likely consumed over 50 GW during its training, enough energy to power 5,500 homes for an entire year.

Definitive data is still lacking. Altman’s claims are striking, but the absence of clear evidence makes it challenging to trust them fully. Recent studies provide a more accurate reflection of the declining costs of energy and water consumption associated with AI use.

However, there’s currently no established consensus or accepted standards on the true impact of energy and water usage when employing ChatGPT and other AI models.

Image | TechCrunch

Related | We Finally Know How Thirsty Artificial Intelligence Is: It Needs a Small Bottle of Water to Send a 100-Word Email

Home o Index