Each ChatGPT Query Uses About 0.000085 Gallons of Water
Despite the low per-query footprint, Altman acknowledged the growing cumulative impact, given ChatGPT’s 5.56 billion monthly visits.

OpenAI CEO Sam Altman has revealed for the first time the average energy and water consumption per ChatGPT query—significantly lower than earlier estimates.
In a blog post titled The Gentle Singularity, Altman said each query uses about 0.34 watt-hours of energy and 0.000085 gallons of water—roughly one-fifteenth of a teaspoon.
This contrasts sharply with prior studies suggesting ChatGPT could consume up to 0.5 litres of water per extended conversation, and that each query used around 2.9 watt-hours—nearly 10 times Altman’s estimate.
Despite the low per-query footprint, Altman acknowledged the growing cumulative impact, given ChatGPT’s 5.56 billion monthly visits.
Previously, he acknowledged that polite user interactions like “please” and “thank you” have cost OpenAI tens of millions in electricity.
As scrutiny of AI’s environmental impact intensifies, Altman predicts the cost of intelligence will eventually align with electricity prices, driven by automated datacentre scaling.
However, according to a new analysis by Epoch AI, the computational power of AI supercomputers has been doubling every nine months, while their hardware costs and power demands have doubled annually.
Drawing from a dataset of 500 AI systems built between 2019 and 2025, the report warns that if these trends continue, the top AI supercomputer by 2030 could require two million AI chips, cost $200 billion, and consume 9 GW of electricity—equal to nine nuclear reactors.
Comments ()