Google has become the first major tech company to publicly release detailed energy consumption data for its AI systems, revealing that an average Gemini text prompt uses 0.24 watt-hours of energy and emits 0.03 grams of carbon dioxide equivalent. The transparency milestone comes as AI’s environmental impact faces increasing scrutiny, with 61% of Americans expressing concern about AI electricity usage according to recent polling.
What you should know: Google’s methodology provides the most comprehensive view of AI energy consumption to date, accounting for factors typically overlooked in public estimates.
- The company tracked not just active computing power but also idle machines, host CPUs, RAM, data center overhead, cooling systems, and water consumption.
- A single Gemini query consumes “0.26 milliliters (or about five drops) of water,” which Google compared to “watching TV for less than nine seconds.”
- Google estimated Gemini had 350 million monthly users in March, representing almost half of ChatGPT’s user base.
The big picture: While Google’s numbers are higher than simplified calculations, they’re “substantially lower than many public estimates” of AI resource consumption.
- Google’s comprehensive approach revealed 0.24 Wh energy usage compared to a “non-comprehensive” estimate of 0.10 Wh for the same query.
- “The energy consumption, carbon emissions, and water consumption were actually a lot lower than what we’ve been seeing in some of the public estimates,” said Savannah Goodman, head of Google’s advanced energy labs.
- Over a recent 12-month period, the energy and carbon footprint of median Gemini text prompts dropped “33x and 44x, respectively.”
Industry context: Google’s disclosure contrasts sharply with the opacity of other AI companies regarding their environmental impact.
- OpenAI CEO Sam Altman claimed in June that ChatGPT queries use “about 0.34 watt-hours” but provided no methodology or supporting data.
- Meta’s data centers reportedly consume massive amounts of water, but the company hasn’t shared specifics.
- Major players like Anthropic have remained silent on their AI systems’ resource consumption.
Why this matters: The lack of concrete AI energy data has fueled environmental concerns as the industry rapidly expands infrastructure and power demands.
- Google’s own energy usage has more than doubled in four years despite a 12% reduction in data center emissions.
- President Trump recently pledged $92 billion toward AI infrastructure in Pennsylvania, extending the $500 billion Stargate initiative announced in January.
- The Trump administration’s AI Action Plan aims to “reject radical climate dogma” and expedite environmental permitting for new data centers.
What they’re saying: Google emphasized the importance of comprehensive measurement in understanding AI’s true environmental footprint.
- “We believe this is the most complete view of AI’s overall footprint,” the company stated in its report.
- However, Google acknowledged that neither the data nor claims had been vetted by independent third parties.
Looking ahead: Google’s disclosure could establish industry standards and create competitive pressure for other AI companies to share similar environmental data.
- More transparency could help users and businesses factor emissions into their AI model selection decisions.
- The effectiveness of such measures depends on whether the industry prioritizes renewable energy investment amid rapid growth and shifting regulatory priorities.
Google reveals how much energy a Gemini query uses