The rapidly expanding use of artificial intelligence, particularly large language models like ChatGPT, is creating unprecedented demands on water resources as data centers struggle to cool their increasingly powerful systems.
Updated water consumption data: Recent research from the University of California, Riverside reveals that ChatGPT’s water consumption is four times higher than previously estimated, with 10-50 queries consuming approximately two liters of water.
- The original study, “Making AI Less Thirsty,” based its calculations on 2020 OpenAI figures but has been revised following new data from Microsoft
- Professor Shaolei Ren, the study’s author, indicates that energy consumption and associated water usage are significantly higher than initial estimates
Technical infrastructure challenges: Modern AI server racks generate unprecedented levels of heat, requiring extensive cooling systems that consume massive amounts of water.
- Nvidia’s latest AI servers generate 120kW of power in a single rack, equivalent to 120 traditional space heaters
- Data centers require drinking-quality water for cooling systems, as impurities can damage sensitive equipment
- The United States hosts over 5,000 data centers, compared to approximately 600 in the UK
Environmental impact and corporate response: Major tech companies are reporting significant increases in water consumption while pledging to address environmental concerns.
- Google, Microsoft, and Meta have reported 17-22.5% increases in water consumption
- Tech giants have committed to becoming “water positive” by 2030, promising to return more water than they consume
- Water replacement often occurs in different locations from where it’s extracted, potentially affecting water-stressed areas
Global implications: The rapid expansion of AI infrastructure is creating environmental tensions in various regions worldwide.
- Chile has become a flashpoint for environmental activism, with Google forced to redesign a $200 million data center to use air-based cooling
- Irish data centers now consume 21% of the country’s electricity, up from 5% in 2015
- Water UK projects that new data centers could require water equivalent to a city the size of Liverpool
Innovation and solutions: Companies are developing various approaches to reduce water consumption in data centers.
- Digital Realty sources 43% of its water from non-drinking sources, including rainwater harvesting
- Iceotope’s precision liquid cooling technology eliminates the need for water cooling
- Google DeepMind has developed AI systems that reduce cooling energy consumption by 40%
Looking ahead: While AI systems present significant environmental challenges, they may also offer solutions to improve resource efficiency, creating a complex balance between technological advancement and environmental stewardship.
- AI tools are being deployed to optimize data center water usage
- Environmental concerns are driving innovation in cooling technologies and infrastructure design
- The industry faces increasing pressure to balance rapid growth with sustainable practices
Recent Stories
DOE fusion roadmap targets 2030s commercial deployment as AI drives $9B investment
The Department of Energy has released a new roadmap targeting commercial-scale fusion power deployment by the mid-2030s, though the plan lacks specific funding commitments and relies on scientific breakthroughs that have eluded researchers for decades. The strategy emphasizes public-private partnerships and positions AI as both a research tool and motivation for developing fusion energy to meet data centers' growing electricity demands. The big picture: The DOE's roadmap aims to "deliver the public infrastructure that supports the fusion private sector scale up in the 2030s," but acknowledges it cannot commit to specific funding levels and remains subject to Congressional appropriations. Why...
Oct 17, 2025Tying it all together: Credo’s purple cables power the $4B AI data center boom
Credo, a Silicon Valley semiconductor company specializing in data center cables and chips, has seen its stock price more than double this year to $143.61, following a 245% surge in 2024. The company's signature purple cables, which cost between $300-$500 each, have become essential infrastructure for AI data centers, positioning Credo to capitalize on the trillion-dollar AI infrastructure expansion as hyperscalers like Amazon, Microsoft, and Elon Musk's xAI rapidly build out massive computing facilities. What you should know: Credo's active electrical cables (AECs) are becoming indispensable for connecting the massive GPU clusters required for AI training and inference. The company...
Oct 17, 2025Vatican launches Latin American AI network for human development
The Vatican hosted a two-day conference bringing together 50 global experts to explore how artificial intelligence can advance peace, social justice, and human development. The event launched the Latin American AI Network for Integral Human Development and established principles for ethical AI governance that prioritize human dignity over technological advancement. What you should know: The Pontifical Academy of Social Sciences, the Vatican's research body for social issues, organized the "Digital Rerum Novarum" conference on October 16-17, combining academic research with practical AI applications. Participants included leading experts from MIT, Microsoft, Columbia University, the UN, and major European institutions. The conference...