Hassan Taher on Data Centers’ Growing Water Crisis in the AI Era

Artificial intelligence development consumes enormous quantities of electricity, a fact widely reported and debated. Less visible but equally consequential is AI’s demand for water, a resource that data centers require in massive volumes to prevent servers from overheating. As AI workloads proliferate, this hidden consumption threatens to strain water systems already facing scarcity pressures.
Hassan Taher, a Los Angeles-based AI consultant and author, has examined how infrastructure requirements shape AI’s environmental footprint. Through his firm Taher AI Solutions, Taher has advised organizations on implementing AI systems while considering resource constraints. His perspective on data center water usage reflects broader concerns about whether AI’s expansion can proceed sustainably.
Projections indicate that by 2025, half the global population will inhabit water-stressed regions. Data centers’ growing thirst compounds pressures on systems already struggling to meet agricultural, industrial, and residential needs. Understanding how and why these facilities consume water provides context for evaluating potential solutions.
Why Data Centers Require Continuous Water Supply
Data centers house thousands of servers generating substantial heat during operation. Without effective cooling, equipment fails within minutes as temperatures exceed safe operating thresholds. Water-based cooling systems provide the thermal management necessary for continuous operation.
The most common approach employs evaporative cooling through chilled water systems. Heated water travels to cooling towers where contact with outside air allows heat dissipation through evaporation before the water returns for another cooling cycle. This method proves particularly effective for hyperscale facilities operating tens of thousands of servers.
Hassan Taher has written about infrastructure efficiency in his book The Future of Work in an AI-Powered World. “Large-scale computing requires systems that can dissipate heat continuously and reliably,” he noted. “Water cooling achieves this at scales where air cooling becomes impractical.”
Data centers also require water for humidity control, maintaining levels between 40% and 60% relative humidity. This prevents static electricity buildup that could damage sensitive electronics. Additional water supports fire suppression systems and general facility operations.
Beyond direct on-site usage, data centers carry indirect water footprints through their supply chains. Thermoelectric power plants supplying electricity to facilities consume water generating steam. Manufacturing processes for AI chips and servers demand ultrapure water for cleaning and rinsing components, adding substantial upstream consumption.
Quantifying the Scale of Water Demand
The magnitude of data center water usage varies considerably based on facility size, cooling system design, and climate conditions. However, aggregate projections indicate substantial and accelerating consumption.
Analyses estimate that U.S. data centers will drain between 731 million and 1,125 million cubic meters of water annually through the end of the decade. This volume equals the household consumption of 6 million to 10 million Americans. Total annual on-site water consumption by U.S. facilities in 2028 could reach 150 billion to 280 billion liters, potentially double or quadruple 2023 levels.
Facility type significantly influences water requirements. Hyperscale data centers operated by companies like Google average approximately 550,000 gallons daily, totaling roughly 200 million gallons annually. Wholesale and retail facilities in regions like Northern Virginia average about 18,000 gallons daily or 6.57 million gallons yearly. The largest data centers may consume up to 5 million gallons daily, approaching 1.8 billion gallons annually.
Hassan Taher has addressed resource intensity in AI development through his consulting work with technology companies. “Organizations often underestimate the infrastructure requirements of AI workloads,” he explained in a 2023 discussion of AI implementation challenges. “Computing costs represent just one dimension, water, electricity, and space requirements create additional constraints.”
AI workloads particularly intensify water demands. Research indicates that training GPT-3 in Microsoft’s U.S. data centers consumed approximately 5.4 million liters of water, including 700,000 liters of direct on-site usage. Generative AI inference, using trained models to produce responses—also proves water-intensive. Processing a single medium-sized GPT-3 request consumes enough water to fill a 500-milliliter bottle for every 10 to 50 responses, depending on deployment location and timing.
These figures illustrate why AI’s expansion raises environmental concerns extending beyond electricity consumption. Water represents a finite resource with competing demands, and data center usage occurs predominantly in regions where population growth and climate change already stress supplies.
Measuring Water Efficiency Through WUE
The industry employs Water Usage Effectiveness (WUE) as a standardized metric for assessing data center water efficiency. The Green Grid developed this measurement in 2011 to provide consistent evaluation across facilities.
WUE calculates the ratio of annual water consumption in liters to total IT equipment energy consumption in kilowatt-hours, expressed as liters per kilowatt-hour (L/kWh). Lower WUE values indicate greater water efficiency. Industry averages reportedly stand at 1.80 L/kWh, though individual facilities vary substantially based on design and operation.
Microsoft reported a global average WUE of 0.30 L/kWh in its most recent fiscal year, significantly below industry averages. This reflects investments in water-efficient cooling technologies and operational practices that the company has implemented across its data center portfolio.
Hassan Taher has emphasized the importance of standardized metrics in evaluating technology’s environmental impact. “Measurement provides the foundation for accountability,” he stated in discussing AI ethics in his book AI and Ethics: Navigating the Moral Maze. “Organizations cannot manage what they do not measure with rigor and consistency.”
WUE calculations distinguish between water withdrawal, the total volume taken from sources, and water consumption, withdrawal minus discharge. This difference proves crucial because evaporative cooling systems lose approximately 80% of withdrawn water to evaporation. This consumed water does not return to local water systems and represents genuine depletion of available supplies.
Understanding this distinction matters when evaluating different cooling approaches. Systems that withdraw large volumes but discharge most back to sources differ fundamentally from those that consume similar volumes through evaporation, even if withdrawal figures appear comparable.
Technological Approaches to Reducing Water Usage
Several major technology companies have committed to becoming water positive by 2030, pledging to replenish more water than their operations consume. Achieving these goals requires deploying innovative cooling methods that minimize or eliminate water consumption.
Microsoft announced in August 2024 that its next-generation data center design would use zero water for cooling. This approach employs chip-level cooling solutions that recycle water through closed-loop systems. The company expects this design to reduce WUE to near zero for facilities incorporating these technologies.
Closed-loop cooling systems continuously recycle water, requiring only small makeup volumes to compensate for minor leaks. These systems can reduce freshwater consumption by up to 70% compared to traditional once-through designs. While involving higher upfront capital costs, closed-loop systems offer long-term operational savings and reduced environmental impact.
Liquid cooling technologies represent another advancement gaining adoption. Direct-to-chip liquid cooling circulates coolant directly to heat-generating components, while immersion cooling submerges entire servers in dielectric fluid. Both methods transfer heat more efficiently than air cooling, requiring substantially less water. These approaches prove particularly valuable in water-scarce regions where conventional cooling creates unsustainable demands.
Hassan Taher has written about technology’s role in addressing environmental challenges in his forthcoming book examining AI’s potential contributions to climate solutions. “Innovation in cooling technology demonstrates how engineering solutions can reduce resource intensity,” he noted. “The question becomes whether adoption occurs rapidly enough to offset growing demand from expanding AI workloads.”
Free-air cooling and adiabatic cooling offer additional strategies in suitable climates. Free cooling utilizes outside air directly when ambient temperatures remain cool enough for effective heat dissipation. Adiabatic cooling supplements outside air with minimal water spray during warmer conditions, using far less water than evaporative systems. These approaches work effectively in cooler climates or during winter months, though they provide limited benefits in hot, humid regions. If successful, these initiatives could make current AI workloads more efficient and even open the door to advancements like quantum AI, which requires absolute zero conditions to function.
Alternative Water Sources and Management Practices
Beyond technological innovations in cooling systems, data centers can reduce environmental impact by diversifying water sources and implementing recycling practices. Many facilities continue relying on potable water supplies despite availability of alternatives requiring less treatment.
Using non-potable water, including treated greywater from municipal systems or collected rainwater—reduces pressure on drinking water supplies. Google reports that over 25% of its data center campuses utilize reclaimed or non-potable water sources. This approach requires appropriate treatment systems and regulatory approvals but offers substantial benefits in water-stressed regions.
Water recycling and reuse extend the utility of withdrawn volumes. Effective treatment enables water circulation through cooling systems multiple times before discharge. Spent cooling water can receive additional treatment and serve secondary purposes including irrigation or toilet flushing, further reducing freshwater demand.
Energy sourcing decisions also influence overall water footprints. Renewable electricity from solar or wind generation requires substantially less water than fossil fuel power plants. Coal facilities particularly consume enormous water volumes generating steam. Data centers powered by renewables therefore carry lower indirect water footprints even when on-site consumption remains constant.
Hassan Taher has advocated for holistic environmental assessment in his consulting work. “Organizations must evaluate their full resource footprint, including indirect consumption through supply chains and electricity sourcing,” he explained in a 2023 article examining AI’s environmental impacts. “Focusing exclusively on direct operations provides incomplete understanding of total environmental cost.”
Operational Strategies and Transparency Initiatives
Beyond physical infrastructure changes, operational practices offer opportunities for water conservation. Spatial-temporal scheduling represents one promising approach, involving dynamic placement and timing of AI workloads to exploit geographical and hourly variations in water efficiency.
This strategy directs computing tasks to facilities and time periods when WUE reaches optimal levels. Implementation requires sophisticated orchestration systems that balance water efficiency against other factors including electricity costs, carbon intensity, and latency requirements. Conflicts may arise, minimizing water footprint could increase carbon emissions or vice versa, requiring careful optimization across multiple environmental dimensions.
Transparency in water consumption reporting provides another lever for driving improvement. Several researchers and advocacy organizations have called for tracking and disclosing both scope-1 (on-site) and scope-2 (off-site electricity) water consumption in AI model cards and cloud service dashboards. This visibility would enable customers to make informed decisions about which providers and services align with their environmental commitments.
Hassan Taher has consistently emphasized transparency’s role in responsible AI development. “Disclosure creates accountability and enables informed decision-making by customers and policymakers,” he stated in a 2023 interview discussing AI ethics. “Organizations that voluntarily report environmental metrics demonstrate commitment beyond minimum compliance requirements.”
Corporate water stewardship programs represent another approach gaining traction. These initiatives involve companies assessing their water risks, setting reduction targets, investing in local watershed health, and collaborating with communities on water management. Such programs acknowledge that data center operations affect and depend upon regional water systems, creating mutual interests in sustainable management.
Balancing Growth Against Resource Constraints
The tension between AI’s expansion and data centers’ water consumption creates dilemmas without simple resolutions. AI capabilities continue advancing, driving demand for ever-larger models requiring more computing power. Simultaneously, climate change intensifies water scarcity in many regions where data centers concentrate.
Hassan Taher’s analysis suggests that technology improvements alone may prove insufficient if deployment scales dramatically. “Efficiency gains matter, but they can be overwhelmed by exponential growth in absolute demand,” he noted in examining AI industry trajectories. “Sustainable AI requires both technological innovation and thoughtful decisions about which applications justify resource consumption.”
This perspective raises questions about AI development priorities. Not all applications deliver equivalent value, using AI for medical diagnosis differs fundamentally from generating marketing content or entertainment. Yet data center water consumption occurs regardless of whether workloads address critical needs or trivial applications.
Geographic distribution of data centers relative to water availability presents additional challenges. Concentrations in regions like Northern Virginia or the Southwest United States place facilities in areas already facing water stress. Relocating capacity to water-abundant regions could reduce local impacts but involves substantial costs and may conflict with other priorities including proximity to users or electricity sources.
The path toward sustainable AI infrastructure requires coordinated action across multiple dimensions: deploying water-efficient cooling technologies, utilizing alternative water sources, improving operational practices, and maintaining transparency about consumption. These measures collectively can reduce water intensity per unit of computing, though whether reductions sufficiently offset growing demand remains uncertain.
Hassan Taher’s work examining AI’s societal implications suggests that addressing environmental costs represents part of ensuring technology serves broader public interests. Data centers’ water consumption, though less visible than electricity usage or carbon emissions, warrants equivalent attention as AI reshapes economic and social systems. How effectively the industry responds to this challenge will help determine whether AI’s benefits justify its environmental costs.