AI’s Dual Threat: Revenue Gains Fuel Data‑Center Power and Water Surge
AI drives revenue while inflating data‑center electricity to 485 billion kWh in 2025 and U.S. water use to 66 billion liters in 2023.

TL;DR
AI adoption turns every customer into both a profit source and a training signal, pushing global data‑center electricity use toward 950 billion kWh by 2030 and U.S. water use past 66 billion liters.
Context Companies rush to embed AI copilots, generative video tools, and autonomous agents to stay competitive. Founder Andrea Pignataro warns that this collective push creates a “tragedy of the commons”: firms profit while simultaneously feeding the data that powers future AI, a dynamic that amplifies resource strain.
Key Facts - Global data‑center electricity consumption reached 485 billion kilowatt‑hours (kWh) in 2025, roughly 3 % of worldwide demand, and could double to 950 billion kWh by 2030. - In the United States, data‑center power use grew from 176 billion kWh in 2023 (4.4 % of national electricity) to a projected 325‑580 billion kWh by 2028, potentially consuming up to 12 % of U.S. power. - Direct water use for cooling U.S. data centres rose from 21.2 billion liters in 2014 to 66 billion liters in 2023, while indirect water footprints from electricity generation approach 800 billion liters. - A single AI‑server rack may demand peak power comparable to 65 homes by 2027, and its heat output can match dozens of gas boilers. - AI‑focused data centres grew 50 % in one year, outpacing typical grid‑planning cycles. - Emissions from U.S. data‑center electricity in 2023 equated to about 61 billion kg CO₂e, with global data‑center emissions projected to hit 350 million metric tons by 2035.
What It Means The economic model that treats each client as both revenue and a training sample accelerates infrastructure expansion. As AI workloads multiply, electricity demand will strain grids, especially where renewable capacity lags. Water consumption for cooling will intensify pressure on local basins, forcing trade‑offs between energy‑efficient but water‑heavy cooling methods and water‑saving but power‑hungry alternatives.
Beyond the environmental toll, the surge in hardware requirements risks widening chip supply bottlenecks and increasing e‑waste, with only about a fifth of global electronic waste formally recycled.
Policymakers and industry leaders must confront this dual threat by mandating transparency on AI‑driven resource use, incentivising low‑water cooling technologies, and aligning AI development with decarbonisation pathways. The next few years will reveal whether AI can pivot from a resource‑intensive growth engine to a catalyst for broader energy efficiency.
Watch next: emerging standards for “sustainable AI” reporting and the impact of upcoming power‑grid upgrades on AI‑heavy data‑center clusters.
Continue reading
More in this thread
Conversation
Reader notes
Loading comments...