The Hidden Cost of Intelligence: How AI Is Reshaping Our Energy and Water Landscape

An environmental explainer on the real-world footprint of artificial intelligence

Introduction

Illustration showing a hand pressing a Generate button on a device connected by power lines to massive data center infrastructure with glowing GPU chips, cooling towers releasing steam, and a small residential neighborhood dwarfed below

A split-screen editorial illustration reveals the hidden infrastructure behind AI. On the left, a hand hovers over a glowing "Generate" button on a tablet or smartphone — the moment of casual creation. A translucent connection line traces from the device across power transmission towers to the right side of the frame, where massive data center infrastructure dominates the landscape. Server racks glow orange with heat from GPU chips. Cooling towers release steam and water droplets into the air. A small residential neighborhood sits tiny at the bottom right, dwarfed by the industrial scale of the AI infrastructure looming above it. The image makes visible what is normally invisible: the physical cost — in electricity, water, and community impact — of a single AI generation request.

I have been working on content about swimming for years now. I need a visual. I opened an AI image generator, typed a prompt, and thirty seconds later had exactly what I needed — a swimmer mid-stroke, clean light, perfect composition.

Years ago, I didn't think twice about it.

After conversations with friends and family, I started researching what that actually costs.



Artificial intelligence is transforming industries, accelerating research, and reshaping how we work. But every AI query answered, every image generated, and every model trained carries a physical cost — one measured not just in code, but in electricity and water. The main concerns are not limited to model training. They extend to the enormous resources required to run large data centers continuously.[1,2,3]

This article explores two of the most pressing environmental dimensions of the AI boom: energy consumption and water use. It also examines the community-level effects of data center expansion and asks whether stronger government policy is needed to manage this growth responsibly.




 

Energy Consumption: Powering the AI Revolution

How Much Power Do AI Data Centers Use?

AI-optimized data centers consume significantly more power than their traditional counterparts. They rely on energy-intensive chips — primarily graphics processing units (GPUs) — and operate continuously for both training and real-time inference. One widely-cited projection estimates that data centers will use twice as much electricity by the late 2020s as they do today.[2,3]

The Carbon Problem

When that electricity comes from fossil fuels, AI growth translates directly into increased greenhouse gas emissions. Research has found that AI expansion can substantially raise CO₂ output across regions,[1,4] particularly in areas where coal or natural gas remain dominant in the energy mix. The carbon footprint of a single large model training run can be significant — sometimes comparable to the lifetime emissions of several automobiles.

Infographic showing three dimensions of AI data center energy impact:GPU, carbon emissions, and grid pressure

Three panels illustrate the hidden costs of powering AI. The left panel shows a GPU chip ablaze with heat, surrounded by dense server racks—representing the most energy-intensive hardware ever built. The center panel depicts the carbon equation: fossil fuel plants and CO₂ emissions contrasted against wind turbines and solar panels, with a single car at the bottom to show that training a large AI model can equal a car's lifetime emissions. The right panel shows the community impact—a residential neighborhood dwarfed by a data center hyperscaler and power transmission lines, with utility meters spinning and a family discussing a $980 electricity bill. Together, the panels reveal that AI's power demand is not abstract: it's a direct trade-off between energy sources, carbon output, and real costs passed to residents who never consented to host this infrastructure.

Community Energy Impacts

The effects are not abstract. Local communities near data center clusters are already experiencing pressure on their electricity grids. In fast-growing data center regions, the power demands of AI facilities can compete directly with households and small businesses for grid capacity.[7,9] Residents may face delayed grid upgrades, increased congestion, and upward pressure on utility pricing — even if they receive little direct benefit from the AI services being powered nearby.

A 2026 CNBC report highlighted growing backlash from ratepayers in several U.S. states, where utility commissions are being asked to weigh whether AI infrastructure costs should be passed on to ordinary electricity customers.[9]

What Companies Are Doing

Major technology companies have responded to criticism with several mitigation strategies:

  • Renewable energy contracts: Many firms are signing long-term agreements to source power from wind, solar, and other clean sources, or building dedicated carbon-free power strategies.[1,6]

  • Efficiency improvements: Investment in more efficient chips, smarter workload scheduling, and improved data center design to reduce wasted computation.[2,3]

  • Strategic siting: Placing facilities where power is cheaper, cleaner, or where cooling is naturally easier — such as colder climates — to lower the overall environmental cost.[3,7]

  • AI for sustainability: Some companies are deploying AI itself to optimize logistics, energy systems, and infrastructure — using the same technology to address broader environmental challenges.[6,8]

 

Water Consumption: The Less-Told Story

Why Data Centers Need So Much Water

Water consumption is often the overlooked dimension of AI's environmental footprint. Data centers generate tremendous heat, and cooling that heat requires substantial water — primarily through evaporative cooling systems that consume water directly rather than recirculating it.[1,2,3]

The scale of use is striking. A medium-sized data center may use approximately 110 million gallons of water per year for cooling — roughly the annual water use of about 1,000 households.[1] At the larger end, a hyperscale AI facility can use up to 5 million gallons per day at peak — comparable to the daily water demand of a town of 10,000 to 50,000 people.[1,2]

Across the United States, data centers collectively used approximately 449 million gallons per day as of 2021 — a figure that has grown substantially since, driven by AI expansion.[1]

Illustration showing a data center with cooling towers connected by pipes to 1.5 Olympic-size swimming pools being filled, with human figures for scale.

One data center. Daily water consumption at scale: 1 million gallons per day is enough to fill 1.5 Olympic pools — every single day.[1,2]

Putting It in Perspective

To make these numbers concrete:

  • A large hyperscale data center using 3 million gallons per day equals the daily water use of approximately 10,000 households (at 300 gallons per household per day).[1,2,3]

  • A facility at peak demand — 5 million gallons per day — is equivalent to the water footprint of a mid-sized town of 10,000 to 50,000 people.[1,2]

  • A single site drawing 500,000 gallons per day can represent as much as 10% of a county's total water supply in some regions.[6]

  • Water use varies widely by facility size and design: medium-sized data centers may use around 300,000 gallons per day, while the largest hyperscale AI facilities can reach 3 to 5 million gallons daily depending on climate, cooling systems, and workload.[1,3,4]

The Local Dimension

The biggest concern is not the aggregate national figure, but where that water comes from. A data center in a water-stressed community can have an outsized impact on local supply even if its national share is small.[1,2,6] When AI facilities are built in drought-prone or arid regions, they compete for the same water supply as homes, farms, and businesses — often without adequate public oversight or community input.

 
Infographic showing three approaches to reduce water use in AI data centers: direct-to-chip liquid cooling, siting facilities in cooler climates, and closed-loop water recirculation systems.

Three isometric panels illustrate how AI data centers are reducing water consumption. The left panel shows direct-to-chip liquid cooling, where coolant flows directly to GPU chips rather than cooling the surrounding air. The center panel depicts geographic siting strategy — placing facilities in colder climates to reduce cooling demand — with snowflakes and low-temperature gauges visible. The right panel demonstrates closed-loop water recirculation systems that capture evaporated water and recycle it rather than releasing it to the environment. Each approach eliminates the need for large-scale evaporative cooling, which consumes millions of gallons daily.

Indirect Water Use

Even data centers that use less water on-site may have an indirect water footprint through the electricity they consume. Many power plants — including natural gas and nuclear facilities — also use significant amounts of water for cooling, meaning the water cost of AI extends beyond the data center fence line.[2,3]

What Companies Are Doing

To reduce direct water consumption, companies are adopting:

  • Direct-to-chip and immersion cooling, which transfer heat without evaporating large volumes of water.[1,5]

  • Outdoor air cooling in cooler climates, reducing or eliminating the need for water-based systems entirely.[3,5]

  • Better airflow design and water recirculation systems to reduce waste.[1,5]



 

Beyond Energy and Water: E-Waste and Rebound Effects

Two additional environmental concerns deserve mention. First, AI systems require frequent hardware upgrades, which drives electronic waste and the resource-intensive manufacturing of chips and servers.[5] The environmental cost of producing a single high-end GPU — including raw materials, fabrication, and transport — is substantial before it ever processes a single query.

Second, efficiency gains from AI do not always translate into reduced overall consumption. When AI makes processes cheaper or faster, it can also encourage greater use — a phenomenon known as the rebound effect.[1,5] A more efficient logistics network, for example, may simply enable more shipments rather than reducing the total environmental load.




Infographic showing three policy levers for AI data center oversight: environmental review before construction, community protection from utility bill increases, and transparency through visible energy and water metering.

Three panels outline a balanced regulatory framework for AI infrastructure. The left panel depicts environmental review—a checklist and magnifying glass examining a data center before construction begins, representing due diligence and community input before bulldozers arrive. The center panel shows community protection as a shield over a residential neighborhood with schools and businesses, with a bill marked approved—symbolizing utility-rate safeguards that prevent residents from absorbing data center growth costs. The right panel illustrates transparency and disclosure: prominent meters on a data center's exterior, visible to the public, with community members gathered to observe them. Together, the three approaches frame a policy vision that protects communities, maintains grid reliability, and still allows productive AI innovation—ensuring that local costs don't flow to residents while benefits accrue primarily to corporations.

The Policy Question

There is a strong case for better government oversight of where AI data centers are built and how they are powered. Policies could require environmental impact reviews, local grid-impact studies, water-use disclosure, and clean energy commitments before new facilities receive approval.[1,4,7,9] Governments may also need rules protecting residents from absorbing data center growth costs through higher electricity bills.

A balanced policy framework would aim to protect communities, maintain grid reliability, and still allow productive innovation. That could include zoning rules, utility-rate protections for residents, mandatory efficiency standards, and greater transparency around energy and water use.[3,4,7,9] Most AI companies don't publicly disclose the specific locations or resource consumption of their inference infrastructure.

Without such guardrails, communities may bear the costs while the benefits of AI flow primarily to the companies building it — a distributional problem that merits serious public attention.



Where am I at?

I'm still going to use AI tools. Probably including image generation. Not because I'm unaware of the cost, but because the tool solves a real problem for me — it lets me explore visual ideas before committing them to canvas. But I think about the meter now. The one that isn't there.

Using a tool responsibly doesn't mean not using it. It means using it with clear eyes about what it actually costs, and supporting the policy frameworks that distribute those costs more fairly.

 

Conclusion

AI's environmental footprint is real, growing, and unevenly distributed. Electricity and water are not abstract resources — they are shared public goods that communities depend on. As AI infrastructure scales rapidly, the decisions made now about siting, energy sourcing, efficiency standards, and transparency will shape the long-term relationship between artificial intelligence and the natural world.

The technology itself is not inherently damaging — but unchecked, unregulated expansion into resource-constrained communities is a pattern that history suggests we should approach with caution and deliberate planning.

I'm curious: do you factor infrastructure cost into your AI use? And if you work in policy or energy planning — what would actually shift this dynamic?


References & Sources

The following sources informed this article:

[1]  Environmental & Energy Study Institute — Data Centers and Water Consumptionhttps://www.eesi.org/articles/view/data-centers-and-water-consumption

[2]  Pew Research Center — What We Know About Energy Use at U.S. Data Centers Amid the AI Boomhttps://www.pewresearch.org/short-reads/2025/10/24/what-we-know-about-energy-use-at-us-data-centers-amid-the-ai-boom/

[3]  Lincoln Institute of Land Policy — Land and Water Impacts of Data Centershttps://www.lincolninst.edu/publications/land-lines-magazine/articles/land-water-impacts-data-centers/

[4]  Cornell University — Roadmap Shows Environmental Impact of AI Data Center Boomhttps://news.cornell.edu/stories/2025/11/roadmap-shows-environmental-impact-ai-data-center-boom

[5]  Earth.org — The Green Dilemma: Can AI Fulfil Its Potential Without Harming the Environment?https://earth.org/the-green-dilemma-can-ai-fulfil-its-potential-without-harming-the-environment/

[6]  Sentinel Earth — Data Centers Now Drink More Water Than Entire Citieshttps://www.sentinelearth.com/post/data-centers-now-drink-more-water-than-entire-cities

[7]  Brookings Institution — Confronting and Addressing Rising Energy Bills Linked to Data Centershttps://www.brookings.edu/articles/confronting-and-addressing-rising-energy-bills-linked-to-data-centers/

[8]  Kogod School of Business — How AI Innovation Is Revolutionizing Sustainabilityhttps://kogod.american.edu/news/how-ai-innovation-is-revolutionizing-sustainability

[9]  CNBC — AI Data Centers, Electricity Prices, Backlash, and Ratepayer Protectionhttps://www.cnbc.com/2026/03/13/ai-data-centers-electricity-prices-backlash-ratepayer-protection.html

[10] Brookings Institution — AI Data Centers and Waterhttps://www.brookings.edu/articles/ai-data-centers-and-water/

[11] Stanford — Thirsty for Power and Water: AI Data Centers Sprout Across the Westhttps://andthewest.stanford.edu/2025/thirsty-for-power-and-water-ai-crunching-data-centers-sprout-across-the-west/

[12] CBC — AI Data Centre Canada Water Usehttps://www.cbc.ca/news/ai-data-centre-canada-water-use-9.6939684

Next
Next

Edmonton Strathearn Art Walk 2026