If the US Has to Build Data Centers, Here’s Where They Should Go

If the US Has to Build Data Centers, Here’s Where They Should Go

Summary

A new Nature Communications analysis models how the US data-centre buildout for AI could affect energy, water use and emissions through the end of the decade. The study combines projected AI chip demand with regional electricity mixes and water‑scarcity data to identify which states are best placed to host future facilities with the lowest environmental footprint. It finds that Texas, Montana, Nebraska and South Dakota strike the best balance of low water stress and cleaner grids—contrasting with the current concentration of development in places such as Virginia, Northern California and Arizona.

Key Points

  • The study projects environmental impacts of future AI data‑centre growth using demand for AI chips and regional electricity and water data.
  • States optimal for new AI server installations include Texas, Montana, Nebraska and South Dakota, due to favourable mixes of energy and water availability.
  • Historic hotspots—Virginia and Northern California—remain popular because of connectivity, talent and incentives, but may face resource and policy tensions.
  • Water is a crucial constraint: states like Arizona face severe water scarcity that can limit sustainable expansion.
  • If grid decarbonisation stalls while AI demand outpaces efficiency gains, the US buildout could add up to ~44 million tonnes of CO2e annually in worst‑case scenarios.
  • Technological advances (cooling, model efficiency, onsite energy, even new nuclear) and policy choices can substantially alter outcomes.
  • The authors call for greater transparency on emissions and resource use from companies driving the buildout to inform smarter siting decisions.

Content summary

The research models multiple scenarios and warns that corporate net‑zero pledges are likely to be undermined unless data‑centre siting, energy policy and technology improvements are aligned. Historically attractive regions have advantages (connectivity, workforce, tax breaks) but may not be optimal environmentally. The paper emphasises that where data centres are built matters as much as how they are built; policy and utility choices (for example investing in gas versus wind) will significantly influence outcomes. The authors and outside experts agree projections are uncertain, but the central message is clear: smart siting plus cleaner grids and better tech can greatly reduce the climate and water impacts of AI infrastructure growth.

Context and relevance

For policymakers, utilities, planners and tech companies, this analysis frames a near‑term problem: an AI‑driven construction boom that could lock in emissions and water demand unless sites are chosen with emissions and resource limits in mind. It connects to broader trends—rapid AI adoption, grid decarbonisation debates, and local competition for economic development deals—and highlights tensions between attracting investment and meeting climate and water goals. The study offers actionable prioritisation (which states are better candidates) while reminding readers that rapid changes in technology or energy policy could shift the picture.

Why should I read this?

Short version: if you care about AI, climate or where the next big tech hubs will sprout, this is worth five minutes. The piece cuts through hype and shows which US regions actually make sense for expanding AI infrastructure — and why building in the wrong places could blow a hole in emissions targets or drain scarce water. We’ve done the heavy lifting: this tells you the states to watch and the policy levers that matter.

Source

Source: https://www.wired.com/story/heres-where-to-build-data-centers-to-keep-emissions-down/

Article Date: 2025-11-10T17:40:27+00:00