The emissions risks of AI data center buildout
Commitments to massively expand infrastructure for artificial intelligence (AI) have accelerated significantly within the past year. The January 21st announcement of Project Stargate — a four-year, $500 billion push to scale AI infrastructure in the US by OpenAI, Microsoft, NVIDIA, SoftBank, and others — represents an unprecedented scale of infrastructure investment in a single technology sector.
It’s likely that power-hungry AI infrastructure will continue to grow and will need to be served by electricity generation in some form, existing or new, renewable or fossil-fueled, on-site or across town. If managing the climate impacts of this growth is a priority, then emissions impacts should be considered when evaluating the options for how to meet the growing AI demand. Already, AI-driven electricity demand has increased the emissions of large companies like Google and is threatening their climate goals.
Forecasts for data centers’ ballooning electricity demand
Lawrence Berkeley National Laboratory (LBNL) recently projected that data centers could consume between 6.7% and 12% of US electricity by 2028, a 2-3x increase from 2023. The corresponding load growth from data centers alone is in the range of 145-400 TWh, which may require some 33-91 GW of new generation capacity to be built by 2028. That's a massive amount of new electricity generation. That's also just for the US, a leader — but far from the only — player in the AI landscape.
The LBNL report is one to take seriously. It was created by researchers, some of whom have been loudly skeptical of AI, to fulfill a 2020 request from Congress. These numbers are in line with or relatively low compared to other recent projections. Meanwhile, Chinese AI startup DeepSeek’s announcement last week that it managed to produce a powerful model with a fraction of the compute compared to leading AI companies is a reminder that efficiency gains are likely. These numbers are constantly in flux.
Estimating the emissions implications of data center load growth
The implications of how new electricity is supplied are massive. If all LBNL’s projected U.S. data center load growth through 2028 were served exclusively by natural gas generation (the most prevalent source of generation in the US currently), it would result in about 180 million tonnes of additional CO2 emissions annually. If it were a country, that would make it the 45th largest emitter.
The emissions impact of this massive buildout depends entirely on choices made today about where to site these facilities and how to power them. A particular computing load, depending on time and location, could cause coal to be burned in Wyoming, or gas to be burned in California, or could cause no emissions at all if it absorbs surplus wind power in Kansas or Texas.
The speed demanded by AI development timelines creates pressure to choose quick solutions over optimal ones. While co-located gas-fired power might seem like an expedient and economically wise choice, particularly given current political signals or to avoid long interconnection queues, it also comes with future fuel cost risk and creates long-term emissions lock-in that will be increasingly difficult to unwind as climate pressures mount.
Ultimately, the emissions impacts are dependent on which power plants will serve the new electrical demand of these facilities. When a new grid-connected data center is switched on, one or more existing power plants will ramp up to meet the demand. In some cases, new power plants will be built to make sure enough generation capacity is available to serve the new load.
The emissions caused when existing power plants respond to changes in load (or new plants are built) are measured by marginal emissions rates. We can use these marginal emissions rates of electricity grids to compare the climate outcomes of different data center scenarios — both the emissions caused by which data centers get built where, and also the emissions consequences of associated impacts on marginal emissions by that load and any new power plants that get built.
Siting new data centers to cause fewer emissions
Let's compare and contrast two significant data center hubs: Northern Virginia's Data Center Alley (in the PJM grid) and Texas's emerging AI corridor (in ERCOT), where the first Stargate data center is being constructed in Abilene. The induced emissions impact of a grid-connected 100 MW data center operating at 95% capacity differs substantially between locations.
A 100 MW data center in Northern Virginia would result in about 463,000 tonnes of CO2 emissions annually, while the same facility in Texas would produce about 386,000 tonnes (17% lower).
These, of course, aren’t the only places where new data centers could get built, and in fact, neither location represents the optimal case from an induced emissions perspective.
For example, the same 100 MW facility built in Kansas (in SPP) would produce about 358,000 tonnes of CO2 annually (23% lower than in Virginia). Further, building it in Northern California (CAISO) would produce about 309,000 tonnes (an even greater 33% reduction vs. Virginia). What Kansas and California have in common is an oversupply of clean and renewable energy for many hours of the year — wind in one and solar in the other.
These calculations assume constant operation near maximum capacity throughout the year, typical for large data centers with critical workloads. While actual emissions would vary based on specific operating patterns and grid conditions, these numbers illustrate the massive emissions implications of siting decisions for new AI infrastructure.
Siting data centers in grids with lower marginal emissions rates can cut the potential induced emissions by up to half. That’s massive.

Building new clean power where it can avoid more emissions
As new data centers increasingly look to bring their own clean power (or procure it), this also opens the question of where that new clean generation should get built to not just meet data center load growth but also avoid the most fossil emissions. (In practice, which power plants get built where has many influences, including the capacity needs of a specific balancing area, interconnection queues, transmission constraints, and other factors. But for now, let’s assume total freedom to choose your location.)
With this in mind, siting new data centers on grids with lower marginal emissions rates is only half the story. The electricity generation supply side of the equation is the other.
There are two dominant ways new power plants could get built for data centers: a) co-located within the same grid balancing area as the data center itself, and b) siting the new clean power on grids with higher marginal emissions rates, and thus where new wind or solar could avoid more fossil emissions.
Continuing our earlier example of would-be new data centers in either Northern Virginia’s Data Center Alley or Texas’s emerging AI corridor, let’s look at a few scenarios for emissions implications, depending on where new wind or solar capacity gets built in association with the new data center load. For example:
- Building a new data center in VA or TX, and co-locating new renewables there, too
- Building a new data center in a grid region with lower marginal emissions rates, such as Northern CA or KS, and co-locating new renewables there
- Building a new data center in a grid region with lower marginal emissions rates, and siting new renewables in other grid regions with higher marginal emissions rates and where wind or solar would avoid more emissions (such as Colorado-Wyoming or Kenya)
We see a clear pattern. When renewables sized to 100% of data center load are procured from within the same grid as the data center, those renewables have a more-modest avoided emissions effect relative to the data center load’s induced emissions. On the other hand, siting data centers on grids with lower marginal emissions rates — and then investing in new renewable capacity on grids with higher marginal emissions rates (where wind and solar displace more fossil fuel generation) — can generate substantial net reductions in total emissions. This approach to building renewable energy in the most impactful places regardless of where data centers are built is already being used by Amazon, Meta, Apple, and Salesforce.

The role of compute load shifting to further reduce emissions
While siting decisions have the largest impact on emissions, there's also potential to reduce emissions of data center use through smart load management. Data centers, particularly those running AI training workloads, require extremely reliable, constant power. Once a training run starts, interruptions can waste days or weeks of compute time. This makes them less flexible than other types of new electricity demand like EV charging, where timing can be shifted to match clean energy availability.
But many data center workloads, like batch processing and cooling, are timing flexible, so shifting that energy use to times of the day when marginal emissions are lower, like when renewables are being wasted, can achieve large emissions reductions.
While load shifting alone won't solve data center emissions, it represents another tool for reducing emissions impact, particularly for facilities that handle workloads beyond AI model training that are flexible. These techniques are already being used by Microsoft, UBS, and other members of the Green Software Foundation.
Data centers will require massive amounts of energy, even if we don’t know precisely how much. And the combination of high reliability requirements and constant load patterns means careful planning is crucial — rushed infrastructure decisions could lock in unnecessarily high emissions for decades.
Conclusion
At a moment when electrification is accelerating across the economy, from vehicles to buildings, the surge in AI infrastructure presents both a challenge and an opportunity. By making smart decisions now about where to build data centers, how to power them, and how to operate them, we can ensure a revolution in compute drives rather than hinders the clean energy transition.
The unprecedented scale of AI infrastructure investment — from Stargate's $500 billion commitment to the broader industry — represents the largest concentrated buildout of computing power in history. Every decision about where and how to build this infrastructure matters more than ever.
image source: iStock | Gerville