In a few short years AI technology has already become a legitimate threat to the earth’s environment. Without guardrails put in place, the competition between the U.S. and China could outstrip current infrastructure.
Part 1: Surging Demand, Hidden Costs
Local communities in the U.S. are beginning to feel the AI strain: in Memphis, Tennessee, residents discovered that Elon Musk’s new xAI supercomputer center had stealthily installed dozens of gas-fired generators without permits. These gas powered generators accelerated the training process of Grok Chatbot to compete with OpenAI’s ChatGPT and other leading GenAI products – but allegedly increased the smog by 30-60%, causing a big spike in resident’s visits to the hospital for respiratory issues. Musk’s AI company subsequently downplayed the controversy.
City officials’ increasing challenge is to now carefully and intentionally balance big tech job creation and the unintended harmful pollution of the power-hungry AI firms. On a bigger scale, the allegations around xAI highlights a pressing question: Can the U.S. and China’s AI race continue without wrecking environmental sustainability?

The xAI supercomputer data center Colossus in Memphis – powered by dozens of on-site methane gas turbines – became a flashpoint for community protests over air pollution and lack of oversight. Such examples underscore the often-overlooked environmental toll of the AI arms race.
Data Centers: The New Power Hogs
There are roughly 11,000 data centers globally. About half are located in the US, 25% in Europe, and 25% are in China. In 2023, data centers in the United States used about 19 Gigawatts (GW) or 4% of all the electricity produced in the country – equivalent to powering about 15 million homes for a year. This amount has doubled since 2017, mainly because new technology like AI needs much more energy. Experts think data centers will keep using more electricity every year, and by 2030 they could use up to 12% of all the electricity in the US. Morgan Stanley, a bank, puts the gap at 45 GW by 2028. At best, if the training chips achieve more efficiency, Bernstein, a broker, estimates a potential power shortfall in America of 17 GW by 2030, and if no improvements on chips, then 62 GW would be the shortcoming.
Each prompt of AI inferencing has a very real footprint. In 2023, American data centers emitted an estimated 105 million metric tons of CO₂ – roughly equivalent to the annual emissions of about 14 million U.S. homes (about 2.2% of total U.S. emissions). Unfortunately, 56% of the power fueling U.S. server farms still comes from fossil fuels – a major source of greenhouse gases. Chinese data centers, mainly located in remote areas like in Tibet, Guizhou, and Hebei region, emit around 84 million metric tons of CO₂ in 2023 according to CAICT, a China-based technology thinktank. Higher total emissions are estimated for China, despite lower energy consumption, because of the high carbon intensity of its energy supply still driven by coal. By 2030, Chinese AI data centers’ carbon emission might double if China does not build the right sustainable AI development infrastructure.
AI’s hunger for energy and water
Data centers are where most of the AI training takes place. Large language models (LLMs) like OpenAI’s GPT series and China’s new DeepSeek models demand massive electricity for training, water for cooling, and specialized hardware (thousands of GPUs or AI chips).
For example, OpenAI’s GPT-3 consumed on the order of 1.2–1.3 million kWh of electricity during training – which powers about 120 American homes for a year, and emitting an estimated 550 metric tons of CO₂– equivalent to about as much as several round-trip flights on a Boeing-777.
How does this compare to China’s latest contender? DeepSeek-R1, launched in early 2025, is a cutting-edge “reasoning” model from Chinese startup DeepSeek that rivals the capabilities of GPT-4. DeepSeek’s founders claim they achieved similar performance at a fraction of the cost: according to the company, DeepSeek-R1 is 20–50× cheaper to operate per task than OpenAI’s models. This suggests enormous potential energy savings – indeed, DeepSeek said it trained its smaller V3 model on less than $6 million worth of Nvidia AI chips, pointing to higher efficiency.
Cooling its heat-intensive servers drank up 700,000 liters of water, roughly two-thirds of an Olympic swimming pool just for one iteration of model training. Consequently, the water excretes from the cooling system as warm waste water. Indeed, developing a state-of-the-art model often involves many iterative runs and experimentation, adding significantly to the carbon and water footprint of AI development.
From One-Off Training to Perpetual Inference
While training a single model is resource-intensive, the usage phase (inference) can quickly eclipse it. Unlike training – which happens occasionally – inference (i.e. model deployment to answer billions of user queries every day, runs continuously across millions of prompts – could consume 25 times the energy it used to train.
Every time you chat with an AI, data center servers rev up to generate the response. Because more advanced models like GPT-5 and R1 use advanced reasoning, it consumes more energy to produce more accurate answers.
Northern Virginia’s power grid, home to the world’s densest cluster of data centers, is groaning under AI-driven demand. The cost to secure future electricity has soared tenfold in two years. Utilities face a race to build enough capacity, pushing residential bills in the Mid-Atlantic 30-60% higher by 2030. With over 90% of new demand expected from data centers, consumers may soon find themselves footing the bill for the AI boom.
The outputs of training and inference at this scale are tangible: carbon emissions, power grid strain, and enormous water use – often returned as heated wastewater and potentially exacerbating global warming. In the future AI sustainability literacy should perhaps be the responsibility of the companies creating AI demand.
A need to reframe the problem: a race to building sustainable AI infrastructure
The AI race between the U.S. and China is accelerating, but without a sustainability framework, it risks becoming a race to the bottom. Energy-hungry data centers, water-intensive cooling, and the arms race for chips are outpacing environmental safeguards. Both nations face shared resource constraints, yet treat them as separate problems. My next piece, “Different Roads to Sustainable AI Systems,” explores how divergent policy paths could still lead to smarter, greener infrastructure.
