
Key Numbers
Key Takeaways
- 1Training GPT-3 generated 552 tons of CO2 equivalent, equal to driving 123 gasoline-powered cars for a year. A single generative AI query uses 7-10 times more electricity than a standard Google Search (MIT News, 2025).
- 2Google's data centers consumed 30.8 TWh of electricity in 2024, more than double the 14.4 TWh they used in 2020. US data centers as a whole consumed 176 TWh in 2023, or 4.4% of all US electricity.
- 3The IEA projects global data center electricity to reach 945 TWh by 2030. All four major hyperscalers (Google, Microsoft, Meta, Amazon) have made carbon commitments, but their AI-driven electricity consumption is growing faster than their renewable energy capacity.
AI has a real environmental impact. That is the honest answer to whether AI is bad for the environment. Training large models generates hundreds of tons of carbon dioxide equivalent per run, inference at scale consumes electricity on the order of small nations, and cooling those data centers evaporates billions of gallons of fresh water each year. The impact is measurable, it is growing, and the industry's current mitigation commitments are not keeping pace with its expansion.
The numbers are larger than most people expect. Google's data centers consumed 30.8 TWh of electricity in 2024, more than double their 2020 figure of 14.4 TWh (Google Environmental Report, 2024). That doubling happened in four years, driven almost entirely by AI workload growth. And Google, with its historically strong efficiency record, is one of the better performers in the industry.
This article covers the specific energy, carbon, and water figures for AI infrastructure; what Google, Microsoft, Meta, and Amazon are actually doing about it; and whether the efficiency gains and renewable commitments are enough to change the trajectory. There is also a genuine other side to this story: AI is being used to reduce energy consumption in ways that will not show up in data center power bills. Both are true at the same time.
In This Article
What is AI's environmental impact?
AI's environmental impact comes from three distinct sources, each with a different scale and time profile: the electricity consumed during model training, the electricity consumed during inference (running the model to answer queries), and the hardware manufacturing footprint of the GPUs and servers that make both possible.
Training is intensive and infrequent. Building a large language model like GPT-3 or Llama 3 requires weeks of continuous computation on thousands of GPUs. That single run generates a fixed carbon cost. Inference is lighter per query but happens billions of times per day, which makes the cumulative total comparable to or larger than the training cost over time. Hardware manufacturing adds a third layer that rarely appears in corporate sustainability reports: fabricating a high-end GPU generates roughly 150-200 kg of CO2 equivalent before it ever powers on.
The resource consumption that matters for understanding AI's environmental footprint breaks down this way:
- Electricity: Powers both training clusters and inference servers. The majority of AI's ongoing environmental impact comes from inference at scale.
- Water: Used to cool data center hardware through evaporative cooling towers and direct liquid cooling systems. Generative AI workloads generate significantly more heat per rack than conventional server workloads.
- Carbon: Determined by electricity consumption multiplied by the carbon intensity of the local grid. A data center running on Wyoming coal power has a different carbon footprint than one running on Norwegian hydroelectric, even with identical hardware.
- Hardware waste: GPU generations turn over every 2-3 years. Discarded servers and accelerators contain hazardous materials including mercury and lead, which create disposal challenges.
The key thing to understand about AI's environmental position is that it sits inside the broader data center sector, which itself sits inside the global electricity system. Data centers as a whole are not uniquely dirty compared to other industries, but they are growing fast at a time when the grid is still substantially carbon-intensive.
| Source | Scale | Timing |
|---|---|---|
| Model training | Hundreds to thousands of tons CO2e per run | Infrequent (months apart) |
| Inference at scale | Continuous, comparable to mid-size cities | Every day |
| Hardware manufacturing | 150-200 kg CO2e per GPU | Every 2-3 years per device |
| E-waste | Hazardous materials (Hg, Pb) | End-of-life, 3-5 year cycles |
How much electricity does AI use?
A generative AI query uses 7-10 times more electricity than a standard Google web search (MIT News, January 2025). That multiplier compounds rapidly at scale. ChatGPT alone handles an estimated 500 million queries per day. If each query uses 0.3 watt-hours of electricity, the daily inference load is around 150 MWh. At the 7-10x multiplier versus search, the equivalent conventional search load would be 15-21 MWh. The difference is real and large.
For data centers overall, the Lawrence Berkeley National Laboratory estimated that US data centers consumed 176 TWh of electricity in 2023, representing 4.4% of all US electricity generation. That figure was already elevated by early AI workload growth. The projection from the same research is 325-580 TWh by 2028, or 6.7-12% of US electricity.
Google's figures are the most transparent available because their annual Environmental Report discloses actual consumption data. Their data centers used 14.4 TWh in 2020 and 30.8 TWh in 2024. That is a 114% increase in four years, driven by AI compute expansion.
The Number Most Guides Don't Show
Google's data center electricity grew at a compound annual rate of approximately 21% between 2020 and 2024. If that rate held to 2028, Google alone would consume around 65 TWh. The entire country of Denmark, population 5.9 million, uses roughly 34 TWh per year. One company's AI infrastructure expansion would by that measure consume nearly twice Denmark's national electricity output. That projection is not a forecast. Efficiency improvements and renewable energy buildout will modify the actual trajectory. But the directional scale is accurate, and it explains why power utilities in Virginia, Texas, and the US Southeast are renegotiating long-term supply contracts specifically for hyperscaler AI campuses.
Global data center electricity consumption was approximately 460 TWh in 2022, about 2% of worldwide electricity. The IEA's Electricity 2025 report projects global data centers will consume 945 TWh by 2030, roughly equal to Japan's total annual electricity consumption today.
| Year | US Data Center Electricity | % of US Grid | Source |
|---|---|---|---|
| 2023 | 176 TWh | 4.4% | Lawrence Berkeley NL |
| 2024 | ~200 TWh (est.) | ~5% | IEA projection |
| 2028 | 325-580 TWh | 6.7-12% | Lawrence Berkeley NL |
| 2030 | ~945 TWh (global total) | N/A | IEA |
The carbon footprint of AI: training vs inference
Training a large AI model once generates a fixed, measurable carbon cost. Researchers at the University of Massachusetts Amherst calculated in 2019 that training GPT-3 consumed 1,287 megawatt-hours of electricity and produced 552 tons of carbon dioxide equivalent. That is roughly equivalent to driving 123 gasoline-powered passenger vehicles for a year, or taking 300 round-trip flights between New York and San Francisco.
GPT-3 was a 2020-era model. Current frontier models are larger, trained longer, and often trained multiple times during development. The carbon cost of training GPT-4 or a comparable 2024-2025 model is estimated at multiples of the GPT-3 figure, though OpenAI and other labs have not disclosed the specific numbers. Training costs are a one-time expense per model version. Inference costs are ongoing.
This matters because inference, not training, now accounts for the majority of AI's total carbon footprint over a model's lifetime. A model trained once is queried billions of times. The per-query carbon cost is small, but the cumulative total across a year of operation at ChatGPT scale is substantial. For a deeper look at how training and inference differ in their resource requirements, see our AI training vs inference explainer.
"Computing power required to train cutting-edge AI models has doubled every 3.4 months since 2012." (MIT News, 2025)
That doubling rate, sustained for 10 years, represents a 35-fold increase in compute requirements at the frontier. Hardware efficiency has improved significantly over the same period, which is why energy consumption hasn't grown at anything close to that rate. But efficiency gains are being absorbed by more compute, more queries, and more models, rather than translating into lower total consumption.
A 2024 paper in Scientific Reports added a counterintuitive finding: for programming tasks specifically, evaluated AI systems produced 5-19 times more carbon emissions than human programmers completing the same work. A different paper in the same journal found the opposite for writing tasks, with AI producing 130-2,900 times lower emissions. The right answer depends entirely on what task you are measuring and how you define the scope of the comparison.
| Activity | CO2 Equivalent | Source |
|---|---|---|
| Train GPT-3 once | 552 tons CO2e | UMass Amherst, 2019 |
| Drive average US car for 1 year | 4.5 tons CO2e | EPA |
| Round-trip flight, NYC to San Francisco | ~1.8 tons CO2e | ICAO average |
| Single ChatGPT query | ~0.0003-0.003 kg CO2e (varies by grid) | MIT estimates, 2025 |
| Human programmer, 1 year | ~7 tons CO2e (office + commute) | Scientific Reports, 2024 |
AI's water footprint
Every AI query also costs water. Data center cooling systems remove heat from GPU racks using either air, water, or a combination. Evaporative cooling towers, the most common type in large US data centers, push heated water through cooling towers where it partially evaporates. That evaporated water leaves the facility and does not return.
According to a 2023 study from the University of California Riverside, responding to 20-50 ChatGPT prompts consumes approximately 500ml of fresh water in cooling. That figure is highly location-dependent. A data center in a hot, dry climate runs its cooling systems harder and evaporates more water per unit of compute than a facility in Iceland running on naturally cold air. For a detailed breakdown of how AI water consumption works across different facilities and cooling technologies, see our full AI water usage guide.
Microsoft disclosed that its data centers in Iowa evaporated 13.4 million gallons of water in August 2022 training a single large model. The contrast with air-cooled facilities is striking. Google's air-cooled data center in Texas used roughly 10,000 gallons across all of 2024. Its water-cooled Iowa facility used approximately 1 billion gallons in the same period. That 100,000x difference in water consumption came from the same AI workloads processed through different cooling architectures.
The water concern is not about global freshwater depletion at current AI scales. It is about local stress on municipal water supplies near major data center clusters. Areas around Loudoun County, Virginia (the world's largest data center hub); Maricopa County, Arizona; and the Dallas-Fort Worth corridor have seen water negotiations with hyperscalers become municipal policy issues.
"AI's water consumption strains municipal water supplies and can disrupt local ecosystems near data center campuses." (MIT News, January 2025)
What Google, Microsoft, Meta, and Amazon are doing
All four major hyperscalers have made public carbon commitments. None are currently on track when measured against the pace of their AI infrastructure expansion.
Google has been carbon-neutral since 2007, achieved by purchasing renewable energy credits to match its total electricity consumption. The more ambitious target is 24/7 carbon-free energy by 2030, meaning every hour of electricity use matched to carbon-free sources in the same hour and region, not just on an annual average. In 2024, Google achieved 66% of its electricity consumption hour-matched to carbon-free energy, up from lower figures in prior years but still short of the 100% target. Regional performance varies sharply: 92% in Latin America, 5% in the Middle East and Africa.
"The end game was 24/7 carbon-free energy around the clock everywhere we operate at all times." (Michael Terrell, Google Head of Advanced Energy, 2025)
Microsoft has committed to being carbon-negative by 2030 and to removing all the carbon it has historically emitted by 2050. The company is targeting a Power Usage Effectiveness (PUE) of 1.125 across its data centers by 2025. PUE measures data center efficiency: a score of 1.0 means all electricity goes to computing; a score of 1.125 means 11.25% of electricity goes to overhead like cooling and power distribution. For reference, Google's current fleet-wide PUE is approximately 1.10.
Amazon has committed to net-zero carbon across all operations by 2040 through its Climate Pledge. Meta has committed to net-zero emissions across its value chain by 2030.
The challenge all four companies face is that their AI infrastructure buildout is expanding faster than their renewable energy procurement. Google's electricity consumption doubled in four years. Renewable energy capacity does not scale at the same speed.
| Company | Carbon Commitment | Timeline | 2024 Progress |
|---|---|---|---|
| 24/7 carbon-free energy | 2030 | 66% hour-matched | |
| Microsoft | Carbon-negative | 2030 | PUE 1.125 target |
| Amazon | Net-zero (Climate Pledge) | 2040 | Largest renewable purchaser by volume |
| Meta | Net-zero value chain | 2030 | 100% renewable matching since 2020 |
Does AI help the environment?
AI is simultaneously one of the largest new sources of industrial electricity demand and one of the most effective tools available for reducing energy consumption in other sectors. Both are true, and the net effect depends heavily on which applications scale fastest.
The positive cases are substantial. Google's DeepMind division applied reinforcement learning to the cooling systems of Google's own data centers and reduced cooling energy consumption by approximately 30%. At the scale of Google's global data center footprint, a 30% reduction in cooling overhead is significant. The same approach is being applied to industrial facilities and power grid management.
AI is also used in climate modeling, allowing researchers to run more scenarios faster and identify tipping points in Earth systems that would otherwise take decades of observational data to detect. Materials discovery is another area where AI has demonstrated genuine efficiency gains: identifying potential compounds for more efficient solar panels, better batteries, and lower-carbon cement formulations faster than conventional laboratory methods.
The honest accounting looks like this: the AI applications that reduce environmental impact tend to be narrow, specialized, and slow to deploy at grid scale. The AI applications that consume the most energy tend to be broad consumer products running billions of queries per day. The efficiency gains from AI-optimized cooling at Google benefit one company's data centers. The energy cost of ChatGPT's inference load is distributed across a global fleet of GPU servers.
Whether AI net-helps or net-hurts the environment is therefore not a question with a single answer. It is a question of which use cases dominate over the next decade, and whether the efficiency gains from AI applied to energy systems scale faster than the direct energy consumption of AI infrastructure itself.
The 2030 outlook: where AI energy use is heading
The trajectory is clear even if the exact endpoint is not. The IEA's 2025 Electricity report projects global data center electricity consumption will reach 945 TWh by 2030. Germany, one of the largest economies in the world, currently uses about 480 TWh per year. The world's data center infrastructure will by 2030 consume nearly twice Germany's national electricity output.
The AI subset of that total is projected at 200-400 TWh globally by 2030 (IEA analysis, 2025), representing 35-50% of all data center electricity. AI workloads growing from a small fraction of data center compute to the majority driver in under a decade is not typical infrastructure growth. It is a structural shift in what the internet is for and what it costs to run.
Two forces compete with that growth trajectory. The first is hardware efficiency. Each new generation of NVIDIA GPUs delivers significantly more computation per watt than its predecessor. The H100 delivers roughly 3-4x the AI training throughput of the A100 at similar power draw. The B200 continues that trend. If the efficiency curve holds, the compute required for a given AI task will cost less energy in 2028 than it does in 2025.
The second force is what economists call the Jevons paradox: efficiency gains tend to be absorbed by more consumption rather than lower total use. Cheaper compute means more AI applications get built and deployed. The net effect on total electricity consumption has, historically, been positive growth despite efficiency improvements.
What would actually change the trajectory: data centers sited near abundant renewable generation rather than near population centers, liquid cooling replacing evaporative cooling to cut water use, and AI chip architectures purpose-built for inference efficiency rather than training peak performance. The industry is moving in all three directions. At current rates, the gap between AI electricity demand and renewable supply is widening, not closing.
Frequently Asked Questions
Is AI bad for the environment?
AI has a measurable environmental impact through electricity consumption, water use, and hardware manufacturing. Whether it is net-negative depends on the application. Training large models generates hundreds of tons of CO2 equivalent per run. Inference at scale consumes electricity comparable to mid-size cities continuously. At the same time, AI is being used to reduce energy use in data center cooling, power grid optimization, and materials discovery. The most honest answer is that AI's direct environmental costs are real and growing, while the environmental benefits are real but slower to scale.
How much energy does a ChatGPT query use?
A generative AI query like a ChatGPT response uses approximately 7-10 times more electricity than a standard Google web search (MIT News, 2025). The absolute figure depends on the model, query length, and data center efficiency, but typical estimates range from 0.1-0.3 watt-hours per query. At 500 million queries per day, ChatGPT's inference load alone is estimated at 50-150 MWh daily, or roughly 18-55 GWh annually, before accounting for other OpenAI products.
How much CO2 does training an AI model produce?
Training GPT-3 in 2019 produced 552 tons of CO2 equivalent, equivalent to driving 123 gasoline-powered cars for a year or taking 300 round-trip flights between New York and San Francisco (University of Massachusetts Amherst, 2019). That figure translates to 626,000 pounds of CO2. Current frontier models are larger and trained for longer periods. The actual training carbon cost for GPT-4 and similar 2024-2025 models has not been publicly disclosed, but is estimated at multiples of the GPT-3 figure.
How much water does AI use?
According to a 2023 study from the University of California Riverside, approximately 500ml of fresh water is consumed per 20-50 ChatGPT prompts through data center evaporative cooling. At the facility level, Microsoft's Iowa data center evaporated 13.4 million gallons training a single large model in August 2022. Google's water-cooled Iowa facility used approximately 1 billion gallons across 2024. Air-cooled facilities use far less: Google's Texas data center used only around 10,000 gallons in all of 2024. For a detailed breakdown, see our full guide to how much water AI uses.
Are Google and Microsoft carbon neutral?
Google has been carbon-neutral since 2007 by purchasing renewable energy credits to match its total annual electricity consumption. Its more ambitious target is 24/7 carbon-free energy by 2030. In 2024, Google achieved 66% hour-matching to carbon-free energy, varying from 92% in Latin America to 5% in the Middle East and Africa. Microsoft has committed to being carbon-negative by 2030. Both companies' electricity consumption is growing faster than their renewable energy procurement, driven by AI infrastructure expansion.
How much electricity will AI use by 2030?
The IEA projects global data center electricity consumption will reach 945 TWh by 2030, roughly equivalent to Japan's current annual electricity use. The AI subset of that total is projected at 200-400 TWh globally (IEA analysis, 2025), representing 35-50% of all data center electricity. In the United States specifically, Lawrence Berkeley National Laboratory projects data center consumption of 325-580 TWh by 2028, or 6.7-12% of all US electricity. AI workloads are the primary driver of that growth.
Can AI help solve climate change?
AI has demonstrated genuine environmental benefits in specific applications. Google's DeepMind applied reinforcement learning to data center cooling systems and reduced cooling energy use by approximately 30%. AI is used in climate modeling, materials discovery for cleaner technologies, and power grid optimization. The challenge is scale and timing: the AI applications that reduce environmental impact tend to be narrow and slow to deploy, while the AI applications consuming the most energy are broad consumer products running billions of queries daily.