How Much Energy Does AI Use? The Real Numbers for 2026

Key Numbers
Key Takeaways
- 1A single generative AI query uses approximately 10 times more electricity than a Google Search (MIT News, 2025). At 500 million ChatGPT queries per day, that inference load alone is equivalent to powering 17,000 to 50,000 US homes per year.
- 2US data centers consumed 176 TWh of electricity in 2023, or 4.4% of all US electricity (Lawrence Berkeley National Laboratory). That figure is projected to reach 325 to 580 TWh by 2028, or up to 12% of the US grid.
- 3Training GPT-3 required 1,287 MWh of electricity (UMass Amherst, 2019). The IEA projects AI workloads specifically will consume 200 to 400 TWh globally by 2030, equal to 35 to 50% of all data center electricity worldwide.
A single generative AI query uses approximately 10 times more electricity than a standard Google Search. Scale that up to hundreds of millions of queries per day, and the cumulative cost becomes measurable in terawatt-hours, not kilowatt-hours.
The numbers are larger than most people expect. Lawrence Berkeley National Laboratory estimated that US data centers consumed 176 terawatt-hours of electricity in 2023, representing 4.4% of all US electricity generation. That was before the AI infrastructure buildout accelerated through 2024 and 2025. The same research projects consumption reaching 325 to 580 TWh by 2028, which would put data centers at 6.7 to 12% of US electricity, roughly the annual output of every power plant in California.
This article covers what AI energy consumption actually consists of, how it breaks down per query and per training run, which companies account for the largest share, and where the trajectory is heading by 2030.
In This Article
- 1What counts as AI energy use?
- 2How much energy does a single AI query use?
- 3How much energy does training an AI model use?
- 4How much energy do US and global data centers use?
- 5Which AI companies use the most energy?
- 6How does AI energy use compare to other activities?
- 7What are AI companies doing about energy consumption?
- 8Where is AI energy use heading by 2030?
What counts as AI energy use?
AI energy consumption comes from two distinct activities with very different scales and timelines: training and inference.
Training is the process of building a model. It requires running thousands of GPUs continuously for weeks or months, adjusting billions of parameters against massive datasets. Training is energy-intensive but infrequent. GPT-3 required 1,287 megawatt-hours to train once (University of Massachusetts Amherst, 2019). To put that in household terms: 1,287 MWh is roughly what 120 average US homes consume in a year. That cost is a one-time fixed figure per model version. OpenAI trains a new major version roughly every 12 to 18 months.
Inference is the process of running a trained model to answer queries. It uses far less energy per event than training, but it happens billions of times per day across all AI products combined. Over the lifetime of a deployed model, cumulative inference energy typically exceeds training energy by a large margin.
"Both phases are very energy-intensive, and we don't really know what the energy ratio there is. Historically, with Google, the balance was 60 percent inference, 40 percent training." (Alex de Vries, data scientist, Washington Post, 2024)
A third factor that rarely appears in corporate energy reports is hardware manufacturing. Fabricating a high-end AI GPU generates roughly 150 to 200 kg of CO2 equivalent before the chip runs a single computation. GPU generations turn over every two to three years. At the scale of hundreds of thousands of units deployed globally, this manufacturing footprint adds up.
| Activity | Energy scale | Frequency |
|---|---|---|
| Model training (GPT-3 scale) | 1,287 MWh per run | Once per model version |
| Model training (frontier 2025 scale) | Est. 10,000 to 100,000+ MWh | Once per model version |
| Inference at ChatGPT scale | ~500,000 kWh per day | Continuous |
| GPU manufacturing (H100 class) | 150-200 kg CO2e per unit | Every 2-3 years per device |
How much energy does a single AI query use?
A generative AI query uses approximately 0.001 to 0.003 kilowatt-hours of electricity. A standard Google Search uses roughly 0.0003 kWh. That puts a typical ChatGPT or Claude response at 3 to 10 times the energy cost of a search, depending on the model and query length (MIT News, January 2025).
The range matters because AI queries vary enormously in complexity. A simple factual question on a small, efficient model uses far less energy than a long reasoning task on a frontier model like ChatGPT 4.1, Claude 4.7, or ChatGPT 5. Epoch AI found that inference optimizations on modern ChatGPT models brought typical query energy down toward 0.3 Wh, roughly a tenfold improvement over earlier GPT-4 estimates. The MIT figure of 7 to 10 times a Google Search still holds for longer, more demanding tasks — and frontier models like ChatGPT 5 handle significantly more complex queries by default, which pushes energy per session higher even as energy per token falls.
| Activity | Energy per event (kWh) | Source |
|---|---|---|
| Google Search | 0.0003 | Google, various |
| Simple AI query (optimized model) | 0.0003-0.001 | Epoch AI, 2024 |
| Standard ChatGPT query | 0.001-0.003 | MIT News, 2025 |
| 100-word GPT-4 email generation | 0.14 | Washington Post study |
| 1 hour HD video streaming | 0.036-0.07 | IEA estimates |
| Standard email (no AI) | 0.0004 | Various |
The video streaming comparison is worth sitting with. An hour of HD Netflix uses roughly 0.036 to 0.07 kWh. A complex GPT-4 task generating a short email at 0.14 kWh uses more electricity than two hours of streaming. Per interaction, generative AI is among the most electricity-intensive computing activities available to consumers.
How much energy does training an AI model use?
Training GPT-3 consumed 1,287 megawatt-hours of electricity and produced 552 tons of CO2 equivalent (University of Massachusetts Amherst, 2019). GPT-3 is now several model generations old. Current frontier models are larger, trained for longer periods, and often trained multiple times during development as researchers test different configurations before settling on a final architecture.
OpenAI has not disclosed the energy cost of training GPT-4. Researcher estimates, based on model size and reported compute budgets, range from 50,000 to 100,000 MWh for the full training process including development runs and fine-tuning. That figure, if accurate, puts GPT-4 at roughly 40 to 80 times the energy cost of GPT-3. Anthropic, Google, and Meta similarly do not publish training energy figures.
To make that concrete: GPT-4 trained once at 50,000 MWh would consume the annual electricity of approximately 4,600 average US homes. Training one frontier model is energetically comparable to taking an entire suburban neighborhood off the grid for a year.
Hardware efficiency has improved. The NVIDIA H100 delivers roughly 3 to 4 times the AI training throughput of the A100 at similar power consumption, and the B200, released in 2025, continues that curve. The problem is those gains get absorbed immediately: larger models, more training runs, more configurations tested before launch. Efficiency has not translated into lower total consumption because the compute requirements keep expanding to meet it.
| Model | Training electricity | CO2 equivalent | Source |
|---|---|---|---|
| GPT-3 (2020) | 1,287 MWh | 552 tons CO2e | UMass Amherst, 2019 |
| GPT-4 (2023) | Not disclosed; est. 50,000+ MWh | Not disclosed | OpenAI policy; researcher estimates |
| Llama 3 70B (2024) | Not disclosed | Not disclosed | Meta policy |
| Google Gemini Ultra | Not disclosed | Not disclosed | Google policy |
How much energy do US and global data centers use?
US data centers consumed 176 terawatt-hours of electricity in 2023, equal to 4.4% of all US electricity generation. That figure comes from Lawrence Berkeley National Laboratory, which has tracked US data center energy use for two decades. The same research projects US data center consumption reaching 325 to 580 TWh by 2028, representing 6.7 to 12% of US electricity. At the high end of that range, data centers would consume more electricity than every residential customer in California combined.
Globally, data centers used approximately 460 TWh in 2022, around 2% of world electricity. The IEA's 2025 Electricity Report projects global data center electricity will reach 945 TWh by 2030, roughly equal to Japan's total current annual electricity consumption. The AI subset of that total is projected at 200 to 400 TWh specifically for AI workloads by 2030, representing 35 to 50% of all data center electricity.
The Number Most Guides Don't Show
ChatGPT handles an estimated 500 million queries per day (OpenAI, 2023). At 0.001 kWh per query, the conservative estimate, that is 500,000 kWh per day, or 182.5 GWh per year, from ChatGPT inference alone. At 0.003 kWh per query, the figure triples to 547.5 GWh annually. The US Energy Information Administration puts average household electricity at 10,791 kWh per year (2022 data). That means ChatGPT's inference load is equivalent to the annual electricity use of 17,000 to 50,000 US homes. That is one AI product from one company. Microsoft, Google, Meta, and Amazon each run AI inference operations at comparable or larger scale across dozens of products simultaneously.
| Year | US Data Center Electricity | % of US Grid | Source |
|---|---|---|---|
| 2020 | ~73 TWh | ~2% | LBNL |
| 2023 | 176 TWh | 4.4% | Lawrence Berkeley NL |
| 2028 | 325-580 TWh | 6.7-12% | Lawrence Berkeley NL |
| 2030 (global) | 945 TWh total | ~4% of global | IEA, 2025 |
Which AI companies use the most energy?
Google publishes the most detailed energy data of any AI company through its annual Environmental Report. Its data centers consumed 14.4 TWh in 2020 and 30.8 TWh in 2024, a 114% increase in four years driven by AI workload expansion (Google Environmental Report, 2024). Google's fleet-wide average Power Usage Effectiveness is approximately 1.10, meaning 10% of all electricity consumed goes to cooling and power distribution rather than computation.
Microsoft targets a PUE of 1.125 across its data center fleet by 2025. Amazon Web Services reported a fleet-wide PUE of approximately 1.14 in 2023. Meta targets 1.10 for its newer AI-optimized facilities. Lower PUE means more efficient use of every kilowatt entering the building.
Epoch AI projects that NVIDIA-powered AI servers will consume 85.4 to 134.0 terawatt-hours of electricity annually by 2027. That is the NVIDIA GPU installed base across all customers globally, not just one company. The lower bound of 85 TWh already exceeds the total annual electricity consumption of Portugal (50 TWh), Ireland (30 TWh), and New Zealand (40 TWh) combined.
"Computing power required to train cutting-edge AI models has doubled every 3.4 months since 2012." (MIT News, January 2025)
That doubling rate applies to compute, not directly to energy, because hardware efficiency has improved substantially over the same period. The direction of electricity demand is harder to argue with, though. More compute means more products, more queries, more users paying for access. Power draw follows.
The fact that Google discloses annual TWh figures while Microsoft, Amazon, and Meta do not is not accidental. Google, with its historically strong efficiency record, benefits from transparency. The companies whose AI electricity footprint is growing fastest have the least incentive to publish it.
| Company | Data Center Electricity | Year | PUE | Source |
|---|---|---|---|---|
| 30.8 TWh | 2024 | 1.10 | Google Environmental Report | |
| Microsoft | Not disclosed | 2024 | 1.125 target | Microsoft Sustainability |
| Amazon (AWS) | Not disclosed | 2023 | 1.14 | AWS Sustainability Report |
| Meta | Not disclosed | 2024 | 1.10 target | Meta Sustainability |
| NVIDIA AI server fleet (all customers) | 85-134 TWh | 2027 projected | N/A | Epoch AI |
How does AI energy use compare to other activities?
The IEA's projection of 945 TWh for global data centers by 2030 equals Japan's current annual electricity use. Germany uses approximately 480 TWh per year. By 2030, global data center infrastructure would consume roughly twice Germany's national output.
At the individual level, a single ChatGPT query uses more electricity than sending 2 to 7 standard emails. A complex GPT-4 task generating a short document uses more energy than two hours of HD video streaming. These comparisons are instructive but not alarming in isolation. They become significant at the scale of hundreds of millions of daily users.
| Activity | Approximate annual electricity equivalent | Source |
|---|---|---|
| ChatGPT inference (500M queries/day, low estimate) | 182 GWh/year | Derived from OpenAI/MIT data |
| ChatGPT inference (500M queries/day, high estimate) | 548 GWh/year | Derived from OpenAI/MIT data |
| Denmark (entire country) | ~34 TWh/year | IEA |
| Bitcoin mining (global) | ~120-150 TWh/year | Cambridge Centre for Alternative Finance, 2023 |
| US data centers (2023) | 176 TWh/year | Lawrence Berkeley NL |
| Japan (entire country) | ~940 TWh/year | IEA |
| Global data centers (2030 projection) | 945 TWh/year | IEA, 2025 |
Global Bitcoin mining consumed approximately 120 to 150 TWh in 2023 (Cambridge Centre for Alternative Finance). US data centers at 176 TWh already surpass that figure. By 2028, AI infrastructure electricity will likely be two to four times the current global Bitcoin mining total, and unlike Bitcoin, AI adoption is still in its early phases.
For the water dimension of this same infrastructure, see our guide to how much water AI uses. For the broader environmental picture, including carbon emissions and what tech companies are doing, see is AI bad for the environment.
What are AI companies doing about energy consumption?
Google, Microsoft, Amazon, and Meta have all made carbon commitments. None are currently on track when measured against the pace of their AI electricity consumption growth.
The most concrete action taken in 2023 and 2024 was a wave of nuclear energy agreements. Microsoft signed a 20-year power purchase agreement with Constellation Energy in September 2023 to restart Unit 1 of Three Mile Island, bringing 835 megawatts of carbon-free generation back online specifically to supply its data centers. Google signed an agreement with Kairos Power in October 2023 to purchase approximately 500 megawatts from small modular reactors by 2035. Amazon acquired a nuclear-powered campus from Talen Energy and signed separate SMR agreements with X-energy in 2023 and 2024.
Nuclear agreements address the 24/7 carbon-free power problem that solar and wind cannot fully solve. A nuclear plant runs continuously at full output regardless of weather, which matches the constant load profile of a data center. Solar generates nothing at night. Wind is intermittent. The practical limitation is time: small modular reactors are years from commercial deployment, and restarting a closed reactor takes 18 to 24 months at minimum.
On efficiency, the NVIDIA H100 delivers roughly 3 to 4 times the AI training throughput of the A100 at similar power draw. Google's TPU v5e delivers 2 to 3 times the performance per watt of v4 for inference. These are genuine improvements. They are also being absorbed by more users and more capable models, which means total electricity use continues rising even as the cost per computation falls.
"The end game was 24/7 carbon-free energy around the clock everywhere we operate at all times." (Michael Terrell, Google Head of Advanced Energy, 2025)
Google achieved 66% of its electricity hour-matched to carbon-free sources in 2024. Its own target of 100% by 2030 would require nearly doubling that achievement over six years while simultaneously growing its AI infrastructure at a compound annual rate above 20%.
Where is AI energy use heading by 2030?
The IEA's 2025 Electricity report projects AI-specific electricity demand at 200 to 400 TWh globally by 2030, representing 35 to 50% of all data center electricity. The broader data center total reaches 945 TWh. Both figures assume continued hardware efficiency improvements from each GPU generation.
Two forces compete with that growth trajectory. Hardware efficiency is improving: the B200 delivers roughly 4 times the inference throughput of the H100 at similar power draw. Against that, the Jevons paradox applies. Cheaper, more efficient compute encourages more AI applications to be built and deployed, and more users to access them. Historical computing efficiency gains have consistently produced higher total consumption, not lower.
Epoch AI's projection of 85 to 134 TWh for the NVIDIA AI server fleet by 2027 illustrates the scale of what is already in motion. That figure represents installed hardware across all customers globally in two years. The entire United Kingdom uses approximately 280 TWh per year. NVIDIA's AI server fleet by 2027 would consume the equivalent of nearly half of Britain's national output.
What would actually change the trajectory: data centers sited near abundant renewable or nuclear generation rather than near population centers, liquid cooling replacing evaporative cooling, and AI chip architectures optimized for inference rather than training peak performance. Progress is visible on all three. The gap between AI electricity demand and available clean supply is still widening.
For a fuller picture of how the AI training and inference workloads driving this consumption actually differ in their infrastructure requirements, see our AI training vs inference explainer.
Frequently Asked Questions
How much energy does AI use per query?
A typical generative AI query uses approximately 0.001 to 0.003 kilowatt-hours, or 1 to 3 watt-hours. That is 3 to 10 times the energy cost of a standard Google Search, which uses roughly 0.0003 kWh (MIT News, January 2025). Optimized current models like ChatGPT 4.1 handle simple queries at the lower end of that range. Longer reasoning tasks, document generation, or complex multi-step queries sit at the higher end or beyond.
How much energy does AI use per year?
US data centers consumed 176 TWh of electricity in 2023, representing 4.4% of all US electricity (Lawrence Berkeley National Laboratory). That figure is projected to reach 325 to 580 TWh by 2028, or 6.7 to 12% of US electricity. Globally, the IEA projects data centers will consume 945 TWh by 2030, with AI workloads specifically accounting for 200 to 400 TWh of that total, or 35 to 50% of all data center electricity.
How much energy did it take to train ChatGPT?
OpenAI has not disclosed the energy required to train GPT-3.5 (the model behind the original ChatGPT) or GPT-4. The preceding model, GPT-3, required 1,287 megawatt-hours of electricity and generated 552 tons of CO2 equivalent (University of Massachusetts Amherst, 2019). Current frontier models are significantly larger. Researcher estimates for GPT-4 training range from 50,000 to 100,000 MWh, though OpenAI has not confirmed any figure.
How does AI energy use compare to a Google Search?
A standard Google Search uses approximately 0.0003 kilowatt-hours. A generative AI query uses 0.001 to 0.003 kWh, making it 3 to 10 times more energy-intensive (MIT News, 2025). At ChatGPT's estimated scale of 500 million queries per day, the inference load is equivalent to the annual electricity consumption of 17,000 to 50,000 average US homes, from a single product. Google processes approximately 8.5 billion searches per day at a much lower energy cost per event.
How much of US electricity does AI use?
US data centers as a whole consumed 176 TWh in 2023, representing 4.4% of all US electricity (Lawrence Berkeley National Laboratory). AI workloads are the primary driver of data center growth and account for a growing share of that total. By 2028, Lawrence Berkeley NL projects US data center consumption will reach 325 to 580 TWh, or 6.7 to 12% of all US electricity, driven almost entirely by AI infrastructure expansion.
How much energy will AI use by 2030?
The IEA projects global data center electricity consumption will reach 945 TWh by 2030, roughly equal to Japan's current annual electricity use (IEA 2025 Electricity Report). AI workloads specifically are projected to account for 200 to 400 TWh of that total, or 35 to 50% of all data center electricity. In the United States, Lawrence Berkeley National Laboratory projects data center consumption reaching 325 to 580 TWh by 2028, with AI as the primary growth driver.
Is AI becoming more energy efficient?
AI hardware efficiency improves meaningfully with each GPU generation. The NVIDIA H100 delivers roughly 3 to 4 times the AI throughput of the A100 at similar power consumption. The B200 continues that improvement. However, those efficiency gains are being absorbed by larger models and more users rather than translating into lower total energy use. The Jevons paradox applies: cheaper, more efficient compute enables more AI applications, which drives higher total electricity consumption even as the cost per query falls.