AI and the Environment:What Every Business Owner Needs to Know

AUC1 Consulting

AI and the Environment:
What Every Business Owner Needs to Know

Separating Fact from Hype, and Understanding the Real Costs

Introduction: Why This Matters to Your Business

Artificial intelligence is everywhere. It’s in your email inbox, your customer service tools, your marketing platforms, and increasingly, in conversations about your bottom line. If you run a business in 2026, you’re either already using AI or you’re being told you should be.

But here’s what most people selling AI tools won’t tell you: every AI interaction has a physical cost. It runs on real hardware, in real buildings, drawing real electricity and real water. And the scale of that infrastructure is growing at a pace that has serious implications for our energy grid, our water supply, and our climate.

This guide is designed to give you—a business owner, not a data scientist—an honest, grounded understanding of AI’s environmental footprint. We’ll walk through the real numbers, put them in proper context, call out where the online discourse gets it wrong, and give you practical steps to use AI responsibly.

Because here’s the thing: AI’s environmental story is more nuanced than most headlines suggest. Some of the most alarming statistics you’ve seen online are exaggerated or outright wrong. Some of the most important costs are being ignored entirely. And the people best positioned to drive responsible AI adoption are business owners like you—people who make daily decisions about which tools to buy, which vendors to trust, and how to run operations efficiently.

The Real Numbers: AI’s Energy, Water, and Carbon Footprint

Energy Consumption

The energy appetite of AI and the data centers that power it has become one of the defining infrastructure challenges of the decade. Data centers are enormous facilities—some exceeding one million square feet—that house the servers, storage, and networking equipment required to train and run AI models.

U.S. data centers consumed approximately 183 terawatt-hours (TWh) of electricity in 2024, representing more than 4% of total national electricity consumption.[1][2] To put that in perspective, 183 TWh exceeds the entire annual electricity consumption of countries like the Netherlands. By 2030, that figure is projected to grow by 133%, reaching approximately 426 TWh.[2]

Globally, the International Energy Agency projects data center electricity demand will more than double by 2030, reaching around 945 TWh—slightly more than Japan’s total energy consumption.[1] A typical AI-focused data center uses as much electricity as 100,000 households, and the largest facilities under development will consume 20 times that amount.[10]

What about individual usage? In February 2026, OpenAI CEO Sam Altman disclosed that a standard ChatGPT query consumes approximately 0.34 watt-hours of energy.[15] Google has published data showing a median Gemini text prompt uses about 0.24 watt-hours.[13] That’s roughly equivalent to running a microwave for one second, or a games console for about six seconds.[14] A family could run 1,000 AI queries per day and it would still amount to less than 1% of their household electricity use.[14]

The environmental cost of AI is not in any single query. It’s in the aggregate—billions of queries, across hundreds of millions of users, running on infrastructure that is expanding at unprecedented speed.

Water Consumption

Data centers need large quantities of water for cooling. For every kilowatt-hour of energy consumed, a data center typically requires about two liters of water.[3] Cornell University researchers project that by 2030, U.S. AI data centers could drain 731 to 1,125 million cubic meters of water annually—equivalent to the household water usage of 6 to 10 million Americans.[4]

A peer-reviewed study published in late 2025 estimated that AI systems’ water footprint could reach 312 to 765 billion liters in 2025 alone—approaching the scale of global annual bottled water consumption.[5] However, as we’ll discuss later in this article, the way water usage is measured and reported varies enormously, and many of the most alarming figures conflate very different types of water consumption.

Carbon Emissions

The carbon footprint of AI is tied directly to the energy sources powering data centers. According to Goldman Sachs Research, roughly 60% of increasing electricity demand from data centers will be met by burning fossil fuels, adding approximately 220 million tons of carbon emissions globally.[6][25]

Cornell researchers found that by 2030, AI growth could produce 24 to 44 million metric tons of CO2 annually—the equivalent of adding 5 to 10 million cars to U.S. roads.[4] The IEA notes that data centers represent one of the few sectors where emissions are projected to grow, while most others are expected to decline.[9]

Major tech companies have already experienced rising emissions. Amazon reported in 2024 that its emissions grew to 68.25 million metric tons—its first increase since 2021, driven primarily by data center expansion.[7] Separately, Google reported a 48% increase in greenhouse gas emissions since 2019, largely attributable to data center growth.[8]


Wait—How Much of What You’ve Read Online Is Actually Accurate?

If you’ve seen alarming headlines about AI’s environmental impact, you’re not alone. But here’s something critical: a significant portion of the data circulating online about AI’s environmental costs is misleading, exaggerated, or lacking essential context. Before you make business decisions based on what you’ve read, you need to understand the gaps.

The Training vs. Inference Confusion

This is the single biggest source of confusion in the public discourse. AI has two distinct operational phases, and they have very different energy profiles:

Training is the “learning” phase, where a model processes massive datasets to build its capabilities. Training GPT-4 reportedly consumed around 50 gigawatt-hours of energy—enough to power San Francisco for three days.[12] But training happens once, or periodically when a new version is released. It is a one-time, front-loaded cost.

Inference is the “using” phase—every time you or your employees send a prompt and receive a response. Individually, inference is far less energy-intensive. But because inference happens millions of times per day, every day, it now accounts for over 80% of total AI electricity consumption.[17]

Most alarming headlines are built on training numbers, because those figures make for dramatic statistics. But the environmental question for your daily business use of AI is primarily an inference question—and the per-query energy cost of inference is dramatically smaller than the training figures would suggest.[18]

To further complicate things, the energy profile of inference varies wildly depending on the model. A 2025 benchmark across 30 commercial AI models found that some long, complex prompts on reasoning models exceeded 33 watt-hours per query, while short prompts on optimized models used as little as 0.4 watt-hours.[27] That’s a range of nearly 100x. So when someone tells you “an AI query uses X amount of energy,” the first question should always be: which model, doing what?

The “Gallons Per Query” Myth

You may have seen claims that a single ChatGPT query consumes “gallons of water.” In February 2026, Sam Altman called these figures “completely untrue” and having “no connection to reality.”[15]

Google’s published methodology for measuring Gemini inference reports that a median text prompt uses about 0.26 milliliters of water—roughly five drops.[13] A widely-circulated early claim about AI energy consumption per query was later found to have been exaggerated by a factor of approximately 90.[14]

This doesn’t mean AI has zero water cost. The aggregate water demand of data centers is real and growing. But the per-query panic is built on bad numbers, and using those numbers to make business decisions would be a mistake.

The Scale and Measurement Problem

There are currently no standardized global metrics for measuring AI’s environmental impact.[19] Different studies measure fundamentally different things, which is why the numbers can vary by orders of magnitude:

  • Some measure only direct, on-site water and energy use at the data center (Scope 1)
  • Others include indirect energy from the power grid supplying the data center (Scope 2)
  • A few attempt the full lifecycle: hardware manufacturing, mineral mining, transport, and construction (Scope 3)
  • Most don’t distinguish between AI workloads and non-AI workloads in the same data center[5]
  • Water calculations may include on-site cooling, off-site power generation water, and embodied water from hardware manufacturing—three very different numbers often lumped together[22][23]

Two studies can examine the same data center and produce wildly different numbers depending on what they choose to measure. A policy memo from the Federation of American Scientists noted that most companies use outdated or narrow measures like Power Usage Effectiveness (PUE) and purchase renewable energy credits that can obscure their true environmental footprint.[19]

The Bottom Line on Data Quality

Be skeptical of any single number you see online about AI’s environmental cost. The real answer to “how much water or energy does AI use?” is almost always: “It depends.” It depends on the model, the provider, the data center location, the cooling technology, the energy grid mix, the time of day, and whether we’re talking about training or inference. Anyone giving you a single, definitive number without context is oversimplifying.

Errors in Popular Books and Media

Even well-researched publications get this wrong. A detailed analysis of Karen Hao’s 2025 book Empire of AI found that a key claim about data center water usage was off by a factor of roughly 4,500—the book stated a data center was using 1,000 times as much water as a city of 88,000 people, when in reality it was using about 0.22 times the city’s consumption.[16]

Similarly, a widely-cited 2019 academic study claimed that training Google’s BERT model produced carbon emissions equivalent to a cross-country flight. That estimate was later found to be overstated by a factor of 88, but continued to be referenced in reputable outlets for years afterward.[14]

The pattern is consistent: dramatic initial claims get widespread coverage, corrections get very little. The lesson for business owners: always ask where the number came from, what exactly it measures, and whether it has been updated since publication.


Putting It in Context: Agriculture, Corn, and the Water Question

When discussing AI’s water usage, it is essential to place it in the context of how water is actually consumed in the broader economy. The comparison to agriculture is instructive—and largely absent from the headlines.

Agriculture Is the Elephant in the Room

Agriculture accounts for approximately 70% of all global freshwater withdrawals—about 3 trillion cubic meters per year.[21] That figure is roughly 600 times Cornell’s estimate of AI’s projected annual water requirements by 2027.[21] In the United States, irrigation alone accounts for 42% of total freshwater withdrawals, with corn and soybeans now the dominant irrigated crops—primarily grown for livestock feed and biofuel, not direct human consumption.[20]

At the AI industry’s current rate of expansion, it would need approximately half a century to reach agriculture’s current water consumption levels.[21]

The Corn Comparison

Corn is one of America’s most water-intensive crops, and the comparison to data center water use is striking.

In Iowa, corn requires between 0.1 and 0.2 inches of water per acre per day to grow. Just 0.1 inches of water spread across a single acre equals 27,154 gallons.[16] During GPT-4’s training, Microsoft’s Iowa data centers reportedly consumed about 11.5 million gallons of water in a single month—roughly 380,000 gallons per day.[16]

That sounds like a staggering amount of water. But a single Iowa corn farm of just 14 acres uses the same volume daily. Iowa has over 12.9 million acres of corn planted annually.

In Maricopa County, Arizona—one of America’s major data center hubs—data centers are projected to use 905 million gallons of water in 2025. For context, the county’s golf courses consume 29 billion gallons per year. Data centers account for 0.12% of the county’s total water usage. Golf courses account for 3.8%—and generate 50 times less tax revenue per unit of water consumed.[16]

In Spain’s Aragon region, Amazon’s new data centers are licensed to use 755,720 cubic meters of water annually—enough to irrigate 576 acres of corn, one of the region’s main crops.[22]

Why This Context Matters

None of this means AI’s water usage is irrelevant. It absolutely matters—especially in water-stressed regions where data centers compete directly with residential and agricultural needs. But the framing matters enormously. When headlines suggest AI is “draining our water supply,” the honest context is that agriculture uses hundreds of times more. AI is a new, growing demand that needs responsible management. It is not, by any defensible measure, the primary driver of water scarcity.

The Real Water Risk Is Local, Not Global

The genuine concern around AI and water is geographic concentration. Cornell’s Fengqi You explains that water requirements for the same AI workload vary enormously based on local climate, cooling technology, and the regional energy mix.[23] Two-thirds of new data centers built or in development since 2022 are located in areas already experiencing water stress.[22]

The problem isn’t that AI uses “too much” water in absolute global terms. The problem is that data centers are sometimes being built in exactly the wrong locations—without adequate planning for local water constraints, and without sufficient transparency about their actual consumption. Communities in places like northern Virginia, Phoenix, and parts of Oregon are already pushing back, and their concerns are legitimate.[10][28]


What You Can Actually Do About It

You don’t need to stop using AI. But you should be using it with intention and awareness. Here are concrete, actionable steps for any business owner.

  1. Audit your AI usage. Identify every AI tool your team uses. Is each one providing measurable value, or are people defaulting to AI because it’s available? Eliminating unnecessary usage is the simplest and most direct way to reduce your footprint—and your costs.
  2. Right-size your models. Not every task needs the most powerful AI model available. Smaller, more efficient models handle many routine tasks at a fraction of the energy cost. MIT researchers note that within a few years, tasks requiring today’s most powerful models will likely be achievable with dramatically smaller ones.[25]
  3. Choose providers who disclose sustainability data. Ask your AI and cloud vendors for their PUE, WUE, and carbon intensity metrics. Providers committed to renewable energy and water efficiency publish these numbers. Those who won’t should raise a red flag.[24]
  4. Look for ISO certifications. ISO 14001 (environmental management) and ISO 42001 (AI ethics and governance) demonstrate concrete commitment to responsible practices—not just marketing promises.[24]
  5. Include AI in your sustainability tracking. If you track your business’s environmental impact in any capacity, your AI and cloud usage should be part of that accounting. Tools now exist to estimate the carbon footprint of cloud service usage.
  6. Use AI to improve your own sustainability. AI excels at optimizing energy usage, logistics routes, inventory management, and supply chain tracking. The goal is to ensure the AI you deploy reduces more waste than it creates.
  7. Educate your team. Most employees have no idea their AI usage has a physical cost. A brief internal session can shift daily behavior—keeping prompts concise, batching work efficiently, and avoiding AI-generated content when existing materials would suffice.[3][24]

The Other Side: AI as an Environmental Tool

The picture is not all negative. AI is also being deployed to solve environmental problems at a scale that simply was not possible before.

  • UNEP’s International Methane Emissions Observatory uses AI to analyze satellite data and detect methane leaks—notifying companies and governments in near-real-time.[11]
  • AI-driven climate models are improving early warning systems and optimizing renewable energy deployment across multiple continents.[26]
  • The IEA estimates that digitalization enabling precise energy management could cut global building energy consumption by up to 10% by 2040.[11]
  • AI-powered ESG reporting tools are automating carbon footprint calculations and sustainability disclosures, reducing what used to take 30 hours of manual work to five.[24]

With the right governance and renewable energy infrastructure, AI can reduce more emissions than it generates. But that outcome requires intentional choices at every level—from the companies building the infrastructure down to individual business owners deciding which tools to deploy.


The Honest Take

AI is not destroying the planet. But it is adding a meaningful and fast-growing new demand on our energy and water systems at exactly the moment when we need to be reducing those demands. The question is not whether to use AI—it’s how to use it with the same rigor and intentionality you’d apply to any other significant business investment.

Here’s what I tell my clients: be skeptical of anyone who says AI is an environmental catastrophe, and be equally skeptical of anyone who says it has no environmental cost at all. The truth is in the middle, and it depends heavily on the specific tools, providers, and use cases involved.

The businesses that will be best positioned in the coming decade are those that adopt AI intentionally—understanding both its power and its costs—and hold their technology partners accountable for sustainability. That’s not just good ethics. It’s good business.

What Now?

If this article raised questions about how your business is using AI—or whether you’re getting honest answers from your technology vendors—that’s a good sign.

  • AI Readiness Assessments — an honest, no-hype evaluation of whether AI fits your needs, including environmental and cost considerations
  • Implementation Guidance — the right tools, providers, and architecture for your business, with sustainability built in
  • Vendor Accountability — we help you ask the right questions about energy, water, and emissions, and interpret the answers

15+ years of experience at AWS and Microsoft.
Enterprise-grade judgment. SMB-scale decisions. No hype.

Get in Touch →

Sources and Citations

All sources accessed and verified as of March 2026.

  1. International Energy Agency (IEA), “Energy and AI,” April 2025. iea.org
  2. Pew Research Center, “What We Know About Energy Use at U.S. Data Centers Amid the AI Boom,” October 2025. pewresearch.org
  3. MIT News, “Explained: Generative AI’s Environmental Impact,” January 2025. news.mit.edu
  4. Cornell University / Nature Sustainability, Xiao & You et al., “Roadmap Shows the Environmental Impact of AI Data Center Boom,” November 2025.
  5. ScienceDirect, “The Carbon and Water Footprints of Data Centers and What This Could Mean for AI,” December 2025.
  6. Goldman Sachs Research, “Data Center Energy Demand Analysis,” August 2025. Referenced in MIT News, September 2025.
  7. Amazon, 2024 Sustainability Report. Referenced in Smithsonian Magazine, September 2025.
  8. Google, 2023 Environmental Report. Referenced in Smithsonian Magazine, September 2025.
  9. Carbon Brief, “AI: Five Charts That Put Data-Centre Energy Use and Emissions Into Context,” September 2025. carbonbrief.org
  10. NPR, “Data Centers Are Booming. But There Are Big Energy and Environmental Risks,” October 2025. npr.org
  11. UN Environment Programme (UNEP), “AI Has an Environmental Problem. Here’s What the World Can Do About That.” unep.org
  12. MIT Technology Review, “We Did the Math on AI’s Energy Footprint,” May 2025. technologyreview.com
  13. Google Cloud Blog, “Measuring the Environmental Impact of AI Inference,” August 2025. cloud.google.com
  14. Transformer News, “We’re Getting the Argument About AI’s Environmental Impact All Wrong,” September 2025. transformernews.ai
  15. CNBC, “Sam Altman Defends AI Resource Usage,” February 2026. cnbc.com
  16. Andy Masley, “Empire of AI Is Wildly Misleading on AI Water Use,” Substack, November 2025. andymasley.substack.com
  17. Arbor.eco, “AI’s Environmental Impact: Calculated and Explained,” 2025. arbor.eco
  18. Online Learning Consortium, “The Real Environmental Footprint of Generative AI: What 2025 Data Tell Us,” December 2025.
  19. Federation of American Scientists, “Measuring AI’s Energy/Environmental Footprint to Assess Impacts,” June 2025. fas.org
  20. Sentient Media, “How Agriculture and Data Centers Compete for the Great Lakes’ Most Precious Resource,” September 2025.
  21. TechPolicy.Press, “Artificial Intelligence, Water Consumption and the Trillion-Radish Conundrum,” November 2025.
  22. EESI, “Data Centers and Water Consumption.” eesi.org
  23. Undark, “How Much Water Do AI Data Centers Really Use?,” December 2025. undark.org
  24. World Economic Forum, “How to Cut the Environmental Impact of Your Company’s AI Use,” June 2025. weforum.org
  25. MIT News, “Responding to the Climate Impact of Generative AI,” September 2025.
  26. UN Global Compact, “Artificial Intelligence and the Sustainable Development Goals.” unglobalcompact.org
  27. Jegham et al., “How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference,” arXiv, May 2025.
  28. Stanford / &theWest, “Thirsty for Power and Water, AI-Crunching Data Centers Sprout Across the West,” April 2025.

© 2026 AUC1 Consulting. This article may be shared with attribution.

Scroll to Top