AI’s Dirty Secret Isn’t the Data Center
It’s Who Controls the Value.
The conversation around AI infrastructure has drifted into a familiar pattern. Outrage travels faster than evidence. Numbers circulate without context. And data centers are being framed as environmental villains while far larger systems escape scrutiny.
There is truth inside the critique. There is also distortion. If we want a serious conversation about AI’s impact on people, labor, and the planet, we have to separate infrastructure facts from power dynamics. These are not the same problem.
Over the past few months, I have seen a wave of claims repeated with increasing certainty. AI queries are being described as equivalent to boiling kettles of water. Single prompts are framed as massive carbon events. Data centers are portrayed as uniquely reckless consumers of energy and water, often without acknowledging how energy accounting actually works or how shared infrastructure is allocated across millions of users and services.
I have also seen posts suggesting that everyday use of AI tools is morally irresponsible, while streaming video, cloud storage, real-time collaboration platforms, and always-on social media are treated as neutral or invisible. The implication is that AI is an outlier, when in reality it sits inside the same digital ecosystem most of us already rely on daily.
What gets lost in this framing is scale, proportionality, and systems thinking.
Per-query energy use is rarely discussed.
Regional energy mixes are ignored. Improvements in efficiency are dismissed. And most importantly, infrastructure is conflated with intent. A server does not make decisions. Governance does. Labor policy does. Business incentives do.
If the concern is environmental impact, then we owe it to ourselves to be precise. If the concern is power concentration, labor displacement, or unchecked corporate behavior, then we should name those issues directly instead of laundering them through infrastructure fear.
This conversation matters. But it only works if we are willing to slow it down, look at the actual mechanics, and stop treating complexity as a moral shortcut.
I Care About Sustainability. I Care More About Facts.
I care deeply about sustainability and environmental impact. I have spent years working alongside event teams, technology partners, and infrastructure providers who are trying to reduce waste, energy use, and unnecessary consumption.
What I care less about is fear-based storytelling that replaces facts with outrage.
Right now, data centers and AI infrastructure are being positioned as environmental villains. The narrative sounds convincing, until you slow it down and actually look at the numbers. When you do, a very different picture emerges.
This is not a defense of unchecked growth. It is a call for accuracy.
Because if we get the problem wrong, we solve the wrong thing.
The myth of “wasteful AI interactions”
A common belief is that every AI interaction is energy-intensive, irresponsible, and inherently harmful. That belief does not hold up under scrutiny.
The environmental impact of AI is not driven by a single query. It is driven by scale, and those two things are not interchangeable.
To make this tangible, let’s talk in everyday terms.
A typical AI or large language model query has been estimated to produce around 0.03 grams of CO₂. That number sounds abstract, so here’s what it looks like in real life.
• Running a modern LED lightbulb for about one minute uses more energy • Watching 9 seconds of television uses more energy • Sending a single email with a large attachment can use more energy • Driving a gas car about 10 feet emits more CO₂
Even a standard web search, at roughly 0.2 grams of CO₂, is still tiny. For perspective, driving to your local library to look something up instead of searching online can easily generate hundreds of grams of CO₂.
That does not mean AI has no footprint. It means the footprint is not where people think it is.
Household energy perspective
Another way to understand this is at the household level.
An average U.S. household uses roughly 30 kilowatt-hours of electricity per day. A single AI query consumes a microscopic fraction of that. You would need tens of thousands of AI interactions to equal the energy used by running a dishwasher cycle, doing a load of laundry, or cooking dinner on an electric stove.
The climate impact of AI does not come from individual usage. It comes from billions of interactions, running constantly, at global scale.
That distinction matters because it changes who is responsible and how solutions should be designed.
Why centralized data centers matter
If we are going to use large amounts of compute, the way we do it matters.
Hyperscale data centers are far more efficient than scattered, on-premise systems. A key metric here is Power Usage Effectiveness, which measures how much energy actually goes to computing versus cooling and overhead.
• Typical enterprise on-prem systems: 1.5 to 1.8 PUE • Leading hyperscale data centers: 1.2 or lower
That difference represents massive energy savings at scale.
Concentrating compute in modern, optimized facilities powered increasingly by renewable energy is materially better for the environment than thousands of smaller, inefficient setups pulling from dirtier grids.
If we want to reduce emissions while meeting real demand, efficiency beats fragmentation every time.
Water use deserves context, not panic
Water is where emotion tends to spike, and again, context matters.
U.S. data centers are estimated to use around 17 billion gallons of water per year for cooling. That is not nothing. It also needs to be compared honestly.
For reference, U.S. golf courses use roughly 500 billion gallons of water annually.
That does not mean local water impact should be ignored. In some communities, a single data center can account for more than 10 percent of local water usage, and that requires serious planning and transparency.
What it does mean is that water outrage is often applied selectively, without proportional reasoning.
The real sustainability issue is not the server rack
The environmental math around data centers is improving. Efficiency is improving. Renewable integration is increasing.
The harder, more uncomfortable conversation sits elsewhere.
Who captures the economic value of AI. Who absorbs the psychological and physical labor costs. Who bears local environmental strain. Who makes the decisions, and who has no seat at the table.
Those are governance and accountability problems, not infrastructure problems.
Blocking data centers does not fix labor exploitation. It does not fix value concentration. It does not fix supply chain opacity.
Good planning, regulation, transparency, and accountability do.
Facts first, then action
Caring about the environment means caring about evidence. It means resisting narratives that feel good but solve nothing. And it means being willing to say two things at once.
Yes, AI and data centers consume resources. Yes, those impacts must be managed responsibly. And no, fear-based claims are not helping us do that.
If we want a sustainable future, we need fewer viral soundbites and more grounded conversations about scale, efficiency, power, and accountability.
Facts are not the enemy of sustainability.
They are the starting point.
If you care about sustainability, you should care about accuracy.
AI is not impact-free. Neither is email. Streaming. Cloud storage. Or the systems we already rely on every day without questioning their footprint. The difference is that AI is visible right now, so it absorbs fear faster than facts.
The real work is learning how to evaluate technology based on evidence, tradeoffs, and outcomes, not headlines or guilt-based narratives.
That is what I work on with teams, leaders, and organizations. Translating technical reality into decisions people can stand behind. Cutting through confusion. Replacing anxiety with clarity.
If you are navigating AI adoption, policy decisions, or sustainability concerns and want a grounded, fact-based conversation, book time with me*. We will separate signal from noise and build a plan that actually holds up under scrutiny.
*The first 20 minutes of our call are complimentary. That time is for context, alignment, and clarity. It allows us to understand your situation, your goals, and whether I am the right fit to support you. If we move into active consulting beyond that point, where I am analyzing workflows, advising on platforms, mapping strategy, or helping you make decisions, that is paid time. What you are accessing is not just minutes on a calendar. It is two decades of industry experience, technical depth, strategic pattern recognition, and the research work I do so you do not have to. That level of support is what businesses hire me for. Clarity has value. Direction has value. Experience has value. If you are ready to get out of decision paralysis and into forward motion, book the intro call. Because momentum beats overthinking. And execution beats endless comparison.
Sources and further reading
Data center energy use and efficiency
International Energy Agency (IEA) What it is: The global authority on energy data and projections. Why it matters: Provides the most cited estimates on data centers’ share of global electricity use and emissions. Key reference: IEA, Data centres and data transmission networks https://www.iea.org/reports/data-centres-and-data-transmission-networks
This is where the “around 1 percent of global emissions” framing comes from. Data centers account for roughly 1–1.5% of global electricity use, with emissions depending heavily on grid mix and efficiency.
Uptime Institute What it is: The industry standard for data center performance metrics. Why it matters: Source of Power Usage Effectiveness (PUE) benchmarks. Key reference: Uptime Institute Global Data Center Survey https://uptimeinstitute.com/research-and-reports This supports the claim that hyperscale facilities operate around 1.2 PUE versus higher ratios for on-prem systems.
Server energy efficiency insights https://uptimeinstitute.com/resources/research-and-reports/server-energy-efficiency-five-key-insights
IT efficiency and sustainability fundamentals https://uptimeinstitute.com/resources/research-and-reports/it-efficiency-the-critical-core-of-digital-sustainability
How resiliency choices affect cloud carbon emissions https://uptimeinstitute.com/resources/research-and-reports/how-resiliency-drives-cloud-carbon-emissions
Per-query energy use and AI efficiency
Google Environmental Reports and AI disclosures What it is: Google’s annual environmental impact reporting. Why it matters: Primary source for per-query energy and CO₂ estimates. Key references: https://www.gstatic.com/gumdrop/sustainability/google-2023-environmental-report.pdf This is where the web search and Gemini per-query comparisons originate.
Electricity pricing and grid impact
Lawrence Berkeley National Laboratory (LBNL) What it is: A U.S. Department of Energy national lab. Why it matters: Best neutral research on grid economics and load growth. Key reference: Electricity Load Growth and Retail Rates https://emp.lbl.gov/publications
Their work supports the claim that large, steady loads can reduce average retail prices by spreading fixed grid costs.
Academic and Neutral Measurement References
Berkeley Lab data center energy research https://datacenters.lbl.gov/
Carbon intensity of electricity by region (for grounding comparisons) https://www.iea.org/data-and-statistics/data-tools/emissions-factors
Golf course comparison Primary source: US Geological Survey and EPA irrigation studies https://www.usgs.gov/mission-areas/water-resources
This is where the roughly 500 billion gallon annual irrigation figure comes from.
Automation and workforce impact
McKinsey Global Institute What it is: One of the most cited economic research bodies on automation. Why it matters: Source of the “30 percent of work hours” and “400 to 800 million workers” estimates. Key reference: Jobs lost, jobs gained https://www.mckinsey.com/featured-insights/future-of-work
Important nuance: McKinsey talks about hours and tasks, not job elimination headlines.
