作者:Daisy Simmons
Is AI saving the world or breaking it? As the era-defining technology leapfrogs from what-if to what-next, it can be hard for us humans to know what to make of it all. You might be hopeful and excited, or existentially concerned, or both.
AI can track Antarctic icebergs 10,000 times faster than humans and optimize renewable energy grids in real time – capabilities that could help us fight climate change. But it also consumes incredible amounts of energy, and ever more of it, creating a whole new level of climate pollution that threatens to undermine those benefits.
All that dizzying transformation isn’t just the stuff of news headlines. It’s playing out in daily conversations for many of us.
“Have I told you what Chatty and I came up with yesterday?” My dad and I talk every Sunday. “It’s an environmental detective show – you’ll star in it, of course.”
He’s mostly retired and spends a lot of time at home while my stepmom is at work, so he’s happy to have found an exciting new hobby: storytelling sessions with his AI pals (the above-referenced ChatGPT, as well as Claude and “Gemmy,” aka Gemini). This is a good thing, I think. He should be having some fun in his sunset years.
But then the conversation turned to a much less fun AI story: I told my dad my sixth grader said he’d felt pressured to dumb down an essay at school because a classmate got heat for using AI. What made the teacher suspect the kid? She flagged it for college-level vocabulary. “Well, that just ain’t right,” said my dad. Agreed.
Grim laughter was my brother-in-law’s reaction to the subject of my son’s essay. Once a rock star graphic designer (literally for rock bands), he said AI has killed creative career prospects for all our kids. But who knows, he said, maybe it will solve climate change – or maybe it will only make it worse.
That tension is what brought me here. The more I read and heard, the more I saw that he and I are not alone in struggling with this topic. To help make sense of the complexity, I asked Ann Bostrom, the chair of the National Academies of Science’s Roundtable on Artificial Intelligence and Climate Change, what she thought of my brother-in-law’s comment. In a nutshell: Is AI good or bad for the climate? The answer is decidedly not straightforward.
“Right now, there is serious uncertainty about what can or might happen with AI,” she said. “But that’s partially because it’s a new tool we’re developing – AI is a tool. So what it does, or what it can do, is a function of what we do with it.”
AI isn’t a single technology but a vast toolbox containing many specialized tools, each with different purposes and environmental footprints. While dinner table conversations often focus on ChatGPT and similar systems, these represent just one part of a rapidly evolving landscape that’s difficult to neatly categorize.
The broader toolset includes everything from systems that analyze medical scans, predict weather patterns, and monitor coral reef health to those that generate text, optimize supply chains, and power autonomous vehicles. Large language models like ChatGPT and Claude represent just one branch of this diverse ecosystem, and they’re frequently updated with new versions, making it challenging to track their evolving capabilities and impacts. This constant iteration reflects a broader pattern across AI development – systems are continuously refined, retrained, and reimagined.
But here’s the thing about any AI tool: Despite their differences, they all share an insatiable appetite for energy – lots of it. And as they scale up, their hunger only grows. Early machine learning systems ran comfortably on desktop computers with minimal power consumption. Some of today’s most prominent AI systems use 100,000 GPUs (the specialized chips that crunch AI calculations), drawing as much electricity as a small city and filling server farms that span several football fields. For perspective, Meta’s flagship AI system relied on about 16,000 of these chips, a setup that would fit in a single, much smaller facility. As we speak, clusters of more than 300,000 GPUs are entering the drawing board, too.
Today, there are upward of 8,000 data centers worldwide – a number projected to double by 2026. The scale is getting so massive that in an extreme scenario, U.S. data centers could consume 12% of U.S. electricity, with one study estimating the extra energy demand will equal whole countries the size of Sweden or Argentina.
This surge in power consumption carries profound implications for our climate goals.
Every step of AI computing comes with a carbon cost. According to new analysis from MIT Technology Review, AI data centers now consume 4.4% of all U.S. energy, with projections showing AI alone could use as much electricity as 22% of U.S. households by 2028. These centers typically use electricity that’s 48% more carbon-intensive than the U.S. average.
The training process – where AI systems learn by digesting huge datasets – requires astronomical amounts of energy. Training GPT-4, for its part, gobbled through enough energy to power San Francisco for three days, at a cost of over $100 million.
And training accounts for just 10-20% of AI’s energy use. The real energy hog is inference – what happens every time someone asks a question, generates an image, or gets an AI recommendation. The MIT Technology Review study found that a simple text query uses about as much energy as riding six feet on an e-bike, while generating a five-second video burns the equivalent of a 38-mile ride.
The catch: These numbers represent snapshots based on highly specific parameters – particular models, data centers, energy grids, and time frames – making them tough to apply across the fast-shifting tech landscape.
“There’s a lot of discussion about how hard it is to get data,” Bostrom says. “And there’s not a common method of disclosing data.”
In other words, outside observers are working with fragments of a puzzle that companies often keep scattered. And most firms don’t track emissions at the granular level it would take to assess the relative impacts of different uses of AI, like search or ChatGPT queries.
What’s more, the available data also typically lacks key context, like where and when emissions were produced. For example, training a model on renewable energy in Sweden leaves a very different footprint than doing the same work on a coal-powered grid in West Virginia, but many reports treat these scenarios as equal. Competitive corporate secrecy only compounds the problem.
Spotty, unreliable, and missing data make it incredibly hard to accurately assess AI’s true climate impact and energy needs, let alone figure out what to do about it.
Existing regulatory frameworks have yet to catch up. Current accounting standards are patchy and still evolving. While recent rules like the European Union’s AI Act and the SEC’s climate disclosure requirements show progress, neither mandates detailed AI emissions reporting. Companies still get to decide what goes, often leading to selective reporting or none at all.
Political headwinds aren’t helping. The Trump administration has aimed to block AI regulation, and California’s SB 1047, a bill that would have required large AI developers to provide basic documentation, was recently defeated following heavy industry opposition.
Here’s where the data gaps become yet more problematic: AI’s environmental impact extends far beyond carbon emissions, creating a web of consequences that’s even harder to track.
Take water, for instance. By some estimates, just 15 ChatGPT queries guzzle half a liter of clean water needed to cool those massive data centers, and two-thirds of new facilities are being built in water-scarce regions. Then there’s embodied carbon – the emissions required to manufacture all that hardware, mining rare earth minerals for GPUs, and shipping components around the globe. And because AI development moves at breakneck speed, perfectly good equipment becomes obsolete fast, creating a growing mountain of electronic waste.
Meanwhile, AI infrastructure pumps pollutants into vulnerable communities – by 2030, U.S. data centers could cause 1,300 premature deaths and 600,000 asthma cases. Elon Musk’s Colossus AI supercomputer in Memphis operates 35 unlicensed methane gas turbines in neighborhoods already struggling with poor air quality. These harms fall disproportionately on communities already bearing climate change’s heaviest burdens, deepening climate injustice through AI’s expansion.
What’s more, AI systems risk undermining climate action by generating convincing but scientifically inaccurate climate information, potentially spreading misinformation that delays urgent policy responses. For example, Grok, the chatbot created by xAI, has reportedly been promoting climate denial talking points.
Given AI’s complex environmental footprint, it’s easy to focus only on the costs. But there’s another side to this story: AI can also be a tool for tackling climate change itself.
Some of these climate-focused tools are already making headway on multiple fronts, from optimizing power grids to predicting disasters before they strike.
“There’s a lot of work on using AI to improve predictions of extreme weather,” Bostrom says. “Given those are severe impacts of climate change that people are already worried about, improving predictions can definitely help protect people.”
The most compelling cases share a common trait: They deliver outsize climate benefits relative to their computational demands. Here are applications that could clear that bar:
Yes, the climate potential is legit. But that doesn’t make it simple.
“I think we’re at an inflection point,” Bostrom says. “Right now, it’s really hard to distinguish the hype from the realistic expectations.”
As it turns out, the same rapid development that creates new opportunities also introduces problems that can undermine the benefits.
For starters, unreliable outputs create dangerous inefficiencies. AI hallucinations – when systems generate false but confident-sounding information – can threaten any number of climate applications. Wrong information about weather predictions could lead to poor disaster preparedness. Faulty energy optimization recommendations could increase rather than reduce emissions.
Security vulnerabilities also threaten critical infrastructure. As AI becomes more integrated into climate-critical systems like power grids and weather monitoring, it’s proving a high-value target for cyberthreats like data poisoning – attacks that corrupt training data to make systems less reliable. And the more we rely on AI for climate solutions, the more these security risks multiply.
Perhaps most fundamentally, a scale-at-all-costs mindset compounds every problem. The AI development culture treats scale as an end in itself. As Bostrom points out, many AI tools are now incorporated into everyday platforms by default – you have to opt out rather than opt in. Case in point: Google now automatically provides AI search results as the standard.
“It’s similar to organ donation,” she said. “You get way more participation if people have to opt out than if they opt in.”
This design choice means climate costs accumulate from widespread AI usage that users never actively chose.
Opinion: Let’s free ourselves from the story of economic growth
These opt-out settings aren’t accidental design decisions.
“Systems-level decisions are being made to benefit commercial interests, and often at the expense of potential public good,” Bostrom says.
Companies compete on model size and capability rather than efficiency, with each new generation growing exponentially larger and demanding exponentially more energy – often for only marginal improvements in usefulness.
This obsession with scale creates a vicious cycle. Bigger models require more data and processing. More powerful models enable more applications, driving more usage. More usage creates demand for even more powerful models – and that translates into physical expansion: more chips, more data centers, more electricity use.
A runaway growth pattern is creating its own problems. As the climate costs become more visible, Bostrom sees a concerning trend toward polarization: “There’s stigmatization of AI going on – people are like, ‘AI is evil, it uses a lot of energy and is killing the planet.'”
But shutting down dialogue would prevent the nuanced thinking needed to harness AI’s genuine climate potential while addressing its real costs.
The good news? We’re not passive observers in this story. AI isn’t some unstoppable force of nature. It’s a tool that people are actively building right now – which means we still have the power to steer how it develops.
“We need to be thinking about solutions, or ways of keeping a system that would be fair for people and benefit society more broadly – a public good system,” she said. “That’s not the way it is right now.”
So where does that leave us? Is AI good or bad for the climate? The honest answer: It depends.
“It’s situationally specific – the context matters,” Bostrom said. She draws a parallel to electric vehicles: “If you have an EV on the West Coast where there’s a lot more hydropower, that’s very different from an EV running on a fossil fuel-heavy grid elsewhere.”
The same principle applies to AI – what matters are the specific applications, energy sources, and whether the outcomes justify the environmental costs.
For me, I’d say my dad’s joy in his storytelling sessions with “Chatty” isn’t the problem; it represents the kind of meaningful use that could warrant AI’s energy costs. If a model helps accelerate lifesaving research or reduces the need for resource-intensive travel, the climate trade-off may be worth it. But spinning up massive models for mundane tasks is starting to look like a high-emissions, low-rewards shortcut for things we once handled with far less energy.
Ultimately, the problem isn’t individual users making thoughtful choices – it’s an industry that treats scale as success, training ever-larger models for increasingly trivial purposes while communities face water shortages and polluted air. My brother-in-law’s grim laughter captures where we are: caught between promise and peril, unsure whether AI will help solve climate change or make it worse.
But that uncertainty also means the path forward isn’t fixed. What happens next depends on the choices we make today – and whether we can steer this technology toward its best climate potential rather than its worst.
Individual actions that add up
Corporate responsibility
Policy solutions
Republish our articles for free, online or in print, under a Creative Commons license.