AI: Is It Ruining the Environment? - Science Vs Recap
Podcast: Science Vs
Published: 2025-11-13
Duration: 38 min
Guests: James O'Donnell, Casey Crownhart, Shale Ren
Summary
The episode investigates the environmental impact of AI, specifically focusing on the energy and water consumption of data centers. It examines whether AI's environmental footprint is as significant as other common activities.
What Happened
The episode begins with Rose Rimmler discussing the increasing concerns about AI's massive energy and water consumption, particularly through data centers. These centers are notorious for their power usage, drawing significant electricity from the grid, which is mostly powered by fossil fuels, thereby contributing to climate change. The conversation highlights how local communities are opposing the construction of new data centers due to their environmental impact.
Rose and editor Blive Terrell delve into the mechanics of AI energy consumption, explaining that AI requires more energy than regular computing because it uses GPUs instead of CPUs. GPUs are likened to a cluster of paintball guns that can perform many tasks simultaneously, consuming more power than a CPU, which handles tasks sequentially.
James O'Donnell and Casey Crownhart, journalists from MIT Technology Review, share their research on AI energy consumption. They note that energy use varies significantly between AI models, with larger models consuming more energy. For instance, a small AI model might use energy equivalent to a tenth of a second in a microwave, while a larger model might use up to eight seconds.
The discussion moves to the cumulative impact of AI on energy consumption, with data centers' electricity use tripling from 2014 to 2023. Predictions suggest that by 2028, AI data centers could consume as much electricity as a quarter of U.S. households. This increase is largely due to the integration of AI into various sectors, as highlighted by a survey showing 78% of organizations now use AI.
The episode then addresses AI's water usage, with Professor Shale Ren explaining how data centers use water for cooling. While only a small portion of this is potable water, the broader impact on local water supplies can be significant, especially in regions with water scarcity. Ren's research, which went viral, found that extensive use of AI models like ChatGPT can consume water equivalent to a bottle over multiple interactions.
Ren emphasizes the regional variability in the impact of data centers on water supplies, noting that some areas may be more affected than others. Despite this, the episode points out that overall, data centers currently consume about 0.3% of the nation's water supply.
The episode concludes with Rose and Blive reflecting on AI's environmental footprint compared to other activities like flying or eating meat. They suggest that while AI's impact is significant, the broader issue lies in the reliance on fossil fuels for energy. The conversation ends with a call for more thoughtful use of AI and a push towards renewable energy sources.
Key Insights
- AI data centers have tripled their electricity usage from 2014 to 2023, with projections indicating they could consume as much electricity as a quarter of U.S. households by 2028.
- GPUs, which AI models use for processing, consume more energy than CPUs because they perform many tasks simultaneously, akin to a cluster of paintball guns.
- AI models vary significantly in energy consumption; a small model might use energy equivalent to a tenth of a second in a microwave, while larger models can use up to eight seconds.
- Data centers currently account for about 0.3% of the nation's water supply, with regional impacts varying significantly, especially in areas facing water scarcity.