Summary: The training process for a single AI model, such as an LLM, can consume thousands of megawatt hours of electricity and emit hundreds of tons of carbon. AI model training can also lead to the evaporation of an astonishing amount of freshwater into the atmosphere for data center heat rejection, potentially exacerbating stress on our already limited freshwater resources. These environmental impacts are expected to escalate considerably, and there remains a widening disparity in how different regions and communities are affected. The ability to flexibly deploy and manage AI computing across a network of geographically distributed data centers offers substantial opportunities to tackle AI’s environmental inequality by prioritizing disadvantaged regions and equitably distributing the overall negative environmental impact.
The adoption of artificial intelligence has been rapidly accelerating across all parts of society, bringing the potential to address shared global challenges such as climate change and drought mitigation. Yet underlying the excitement surrounding AI’s transformative potential are increasingly large and energy-intensive deep neural networks. And the growing demands of these complex models are raising concerns about AI’s environmental impact.
Importantly, beyond their global climate impact, the environmental effects of AI have significant implications at the local and regional levels. While recent initiatives present promising steps for sustainable AI, they often prioritize easily measurable environmental metrics such as the total amount of carbon emissions and water consumption. They do not give enough attention to environmental equity — the imperative that AI’s environmental costs be equitably distributed across different regions and communities.
Even putting aside the environmental toll of chip manufacturing and supply chains, the training process for a single AI model, such as a large language model, can consume thousands of megawatt hours of electricity and emit hundreds of tons of carbon. This is roughly equivalent to the annual carbon emissions of hundreds of households in America. Furthermore, AI model training can lead to the evaporation of an astonishing amount of fresh water into the atmosphere for data center heat rejection, potentially exacerbating stress on our already limited freshwater resources.
All these environmental impacts are expected to escalate considerably, with the global AI energy demand projected to exponentially increase to at least 10 times the current level and exceed the annual electricity consumption of a small country like Belgium by 2026. In the United States, the rapidly growing AI demand is poised to drive data center energy consumption to about 6% of the nation’s total electricity usage in 2026, adding further pressure on grid infrastructures and highlighting the urgent need for sustainable solutions to support continued AI advancement.
The generation of electricity, particularly through fossil fuel combustion, results in local air pollution, thermal pollution in water bodies, and the production of solid wastes, including even hazardous materials. Elevated carbon emissions in a region come with localized social costs, potentially leading to higher levels of ozone, particulate matter, and premature mortality. Furthermore, the strain on local freshwater resources imposed by the substantial water consumption associated with AI, both directly for onsite server cooling and indirectly for offsite electricity generation, can worsen prolonged droughts in water-stressed regions like Arizona and Chile.
In the era of heightened environmental consciousness, a variety of multifaceted initiatives have been gaining traction, aiming to advance AI’s environmental sustainability and ensure its positive net contribution to mitigating climate change.
Advancements in data center power and cooling infrastructures have made substantial strides in reducing the once-hefty energy overheads of AI computing, as evidenced by the decline in power usage effectiveness (PUE) from 2.0 to 1.1 or even lower in cutting-edge data center facilities.
Other key innovations include the design of efficient AI model architectures, optimization algorithms to accelerate AI training and inference, techniques like weight pruning and quantization to reduce model sizes, and the creation of energy-efficient GPUs and accelerators.
At the system level, holistic management of both computing and non-computing resources is essential for building sustainable AI systems. For instance, geographical load balancing, a well-established technique, can dynamically align energy demand with real-time grid operating conditions and carbon intensities across a network of distributed data centers. Its effectiveness in mitigating environmental impact has been demonstrated in real-world systems, such as Google’s carbon-intelligent computing platform.
Furthermore, data center operators have pursued diverse strategies to achieve “net zero” emissions, including the development of large-scale solar farms and procurement of renewable energy credits. Likewise, in acknowledgment of fresh water as a vital social resource, industry leaders have set an ambitious goal of “water positive” by 2030 by replenishing more water than they consume.
Unfortunately, there remains a widening disparity in how different regions and communities are affected by AI’s environmental impacts. In many cases, adverse environmental impacts of AI disproportionately burden communities and regions that are particularly vulnerable to the resulting environmental harms. For instance, in 2022, Google operated its data center in Finland on 97% carbon-free energy; that number drops to 4–18% for its data centers in Asia. This highlights a significant disparity in the local consumption of fossil fuels and the creation of air pollution. Similarly, the water-consumption rate for data center heat rejection can be disproportionately higher in drought-stricken regions such as Arizona due to their hotter climates.
Moreover, existing approaches to deploying and managing AI computing often exacerbate environmental inequity, which is compounded by persistent socioeconomic disparities between regions. For instance, geographical load balancing that prioritizes the total energy costs or carbon footprint may inadvertently increase the water footprint of data centers in water-stressed regions, further straining local freshwater resources. It could also disproportionately add to the grid congestion and raise locational marginal prices for electricity, potentially leading to increased utility rates and unfairly burdening local residents with higher energy costs.
The troubling rise of AI’s environmental inequality has impeded progress toward environmentally responsible AI. This issue has garnered increasing public attention, prompting urgent calls for […]
Full article: hbr.org
The Inspector General of the Department of Defense released some scathing reports Thursday over the…
Photo: Morgan Boone, a volunteer with Crop Swap LA, harvested lettuce at the La Salle…
Los Angeles residents at a section of the Los Angeles River cleanup in Los Angeles,…
Over the past decade, about 67 million gallons of fire retardant have been dropped on…
Photo: Golden Trout Wilderness Seeking blue, seeing gold The Kern Plateau features a chain of…
For the first time in more than a century, a salmon was observed swimming through Klamath…