atomcamp

Sustainable AI: How Can AI Reduce its Environmental Footprint?

We’re living in an era where artificial intelligence (AI) is grabbing more attention and funding than ever before. Following the breakthrough developments of tools like ChatGPT, Google Gemini, and a range of other generative AI models, millions of people are utilizing these powerful technologies for a variety of purposes. Whether it’s summarizing documents, answering questions, generating explanations, or even creating content like code, music, and marketing campaigns, the applications are vast. If you’re curious about this AI boom, you can explore more about it in our AI Fundamentals Skill Track.

With all the buzz around AI’s impressive capabilities, it’s easy to overlook some of its downsides. While much has been debated about the existential risks AI might pose to humanity, another serious discussion—its environmental impact—hasn’t received as much attention, despite growing evidence from AI researchers and climate activists.

It’s crucial to align AI development with environmental objectives to help tackle the climate crisis. Yes, using AI to solve climate-related problems is important, but this won’t be enough if the AI industry doesn’t take serious steps to reduce its own growing environmental footprint.

In this article, we’re diving into the environmental costs of AI. We’ll look at how resource-heavy AI is, consider the ethical questions around prioritizing AI over other critical societal needs, and explore ways to reduce the environmental toll of AI while making it more sustainable.

Elevate Your Team’s AI Skills

Transform your business by boosting your team’s AI knowledge with DataCamp for Business. Achieve better insights and increased efficiency.


Calculating AI’s Environmental Impact: Methodological Challenges

Since ChatGPT’s launch in late 2022, the AI industry has exploded. Statista reports that the AI market surpassed $184 billion in 2024 and is expected to soar past $826 billion by 2030.

But before we get carried away by these ambitious projections, we need to stop and evaluate AI’s environmental costs. The stakes couldn’t be higher as the world is running out of time to prevent a full-blown climate catastrophe.

Assessing AI’s environmental impact is no easy task. As we explained in our piece on the Environmental Impact of Digital Technologies and Data, this is a complicated endeavor.

While research is progressing, it’s still not possible to comprehensively measure AI’s environmental footprint. Here are a few of the main challenges:

Lack of transparency and data gaps
AI companies are often reluctant to share critical information about their products, particularly when it comes to environmental data. This creates gaps and blind spots that make it difficult for researchers. As Sasha Luccioni, Climate Lead at Hugging Face, points out:

“Not a single company that offers AI tools, that I know of, provides energy usage and carbon footprint information. We don’t even know how big models like GPT are. Nothing is divulged; everything is a company secret.”
— Sasha Luccioni, Climate Lead at Hugging Face

What counts as an environmental footprint?
Most studies focus on energy consumption or carbon emissions (i.e., the carbon footprint). However, we also need to consider other non-energy impacts like water and mineral usage, which contribute to AI’s overall environmental footprint.

Looking at the whole AI lifecycle
Many studies zero in on AI’s footprint during the training phase. However, as Luccioni and colleagues pointed out in a 2023 paper, other stages in the AI lifecycle should also be considered, including material extraction, manufacturing, deployment, and eventual disposal.


The Environmental Footprint of AI

Despite the difficulties in fully estimating AI’s environmental costs, we do know that it demands significant natural resources. Here, we’ll focus on AI’s carbon and water footprints, but remember, more research is needed to understand its impacts on mineral consumption and biodiversity loss.

AI’s Carbon Footprint
To work its magic, AI consumes a ton of energy, which typically leads to carbon emissions.

There are two primary categories of energy consumption in AI:

  1. Operational energy: This is the energy required to run AI systems, including both training and inference.
  2. Hardware manufacturing: This refers to the energy needed to build the equipment, such as GPUs (graphical processing units), used in AI systems.

Most of the existing studies have focused on AI’s carbon footprint during the training phase because it’s the part of the lifecycle with the most available data—especially for open-source models.

For example, Strubell et al., in a 2019 article, estimated that training an LLM (Large Language Model) with 213 million parameters can emit 626,155 pounds of CO2, nearly the same as the lifetime emissions of five cars, including fuel.

But don’t forget, the energy bill during inference (the phase where models actually perform the tasks they were trained for) can be just as high—or even higher. In a 2023 study, Luccioni and colleagues calculated that deploying a 176-billion-parameter BLOOM model over 18 days consumed an average of 40.32 kWh of energy daily (the equivalent of around 1,110 smartphone charges) and emitted roughly 19 kg of CO2 daily.

They also found that image-based tasks are the most energy-intensive.

Given all this, it’s no wonder that electricity consumption at data centers—where most AI computing happens—could double by 2026, according to a report by the International Energy Agency.

This surge in energy usage is not good news for tech companies aiming for net-zero emissions. In Google’s latest sustainability report, the company admitted that its emissions grew by 13% in 2023 compared to the previous year, a 48% increase since 2019, mostly due to energy demands in its data centers. Microsoft faced similar challenges, with a 29.1% increase in emissions since 2020, mainly from the energy needed to run AI.

AI’s Water Footprint
More researchers are beginning to highlight the alarming water consumption required by AI. Data centers, where AI computing takes place, need billions of gallons of water to cool the servers that power these calculations. A 2021 study revealed that the average data center uses as much water as three mid-sized hospitals.

And this water consumption is likely to rise as new data centers are built to support AI’s growth. Researchers from UC Riverside estimate that data centers could consume between 4.2 and 6.6 billion cubic meters of water by 2027—more than half of the UK’s total annual water usage.

Faced with increasing social and regulatory pressure, tech companies are finally beginning to disclose their water usage. For instance, Google’s latest sustainability report showed a 17% rise in its water consumption from 2025, totaling 6.1 billion gallons—enough to irrigate 41 golf courses annually.

While data centers account for most of AI’s water footprint, more research is needed to understand the water costs during other phases of AI’s lifecycle, especially in hardware manufacturing, like the semiconductors used in GPU chips.


3 Techniques to Reduce AI’s Environmental Footprint

With AI’s environmental costs on the rise, companies must adopt strategies to shrink their footprint. Below, we explore some of the top solutions to promote more sustainable AI.

  1. Optimizing Model Efficiency
    One of the most straightforward ways to cut AI’s carbon footprint is by improving the efficiency of AI models. This means achieving the same accuracy but with smaller, less complex models.Hardware advancements can also help. If engineers and microchip makers can maintain the pace of Moore’s Law—which predicts that the number of transistors on microchips will double every two years—then we’ll see faster, smaller, and more efficient GPUs in the coming years. However, some experts predict the end of Moore’s Law around 2025. If that happens, emerging technologies like quantum computing and new materials may become critical to AI’s future efficiency. Innovation in data center design and operation will also be essential for lowering energy use.On the software side, researchers are developing techniques to make AI models more efficient during training and inference, including:
    • Model pruning: This involves cutting out unnecessary parameters from a neural network, making the model smaller and more efficient during inference.
    • Quantization: This reduces the precision of the numbers in a model’s calculations, making them faster and less resource-heavy.
    • Knowledge distillation: This technique transfers the knowledge from a large model to a smaller one, enabling efficient performance with fewer resources.
  2. Using Renewable Energy
    Another approach is increasing investments in renewable energy sources. Solar panels, wind turbines, and hydropower generate electricity without releasing carbon emissions, which is exactly what the AI industry needs.Major AI players like Google, Microsoft, and Meta are heavily investing in clean energy infrastructure. However, as Sasha Luccioni points out:”Renewable energy is definitely growing. The problem is it’s not growing fast enough to keep up with AI’s growth.” — Sasha Luccioni, Climate Lead at Hugging FaceBecause of this, tech companies often rely on carbon credits or renewable energy certificates (RECs) to meet their green energy targets.
  3. Leveraging Cloud-Based Solutions
    Cloud providers can help AI companies train and deploy models more efficiently. These hyperscale data centers are specifically designed to handle energy-intensive tasks more quickly and at a lower cost. Plus, they’re optimized to reduce water usage for cooling.

Sustainable AI Practices in Industry

If used wisely, AI can be a game-changer in addressing the climate crisis. Let’s look at some real-world examples where AI is helping to promote environmental sustainability:

  • Extreme Weather Modeling: AI models like GraphCast, developed by Google, are improving weather forecasting with greater accuracy, helping predict cyclones, atmospheric rivers, and extreme temperatures.
  • Greenhouse Gas Monitoring: Projects like Climate TRACE use AI and remote sensing to monitor global GHG emissions more accurately than ever before.
  • Optimizing Power Grids: AI is playing a crucial role in managing the complexity of modern power grids, helping balance the supply and demand of renewable energy sources like solar and wind.
  • Material Discovery: AI models are speeding up the discovery of new materials that could revolutionize green technologies, from better batteries for electric cars to more efficient computing.

Ethical Considerations in AI Development

As AI continues to advance, every industry must evaluate its impact on the planet, and AI is no exception. While AI offers promise in driving environmental goals, the industry must also take responsibility for its environmental footprint.

AI Trade-offs
We need to consider what’s sacrificed when we prioritize AI development over other societal needs. For example, while tech companies are investing in clean energy to power their data centers, that energy could be used to heat homes or provide electricity to under-resourced communities.

Another issue is that green technologies often rely on minerals and rare earth elements that are difficult to extract and require a significant amount of energy, which negatively impacts the environment.

While the carbon footprint of AI is a global issue, the water footprint is often a local one. The water used by data centers can deplete resources needed by nearby communities and ecosystems.

Potential Rebound Effects
Another area needing further research is the potential rebound effects of AI efficiency improvements. History shows that efficiency gains in one area often lead to increased overall use, counteracting those gains.

For instance, Goldman Sachs reports that a ChatGPT query requires about 10 times the electricity of a Google search. As ChatGPT becomes more efficient, usage is likely to increase, potentially leading to even greater overall energy consumption.

AI and the Fossil Fuel Industry
It’s also important to note the close ties between AI and the fossil fuel industry. Big tech companies like Microsoft are providing AI tools to help oil and gas companies optimize their operations, which could delay the transition to renewable energy.


The Future of Sustainable AI

What does the future hold for AI and sustainability?

Emerging Trends in Green AI
Research into AI’s environmental costs is growing rapidly. As social concerns rise, AI companies will face increasing pressure to reduce their environmental footprints and adopt greener practices.

Policy and Regulation
Regulators will also play a crucial role in driving sustainable AI practices. The EU AI Act, for example, includes provisions for reducing the energy consumption of high-risk AI systems and promoting energy-efficient general-purpose AI models like GPT.

While the Act’s final environmental provisions were less ambitious than initially proposed, this is just the beginning. Further regulatory initiatives will likely emerge to promote transparency and encourage AI companies to assess their environmental impacts.


Conclusion

The AI revolution is unfolding alongside the most urgent challenge humanity has ever faced: the climate crisis. While AI can be a powerful tool to advance sustainability goals, the industry still has a long way to go in addressing its environmental impacts.

At DataCamp, we’re committed to raising awareness about the environmental impact of technology and working with organizations dedicated to sustainability. Through our DataCamp Donates program, we’ve partnered with climate-focused NGOs and clean energy companies to help their teams develop the skills needed to make a positive impact.

We invite nonprofit organizations with environmental missions to apply to DataCamp Donates or schedule a demo with our Sales team to explore how DataCamp can support their work.