Skip to main content
opinion
Open this photo in gallery:

Visitors gather near a statue at a concept shop for Chinese electric car maker Jidu in Beijing, on Feb. 14, 2023.Mark Schiefelbein

Tushar Sharma is an assistant professor at Dalhousie University.

The proliferation of artificial intelligence, especially generative models, is evident in all facets of our lives. While this advancement drives innovation, it comes with a massive direct and indirect cost.

Modern AI models consume enormous computational resources, resulting in significant energy usage and carbon emissions. Their carbon footprint will spiral in the absence of mitigation measures. All the state-of-the-art models have hardly considered any significant mitigation strategies. Even China’s DeepSeek models, which its proponent said was trained with a fraction of traditional computing power, have not considered all the dimensions to address this critical challenge.

Training a relatively old Nvidia Megatron-LM model consumes enough energy to power three American households for a year. This unsustainable trend continues: the computational resources required to train a best-in-class AI model is doubling every 100 days.

On the usage side, when you fire a simple prompt to generate an image using a modern image generation AI model, it can consume as much energy as it takes to nearly charge our phones once. The cumulative energy consumption becomes staggering when millions of such requests occur daily across popular AI platforms.

The rapid evolution of AI models drives exponential growth in complexity. The size of large models increases from OpenAI GPT-3’s 175 billion parameters to Google AI PaLM’s 540 billion and further to GPT-4’s estimated trillions. Each leap forward in model capacity amplifies energy consumption and environmental impact.

In 2024, data centres consumed approximately 4 per cent of U.S. electricity, with projections from American Electric Power Co. anticipating a dramatic surge from eight terawatt-hours in 2024 to 652 TWh in 2030. This massive energy demand has prompted major tech companies such as Amazon Web Services and Google to invest significantly in nuclear energy solutions.

The AI boom’s environmental impact extends beyond carbon emissions and electricity consumption at a threatening scale, with data centres requiring massive water resources for cooling. Projections indicate AI water usage will reach 1.7 trillion gallons by 2027– exceeding the total water consumption of half the United Kingdom.

The Green AI movement explores solutions to reduce resource consumption of AI models while maintaining their performance. These initiatives aim to develop more efficient approaches to AI development and deployment, minimizing environmental impact through optimized resource usage.

DeepSeek models, according to their proponent, have recently demonstrated that with innovative techniques it is possible to significantly reduce energy footprint and computing resources without compromising the performance. These green solutions include model compression through pruning, quantization and knowledge distillation, along with efficient model training via mixed precision training, dynamic batch sizing and gradient sparsification.

Energy-efficient architectures using neural architecture search and sparse designs further reduce resource usage. Additionally, choosing task-specific energy-efficient programming languages and machine-learning frameworks has shown promise in reducing an AI system’s environmental impact.

The real question is: Will these efforts be sufficient in helping us reduce AI’s increasing carbon emissions?

Despite promising efficiency improvements in AI, current efforts to reduce its environmental impact fall short. Technical solutions alone cannot overcome the exponential growth in AI’s resource consumption. More importantly, the AI community’s intense focus on performance metrics overshadows environmental concerns.

Major tech companies prioritize model capabilities over sustainability when developing large-scale AI systems. It’s the classic Jevons paradox. Increased efficiency induces more demand and does little to reduce overall consumption. For meaningful change, sustainability must be a fundamental consideration and a core design principle rather than an afterthought.

We can’t remain passive to this alarming challenge. Each ecosystem stakeholder has a responsibility that they need to assume sincerely.

Large AI companies must shift from purely performance-driven development to sustainable practices. The current AI race incentivizes scaling models for marginal performance gains while neglecting environmental costs. As part of corporate social responsibilities, companies must implement transparent energy consumption reporting through public and open datasets and adopt mechanisms such as carbon credit systems to avoid rampant unsustainable growth.

The Green AI research community must drive innovation through technical standards development and evaluation framework reform. This includes establishing energy-aware architecture principles, standardized efficiency metrics, and automated monitoring and reporting tools and techniques. Research evaluation must evolve to incorporate efficiency alongside performance through integrated energy metrics and dedicated sustainability leaderboards.

Government and policy makers must create frameworks incentivizing sustainable practices through economic and regulatory measures. This includes implementing AI-specific carbon pricing mechanisms and energy-based pricing models reflecting environmental costs. Regulations may consider mandating efficiency standards, energy consumption caps, standardized impact reporting and progressive renewable energy targets while promoting innovation and accountability.

Growth without sustainability is a mirage. While AI accelerates our progress, sustainable AI ensures the appetite to enjoy it.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe