Pàgines

24 de febr. 2026

Making an image with generative AI uses as much energy as charging your phone




Each time you use AI to generate an image, write an email, or ask a chatbot a question, it comes at a cost to the planet.

In fact, generating an image using a powerful AI model takes as much energy as fully charging your smartphone, according to a new study by researchers at the AI startup Hugging Face and Carnegie Mellon University. However, they found that using an AI model to generate text is significantly less energy-intensive. Creating text 1,000 times only uses as much energy as 16% of a full smartphone charge. 

Their work, which is yet to be peer reviewed, shows that while training massive AI models is incredibly energy intensive, it’s only one part of the puzzle. Most of their carbon footprint comes from their actual use.  

The study is the first time researchers have calculated the carbon emissions caused by using an AI model for different tasks, says Sasha Luccioni, an AI researcher at Hugging Face who led the work. She hopes understanding these emissions could help us make informed decisions about how to use AI in a more planet-friendly way. 


Luccioni and her team looked at the emissions associated with 10 popular AI tasks on the Hugging Face platform, such as question answering, text generation, image classification, captioning, and image generation. They ran the experiments on 88 different models. For each of the tasks, such as text generation, Luccioni ran 1,000 prompts, and measured the energy used with a tool she developed called Code Carbon. Code Carbon makes these calculations by looking at the energy the computer consumes while running the model. The team also calculated the emissions generated by doing these tasks using eight generative models, which were trained to do different tasks. 

Generating images was by far the most energy- and carbon-intensive AI-based task. Generating 1,000 images with a powerful AI model, such as Stable Diffusion XL, is responsible for roughly as much carbon dioxide as driving the equivalent of 4.1 miles in an average gasoline-powered car. In contrast, the least carbon-intensive text generation model they examined was responsible for as much CO2 as driving 0.0006 miles in a similar vehicle. Stability AI, the company behind Stable Diffusion XL, did not respond to a request for comment.

The study provides useful insights into AI’s carbon footprint by offering concrete numbers and reveals some worrying upward trends, says Lynn Kaack, an assistant professor of computer science and public policy at the Hertie School in Germany, where she leads work on AI and climate change. She was not involved in the research.

These emissions add up quickly. The generative-AI boom has led big tech companies to  integrate powerful AI models into many different products, from email to word processing. These generative AI models are now used millions if not billions of times every single day. 

The team found that using large generative models to create outputs was far more energy intensive than using smaller AI models tailored for specific tasks. For example, using a generative model to classify movie reviews according to whether they are positive or negative consumes around 30 times more energy than using a fine-tuned model created specifically for that task, Luccioni says. The reason generative AI models use much more energy is that they are trying to do many things at once, such as generate, classify, and summarize text, instead of just one task, such as classification. 

Luccioni says she hopes the research will encourage people to be choosier about when they use generative AI and opt for more specialized, less carbon-intensive models where possible. 

If you’re doing a specific application, like searching through email … do you really need these big models that are capable of anything? I would say no,” Luccioni says. 

The energy consumption associated with using AI tools has been a missing piece in understanding their true carbon footprint, says Jesse Dodge, a research scientist at the Allen Institute for AI, who was not part of the study. 

Comparing the carbon emissions from newer, larger generative models and older AI models  is also important, Dodge adds. “It highlights this idea that the new wave of AI systems are much more carbon intensive than what we had even two or five years ago,” he says. 

Google once estimated that an average online search used 0.3 watt-hours of electricity, equivalent to driving 0.0003 miles in a car. Today, that number is likely much higher, because Google has integrated generative AI models into its search, says Vijay Gadepally, a research scientist at the MIT Lincoln lab, who did not participate in the research. 

Not only did the researchers find emissions for each task to be much higher than they expected, but they discovered that the day-to-day emissions associated with using AI far exceeded the emissions from training large models. Luccioni tested different versions of Hugging Face’s multilingual AI model BLOOM to see how many uses would be needed to overtake training costs. It took over 590 million uses to reach the carbon cost of training its biggest model. For very popular models, such as ChatGPT, it could take just a couple of weeks for such a model’s usage emissions to exceed its training emissions, Luccioni says. 

This is because large AI models get trained just once, but then they can be used billions of times. According to some estimates, popular models such as ChatGPT have up to 10 million users a day, many of whom prompt the model more than once. 

Studies like these make the energy consumption and emissions related to AI more tangible and help raise awareness that there is a carbon footprint associated with using AI, says Gadepally, adding, “I would love it if this became something that consumers started to ask about.”

Dodge says he hopes studies like this will help us to hold companies more accountable about their energy usage and emissions. 

“The responsibility here lies with a company that is creating the models and is earning a profit off of them,” he says. 


From FLOPs to Footprints: The Resource Cost of Artificial Intelligence

As computational demands continue to rise, assessing the environmental footprint of AI requires moving beyond energy and water consumption to include the material demands of specialized hardware. This study quantifies the material footprint of AI training by linking computational workloads to physical hardware needs. The elemental composition of the Nvidia A100 SXM 40 GB graphics processing unit (GPU) was analyzed using inductively coupled plasma optical emission spectroscopy, which identified 32 elements. The results show that AI hardware consists of about 90% heavy metals and only trace amounts of precious metals. The elements copper, iron, tin, silicon, and nickel dominate the GPU composition by mass. In a multi-step methodology, we integrate these measurements with computational throughput per GPU across varying lifespans, accounting for the computational requirements of training specific AI models at different training efficiency regimes. Scenario-based analyses reveal that, depending on Model FLOPs Utilization (MFU) and hardware lifespan, training GPT-4 requires between 1,174 and 8,800 A100 GPUs, corresponding to the extraction and eventual disposal of up to 7 tons of toxic elements. Combined software and hardware optimization strategies can reduce material demands: increasing MFU from 20% to 60% lowers GPU requirements by 67%, while extending lifespan from 1 to 3 years yields comparable savings; implementing both measures together reduces GPU needs by up to 93%. Our findings highlight that incremental performance gains, such as those observed between GPT-3.5 and GPT-4, come at disproportionately high material costs. The study underscores the necessity of incorporating material resource considerations into discussions of AI scalability, emphasizing that future progress in AI must align with principles of resource efficiency and environmental responsibility.


https://arxiv.org/abs/2512.04142

From Efficiency Gains to Rebound Effects: The Problem of Jevons' Paradox in AI's Polarized Environmental Debate



As the climate crisis deepens, artificial intelligence (AI) has emerged as a contested force: some champion its potential to advance renewable energy, materials discovery, and large-scale emissions monitoring, while others underscore its growing carbon footprint, water consumption, and material resource demands. Much of this debate has concentrated on direct impacts—energy and water usage in data centers, e-waste from frequent hardware upgrades—without addressing the significant indirect effects. This paper examines how the problem of Jevons’ Paradox applies to AI, whereby efficiency gains may paradoxically spur increased consumption. We argue that understanding these second-order impacts requires an interdisciplinary approach, combining lifecycle assessments with socio-economic analyses. Rebound effects undermine the assumption that improved technical efficiency alone will ensure net reductions in environmental harm. Instead, the trajectory of AI’s impact also hinges on business incentives and market logics, governance and policymaking, and broader social and cultural norms. We contend that a narrow focus on direct emissions misrepresents AI’s true climate footprint, limiting the scope for meaningful interventions. We conclude with recommendations that address rebound effects and challenge the market-driven imperatives fueling uncontrolled AI growth. By broadening the analysis to include both direct and indirect consequences, we aim to inform a more comprehensive, evidence-based dialogue on AI’s role in the climate crisis.

The Hidden Cost of AI


 

Dr. Sasha Luccioni Projects

  • Evaluating the carbon emissions of AI models – my longstanding project is getting a better idea of how much carbon is emitted by AI models and what are the factors that influence it - see my “BLOOM” and “Counting Carbon” articles.
  • Stable Diffusion Bias Explorer – a demo for exploring the biases in text-to-image models like Stable Diffusion and Dall-E 2.
  • The Data Measurements Tool – a tool for exploring and analyzing common datasets used for training and evaluating Machine Learning models.
  • This Climate Does Not Exist – in which we use Generative Adversarial Networks (GANs) to visualize the potential future impacts of climate change.
  • CodeCarbon – I am contributing to creating a calculator to quantify the CO2 emissions produced during the training of AI algorithms.
  • Big Science – BigScience is a one-year long research workshop on very large language models as used and studied in the field of Natural Language Processing and more generally Artifical Intelligence research. I am co-chairing the carbon footprint working group within the project.


Measuring the environmental impact of delivering AI at Google Scale

The transformative power of AI is undeniable - but as user adoption accelerates, so does the need to understand and mitigate the environmental impact of AI serving. However, no studies have measured AI serving environmental metrics in a production environment. This paper addresses this gap by proposing and executing a comprehensive methodology for measuring the energy usage, carbon emissions, and water consumption of AI inference workloads in a large-scale, AI production environment. Our approach accounts for the full stack of AI serving infrastructure - including active AI accelerator power, host system energy, idle machine capacity, and data center energy overhead. Through detailed instrumentation of Google's AI infrastructure for serving the Gemini AI assistant, we find the median Gemini Apps text prompt consumes 0.24 Wh of energy - a figure substantially lower than many public estimates. We also show that Google's software efficiency efforts and clean energy procurement have driven a 33x reduction in energy consumption and a 44x reduction in carbon footprint for the median Gemini Apps text prompt over one year. We identify that the median Gemini Apps text prompt uses less energy than watching nine seconds of television (0.24 Wh) and consumes the equivalent of five drops of water (0.26 mL). While these impacts are low compared to other daily activities, reducing the environmental impact of AI serving continues to warrant important attention. Towards this objective, we propose that a comprehensive measurement of AI serving environmental metrics is critical for accurately comparing models, and to properly incentivize efficiency gains across the full AI serving stack.




16 de febr. 2026

The dark side of green technology: what do electric vehicles really cost?

 



https://www.nature.com/articles/d41586-026-00385-3


The Elements of Power: A Story of War, Technology, and the Dirtiest Supply Chain on Earth Nicolas Niarchos Penguin & William Collins (2026)

You probably don’t think about the Democratic Republic of the Congo (DRC) when scrolling on your phone. Or about the millions of people worldwide whose job it is to dig up and sell vast quantities of metals such as cobalt, copper or tungsten. But you ought to. Electronic devices have turned the metals used in batteries into strategic resources; green technologies such as electric vehicles have accelerated the scramble for them. Metal-rich nations, from Chile to Indonesia, have been pulled into a contest between governments, multinational corporations and armed groups.

In The Elements of Power, journalist Nicolas Niarchos refuses to let the realities of the critical-mineral supply chain be overlooked. He weaves together many seemingly disparate threads, from the DRC’s colonial history to how the mineral-extraction industry has grown in several nations to battery development in leading laboratories around the world. He lays out clearly the emergence of resource nationalism and superpower competition to secure dependable supplies. Rather than a dull account of business deals, Niarchos shares a vivid story of how the greed of a handful of high-ranking individuals has hurt millions of people.