Under the hood of generative AI

Based on an infographic created by the Estampa team entitled “Cartography of generative AI”, EPFL’s sustainability team has created an interactive workshop that explores the various environmental and social impacts of generative artificial intelligence. A description and list of references for each impact are provided here.

Energy consumption

Generative AI relies on large-scale computation, particularly GPU and TPU-accelerated, for both training and inference. Training large language models can consume gigawatt-hours of electricity, comparable to the lifetime emissions of dozens of households. Despite optimization and efficiency gains, demand keeps rising due to larger models and increased demand. And for some territories, implantation of data centers make raise the electricity cost.

  • Strubell, E., Ganesh, A., McCallum, A. Energy and Policy Considerations for Deep Learning in NLP. ACL, 2019.
  • Masanet, E. et al. Recalibrating global data center energy-use estimates. Science, 2020.
  • International Energy Agency. Energy and AI. IEA Report, Avril 2025.
  • Petterson et al., The Carbon Footprint of Machine Learning training will plateau, then shrink, IEEE Computer, 2022
  • Saul, J., et al., AI Data Centers are sending power bills soaring, Bloomberg Technology, Sept. 30, 2025

Carbon footprint

The carbon footprint of Gen AI varies with energy sources and location. Training a single large model may emit hundreds of tons of CO₂-equivalent, particularly when powered by fossil-based grids. Transparency around emissions remains limited. Today, as electricity comes mostly from fossil fuel, its usage remains carbon intensive.
If AI usage phase remains the main source of carbon impact, manufacturing of servers is often not negligible, and adds to the environmental cost of AI.

  • Lacoste, A. et al. Quantifying the Carbon Emissions of Machine Learning. NeurIPS Workshop, 2019.
  • Patterson, D. et al. Carbon Emissions and Large Neural Network Training. arXiv, 2022.
  • Bender, E. et al. On the Dangers of Stochastic Parrots. FAccT, 2021.

Cooling needs

High-density AI workloads generate substantial heat, increasing reliance on energy-intensive cooling systems, including liquid cooling and evaporative cooling in hyperscale data centers. In the best case, the heat can be reused in urban heat systems (households, public swimming pools, etc.).

  • Shehabi, A. et al. 2024 United States Data Center Energy Usage Report. Lawrence Berkeley National Laboratory, 2024.
  • Li, P. et al. Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models. Nature Climate Change, 2023.

Water consumption

AI-driven data centers consume large volumes of freshwater for cooling, sometimes competing with local water needs. Some estimates show millions of liters per day for large facilities. In some territories with abundant water resources (e.g. Switzerland, Canada), using a small portion for cooling servers may not represent a significant issue. However, in territories with existing water tensions (Spain, Uruguay, Arizona – US), it can accentuate difficulties in accessing drinkable water for humans.

  • Shehabi, A. et al. United States Data Center Energy Usage Report. LBNL, 2018.
  • Li, Z. et al. Making AI Less “Thirsty”. Nature Climate Change, 2023.
  • The Guardian. AI data centers are draining water resources. 2023.

End of life for equipment

Rapid hardware turnover (GPUs, servers) shortens the lifespan of AI infrastructure, accelerating electronic waste. GPUs and servers are replaced frequently – even if new GPU market price and DRAM crisis might revert this trend, while recycling rates remain low and resource recovery inefficient. In 2022, only 22% of electronic waste was properly collected and recycled.

  • Forti, V. et al. The Global E-waste Monitor 2020. United Nations University, 2020.
  • Shilov, A., Datacenter GPU service life can be surprisingly short, Tom Hardware, October 24, 2024.
  • Bordage, F. Sobriété numérique. GreenIT.fr, 2019.
  • Baldé C. et al., The global E-waste monitor 2024, UNITAR and UIT, Nov. 2024.

Noise and air pollution

Although less significant, noise and air pollution remain important issues for the largest data centers—also known as hyperscalers—if they are located close to residential areas. Air pollution is mainly linked to the energy mix of the grid, but recent data centers are built with gas-fired power plants for electricity generation, which impacts air quality in the vicinity of these facilities.

  • Data center turn to ex airliner engines as AI power crunch bites, Tom’s Hardware, Oct. 25
  • Cooper S. P., Noisy, Hungry Data Centers Are Catching Communities by Surprise, The New York Times

Supply chain of raw materials

AI hardware depends on critical raw materials such as cobalt, lithium, and rare earths, often extracted under environmentally and socially harmful conditions. These dependencies increase geopolitical vulnerability and instability.

  • Sovacool, B. K. et al. Sustainable minerals. Nature Energy, 2020.
  • International Energy Agency. The Role of Critical Minerals in Clean Energy Transitions. 2021.
  • Systext reports

Human labour

Content moderation and Gen AI, which often requires human labour for large-scale data labeling, often relies on low-paid, precarious labor, frequently outsourced to the Global South, with documented psychological and social harms.

  • Gray, M. L., Suri, S. Ghost Work. Houghton Mifflin Harcourt, 2019.
  • Couldry, N., Mejias, U. The Costs of Connection. Stanford University Press, 2019.
  • Time Magazine. The Hidden Workers Behind AI. 2023.

Underwater pollution

Generative AI services rely heavily on global digital infrastructures, notably submarine fiber-optic cables, which carry more than 95% of intercontinental data traffic. Cable installation and repair can disturb seabed habitats through dredging, anchoring, and trenching, affecting benthic ecosystems and sensitive marine species.

  • Carter, L., Burnett, D., Drew, S. et al. Submarine Cables and the Oceans: Connecting the World. UNEP-WCMC & ICPC, 2009.
  • Clare, M., Wopschall, R., et al. Submarine Cable Protection and the Environment, ICPC, 2024
  • Duarte, C. M. et al. The soundscape of the Anthropocene ocean. Science, 2021.

Private-sector bottleneck and geopolitical tensions

Compute infrastructure and foundation models are controlled by a small number of firms and countries, creating strategic bottlenecks and geopolitical tensions over access to AI capabilities, both in terms of raw material supply (see section Supply chain of raw materials) and the actual manufacturing of GPU cards and its components (TSMC for instance has a virtual monopoly). For random access memory (DRAM) components, the situation is even worse, with only four to five manufacturers worldwide.

  • Crawford, K. Atlas of AI. Yale University Press, 2021.
  • OECD. AI Compute and Climate. 2023.

Planned obsolescence

Generative AI accelerates planned obsolescence through rapid innovation cycles in hardware (GPUs, accelerators) and tightly coupled software ecosystems. New AI models are often optimized for the latest architectures, reducing the economic and technical viability of older hardware. This shortens server lifespans, increases electronic waste, and reinforces dependency on frequent upgrades. Proprietary stacks and cloud-based access further limit reuse, repair, and interoperability. As a result, efficiency gains are often offset by increased material consumption and resource extraction.

  • European Commission. Electronics and Sustainability – Right to Repair. 2022.
  • Forti, V. et al. The Global E-waste Monitor. United Nations University, 2020.
  • Crawford, K. Atlas of AI. Yale University Press, 2021.

Data extractivism

Generative AI systems are trained on massive datasets collected from the web, scientific repositories, and cultural archives, often without explicit consent or compensation, most of the time both during training and use phase. This practice is described as data extractivism, drawing parallels with historical resource extraction: value is appropriated from individuals and communities while benefits concentrate among a few private actors. Data scraping can undermine creators’ rights, distort knowledge production, and reinforce global inequalities. The opacity of training datasets and limited governance mechanisms exacerbate these ethical and legal concerns.

  • Couldry, N., Mejias, U. The Costs of Connection: How Data Is Colonizing Human Life and Appropriating It for Capitalism. Stanford University Press, 2019.
  • Crawford, K. Atlas of AI. Yale University Press, 2021.
  • Birhane, A., Prabhu, V. Large image datasets: A pyrrhic win for computer vision. WACV, 2021.
  • Le Monde. Les données au cœur des conflits sur l’IA générative. 2023.
  • The New York Times. The Authors Guild Sues OpenAI. 2023.

Job reshape

… under construction …

AI investment capital

… under construction …

Policy reshape

… under construction …

AI social influence

Generative AI increasingly shapes social interactions, information ecosystems, and cultural production. By generating text, images, and videos at scale, Gen AI can amplify misinformation, reinforce biases, and influence public opinion, particularly through social media and automated content generation. Recommendation systems and AI-generated content blur the line between human and machine expression, affecting trust and democratic processes. At the same time, Gen AI reshapes creativity, education – particularly learning process for students – and communication practices, raising questions about authorship, cultural diversity, and power asymmetries in the digital public sphere.