Based on an infographic created by the Estampa team entitled “Cartography of generative AI”, EPFL’s sustainability team has created an interactive workshop that explores the various environmental and social impacts of generative artificial intelligence. A description and list of references for each impact are provided here.
Energy consumption
Generative AI relies on large-scale computation, particularly GPU and TPU-accelerated, for both training and inference. Training large language models can consume gigawatt-hours of electricity, comparable to the lifetime emissions of dozens of households. Despite efficiency gains, demand keeps rising due to larger models and increased usage.
- Strubell, E., Ganesh, A., McCallum, A. Energy and Policy Considerations for Deep Learning in NLP. ACL, 2019.
- Masanet, E. et al. Recalibrating global data center energy-use estimates. Science, 2020.
- International Energy Agency. Energy and AI. IEA Report, Avril 2025.
Carbon footprint
The carbon footprint of Gen AI varies with energy sources and location. Training a single large model may emit hundreds of tons of CO₂-equivalent, particularly when powered by fossil-based grids. Transparency around emissions remains limited. Today, as electricity comes mostly from fossil fuel, its usage remains carbon intense.
If usage remains the main source of carbon impact, manufacturing of servers is often not negligible, and add to environmental cost of AI.
- Lacoste, A. et al. Quantifying the Carbon Emissions of Machine Learning. NeurIPS Workshop, 2019.
- Patterson, D. et al. Carbon Emissions and Large Neural Network Training. arXiv, 2022.
- Bender, E. et al. On the Dangers of Stochastic Parrots. FAccT, 2021.
Cooling needs
High-density AI workloads generate substantial heat, increasing reliance on energy-intensive cooling systems, including liquid cooling and evaporative cooling in hyperscale data centers. In the best case, the heat can be reuse in urban heat systems (households, public swimming pools, etc.)
- Shehabi, A. et al. 2024 United States Data Center Energy Usage Report. Lawrence Berkeley National Laboratory, 2024.
- Li, P. et al. Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models. Nature Climate Change, 2023.
Water consumption
AI-driven data centers consume large volumes of freshwater for cooling, sometimes competing with local water needs. Some estimates show millions of liters per day for large facilities. Depending on territories, if water is abundant (Switzerland, Canada), there are no issues in using a portion dedicated on cooling servers. But in territories with existing water tensions (Spain, Uruguay, Arizona – US), it can accentuate difficulties in accessing drinkable water for users.
- Shehabi, A. et al. United States Data Center Energy Usage Report. LBNL, 2018.
- Li, Z. et al. Making AI Less “Thirsty”. Nature Climate Change, 2023.
- The Guardian. AI data centers are draining water resources. 2023.
End of life for equipment
Rapid hardware turnover (GPUs, servers) shortens the lifespan of AI infrastructure, accelerating electronic waste. GPUs and servers are replaced frequently, while recycling rates remain low and resource recovery inefficient. In 2022, only 22% of electronic waste was properly collected and recycled.
- Forti, V. et al. The Global E-waste Monitor 2020. United Nations University, 2020.
- Bordage, F. Sobriété numérique. GreenIT.fr, 2019.
- Baldé C. et al., The global E-waste monitor 2024, UNITAR and UIT, Nov. 2024
Noise and air pollution
Although less significant, noise and air pollution remain important issues for the largest data centers—also known as hyperscalers—if they are located close to residential areas. Air pollution is mainly linked to the energy mix of the grid, but recent data centers are built with gas-fired power plants for electricity generation, which impacts air quality in the vicinity of these facilities.
- Data center turn to ex airliner engines as AI power crunch bites, Tom’s Hardware, Oct. 25
- Cooper S. P., Noisy, Hungry Data Centers Are Catching Communities by Surprise, The New York Times
Supply chain of raw materials
AI hardware depends on critical raw materials such as cobalt, lithium, and rare earths, often extracted under environmentally and socially harmful conditions. These dependencies increase geopolitical vulnerability.
- Sovacool, B. K. et al. Sustainable minerals. Nature Energy, 2020.
- International Energy Agency. The Role of Critical Minerals in Clean Energy Transitions. 2021.
- Systext reports
Human labour
Gen AI relies on hidden human labor for large-scale data labeling and content moderation often rely on low-paid, precarious labor, frequently outsourced to the Global South, with documented psychological and social harms.
- Gray, M. L., Suri, S. Ghost Work. Houghton Mifflin Harcourt, 2019.
- Couldry, N., Mejias, U. The Costs of Connection. Stanford University Press, 2019.
- Time Magazine. The Hidden Workers Behind AI. 2023.
Underwater pollution
Generative AI services rely heavily on global digital infrastructures, notably submarine fiber-optic cables, which carry more than 95% of intercontinental data traffic. The rapid growth of cloud computing and Gen AI increases data flows, indirectly intensifying the deployment, maintenance, and upgrading of these undersea cables. Cable installation and repair can disturb seabed habitats through dredging, anchoring, and trenching, affecting benthic ecosystems and sensitive marine species.
- Carter, L., Burnett, D., Drew, S. et al. Submarine Cables and the Oceans: Connecting the World. UNEP-WCMC & ICPC, 2009.
- Clare, M., Wopschall, R., et al. Submarine Cable Protection and the Environment, ICPC, 2024
- Duarte, C. M. et al. The soundscape of the Anthropocene ocean. Science, 2021.
Private-sector bottleneck and geopolitical tensions
Compute infrastructure and foundation models are controlled by a small number of firms and countries, creating strategic bottlenecks and geopolitical tensions over access to AI capabilities, both in terms of raw material supply and the actual manufacturing of GPU cards (NVIDIA and ASML have a virtual monopoly). For random access memory (DRAM) components, the situation is even worse, with only four to five manufacturers worldwide.
- Crawford, K. Atlas of AI. Yale University Press, 2021.
- OECD. AI Compute and Climate. 2023.
Planned obsolescence
Generative AI accelerates planned obsolescence through rapid innovation cycles in hardware (GPUs, accelerators) and tightly coupled software ecosystems. New AI models are often optimized for the latest architectures, reducing the economic and technical viability of older hardware. This shortens server lifespans, increases electronic waste, and reinforces dependency on frequent upgrades. Proprietary stacks and cloud-based access further limit reuse, repair, and interoperability. As a result, efficiency gains are often offset by increased material consumption and resource extraction.
- European Commission. Electronics and Sustainability – Right to Repair. 2022.
- Forti, V. et al. The Global E-waste Monitor. United Nations University, 2020.
- Crawford, K. Atlas of AI. Yale University Press, 2021.
Data extractivism
Generative AI systems are trained on massive datasets collected from the web, scientific repositories, and cultural archives, often without explicit consent or compensation, most of the time both during training and use phase. This practice is described as data extractivism, drawing parallels with historical resource extraction: value is appropriated from individuals and communities while benefits concentrate among a few private actors. Data scraping can undermine creators’ rights, distort knowledge production, and reinforce global inequalities. The opacity of training datasets and limited governance mechanisms exacerbate these ethical and legal concerns.
- Couldry, N., Mejias, U. The Costs of Connection: How Data Is Colonizing Human Life and Appropriating It for Capitalism. Stanford University Press, 2019.
- Crawford, K. Atlas of AI. Yale University Press, 2021.
- Birhane, A., Prabhu, V. Large image datasets: A pyrrhic win for computer vision. WACV, 2021.
- Le Monde. Les données au cœur des conflits sur l’IA générative. 2023.
- The New York Times. The Authors Guild Sues OpenAI. 2023.
Job reshape
… under construction …
AI investment capital
… under construction …
Policy reshape
… under construction …
AI social influence
Generative AI increasingly shapes social interactions, information ecosystems, and cultural production. By generating text, images, and videos at scale, Gen AI can amplify misinformation, reinforce biases, and influence public opinion, particularly through social media and automated content generation. Recommendation systems and AI-generated content blur the line between human and machine expression, affecting trust and democratic processes. At the same time, Gen AI reshapes creativity, education – particularly learning process for students – and communication practices, raising questions about authorship, cultural diversity, and power asymmetries in the digital public sphere.