ChatGPT: tackle the growing carbon footprint of generative AI

Research output: Journal article publicationComment/debate/erratum

55 Citations (Scopus)

Abstract

Microsoft, Google and Meta are investing billions of dollars in generative artificial intelligence (AI) such as the large language model ChatGPT, released by OpenAI in San Francisco, California. But the features that make these models so much more powerful than their predecessors also impose a much heavier toll on the environment. We propose a framework for more sustainable development of generative AI.

A surge in complexity allows large language models to produce intelligent text, but consumes substantially more electricity than did previous versions. Based on the number of specialized GPUs (graphics processing units) shipped in 2022, this could amount to about 9,500 gigawatt-hours of electricity — comparable to Google’s energy consumption in 2018. Ever-more-sophisticated hardware will increase the speed and capacity for training huge data sets, spurring growth of generative AI models that consume yet more energy.

When developing generative AI, carbon emissions can be cut by tailoring the structure of the model and by promoting energy-efficient hardware and the use of clean energy sources. And optimizing the operation of AI models will reduce the number of inefficient steps (see, for example, go.nature.com/3jh3nzy and go.nature.com/3jbtfga)
Original languageEnglish
Pages (from-to)586
Number of pages1
JournalNature
Volume615
Issue number7953
DOIs
Publication statusPublished - 23 Mar 2023
Externally publishedYes

Keywords

  • Energy
  • Machine learning
  • Renewable energy

ASJC Scopus subject areas

  • General

Fingerprint

Dive into the research topics of 'ChatGPT: tackle the growing carbon footprint of generative AI'. Together they form a unique fingerprint.

Cite this