ChatGPT consumes more than 500,000 kWh of electricity per day. Is energy the bottleneck for AI development?

ChatGPT consumes more than 500,000 kWh of electricity per day. Is energy the bottleneck for AI development?

The New Yorker recently reported that Chat GPT's daily electricity consumption may exceed 500,000 kWh, which is 17,000 times the average electricity consumption of American households. Elon Musk has also predicted that power shortages will become the main factor restricting the development of AI in the next two years. However, is this really the case?

At present, the views on AI power consumption are mainly based on estimates rather than real measurement data. A report released by the Information Technology and Innovation Foundation (ITIF) of the United States believes that some inaccurate studies have greatly overestimated the energy consumption of AI. Such remarks may have an adverse impact on the development of AI and hinder AI from realizing its potential in promoting energy conservation and emission reduction and addressing environmental challenges. Industry insiders call for promoting industry information transparency and reducing the abuse of AI technology.

AI energy consumption assessment challenges:

Many factors affect

ITIF is a non-profit think tank headquartered in Washington, D.C. In the report titled "Reexamining Concerns about Energy Consumption of Artificial Intelligence," ITIF pointed out that there are huge differences in energy consumption and carbon emissions among different AI models, which are affected by many factors, including chip design, cooling system, data center design, software technology, workload, power source, etc.

Therefore, when it comes to estimating AI’s energy consumption, different studies have reached different conclusions. A preprint study published by a team at the University of Massachusetts Amherst in 2019 estimated that Google’s then-leading large language model BERT emitted about 1,438 pounds of carbon dioxide (about 652 kilograms) during 79 hours of training, which is equivalent to a passenger flying between New York and San Francisco. The study also reached similar conclusions for technologies such as AI Neural Architecture Search (NAS). This paper has been cited nearly 3,000 times on Google Scholar and received a lot of media coverage.

However, companies and institutions engaged in AI research and development have come to completely different analytical conclusions. In 2021, Google and the University of California, Berkeley released a preprint study arguing that the aforementioned study overestimated Google's AI carbon emissions by 88 times. However, this study received far less attention than the former, with only about 500 citations.

Copyright images in the gallery. Reprinting and using them may lead to copyright disputes.

Because the media and the public tend to focus on negative information, research with shocking conclusions is more likely to be spread. The words and deeds of celebrities in the technology industry are also amplifying the message that "AI consumes a lot of energy." Musk once predicted that "the factor restricting the development of AI will change from 'silicon shortage' to 'power shortage'"; OpenAI CEO Sam Altman also said that AI will consume more electricity and made a high-profile investment in nuclear fusion.

From the perspective of the entire life cycle

Evaluating AI Energy Consumption

The ITIF report points out that many current studies and policies focus on the AI ​​training phase, but many studies have shown that AI consumes more energy during the inference process, that is, the process in which people use AI to output results. In addition, different AI models handle different types of inference tasks, and the energy consumption varies greatly. For example, for 1,000 requests, the power consumption of the image classification task is 0.007 kWh, while the power consumption of the image generation task is 2.907 kWh.

The report authors point out that AI training is a one-time event, while its use is a long-term process. When discussing AI energy consumption, we should focus not on explosive growth but on long-term impact. Moreover, from the perspective of technological history, the growth of AI and its energy consumption will be limited by the following four factors:

1. Infrastructure construction costs will limit the rapid growth of AI

The conclusion that "ChatGPT's daily electricity consumption may exceed 500,000 kWh" comes from the estimate of Alex de Vries, the author of the technology blog Digiconomist. De Vries also predicted that in the worst case, Google's AI electricity consumption will be equivalent to the entire Ireland, reaching 29.3TWh (terawatt hours) per year. But he also pointed out that to achieve such a scale, Google needs to invest $100 billion in chips, in addition to billions of dollars in data center operations and electricity costs. If the operating costs of AI remain high, profit-seeking commercial companies will naturally slow down and reduce investment.

2. AI performance growth has marginal effects

In recent years, AI has made breakthroughs in many fields, which means it may soon reach a bottleneck. The returns from developing and operating larger models will become increasingly lower, and it will become more difficult to improve accuracy. Therefore, optimizing AI models may be the next research and development direction.

3. Improved performance of software and hardware will reduce AI energy consumption

The optimization of AI models and the advancement of hardware technology are expected to reduce AI energy consumption. A study published in the journal Science pointed out that between 2010 and 2018, the computing capacity of global data centers increased by 550%, storage space increased by 2,400%, but power consumption only increased by 6%. Innovations in hardware, virtualization technology, and data center design have improved energy efficiency and made it possible to scale cloud computing.

Copyright images in the gallery. Reprinting and using them may lead to copyright disputes.

Similarly, technologies such as pruning, quantization, and distillation are expected to bring better algorithms, making AI models faster and more energy-efficient. Google and the University of California, Berkeley research team pointed out that thanks to various technological advances, the energy consumption of AI within Google has remained stable in recent years, even though machine learning has grown to 70% to 80% of computing power.

4. The application of AI will eventually reduce carbon emissions in some areas

Overall, when people use emails instead of traditional letters, and watch DVDs or streaming channels instead of going to the cinema to watch movies, the digitization of these activities has reduced carbon emissions. AI is expected to continue to play a role in this regard, for example, by improving the video call experience and enabling more meetings to be conducted remotely. In addition, AI technology can also be used for power grid dispatching and analyzing climate data, thus becoming a help in combating climate change.

Accurately assess AI energy consumption

Promote healthy development of the field

From a historical perspective, concerns about AI carbon emissions are not new. In the 1990s, it was predicted that half of the future electricity generation would be used to meet Internet activities; when streaming media emerged, similar views were also raised. Today we know that these concerns have not become a reality. The ITIF report believes that rushing to control AI energy consumption without fully understanding it may hinder AI's performance improvement and limit its development potential. For example, making AI eliminate bias and hate speech and avoid outputting harmful information requires more inferences, which increases energy consumption.

To address public concerns about AI’s energy consumption, the report recommends that policymakers should:

① Establish corresponding standards to make AI model energy consumption open and transparent;

② Encourage the industry to proactively disclose AI model energy consumption information so that the public can make informed choices;

③ Consider the unintended consequences of AI regulations on energy use;

④ Use AI to achieve low-carbon government operations.

Some organizations are already promoting the disclosure of AI information. In December 2023, the European Union passed the Artificial Intelligence Act, the world's first AI regulatory bill. The bill urges AI developers to make models energy-efficient and sustainable, and requires corresponding information disclosure.

Copyright images in the gallery. Reprinting and using them may lead to copyright disputes.

De Vries also called for more information to be made public, and he hopes that AI development will be limited in time, just as cryptocurrency technology is. The Digiconomist blog states: "The development of emerging technologies such as AI and previously blockchain has been accompanied by a great deal of enthusiasm and fear of missing out (FOMO), which often leads to applications that have little benefit to end users."

In a report in The New Yorker, the famous science reporter Elizabeth Kolbert said: "Every time ChatGPT spits out information (or writes a high school essay for someone), a lot of computing is required. It is estimated that ChatGPT responds to about 200 million requests a day and consumes more than 500,000 kilowatt-hours of electricity." In other words, to deal with the AI ​​energy consumption crisis, perhaps we can start by limiting the use of AI to do homework.

References

[1]ElizabethKolbert.TheObsceneEnergyDemandsofA.I..TheNewYorker.https://www.newyorker.com/news/daily-comment/the-obscene-energy-demands-of-ai.<2024-03-09/2024-03-09>

[2] LozBlain.ElonMusk:AIwillrunoutofelectricityandtransformersin2025.NewAtalas.https://newatlas.com/technology/elon-musk-ai/.<2024-03-01/2024-03-12>

[3]DanielCastro.RethinkingConcernsAboutAI'sEnergyUse.CenterforDataInnovation.ITIF.https://itif.org/publications/2024/01/29/rethinking-concerns-about-ai-energy-use/.<2024-01-29/2024-03-04>

[4] Strubell, Emma, ​​Ananya Ganesh, and Andrew McCallum. "EnergyandpolicyconsiderationsfordeeplearninginNLP."arXivpreprintarXiv:1906.02243(2019).https://arxiv.org/pdf/1906.02243.pdf

[5]Patterson,David,etal."Carbonemissionsandlargeneuralnetworktraining."arXivpreprintarXiv:2104.10350(2021).https://arxiv.org/abs/2104.10350

[6]OpenAICEOAltmansaysatDavosfutureAIdependsonenergybreakthrough.Reuters.https://www.reuters.com/technology/openai-ceo-altman-says-davos-future-ai-depends-energy-breakthrough-2024-01-16/.<2024-01-16/2024-03-12>

[7] deVries, Alex. "The growing energy footprint of artificial intelligence." Joule7.10 (2023): 2191-2194.

https://www.cell.com/joule/pdf/S2542-4351(23)00365-3.pdf

[8] Eric Masanet et al., "RecalibratingGlobalDataCenterEnergy-UseEstimates," Science367, no.6481 (February28, 2020):984–86, https://doi.org/10.1126/science.aba3758.

[9] Patterson, David, et al. "The carbon footprint of machine learning training will plateau, then shrink." Computer55.7 (2022): 18-28.

[10]Artificialintelligenceact:CouncilandParliamentsstrikeadealonthefirstrulesforAIintheworld.CounciloftheEU.https://www.consilium.europa.eu/en/press/p ress-releases/2023/12/09/artificial-intelligence-act-council-and-parliament-strike-a-deal-on-the-first-worldwide-rules-for-ai/.<2023-12-09/2024-03-12>

[11]PoweringAIcoulduseasmuchelectricityasasmallcountry.Digiconomist.https://digiconomist.net/powering-ai-could-use-as-much-electricity-as-a-small-country/<2023-10-10/2024-03-12>

Author: Maya Blue Popular Science Creator

Review丨Yu Yang, Head of Tencent Xuanwu Lab

<<:  International Mathematics Day | Pi is not just an irrational number, but also a "rational" history of mathematics

>>:  What to do if you have fatty liver? Can it be reversed? Huaxi doctors: The key to daily conditioning is to do these two things!

Recommend

Perfect Diary’s community operation method!

1. Introduction I will not go into details about ...

How come the cute little "rabbit" turned into a noodle spirit when he grew up? !

In recent years, succulents have quickly become p...

Xiaomi in India: Poaching Google employees while showing goodwill to Android One

After Google teamed up with three major Indian lo...

Android-Super simple to achieve picture rounded corners

I've been troubled by image rounding recently...

iOS9 Learn more every day 3 :: Storyboard References

[[142029]] If you have ever used interface builde...

How to follow the hot topics of Qixi Festival with new media operators?

Chinese Valentine's Day is here again~ Some p...