Login Dark

Generative AI’s Energy Problem Today is Foundational - IEEE Spectrum

Author: IEEE Spectrum

Source: https://spectrum.ieee.org/ai-energy-consumption

Image of Generative AI’s Energy Problem Today is Foundational - IEEE Spectrum

While the origins of artificial intelligence (AI) can be traced back more than 60 years to the mid-20th century, the explosion of generative AI products—like ChatGPT or Midjourney— in the past two years has brought the technology to a new level of popularity.And that popularity comes at a steep energy cost, a reality of operations today that is often shunted to the margins and left unsaid.Alex de Vries is a PhD candidate at VU Amsterdam and founder of the digital sustainability blog Digiconomist.In a report published earlier this month in Joule, de Vries has analyzed trends in AI energy use and predicted that current AI technology could be on track to annually consume as much electricity as the country of Ireland (29.3 terawatt-hours per year.)“A single LLM interaction may consume as much power as leaving a low brightness LED lightbulb on for one hour.”—Alex de Veries, VU AmsterdamMany generative AI tools rely on a type of natural language processing called large language models (LLM) to first learn and then make inferences about languages and linguistic structures (like code or legal case prediction) our world.While the training process of these LLMs typically receives the brunt of environmental concern—models can consume many terabytes of data and use over 1,000 megawatt-hours of electricity—de Vries’ report highlights that in some cases electricity consumed while making inferences may be even higher. “You could say that a single LLM interaction may consume as much power as leaving a low brightness LED lightbulb on for one hour,” de Vries says.Roberto Verdecchia is an assistant professor at the University of Florence and the first author of a paper earlier this year on developing green AI solutions.He says that de Vries’ predictions may even be conservative when it comes to the true cost of AI, especially when considering the non-standardized regulation surrounding this technology.“I would not be surprised if also these predictions will prove to be correct, potentially even sooner than expected,” he says.“Considering the general IT environmental sustainability trends throughout the years, and the recent popularization of LLMs, the predictions might be even deemed as conservative.”AI’s energy problem has historically been approached through optimizing hardware, says Verdecchia.However, continuing to make microelectronics smaller and more efficient is becoming “physically impossible,” he says. In his paper, published in the journal WIREs Data Mining and Knowledge Discovery, Verdecchia and colleagues highlight several algorithmic approaches that experts are taking instead.These include improving data collection and processing techniques, choosing more efficient libraries, and improving the efficiency of training algorithms.“The solutions report impressive energy savings, often at a negligible or even null deterioration of the AI algorithms’ precision,” Verdecchia says.Yet, even with work underway to improve the sustainability of AI products, de Vries says that these solutions may still only succeed in helping the reach of these AI grow even further.“In the race to produce faster and more accurate AI models, environmental sustainability is often regarded as a second-class citizen.”—Roberto Verdecchia, University of FlorenceWe need to consider rebound effects, de Vries says, such as “increasing efficiency leading to more consumer demand, leading to an increase in total resource usage and the fact that AI efficiency gains may also lead to even bigger models requiring more computational power.”Ultimately, de Vries and Verdecchia concur that human self-regulation could play an equally important role in curbing the slope of AI’s energy consumption.For example, developers will need to decide whether eking out another precision point from their model is with the jump in that model’s environmental impact, Verdecchia says.Unfortunately, this kind of self-restraint may be easier said than done, particularly when the market demands newer and better products.“In the race to produce faster and more accurate AI models, environmental sustainability is often regarded as a second-class citizen,” Verdecchia says. De Vries argues that developers should also think critically about what products really need AI integration.For example, de Vries’ paper estimates that it would cost Google $100 billion in server costs alone if the search engine were to incorporate AI inference into every single one of its web searches.“I think the biggest responsibility is with institutions that are currently forcing AI on all kinds of solutions regardless of whether it is the best fit, [because they’re] influenced by hype and fear of missing out,” he says.“It will be crucial to realize that AI is not a miracle cure and has its own limitations.”As for users asking ChatGPT to write silly stories or generative fantastical photos, Verdecchia says that these individual habits are not going to make or break the environmental impact of these products.That said, thinking and speaking critically about the impact of these products could help push the needle in the right direction as developers work behind the scenes. “Pushing for a clear, transparent, and comparable monitoring and reporting of AI sustainability is the first step required to make AI more environmentally sustainable,” Verdecchia says.

Subscribe To Our NewsLetter