In an interview cited by The Wall Street Journal earlier this week, Rene Has, CEO of Arm, warned of AI’s “insatiable” thirst for electricity, stating an increase to as much as 25% of the U.S.' current 4% power grid usage from AI datacenters is possible.

Rene himself may have been citing anInternational Energy Agencyreport from January stating that ChatGPT consumes roughly 2.9 watt-hours of electricity per request, which is 10 times as much as a standardGooglesearch. Thus, if Google made the full hardware and software switch with its search engine, Google would consume at least 11 terawatt-hours of electricity per year from its current 1 TWh.

November 3, 2022 photo of Rene Haas.

The original report says one example of a standard 2.9-watt-hour would be running a 60-watt-hour lightbulb for just under three minutes. Similar to the standard deviation of ChatGPT queries to standard search engines, industry-wide expectations for Artificial Intelligence power demands are expected to increase tenfold.

These statements were made ahead of an expected U.S. and Japanese partnership in AI and alongside recent developments like OpenAI’sSora, the current version of whichFactorial Fundsestimates to consume at least one Nvidia H100 GPU per hour to generate five minutes of video.Grok 3has also been estimated to require 100,000 Nvidia H100s just for training. A single, 700-watt Nvidia H100 can consume roughly 3740 kilowatt-hours per year.

Christopher Harper

Without great improvements to efficiency and/or greatly increased government regulation, Rene declares the current trend is “hardly very sustainable,” and he might be correct.

The US Energy Information Administration (EIA) stated that the United States generated a total of 4.24 trillion kilowatt-hours, or 4240 terawatt-hours, in 2022, with only 22% of that coming from renewables. This is compared to a total consumption of 3.9 trillion kWh, or 3900 terawatt-hours of the available ~42.

That’s 11 of the 340 remaining terawatt-hours left at current levels that the AI industry seems to be aiming for in the next decade. Sustainability must also keep in mind the likely increasing demands of other industries and the scale of renewable to non-renewable resources. Given that the cost of power has nearly doubled since 1990 (perStatista), perhaps calls for more regulation are justified.

Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox.

Of course, outlets likeThe New York Timesare also outright suing OpenAI andMicrosoft, so it’s not like the current AI industry is without existing legal challenges. Rene Haas expressed hope that the international partnership between Japan and the U.S. may yet improve these dramatically high power estimations. However, corporate greed and compute demand are also international, so only time will tell.

Christopher Harper has been a successful freelance tech writer specializing in PC hardware and gaming since 2015, and ghostwrote for various B2B clients in High School before that. Outside of work, Christopher is best known to friends and rivals as an active competitive player in various eSports (particularly fighting games and arena shooters) and a purveyor of music ranging from Jimi Hendrix to Killer Mike to the Sonic Adventure 2 soundtrack.