Today’s most powerful new data center GPUs for AI workloads can consume as much as 700 watts apiece. With a 61% annual utilization, that would account for about 3,740,520 Wh or 3.74 MWh per yearper GPU,fueling concerns about the availability of power and environmental impacts, especially when we zoom out and look at the number of total GPUs sold last year alone.Nvidiasold 3.76 million data center GPUslast year, accounting for 98% of the market. Adding the remaining 2% from Intel, AMD, and other players will account for over 3,836,000 GPUs delivered to data servers in 2023.Multiply that number by the total data center GPU deliveries last year, and you get 14,348.63 GWh of electricity used in a year. To put this in context, the average American home uses 10,791 kWh per year, meaning data center GPUs sold last year consume the same amount of power that 1.3 million households use annually. If we look at this at the state level, theCalifornia Energy Commissionreported that the state produced 203,257 GWh in 2022, meaning data center GPUs consume about 7% of the state’s annual production.However, you should remember that these are just data center GPUs. The data here does not include the CPUs, cooling systems, and other equipment data centers need to run properly. For example, mostRTX 4090s recommend a minimum 850-watt power supply, with some requiring 1,000 or even 1,200 watts. If we go by the minimum, servers and data centers built last year require over 25 GWh annually. These numbers do not even include data centers from 2022 and older and do not consider the many more coming online this year.Industry analysts have estimated that the data center GPU market will grow by 34.6% year-on-year until 2028, meaning we will likely see more data center GPUs pumped out in the coming years. Furthermore,Nvidia’s next generation of AI GPUs are expected to draw more powerthan the current 700-watt H100. Even if data center computers retain their power consumption in the coming years (they won’t), power demand for data centers should increase proportionally with the market’s growth.This unprecedented rise of data centers is raising concerns about our power infrastructure. In fact,the U.S. government is already in talks with tech companiesabout their AI electricity demands, especially as these new data centers could put undue pressure on the grid. Meta founder Mark Zuckerberg even says thatlimited power will constrain AI growth, especially asEnerdata notedthat global power production only rose by 2.5% per year in the last decade.Nevertheless, tech companies are not blind to this issue.Microsoft, traditionally a software company, is even looking toinvest in small modular nuclear reactorsfor its data centers. This is especially important as it partnered with OpenAI tobuild a $100 billion AI supercomputer, which would definitely require a ton of power.The rise of AI in our data-driven society means we need a lot of electricity to power our computing requirements. Furthermore, we mustn’t forget other upcoming technologies that also need a lot of juice, like EVs. Unless we find a way to develop chips (and motors) that deliver more power while simultaneously consuming less power, we’ll likely have to add more power production facilities and upgrade the supporting infrastructure to deliver them where needed.

Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox.

Nvidia data center GPUs

Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.

Jowi Morales