SilverStone’s most powerful PSU, the HELA 2050R, just received a 450-watt upgrade. This new model is called the HELA 2500R and has four12V-2x6 power connectorsand a maximum rated continuous power output of 2,500 watts. This means you’ll need a special 16-amp, 240-volt outlet to run this beast, as the average 110-volt household outlet can only accommodate 1,800 watts.
you’re able to attach up to four RTX 4090s to the power supply with the four native 16-pin connectors, making it easier for you run these high-powered devices without using up all the other eight-pin slots. But even if you use up all the 16-pin slots on Nvidia’s top GPUs, the PSU can comfortably run them all as their 1,800 total TDP (at 450 watts each) will still give you a 700-watt headroom to run the rest of your PC. Aside from these four GPU 12V-2x6 slots, you also get four SATA, seven 8-pin PCIe, and the motherboard connectors.
This is SilverStone’s most potent PSU ever, but it’s not the highest output you may buy today. That’s because Super Flower alsolaunched a 2,800-watt PSUthat also has the same number of connectors as the HELA 2500R, but with more SATA / PERIF slots.
These new, high-powered PSUs aren’t made for gamers with deep pockets (although they can certainly use them). Instead, these cater for the needs of institutions and professionals that need multiple high-end GPUs for their computing needs, like AI training.
AI processing requires a lot of specialized hardware which you typically won’t find in CPUs. And even though we recently saw thelaunch of Snapdragon X processorswith onboard NPUs that can hit 45 TOPS, Nvidia says that this isnot enough performance to handle advanced AI tasks. The graphics card company claims that its GPUs still perform better when it comes to AI processing, that’s whyNvidia is seeing massive revenuesand is quicklybecoming one of the most valuable companiesin the world.
However, graphics cards are power-hungry, meaning companies that want to use on-device AI processing require these monster PSUs to power their workstations. In fact, Meta founder Mark Zuckerberg claims thatAI growth will be constrained by power limits. If a business runs this 2,500-watt machine for just 50% of the time, it will consume almost 11,000kWh per year — which is 500kWh more than the average annual U.S. household power consumption.
Of course, any business that invests thousands of dollars in high-powered AI systems will want to get its money’s worth. So, you can expect these PSUs to run 24/7, meaning just one HELA 2500R has the power requirements of two households. And if a company invests in ten of these systems, then we’re looking at 20 houses worth of power consumption going into a premises.
Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox.
With experts seeingAI technologies and applications consuming a quarter of America’s power production, we might soon see data centersputting up their own nuclear power plantsto deliver their electrical needs. Unless the national grid keeps up with the jump in power demand in the near future, the U.S. runs the risk of falling behind other countries in the AI race.
Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.