A.I. Could Soon Need as Much Electricity as an Entire Country

admin
2 Min Read

In 2022, data centers that power all computers, including Amazon’s cloud and Google’s search engine, used about 1 to 1.3 percent of the world’s electricity. That excludes cryptocurrency mining, which used another 0.4 percent, though some of those resources are now being redeployed to run A.I. De Vries is a Ph.D. student at Vrije Universiteit Amsterdam and founded the research company Digiconomist, which publishes the Bitcoin Energy Consumption Index. It’s impossible to quantify A.I.’s energy use exactly, because companies like OpenAI disclose very few details, including how many specialized chips they need to run their software. So de Vries came up with a way to estimate electricity consumption using projected sales of Nvidia A100 servers — the hardware estimated to be used by 95 percent of the A.I. market. “Each of these Nvidia servers, they are power-hungry beasts,” de Vries said. He started with a recent projection that Nvidia could ship 1.5 million of these servers by 2027, and multiplied that number by its servers’ electricity use: 6.5 kilowatts for Nvidia’s DGX A100 servers, for example, and 10.2 kilowatts for its DGX H100 servers. He noted several caveats. Customers might use the servers at less than 100 percent capacity, which would lower electricity consumption. But server cooling and other infrastructure would push the total higher.

Share This Article
By admin
test bio
Please login to use this feature.