- World’s data centres on course to use more electricity than India, by 2030, and finding ways to head off that projected tripling of energy use is paramount if AI is going to achieve its promise.
- For AI systems to get better, they will need more training — a stage that involves bombarding the software with data — and that’s going to run up against the limits of energy capacity.
The world’s data centers are on course to use more electricity than India, the world’s most populous country, by 2030, Arm Holdings Plc Chief Executive Officer Rene Haas, said.
“AI’s voracious need for computing power is threatening to overwhelm energy sources and finding ways to head off that projected tripling of energy use is paramount if artificial intelligence is going to achieve its promise,” he told Bloomberg.
The annual electricity report from the International Energy Agency (IEA) says data centres consumed 460TWh in 2022, a figure that could rise to more than 1,000TWh by 2026 in a worst-case scenario.
According to the report, compute power and cooling are the two most energy-intensive processes within data centres and the rapid growth of artificial intelligence-related services over the last 12 months means providers have been investing in power-hungry GPUs.
“The rate at which electricity usage will increase by 2026 depends on “the pace of deployment, range of efficiency improvements, as well as artificial intelligence and cryptocurrency trends,” the report said,
Haas joins a growing number of people raising alarms about the toll AI could take on the world’s infrastructure. But he also has an interest in the industry shifting more to Arm chips designs, which are gaining a bigger foothold in data centers. The company’s technology — already prevalent in smartphones — was developed to use energy more efficiently than traditional server chips.
More training needed
“We are still incredibly in the early days in terms of the capabilities. For AI systems to get better, they will need more training — a stage that involves bombarding the software with data — and that’s going to run up against the limits of energy capacity,” Hass said.
Arm, which began trading on the Nasdaq last year after 2023’s largest US initial public offering, sees AI and data centre computing as one of its biggest growth drivers.
AWS, Microsoft Corp and Alphabet are using Arm’s technology as the basis of in-house chips that help run their server farms. As part of that shift, they’re decreasing reliance on off-the-shelf parts made by Intel Corp. and Advanced Micro Devices Inc.
By using more custom-built chips, companies can lessen bottlenecks and save energy, according to Haas. Such a strategy could reduce data centre power by more than 15 per cent.
“There needs to be broad breakthroughs,” he said. “Any piece of efficiency matters.”