Sunday, December 22, 2024
Business

With AI forcing data centers to consume more energy, software that hunts for clean electricity across the globe gains currency

Tech giants are racing to ward off a carbon time bomb caused by the massive data centers they’re building around the world.

A technique pioneered by Google is gaining currency as more power-hungry artificial intelligence comes online: Using software to hunt for clean electricity in parts of the world with excess sun and wind on the grid, then ramping up data center operations there. Doing so could cut carbon and costs.

There’s an urgent need to figure out how to run data centers in ways that maximize renewable energy usage, said Chris Noble, co-founder and chief executive officer of Cirrus Nexus, a cloud-computing manager tapping data centers owned by Google, Microsoft and Amazon.

The climate risks sparked by AI-driven computing are far-reaching — and will worsen without a big shift from fossil fuel-based electricity to clean power. Nvidia Corp. Chief Executive Officer Jensen Huang has said AI has hit a “tipping point.” He has also said that the cost of data centers will double within five years to power the rise of new software.

Already, data centers and transmission networks each account for up to 1.5% of global consumption, according to the International Energy Agency. Together, they’re responsible for emitting about as much carbon dioxide as Brazil annually.

Hyperscalers — as the biggest data center owners like Google, Microsoft and Amazon are known — have all set climate goals and are facing internal and external pressure to deliver on them. Those lofty targets include decarbonizing their operations.

But the rise of AI is already wreaking havoc on those goals. Graphics processing units have been key to the rise of large language models and use more electricity than central processing units used in other forms of computing. Training an AI model uses the more power than 100 households in a year, according to IEA estimates.

“The growth in AI is far outstripping the ability to produce clean power for it,” he said.

Moreover, AI’s energy consumption is volatile and more akin to a sawtooth graph than a smooth line that most data center operators are used to. That makes decarbonization a challenge, to say nothing of ensuring grid stability.

AI’s growth is being driven by North American companies, keeping computing power — and energy usage — concentrated there, said Dave Sterlace, account director for global data centers at Hitachi Energy. That’s a trend he didn’t expect two years ago.

To lower data center CO2 emissions, hyperscalers and other data center providers have financed massive amounts of solar or wind farms and used credits to offset emissions. (In the case of credits, some have failed to have a meaningful impact on emissions.)

But that alone won’t be enough, especially as AI use ticks up. That’s why operators are turning to the strategy employed by Alphabet Inc. unit Google called load shifting. The idea: Lower emissions by upending the way data centers function.

Today, most data centers seek to operate in a “steady state,” such that their energy consumption is fairly stable. That leaves them at the mercy of the grid they’re connected to and whatever the day’s mix of natural gas, nuclear and renewable power generation is given the lack of transmission lines between regions. To break their reliance on dirtier grids, tech giants are looking for opportunities to shift daily or even hourly data center operations around the world in an effort to soak up excess renewable energy production.

Google launched the first effort to match its power usage at certain data centers with zero-carbon power on an hourly basis in a bid to get its machines running on clean energy 24/7. No one has fully achieved that goal yet. And, to be sure, the strategy of shifting loads around the world might be complicated by countries pushing for data sovereignty policies that attempt to restrict and safeguard the flow of data across borders. But what Cirrus Nexus and Google are testing could still be a critical piece of the puzzle for cutting emissions.

Manhattan-based Cirrus Nexus scours the world’s power grids and measures emissions in five-minute increments to find the least polluting computing resources for itself and its clients in industries that range from pharmaceuticals to accounting. The company had a chance to put that search into practice last summer.

The Netherlands was in the midst of its sunniest June on record, causing the cost of solar power on the grid to drop. That made it cheaper and less carbon-intensive to run servers. Cirrus Nexus then shifted its computing load to California once the sun went down in the Netherlands, allowing it to draw on solar power just coming online for the day in the Golden State.

By chasing the sun from Europe to the US West Coast and back again, the company was able to slash computing emissions for certain workloads for itself and clients by 34% rather than relying on servers in either location alone, according to company data shared with Bloomberg Green. Making operations flexible to do that comes with both benefits and risks.

Being able to pursue spare zero-carbon megawatts can help reduce stress on grids, such as during a heat wave or frigid winter storm. But data centers need to cooperate with utilities and grid operators because big swings in demand can throw electric systems into disarray, boosting the odds of blackouts. Dominion Energy, which is seeing data center demand soar at its Virginia utility, is working on a program to harness load shifting at data centers to ease stress on the grid during extreme weather.

In recent years, Google and Amazon have tested shifting data center use for their own operations and for clients that use their cloud services. (Cirrus Nexus, for instance, uses cloud services offered by Amazon, Microsoft and Google.) In Virginia, Microsoft inked a deal with Constellation Energy Corp. that guarantees more than 90% of the power for its area data center will be zero-carbon energy. Reaching 100%, though, remains a formidable goal for it and other hyperscalers.

Google’s data centers run on carbon-free energy about 64% of the time, with 13 of the regional sites getting to 85% and seven at just over 90% globally, said Michael Terrell, who leads Google’s 24/7 carbon-free energy strategy.

“But if you’re not displacing fossil assets, then you’re not completely achieving your climate goals,” said Terrell.

Subscribe to the Eye on AI newsletter to stay abreast of how AI is shaping the future of business. Sign up for free.

source

Leave a Reply

Your email address will not be published. Required fields are marked *