National leaders and tech industry pioneers from the likes of Google, Meta and Microsoft – not to mention the ubiquitous multibillionaire Elon Musk – this week gathered in southern England for what's billed as a first-of-a-kind summit on the implications for humanity of artificial intelligence (AI).
AI is the shiny new toy of the 21st century, having moved from the realm of science fiction to being irrevocably entwined in our lives. As 28 nations signing the 'Bletchley Declaration', named after the famous country house that nurtured some of the world's earliest computing breakthroughs noted, AI has vast potential for good – and for "serious, even catastrophic, harm".
Those fears mostly focus on malign applications of AI's almost boundless applications. But as its use has boomed, concerns are growing that its ravenous power demands will leave green energy supply trailing in its wake, or consume renewable power needed to decarbonise national grids and other industries in the fight against climate change.
“I definitely think that AI is the ‘it technology’ right now,” said Sasha Luccioni, a researcher in ethical and sustainable AI, continuing that companies are “scrambling” to launch new products “as fast as possible” and use them in their business models.
A corporate AI arms race has seen the technology rolled out to improve our search engines, emails and even grocery shopping. Intel plans to use AI in “every product that we build,” an attitude that reflects the mood of the moment in Big Tech.
Deep Jariwala, a professor in electrical and systems engineering from the University of Pennsylvania, said that the energy demands of AI could “spiral out of control” if it is integrated into “every sphere of life and touches every person’s lives dozens of times a day” with no efforts to make it more sustainable.
There are predictions that by 2030 computing generally could use up to a fifth of electricity globally. By 2040, Jariwala says others think computing and communications “will gobble up most of the world's energy”.
Brookfield Renewable CEO Connor Teskey recently predicted that tech giants like Google could in the future need as much green power as the entire UK to meet their AI-driven energy demands.
This may be “somewhat of an overestimate,” said Jariwala, and using “precise timelines” is important. Google used 15TWh of electricity in 2020, while the UK uses around 300TWh annually.
But he added that the electricity demands of AI and computing will rise “much faster” than Western economies. Electricity usage in the US has stayed largely flat since 2010, while in the UK it has steadily fallen from a high of 400TWh in 2005.
Veil of secrecy
One difficulty with assessing the energy demands of AI is that its biggest users are secretive about sharing information.
“It is incredibly difficult to get any information,” said Luccioni. Especially in the last six months, she said companies have “really cracked down in terms of sharing any details about their AI models,” whether that be on the “data used, their size, or anything else.”
Several tech giants did not respond to requests for comment for this article. Google would only provide information on background. Intel was initially keen, but after reviewing questions, including on whether building AI into all its products is responsible, said it didn’t have “anyone available.”
While these companies release some information on their overall energy usage, it is difficult to disentangle how much is being burned by AI compared to other computational demands.
The OECD bemoaned this difficulty in a report on AI energy usage last November. It did however include a revelation from Google that 15% of its power needs over the last three years have gone to machine learning, a subfield of AI.
Based on its 15TWh of electricity usage in 2020, that would make AI’s share of that at least 2.25TWh. Not quite up to UK levels, but still only a little less than the electricity used in 2019 by Malta, a country of over half a million people.
Data centres key
The physical manifestations of the internet, data centres, which can be larger than aircraft carriers and staff use scooters to get around, are where the vast majority of this energy is burned.
Benjamin Lee, another electrical systems expert at Pennsylvania university, says that hyperscale data centres (there are around 600 globally) typically use between 20MW and 40MW of power. The largest ones use over 100MW, enough to power 80,000 homes in the US.
A yet-to-be-published meta-analysis conducted by Lee and his colleagues of such data centres based on their sustainability reports suggests that their energy usage increased by around 25% annually between 2015-21.
In 2021, Amazon, Google, Microsoft and Meta’s data centres together used 72TWh of electricity, according to the International Energy Agency (IEA). That number more than doubled since 2017.
Some studies have found that computational energy demands have “grown much more slowly” than expected, he said. This may be due to the rise of cloud computing, with companies closing their small individual data centres and instead renting space in the more efficient hyperscale centres run by tech giants.
However, Lee adds that this was a “one-time paradigm shift,” and it may be the “low-hanging fruit in system efficiency have been picked.”
Francisco Mingorance, secretary general of trade group Cloud Infrastructure Services Providers in Europe, argues it is “misleading” to focus only on the energy consumption of data centres without considering their output.
The “increase in workloads” of such centres has he said “massively outstripped the increase in power consumed.”
Large data centres can be “easy targets” as they do use lots of energy, said Mingorance, who is also on the board of the Climate Neutral Data Centre Pact. “However, in most cases they are replacing scattered data rooms and on-premise data centres that collectively consume more power, less efficiently.”
This might be why, despite the energy use of hyperscale data centres rising, the total energy use of data centres has held relatively steady for the last decade, according to the IEA.
Big Tech is easily the biggest private buyer of renewable energy. It took home half the record total 31GW of corporate green energy procurement in 2021, with Amazon alone making up a fifth of that.
But companies are being impeded from reaching their net zero goals due to limited green energy power options and prohibitive costs, corporate renewables body the RE100 found last year.
There is a major problem brewing for these manufacturing economies.
RE100 head Ollie Wilson tells Recharge there are concerns that governments are “woefully ill-prepared” for the increasing energy demands of AI, data centres and other technologies.
“This is not just an environmental problem, there is a major economic problem brewing for these manufacturing economies,” he said, adding that the only solution is to “dramatically scale up” the rollout of renewables.
Currently, there are “real concerns" that this rollout is "not keeping pace with demand," which he partly blames on “government foot-dragging”.
Torjus Bolkesjø, head of global energy market drivers at Norwegian renewables giant Statkraft, paints a different picture. Although energy demand for digitalisation has “increased substantially,” he said that, “for the first time ever,” its growth is being matched by that of green energy.
AI to help solve its own problem
While it is easy to get caught up in how much power AI uses, there is no doubt it also holds the key to making radical improvements to energy efficiency.
AI can change light and heating depending on how many people are in a room, optimise the use of rooftop solar when it’s sunny, and coordinate when to sell unused electricity in back to the grid, said RE100’s Wilson.
When it comes to data centres, although they will have to adapt to the higher energy demands of AI, Mingorance said that AI-managed facilities could in the future be “hyper-efficient and thus radically reduce energy consumption required to execute workloads.”
Pennsylvania professor Jariwala adds that, while the computing demands of the global population “keeps rising at a brisk pace, there are many people like us, all over the world who are working hard to reduce the energy required per bit at every level in the computing technology stack.”