
The global AI market is exploding, causing a new resource to become critical: electricity. Louis-David Benyayer, scientific co-director of the MSc in Big Data and Business Analytics at ESCP Business School, explores why the battle over AI will probably also be a battle over energy, which may be even fiercer because of the current global situation.
- AI’s ability to manage and assess massive and complex data sets means it will play a significant role in addressing climate change.
- The AI market is growing rapidly: from $93.5 billion in 2021, it is projected to rise by nearly 40% from 2022 to 2030 but as it grows so does its energy consumption.
- Given the importance of energy to developing AI-based processes, could global giants price everyone else out of the market?
The global artificial intelligence (AI) market is projected by Precedence Research to post a CAGR (compound annual growth rate) of 38.1%% to reach close to $ 1.6 billion by 2030. “The current wave of growth in the AI industry is as much about the abundant availability of big data as it is about software and hardware,” according to Statista’s In-depth Artificial Intelligence Report 2021.
“The amount of big data being generated by today’s increasingly digitised economy is growing at a rate of 40% each year and is expected to reach 163 trillion gigabytes by 2025. This growth in big data is driving the improvement of AI algorithms.”
The power consumption of the data and AI value chain is also growing: In 2021, Bitcoin’s electricity consumption equaled that of Argentina and according to AlgorithmWatch’s very interesting report on sustainable AI, data centres are responsible for 20% of all electricity consumption in German financial centre Frankfurt.
But neural networks (short for artificial neural networks) – i.e. computing systems inspired by the biological neural networks that constitute human brains and used by most modern AI algorithms – are particularly power-hungry: “The general trend in AI is going in the wrong direction for power consumption,” warns Kate Saenko, associate professor of computer science at Boston University. “AI is getting more expensive in terms of power to train the newer models.”
In a nutshell, the power consumption of neural networks is due to two main reasons: the computing used to train the model and the computing used to infer new data from the model. “AI is more computationally intensive because it needs to read through lots of data until it learns to understand it – that is, is trained,” adds Kate Saenko. A lot more than the human brain… According to researchers associated with AI research and deployment company OpenAI, the computing used to train the average model increases by a factor of 10 each year.
“A language processing model might be able to understand 95% of what people say, but wouldn’t it be great if it could handle exotic words that hardly anyone uses? More importantly, your autonomous vehicle must be able to stop in dangerous conditions that rarely ever arise,” The Register explains.
Source: Open AI
Neural networks require a lot of data to be trained, which requires a lot of electricity to store the data and to build and use the model. According to OpenAI, accelerating demand caused power requirements for training large models to skyrocket 300,000-fold by 2018, and they now double every 3.4 months. So much power that some predict that machine learning is on track to consume all the energy that can be supplied.
“Exact figures on how much this all costs are scarce,” The Economist explains. “But a paper published in 2019 by researchers at the University of Massachusetts Amherst estimated that training one version of ‘Transformer’, another big language model, could cost as much as $3 million. Jerome Pesenti, Facebook’s head of AI, says that one round of training for the biggest models can cost ‘millions of dollars’ in electricity consumption.”
As this consumption explodes, the ability to pay the energy bill and benefit from the most efficient energy production capabilities is critical for the most intense users of neural networks.
This topic is not new and it’s been years now since Big Tech companies started investing in their own electricity production facilities (“at the end of 2018, the Giant Five owned or had contracted for […] more electric generation capacity than the Los Angeles Department of Water and Power”) and in technologies to reduce electricity costs and consumption in their data centres. These investments can be interpreted from various angles:
- First, as electricity costs grow bigger, optimising them is strategic.
- Second, the ability to access a cheaper or more efficient solution than the direct competitor’s is a source of competitive advantage.
- Third, developing these solutions and capabilities increases the gap with non-digital companies, which have no other choice than to partner with them on their terms to access AI capabilities.
A cynical observer would even conclude that promoting the use of neural networks is a way for Big Tech firms to make non-digital companies dependent on a critical resource they master much better.
The stakes may be more than just economic or environmental; they could be political too: ”You have to buy the energy to train these models, and the only people that can realistically afford that will be Google and Microsoft and the 100 biggest corporations,” posits Lukas Biewald, the CEO and Cofounder of Weights and Biases, a machine learning platform for developers to build better models faster.
In AI, the battle over data, talents and chips is already pretty fierce. A new front has started with the battle over electricity, with several data centre projects already paused as a result of the current situation. In this battle, the companies that dominated the non-digital markets before the arrival of Big Tech have some valuable cards to play: They have been implementing strategies for resource control for decades.