Susan Rakov
Managing Director, Frontier Group; Senior Vice President, The Public Interest Network
We need to use a lot less energy to reach climate goals. Instead, we're using more. Here's how much crypto and AI are contributing.
Managing Director, Frontier Group; Senior Vice President, The Public Interest Network
Former Policy Associate, Frontier Group
America and the world have an unprecedented opportunity to replace fossil fuels with cleaner alternatives. Solar and wind power cost less than ever, sales of new polluting internal combustion engine vehicles could end within 20 years, sales of electric heat pumps are growing rapidly and new clean energy innovations are emerging all the time.
As renewable electricity grows, however, demand for electricity is poised to grow as well. As part of decarbonizing our economy, we will need to rely more heavily on electricity to power transportation and buildings. Producing enough clean electricity to meet all of that demand will be a challenge.
Over the past two decades, growth in end-use energy consumption in the U.S. has been basically flat, making it easier to deploy renewable energy to replace fossil fuel consumption. Limiting the growth of energy consumption going forward will be essential to a successful energy transition.
Computing is one sector in which energy demand is growing – in part because of the continued digitization of much of the economy, but also due to the emergence of new products and services such as cryptocurrency and artificial intelligence (AI).
Each additional use of electricity that we adopt brings with it the potential to make the clean energy transition harder. To overcome those challenges, we need to have a better idea of what future electricity demand from computing might look like – and make smart decisions about managing that demand.
Modern computing relies on powerful data centers located around the world. Data centers house data and IT infrastructure – both software and hardware. Data centers consumed 240-340 terawatt-hours of electricity in 2022, or 240-340 billion kilowatt-hours – about 1.0-1.3% of total global electricity use. That’s comparable to the electricity consumption of the entire United Kingdom.
In 2020, the information and communication technology sector as a whole, including data centers, networks and user devices, consumed about 915 TWh of electricity, or 4-6% of all electricity used in the world. That’s comparable to the energy consumption of 86 million typical American homes – about two-thirds of all U.S. households.
The electricity demand of the IT sector may reach 3,200 TWh by 2030. That’s even more than the amount of electricity experts estimate that it could take to power every electric vehicle in the world in 2050 if automakers sell only EVs by 2040.
Crypto and AI increase energy demand by making energy-intensive computing activities more commonplace.
Cryptocurrencies are virtual payment systems powered by encryption algorithms. Cryptocurrency wallets store encrypted “keys” that link users to their currency. Crypto uses blockchain technology, a method of recording and verifying transactions that makes data unchangeable over time.
The electricity demand of cryptocurrency activity is large and growing.
Verifying a single Bitcoin transaction takes about 775.23 kWh of electricity – comparable to the electricity demand of an American home over 27 days. Bitcoin users make hundreds of thousands of these transactions every day.
Some cryptocurrencies use more or less efficient algorithms to mine currency and verify transactions; Bitcoin, the most popular cryptocurrency, uses “proof of work” verification, a more energy-intensive option than the alternative, “proof of stake” verification.
Globally, cryptocurrency mining consumed 110 TWh of electricity in 2022. This year, Bitcoin alone may consume 125 TWh[1] of electricity. That’s more than Argentina – a nation of 46 million people.
Artificial intelligence models “learn” information, patterns and associations from source data. They then use that training to respond to inputs.
Generative AI, which many users associate with the popular chatbot ChatGPT, produces complex data – like sentences and images – based on what it has “learned” from the large collections of data developers used to train it.
In the case of ChatGPT, training data included books, articles, web pages and all of Wikipedia. In total, it took about 300 billion words to train ChatGPT.
AI models are built out of parameters. Parameters are all the values that the model can change as it learns and applies knowledge. Think of playing a game of 20 questions: Each question is a parameter; every time you ask one, your understanding of the thing gets a little clearer, a little more precise. And every time you ask a question, you have to remember one more answer.
The more parameters, the larger and more sophisticated the model, the more precise and complex the output, and the more energy it takes. In short, better AI consumes more energy.
Some AI applications are accomplishing things we’ve never been able to do before. For instance, AI can easily classify and filter vast quantities of unstructured data. Other AI applications are simply adding speed or ease to tasks we can already complete. Using ChatGPT to find information or write emails for you falls into this category.
In most cases, it takes more energy to use AI than it does to query a traditional search engine or draft an email using your own wit. For instance, every ChatGPT query consumes four to five times as much energy as it would on a traditional search engine.
Machine learning models have been around for years, but they are now larger, more complex and more embeddable into computing activities of all sorts than ever before. As a result, AI can perform more complex tasks and improve the functionality of other systems, but to achieve its sophisticated performance, it also must use many more parameters than in the past.
Deep Blue – the algorithm that famously defeated a human chess champion in 1997 – had 8,150 parameters. GPT-3, released in 2020, has about 1.5 billion. Google’s Pathways Language Model, launched in 2023, has 540 billion.
Compared to something like Bitcoin, the energy use of training one AI model is relatively small; the electricity it took for developers to train GPT-3 could only have powered about 70 American homes for one year. However, there is a lot more to AI than just training chatbots, and in the years to come AI will likely become even more widespread than it is now. ChatGPT alone may not cause a significant shift in electricity demand, but millions of ChatGPT-like models running everything from search engines to appliances to industrial machines certainly will.
We already know of a few ways to make AI more energy efficient, and some developers are working to discover more. But unless we’re intentional about which applications we adopt and make those we choose as energy efficient as possible, AI will only get more energy-intensive as it advances.
To decarbonize, we need to manage our demand for energy even as we replace dirty sources of energy with clean ones. But global demand for energy is rising – for computing and other purposes. Before jumping to the question of how to meet that demand, it’s important to first ask whether we can make our uses of energy more efficient. Ultimately, we will also need to have a society-wide conversation about what’s worth using energy on at all.
[1] Cambridge Bitcoin Electricity Consumption estimates annualized Bitcoin energy use based on current power demand. Figure used is as of 10/02/2023. https://ccaf.io/cbnsi/cbeci.
Managing Director, Frontier Group; Senior Vice President, The Public Interest Network
Susan directs Frontier Group, the research and policy development center for The Public Interest Network. Frontier Group’s work informs the public discussion about degradations to the environment and public health, threats to consumer rights and democracy, and the available routes to a better future. Susan lives in Santa Barbara, California; she has two children, a husband, and a dog, and is an amateur singer/songwriter.
Former Policy Associate, Frontier Group