share_log

AI尽头是电力? 英国电网负责人:未来10年数据中心耗电量料增6倍

Is electricity at the end of AI? Head of the UK Power Grid: Data center electricity consumption is expected to increase 6 times in the next 10 years

Zhitong Finance ·  Mar 26 21:53

New technologies, such as rapidly developing AI, will cause large-scale electricity consumption.

The Zhitong Finance App learned that the head of the UK National Grid Plc (National Grid Plc) said on Tuesday local time that since the booming development of artificial intelligence technology requires a drastic increase in data center computing capacity, the electricity demand for data centers in the UK is expected to increase by about 6 times in the next 10 years. This will significantly increase the pressure on the country's grid system, as the country's grid must transport large amounts of renewable energy from remote Scottish wind farms to AI data centers around London. However, due to the accelerated electrification trend in home heating, transportation and industry, the UK grid is already under tremendous pressure.

“Future trends in basic technologies such as artificial intelligence and quantum computing will mean a larger, energy-intensive computing power infrastructure,” John Pettigrew, CEO of the UK National Grid said at a conference in Oxford on Tuesday.

Other European countries have similar electricity consumption prospects, and these countries are struggling to solve the problem of how to finance the huge expenses required for growing electricity generation. According to the latest forecast data from the International Energy Agency (IEA), electricity demand in high-energy industries such as global data centers, artificial intelligence, and cryptocurrency is likely to at least double in the next three years, which is equivalent to the total electricity demand of Germany, Europe's largest economy.

In the UK, the National Grid Corporation is studying the addition of an ultra-high voltage power network of up to 800 kV, which is about twice the current capacity.

Pettigrew, CEO of the UK National Grid Corporation, said that the UK's new grid system will enable large-scale power transmission throughout the UK, striving to connect large-scale energy sources to power demand centers. This would mean moving from the current huge network of extremely scattered individual connectivity projects—all of which require approval and infrastructure of their own, to very large power capacity centers.

According to information, this important proposal was put forward on top of the British grid investment plan of about 112 billion pounds (about 142 billion US dollars) announced earlier to ensure that the country is on the path of net zero emissions.

“Projects like this are really expensive,” said UK Energy Secretary Graham Stuart. “The biggest obstacle my department faces in this country is building a power grid.”

As human society enters the AI era, electricity consumption may increase dramatically! Electricity resources are a top priority

ChatGPT became popular all over the world in 2023. The big Sora Wensheng video model came out in 2024, and the unparalleled performance of Nvidia, the “seller” in the AI field, for four consecutive quarters, means that human society will gradually enter the AI era starting in 2024, and electricity resources can play a central role and are a core component of AI infrastructure.

Alex de Vries, a data expert at the Dutch National Bank, predicts that ChatGPT, a chatbot under OpenAI, consumes more than 500,000 kilowatt-hours of electricity every day to process about 200 million user requests, which is equivalent to more than 17,000 times the daily electricity consumption of American households.

Google's search engine also requires a lot of electricity. According to de Vries forecast data, if Google uses AIGC (Generative Artificial Intelligence) in every search, its annual electricity consumption will increase to about 29 billion kilowatt-hours, which exceeds the annual electricity consumption of many countries, including countries such as Kenya, Guatemala, and Croatia.

According to forecast data from the Data Center Standards Organization and the Uptime Institute, the share of artificial intelligence business in global data center electricity consumption is expected to soar from less than 2% to more than 10% by 2025.

The share of data centers in US electricity consumption is expected to triple, from 126 terawatt-hours in 2022 to 390 terawatt-hours in 2030, according to Boston Consulting's forecast data. Terawatt-hours are used to describe the largest electricity levels, and are usually used for national-level energy statistics, planning and evaluation of large-scale energy projects. Large industrial facilities, such as super steel plants, may consume less than 10 terawatt-hours of electricity in a year.

Musk predicts that the AI technology industry will change from “lack of silicon” to “lack of electricity” within the next two years, which may hinder the development of artificial intelligence technology. In his view, the shortage of electricity could have serious consequences, just as the shortage of chips will hinder the development of the human technology industry.

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment