share_log

马斯克:下一代大模型Grok 3将需要10万块英伟达H100

Musk: The next big model Grok 3 will require 100,000 Nvidia H100

wallstreetcn ·  Apr 10 11:28

Source: Wall Street News

Other than GPUs, the biggest obstacle to AI development is electricity! Musk predicts that the next big model, Grok 3, will require 100,000 Nvidia H100 GPUs, and the potential power consumption is equivalent to the electricity consumption of a small city.

$Tesla (TSLA.US)$CEO Elon Musk recently made bold predictions about the development of general artificial intelligence (AGI) in an interview.

In his opinion:

AGI is likely to achieve a breakthrough beyond human intelligence in the next 2 years, but this process requires a large amount of GPUs and electricity consumption.

According to Musk, its artificial intelligence company XAI is currently training the second-generation big language model Grok 2 and is expected to complete the next training phase in May. Grok 2's training has already cost about 20,000 yuan$NVIDIA (NVDA.US)$H100 GPU. The development of future advanced versions of Grok 3 may require up to 100,000 Nvidia H100 GPUs.

Musk stated:

Currently, the development of AI technology is facing two major problems: first, there is a shortage of supply of high-end GPUs, such as the Nvidia H100, and it is not easy to obtain 100,000 GPUs quickly; the other is the huge demand for electricity. An Nvidia H100 GPU consumes about 700 watts of electricity when working at full load, so 100,000 such GPUs will consume up to 70 megawatts of electricity. If the server and cooling system requirements are taken into account, the electricity consumption of a data center equipped with 100,000 Nvidia H100 processors will probably reach 100 megawatts, which is equivalent to the electricity consumption of a small city.

These two major constraints highlight the challenges of scaling AI technology to meet growing computing demands.

Nevertheless, advances in computing and storage technology will make it possible to train even larger language models in the next few years. The Blackwell B200 GPU platform presented by Nvidia at GTC 2024 is designed to support large language models that can be scaled to trillions of parameters, heralding a key step in the development of AGI.

Musk expects:

An artificial intelligence that is smarter than the smartest humans will be born within the next one to two years. If you define AGI as being smarter than the smartest humans, I think it's likely to happen within next year, or two years.

editor/tolk

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment