share_log

观点 | 英伟达最大的风险是英伟达自身?

Opinion | Nvidia's biggest risk is Nvidia itself?

wallstreetcn ·  Mar 23 17:38

Nvidia benefits from the strong demand for GPUs in the market, but its high price also boosts AI companies' costs, leading to poor implementation of downstream applications — how long can Nvidia's growth story be told?

High costs are becoming the “number one anxiety” for AI companies.

Recently, Citibank analyst Yitchuin Wong and others attended the Gartner Data Analysis Summit in Orlando. In the post-conference summary report, Citi pointed out that AI is still facing many challenges, especially in the face of high costs, and many enterprise customers are beginning to refocus on ROI (return on investment).

Why is it so expensive? This is where we have to talk about the “King of AI chips” —$NVIDIA (NVDA.US)$.

Along with the booming development of AI technology, graphics processing units (GPUs) produced by Nvidia can meet the huge computing power demand for AI training. They have been “in short supply” in the market for a long time, and chip sales prices have surged, repeatedly pushing the company's stock price to record highs.

Nvidia's upstream hardware vendors have also benefited from the trading behavior of frantically betting on “AI beliefs.” According to media reports, Micron Technology, one of Nvidia's main hardware suppliers and manufacturer of high-bandwidth memory chips, has sold out all of its products by 2025.

In an interview with the media, Micron said bluntly:

“I have never seen memory allocations 18 months in advance in history. This is entirely driven by Nvidia and many other companies in this AI game.”

High costs lead to poor implementation of downstream applications

The price of an H100 GPU has already surpassed 30,000 US dollars in 2022. By last year, the price of each H100 on the Ebay website had soared to over 40,000 US dollars, and its average selling price was more than four times that of AMD's competitor MI300X.

And the deployment of GPUs is far from over. Zuckerberg said this year that Meta will invest heavily in GPUs this year, including up to 350,000 Nvidia “Hopper” H100 and other devices, to achieve “computing power equivalent to nearly 600,000 H100s” by the end of the year.

On the one hand, there is the continuing “lack of cores” due to technological development, and on the other side is the high price of GPUs. Companies have to start thinking about AI's ability to commercialize and monetize. Will the market demand for GPUs continue to be strong in the future?

According to Citi, enthusiasm for generative AI is cooling down, and large-scale projects are still “too early” to land.

The report said that although generative AI is still the focus of attention of most participating management, the actual project scale and use cases are still relatively small (such as text/image generation) rather than large-scale transformation changes.

According to media reports, Amazon Web Services and other generative artificial intelligence providers have lowered sales expectations, and Cohere, one of OpenAI's main competitors, actually had minimal revenue last year. The company said that in the process of seeking financing, customers are cautious about costs and are measuring the functionality of the technology.

Citi mentioned that according to information technology research and consulting firm Gartner, about one-third of the projects will fail due to the premature launch of POC (Proof of Concept).

Narasimhan, an executive at Micron Technology, and Mukesh Khare, general manager of IBM's semiconductor research department, both said that the cost of AI far exceeds the cost of traditional calculation, and only by lowering the cost first can it attract more corporate customers.

Micron Technology said:

“Currently, the costs involved in large language models are quite high, and may be acceptable for large enterprises, but not for the general public.”

“Frankly, today I think investing is changing... I don't want to use the term hype... Too many people are excited about it, and if you have the budget, then you're probably prioritizing investing in generative AI servers over any other field.”

Regarding the question of “when will costs drop,” Micron believes that this is “a question worth 1 billion dollars,” but “it will drop; this is inevitable competition.”

In addition, Citi also pointed out that corporate customers are also increasingly concerned about data governance. In order to improve data and AI literacy, data governance and quality will become a higher priority for enterprises to continue investing.

Editor/Somer

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment