share_log

英伟达AI芯片的最大买家是谁?这家科技巨头力压同行霸榜

Who is the biggest buyer of NVIDIA's AI Chip? This technology giant outperforms its peers to top the list.

cls.cn ·  16:39

① Microsoft has become the largest buyer of NVIDIA's Hopper chip, with annual purchases reaching 0.485 million units, significantly surpassing Meta's 0.224 million units; ② Jensen Huang stated that the demand for the Hopper chip is strong, with the company's revenue in the third quarter increasing by 94% year-on-year and net income growing by 109%.

The latest data from market research consulting Institution Omdia shows that $Microsoft (MSFT.US)$ has become $NVIDIA (NVDA.US)$ the largest buyer of the flagship Hopper chip, far outpacing other competitors in the Technology sector.

Omdia's Analysts estimate that Microsoft purchased 0.485 million Hopper chips this year, compared to the second largest American customer.$Meta Platforms (META.US)$Purchased 0.224 million units, which is less than half of Microsoft's procurement amount.

Omdia claims that its calculations are based on data disclosed by various companies regarding capital expenditures, server shipments, supply chain intelligence, and other factors. According to Omdia, ByteDance and Tencent each ordered approximately 0.23 million NVIDIA chips this year, which is slightly higher in quantity than that of Meta.

Although$Amazon (AMZN.US)$and $Alphabet-C (GOOG.US)$ is trying to deploy its own customized alternative products, the two still purchased 0.196 million units and 169,000 units of Hopper, respectively. Data also shows that under Musk's management,$Tesla (TSLA.US)$The total number of chips purchased with xAI is slightly higher than that of Amazon.

Last month, NVIDIA CEO Jensen Huang stated during the Earnings Reports conference call that although the next-generation Blackwell chip is set to start shipping this quarter, the current Hopper chip remains very popular, thanks to the outstanding contributions of foundational model developers in pre-training, post-training, and inference stages.

Since the debut of the ChatGPT chatbot two years ago, major technology companies have invested billions of dollars to develop AI infrastructure, initiating an unprecedented investment boom, which has made NVIDIA's AI chips one of the hottest commodities in Silicon Valley.

Jensen Huang has repeatedly mentioned, "The demand for NVIDIA products is very strong, everyone wants to be the first to receive the goods, and everyone wants to receive the most products." NVIDIA's third quarter revenue announced last month increased by 94% year-on-year, with net income growing by 109%.

Compared to other Technology companies, Microsoft can be said to be the most active in building infrastructure because it needs Datacenters not only to run its own AI services (such as Copilot) but also to rent out computing power to Cloud Computing Service customers through its Azure division.

Omdia believes that the number of NVIDIA chips procured by Microsoft in 2024 will be three times that of 2023. Microsoft Azure executive Alistair Speirs told the media, "Good Datacenter infrastructure is very complex, a capital-intensive project that requires years of planning."

Speirs added, "Therefore, it is important to forecast our growth while leaving some room." Omdia estimates that by 2024, global spending by Technology companies on Servers will reach 229 billion USD, with Microsoft at 31 billion USD and Amazon at 26 billion USD.

Vlad Galabov, head of Omdia's Cloud Computing and Datacenter research, stated that about 43% of server spending in 2024 will go to NVIDIA, "NVIDIA GPUs have a very high share, but we expect this may be close to its peak."

On one hand, NVIDIA's main competitor in the GPU field, AMD (Advanced Micro Devices), is making progress. Omdia reports that Meta purchased 0.173 million AMD MI300 chips this year, while Microsoft also bought 0.096 million.

At the same time, large Technology companies have also intensified their use of self-developed chips. Google has been developing its "Tensor Processing Units" (TPU) for ten years, and Meta has launched its self-developed "MTIA" chip, with each deploying about 1.5 million.

Amazon is also investing in its Trainium and Inferentia processors, deploying about 1.3 million such chips this year. Earlier this month, Amazon announced plans to use its latest Trainium chips to build a new cluster for its partner, Anthropic.

In comparison, Microsoft has been relatively conservative in its use of its first self-developed chip, "Maia," with only about 0.2 million deployed this year.

Editor/rice

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment