share_log

“买卡大户”微软:今年买了近50万张GPU!是Meta的两倍多

"The big spender in buying cards" Microsoft: This year purchased nearly 0.5 million GPUs! More than twice that of Meta.

wallstreetcn ·  15:27

According to the analysis, the company estimates that Microsoft purchased 0.485 million NVIDIA Hopper chips this year, significantly outpacing Meta (0.224 million), ByteDance (0.23 million), Tencent (0.23 million), Amazon (0.196 million), Google (0.169 million), and other companies.

$Microsoft (MSFT.US)$Currently buying chips like crazy...

On December 17th, the Financial Times reported that Microsoft purchased significantly more NVIDIA AI Chips this year than any competitor to accelerate investment in AI infrastructure. This year, Microsoft's$NVIDIA (NVDA.US)$chip orders are more than three times the number of NVIDIA AI processors of the same generation it purchased in 2023.

By analyzing publicly disclosed capital expenditures, server shipment volumes, and supply chain information, analysts from Omdia Technology Consulting estimated that Microsoft purchased 0.485 million NVIDIA Hopper chips this year, far ahead of$Meta Platforms (META.US)$(0.224 million), ByteDance (0.23 million), $TENCENT (00700.HK)$(0.23 million), $Amazon (AMZN.US)$(0.196 million), $Alphabet-A (GOOGL.US)$(0.169 million) and other companies.

Analysts believe that in the first two years, NVIDIA's GPUs were in short supply, and Microsoft's chip inventory has given it an edge in the competition to build the next generation of AI systems.

This year, tech giants spent hundreds of billions of dollars on datacenters, and Omdia estimates that global tech companies will spend about 229 billion dollars on servers in 2024, with Microsoft's capital expenditures at 31 billion dollars, Amazon at 26 billion dollars, and the world's top ten datacenter infrastructure buyers, including xAI and CoreWeave, will account for 60% of global computing power investment.

As the largest investor in OpenAI, Microsoft is most proactive in building datacenter infrastructure, not only running AI itself (such as the Copilot assistant), but also renting it out to customers through its Azure division.

Currently, Microsoft's Azure cloud infrastructure is being used to train OpenAI's latest models and is competing with startups such as Google, xAI, and Anthropic, as well as competitors outside the USA for dominance in next-generation computing.

Microsoft Azure's Global Infrastructure Senior Director Alistair Speirs stated in an interview with the Financial Times:

"A good Datacenter infrastructure is a very complex and capital-intensive project, requiring years of planning, so it is important to forecast our growth and allow for some flexibility."

Omdia's Cloud and Datacenter Research Director Vlad Galabov also pointed out that about 43% of server spending in 2024 will be on NVIDIA's chips, but Technology giants have also strengthened their own AI Chip deployments this year to reduce reliance on NVIDIA. For example, Google and Meta deployed about 1.5 million of their own chips this year.

However, Microsoft is still in the early stages in this regard, having only installed about 200,000 self-developed Maia chips this year.

Speirs stated that Microsoft is currently mainly using NVIDIA's chips, but the company needs to invest heavily in its own technology to provide "unique" services to customers:

"When building AI infrastructure, based on our experience, it's not just about having the best chips; it also requires the right storage components, the right infrastructure, the right Software layer, the right host management layer, error correction, and all other components to build this system."

Editor/ping

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment