Source: Finance Association
Author: Zhou Ziyi
① Meta, Microsoft and other companies said at an AMD investor event that in the future they will use AMD's latest artificial intelligence chip, Instinct MI300X; ② Interestingly, Meta and Microsoft are the two biggest buyers of Nvidia H100 GPU chips this year; ③ Currently, many technology companies are looking for alternatives to Nvidia GPUs to reduce costs.
$Meta Platforms (META.US)$,$Microsoft (MSFT.US)$With OpenAI on Wednesday (12/6)$Advanced Micro Devices (AMD.US)$Investors stated at the event that in the future they will use AMD's latest artificial intelligence (AI) chip, the Instinct MI300X.
Graphics processing units (GPUs) are critical to creating and deploying artificial intelligence programs such as OpenAI's ChatGPT. in$NVIDIA (NVDA.US)$At a time when GPUs are dominating the artificial intelligence market, many technology companies are also actively looking for alternatives to them to reduce costs. The latest statements from Meta, Microsoft, and other companies this time are clear examples of what has happened so far.
The first product launched by AMD at the conference was the Instinct MI300X accelerator, which consists of 8 MI300X GPUs and can provide up to 1.5TB of HBM3 memory capacity.
Compared with Nvidia's H100 HGX, the Instinct MI300X accelerator has significantly higher throughput and latency performance when running large language model inference, and is superior in various AI and HPC projects.
Analysts believe that if the performance of AMD's latest high-end chip is sufficient to meet the needs of technology companies and cloud service providers that build artificial intelligence models when it starts shipping at the beginning of next year, then it will inevitably put competitive pressure on Nvidia's soaring sales growth of artificial intelligence chips.
AMD CEO Su Zifeng also said on Wednesday, “All interest from the outside world is focused on large processors and large GPUs for cloud computing.”
Problems faced by AMD
The main question facing AMD now is whether the company that has always been supported by Nvidia chip-based computing power will invest time and money to increase procurement from another GPU supplier. In response, Su Zifeng believes that AMD still needs to work hard.
At the same time, the price is also important. On Wednesday, AMD did not reveal the pricing of the MI300X at the conference. However, Su Zifeng told the reporter that the price of AMD chips must be lower than Nvidia's chips in order to convince customers to buy them. According to reports, each Nvidia GPU chip sells for about 40,000 US dollars.
On the software side, AMD told investors and partners that the company has improved the software suite called ROCm to smooth out a key flaw, and this flaw is one of the main reasons why AI developers currently prefer Nvidia.
AMD announced the latest version of the ROCM 6 open source software platform at the conference, which has made significant progress in improving AI acceleration performance.
In addition, AMD also revealed that it has signed agreements with some of the companies that need GPUs the most to use this chip.
At the AMD investor event, many companies came to cheer, including Microsoft, Meta, OpenAI,$Oracle (ORCL.US)$etc.
Meta said at the conference that it will use the MI300X GPU to handle artificial intelligence inference workloads.
Microsoft Chief Technology Officer Kevin Scott said the company will develop access to the MI300x chip through Azure web services.
OpenAI said it will support AMD's GPUs in a software product called Triton. Triton is not a large language model like GPT; it is a Python-like open source programming language.
In addition, Oracle also claims that its cloud computing will also use this type of chip.
What is interesting, however, is that just a few days ago, a recent report from market research company Omdia Research showed that Meta and Microsoft each purchased 150,000 H100 GPUs from Nvidia this year, tied for first place in purchasing volume.
This shows that the artificial intelligence GPU market is fiercely competitive.
AMD predicted on Wednesday that the total market for artificial intelligence GPUs in the next four years (until 2027) may rise to 400 billion US dollars, double the company's previous forecast.
Su Zifeng also told reporters that AMD does not believe that it is necessary to beat Nvidia to achieve good results in the market, but that it can use itself to occupy a good share of the 400 billion US dollar market.