share_log

存储战火蔓延至AI领域! SK海力士与美光、三星开启“HBM3E之争”

The storage war has spread to the field of AI! SK Hynix, Micron and Samsung open “HBM3E dispute”

Zhitong Finance ·  Mar 19 15:16

Source: Zhitong Finance

South Korean storage giant SK Hynix said on Tuesday that it has begun full-scale mass production of next-generation HBM storage systems for artificial intelligence chip (AI chip) systems, which also means that SK Hynix will fully compete with Micron in the HBM3E field in the future.

Some media quoted information revealed by people familiar with the matter as reporting that South Korean storage giant SK Hynix began full mass production of next-generation HBM storage for artificial intelligence chip (AI chip) systems on Tuesday, which also means that SK Hynix will compete fully with Micron in the HBM3E field in the future. People familiar with the matter said that the first shipment of SK Hynix will be delivered to the AI chip leader Nvidia (NVDA.US) this month, and the supercomputing power system based on the two new AI GPUs, the B100 and H200 that Nvidia will soon mass-produce will be equipped with SK Hynix's next-generation HBM storage system.

In the global storage field, this new high-end storage system called “HBM3E” can be described as the core focus of fierce competition in the storage market. HBM storage systems have exploded since 2023 due to demand for AI chips. Micron, like SK Hynix, has become an HBM storage system supplier for Nvidia's new AI GPUs, and the products they provide are all the latest “HBM3E.”

Last month, Micron Technology (MU.US), one of SK Hynix's biggest competitors in the field of memory chips, said it had begun mass production of the new HBM3E storage system on a large scale and began delivering it to Nvidia one after another.

Although the HBM3E developed by Samsung Electronics (Samsung Electronics), which has a leading position in the field of memory chips, has not officially received funding from Nvidia, it recently stated that it has developed the industry's first 12-layer stacked 36GB high-performance HBM3E storage system. In addition, Samsung's HBM3 storage system has successfully joined Nvidia's AI chip component supplier list. It is expected that the HBM3 storage system will be supplied starting in the fourth quarter. The overall supply time is later than SK and Micron, and SK and Micron have obtained Nvidia's HBM3E certification.

HBM is a high-bandwidth, low-energy storage technology specifically used in the fields of high-performance computing and graphics processing. Multiple DRAM chips stacked by HBM are connected together and data is transmitted through fine Through-Silicon Vias (TSVs) to achieve high-speed, high-bandwidth data transmission.

HBM is mainly used in fields such as high-performance graphics cards, AI acceleration, high-performance computing, and data center servers. Its high bandwidth characteristics, extremely low latency and high energy efficiency ratio enable processors to access storage space faster, greatly improving computing performance and efficiency. In the field of AI infrastructure, the HBM storage system is used with the Nvidia H100 AI server system and AI server systems such as the upcoming B100 and H200.

According to forecast data from Mordor Intelligence, a well-known research institute, the market size of HBM storage products is expected to surge from about US$2.52 billion in 2024 to US$7.95 billion in 2029, with a compound annual growth rate of 25.86% during the forecast period (2024-2029).

SK Hynix is a well-deserved leader in HBM! Micron wants “latecomers”

However, it is undeniable that although Samsung Electronics is the absolute leader in the DRAM storage field, the “HBM3 version” of SK Hynix's HBM storage system is currently the only storage system bound to the Nvidia H100, an AI GPU that is extremely in demand. SK can be described as leading the global HBM storage market. Statistics show that in 2023, Nvidia will account for up to 90% of the AI chip market for artificial intelligence (AI) training/inference.

In terms of the HBM market, as of 2022, the market share of the three original HBM manufacturers was SK Hynix 50%, Samsung Electronics about 40%, and Micron about 10%. Since SK Hynix was the first in the HBM field, it already entered this field as early as 2016, so it occupied the vast majority of the market share. Some industry insiders said that the share distribution in 2023 will be roughly the same as 2022. Micron's development in the HBM field is relatively late, but recent news from the HBM industry shows that “latecomers”, which account for only 10% of HBM's share, is a latecomer.

Furthermore, just as SK Hynix and Micron were successively finalizing the HBM3E supply certification with AI chip leader Nvidia, Samsung was unexpectedly absent. According to reports, Samsung's latest HBM3E has yet to pass Nvidia's HBM storage quality test.

Micron plans to begin mass production and delivery of the new HBM3E storage products in early 2024 (around February). At the same time, it was revealed that Nvidia is one of the main customers of its new HBM storage products. Additionally, the company emphasized that its new HBM products are receiving significant interest from across the industry, which suggests that Nvidia may not be the only major customer to end up using the Micron HBM3E. Micron clearly has high hopes for HBM3E, as this may enable it to continue to gain market share from SK Hynix and Samsung Electronics in order to comprehensively increase the company's revenue and profits.

According to news from some industry insiders, Micron's HBM3e will support Nvidia's new HBM3e AI hardware system, the Grace Hopper GH200 supercomputing power system (equipped with an H100 computing GPU and Grace CPU). This shows that Micron's latest developments in the HBM field are not only a comprehensive technological breakthrough, but also a reflection of its deepening cooperation with Nvidia. According to industry sources, Nvidia's upcoming supercomputing power system based on B100 and H200 GPUs may also be equipped with the Micron HBM3E.

The Micron HBM3E module is based on eight stacked 24Gbit memory chips and is manufactured using the company's 1beta (1-beta) manufacturing process. The modules have a data rate of up to 9.2 GT/s, bringing the peak bandwidth of each stack to 1.2 Tb/s, which is 44% higher than the fastest HBM3 module available today. Meanwhile, the company won't stop its 8-Hi 24 Gbit-based HBM3E components. The company announced plans to launch an ultra-high-capacity 36 GB 12-Hi HBM3E stack in 2024, following the start of mass production of the 8-Hi 24GB stack.

Micron also said that its latest HBM3E will consume about 30% less power than competitor products, helping to meet the growing storage needs of AI chips that provide computational power for large-scale generative AI applications.

SK Hynix wants to consolidate HBM's leading position

“The company hopes that HBM3E will be successfully mass-produced soon. With our experience, as the industry's first HBM3 supplier, we hope to solidify our leadership position in the field of artificial intelligence HBM storage.” SK Hynix said in a statement.

In terms of performance, the new HBM3E storage system recently launched by SK Hynix, the world's second-largest memory chip manufacturer, is 10% faster than the previous generation in terms of heat dissipation, and the data processing speed is as high as 1.18 terabytes per second.

Some analysts in the industry say SK Hynix's HBM production capacity for 2024 may have been fully booked because the explosive demand for artificial intelligence chipsets from data centers and technology companies around the world is driving strong demand for high-end HBM storage systems used in them.

Coincidentally, Micron executives also said that demand for the Micron HBM3E will be extremely strong in 2024. Micron CEO Sanjay Mehrotra said during the first fiscal quarter results conference that demand for high-end memory chips used by data centers to help develop artificial intelligence software/applications is extremely strong. Mehrotra also emphasized that Micron has completely sold out all of the HBM that it is expected to be able to produce in 2024. “This is the type of ultra-fast access chip needed for computers that create artificial intelligence software. The high-revenue and high-profit opportunities that AI has brought us are just beginning.” Mehrotra display.

From a market strategy perspective, SK Hynix has established a leading position in the HBM storage market by being the sole supplier of HBM3 storage currently bound to Nvidia's AI GPU, and its HBM production capacity has already been booked for 2024. Micron is trying to gain a competitive advantage in the high-end memory market through the energy efficiency of its HBM3E products and the performance advantages of leading global peers, and has received great attention and interest in the industry.

“SK Hynix can be said to have obtained a leading market position in the HBM storage field, and it is expected that the production growth scale of its high-end storage systems will also be the most aggressive among chip makers.” IBK Investment & Securities analyst Kim Un-ho said.

Nvidia unveiled its latest flagship artificial intelligence GPU, the B200, on Monday EST, which is said to be nearly 30 times faster than its predecessor AI GPU on certain tasks, to strongly defend its absolute dominance in the field of artificial intelligence chips. The Blackwell architecture behind the B100 is 25 times better in cost and energy consumption than the previous generation. It is the most powerful AI GPU in the world. It consists of 208 billion transistors, uses TSMC's 4NP process, and supports models with up to 10 trillion parameters for AI training and real-time big language model (LLM) inference.

SK Hynix's stock price has doubled in the past 12 months due to its leading position in the HBM chip sector. By the close of trading on Friday, Micron had risen more than 70% over the past 12 months. Earlier this month, as analysts emphasized the storage giant's important position in the artificial intelligence boom, the company's stock price reached a record high this year.

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment