Driven by the AI wave, market demand for HBM continues to rise. How long will the “short supply” situation last?
On April 14, Bank of America released a research report on global memory technology to understand the latest developments and trends in HBM (High Bandwidth Memory) and DRAM (Dynamic Random Access Memory) technology.
According to the report, due to low yield, long manufacturing cycles, and continued strong orders, it is estimated that HBM will require 10%-20% more DRAM wafer production capacity than currently predicted. Previously, Bank of America expected global DRAM wafer production capacity of 182,000 wafers and 257,000 wafers per month in 2024 and 2025.
The higher the demand for HBM, the greater the DRAM consumption
HBM technology can be said to be the main representative product of DRAM's development from traditional 2D to stereoscopic 3D. Since it is made by stacking 8-12 DRAM chips, it requires more DRAM wafer production capacity.
Specifically, the report stated:
Currently, the average yield rate of HBM is probably only 70% or more, and it is difficult to reach 90% or more;
It usually takes more than 5 months to complete HBM's front-end and back-end processes, which is more reasonable than the 4 months originally anticipated. This may reduce production efficiency and further increase the demand for wafers;
New orders are expected to be stronger in the second half of the year and even into 2025, driven by demand from Nvidia and other large producers.
This means that if you want to produce sufficient, high-quality HBM, you need to invest more DRAM wafer production. According to the report, this process may require more than 30% of wafers to be discarded, higher than previously expected of less than 30%.
Non-HBM DRAM supply is facing a shortage
At the same time, Bank of America also pointed out that this increased demand for DRAM wafer production capacity may cause a shortage of conventional DRAM for non-HBM use in 2025.
According to public information, DRAM devices used for HBM are completely different from typical DRAM ICs for commercial memory (such as DDR4 and DDR5). Not only do they require a higher number of test devices, but the memory and data architecture have also been redesigned.
DRAM devices for HBM must have a wide interface, so they are physically larger and more expensive than conventional DRAM ICs. According to media reports, Micron CEO Sanjay Mehrotra once said:
“The HBM3E chip is approximately twice the size of DDR5 with the same capacity. HBM products include logic interface chips and have more complex package stacks, which affects yield. As a result, HBM3 and 3E demand will absorb a significant portion of the industry's wafer supply.”
“The increase in HBM3 and 3E production will reduce the overall growth in DRAM bit supply across the industry, particularly the supply impact on non-HBM products, as more capacity will be transferred to address HBM opportunities.”
As a “new favorite” in the AI era, it is expected that HBM will continue to be in short supply. Goldman Sachs also released a research report earlier saying that the market size is expected to grow tenfold from 2022 to 2026 (4-year compound annual growth rate 77%), from 2.3 billion US dollars in 2022 to 23 billion US dollars in 2026.
edit/new