share_log

美光财报:营收猛增82%,但业绩预期未能震撼华尔街

Micron's financial report: revenue surges 82%, but performance expectations fail to impress Wall Street.

Zhitong Finance ·  Jun 27 08:15

Wall Street has very high expectations for this financial report and performance outlook; Micron's stock price has skyrocketed by 67% to date before today's financial report release.

According to the financial app of the Intelligence Finance, the largest computer storage chip manufacturer in the United States $Micron Technology (MU.US)$After the latest financial report was released, the stock price fell more than 9% during post-market trading in the US. The quarterly performance data and performance outlook of the company showed an extremely strong financial foundation, as storage demand enters a rapid growth phase amid the wave of frenzy in which global companies are investing heavily in AI infrastructure. However, Wall Street has extremely high expectations for this financial report and performance outlook. Although Micron's core performance indicators all exceeded expectations, the performance outlook for the next quarter fell short of the extremely high expectations of some investment institutions on Wall Street.

The financial report shows that in the third quarter of fiscal year 2024, which ended on May 30, Micron's total revenue scale achieved significant growth of 82%, reaching 6.81 billion US dollars. The storage chip giant, headquartered in Boise, Idaho, reported that excluding certain items, the non-GAAP earnings per share was $0.62, compared to a loss per share of $1.43 in the same period last year and $0.42 in the previous quarter. In contrast, the average Wall Street analyst expectations were for revenue of approximately $6.67 billion and earnings per share of $0.50. Micron's actual performance is far beyond Wall Street expectations.

In a slide presentation, the company stated that PC sales are expected to recover with single-digit low percentage growth in 2024. Smartphone sales are expected to recover with a low to medium percentage growth trend. The company also expects that by 2025, AI functions will help stimulate the large-scale replacement demand for smartphones and personal computers, indicating that DRAM and NAND storage demand will usher in a new growth phase, and Micron, which focuses on the storage field, is expected to fully benefit from this trend.

Regarding the next quarter's performance expectations, the company stated in the performance outlook section of the report that it expects the fourth quarter's revenue to reach $7.4 billion to $7.8 billion. The average analyst expectations are approximately $7.58 billion, which is basically consistent with the average expectation. However, some analysts expect more than $8 billion, which is also an important logical reason for Micron's stock price to plummet after the outlook was announced. For example, Citigroup, a Wall Street major, listed Micron Technology as the "top stock" and expects the fourth quarter's revenue to exceed $8 billion. Excluding certain items, Micron expects non-GAAP earnings per share of approximately $1.08 for the fourth quarter, with a fluctuation range of 0.08 US dollars, which is higher than the Wall Street average expectation of $1.02.

Although Micron has greatly benefited from the frenzy of global enterprises laying out AI technology, demand in Micron's traditional markets such as personal computers and smartphones has not achieved strong growth, and these traditional markets have just begun to recover from the historic demand recession last year. AI PCs and AI smartphones expected to start in 2024 are expected to drive the storage demand of these two major traditional markets into a stage of explosive growth.

After announcing the latest quarterly performance and performance outlook data, the stock fell more than 9% in post-market trading in the US stock market, and then narrowed to around 7%. Prior to the announcement of the financial report today, Micron's stock price had surged by 67% due to the AI boom. Investors expect the company to be one of the main beneficiaries of spending on artificial intelligence. Top Wall Street investment institutions have also greatly raised their target stock prices for Micron in the next 12 months, with the most optimistic target price reaching 225 US dollars (Micron closed at 142.36 US dollars on Wednesday).

Micron CEO Sanjay Mehrotra reiterated his optimistic expectations for the storage industry, stating that 2024 will mark the beginning of a major rebound in the storage chip industry, with sales expected to reach record levels in 2025.

The AI boom will comprehensively drive demand for the expensive HBM storage system, which has a very high manufacturing difficulty for these 3D stacked chip systems and requires chip manufacturers to use a large portion of their production capacity resources. Therefore, it is difficult to significantly increase production capacity. This will significantly reduce the risk of future oversupply, which has been a long-standing scourge of the storage industry. Micron's CEO emphasized that due to the surge in demand for HBM, the expansion rate of HBM production capacity will accelerate, which will also have a very far-reaching impact on the wide range of DRAM and NAND production capacity and price increases. DRAM and NAND supply will gradually be unable to keep up with demand, and prices are expected to steadily rise.

In this fiscal quarter, Micron sold about $100 million worth of HBM3E storage and expects total sales of HBM storage systems to reach hundreds of millions of dollars this quarter. By the end of the 2025 fiscal year (August of that year), sales of HBM storage systems, which are part of the DRAM subsector, are expected to increase to several billion dollars.

HBM storage systems sold by Micron, which are a key component of AI hardware infrastructure, as well as the extensive DRAM and NAND storage products needed for AI infrastructure, have benefited fully from this unprecedented surge in AI spending. HBM storage systems, along with the H100/H200/GB200 AI GPUs that are the most essential hardware for heavyweights such as ChatGPT and Sora, provided by AI chip giant Nvidia, are needed. With almost endless demand for Nvidia's full range of AI GPU products, Nvidia has become the world's most valuable chip company. The HBM storage system provides information more quickly, helping to develop and run artificial intelligence large models.

Creating a large AI model often involves data-bombing software and high-density matrix operations, which can involve tens of trillions of parameters and heavily rely on HBM storage systems. Meanwhile, AI inference workloads involve massive parallel computing patterns, which also heavily depend on HBM storage systems to provide high-bandwidth, low-latency, and high-energy-efficient storage solutions. In order to avoid the bottleneck of computing power and keep the expensive processors running at full speed, Micron and its competitors have developed HBM storage that communicates with other components faster than traditional storage.

South Korea is the location of the world's largest two storage chip manufacturers, SK Hynix and Samsung, of which global HBM leader SK Hynix has become the core HBM supplier for Nvidia, whose H100 AI GPU is equipped with SK Hynix-produced HBM storage systems. In addition, the HBM of Nvidia's H200 AI GPU and the latest B200/GB200 AI GPU based on the Blackwell architecture will also be equipped with the latest HBM3E generation of HBM storage systems produced by SK Hynix, with the other major HBM3E supplier coming from storage giant Micron in the United States. Micron's HBM3E is highly likely to be equipped with Nvidia's H200 and the latest and extremely powerful B200/GB200 AI GPU.

In the hottest HBM market in the storage industry, as of 2022, SK Hynix, Samsung Electronics, and Micron's market shares were 50%, 40%, and 10%, respectively. As SK Hynix was the first to jump into the HBM field back in 2016, it has taken the lion's share of the market. Some insiders say that by the end of 2023, SK Hynix's HBM market share is likely to grow to around 55%, taking an absolute dominant position.

It's not just the demand for HBM that's exploding! DRAM and NAND demand is also rising steadily.

HBM is a high-bandwidth, low-power storage technology used specifically in high-performance computing and graphics processing. HBM connects multiple stacked DRAM chips through 3D stacked storage technology and transfers data through microscopic Through-Silicon Vias (TSVs), resulting in high-speed, high-bandwidth data transfer. HBM stacks multiple storage chips together through 3D stacking technology, significantly reducing the storage system's space footprint, reducing energy consumption for data transfer, and significantly improving data transfer efficiency with high bandwidth, enabling AI heavy models to run more efficiently 24/7.

In particular, the HBM storage system also has powerful low-latency characteristics, which can respond quickly to data access requests. Generative AI heavy models such as GPT-4 often require frequent access to large datasets and carry out incredibly heavy workload inference, and the powerful low-latency characteristics can greatly improve the overall efficiency and response speed of AI systems. In the field of AI infrastructure, the HBM storage system is fully bound to Nvidia H100/H200 AI GPU server systems, as well as fully bound to Nvidia B200/GB200 AI GPU server systems that are about to begin delivery.

According to the data released by the Korean National Statistical Office, the semiconductor inventory in Korea fell sharply by 33.7% on a year-on-year basis in April, the biggest drop since the end of 2014, which largely reflects the surge in demand for storage chips, especially HBM storage systems produced by the two largest storage giants, Samsung and SK Hynix, that contribute nearly 15% of South Korea's GDP, and such demand is growing far faster than supply.

According to a recent research report by Goldman Sachs, due to the extremely strong demand for generative artificial intelligence (Gen AI) from enterprises, driving higher AI server shipment volumes and higher HBM densities per AI GPU, the institution has significantly increased its total HBM market size estimate, now expecting the market to grow tenfold from 2022 to 2026 (a compound annual growth rate of 77% over four years), from $2.3 billion in 2022 to $23 billion in 2026. Goldman Sachs predicts that the HBM market will continue to be in short supply in the coming years, and major players such as SK Hynix, Samsung, and Micron will continue to benefit.

Another major storage giant from South Korea, Samsung, is the world's largest supplier of DRAM and NAND storage chips and is also striving to become one of Nvidia's HBM and the next generation of HBM3E suppliers. In the field of DDR series memory chips, such as DDR4 and DDR5, which is one of the mainstream applications of DRAM, and in the field of SSD, which is one of the mainstream applications of NAND storage, Samsung's market share is far ahead of other memory chip manufacturers, while Micron's position in the DRAM and NAND storage fields may be second only to Samsung. Different from HBM, which is widely used in AI data centers, DDR series storage is mainly used for the main memory of PC systems, providing sufficient memory capacity and bandwidth, supporting multi-task processing and handling of consumer electronic data sets, and LPDDR (Low Power DDR) series is applied to smartphones.

Looking at data on South Korean chip exports, the demand for storage chips is becoming clearer. Early trade data from South Korea shows a 50.2% year-on-year increase in chip product sales in the first 20 days of June, continuing to lead export growth, mainly due to the demand for storage chips from smartphone manufacturers, data center operators, and AI developers, as well as the increase in sales prices.

Since the AI boom swept the world's businesses in 2023, the demand for AI servers has skyrocketed. Global top data center server manufacturers such as Dell Technologies (DELL.US) and Super Micro Computer (SMCI.US) usually use Samsung and Micron's DDR series products, and Samsung/Micron SSD, one of the mainstream applications of NAND storage, is widely used in the server main storage system of computing systems. SK hynix's HBM storage system is bound together with Nvidia's AI GPU. This is also the important logic of the surge in demand for HBM storage systems and the entire DRAM and NAND storage.

DRAM is mainly used for the main memory of the computing system, providing temporary data storage and intermediate computation results for CPUs and GPUs, as well as data loading and pre-processing. Although the reading and writing speed of NAND storage is not as fast as that of DRAM and HBM, it has a large capacity and low cost, and is an ideal choice for storing data for a long time. In the generative AI computing system, NAND is usually used to store large-scale training/inference data sets and trained models. When training or inference loads need to be performed, the data is loaded into DRAM or HBM for processing at high speed.

The trend of integrating large-scale AI models led by Apple Intelligence into consumer electronics is likely to drive the surge in demand for DRAM and NAND, which is also the core logic why Micron's stock price continued to rise following Apple's WWDC.

Morgan Stanley believes that if the base model remains at about 3 billion parameters, the DRAM capacity of the new iPhone 16 base model is expected to upgrade from 6GB of the iPhone 15 to 8GB (possibly the minimum configuration for driving Apple's edge AI large model); considering that the storage density of M2 chips is limited (192GB), the growing demand for Apple's AI servers will consume a lot of LPDDR5.

Apple's AI grand plan presented at WWDC means that since 2024, large-scale edge AI models will gradually begin to be integrated into consumer electronics terminals such as PCs, smartphones, and smart watches, and may even be integrated into humanoid robots in the near future, ushering in the era of embodied AI intelligence. The demand for storage capacity from various terminals may show an exponential growth trend, which is why some research institutions have recently continuously raised the demand expectations of storage chips in the next few years after Apple's WWDC.

In a recent research report, Morgan Stanley emphasized that the surge in AI demand, coupled with the severe lack of capital expenditure by major storage factories over the past two years due to the cold winter in demand, has led to an unprecedented "super cycle" in the storage market. From 2025 onwards, Mizuho Securities predicts that the AI upgrade cycle of smartphones and personal computers may require additional storage capacity, and the market is expected to face a serious supply shortage. The supply shortage rate of HBM is -11%, and the supply shortage rate of the entire DRAM market is -23%.

The latest semiconductor industry outlook data released by the World Semiconductor Trade Statistics Organization (WSTS) shows that the global semiconductor market is expected to show a very strong recovery trend in 2024, and WSTS expects a significant increase in the sales scale of the global semiconductor market for 2024 compared to the forecast report at the end of 2023. For 2024, WSTS predicts a market size of $611 billion, which means a significant increase of 16% compared to the previous year, and a significant upward revision from the forecast at the end of 2023.

WSTS said that the revised expectations for 2024 reflect strong performance in the past two quarters, especially in the computing terminal market. After the market contracted significantly in 2023, WSTS expects that there will be two core chip product categories that will drive sales growth to double digits in 2024. The total sales of logic chip categories, including CPUs and GPUs, are expected to grow by 10.7%, and the storage chip category dominated by DRAM and NAND is expected to surge by 76.8% in 2024.

Looking ahead to 2025, WSTS predicts that the sales scale of the global semiconductor market is expected to reach 687 billion US dollars, which means that the global semiconductor market is expected to grow by about 12.5% on top of the already extremely strong recovery trend in 2024. WSTS still expects this growth to be mainly driven by the storage chip and logic categories, and the overall size of the two categories is expected to soar to over $200 billion in 2025 under the impetus of the AI boom. Compared with the previous year, WSTS predicts that the sales growth rate of the storage chip category dominated by DRAM and NAND will exceed 25% in 2025, and the sales growth rate of the logic chip category, including CPUs and GPUs, will exceed 10%. , It is also expected that the growth rate of all other subdivided markets such as discrete devices, optoelectronics, sensors, and analog semiconductors will reach single digits.

Micron, one of the "AI Three Knights", may not be done with its stock price surge yet!

International bank SMBC Nikko Securities recently released a research report stating that Nvidia is the undisputed leader in the generative AI field, but recently the performance and stock price of Ethernet chip giant Broadcom and memory chip giant Micron have also been outstanding. In particular, the storage chips provided by Micron for AI training/inference systems are sufficient to compete with Nvidia's AI GPU, together forming the 'AI Three Knights'.

Before Micron released its financial report and performance outlook, Wall Street investment institutions generally bullish on Micron's stock price continued to reach historical highs in the next 12 months with a "bull market pace." Among them, Rosenblatt, a well-known Wall Street investment institution that gave Nvidia a target price of up to $200, reiterated its "buy" rating on Micron with a highest target price of $225 on Wall Street.

Wolfe Research and Citigroup both maintained their bullish outlook on Micron Technology before the company released its financial report. Wolfe Research maintained its 'shareholding' rating on Micron Technology, with a target price raised from $150 to $200. The analyst at the firm pointed out that they raised their expectations for the company's performance because the storage industry is in good condition and they maintain an optimistic attitude toward the company's HBM sales expectations.

However, analysts at Wolfe Research also added that Micron's recent performance is not the core of their bullish outlook on the stock. Analysts believe that all memory chip suppliers have been working to limit supplies in recent years to avoid customers building up inventories before expected price increases. Wolfe Research analysts believe that Micron's EPS is expected to be pushed to $20 per share in fiscal years 2025/2026, with HBM contributing about $3 per share.

Meanwhile, Citigroup reiterated its buy rating on Micron and raised its target price from $150 to $175. Analysts at the bank expect that given the company's overall improvement in DRAM and the increasing exposure in the HBM segment, the stock price should continue to be higher than its historical range. Another major Wall Street bank, Bank of America, also reiterated its buy rating on Micron, raising its target price from $144 to $170 and pointing out that Micron will be the biggest beneficiary of the increasing market share of HBM.

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment