share_log

AI重塑PC产业链:微软开启战场,高通、英特尔、AMD混战CPU,内存最为受益

AI reshapes the PC industry chain: Microsoft opens the battlefield. Qualcomm, Intel, and AMD scuffle CPUs, and memory benefit the most

wallstreetcn ·  May 9 13:08

Source: Wall Street News Author: Zhao Yuhe

According to the Morgan Stanley Research Report, in order to properly run CoPilot on PC devices, artificial intelligence PCs need to be equipped with 45 trillion operations per second (TOPS) of computing power, which will drive CPU and memory upgrades. Among them, memory will be the key. It is expected that 32GB memory will become the default configuration for artificial intelligence PCs, and the increasing penetration rate of artificial intelligence PCs will also increase the shortage of memory supply.

Artificial intelligence is transforming the PC sector, making it smarter, faster, and more personalized. Morgan Stanley recently released a research report stating that the future PC market will be dominated by artificial intelligence PCs, which will drive CPU and memory upgrades. Among them, memory will be the key. It is expected that 32GB memory will become the default configuration for artificial intelligence PCs, and the increasing penetration rate of artificial intelligence PCs will also increase the shortage of memory supply.

Where did the changes in artificial intelligence PCs come from: 45 Tops computing power to meet the operating needs of Microsoft Copilot

According to the research report, most existing PCs are aging, and artificial intelligence can change the way we work in the future. Morgan Stanley defines an artificial intelligence PC as a superpersonal computer with specific SoC (system-on-chip) functions, capable of running generative AI tasks locally. Since most CPUs or GPUs are currently not optimized to run AI tasks efficiently, artificial intelligence PCs equipped with neural processing units (NPUs) will be able to run these tasks efficiently. Until now, there are only PC chips equipped with less than 40 TOPS (trillions of operations per second) NPU, such as Intel's Core Ultra series, AMD's Ryzen 7040 series, Qualcomm's Snapdragon 8CX series, and Apple's M series.

Currently, Microsoft is working to provide Copilot and other AI features in the latest version of Windows, and third-party developers will later create related applications. To properly run CoPilot on a PC device, an artificial intelligence PC needs to be equipped with 45 trillion operations per second (TOPS) of computing power. Upcoming AI features are likely to be more dependent on device hardware, which means Microsoft will push its manufacturing partners to set a minimum memory configuration of 16GB in new notebooks and desktops to include Copilot and other Windows 11 AI features. Upgrading memory is one of the easiest ways to improve PC performance, and this will be the first major upgrade seen in the PC memory industry since the post-2012 PC era.

With the launch of Windows 12, Morgan Stanley's global PC hardware team anticipates rapid growth in the penetration rate of AI laptops in 2025. The research report estimates that AI PC penetration rates will reach 30% and 50% in 2025 and 2026, respectively (up from 8% in 2024). Artificial intelligence PCs not only brought higher profit margins to system manufacturers such as Dell, but also increased the semiconductor content of each PC by 20%-30%, mainly due to CPU and memory upgrades (which increased the global semiconductor industry's revenue by about 30 billion US dollars).

CPU Wars: X86 architecture vs. Arm architecture

There are many types of AI-enabled PC processors.$Qualcomm (QCOM.US)$The Snapdragon X Elite platform is the first product to meet Microsoft's Copilot standards. It is expected to be shipped in the second half of 2024, and will provide approximately 45 TOPS of computing power, exceeding the 40 TOPS artificial intelligence PC requirement defined by Microsoft. AMD's Ryzen 8000 series (Strix Point), along with Intel's Meteor Lake, was released in December 2023. It has 34 TOPS of CPU+GPU+NPU comprehensive computing power and has not yet reached Microsoft's standards, but the upcoming Lunar Lake processor will break the 40 TOPS threshold by the end of this year.

According to the research report, competition between Intel and AMD's x86 CPU architectures and Qualcomm ARM CPU architectures will intensify in the artificial intelligence PC market. Qualcomm was the first to meet Microsoft's requirements, enabling it to seize the opportunities of the first wave of artificial intelligence PCs, such as Dell,$HP Inc (HPQ.US)$Major PC original equipment manufacturers such as Lenovo, Asus, and Acer will all launch models equipped with Qualcomm CPUs in 2024, challenging Intel's x86 architecture camp.

Memory yield logic

According to the research report, internal memory plays a critical role in the overall system performance of AI PCs and the efficiency of AI task execution. AI operations require large amounts of data for training and deep learning, and this data is temporarily stored in memory. The trained AI model contains a large number of parameters, and as the complexity of the model increases, the number of parameters also increases accordingly, so sufficient memory capacity is required to store these parameters. As a result, high-speed, large-capacity memory can provide faster data access and help speed up the model training process.

According to the research report, internal memory has greatly benefited from the competition between x86 and ARM. Artificial intelligence PCs have driven a significant increase in average PC memory capacity and use higher LPDDR in PC memory. AI acceleration is a highly memory-dependent operation, and large language models (LLM) require large amounts of fast and frequently accessed memory. That's why Microsoft has set the base amount of memory in AI PCs at 16GB, but it's expected that most customers and businesses will probably choose 32GB as the default configuration to get better performance and be prepared for the future.

16GB is used not only for local task acceleration, but also for Copilot AI functions in the cloud. This capacity is 50% higher than the average memory capacity of each PC in 2023, when it was around 10.5GB. Morgan Stanley predicts that artificial intelligence PCs will drive demand for PC memory to grow faster, and consumer upgrading trends may further enhance this demand growth trajectory.

Furthermore, most AI PC CPUs will use LPDDR5x instead of the current mainstream DDR SO-DIMM module. Bandwidth and energy efficiency are just as important for AI PCs. This shift in low-power memory is driven by the need for faster data transmission — DDR5 speeds are between 4.8-5.6 Gbps, while LPDDR5x reaches 7.5-8.5 Gbps, which can meet the needs of AI PCs for fast language processing and response. Compared with DDR5, the power consumption of LPDDR5X is reduced by about 50% during active use, the power consumption during self-refresh is reduced by 86%, and the bandwidth is higher, reaching 6.4 Gb/s, while the bandwidth of DDR5 is 4.8 Gb/s. Morgan Stanley estimates that LPDDR will account for 30-35% of PC memory requirements this year, and future growth will be driven by the higher penetration rate of artificial intelligence PCs and the continued adoption of CPU manufacturers.

However, the increase in the popularity and penetration rate of artificial intelligence PCs is ultimately beneficial to the overall update cycle, especially in the commercial sector. Morgan Stanley's Greater China hardware team predicts that by the end of 2027, the penetration rate of AI PCs will reach 53%, which means AI PC memory will account for 10% of global memory shipments in 2027. Although artificial intelligence PC memory accounts for only 3% of the global memory market in 2024, the increase in memory capacity will still exacerbate the already tight memory supply. Morgan Stanley estimates that the PC memory supply and demand ratio will drop from the current 2% to -9%, and the total global memory supply and demand ratio will drop by 2 percentage points to -5%.

Where do killer apps come from

Morgan Stanley believes that currently Microsoft's Copilot is the main killer AI application, and as penetration increases, third-party developers will also join.

Copilot is an AI-driven productivity tool that coordinates large language models and$Microsoft (MSFT.US)$ The content in Graph helps users operate on a daily basis. All users can access it through the Copilot website, Bing Chat, Copilot in Microsoft Edge, and Copilot in Windows. Copilot is also accessible via the Copilot, Bing, Edge, Microsoft Start, and Microsoft 365 mobile apps.

At the same time, Microsoft also embeds Copilot into major applications such as Word, Excel, PowerPoint, OneNote, Teams, and Outlook. For example, Copilot in PowerPoint can turn an existing Word document into a presentation; Copilot in Excel will help analyze and explore data to help make the best decisions and generate diagrams to visualize opportunities; and Copilot in Word will write, edit, summarize, provide suggestions, and create collaboratively. Additionally, it can help generate meeting minutes when conversations/video meetings are taking place in Teams.

Currently, Copilot offers free and professional versions for individual users, with the professional version allowing users to prioritize access to GPT-4 and GPT-4 Turbo during peak hours for faster performance, and to build customized Copilot GPT to meet individual needs and interests. Although Copilot currently runs in the cloud, Morgan Stanley anticipates that AI PCs will launch offline versions when entering the Windows 12 era, which will help ease concerns about data security and possible network speed drops.

edit/lambor

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment