share_log

英伟达重金砸向AI新生势力! “NVIDIA AI 帝国”愈发强盛

NVIDIA invests heavily in AI emerging forces! The "NVIDIA AI Empire" is becoming stronger.

Zhitong Finance ·  Sep 12 23:33

Nvidia, with its powerful AI GPU+CUDA ecosystem, has established a dominant position in the AI chip field and its "NVIDIA AI Empire" is becoming stronger and stronger.

$NVIDIA (NVDA.US)$CEO Huang Renxun stated at a technology conference organized by Goldman Sachs on Wednesday that Nvidia's next-generation AI GPU, the Blackwell architecture AI GPU, is so popular that it has caused dissatisfaction among some large customers who were unable to obtain the product in a timely manner. This "Versailles-style" statement by Huang shows Nvidia's near-monopoly position in the AI infrastructure construction field and the market's continued fanatical demand for Nvidia's AI GPU. In addition, statistics show that since 2023, Nvidia has been aggressively investing in AI startups worldwide to consolidate its absolute dominance in the generative AI infrastructure field.

There is no doubt that NVIDIA is one of the biggest winners in the global AI investment frenzy so far. Due to the high demand for its high-performance AI GPUs from global companies and government institutions, NVIDIA's stock price has skyrocketed over 700% since early 2023, and even more than 900% since its historical low in October 2022. As NVIDIA's market value and revenue scale soar, the company's management continues to consolidate the penetration of the 'NVIDIA software and hardware ecosystem' in the AI industry. The pace of investment in AI startups is also accelerating, with more than half of the company's investments in startups occurring in the past two years since 2005.

According to statistics, the AI chip giant, which was once hailed as the 'world's highest market cap listed company,' invested a total of over $1.5 billion in AI startups in early 2024, a significant increase compared to $0.3 billion a year ago. According to Crunchbase statistics, in 2024 alone, this AI chip giant participated in more than 10 rounds of financing for AI startups with funding exceeding $0.1 billion. Since 2023, NVIDIA has invested in more than 50 AI startups, including important AI companies such as Perplexity A and Hugging Face.

In addition, NVIDIA is considering investing in the upcoming financing round of ChatGPT developer OpenAI. These latest developments from NVIDIA all indicate that the 'NVIDIA AI Empire' established by its powerful AI GPU+CUDA ecosystem in the field of AI chips is becoming stronger.

AI startups are crucial for the development of the global AI industry, especially the flourishing enterprise AI application market. Unlike global cloud computing giants like Amazon AWS, Microsoft Azure, and Google Cloud Platform, which focus on building AI application development ecosystems or underlying AI infrastructure, these AI startups focus on various specific AI application scenarios that are crucial for improving enterprise efficiency or enhancing the work or learning efficiency of global end-users.

For example, Perplexity AI, an AI startup from the United States, focuses on the cutting-edge field of 'AI search'; French AI startup Bioptimus focuses on fully integrating the latest AI technology with medical science and biotechnology; AI startup Cognition has launched what is considered the world's first 'fully autonomous virtual AI software engineer.' This virtual engineer possesses powerful programming and software development capabilities and can assist programmers or independently complete large-scale software development projects in multiple cutting-edge technology areas.

The following are the startup companies in the AI field that Nvidia has invested in and holds an important position.

Perplexity AI

Huang Renxun does not hide his love for Perplexity AI, which can be called the "Google Killer", and unexpectedly becomes the favorite AI tool of Nvidia CEO Huang Renxun. In an interview this year, Huang Renxun was asked, "Do you use ChatGPT or Google AI chatbots frequently? Or do you use other products?" Huang Renxun replied, "I generally use Perplexity, and I use it almost every day. For example, when I recently want to understand how AI can assist in drug development, I will use Perplexity for relevant searches."

He even expressed his support for Perplexity with practical actions. Nvidia participated in a round of financing of approximately $62.7 million for Perplexity AI in April. The AI startup's valuation is approximately $1 billion. Top investors including Daniel Gross and Jeff Bezos, the founder of Amazon, participated. This is not the first time Nvidia has supported the company. The chip giant invested in Perplexity AI in a round of financing in January, when the AI startup raised up to $73.6 million.

Hugging Face

Hugging Face is an AI startup that provides open-source AI models and application development platforms. It has a long-standing relationship with Nvidia. In August 2023, this chip giant participated in a round of financing of up to $0.235 billion for Hugging Face. After this round of financing, Hugging Face is valued at approximately $4.5 billion. Other corporate investors participating in Hugging Face's financing at the time included Google, Amazon, Intel, AMD, and Salesforce.

Hugging Face has long incorporated Nvidia's hardware system and CUDA software tools and library resources into its shared resources. In May, the startup launched a new project, donating Nvidia GPUs worth up to $10 million for free sharing to AI developers.

Adept AI

Unlike the generative artificial intelligence chatbots made by well-known AI startups like OpenAI and Anthropic, Adept AI's main product is not centered around text or image generation. Instead, this AI startup focuses on creating a software engineering assistant that can perform tasks on a computer, such as generating reports or browsing web pages, and can use software tools. Nvidia is also involved, participating in a financing round of up to $0.35 billion in March 2023.

Databricks

Last autumn, Databricks gained a whopping $43 billion valuation, making it one of the most valuable AI startups in the world. As expected, this data analytics software provider has extensively adopted Nvidia's AI GPU and has received support from the chip giant, as well as other venture capital firms such as Anderson Horowitz and Capital One Ventures. All of these investors participated in a $0.5 billion financing round in September 2023. "Databricks is leveraging Nvidia's software and hardware technology to do incredible work in accelerating data processing and large-scale AI model generation," Huang Renxun said in a statement at the time.

Cohere

Well-known Canadian AI startup Cohere is a strong competitor to OpenAI and Anthropic, specializing in providing exclusive AI models for businesses. The company's growth over the past five years has attracted support from major technology backers such as Nvidia, Salesforce, and Cisco, which provided funding for Cohere in a financing round in July. Nvidia also participated in a financing round in May 2023, bringing approximately $0.27 billion in funding support to this AI startup.

"The NVIDIA AI Empire" is becoming increasingly strong.

When Nvidia invests in AI startups focused on different application areas, these companies essentially use the majority of their investment funds to purchase Nvidia AI GPUs to establish or expand their AI training and inference infrastructure. AI startups require a large amount of computing resources to train their deep learning models, and Nvidia's GPUs (such as H100, H200, and the soon-to-be-released Blackwell GPU) are industry standard in terms of performance, making it a natural decision for them to choose Nvidia's products.

Nvidia CUDA is a highly optimized parallel computing platform and programming model that is deeply integrated with Nvidia's GPU hardware. AI startups that accept investments from Nvidia essentially allocate a significant amount of their funding to purchase the advanced version of the CUDA acceleration toolkit, further deepening their reliance on Nvidia's ecosystem. This "lock-in effect" ensures that these startups, when developing AI applications or iterating large models, almost inevitably continue to use Nvidia's hardware and software tools.

In the future, when enterprises use these AI models or applications developed by AI startups, they have to continue to rely on Nvidia's full-stack ecosystem of software and hardware collaboration platform during the inference and deployment stage, in order to optimize the models and applications. This allows Nvidia to further expand its market share through these startups.

In addition, if the enterprises that actually use these AI models or applications developed by AI startups choose to deploy the training/inference power on cloud computing platforms, it will require AWS, Microsoft, Oracle, and other cloud giants to continue purchasing Nvidia's AI GPUs to build AI infrastructure. Many enterprises prefer to use cloud computing platforms such as AWS, Microsoft Azure, and Oracle OCI when deploying AI applications. If these cloud service providers' customers are using AI models and applications developed based on Nvidia's software and hardware ecosystem, these cloud service providers will also need to continuously purchase Nvidia's latest AI GPUs and configure CUDA advanced acceleration tools and libraries to meet the huge computing power demands. This software and hardware stack ecosystem, driven by these factors, is pushing Nvidia's "NVIDIA AI Empire" to grow even stronger.

Among them, the CUDA ecosystem is Nvidia's strongest moat. Nvidia has been deeply involved in the global high-performance computing field for many years, especially with its CUDA computing platform, which has become the first choice for software and hardware collaboration systems in the field of high-performance computing, such as AI training/inference. CUDA acceleration computing ecosystem is an exclusive parallel computing acceleration platform and programming assistance software developed by Nvidia, which allows software developers and engineers to use Nvidia GPUs to accelerate parallel general-purpose computing (only supports Nvidia GPUs, not compatible with mainstream GPUs such as AMD and Intel).

CUDA can be said to be the platform that generative AI applications such as ChatGPT are extremely dependent on, and its importance is no less important than the hardware system. It is critical for the development and deployment of large-scale AI models. With extremely high technical maturity, absolute performance optimization advantages and extensive ecosystem support, CUDA has become the most commonly used and fully popular collaborative platform in AI research and commercial deployment.

According to the official information on Nvidia's website, using Nvidia GPUs for CUDA general acceleration computation programming and some basic tools is free, but if it involves CUDA enterprise-level large-scale application and support (such as NVIDIA AI Enterprise), or leasing Nvidia computing power on cloud platforms such as Amazon AWS, Google Cloud, and Microsoft Azure, it may require subscription-based advanced CUDA microservice to develop AI systems and may require additional fees. In addition to the huge revenue generated by CUDA-tied GPU hardware and the revenue generated by CUDA enterprise-level large-scale applications, the software business derived from CUDA is also a major source of revenue for Nvidia.

Based on the extremely powerful and highly penetrative CUDA platform and the powerful AI GPUs, Nvidia's layout in the software and hardware ecosystem has been continuously enhanced recently. Nvidia officially launched a microservice called "NVIDIA NIM" at the GTC conference in March, which charges based on GPU usage time. It is a cloud-native microservice focused on optimization, aiming to shorten the time-to-market for AI applications based on large-scale AI models and simplify their deployment workloads on the cloud, data centers, and GPU-accelerated workstations. This enables enterprises to deploy AI applications based on Nvidia's AI GPU cloud inference computing power and the acceleration provided by the CUDA platform, seeking to establish an exclusive Nvidia GPU-based AI application full-stack development ecosystem.

Wall Street is calling for a "buy on the dip" for Nvidia stocks, as they believe the stocks have been sold off "excessively".

This is also why Rosenblatt, a well-known investment institution on Wall Street, is more bullish on the revenue growth of Nvidia's software, which is centered around CUDA, compared to the revenue from AI GPUs. In a research report, Rosenblatt's chip industry analyst Hans Mosesmann raised the institution's target stock price for Nvidia within 12 months from $140 to an amazing $200 per share, ranking the highest target price among Wall Street's expectations for Nvidia.

Mosesmann stated that based on the potential prosperity of NVIDIA's software business based on CUDA, even though the stock price of the AI chip leader NVIDIA has soared in the past year, the stock price of this chip giant will continue to rise in the next 12 months. Therefore, in addition to the huge GPU revenue brought by the tightly bound NVIDIA AI GPU through CUDA, the software business derived from CUDA as its core is also an engine for NVIDIA to achieve huge revenue.

Regarding the recent sharp drop in NVIDIA's stock price, with a market cap evaporating nearly $400 billion in the past week, top Wall Street investment giants like Goldman Sachs have all stated that investors have been "over selling" NVIDIA. Analyst Toshiya Hari from Goldman Sachs maintained a "buy" rating for NVIDIA recently, stating, "The recent performance of NVIDIA's stock price has not been very good, but we still believe in this stock. The recent sell-off is clearly excessive. First of all, the global demand for accelerated computing is still very strong. We tend to focus more on ultra-large-scale enterprises such as Amazon, Google, Microsoft, and other global giants, but what you will see is that the demand scope is expanding to include enterprises, and even sovereign nations."

As the competition in the AI field intensifies among large tech companies such as Microsoft and Amazon, international bank UBS recently predicted that the overall AI capital expenditure of these tech giants in 2021 and 2025 may grow by 47% and 16.5% respectively, reaching $218 billion and $254 billion. However, UBS also mentioned that the comprehensive capital expenditure intensity (capital expenditure divided by revenue) of large tech companies is still lower than historical peaks. UBS predicts that these tech giants seem to have the potential to achieve a profit growth of around 15-20% in the next few quarters as the monetization of generative AI accelerates. UBS predicts that the total free cash flow of large tech companies may increase from $413 billion in 2021 to $522 billion in 2025.

On Wall Street, the sentiment of "buying on the dips" is particularly strong. The bullish investors on Wall Street firmly believe that this round of correction has squeezed out most of the "AI bubble," and tech companies that can continue to profit in the AI wave are expected to enter a new round of "major upswing," such as NVIDIA, AMD, Taiwan Semiconductor, Advanced Micro Devices, and Broadcom, among other popular chip stocks. The core infrastructure for popular generative AI tools such as ChatGPT is indispensable for chips, and these popular chip stocks can be called the biggest winners of the AI boom, especially the combination of the "CUDA ecosystem + high-performance AI GPU," which forms NVIDIA's exceptionally strong moat.

In addition to Goldman Sachs, analysts from Bank of America and Morgan Stanley and other major banks are also optimistic about NVIDIA's stock trend and are shouting out that it is a good opportunity to "buy on the dips." Among them, analyst Vivek Arya from Bank of America recently reiterated his "buy" rating for NVIDIA, calling it the "best industry choice," and stated that the decline in NVIDIA's stock price provides a good entry point. He raised NVIDIA's target stock price from $150 to $165, contrasted with NVIDIA's closing price of $116.910 on Wednesday. The Bank of America analyst emphasized that there is no need to doubt the potential of artificial intelligence at least until 2026.

The demand for AI chips is indeed extremely strong at present, and it is expected to remain so for a long time to come. The management of TSMC recently stated at an earnings conference that the situation of insufficient supply for the advanced packaging required for AI chips, CoWoS, is expected to continue until 2025, and there may be a slight easing in 2026. Industry insiders recently revealed that due to the extremely strong demand for NVIDIA's upcoming Blackwell architecture AI GPU on a global scale, NVIDIA has significantly increased its AI GPU foundry orders with chip giant TSMC by at least 25%.

Huang Renxun stated at a conference on Wednesday, "The demand for AI GPUs can be said to be very strong, everyone wants to be the first to receive the goods, everyone wants to receive the most products." "We may now have more emotional customers, which is natural. The situation is a bit tense right now, and we are doing our best." He added that the demand for the latest generation of AI GPUs, the Blackwell GPU, is strong, and suppliers are catching up.

In addition, when asked whether massive AI spending provides a return on investment for customers - which has always been a concern during this wave of AI boom - Huang Renxun stated that companies have no choice but to embrace "accelerated computing." He pointed out that NVIDIA's technology speeds up traditional workload processing and can also handle AI tasks that existing technologies cannot cope with.

Editor/ping

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment