share_log

乘着生成式AI的东风,英伟达市值奔向1万亿美元

Riding the wind of generative AI, Nvidia's market capitalization reached 1 trillion dollars

新浪科技 ·  May 30, 2023 17:43

Source: Sina Technology

May 30th, Beijing time,$NVIDIA (NVDA.US)$It rose nearly 4% before the market. The current stock price is 404.52 US dollars, and the market capitalization has exceeded 1 trillion US dollars. This continuous rise is credited to the explosion of generative artificial intelligence (AI), as Nvidia has taken over GPUs that can handle complex computations.

In 2022, chipmaker Nvidia (Nvidia) released the H100 processor. It's one of the company's most powerful and expensive processors, costing around $40,000 each. The launch of the H100 doesn't seem timely, as major companies are cutting spending because of increasing inflation.

However, by November, ChatGPT went live. Nvidia CEO Jensen Huang (Jensen Huang) said in this regard: “We had a very difficult year last year. But with the advent of the OpenAI chatbot ChatGPT, we were able to turn the tide overnight. ChatGPT has created huge demand in a very short time.”

The sudden popularity of ChatGPT sparked an arms race between the world's leading tech companies and startups, who began scrambling to buy the H100 because it was “the world's first computer chip created for generative AI. The so-called generative AI means that the system can generate text, images, and other content as fast as a human.

Having the right product at the right time can create real value. Last Thursday, Nvidia predicted that in the second fiscal quarter ending at the end of July, its sales would reach 11 billion US dollars, which is more than 50% higher than Wall Street's previous forecast. This was mainly due to a rebound in data center spending by major tech companies and a sharp increase in demand for Nvidia's AI chips.

Investors' reactions to this optimistic outlook have also been positive. On the following day, Nvidia's market capitalization increased by 184 billion US dollars, making the valuation of what is already the world's most valuable chip company close to 1 trillion US dollars.

Generative AI could reshape every industry, bring huge productivity gains, and replace millions of jobs. Nvidia was an early winner in the rise of this technology, and the H100 will accelerate this technological leap forward. The H100 is based on a new Nvidia chip architecture called “Hopper” named after American programming pioneer Grace Hopper (Grace Hopper). Today, the H100 is fast becoming the hottest product in Silicon Valley.

Hwang In-hoon said, “Just when we wanted to mass-produce the Hopper, the business opportunity came. ChatGPT debuted a few weeks after we began mass production of the Hopper.”

Wong In-hoon is confident of seizing this continued rise in demand, partly due to Nvidia and chipmakers$Taiwan Semiconductor (TSM.US)$Cooperation to expand the production scale of H100 to meet$Microsoft (MSFT.US)$,$Amazon (AMZN.US)$with$Alphabet-A (GOOGL.US)$Demand from cloud providers, Internet companies such as Meta, etc., and enterprise customers has exploded.

CoreWeave is an AI-focused cloud infrastructure startup and one of the first companies to buy H100 earlier this year. “It's one of the scarcest engineering resources on Earth,” said Brannin McBee, CoreWeave's chief strategy officer and founder.

Some customers have waited as long as 6 months to get the thousands of H100 chips they want to train their huge data models. Some AI startups say they are worried that H100 will be in short supply at a time when demand is taking off.

Recently,$Tesla (TSLA.US)$CEO Elon Musk (Elon Musk) founded a new AI startup, X.ai, and purchased thousands of Nvidia chips.

Musk also said that in addition to establishing partnerships, winning the AI competition requires a lot of financial support based only on computing power. For example, the minimum investment for server hardware is $250 million. “The computational costs have reached astronomical amounts,” he said. To build a generative AI system, server hardware would require an investment of at least $250 million.”

Today, the H100 is increasingly popular with large technology companies such as Microsoft and Amazon, as well as generative AI startups such as OpenAI, Anthropic, Stability AI, and Inflection AI. The former is building a complete data center centered around AI workloads, while the latter promises higher performance, which can speed up product launches or reduce training costs over time.

Currently, Ian Buck (Ian Buck), head of Nvidia's hyperscale and high-performance computing business, is facing the difficult task of increasing the supply of H100 to meet huge market demand. “The market demand is very strong, and some big customers are looking to buy tens of thousands of GPUs,” he said.

The H100 is a large “acceleration” chip designed specifically for data centers. It has 80 billion transistors, which is 5 times the number of transistors in the latest iPhone processor. Although it's twice as expensive as its predecessor, the A100 released in 2020, early adopters said the H100's performance was at least three times higher.

Stability AI co-founder and CEO Emad Mostaque (Emad Mostaque) said, “The H100 solved the scalability problem that has always plagued AI model creators. This is important because it allows all of us to train larger models faster as we move from research to engineering.”

Although the timing for the H100 release was ideal, Nvidia's breakthrough in the field of AI can be traced back to software innovation over the past 20 years, not hardware. In 2006, Nvidia created Cuda software that allows GPUs to be reused for other types of workloads other than graphics. By 2012 or so, to quote Buck, “AI discovered us.”

Canadian researchers were the first to realize that GPUs are ideal for creating neural networks. Neural networks are a form of AI inspired by the way neurons interact in the human brain. Subsequently, neural networks became the new focus of AI development. “It took us almost 20 years to get to where we are today,” Buck said.

Today, Nvidia has more software engineers than hardware engineers, which enabled it to support many different types of AI frameworks that appeared in the next few years, and made its chips more efficient in statistical computation (statistical computation), which is what was needed to train AI models.

Hopper is the first architecture optimized for “transformers”. This AI method (transformer) supports OpenAI's “generative pre-trained transformer” chatbots. Nvidia's close collaboration with AI researchers enabled them to discover the emergence of transformers in 2017 and begin adjusting their software accordingly.

Nathan Benaich (Nathan Benaich), partner at AI startup investment firm Air Street Capital, said, “It can be said that Nvidia saw the future earlier than everyone else, and their strategy was to make GPUs programmable. Nvidia discovered business opportunities, placed big bets, and stayed ahead of the competition.”

Benaci expects Nvidia to be two years ahead of its rivals. However, Benaci also said, “In terms of hardware and software, Nvidia's position is not impeccable.”

Mostak also agreed with this view. He said, “Google,$Intel (INTC.US)$Next-generation chips from other companies are catching up. With software standardization, Nvidia's Cuda isn't unbreakable.”

To some in the AI industry, Wall Street's recent enthusiasm (driving Nvidia's stock price surge) seems overly optimistic. Regardless, Jay Goldberg (Jay Goldberg), founder of chip consulting firm D2D Consulting, said, “As far as it is concerned, it seems like Nvidia will take it all in the AI market's' semifinals'.”

Editor/jayden

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment