share_log

黄仁勋:AI未来在于“推理”,芯片成本大降是关键!

Huang Renxun: The future of AI lies in "reasoning", and a significant reduction in chip costs is key!

cls.cn ·  Oct 9 22:57

Huang Renxun's latest statement indicates that the future of artificial intelligence will be a service capable of "reasoning", but reaching this stage requires first lowering the computing costs. He mentioned that Nvidia plans to increase chip performance by two to three times each year while maintaining the same cost and energy consumption levels to lay the groundwork for these advancements.

Finance and Economics Society News on October 9th (Editor Zhao Hao), Huang Renxun, the CEO of Nvidia, recently stated that the future of artificial intelligence (AI) will be a service capable of "reasoning", but reaching this stage requires first lowering the computing costs.

Arm, the chip design company, released on its official website on Wednesday (October 9th) the audio of the conversation between Arm's CEO Rene Haas and Huang Renxun, covering topics such as Huang Renxun's entrepreneurial journey, the future of AI, and Nvidia's unique corporate culture.

Source: Arm Official Website

Before joining Arm in 2013, Haas worked at Nvidia for 7 years, rising to become the Vice President and General Manager of the Computing Product Business, making him Huang Renxun's long-term colleague. It is worth mentioning that Nvidia had previously attempted to acquire Arm.

Huang Renxun told Haas in the podcast that the next generation tools would be able to respond to human questions by going through hundreds or thousands of steps and reflecting on their conclusions.

Huang Renxun stated that this will enable future software to have reasoning abilities, which is completely different from what he uses every day, OpenAI's chatbot ChatGPT.

He mentioned that Nvidia plans to increase chip performance by two to three times each year while maintaining the same cost and energy consumption levels to lay the groundwork for these advancements.

This will change the way ai systems handle inference - identifying patterns or trends from data and drawing reasonable conclusions based on these patterns.

"We are able to drive significant cost reductions in ai, and we all recognize the value of this. If we can significantly reduce costs, we can do some reasoning in inference."

Data shows that in the current ai chip market, nvidia has already captured over 90% of the market share. Last week, Huang Renxun said that the external demand for the company's next-generation ai chip 'Blackwell' was 'too crazy'.

Back in March of this year, Huang Renxun mentioned that the price range for Blackwell is between $0.03 million and $0.04 million. However, Nvidia will not sell the GPU separately, but rather prefers to sell it as part of a package with networking equipment, software services, etc.

Affected by these factors, nvidia's stock price has risen by over 10% this month, reaching $133 at the time of publication, approaching its historical record of $140. Its total market value has also surpassed microsoft to become the world's second most valuable listed company after apple.

Since the beginning of the year, Nvidia's stock price has surged by 170%, bringing Huang Renxun close to the top ten on the rich list.

Nvidia Stock Price Daily Chart

However, nvidia is also facing the threat of 'weakening control over the market'; some major customers are developing internal alternatives, and its main competitor AMD (Advanced Micro Devices) plans to share its latest ai products at Thursday's event.

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment