share_log

科技巨头接连推出“去英伟达”AI芯片新品!这一次是Meta,MTIA又有更新

Tech giants have launched new “Go to Nvidia” AI chip products one after another! This time it's Meta, MTIA has another update

wallstreetcn ·  Apr 11 07:08

Source: Hard AI Author: Zhao Yuhe

Meta released its first Training and Reasoning Accelerator (MTIA) product last year. The self-developed chip announced on Wednesday is the latest version of MTIA to help rank and recommend content on Facebook and Instagram.

$Meta Platforms (META.US)$It was announced on Wednesday that a self-developed chip is being deployed to help support its artificial intelligence services. Analysts believe that the move is intended to reduce the right$NVIDIA (NVDA.US)$Dependence on chips from other external companies.

Meta released its first Training and Inference Accelerator (MTIA) product last year. According to information, the self-developed chip announced on Wednesday is the latest version of MTIA to help rank and recommend content on Facebook and Instagram.

Meta's move to AI services has increased demand for computing power. Last year, Meta released its own AI model to compete with OpenAI's ChatGPT. Meta is also adding new generative AI features to its social apps, including custom stickers and chatbot characters featuring celebrity faces.

Last October, Meta said it would invest up to $35 billion to support AI, including data centers and hardware. At the time, Meta CEO Zuckerberg said, “AI will be our biggest investment area in 2024.”

According to information, a significant portion of this expenditure may still go to Nvidia to purchase currently popular H100 video cards to drive AI models. Earlier this year, Zuckerberg said the company would buy 350,000 such chips, each costing tens of thousands of dollars.

However, more and more tech giants are starting to develop their own chips. Now Meta is joining$Amazon (AMZN.US)$The company AWS,$Microsoft (MSFT.US)$Along with Google, they are trying to get rid of dependence on high-cost AI chips. However, analysts believe that this process will not be completed quickly. Until now, the AI industry is still hungry for Nvidia's AI accelerators, and the tech giants' efforts have not even left any trace.

Currently, the AI boom has helped Nvidia become the third-largest technology company in the world by market capitalization after Microsoft and Apple. Its total revenue for data center operators reached $47.5 billion in fiscal year 2024, a significant increase from the previous year's $15 billion. Analysts predict that number will double again in fiscal year 2025.

Meta shares closed up 0.57% at $519.83 on Wednesday.

Nvidia shares closed up 1.76% at $870.39 on Wednesday.

Judging from netizens' comments, the new chip Meta announced this time was well received by many netizens. Some netizens said that now everyone is starting to make their own chips. This is the right direction, please speed up and don't stop.

Some netizens said that if Meta does a good job, the more competition Nvidia feels, the faster humans will reach general artificial intelligence.

Other netizens said that humans have reached a “technological singularity” (meaning that technological development will make great progress almost infinitely in a very short period of time), and great progress will occur every day. Catching up with AI will soon become a full time job.

However, there are also netizens who analyze it from a technical perspective and think that the Meta chip is plain and unremarkable. Some netizens said that when you don't care about tedious general computing workloads and just want to perform 8-bit integer operations, you can filter out a lot of content. It's just an overdeveloped SIMD core, equipped with a rudimentary caching system. Everything depends on the compiler, and if they can't use the LLVM compiler then they'll definitely find it really difficult.

Some netizens said that the chip area of the new MTIA is small, the thermal design power consumption (TDP) is 90W (typical training accelerator is 350-500W, multi-chip module (MCM) is 700-1000W), which is only about one-third of the H100's floating point computing power (TFLOPs), so overall it may be a victory, and may be similar to the expansion strategy of many small TPUs adopted by Google. Still, this was really a great time for chips!

Other netizens said that if you can't buy this chip, then what's the point of worrying about it?

From the perspective of the capital market, some netizens said that in any case, this is beneficial$Taiwan Semiconductor (TSM.US)$...

edit/lambor

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment