Amazon is expanding its artificial intelligence product lineup, launching powerful new chip arrays and large language models, and claiming they can compete with major competitors.
Fintech News App learned that Amazon (AMZN.US) is expanding its artificial intelligence product lineup, launching powerful new chip arrays and large language models, and claiming they can compete with major competitors.
The company, headquartered in Seattle, is assembling hundreds of thousands of Trainium2 semiconductors into clusters, making it easier for partner Anthropic to train large language models required for generative artificial intelligence and other machine learning tasks. Amazon said the new array will increase the processing power of this start-up company fivefold.
Amazon announced at its annual re:Invent conference that its cloud computing service department AWS started providing customers with the latest chips on Tuesday.
In addition, Amazon CEO Andy Jassy introduced a new artificial intelligence model called Nova, which can generate text, images, and videos, representing Amazon's latest efforts to compete with OpenAI and other builders of large language models that power chatbots and other generative artificial intelligence tools.
AWS is the largest seller of rented computing power, operating many servers rented by other companies to train artificial intelligence applications. AWS also provides customers with models built by other companies, including Anthropic's Claude and Meta (META.US)'s Llama. However, the company has not yet produced a large language model widely considered to compete with OpenAI's state-of-the-art GPT model.
Amazon has released several generations of products called Titan in the past two years, but their capabilities are limited. Part of the Nova model has already been released, and another part will be released next year, including a "multi-modal to multi-modal" version that can take text, voice, images, and videos as input and generate responses in each mode.
Jassy stated that Amazon will continue to develop its own models and provide models manufactured by other companies. He said, "We will provide you with the broadest and best features you can find anywhere."
Amazon announced last month that it will add an additional $4 billion investment to Anthropic. As part of the transaction, Anthropic stated that it will use Amazon's cloud and chips to develop its most advanced models.
Gadi Hutt, who collaborates with customers at Amazon's Annapurna Labs chip manufacturing department, stated in an interview that the new chip cluster known as Project Rainier will contain "significantly more than" 100,000 chips. Amazon stated that the cluster is expected to be the world's largest dedicated artificial intelligence hardware.
Amazon hopes that the company's third-generation artificial intelligence semiconductor chips can compete with Nvidia's (NVDA.US) products, providing an alternative for AWS customers in developing generative artificial intelligence products. For most companies, Nvidia's GPU is the default hardware for such tasks, but it is expensive and often in short supply.
Amazon stated that starting from early next year, it will provide customers with computing capabilities supported by Nvidia's new Blackwell chip.