share_log

昆仑万维开源2千亿稀疏大模型天工MoE,全球首创能用4090推理

Kunlun Tech open-sourced the sparse large-scale model TianGong MoE with a size of 200 billion, which is the world's first to use 4090 for inference.

Zhitong Finance ·  Jun 3 17:45
According to Kunlun Tech's official WeChat account, on June 3, 2024, Kunlun Tech announced the open-source of a powerful 200 billion sparse large model - Skywork-MoE, with lower inference costs. Skywork-MoE is an open-source billion-level MoE model that fully applies MoE Upcycling technology and is based on the extension of the checkpoint of the previously open-sourced Skywork-13B model by Kunlun Tech. It is also the first open-source billion-level MoE model that can be inferred using a single 4090 server.

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment