On the afternoon of November 27, at a media meeting co-hosted by Orion Star and JunCloud Technology, Orion Star announced the launch of its self-developed open-source hybrid architecture expert large model Orion-MoE8×7B, and in partnership with JunCloud Technology, introduced the AI data treasure AirDS (AI-Ready Data Service), which is based on this large model. It was introduced that the Orion-MoE8×7B large model has 8×7 billion parameters and adopts a generative mixture of experts design, covering multiple languages such as Chinese, English, Japanese, and Korean. In mainstream public benchmark evaluations, Orion-MoE8×7B shows outstanding performance across various evaluation indicators compared to base large models of the same parameter scale, and it also surpasses dense models of the same parameter scale in inference speed. (Sina Technology)
猎户星空官宣发布MOE大模型,推出AirDS数据服务
Orion Star officially announces the release of the MOE large model and launches the AirDS data service.
The translation is provided by third-party software.
The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.