AI在线 AI在线

llm1

Xiaohongshu makes a major move! The all-new open-source large model dots.llm1震撼登场 with 142 billion parameters!

Recently, the hi lab team of Xiaohongshu officially released its first open-source text large model — dots.llm1. This new model has attracted extensive attention in the industry due to its outstanding performance and massive number of parameters.dots.llm1 is a large-scale Mixture of Experts (MoE) language model with an impressive 142 billion parameters, including 14 billion activated parameters. After being trained on 11.2 TB of high-quality data, this model's performance can rival Alibaba's Qwen2.5-72B.
6/16/2025 9:48:52 AM
AI在线
  • 1