Post
4400
At the close of the National Holiday🇨🇳, Antgroup drops a new SoTA model.
Ling-1T 🔥 the trillion-parameter flagship of the Ling 2.0 series.
inclusionAI/Ling-1T
✨1T total / 50B active params per token
✨20T+ reasoning-dense tokens (Evo-CoT)
✨128K context via YaRN
✨FP8 training: 15%+ faster, same precision as BF16
✨Hybrid Syntax-Function-Aesthetics reward for front-end & visual generation
Ling-1T 🔥 the trillion-parameter flagship of the Ling 2.0 series.
inclusionAI/Ling-1T
✨1T total / 50B active params per token
✨20T+ reasoning-dense tokens (Evo-CoT)
✨128K context via YaRN
✨FP8 training: 15%+ faster, same precision as BF16
✨Hybrid Syntax-Function-Aesthetics reward for front-end & visual generation