Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
AdinaY 
posted an update 25 days ago
Post
4400
At the close of the National Holiday🇨🇳, Antgroup drops a new SoTA model.

Ling-1T 🔥 the trillion-parameter flagship of the Ling 2.0 series.

inclusionAI/Ling-1T

✨1T total / 50B active params per token
✨20T+ reasoning-dense tokens (Evo-CoT)
✨128K context via YaRN
✨FP8 training: 15%+ faster, same precision as BF16
✨Hybrid Syntax-Function-Aesthetics reward for front-end & visual generation

50b per token isn't very efficient... Wonder if we could make this 4: https://huggingface.co/inclusionAI/Ling-1T/blob/main/config.json#L22