Update README.md
Browse files
README.md
CHANGED
|
@@ -28,7 +28,7 @@ At its core, OpenModel-1T leverages an **Evolutionary Chain-of-Thought (Evo-CoT)
|
|
| 28 |
|
| 29 |
## ⚙️ Key Features
|
| 30 |
|
| 31 |
-
* 🧩 **1T Total
|
| 32 |
* 🧠 **Evo-CoT Training:** Evolutionary chain-of-thought reinforcement — model learns to reason *about* its own reasoning.
|
| 33 |
* 📚 **20T+ Token Corpus:** Pre-trained on a curated, reasoning-dense dataset spanning code, math, science, multilingual text, and human reasoning.
|
| 34 |
* ⏱️ **128K Context Window:** Long-context comprehension for entire projects, books, or datasets.
|
|
@@ -104,7 +104,7 @@ The model does **not** produce or endorse harmful, biased, or illegal content.
|
|
| 104 |
| **Total Parameters** | 1 Trillion |
|
| 105 |
| **Active Parameters** | 50 Billion |
|
| 106 |
| **Architecture** | Transformer-MoE with Evo-CoT |
|
| 107 |
-
| **Training Tokens** | 20 Trillion
|
| 108 |
| **Context Length** | 128K |
|
| 109 |
| **Precision** | FP8 / BF16 hybrid |
|
| 110 |
| **License** | Apache-2.0 with AI-Responsible Use Addendum |
|
|
|
|
| 28 |
|
| 29 |
## ⚙️ Key Features
|
| 30 |
|
| 31 |
+
* 🧩 **1T Total | 50B Active MoE Design:** Trillion-parameter scale with sparse activation for exceptional throughput efficiency.
|
| 32 |
* 🧠 **Evo-CoT Training:** Evolutionary chain-of-thought reinforcement — model learns to reason *about* its own reasoning.
|
| 33 |
* 📚 **20T+ Token Corpus:** Pre-trained on a curated, reasoning-dense dataset spanning code, math, science, multilingual text, and human reasoning.
|
| 34 |
* ⏱️ **128K Context Window:** Long-context comprehension for entire projects, books, or datasets.
|
|
|
|
| 104 |
| **Total Parameters** | 1 Trillion |
|
| 105 |
| **Active Parameters** | 50 Billion |
|
| 106 |
| **Architecture** | Transformer-MoE with Evo-CoT |
|
| 107 |
+
| **Training Tokens** | 20+ Trillion |
|
| 108 |
| **Context Length** | 128K |
|
| 109 |
| **Precision** | FP8 / BF16 hybrid |
|
| 110 |
| **License** | Apache-2.0 with AI-Responsible Use Addendum |
|