Update README.md
Browse files
README.md
CHANGED
|
@@ -53,7 +53,7 @@ This model retains the efficient MoE architecture of Ling 2.0, completed pre-tra
|
|
| 53 |
|
| 54 |
__Ring-1T__ remains under continuous training.
|
| 55 |
While the preview version already demonstrates powerful natural language reasoning capabilities, it still exhibits issues such as language mixing, repetitive reasoning and identity misperception.
|
| 56 |
-
|
| 57 |
|
| 58 |
|
| 59 |
|
|
|
|
| 53 |
|
| 54 |
__Ring-1T__ remains under continuous training.
|
| 55 |
While the preview version already demonstrates powerful natural language reasoning capabilities, it still exhibits issues such as language mixing, repetitive reasoning and identity misperception.
|
| 56 |
+
__We look forward to community exploration and feedback to collectively accelerate the iterative refinement of this trillion-parameter foundation.__
|
| 57 |
|
| 58 |
|
| 59 |
|