Update README.md
Browse files
README.md
CHANGED
|
@@ -10,6 +10,9 @@ tags:
|
|
| 10 |
|
| 11 |
**Virtuoso-Medium-v2 (32B)** is our next-generation, 32-billion-parameter language model that builds upon the original Virtuoso-Medium architecture. This version is distilled from Deepseek-v3, leveraging an expanded dataset of 5B+ tokens worth of logits. It achieves higher benchmark scores than our previous release (including surpassing Arcee-Nova 2024 in certain tasks).
|
| 12 |
|
|
|
|
|
|
|
|
|
|
| 13 |
### Model Details
|
| 14 |
- **Architecture Base:** Qwen-2.5-32B
|
| 15 |
- **Parameter Count:** 32B
|
|
|
|
| 10 |
|
| 11 |
**Virtuoso-Medium-v2 (32B)** is our next-generation, 32-billion-parameter language model that builds upon the original Virtuoso-Medium architecture. This version is distilled from Deepseek-v3, leveraging an expanded dataset of 5B+ tokens worth of logits. It achieves higher benchmark scores than our previous release (including surpassing Arcee-Nova 2024 in certain tasks).
|
| 12 |
|
| 13 |
+
### GGUF
|
| 14 |
+
Quantizations available [here](https://huggingface.co/arcee-ai/Virtuoso-Lite-GGUF)
|
| 15 |
+
|
| 16 |
### Model Details
|
| 17 |
- **Architecture Base:** Qwen-2.5-32B
|
| 18 |
- **Parameter Count:** 32B
|