Update README.md
Browse files
README.md
CHANGED
|
@@ -29,6 +29,8 @@ license: apache-2.0
|
|
| 29 |
|
| 30 |
**GRaPE** stands for **G**eneral **R**easoning **A**gent for **P**roject **E**xploration.
|
| 31 |
|
|
|
|
|
|
|
| 32 |
GRaPE Mini is a 1.5 billion parameter, dense, instruction-tuned language model designed for high-quality reasoning, robust coding, and agentic capabilities. It is built upon the powerful **Qwen2.5** architecture and has been meticulously fine-tuned on a specialized blend of datasets to achieve a unique balance of helpfulness and controllable alignment.
|
| 33 |
|
| 34 |
Along with GRaPE Mini, a 7B MoE (Mixture of Experts) model based from OlMoE will be made after benchmark, and safety tests from GRaPE Mini (beta) have concluded, in the meantime, enjoy this model!
|
|
|
|
| 29 |
|
| 30 |
**GRaPE** stands for **G**eneral **R**easoning **A**gent for **P**roject **E**xploration.
|
| 31 |
|
| 32 |
+
**THIS MODEL IS CURRENTLY IN PREVIEW. THIS MODEL IS ONLY UNDERGONE 10% OF IT'S INSTRUCTION TRAINING.**
|
| 33 |
+
|
| 34 |
GRaPE Mini is a 1.5 billion parameter, dense, instruction-tuned language model designed for high-quality reasoning, robust coding, and agentic capabilities. It is built upon the powerful **Qwen2.5** architecture and has been meticulously fine-tuned on a specialized blend of datasets to achieve a unique balance of helpfulness and controllable alignment.
|
| 35 |
|
| 36 |
Along with GRaPE Mini, a 7B MoE (Mixture of Experts) model based from OlMoE will be made after benchmark, and safety tests from GRaPE Mini (beta) have concluded, in the meantime, enjoy this model!
|