metadata
base_model:
- ConicCat/Mistral-Small-3.2-AntiRep-24B
- CrucibleLab/M3.2-24B-Loki-V1.3
- zerofata/MS3.2-PaintedFantasy-v2-24B
- Gryphe/Codex-24B-Small-3.2
library_name: transformers
tags:
- mergekit
- merge
- roleplay
MS3.2-24B-Fiery-Lynx
Instruct template: Mistral V7
Merge Details
Merge Method
This model was merged using the Linear DELLA merge method using ConicCat/Mistral-Small-3.2-AntiRep-24B as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: zerofata/MS3.2-PaintedFantasy-v2-24B
parameters:
weight: 0.2
density: 0.5
epsilon: 0.4
- model: Gryphe/Codex-24B-Small-3.2
parameters:
weight: 0.2
density: 0.5
epsilon: 0.4
- model: CrucibleLab/M3.2-24B-Loki-V1.3
parameters:
weight: 0.4
density: 0.4
epsilon: 0.3
merge_method: della_linear
base_model: ConicCat/Mistral-Small-3.2-AntiRep-24B
parameters:
lambda: 0.9
normalize: true
dtype: bfloat16
tokenizer:
source: union
