MN-12B-Azure-Veil

azureveil_

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Passthrough merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:


slices:
- sources:
  - model: anthracite-org/magnum-v4-12b
    layer_range: [0, 15]

- sources:
  - model: SicariusSicariiStuff/Impish_Nemo_12B
    layer_range: [15, 20]
    
- sources:
  - model: crestf411/MN-Slush
    layer_range: [20, 32]

- sources:
  - model: Vortex5/Moonlit-Shadow-12B
    layer_range: [32, 40]
merge_method: passthrough
dtype: bfloat16
tokenizer:
  source: anthracite-org/magnum-v4-12b
Downloads last month
6
Safetensors
Model size
12B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Vortex5/MN-12B-Azure-Veil