Tarek07's picture
Add files using upload-large-folder tool
e652f26 verified
metadata
base_model:
  - TareksLab/Fifth-Dimension-V1-LLaMa-70B
  - TareksLab/Zero-Dimension-F-LLaMa-70B
  - TareksLab/Second-Dimension-V1-LLaMa-70B
  - TareksLab/First-Dimension-V1-LLaMa-70B
  - TareksLab/Sixth-Dimension-V1-LLaMa-70B
  - TareksLab/Third-Dimension-V1-LLaMa-70B
library_name: transformers
tags:
  - mergekit
  - merge

MERGE1

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Linear DELLA merge method using TareksLab/Second-Dimension-V1-LLaMa-70B as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: TareksLab/First-Dimension-V1-LLaMa-70B
    parameters:
      weight: [0.5, 0.2, 0.1, 0.1, 0.1]
      density: 0.7
      epsilon: 0.2
  - model: TareksLab/Third-Dimension-V1-LLaMa-70B
    parameters:
      weight: [0.2, 0.4, 0.2, 0.1, 0.1]
      density: 0.7
      epsilon: 0.2
  - model: TareksLab/Zero-Dimension-F-LLaMa-70B
    parameters:
      weight: [0.1, 0.2, 0.4, 0.2, 0.1]
      density: 0.7
      epsilon: 0.2
  - model: TareksLab/Fifth-Dimension-V1-LLaMa-70B
    parameters:
      weight: [0.1, 0.1, 0.2, 0.4, 0.2]
      density: 0.7
      epsilon: 0.2
  - model: TareksLab/Sixth-Dimension-V1-LLaMa-70B
    parameters:
      weight: [0.1, 0.1, 0.1, 0.2, 0.5]
      density: 0.7
      epsilon: 0.2
merge_method: della_linear
base_model: TareksLab/Second-Dimension-V1-LLaMa-70B
parameters:
  lambda: 1.1
  normalize: false
dtype: float32
out_dtype: bfloat16
chat_template: llama3
tokenizer:
 source: TareksLab/Zero-Dimension-F-LLaMa-70B
 pad_to_multiple_of: 8