YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Official AQLM quantization of meta-llama/Llama-2-13b-hf.

For this quantization, we used 2 codebook of 8 bits.

Selected evaluation results for this and other models:

Model Quantization WikiText 2 PPL Model size, Gb
Llama-2-13b - 4.57 26.0
2x8 5.63 3.8
Downloads last month
14
Safetensors
Model size
4B params
Tensor type
F16
·
I8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support