gpt2-exl2 / README.md
Anthonyg5005's picture
Add temp README
0a0beb9 verified
|
raw
history blame
639 Bytes
---
library_name: ExLlama2
pipeline_tag: text-generation
base_model: openai-community/gpt2
base_model_relation: quantized
---
# Exl2 quants for [gpt2](https://huggingface.co/openai-community/gpt2)
## Automatically quantized using the auto quant script from [hf-scripts](https://huggingface.co/anthonyg5005/hf-scripts)
### BPW:
[6.0](https://huggingface.co/Anthonyg5005/gpt2-exl2/tree/6.0bpw)\
[6.5](https://huggingface.co/Anthonyg5005/gpt2-exl2/tree/6.5bpw)\
[8.0](https://huggingface.co/Anthonyg5005/gpt2-exl2/tree/8.0bpw)\
[measurement.json](https://huggingface.co/Anthonyg5005/gpt2-exl2/blob/main/measurement.json)