Quantized model for vllm
- tool: autoawq 4bit
- caliblation: japanese wiki
See detail.
https://huggingface.co/nitky/RoguePlanet-DeepSeek-R1-Qwen-32B
- Downloads last month
- 1
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for fujisan/RoguePlanet-DeepSeek-R1-Qwen-32B-AWQ-calib-wiki
Base model
nitky/RoguePlanet-DeepSeek-R1-Qwen-32B