Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
befox
/
WAN2.2-14B-Rapid-AllInOne-GGUF
like
121
GGUF
Model card
Files
Files and versions
xet
Community
28
22c1b8a
WAN2.2-14B-Rapid-AllInOne-GGUF
/
Mega-v2
146 GB
2 contributors
History:
12 commits
befox
Upload Mega-v2/wan2.2-rapid-mega-aio-nsfw-v2-Q8_0.gguf with huggingface_hub
b973da3
verified
about 2 months ago
wan2.2-rapid-mega-aio-nsfw-v2-Q2_K.gguf
6.7 GB
xet
Upload Mega-v2/wan2.2-rapid-mega-aio-nsfw-v2-Q2_K.gguf with huggingface_hub
about 2 months ago
wan2.2-rapid-mega-aio-nsfw-v2-Q3_K.gguf
8.62 GB
xet
Upload Mega-v2/wan2.2-rapid-mega-aio-nsfw-v2-Q3_K.gguf with huggingface_hub
about 2 months ago
wan2.2-rapid-mega-aio-nsfw-v2-Q4_K.gguf
11.6 GB
xet
Upload Mega-v2/wan2.2-rapid-mega-aio-nsfw-v2-Q4_K.gguf with huggingface_hub
about 2 months ago
wan2.2-rapid-mega-aio-nsfw-v2-Q5_K.gguf
13 GB
xet
Upload Mega-v2/wan2.2-rapid-mega-aio-nsfw-v2-Q5_K.gguf with huggingface_hub
about 2 months ago
wan2.2-rapid-mega-aio-nsfw-v2-Q6_K.gguf
14.5 GB
xet
Upload Mega-v2/wan2.2-rapid-mega-aio-nsfw-v2-Q6_K.gguf with huggingface_hub
about 2 months ago
wan2.2-rapid-mega-aio-nsfw-v2-Q8_0.gguf
18.7 GB
xet
Upload Mega-v2/wan2.2-rapid-mega-aio-nsfw-v2-Q8_0.gguf with huggingface_hub
about 2 months ago
wan2.2-rapid-mega-aio-v2-Q2_K.gguf
6.7 GB
xet
Upload Mega-v2/wan2.2-rapid-mega-aio-v2-Q2_K.gguf with huggingface_hub
about 2 months ago
wan2.2-rapid-mega-aio-v2-Q3_K.gguf
8.62 GB
xet
Upload Mega-v2/wan2.2-rapid-mega-aio-v2-Q3_K.gguf with huggingface_hub
about 2 months ago
wan2.2-rapid-mega-aio-v2-Q4_K.gguf
11.6 GB
xet
Upload Mega-v2/wan2.2-rapid-mega-aio-v2-Q4_K.gguf with huggingface_hub
about 2 months ago
wan2.2-rapid-mega-aio-v2-Q5_K.gguf
13 GB
xet
Upload Mega-v2/wan2.2-rapid-mega-aio-v2-Q5_K.gguf with huggingface_hub
about 2 months ago
wan2.2-rapid-mega-aio-v2-Q6_K.gguf
14.5 GB
xet
Upload Mega-v2/wan2.2-rapid-mega-aio-v2-Q6_K.gguf with huggingface_hub
about 2 months ago
wan2.2-rapid-mega-aio-v2-Q8_0.gguf
18.7 GB
xet
Upload Mega-v2/wan2.2-rapid-mega-aio-v2-Q8_0.gguf with huggingface_hub
about 2 months ago