Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
QuantStack
/
Wan2.1_I2V_14B_FusionX-GGUF
like
79
Follow
QuantStack
1.39k
Image-to-Video
GGUF
English
quantized
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
1
339e576
Wan2.1_I2V_14B_FusionX-GGUF
31.3 GB
2 contributors
History:
7 commits
YarvixPA
Rename Wan2.1_14B_I2V_FusionX_fp16-Q2_K.gguf to Wan2.1_14B_I2V_FusionX-Q2_K.gguf
339e576
verified
5 months ago
.gitattributes
1.89 kB
Rename Wan2.1_14B_I2V_FusionX_fp16-Q2_K.gguf to Wan2.1_14B_I2V_FusionX-Q2_K.gguf
5 months ago
README.md
1.99 kB
Create README.md
5 months ago
Wan2.1_14B_I2V_FusionX-Q2_K.gguf
6.01 GB
xet
Rename Wan2.1_14B_I2V_FusionX_fp16-Q2_K.gguf to Wan2.1_14B_I2V_FusionX-Q2_K.gguf
5 months ago
Wan2.1_14B_I2V_FusionX_fp16-Q3_K_M.gguf
8.09 GB
xet
Upload Wan2.1_14B_I2V_FusionX_fp16-Q3_K_M.gguf
5 months ago
Wan2.1_14B_I2V_FusionX_fp16-Q3_K_S.gguf
7.43 GB
xet
Upload Wan2.1_14B_I2V_FusionX_fp16-Q3_K_S.gguf
5 months ago
Wan2.1_14B_I2V_FusionX_fp16-Q4_0.gguf
9.76 GB
xet
Upload Wan2.1_14B_I2V_FusionX_fp16-Q4_0.gguf
5 months ago