Update README.md
Browse files
README.md
CHANGED
|
@@ -8,6 +8,7 @@ pipeline_tag: text-to-video
|
|
| 8 |
tags:
|
| 9 |
- mochi
|
| 10 |
- t5
|
|
|
|
| 11 |
- gguf-node
|
| 12 |
widget:
|
| 13 |
- text: >-
|
|
@@ -15,34 +16,38 @@ widget:
|
|
| 15 |
tracking camera
|
| 16 |
output:
|
| 17 |
url: samples\ComfyUI_00001_.webp
|
| 18 |
-
- text:
|
| 19 |
-
same prompt as 1st one <metadata inside>
|
| 20 |
output:
|
| 21 |
url: samples\ComfyUI_00002_.webp
|
| 22 |
---
|
| 23 |
|
| 24 |
-
#
|
| 25 |
|
| 26 |

|
| 27 |
|
| 28 |
-
##
|
| 29 |
- drag mochi_fp8.safetensors ([10GB](https://huggingface.co/calcuis/mochi/blob/main/mochi_fp8.safetensors)) to > ./ComfyUI/models/diffusion_models
|
| 30 |
- drag t5xxl_fp16-q4_0.gguf ([2.9GB](https://huggingface.co/calcuis/mochi/blob/main/t5xxl_fp16-q4_0.gguf)) to > ./ComfyUI/models/text_encoders
|
| 31 |
- drag mochi_vae_scaled.safetensors ([725MB](https://huggingface.co/calcuis/mochi/blob/main/mochi_vae_scaled.safetensors)) to > ./ComfyUI/models/vae
|
| 32 |
|
| 33 |
-
##
|
| 34 |
-
- run the .bat file in the main directory (assuming you are using the gguf-
|
| 35 |
- drag the workflow json file (below) to > your browser
|
| 36 |
|
| 37 |
-
###
|
| 38 |
- example workflow (with [gguf](https://huggingface.co/calcuis/mochi/blob/main/workflow-mochi-gguf.json) encoder)
|
| 39 |
- example workflow ([safetensors](https://huggingface.co/calcuis/mochi/blob/main/workflow-mochi-safetensors.json))
|
| 40 |
|
| 41 |
-
###
|
|
|
|
|
|
|
|
|
|
|
|
|
| 42 |
- base model from [genmo](https://huggingface.co/genmo/mochi-1-preview)
|
| 43 |
- comfyui from [comfyanonymous](https://github.com/comfyanonymous/ComfyUI)
|
| 44 |
-
- gguf
|
| 45 |
- gguf-comfy [pack](https://github.com/calcuis/gguf-comfy/releases)
|
|
|
|
| 46 |
|
| 47 |
### prompt test#
|
| 48 |

|
|
|
|
| 8 |
tags:
|
| 9 |
- mochi
|
| 10 |
- t5
|
| 11 |
+
- gguf-comfy
|
| 12 |
- gguf-node
|
| 13 |
widget:
|
| 14 |
- text: >-
|
|
|
|
| 16 |
tracking camera
|
| 17 |
output:
|
| 18 |
url: samples\ComfyUI_00001_.webp
|
| 19 |
+
- text: same prompt as 1st one <metadata inside>
|
|
|
|
| 20 |
output:
|
| 21 |
url: samples\ComfyUI_00002_.webp
|
| 22 |
---
|
| 23 |
|
| 24 |
+
# **gguf quantized version of t5xxl encoder with mochi (test pack)**
|
| 25 |
|
| 26 |

|
| 27 |
|
| 28 |
+
## **setup (once)**
|
| 29 |
- drag mochi_fp8.safetensors ([10GB](https://huggingface.co/calcuis/mochi/blob/main/mochi_fp8.safetensors)) to > ./ComfyUI/models/diffusion_models
|
| 30 |
- drag t5xxl_fp16-q4_0.gguf ([2.9GB](https://huggingface.co/calcuis/mochi/blob/main/t5xxl_fp16-q4_0.gguf)) to > ./ComfyUI/models/text_encoders
|
| 31 |
- drag mochi_vae_scaled.safetensors ([725MB](https://huggingface.co/calcuis/mochi/blob/main/mochi_vae_scaled.safetensors)) to > ./ComfyUI/models/vae
|
| 32 |
|
| 33 |
+
## **run it straight (no installation needed way)**
|
| 34 |
+
- run the .bat file in the main directory (assuming you are using the gguf-node pack below)
|
| 35 |
- drag the workflow json file (below) to > your browser
|
| 36 |
|
| 37 |
+
### **workflow**
|
| 38 |
- example workflow (with [gguf](https://huggingface.co/calcuis/mochi/blob/main/workflow-mochi-gguf.json) encoder)
|
| 39 |
- example workflow ([safetensors](https://huggingface.co/calcuis/mochi/blob/main/workflow-mochi-safetensors.json))
|
| 40 |
|
| 41 |
+
### **review**
|
| 42 |
+
- t5xxl gguf works fine as text encoder
|
| 43 |
+
- mochi gguf file might not work; if so, please wait for the code update
|
| 44 |
+
|
| 45 |
+
### **reference**
|
| 46 |
- base model from [genmo](https://huggingface.co/genmo/mochi-1-preview)
|
| 47 |
- comfyui from [comfyanonymous](https://github.com/comfyanonymous/ComfyUI)
|
| 48 |
+
- comfyui-gguf [city96](https://github.com/city96/ComfyUI-GGUF)
|
| 49 |
- gguf-comfy [pack](https://github.com/calcuis/gguf-comfy/releases)
|
| 50 |
+
- gguf-node ([pypi](https://pypi.org/project/gguf-node)|[repo](https://github.com/calcuis/gguf)|[pack](https://github.com/calcuis/gguf/releases))
|
| 51 |
|
| 52 |
### prompt test#
|
| 53 |

|