Spaces:
Running
on
Zero
Running
on
Zero
removes flash attention from requirements
Browse files- requirements.txt +0 -1
requirements.txt
CHANGED
|
@@ -1,4 +1,3 @@
|
|
| 1 |
-
https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.9.post1/flash_attn-2.5.9.post1+cu118torch1.12cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
|
| 2 |
gradio>=5.38.2
|
| 3 |
torch>=2.0.0
|
| 4 |
transformers>=4.54.0
|
|
|
|
|
|
|
| 1 |
gradio>=5.38.2
|
| 2 |
torch>=2.0.0
|
| 3 |
transformers>=4.54.0
|