Upload flash_attn-2.8.3+cu129torch2.8.0cxx11abiTRUE-cp313-cp313-win_amd64.whl
#18 opened 1 day ago
by
Bollty
Request for Python 3.12 wheel: flash-attn for CUDA 13 + PyTorch 2.9
π
4
#17 opened 14 days ago
by
razvanab
Request for Python 3.12 wheel: flash-attn for CUDA 12.8 + PyTorch 2.8.0
π
3
3
#16 opened 2 months ago
by
Consider20241010
12.8 version for Windows
1
#15 opened 2 months ago
by
gjnave
1.0.9 build for Windows with cuda 12.4 and torch 2.5 please?
#14 opened 2 months ago
by
seedmanc
Could we please get a build that works with pytorch 2.8?
π₯
1
#13 opened 3 months ago
by
phazei
PyTorch support ?
2
#9 opened 5 months ago
by
AliceBeta
flash_attn-2.7.4.post1%2Bcu128torch2.8.0cxx11abiTRUE-cp312-cp312-win_amd64
3
#8 opened 5 months ago
by
mythicvision
Save a lot of time
#6 opened 6 months ago
by
AekDevDev
CMD says it's successfully installed, but cannot find
#5 opened 7 months ago
by
czx850
RTX 5090 and Torch V 2.8.0dev Cu128 options
π₯
π
11
2
#4 opened 7 months ago
by
Astroburner
Open to collaborating on building wheels?
1
#3 opened 8 months ago
by
robinjhuang
not run
10
#2 opened 9 months ago
by
rakmik