jp6/cu129/: flash-attn versions
Because this project isn't in the mirror_whitelist,
no releases from root/pypi are included.
Latest version on stage is: 2.8.4
Flash Attention: Fast and Memory-Efficient Exact Attention
| Index | Version | Documentation |
|---|---|---|
| jp6/cu129 | 2.8.4 | |
| jp6/cu129 | 2.8.3 | |
| jp6/cu129 | 2.8.2 | |
| jp6/cu129 | 2.7.4.post1 | |
| jp6/cu129 | 2.7.2.post1 |