Search Criteria
Package Details: python-flash-attention 2.7.4-1
Package Actions
Git Clone URL: | https://aur.archlinux.org/python-flash-attention.git (read-only, click to copy) |
---|---|
Package Base: | python-flash-attention |
Description: | Fast and memory-efficient exact attention |
Upstream URL: | https://github.com/Dao-AILab/flash-attention |
Keywords: | ai |
Licenses: | Apache-2.0 |
Submitter: | daskol |
Maintainer: | daskol |
Last Packager: | daskol |
Votes: | 4 |
Popularity: | 0.51 |
First Submitted: | 2023-06-21 07:27 (UTC) |
Last Updated: | 2025-04-21 13:31 (UTC) |
Dependencies (9)
- python-einopsAUR
- python-pytorch-cuda (python-pytorch-cxx11abi-opt-cudaAUR, python-pytorch-opt-cuda)
- ninja (ninja-kitwareAUR, ninja-fuchsia-gitAUR, ninja-gitAUR, ninja-memAUR, ninja-noemacs-gitAUR, ninja-jobserverAUR) (make)
- python-build (make)
- python-installer (make)
- python-packaging (make)
- python-psutil (make)
- python-setuptools (make)
- python-wheel (make)
Latest Comments
lubosz commented on 2025-10-13 20:46 (UTC)
Feel free to take my patches that drop GCC 13, fix OOM and bump to 2.8.3
https://github.com/lubosz/python-flash-attention/commits/v2.8.3/
lubosz commented on 2025-10-13 13:11 (UTC)
v2.8.3 has been released 2 months ago.
https://github.com/Dao-AILab/flash-attention/releases/tag/v2.8.3
hottea commented on 2025-04-21 13:28 (UTC)
arch should be x86_64, and according to spdx, license should be Apache-2.0