Search Criteria
Package Details: python-flash-attn 2.8.3-1
Package Actions
| Git Clone URL: | https://aur.archlinux.org/python-flash-attn.git (read-only, click to copy) |
|---|---|
| Package Base: | python-flash-attn |
| Description: | Fast and memory-efficient exact attention |
| Upstream URL: | https://github.com/Dao-AILab/flash-attention |
| Licenses: | BSD-3-Clause |
| Provides: | python-flash-attention |
| Submitter: | Smoolak |
| Maintainer: | Smoolak |
| Last Packager: | Smoolak |
| Votes: | 0 |
| Popularity: | 0.000000 |
| First Submitted: | 2025-12-11 01:40 (UTC) |
| Last Updated: | 2025-12-11 01:40 (UTC) |
Dependencies (11)
- python-einopsAUR
- python-pytorch-cuda (python-pytorch-cxx11abi-opt-cudaAUR, python-pytorch-cuda12.9AUR, python-pytorch-opt-cuda12.9AUR, python-pytorch-opt-cuda)
- cuda (cuda11.1AUR, cuda-12.2AUR, cuda12.0AUR, cuda11.4AUR, cuda-12.5AUR, cuda-12.9AUR, cuda-pascalAUR) (make)
- git (git-gitAUR, git-glAUR) (make)
- ninja (ninja-gitAUR, ninja-memAUR, ninja-noemacs-gitAUR, ninja-kitwareAUR, ninja-fuchsia-gitAUR) (make)
- python-build (make)
- python-installer (make)
- python-packaging (make)
- python-psutil (make)
- python-setuptools (make)
- python-wheel (make)
Required by (2)
- python-kt-kernel (optional)
- python-nanotron (optional)