Search Criteria
Package Details: python-flash-attention 2.7.4-1
Package Actions
Git Clone URL: | https://aur.archlinux.org/python-flash-attention.git (read-only, click to copy) |
---|---|
Package Base: | python-flash-attention |
Description: | Fast and memory-efficient exact attention |
Upstream URL: | https://github.com/Dao-AILab/flash-attention |
Keywords: | ai |
Licenses: | Apache-2.0 |
Submitter: | daskol |
Maintainer: | daskol |
Last Packager: | daskol |
Votes: | 2 |
Popularity: | 0.000630 |
First Submitted: | 2023-06-21 07:27 (UTC) |
Last Updated: | 2025-04-21 13:31 (UTC) |
Dependencies (9)
- python-einopsAUR
- python-pytorch-cuda (python-pytorch-cxx11abi-opt-cudaAUR, python-pytorch-opt-cuda)
- ninja (ninja-kitwareAUR, ninja-fuchsia-gitAUR, ninja-gitAUR, ninja-jobserverAUR, ninja-memAUR) (make)
- python-build (make)
- python-installer (make)
- python-packaging (make)
- python-psutil (make)
- python-setuptools (make)
- python-wheel (make)
Latest Comments
hottea commented on 2025-04-21 13:28 (UTC)
arch should be x86_64, and according to spdx, license should be Apache-2.0