Package Details: python-flash-attention 2.7.4-1

Git Clone URL: https://aur.archlinux.org/python-flash-attention.git (read-only, click to copy)
Package Base: python-flash-attention
Description: Fast and memory-efficient exact attention
Upstream URL: https://github.com/Dao-AILab/flash-attention
Keywords: ai
Licenses: Apache-2.0
Submitter: daskol
Maintainer: daskol
Last Packager: daskol
Votes: 2
Popularity: 0.000630
First Submitted: 2023-06-21 07:27 (UTC)
Last Updated: 2025-04-21 13:31 (UTC)

Latest Comments

hottea commented on 2025-04-21 13:28 (UTC)

arch should be x86_64, and according to spdx, license should be Apache-2.0