Package Details: python-flash-attn 2.8.3-1

Git Clone URL: https://aur.archlinux.org/python-flash-attn.git (read-only, click to copy)
Package Base: python-flash-attn
Description: Fast and memory-efficient exact attention
Upstream URL: https://github.com/Dao-AILab/flash-attention
Licenses: BSD-3-Clause
Provides: python-flash-attention
Submitter: Smoolak
Maintainer: Smoolak
Last Packager: Smoolak
Votes: 0
Popularity: 0.000000
First Submitted: 2025-12-11 01:40 (UTC)
Last Updated: 2025-12-11 01:40 (UTC)