Search Criteria
Package Details: python-flash-attention 2.3.6-1
Package Actions
Git Clone URL: | https://aur.archlinux.org/python-flash-attention.git (read-only, click to copy) |
---|---|
Package Base: | python-flash-attention |
Description: | Fast and memory-efficient exact attention |
Upstream URL: | https://github.com/HazyResearch/flash-attention |
Keywords: | ai |
Licenses: | Apache |
Submitter: | daskol |
Maintainer: | None |
Last Packager: | daskol |
Votes: | 2 |
Popularity: | 0.015421 |
First Submitted: | 2023-06-21 07:27 (UTC) |
Last Updated: | 2023-12-24 08:36 (UTC) |
Dependencies (9)
- python-einopsAUR
- python-pytorch-cuda (python-pytorch-mkl-cuda-gitAUR, python-pytorch-cxx11abi-opt-cudaAUR, python-pytorch-opt-cuda)
- cutlassAUR (make)
- ninja (ninja-kitwareAUR, ninja-memAUR, ninja-fuchsia-gitAUR, ninja-gitAUR, ninja-jobserverAUR) (make)
- python-build (make)
- python-installer (python-installer-gitAUR) (make)
- python-packaging (make)
- python-setuptools (make)
- python-wheel (make)