Search Criteria
Package Details: python-flash-attn 2.7.4.post1-1
Package Actions
Git Clone URL: | https://aur.archlinux.org/python-flash-attn.git (read-only, click to copy) |
---|---|
Package Base: | python-flash-attn |
Description: | Fast and memory-efficient exact attention |
Upstream URL: | https://github.com/Dao-AILab/flash-attention |
Licenses: | BSD-3-Clause |
Submitter: | hottea |
Maintainer: | hottea |
Last Packager: | hottea |
Votes: | 0 |
Popularity: | 0.000000 |
First Submitted: | 2025-02-24 15:07 (UTC) |
Last Updated: | 2025-02-24 15:13 (UTC) |
Dependencies (9)
- python-einopsAUR
- python-pytorch-opt-cuda
- git (git-gitAUR, git-glAUR) (make)
- ninja (ninja-kitwareAUR, ninja-memAUR, ninja-fuchsia-gitAUR, ninja-gitAUR, ninja-jobserverAUR) (make)
- python-build (make)
- python-installer (make)
- python-psutil (make)
- python-setuptools (make)
- python-wheel (make)