Package Details: python-flash-attn 2.7.4.post1-1

Git Clone URL: https://aur.archlinux.org/python-flash-attn.git (read-only, click to copy)
Package Base: python-flash-attn
Description: Fast and memory-efficient exact attention
Upstream URL: https://github.com/Dao-AILab/flash-attention
Licenses: BSD-3-Clause
Submitter: hottea
Maintainer: hottea
Last Packager: hottea
Votes: 0
Popularity: 0.000000
First Submitted: 2025-02-24 15:07 (UTC)
Last Updated: 2025-02-24 15:13 (UTC)