Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
⬆️ Update flash-attn requirement from ~=2.6.3 to >=2.6.3,<2.8.0
Updates the requirements on [flash-attn](https://github.com/Dao-AILab/flash-attention) to permit the latest version. - [Release notes](https://github.com/Dao-AILab/flash-attention/releases) - [Commits](Dao-AILab/flash-attention@v2.6.3...v2.7.0.post2) --- updated-dependencies: - dependency-name: flash-attn dependency-type: direct:production ... Signed-off-by: dependabot[bot] <support@github.com>
- Loading branch information