Skip to content

Commit

Permalink
⬆️ Update flash-attn requirement from ~=2.6.3 to >=2.6.3,<2.8.0
Browse files Browse the repository at this point in the history
Updates the requirements on [flash-attn](https://github.com/Dao-AILab/flash-attention) to permit the latest version.
- [Release notes](https://github.com/Dao-AILab/flash-attention/releases)
- [Commits](Dao-AILab/flash-attention@v2.6.3...v2.7.0.post2)

---
updated-dependencies:
- dependency-name: flash-attn
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
  • Loading branch information
dependabot[bot] authored Dec 2, 2024
1 parent 9695597 commit 653cc1c
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ dependencies = [
"accelerate>=0.33,<1.1",
"sentencepiece~=0.2.0",
"peft>=0.12,<0.14",
"flash-attn~=2.6.3; sys_platform != 'darwin'",
"flash-attn>=2.6.3,<2.8.0; sys_platform != 'darwin'",
"einops~=0.8.0",
"timm~=1.0.9",
"typer~=0.12.5",
Expand Down

0 comments on commit 653cc1c

Please sign in to comment.