Skip to content

Actions: ecmwf/anemoi-models

Code Quality checks for PRs

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
123 workflow runs
123 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

docs: update changelog
Code Quality checks for PRs #73: Commit b7b8f2e pushed by theissenhelen
October 3, 2024 11:20 1m 52s feature/44-make-flash-attention-configurable
October 3, 2024 11:20 1m 52s
fix: annotation error
Code Quality checks for PRs #71: Commit 6c12dda pushed by theissenhelen
October 3, 2024 09:58 1m 54s feature/44-make-flash-attention-configurable
October 3, 2024 09:58 1m 54s
Feature/44 make flash attention configurable
Code Quality checks for PRs #70: Pull request #47 synchronize by theissenhelen
October 3, 2024 09:45 1m 55s feature/44-make-flash-attention-configurable
October 3, 2024 09:45 1m 55s
fix: import annotations from future
Code Quality checks for PRs #69: Commit c841324 pushed by theissenhelen
October 3, 2024 09:45 1m 57s feature/44-make-flash-attention-configurable
October 3, 2024 09:45 1m 57s
fix: softcap optional
Code Quality checks for PRs #67: Commit ed07e34 pushed by theissenhelen
October 3, 2024 09:37 2m 19s feature/44-make-flash-attention-configurable
October 3, 2024 09:37 2m 19s
fix: bias shape
Code Quality checks for PRs #65: Commit 22623cc pushed by theissenhelen
October 3, 2024 09:05 2m 0s feature/44-make-flash-attention-configurable
October 3, 2024 09:05 2m 0s
Feature/44 make flash attention configurable
Code Quality checks for PRs #64: Pull request #47 synchronize by theissenhelen
October 3, 2024 08:55 1m 57s feature/44-make-flash-attention-configurable
October 3, 2024 08:55 1m 57s
docs: update docstrings
Code Quality checks for PRs #63: Commit 6523b47 pushed by theissenhelen
October 3, 2024 08:55 4m 41s feature/44-make-flash-attention-configurable
October 3, 2024 08:55 4m 41s
Feature/mask NaNs in training loss function
Code Quality checks for PRs #62: Pull request #56 synchronize by sahahner
October 2, 2024 14:45 1m 59s feature/mask-NaNs-in-training-loss-function
October 2, 2024 14:45 1m 59s
changelog
Code Quality checks for PRs #61: Commit 87647b7 pushed by sahahner
October 2, 2024 14:45 2m 2s feature/mask-NaNs-in-training-loss-function
October 2, 2024 14:45 2m 2s
Feature/44 make flash attention configurable
Code Quality checks for PRs #58: Pull request #47 synchronize by theissenhelen
October 2, 2024 13:09 1m 56s feature/44-make-flash-attention-configurable
October 2, 2024 13:09 1m 56s
feat: get alibi_slopes
Code Quality checks for PRs #57: Commit c04e641 pushed by theissenhelen
October 2, 2024 13:09 1m 57s feature/44-make-flash-attention-configurable
October 2, 2024 13:09 1m 57s
Feature/mapper chunking (#46)
Code Quality checks for PRs #56: Commit f96bcf9 pushed by japols
October 2, 2024 13:00 2m 0s develop
October 2, 2024 13:00 2m 0s
Add shouldhasattr
Code Quality checks for PRs #55: Commit 084f3cc pushed by HCookie
October 1, 2024 14:06 2m 3s feature/experimental/attribute_warning
October 1, 2024 14:06 2m 3s
Feature/mapper chunking
Code Quality checks for PRs #54: Pull request #46 synchronize by japols
October 1, 2024 13:55 3m 8s feature/mapper_chunking
October 1, 2024 13:55 3m 8s
Merge remote-tracking branch 'origin/develop' into feature/mapper_chu…
Code Quality checks for PRs #53: Commit b6e34b9 pushed by japols
October 1, 2024 13:54 2m 12s feature/mapper_chunking
October 1, 2024 13:54 2m 12s
Feature/mapper chunking
Code Quality checks for PRs #52: Pull request #46 synchronize by japols
October 1, 2024 09:49 2m 17s feature/mapper_chunking
October 1, 2024 09:49 2m 17s
docs: add chunking documentation, rename ANEMOI_INFERENCE_NUM_CHUNKS
Code Quality checks for PRs #51: Commit 7c0daa7 pushed by japols
October 1, 2024 09:48 1m 47s feature/mapper_chunking
October 1, 2024 09:48 1m 47s
Feature/44 make flash attention configurable
Code Quality checks for PRs #50: Pull request #47 synchronize by theissenhelen
October 1, 2024 08:10 2m 21s feature/44-make-flash-attention-configurable
October 1, 2024 08:10 2m 21s
chore(deps): remove flash-attn
Code Quality checks for PRs #49: Commit 0eb5c50 pushed by theissenhelen
October 1, 2024 08:10 2m 31s feature/44-make-flash-attention-configurable
October 1, 2024 08:10 2m 31s