Skip to content

Commit

Permalink
Polish NEWS
Browse files Browse the repository at this point in the history
  • Loading branch information
dfalbel committed Jan 30, 2025
1 parent 8cbd3f2 commit 025d89e
Showing 1 changed file with 10 additions and 6 deletions.
16 changes: 10 additions & 6 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,22 @@
# torch 0.14.0

# torch (development version)
## Breaking changes

- Updated to LibTorch v2.5.1 (#1204) -- potentially breaking change!

## New features

- Feature: Faster optimizers (`optim_ignite_<name>()`) are available: Adam, AdamW, Adagrad, RMSprop,SGD.
These can be used as drop-in replacements for `optim_<name>` but are considerably
faster as they wrap the LibTorch implementation of the optimizer.
The biggest speed differences can be observed for complex optimizers such as `AdamW`.

## Bug fixes

- `torch_iinfo()` now support all integer dtypes (#1190 @cregouby)
- Fixed float key_padding_mask in `nnf_multi_head_attention_forward()` (#1205)
- Updated to LibTorch v2.5.1 (#1204)
- Fix french translation (#1176 @cregouby)
- Trace jitted modules now respect 'train' and 'eval' mode (#1211)
- Feature: Faster optimizers (`optim_ignite_<name>()`) are available: Adam, AdamW, Adagrad, RMSprop,SGD.
These can be used as drop-in replacements for `optim_<name>` but are considerably
faster as they wrap the LibTorch implementation of the optimizer.
The biggest speed differences can be observed for complex optimizers such as `AdamW`.
* Fix: Avoid name clashes between multiple calls to `jit_trace` (#1246)

# torch 0.13.0
Expand Down

0 comments on commit 025d89e

Please sign in to comment.