@@ -2,11 +2,40 @@ Changelog
2
2
----------
3
3
4
4
vNext
5
- ----
6
- (Please add entries here with your changes for the next version)
5
+ ------
6
+ Add new changes here
7
+
8
+ v0.3
9
+ -----
10
+ Linen is now out of Alpha (flax.nn is being deprecated)!
11
+
7
12
- ` flax.core.apply ` and linen ` Module.apply ` will now only return the variables
8
13
collections that were specified as mutable.
9
- - ...
14
+ - Fixed handling of multiple separate subclasses of a Module.
15
+ - We now allow assignment of mixed Module pytrees in setup.
16
+ - Refactored collection creation to fail early when modifying an undefined collection as
17
+ before an non-existing non-mutable collection would just be silently ignored.
18
+ - Added the silu activation function.
19
+ - Add offset argument to Adafactor optimizer for fine-tuning schedules.
20
+ - Relaxed limit on calling methods on unbound modules.
21
+ - Relaxed parameter attribute check
22
+ - Added centered version of RMSProp.
23
+ - Added GCE getting started kit.
24
+ - Renamed -gpu_type to -accelerator_type.
25
+ - Fixed bug in MultiOptimizer causing it to throw away empty dictionary
26
+
27
+ ### Improvements
28
+ - Made FrozenDict constructor freeze correctly.
29
+ - Made freeze a synonym of the FrozenDict constructor
30
+ - Optimize freezing FrozenDicts by sharing immutable internal state.
31
+ - We simplified __ setattr__ handling of trees with Modules.
32
+ - Minor improvements in dtype handling, broadcast option for dropout.
33
+ - Added a dtype specification to Embed layer, made Adafactor use float32
34
+ state consistently, and added a broadcasting option to the Dropout layer.
35
+ - Improved frozen dict performance.
36
+ - (Massive) docs improvements
37
+ - End to end benchmarks added.
38
+ - Examples were updated to Linen.
10
39
11
40
v0.2.2
12
41
----
0 commit comments