From 03ac63dedd3cb3460f098b5fc36253bd83a6337e Mon Sep 17 00:00:00 2001 From: danielward27 Date: Fri, 24 Jan 2025 09:15:51 +0000 Subject: [PATCH] Add paramax --- README.md | 1 + docs/index.md | 1 + 2 files changed, 2 insertions(+) diff --git a/README.md b/README.md index 20d04589..e24717cf 100644 --- a/README.md +++ b/README.md @@ -69,6 +69,7 @@ If you found this library useful in academic research, please cite: [(arXiv link [Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. [Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device). [Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). +[paramax](https://github.com/danielward27/paramax): parameterizations and constraints for PyTrees. **Scientific computing** [Optimistix](https://github.com/patrick-kidger/optimistix): root finding, minimisation, fixed points, and least squares. diff --git a/docs/index.md b/docs/index.md index 73fed50e..8987a9f7 100644 --- a/docs/index.md +++ b/docs/index.md @@ -57,6 +57,7 @@ Have a look at the [Getting Started](./usage/getting-started.md) page. [Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. [Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device). [Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). +[paramax](https://github.com/danielward27/paramax): parameterizations and constraints for PyTrees. **Scientific computing** [Optimistix](https://github.com/patrick-kidger/optimistix): root finding, minimisation, fixed points, and least squares.