HippoTrainer is a PyTorch-compatible library for gradient-based hyperparameter optimization, implementing cutting-edge algorithms that leverage automatic differentiation to efficiently tune hyperparameters.
- Algorithm Zoo: T1-T2, Billion Hyperparameters, HOAG, DrMAD
- PyTorch Native: Direct integration with
torch.nn.Module
- Memory Efficient: Checkpointing & implicit differentiation
- Scalable: From laptop to cluster with PyTorch backend
- T1-T2 (Paper): Unrolled optimization with explicit gradient computation
- Billion Hyperparams (Paper): Large-scale optimization with PyTorch fusion
- HOAG (Paper): Implicit differentiation via conjugate gradient (Daniil Dorin)
- DrMAD (Paper): Memory-efficient piecewise-linear backprop
- Daniil Dorin (Basic code writing, Final demo, Algorithms)
- Igor Ignashin (Project wrapping, Documentation writing, Algorithms)
- Nikita Kiselev (Project planning, Blog post, Algorithms)
- Andrey Veprikov (Tests writing, Documentation writing, Algorithms)
- We welcome contributions!
HippoTrainer is MIT licensed. See LICENSE for details.