Releases: team-boomeraang/cs107-FinalProject
Releases · team-boomeraang/cs107-FinalProject
version 2.0
This release contains a complete forward-mode implementation of AD, as well as the boomdiff optimization library. Our AD class supports scalar and vector operations for functions of many variables. Optimization methods include gradient descent, momentum, and Adam. New to this release: tutorials for getting started with boomdiff and AD objects, an introduction to optimization with boomdiff, and tutorials for three common applications of optimization methods (linear regression, logistic regression, and neural networks).
Version 1.3
Version 1.3 of boomdiff supports multiple functions and vector operations for automatic differentiation and optimization of functions of many variables.