This release contains a complete forward-mode implementation of AD, as well as the boomdiff optimization library. Our AD class supports scalar and vector operations for functions of many variables. Optimization methods include gradient descent, momentum, and Adam. New to this release: tutorials for getting started with boomdiff and AD objects, an introduction to optimization with boomdiff, and tutorials for three common applications of optimization methods (linear regression, logistic regression, and neural networks).