Pseudo-Inverse, Gradient-Stochastic-Steepest Descent, Logistic Regression and LDA-QDA
-
Updated
Nov 19, 2019 - Jupyter Notebook
Pseudo-Inverse, Gradient-Stochastic-Steepest Descent, Logistic Regression and LDA-QDA
This repo contain implementation of Steepest Descent algorithm using inexact line search and Newton's method on Functions like Tried Function, Three Hump Camel, Styblinski-Tang Function, Rosenbrock Function, etc.
Add a description, image, and links to the steepest topic page so that developers can more easily learn about it.
To associate your repository with the steepest topic, visit your repo's landing page and select "manage topics."