Skip to content

Latest commit

 

History

History
5 lines (4 loc) · 354 Bytes

README.md

File metadata and controls

5 lines (4 loc) · 354 Bytes

optimizer-SUG-torch

Adaptive stochastic gradient method based on the universal gradient method. The universal method adjusts Lipsitz constant of the gradient on each step so that the loss function is majorated by the quadratic function.

Please use https://nbviewer.jupyter.org/github/sverdoot in case you have problems with rendering .ipynb files.