This repo is outdated and will no longer be maintained.
Unofficial implementation of the lookahead mechanism for optimizers.
pip install git+https://github.com/cyberzhg/keras-lookahead.git
Arguments:
optimizer
: Original optimizer.sync_period
: thek
in the paper. The synchronization period.slow_step
: theα
in the paper. The step size of slow weights.
from keras_lookahead import Lookahead
optimizer = Lookahead('adam', sync_period=5, slow_step=0.5)
Custom optimizers can also be used:
from keras_radam import RAdam
from keras_lookahead import Lookahead
optimizer = Lookahead(RAdam())