lmflow.optim.adabound#

Classes#

AdaBound

Implements AdaBound algorithm.

Module Contents#

class lmflow.optim.adabound.AdaBound(params, lr: float = 0.001, betas=(0.9, 0.999), final_lr: float = 0.1, gamma: float = 0.001, eps: float = 1e-08, weight_decay: float = 0, amsbound: bool = False)[source]#

Bases: torch.optim.optimizer.Optimizer

Implements AdaBound algorithm.

It has been proposed in `Adaptive Gradient Methods with Dynamic Bound of Learning Rate https://arxiv.org/abs/1902.09843 Note:

Reference code: Luolc/AdaBound

base_lrs[source]#
__setstate__(state) None[source]#
step(closure=None)[source]#

Performs a single optimization step.

Arguments:

closure: A closure that reevaluates the model and returns the loss.