lmflow.optim.adabelief#
Classes#
Implements AdaBelief algorithm. Modified from Adam in PyTorch |
Module Contents#
- class lmflow.optim.adabelief.AdaBelief(params, lr=0.001, betas=(0.9, 0.999), eps=1e-16, weight_decay=0, amsgrad=False, weight_decouple=True, fixed_decay=False, rectify=True, degenerated_to_sgd=True, print_change_log=True)[source]#
Bases:
torch.optim.optimizer.Optimizer
Implements AdaBelief algorithm. Modified from Adam in PyTorch reference: AdaBelief Optimizer, adapting stepsizes by the belief in observed gradients, NeurIPS 2020