lmflow.optim.adabelief ====================== .. py:module:: lmflow.optim.adabelief Classes ------- .. autoapisummary:: lmflow.optim.adabelief.AdaBelief Module Contents --------------- .. py:class:: AdaBelief(params, lr=0.001, betas=(0.9, 0.999), eps=1e-16, weight_decay=0, amsgrad=False, weight_decouple=True, fixed_decay=False, rectify=True, degenerated_to_sgd=True, print_change_log=True) Bases: :py:obj:`torch.optim.optimizer.Optimizer` Implements AdaBelief algorithm. Modified from Adam in PyTorch reference: AdaBelief Optimizer, adapting stepsizes by the belief in observed gradients, NeurIPS 2020 .. !! processed by numpydoc !! .. py:attribute:: degenerated_to_sgd .. py:attribute:: weight_decouple .. py:attribute:: rectify .. py:attribute:: fixed_decay .. py:method:: __setstate__(state) .. py:method:: reset() .. py:method:: step(closure=None) Performs a single optimization step. Arguments: closure (callable, optional): A closure that reevaluates the model and returns the loss. .. !! processed by numpydoc !!