lmflow.optim.adam#

Classes#

Module Contents#

class lmflow.optim.adam.Adam(params, lr=0.001, betas=(0.9, 0.999), eps=1e-08)[source]#

Bases: torch.optim.optimizer.Optimizer

step(closure=None)[source]#