Package ai.djl.training.optimizer
Class Nag
- java.lang.Object
-
- ai.djl.training.optimizer.Optimizer
-
- ai.djl.training.optimizer.Nag
-
public class Nag extends Optimizer
Nagis a Nesterov accelerated gradient optimizer.This optimizer updates each weight by:
\( state = momentum * state + grad + wd *weight\)
\( weight = weight - (lr * (grad + momentum * state))
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description static classNag.BuilderThe Builder to construct anNagobject.-
Nested classes/interfaces inherited from class ai.djl.training.optimizer.Optimizer
Optimizer.OptimizerBuilder<T extends Optimizer.OptimizerBuilder>
-
-
Field Summary
-
Fields inherited from class ai.djl.training.optimizer.Optimizer
clipGrad, rescaleGrad
-
-
Constructor Summary
Constructors Modifier Constructor Description protectedNag(Nag.Builder builder)Creates a new instance ofNagoptimizer.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description voidupdate(java.lang.String parameterId, NDArray weight, NDArray grad)Updates the parameters according to the gradients.-
Methods inherited from class ai.djl.training.optimizer.Optimizer
adadelta, adagrad, adam, adamW, getWeightDecay, nag, rmsprop, sgd, updateCount, withDefaultState
-
-
-
-
Constructor Detail
-
Nag
protected Nag(Nag.Builder builder)
Creates a new instance ofNagoptimizer.- Parameters:
builder- the builder to create a new instance ofNagoptimizer
-
-