Package ai.djl.training.optimizer
Class Adagrad
- java.lang.Object
-
- ai.djl.training.optimizer.Optimizer
-
- ai.djl.training.optimizer.Adagrad
-
public class Adagrad extends Optimizer
Adagradis an AdaGradOptimizer.This class implements
Adagrad updates the weights using:
\( grad = clip(grad * resc_grad, clip_grad) + wd * weight \)
\( history += grad^2 \)
\( weight -= lr * grad / (sqrt(history) + epsilon) \)
where grad represents the gradient, wd represents weight decay, and lr represents learning rate.- See Also:
- The D2L chapter on Adagrad
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description static classAdagrad.BuilderThe Builder to construct anAdagradobject.-
Nested classes/interfaces inherited from class ai.djl.training.optimizer.Optimizer
Optimizer.OptimizerBuilder<T extends Optimizer.OptimizerBuilder>
-
-
Field Summary
-
Fields inherited from class ai.djl.training.optimizer.Optimizer
clipGrad, rescaleGrad
-
-
Constructor Summary
Constructors Modifier Constructor Description protectedAdagrad(Adagrad.Builder builder)Creates a new instance ofAdamoptimizer.
-
Method Summary
All Methods Static Methods Instance Methods Concrete Methods Modifier and Type Method Description static Adagrad.Builderbuilder()Creates a builder to build aAdam.voidupdate(java.lang.String parameterId, NDArray weight, NDArray grad)Updates the parameters according to the gradients.-
Methods inherited from class ai.djl.training.optimizer.Optimizer
adadelta, adagrad, adam, adamW, getWeightDecay, nag, rmsprop, sgd, updateCount, withDefaultState
-
-
-
-
Constructor Detail
-
Adagrad
protected Adagrad(Adagrad.Builder builder)
Creates a new instance ofAdamoptimizer.- Parameters:
builder- the builder to create a new instance ofAdamoptimizer
-
-
Method Detail
-
update
public void update(java.lang.String parameterId, NDArray weight, NDArray grad)Updates the parameters according to the gradients.
-
builder
public static Adagrad.Builder builder()
Creates a builder to build aAdam.- Returns:
- a new builder
-
-