Package ai.djl.training.optimizer
Contains classes for optimizing a neural network
Block.
It contains a main interface Optimizer and various
optimizers that extend it. There are also the helpers for learning rates in ai.djl.training.tracker.
-
Class Summary Class Description Adadelta Adadeltais an AdadeltaOptimizer.Adadelta.Builder The Builder to construct anAdadeltaobject.Adagrad Adagradis an AdaGradOptimizer.Adagrad.Builder The Builder to construct anAdagradobject.Adam Adamis a generalization of the AdaGradOptimizer.Adam.Builder The Builder to construct anAdamobject.AdamW Adamis a generalization of the AdaGradOptimizer.AdamW.Builder The Builder to construct anAdamWobject.Nag Nagis a Nesterov accelerated gradient optimizer.Nag.Builder The Builder to construct anNagobject.Optimizer AnOptimizerupdates the weight parameters to minimize the loss function.Optimizer.OptimizerBuilder<T extends Optimizer.OptimizerBuilder> The Builder to construct anOptimizer.RmsProp TheRMSPropOptimizer.RmsProp.Builder The Builder to construct anRmsPropobject.Sgd Sgdis a Stochastic Gradient Descent (SGD) optimizer.Sgd.Builder The Builder to construct anSgdobject.