Class Adagrad


  • public class Adagrad
    extends Optimizer
    Adagrad is an AdaGrad Optimizer.

    This class implements

    Adagrad updates the weights using:

    \( grad = clip(grad * resc_grad, clip_grad) + wd * weight \)
    \( history += grad^2 \)
    \( weight -= lr * grad / (sqrt(history) + epsilon) \)

    where grad represents the gradient, wd represents weight decay, and lr represents learning rate.

    See Also:
    The D2L chapter on Adagrad
    • Constructor Detail

      • Adagrad

        protected Adagrad​(Adagrad.Builder builder)
        Creates a new instance of Adam optimizer.
        Parameters:
        builder - the builder to create a new instance of Adam optimizer
    • Method Detail

      • update

        public void update​(java.lang.String parameterId,
                           NDArray weight,
                           NDArray grad)
        Updates the parameters according to the gradients.
        Specified by:
        update in class Optimizer
        Parameters:
        parameterId - the parameter to be updated
        weight - the weights of the parameter
        grad - the gradients
      • builder

        public static Adagrad.Builder builder()
        Creates a builder to build a Adam.
        Returns:
        a new builder