Package org.nd4j.linalg.learning
Class AdaGradUpdater
- java.lang.Object
-
- org.nd4j.linalg.learning.AdaGradUpdater
-
- All Implemented Interfaces:
GradientUpdater<AdaGrad>
public class AdaGradUpdater extends Object implements GradientUpdater<AdaGrad>
-
-
Field Summary
Fields Modifier and Type Field Description static StringGRAD_STATEINDArrayhistoricalGradientprotected doublelearningRateprotected intnumIterationsint[]shape
-
Constructor Summary
Constructors Constructor Description AdaGradUpdater(AdaGrad config)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description voidapplyUpdater(INDArray gradient, int iteration, int epoch)Gets feature specific learning rates Adagrad keeps a history of gradients being passed in.Map<String,INDArray>getState()voidsetState(Map<String,INDArray> stateMap, boolean initialize)voidsetStateViewArray(INDArray viewArray, long[] gradientShape, char gradientOrder, boolean initialize)For the internal updater state (if any): set this to use the provided array.-
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface org.nd4j.linalg.learning.GradientUpdater
getConfig
-
-
-
-
Field Detail
-
GRAD_STATE
public static final String GRAD_STATE
- See Also:
- Constant Field Values
-
historicalGradient
public INDArray historicalGradient
-
shape
public int[] shape
-
learningRate
protected double learningRate
-
numIterations
protected int numIterations
-
-
Constructor Detail
-
AdaGradUpdater
public AdaGradUpdater(AdaGrad config)
-
-
Method Detail
-
setState
public void setState(Map<String,INDArray> stateMap, boolean initialize)
- Specified by:
setStatein interfaceGradientUpdater<AdaGrad>
-
getState
public Map<String,INDArray> getState()
- Specified by:
getStatein interfaceGradientUpdater<AdaGrad>
-
setStateViewArray
public void setStateViewArray(INDArray viewArray, long[] gradientShape, char gradientOrder, boolean initialize)
Description copied from interface:GradientUpdaterFor the internal updater state (if any): set this to use the provided array. Used during initialization, and when restoring the updater state (after serialization, for example)- Specified by:
setStateViewArrayin interfaceGradientUpdater<AdaGrad>- Parameters:
viewArray- Array (that is a view of a larger array) to use for the state.initialize- If true: the updater must initialize the view array. If false: no change to view array contents
-
applyUpdater
public void applyUpdater(INDArray gradient, int iteration, int epoch)
Gets feature specific learning rates Adagrad keeps a history of gradients being passed in. Note that each gradient passed in becomes adapted over time, hence the opName adagrad- Specified by:
applyUpdaterin interfaceGradientUpdater<AdaGrad>- Parameters:
gradient- the gradient to get learning rates foriteration-
-
-