Class Stochastic

  • All Implemented Interfaces:
    HillClimber

    public final class Stochastic
    extends AbstractClimber
    Stochastic gradient descent (SGD) optimizer.

    w(t+1) = w(t) - alpha * dL/dw(t) where,

    • w(t) is the current window size
    • alpha is the learning rate (step size)
    • dL/dw(t) is the gradient of the curve
    • w(t+1) is the new window size configuration

    SGC may be enhanced using momentum, either classical or Nesterov's. For details see

    • https://towardsdatascience.com/10-gradient-descent-optimisation-algorithms-86989510b5e9
    • http://ruder.io/optimizing-gradient-descent/index.html#momentum
    • http://cs231n.github.io/neural-networks-3/#sgd
    • Constructor Detail

      • Stochastic

        public Stochastic​(Config config)
    • Method Detail

      • adjust

        protected double adjust​(double hitRate)
        Description copied from class: AbstractClimber
        Returns the amount to adapt by.
        Specified by:
        adjust in class AbstractClimber