Class GradientDescentOptimizer

  • All Implemented Interfaces:
    IGradientBasedOptimizer

    public class GradientDescentOptimizer
    extends java.lang.Object
    implements IGradientBasedOptimizer
    An optimizer based on the gradient descent method [1]. This optimizer is the naive implementation that calculates the gradient in every step and makes an update into the negative direction of the gradient. This method is known to find the optimum, if the underlying function is convex. At some point in the future, we should probably implement faster methods, like for example http://www.seas.ucla.edu/~vandenbe/236C/lectures/fgrad.pdf [1] Jonathan Barzilai and Jonathan M. Borwein, "Two-point step size gradient methods", in: IMA journal of numerical analysis, 8.1 (1998), pp. 141-148.
    • Constructor Detail

      • GradientDescentOptimizer

        public GradientDescentOptimizer()
    • Method Detail

      • optimize

        public org.api4.java.common.math.IVector optimize​(IGradientDescendableFunction descendableFunction,
                                                          IGradientFunction gradient,
                                                          org.api4.java.common.math.IVector initialGuess)
        Description copied from interface: IGradientBasedOptimizer
        Optimize the given function based on its derivation.
        Specified by:
        optimize in interface IGradientBasedOptimizer
        Parameters:
        descendableFunction - the function to optimize
        gradient - the first order derivate of the function
        initialGuess - the initial guess for the parameters that shall be optimized
        Returns:
        the optimized vector