- All Implemented Interfaces:
- IGradientBasedOptimizer
public class GradientDescentOptimizer
extends java.lang.Object
implements IGradientBasedOptimizer
An optimizer based on the gradient descent method [1]. This optimizer is the
naive implementation that calculates the gradient in every step and makes an
update into the negative direction of the gradient.
This method is known to find the optimum, if the underlying function is convex.
At some point in the future, we should probably implement faster methods,
like for example http://www.seas.ucla.edu/~vandenbe/236C/lectures/fgrad.pdf
[1] Jonathan Barzilai and Jonathan M. Borwein, "Two-point step size gradient
methods", in: IMA journal of numerical analysis, 8.1 (1998), pp. 141-148.