class LBFGS[T] extends OptimMethod[T]
This implementation of L-BFGS relies on a user-provided line search function (state.lineSearch). If this function is not provided, then a simple learningRate is used to produce fixed size steps. Fixed size steps are much less costly than line searches, and can be useful for stochastic problems.
The learning rate is used even when a line search is provided. This is also useful for large-scale stochastic problems, where opfunc is a noisy approximation of f(x). In that case, the learning rate allows a reduction of confidence in the step size.
- Alphabetic
- By Inheritance
- LBFGS
- OptimMethod
- Serializable
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
-
new
LBFGS(maxIter: Int = 20, maxEval: Double = Double.MaxValue, tolFun: Double = 1e-5, tolX: Double = 1e-9, nCorrection: Int = 100, learningRate: Double = 1.0, verbose: Boolean = false, lineSearch: Option[LineSearch[T]] = None, lineSearchOptions: Option[Table] = None)(implicit arg0: ClassTag[T], ev: TensorNumeric[T])
- maxIter
Maximum number of iterations allowed
- maxEval
Maximum number of function evaluations
- tolFun
Termination tolerance on the first-order optimality
- tolX
Termination tol on progress in terms of func/param changes
- lineSearch
A line search function
- lineSearchOptions
If no line search provided, then a fixed step size is used
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clearHistory(): Unit
Clear the history information in the OptimMethod state
Clear the history information in the OptimMethod state
- Definition Classes
- LBFGS → OptimMethod
-
def
clone(): OptimMethod[T]
clone OptimMethod
clone OptimMethod
- Definition Classes
- OptimMethod → AnyRef
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
getHyperParameter(): String
Get hyper parameter from config table.
Get hyper parameter from config table.
- Definition Classes
- OptimMethod
-
def
getLearningRate(): Double
get learning rate
get learning rate
- Definition Classes
- LBFGS → OptimMethod
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- var learningRate: Double
- var lineSearch: Option[LineSearch[T]]
- var lineSearchOptions: Option[Table]
-
def
loadFromTable(config: Table): LBFGS.this.type
load optimMethod parameters from Table
load optimMethod parameters from Table
- Definition Classes
- LBFGS → OptimMethod
- var maxEval: Double
- var maxIter: Int
- var nCorrection: Int
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
optimize(opfunc: (Tensor[T]) ⇒ (T, Tensor[T]), x: Tensor[T]): (Tensor[T], Array[T])
Optimize the model parameter
Optimize the model parameter
- opfunc
a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX
- x
the initial point
- returns
the new x vector and the evaluate value list, evaluated before the update x : the new
xvector, at the optimal point f : a table of all function values:f[1]is the value of the function before any optimization andf[#f]is the final fully optimized value, atx*
- Definition Classes
- LBFGS → OptimMethod
-
def
save(path: String, overWrite: Boolean = false): LBFGS.this.type
save OptimMethod
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
- var tolFun: Double
- var tolX: Double
-
def
updateHyperParameter(): Unit
Update hyper parameter.
Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.
- returns
A string.
- Definition Classes
- OptimMethod
- def verbose(msg: String): Unit
- var verbose: Boolean
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )
Deprecated Value Members
-
def
clearHistory(state: Table): Table
Clear the history information in the state
Clear the history information in the state
- Definition Classes
- OptimMethod
- Annotations
- @deprecated
- Deprecated
(Since version 0.2.0) Please use clearHistory() instead
-
def
getHyperParameter(config: Table): String
Get hyper parameter from config table.
Get hyper parameter from config table.
- config
a table contains the hyper parameter.
- Definition Classes
- OptimMethod
- Annotations
- @deprecated
- Deprecated
(Since version 0.2.0) Please use getHyperParameter() instead
-
def
optimize(feval: (Tensor[T]) ⇒ (T, Tensor[T]), parameter: Tensor[T], config: Table, state: Table = null): (Tensor[T], Array[T])
Optimize the model parameter
Optimize the model parameter
- feval
a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX
- parameter
the initial point
- config
a table with configuration parameters for the optimizer
- state
a table describing the state of the optimizer; after each call the state is modified
- returns
the new x vector and the function list, evaluated before the update
- Definition Classes
- OptimMethod
- Annotations
- @deprecated
- Deprecated
(Since version 0.2.0) Please initialize OptimMethod with parameters when creating it instead of importing table
-
def
updateHyperParameter(config: Table, state: Table): Unit
Update hyper parameter.
Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.
- config
config table.
- state
state Table.
- returns
A string.
- Definition Classes
- OptimMethod
- Annotations
- @deprecated
- Deprecated
(Since version 0.2.0) Please use updateHyperParameter() instead