trait OptimMethod[T] extends Serializable
Similar to torch Optim method, which is used to update the parameter
- Alphabetic
- By Inheritance
- OptimMethod
- Serializable
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Abstract Value Members
-
abstract
def
clearHistory(): Unit
Clear the history information in the OptimMethod state
-
abstract
def
getLearningRate(): Double
get learning rate
-
abstract
def
loadFromTable(config: Table): OptimMethod.this.type
load optimMethod parameters from Table
-
abstract
def
optimize(feval: (Tensor[T]) ⇒ (T, Tensor[T]), parameter: Tensor[T]): (Tensor[T], Array[T])
Optimize the model parameter
Optimize the model parameter
- feval
a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX
- parameter
the initial point
- returns
the new x vector and the function list, evaluated before the update
Concrete Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): OptimMethod[T]
clone OptimMethod
clone OptimMethod
- Definition Classes
- OptimMethod → AnyRef
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
getHyperParameter(): String
Get hyper parameter from config table.
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
save(path: String, overWrite: Boolean = false): OptimMethod.this.type
save OptimMethod
save OptimMethod
- path
path
- overWrite
whether to overwrite
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
def
updateHyperParameter(): Unit
Update hyper parameter.
Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.
- returns
A string.
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )
Deprecated Value Members
-
def
clearHistory(state: Table): Table
Clear the history information in the state
Clear the history information in the state
- Annotations
- @deprecated
- Deprecated
(Since version 0.2.0) Please use clearHistory() instead
-
def
getHyperParameter(config: Table): String
Get hyper parameter from config table.
Get hyper parameter from config table.
- config
a table contains the hyper parameter.
- Annotations
- @deprecated
- Deprecated
(Since version 0.2.0) Please use getHyperParameter() instead
-
def
optimize(feval: (Tensor[T]) ⇒ (T, Tensor[T]), parameter: Tensor[T], config: Table, state: Table = null): (Tensor[T], Array[T])
Optimize the model parameter
Optimize the model parameter
- feval
a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX
- parameter
the initial point
- config
a table with configuration parameters for the optimizer
- state
a table describing the state of the optimizer; after each call the state is modified
- returns
the new x vector and the function list, evaluated before the update
- Annotations
- @deprecated
- Deprecated
(Since version 0.2.0) Please initialize OptimMethod with parameters when creating it instead of importing table
-
def
updateHyperParameter(config: Table, state: Table): Unit
Update hyper parameter.
Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.
- config
config table.
- state
state Table.
- returns
A string.
- Annotations
- @deprecated
- Deprecated
(Since version 0.2.0) Please use updateHyperParameter() instead