class Adamax[T] extends OptimMethod[T]
An implementation of Adamax http://arxiv.org/pdf/1412.6980.pdf
- Alphabetic
- By Inheritance
- Adamax
- OptimMethod
- Serializable
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
-
new
Adamax(learningRate: Double = 0.002, beta1: Double = 0.9, beta2: Double = 0.999, Epsilon: Double = 1e-38)(implicit arg0: ClassTag[T], ev: TensorNumeric[T])
- learningRate
learning rate
- beta1
first moment coefficient
- beta2
second moment coefficient
- Epsilon
for numerical stability
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- var Epsilon: Double
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
- var beta1: Double
- var beta2: Double
-
def
clearHistory(): Unit
Clear the history information in the OptimMethod state
Clear the history information in the OptimMethod state
- Definition Classes
- Adamax → OptimMethod
-
def
clone(): OptimMethod[T]
clone OptimMethod
clone OptimMethod
- Definition Classes
- OptimMethod → AnyRef
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
getHyperParameter(): String
Get hyper parameter from config table.
Get hyper parameter from config table.
- Definition Classes
- OptimMethod
-
def
getLearningRate(): Double
get learning rate
get learning rate
- Definition Classes
- Adamax → OptimMethod
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- var learningRate: Double
-
def
loadFromTable(config: Table): Adamax.this.type
load optimMethod parameters from Table
load optimMethod parameters from Table
- Definition Classes
- Adamax → OptimMethod
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
optimize(feval: (Tensor[T]) ⇒ (T, Tensor[T]), parameter: Tensor[T]): (Tensor[T], Array[T])
An implementation of Adamax http://arxiv.org/pdf/1412.6980.pdf
An implementation of Adamax http://arxiv.org/pdf/1412.6980.pdf
- feval
a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX
- parameter
the initial point
- returns
the new x vector and the function list {fx}, evaluated before the update
- Definition Classes
- Adamax → OptimMethod
-
def
save(path: String, overWrite: Boolean = false): Adamax.this.type
save OptimMethod
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
def
updateHyperParameter(): Unit
Update hyper parameter.
Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.
- returns
A string.
- Definition Classes
- OptimMethod
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )
Deprecated Value Members
-
def
clearHistory(state: Table): Table
Clear the history information in the state
Clear the history information in the state
- Definition Classes
- OptimMethod
- Annotations
- @deprecated
- Deprecated
(Since version 0.2.0) Please use clearHistory() instead
-
def
getHyperParameter(config: Table): String
Get hyper parameter from config table.
Get hyper parameter from config table.
- config
a table contains the hyper parameter.
- Definition Classes
- OptimMethod
- Annotations
- @deprecated
- Deprecated
(Since version 0.2.0) Please use getHyperParameter() instead
-
def
optimize(feval: (Tensor[T]) ⇒ (T, Tensor[T]), parameter: Tensor[T], config: Table, state: Table = null): (Tensor[T], Array[T])
Optimize the model parameter
Optimize the model parameter
- feval
a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX
- parameter
the initial point
- config
a table with configuration parameters for the optimizer
- state
a table describing the state of the optimizer; after each call the state is modified
- returns
the new x vector and the function list, evaluated before the update
- Definition Classes
- OptimMethod
- Annotations
- @deprecated
- Deprecated
(Since version 0.2.0) Please initialize OptimMethod with parameters when creating it instead of importing table
-
def
updateHyperParameter(config: Table, state: Table): Unit
Update hyper parameter.
Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.
- config
config table.
- state
state Table.
- returns
A string.
- Definition Classes
- OptimMethod
- Annotations
- @deprecated
- Deprecated
(Since version 0.2.0) Please use updateHyperParameter() instead