| Package | Description |
|---|---|
| org.nd4j.autodiff.samediff | |
| org.nd4j.weightinit | |
| org.nd4j.weightinit.impl |
| Modifier and Type | Method and Description |
|---|---|
SDVariable |
SameDiff.var(@NonNull String name,
@NonNull LongShapeDescriptor shape,
WeightInitScheme weightInitScheme)
Creates a
SDVariable with the given shape and nameThe underlying array will be initialized using the specified weight initilization scheme This is a VARIABLE type SDVariable - i.e., must be floating point, and is a trainable parameter. |
SDVariable |
SameDiff.var(@NonNull String name,
@NonNull VariableType variableType,
WeightInitScheme weightInitScheme,
DataType dataType,
long... shape)
Variable initialization with a specified
WeightInitScheme
This method creates VARIABLE type SDVariable - i.e., must be floating point, and is a trainable parameter. |
SDVariable |
SameDiff.var(@NonNull String name,
@NonNull WeightInitScheme weightInitScheme,
DataType dataType,
long... shape)
Variable initialization with a specified
WeightInitScheme
This method creates VARIABLE type SDVariable - i.e., must be floating point, and is a trainable parameter. |
SDVariable |
SameDiff.var(@NonNull String name,
@NonNull WeightInitScheme weightInitScheme,
long... shape)
Variable initialization with a specified
WeightInitScheme. |
SDVariable |
SameDiff.var(WeightInitScheme weightInitScheme,
DataType dataType,
long... shape)
Creates a
SDVariable with the specified shape and a generated name. |
| Modifier and Type | Class and Description |
|---|---|
class |
BaseWeightInitScheme |
| Modifier and Type | Class and Description |
|---|---|
class |
ConstantInitScheme
Initialize the weight to zero.
|
class |
DistributionInitScheme
Initialize the weights based on a given
Distribution |
class |
IdentityInitScheme
Initialize the weight to one.
|
class |
LecunUniformInitScheme
Initialize the weight to:
randn(shape) //N(0, 2/nIn);
|
class |
NDArraySupplierInitScheme |
class |
OneInitScheme
Initialize the weight to one.
|
class |
ReluInitScheme
Initialize the weight to:
randn(shape) //N(0, 2/nIn);
|
class |
ReluUniformInitScheme
Initialize the weight to:
U(-sqrt(6/fanIn), sqrt(6/fanIn)
|
class |
SigmoidUniformInitScheme
Initialize the weight to:
range = 4.0 * Math.sqrt(6.0 / (fanIn + fanOut))
U(-range,range)
|
class |
UniformInitScheme
Initialize the weight to:
range = 1.0 / Math.sqrt(fanIn)
U(-range,range)
|
class |
VarScalingNormalFanAvgInitScheme
Initialize the weight to:
U / sqrt((fanin _ fanout) / 2)
|
class |
VarScalingNormalFanInInitScheme
Initialize the weight to:
U / fanIn
|
class |
VarScalingNormalFanOutInitScheme
Initialize the weight to:
U / sqrt(fanout)
|
class |
VarScalingNormalUniformFanInInitScheme
Initialize the weight to:
range = = 3.0 / Math.sqrt(fanIn)
U(-range,range)
|
class |
VarScalingNormalUniformFanOutInitScheme
Initialize the weight to:
randn(shape) //N(0, 2/nIn);
|
class |
VarScalingUniformFanAvgInitScheme
Initialize the weight to:
range = = 3.0 / Math.sqrt((fanIn + fanOut) / 2)
U(-range,range)
|
class |
XavierFanInInitScheme
Initialize the weight to:
randn(shape) //N(0, 2/nIn);
|
class |
XavierInitScheme
Initialize the weight to:
range = = U * sqrt(2.0 / (fanIn + fanOut)
|
class |
XavierUniformInitScheme |
class |
ZeroInitScheme
Initialize the weight to zero.
|
Copyright © 2021. All rights reserved.