Class Activation
- java.lang.Object
-
- ai.djl.nn.Activation
-
public final class Activation extends java.lang.ObjectUtility class that provides activation functions and blocks.Many networks make use of the
Linearblock and other similar linear transformations. However, any number of linear transformations that are composed will only result in a different linear transformation (\($f(x) = W_2(W_1x) = (W_2W_1)x = W_{combined}x\)). In order to represent non-linear data, non-linear functions called activation functions are interspersed between the linear transformations. This allows the network to represent non-linear functions of increasing complexity.See wikipedia for more details.
-
-
Method Summary
All Methods Static Methods Concrete Methods Modifier and Type Method Description static NDArrayelu(NDArray array, float alpha)Applies ELU activation on the inputNDArray.static NDListelu(NDList arrays, float alpha)Applies ELU(Exponential Linear Unit) activation on the input singletonNDList.static BlockeluBlock(float alpha)Creates aLambdaBlockthat applies theELUactivation function in its forward function.static NDArraygelu(NDArray array)Applies GELU(Gaussian Error Linear Unit) activation on the inputNDArray.static NDListgelu(NDList arrays)Applies GELU(Gaussian Error Linear Unit) activation on the input singletonNDList.static BlockgeluBlock()Creates aLambdaBlockthat applies theGELUactivation function in its forward function.static NDArrayleakyRelu(NDArray array, float alpha)Applies Leaky ReLU activation on the inputNDArray.static NDListleakyRelu(NDList arrays, float alpha)Applies Leaky ReLU activation on the input singletonNDList.static BlockleakyReluBlock(float alpha)Creates aLambdaBlockthat applies theLeakyReLUactivation function in its forward function.static NDArraymish(NDArray array)Applies Mish activation on the inputNDArray.static NDListmish(NDList arrays)Applies Mish activation on the input singletonNDList.static BlockmishBlock()Creates aLambdaBlockthat applies theMishactivation function in its forward function.static BlockpreluBlock()Returns aPrelublock.static NDArrayrelu(NDArray array)Applies ReLU activation on the inputNDArray.static NDListrelu(NDList arrays)Applies ReLU activation on the input singletonNDList.static NDArrayrelu6(NDArray array)Applies ReLU6 activation on the inputNDArray.static NDListrelu6(NDList arrays)Applies ReLU6 activation on the input singletonNDList.static Blockrelu6Block()Creates aLambdaBlockthat applies theReLU6activation function in its forward function.static BlockreluBlock()Creates aLambdaBlockthat applies theReLUactivation function in its forward function.static NDArrayselu(NDArray array)Applies Scaled ELU activation on the inputNDArray.static NDListselu(NDList arrays)Applies Scaled ELU activation on the input singletonNDList.static BlockseluBlock()Creates aLambdaBlockthat applies theSELUactivation function in its forward function.static NDArraysigmoid(NDArray array)Applies Sigmoid activation on the inputNDArray.static NDListsigmoid(NDList arrays)Applies Sigmoid activation on the input singletonNDList.static BlocksigmoidBlock()Creates aLambdaBlockthat applies theSigmoidactivation function in its forward function.static NDArraysoftPlus(NDArray array)Applies softPlus activation on the inputNDArray.static NDListsoftPlus(NDList arrays)Applies softPlus activation on the input singletonNDList.static BlocksoftPlusBlock()Creates aLambdaBlockthat applies thesoftPlus(NDList)activation function in its forward function.static NDArraysoftSign(NDArray array)Applies softSign activation on the inputNDArray.static NDListsoftSign(NDList arrays)Applies softPlus activation on the input singletonNDList.static BlocksoftSignBlock()Creates aLambdaBlockthat applies thesoftSign(NDList)activation function in its forward function.static NDArrayswish(NDArray array, float beta)Applies Swish activation on the inputNDArray.static NDListswish(NDList arrays, float beta)Applies SWish activation on the input singletonNDList.static BlockswishBlock(float beta)Creates aLambdaBlockthat applies theSwishactivation function in its forward function.static NDArraytanh(NDArray array)Applies Tanh activation on the inputNDArray.static NDListtanh(NDList arrays)Applies Tanh activation on the input singletonNDList.static BlocktanhBlock()Creates aLambdaBlockthat applies theTanhactivation function in its forward function.
-
-
-
Method Detail
-
relu
public static NDArray relu(NDArray array)
Applies ReLU activation on the inputNDArray.ReLU is defined by: \( y = max(0, x) \)
-
relu
public static NDList relu(NDList arrays)
Applies ReLU activation on the input singletonNDList.ReLU is defined by: \( y = max(0, x) \)
-
relu6
public static NDArray relu6(NDArray array)
Applies ReLU6 activation on the inputNDArray.ReLU6 is defined by: \( y = min(6,max(0, x)) \)
-
relu6
public static NDList relu6(NDList arrays)
Applies ReLU6 activation on the input singletonNDList.ReLU6 is defined by: \( y = min(6,max(0, x)) \)
-
sigmoid
public static NDArray sigmoid(NDArray array)
Applies Sigmoid activation on the inputNDArray.Sigmoid is defined by: \( y = 1 / (1 + e^{-x}) \)
-
sigmoid
public static NDList sigmoid(NDList arrays)
Applies Sigmoid activation on the input singletonNDList.Sigmoid is defined by: \( y = 1 / (1 + e^{-x}) \)
-
tanh
public static NDArray tanh(NDArray array)
Applies Tanh activation on the inputNDArray.Tanh is defined by: \( y = (e^x - e^{-x}) / (e^x + e^{-x}) \)
-
tanh
public static NDList tanh(NDList arrays)
Applies Tanh activation on the input singletonNDList.Tanh is defined by: \( y = (e^x - e^{-x}) / (e^x + e^{-x}) \)
-
softPlus
public static NDArray softPlus(NDArray array)
Applies softPlus activation on the inputNDArray.softPlus is defined by: \( y = log(1 + e^x) \)
-
softPlus
public static NDList softPlus(NDList arrays)
Applies softPlus activation on the input singletonNDList.softPlus is defined by: \( y = log(1 + e^x) \)
-
softSign
public static NDArray softSign(NDArray array)
Applies softSign activation on the inputNDArray.softPlus is defined by: \( y = x / 1 + |x| \)
-
softSign
public static NDList softSign(NDList arrays)
Applies softPlus activation on the input singletonNDList.softPlus is defined by: \( y = x / 1 + |x| \)
-
leakyRelu
public static NDArray leakyRelu(NDArray array, float alpha)
Applies Leaky ReLU activation on the inputNDArray.Leaky ReLU is defined by: \( y = x \gt 0 ? x : alpha * x \)
-
leakyRelu
public static NDList leakyRelu(NDList arrays, float alpha)
Applies Leaky ReLU activation on the input singletonNDList.Leaky ReLU is defined by: \( y = x \gt 0 ? x : alpha * x \)
-
elu
public static NDArray elu(NDArray array, float alpha)
Applies ELU activation on the inputNDArray.ELU is defined by: \( y = x \gt 0 ? x : alpha * (e^x - 1) \)
-
elu
public static NDList elu(NDList arrays, float alpha)
Applies ELU(Exponential Linear Unit) activation on the input singletonNDList.ELU is defined by: \( y = x \gt 0 ? x : alpha * (e^x - 1) \)
-
selu
public static NDArray selu(NDArray array)
Applies Scaled ELU activation on the inputNDArray.Scaled ELU is defined by: \( y = lambda * (x \gt 0 ? x : alpha * (e^x - 1))\) where \(lambda = 1.0507009873554804934193349852946\) and \(alpha = 1.6732632423543772848170429916717\)
-
selu
public static NDList selu(NDList arrays)
Applies Scaled ELU activation on the input singletonNDList.Scaled ELU is defined by: \( y = lambda * (x \gt 0 ? x : alpha * (e^x - 1))\) where \(lambda = 1.0507009873554804934193349852946\) and \(alpha = 1.6732632423543772848170429916717 \)
-
gelu
public static NDArray gelu(NDArray array)
Applies GELU(Gaussian Error Linear Unit) activation on the inputNDArray.
-
gelu
public static NDList gelu(NDList arrays)
Applies GELU(Gaussian Error Linear Unit) activation on the input singletonNDList.
-
swish
public static NDArray swish(NDArray array, float beta)
Applies Swish activation on the inputNDArray.Swish is defined as \(y = x * sigmoid(beta * x)\)
-
swish
public static NDList swish(NDList arrays, float beta)
Applies SWish activation on the input singletonNDList.Swish is defined as \(y = x * sigmoid(beta * x)\)
-
mish
public static NDArray mish(NDArray array)
Applies Mish activation on the inputNDArray.Mish is defined as \(y = x * tanh(ln(1 + e^x)\) defined by Diganta Misra in his paper Mish: A Self Regularized Non-Monotonic Neural Activation Function
-
mish
public static NDList mish(NDList arrays)
Applies Mish activation on the input singletonNDList.Mish is defined as \(y = x * tanh(ln(1 + e^x)\) defined by Diganta Misra in his paper Mish: A Self Regularized Non-Monotonic Neural Activation Function
-
reluBlock
public static Block reluBlock()
Creates aLambdaBlockthat applies theReLUactivation function in its forward function.- Returns:
- the
LambdaBlockthat applies theReLUactivation function
-
relu6Block
public static Block relu6Block()
Creates aLambdaBlockthat applies theReLU6activation function in its forward function.- Returns:
- the
LambdaBlockthat applies theReLUactivation function
-
sigmoidBlock
public static Block sigmoidBlock()
Creates aLambdaBlockthat applies theSigmoidactivation function in its forward function.- Returns:
- the
LambdaBlockthat applies theSigmoidactivation function
-
tanhBlock
public static Block tanhBlock()
Creates aLambdaBlockthat applies theTanhactivation function in its forward function.- Returns:
- the
LambdaBlockthat applies theTanhactivation function
-
softPlusBlock
public static Block softPlusBlock()
Creates aLambdaBlockthat applies thesoftPlus(NDList)activation function in its forward function.- Returns:
- the
LambdaBlockthat applies thesoftPlus(NDList)activation function
-
softSignBlock
public static Block softSignBlock()
Creates aLambdaBlockthat applies thesoftSign(NDList)activation function in its forward function.- Returns:
- the
LambdaBlockthat applies thesoftSign(NDList)activation function
-
leakyReluBlock
public static Block leakyReluBlock(float alpha)
Creates aLambdaBlockthat applies theLeakyReLUactivation function in its forward function.- Parameters:
alpha- the slope for the activation- Returns:
- the
LambdaBlockthat applies theLeakyReLUactivation function
-
eluBlock
public static Block eluBlock(float alpha)
Creates aLambdaBlockthat applies theELUactivation function in its forward function.- Parameters:
alpha- the slope for the activation- Returns:
- the
LambdaBlockthat applies theELUactivation function
-
seluBlock
public static Block seluBlock()
Creates aLambdaBlockthat applies theSELUactivation function in its forward function.- Returns:
- the
LambdaBlockthat applies theSELUactivation function
-
geluBlock
public static Block geluBlock()
Creates aLambdaBlockthat applies theGELUactivation function in its forward function.- Returns:
- the
LambdaBlockthat applies theGELUactivation function
-
swishBlock
public static Block swishBlock(float beta)
Creates aLambdaBlockthat applies theSwishactivation function in its forward function.- Parameters:
beta- a hyper-parameter- Returns:
- the
LambdaBlockthat applies theSwishactivation function
-
mishBlock
public static Block mishBlock()
Creates aLambdaBlockthat applies theMishactivation function in its forward function.- Returns:
- the
LambdaBlockthat applies theMishactivation function
-
-