Package deepboof.forward
Interface ActivationReLU<T extends Tensor>
- All Superinterfaces:
Function<T>
- All Known Implementing Classes:
ActivationReLU_F32,ActivationReLU_F64
Rectified Linear Unit (ReLU) activation function. Used in [1] as an alternative to tanh with the claim
that it has much better convergence.
f(x) = 0 if x < 0
x x ≥ 0
[1] Krizhevsky, Alex, Ilya Sutskever, and Geoffrey E. Hinton. "Imagenet classification with deep convolutional neural networks." Advances in neural information processing systems. 2012.
-
Method Summary
Methods inherited from interface deepboof.Function
getOutputShape, getParameters, getParameterShapes, getTensorType, initialize
-
Method Details
-
setParameters
Can skip. No parameters required.- Specified by:
setParametersin interfaceFunction<T extends Tensor>- Parameters:
parameters- No parameters required
-
forward
Applies the ReLU operator to each element in the input tensor and saves the results in the output tensor. Any shape is allowed.
-