Interface ActivationReLU<T extends Tensor>

All Superinterfaces:
Function<T>
All Known Implementing Classes:
ActivationReLU_F32, ActivationReLU_F64

public interface ActivationReLU<T extends Tensor>
extends Function<T>
Rectified Linear Unit (ReLU) activation function. Used in [1] as an alternative to tanh with the claim that it has much better convergence.
 f(x) = 0  if x < 0
        x     x ≥ 0
 

[1] Krizhevsky, Alex, Ilya Sutskever, and Geoffrey E. Hinton. "Imagenet classification with deep convolutional neural networks." Advances in neural information processing systems. 2012.

  • Method Details

    • setParameters

      void setParameters​(java.util.List<T> parameters)
      Can skip. No parameters required.
      Specified by:
      setParameters in interface Function<T extends Tensor>
      Parameters:
      parameters - No parameters required
    • forward

      void forward​(T input, T output)
      Applies the ReLU operator to each element in the input tensor and saves the results in the output tensor. Any shape is allowed.
      Specified by:
      forward in interface Function<T extends Tensor>
      Parameters:
      input - Input to the function. Any shape.
      output - Output tensor. Same shape as input. Modified.