public class Mlp
extends ai.djl.nn.SequentialBlock
A multilayer perceptron (MLP) is a feedforward artificial neural network that generates a set of outputs from a set of inputs. An MLP is characterized by several layers of input nodes connected as a directed graph between the input and output layers. MLP uses backpropogation for training the network.
MLP is widely used for solving problems that require supervised learning as well as research into computational neuroscience and parallel distributed processing. Applications include speech recognition, image recognition and machine translation.
| Constructor and Description |
|---|
Mlp(int input,
int output,
int[] hidden)
Create an MLP NeuralNetwork using RELU.
|
Mlp(int input,
int output,
int[] hidden,
java.util.function.Function<ai.djl.ndarray.NDList,ai.djl.ndarray.NDList> activation)
Create an MLP NeuralNetwork.
|
add, add, addAll, addAll, forward, getChildren, getDirectParameters, getOutputShapes, getParameterShape, initialize, loadParameters, removeLastBlock, replaceLastBlock, saveParameters, toStringbeforeInitialize, cast, clear, describeInput, getParameters, isInitialized, readInputShapes, saveInputShapes, setInitializer, setInitializerpublic Mlp(int input,
int output,
int[] hidden)
input - the size of the input vectoroutput - the size of the input vectorhidden - the sizes of all of the hidden layerspublic Mlp(int input,
int output,
int[] hidden,
java.util.function.Function<ai.djl.ndarray.NDList,ai.djl.ndarray.NDList> activation)
input - the size of the input vectoroutput - the size of the input vectorhidden - the sizes of all of the hidden layersactivation - the activation function to use