Package deepboof.backward
Interface DFunctionDropOut<T extends deepboof.Tensor<T>>
- All Superinterfaces:
deepboof.DFunction<T>,deepboof.Function<T>
- All Known Implementing Classes:
DFunctionDropOut_F64
public interface DFunctionDropOut<T extends deepboof.Tensor<T>>
extends deepboof.DFunction<T>
Drop out is a technique introduced by [1] for regularizing a network and helps prevents over fitting. It works
by randomly selecting neurons and forces them to be off. The chance of a neuron being turned off is specified
by the drop rate. It's behavior is different when in learning or evaluation mode. In learning mode it will
decide if a neuron is dropped using a probability of drop_rate*100, drop_rate is 0 to 1.0, inclusive.
In evaluation mode it scales each input by 1.0 - drop_rate.
[1] Srivastava et al. "Dropout: A Simple Way to Prevent Neural Networks from Overfitting"
-
Method Summary
Modifier and Type Method Description doublegetDropRate()Returns a number from 0 to 1 indicating the likelihood of a neuron being dropped.Methods inherited from interface deepboof.DFunction
backwards, evaluating, isLearning, learningMethods inherited from interface deepboof.Function
forward, getOutputShape, getParameters, getParameterShapes, getTensorType, initialize, setParameters
-
Method Details
-
getDropRate
double getDropRate()Returns a number from 0 to 1 indicating the likelihood of a neuron being dropped. 0 = 0% change and 1 = 100% chance- Returns:
- drop rate
-