public interface DFunctionBatchNorm<T extends deepboof.Tensor<T>> extends DBatchNorm<T>
Implementation of Batch Normalization for training networks. This has distinctly
different behavior from forward only implementations. In this learning implementation, statistics of
the input parameters are recomputed every time forward(T, T) is invoked. While for the forward only
implementation those statistics are known already and not recomputed.
The above described change in behavior also changes how parameters are specified. mean and variance are no longer input parameters but are computed dynamically in the forwards pass.
NOTES:| Modifier and Type | Method and Description |
|---|---|
void |
forward(T input,
T output)
Applies batch normalization to each variable in the input.
|
void |
setParameters(java.util.List<T> parameters)
See
forward(T, T) for a description of parameters. |
getMean, getVariancevoid forward(T input, T output)
Applies batch normalization to each variable in the input.
There is only a parameter tensor if BatchNorm.hasGammaBeta() returns true. If true then
gamma, and beta are encoded in a single tensor in an interleaved fashion (gamma, beta).
Summary Table ------------------------------------------------- Input shape = (N, d[i], ... , d[k]) Output shape = (N, d[i], ... , d[k]) Params shape = (d[i], ... , d[k], 2) ------------------------------------------------- N = Size of mini-batch d[i] = length of a dimension
NOTE: Interleaving is used in the parameters instead of multiple tensors to improve memory locality, which reduces cache misses.
void setParameters(java.util.List<T> parameters)
forward(T, T) for a description of parameters.