public class SoftmaxCrossEntropyLoss extends Loss
SoftmaxCrossEntropyLoss is a type of Loss that calculates the softmax cross
entropy loss.
If sparse_label is true (default), label should contain integer
category indicators. Then, \(L = -\sum_i \log p_{i, label_i}\). If sparse_label is false, label should contain probability distribution and its shape should be the same as
the shape of prediction. Then, \(L = -\sum_i \sum_j {label}_j \log p_{ij}\).
| Constructor and Description |
|---|
SoftmaxCrossEntropyLoss()
Creates a new instance of
SoftmaxCrossEntropyLoss with default parameters. |
SoftmaxCrossEntropyLoss(float weight,
int batchAxis,
int classAxis,
boolean sparseLabel,
boolean fromLogit)
Creates a new instance of
SoftmaxCrossEntropyLoss with the given parameters. |
| Modifier and Type | Method and Description |
|---|---|
NDArray |
getLoss(NDList label,
NDList prediction)
Calculates loss between the label and prediction.
|
duplicate, excludeBatchAxis, getValue, hingeLoss, hingeLoss, l1Loss, l1Loss, l2Loss, l2Loss, reset, sigmoidBinaryCrossEntropyLoss, sigmoidBinaryCrossEntropyLoss, softmaxCrossEntropyLoss, softmaxCrossEntropyLoss, updatecheckLabelShapes, checkLabelShapes, getNamepublic SoftmaxCrossEntropyLoss(float weight,
int batchAxis,
int classAxis,
boolean sparseLabel,
boolean fromLogit)
SoftmaxCrossEntropyLoss with the given parameters.weight - the weight to apply on the loss value, default 1batchAxis - the axis that represents the mini-batch, default 0classAxis - the axis that represents the class probabilities, default -1sparseLabel - whether labels are integer array or probabilities, default truefromLogit - whether labels are log probabilities or un-normalized numberspublic SoftmaxCrossEntropyLoss()
SoftmaxCrossEntropyLoss with default parameters.