Package ai.djl.training.loss
Class SoftmaxCrossEntropyLoss
- java.lang.Object
-
- ai.djl.training.evaluator.Evaluator
-
- ai.djl.training.loss.Loss
-
- ai.djl.training.loss.SoftmaxCrossEntropyLoss
-
public class SoftmaxCrossEntropyLoss extends Loss
SoftmaxCrossEntropyLossis a type ofLossthat calculates the softmax cross entropy loss.If
sparse_labelistrue(default),labelshould contain integer category indicators. Then, \(L = -\sum_i \log p_{i, label_i}\). Ifsparse_labelisfalse,labelshould be one-hot class coding or probability distribution and its shape should be the same as the shape ofprediction. Then, \(L = -\sum_i \sum_j {label}_j \log p_{ij}\).
-
-
Field Summary
-
Fields inherited from class ai.djl.training.evaluator.Evaluator
totalInstances
-
-
Constructor Summary
Constructors Constructor Description SoftmaxCrossEntropyLoss()Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters.SoftmaxCrossEntropyLoss(java.lang.String name)Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters.SoftmaxCrossEntropyLoss(java.lang.String name, float weight, int classAxis, boolean sparseLabel, boolean fromLogit)Creates a new instance ofSoftmaxCrossEntropyLosswith the given parameters.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description NDArrayevaluate(NDList label, NDList prediction)Calculates the evaluation between the labels and the predictions.-
Methods inherited from class ai.djl.training.loss.Loss
addAccumulator, elasticNetWeightedDecay, elasticNetWeightedDecay, elasticNetWeightedDecay, elasticNetWeightedDecay, getAccumulator, hingeLoss, hingeLoss, hingeLoss, l1Loss, l1Loss, l1Loss, l1WeightedDecay, l1WeightedDecay, l1WeightedDecay, l2Loss, l2Loss, l2Loss, l2WeightedDecay, l2WeightedDecay, l2WeightedDecay, maskedSoftmaxCrossEntropyLoss, maskedSoftmaxCrossEntropyLoss, maskedSoftmaxCrossEntropyLoss, quantileL1Loss, quantileL1Loss, resetAccumulator, sigmoidBinaryCrossEntropyLoss, sigmoidBinaryCrossEntropyLoss, sigmoidBinaryCrossEntropyLoss, softmaxCrossEntropyLoss, softmaxCrossEntropyLoss, softmaxCrossEntropyLoss, updateAccumulator
-
Methods inherited from class ai.djl.training.evaluator.Evaluator
checkLabelShapes, checkLabelShapes, getName
-
-
-
-
Constructor Detail
-
SoftmaxCrossEntropyLoss
public SoftmaxCrossEntropyLoss()
Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters.
-
SoftmaxCrossEntropyLoss
public SoftmaxCrossEntropyLoss(java.lang.String name)
Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters.- Parameters:
name- the name of the loss
-
SoftmaxCrossEntropyLoss
public SoftmaxCrossEntropyLoss(java.lang.String name, float weight, int classAxis, boolean sparseLabel, boolean fromLogit)Creates a new instance ofSoftmaxCrossEntropyLosswith the given parameters.- Parameters:
name- the name of the lossweight- the weight to apply on the loss value, default 1classAxis- the axis that represents the class probabilities, default -1sparseLabel- whether labels are rank-1 integer array of [batch_size] (true) or rank-2 one-hot or probability distribution of shape [batch_size, n-class] (false), default truefromLogit- if true, the inputs are assumed to be the numbers before being applied with softmax. Then logSoftmax will be applied to input, default true
-
-