Package ai.djl.training.loss
Class MaskedSoftmaxCrossEntropyLoss
- java.lang.Object
-
- ai.djl.training.evaluator.Evaluator
-
- ai.djl.training.loss.Loss
-
- ai.djl.training.loss.MaskedSoftmaxCrossEntropyLoss
-
-
Field Summary
-
Fields inherited from class ai.djl.training.evaluator.Evaluator
totalInstances
-
-
Constructor Summary
Constructors Constructor Description MaskedSoftmaxCrossEntropyLoss()Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters.MaskedSoftmaxCrossEntropyLoss(java.lang.String name)Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters.MaskedSoftmaxCrossEntropyLoss(java.lang.String name, float weight, int classAxis, boolean sparseLabel, boolean fromLogit)Creates a new instance ofMaskedSoftmaxCrossEntropyLosswith the given parameters.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description NDArrayevaluate(NDList labels, NDList predictions)Calculates the evaluation between the labels and the predictions.-
Methods inherited from class ai.djl.training.loss.Loss
addAccumulator, elasticNetWeightedDecay, elasticNetWeightedDecay, elasticNetWeightedDecay, elasticNetWeightedDecay, getAccumulator, hingeLoss, hingeLoss, hingeLoss, l1Loss, l1Loss, l1Loss, l1WeightedDecay, l1WeightedDecay, l1WeightedDecay, l2Loss, l2Loss, l2Loss, l2WeightedDecay, l2WeightedDecay, l2WeightedDecay, maskedSoftmaxCrossEntropyLoss, maskedSoftmaxCrossEntropyLoss, maskedSoftmaxCrossEntropyLoss, quantileL1Loss, quantileL1Loss, resetAccumulator, sigmoidBinaryCrossEntropyLoss, sigmoidBinaryCrossEntropyLoss, sigmoidBinaryCrossEntropyLoss, softmaxCrossEntropyLoss, softmaxCrossEntropyLoss, softmaxCrossEntropyLoss, updateAccumulator
-
Methods inherited from class ai.djl.training.evaluator.Evaluator
checkLabelShapes, checkLabelShapes, getName
-
-
-
-
Constructor Detail
-
MaskedSoftmaxCrossEntropyLoss
public MaskedSoftmaxCrossEntropyLoss()
Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters.
-
MaskedSoftmaxCrossEntropyLoss
public MaskedSoftmaxCrossEntropyLoss(java.lang.String name)
Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters.- Parameters:
name- the name of the loss
-
MaskedSoftmaxCrossEntropyLoss
public MaskedSoftmaxCrossEntropyLoss(java.lang.String name, float weight, int classAxis, boolean sparseLabel, boolean fromLogit)Creates a new instance ofMaskedSoftmaxCrossEntropyLosswith the given parameters.- Parameters:
name- the name of the lossweight- the weight to apply on the loss value, default 1classAxis- the axis that represents the class probabilities, default -1sparseLabel- whether labels are 1-D integer array of [batch_size] (false) or 2-D probabilities of [batch_size, n-class] (true), default truefromLogit- if true, the inputs are assumed to be the numbers before being applied with softmax. Then logSoftmax will be applied to input, default false
-
-