public abstract class Loss extends TrainingMetric
| Constructor and Description |
|---|
Loss(java.lang.String name)
Base class for metric with abstract update methods.
|
| Modifier and Type | Method and Description |
|---|---|
Loss |
duplicate()
Creates and returns a copy of this object.
|
protected int[] |
excludeBatchAxis(NDArray loss,
int batchAxis)
Gets all axes except the batch axis because loss functions require reduction on all axes
except the batch axis.
|
abstract NDArray |
getLoss(NDList label,
NDList prediction)
Calculates loss between the label and prediction.
|
float |
getValue()
Calculates metric values.
|
static HingeLoss |
hingeLoss()
Returns a new instance of
HingeLoss with default arguments. |
static HingeLoss |
hingeLoss(int margin,
float weight,
int batchAxis)
Returns a new instance of
HingeLoss with the given arguments. |
static L1Loss |
l1Loss()
Returns a new instance of
L1Loss with default weight and batch axis. |
static L1Loss |
l1Loss(float weight,
int batchAxis)
Returns a new instance of
L1Loss with given weight and batch axis. |
static L2Loss |
l2Loss()
Returns a new instance of
L2Loss with default weight and batch axis. |
static L2Loss |
l2Loss(float weight,
int batchAxis)
Returns a new instance of
L2Loss with given weight and batch axis. |
void |
reset()
Resets metric values.
|
static SigmoidBinaryCrossEntropyLoss |
sigmoidBinaryCrossEntropyLoss()
Returns a new instance of
SigmoidBinaryCrossEntropyLoss with default arguments. |
static SigmoidBinaryCrossEntropyLoss |
sigmoidBinaryCrossEntropyLoss(float weight,
int batchAxis,
boolean fromSigmoid)
Returns a new instance of
SigmoidBinaryCrossEntropyLoss with the given arguments. |
static SoftmaxCrossEntropyLoss |
softmaxCrossEntropyLoss()
Returns a new instance of
SoftmaxCrossEntropyLoss with default arguments. |
static SoftmaxCrossEntropyLoss |
softmaxCrossEntropyLoss(float weight,
int batchAxis,
int classAxis,
boolean sparseLabel,
boolean fromLogit)
Returns a new instance of
SoftmaxCrossEntropyLoss with the given arguments. |
void |
update(NDList labels,
NDList predictions)
Updates the training metrics based on a
NDList of labels and predictions. |
checkLabelShapes, checkLabelShapes, getNamepublic Loss(java.lang.String name)
name - The display name of the Losspublic abstract NDArray getLoss(NDList label, NDList prediction)
label - the true labelprediction - the predicted labelpublic static L1Loss l1Loss()
L1Loss with default weight and batch axis.L1Losspublic static L1Loss l1Loss(float weight, int batchAxis)
L1Loss with given weight and batch axis.weight - the weight to apply on loss value, default 1batchAxis - the axis that represents mini-batch, default 0L1Losspublic static L2Loss l2Loss()
L2Loss with default weight and batch axis.L2Losspublic static L2Loss l2Loss(float weight, int batchAxis)
L2Loss with given weight and batch axis.weight - the weight to apply on loss value, default 1batchAxis - the axis that represents mini-batch, default 0L2Losspublic static SigmoidBinaryCrossEntropyLoss sigmoidBinaryCrossEntropyLoss()
SigmoidBinaryCrossEntropyLoss with default arguments.SigmoidBinaryCrossEntropyLosspublic static SigmoidBinaryCrossEntropyLoss sigmoidBinaryCrossEntropyLoss(float weight, int batchAxis, boolean fromSigmoid)
SigmoidBinaryCrossEntropyLoss with the given arguments.weight - the weight to apply on the loss value, default 1batchAxis - the axis that represents the mini-batch, default 0fromSigmoid - whether the input is from the output of sigmoid, default falseSigmoidBinaryCrossEntropyLosspublic static SoftmaxCrossEntropyLoss softmaxCrossEntropyLoss()
SoftmaxCrossEntropyLoss with default arguments.SoftmaxCrossEntropyLosspublic static SoftmaxCrossEntropyLoss softmaxCrossEntropyLoss(float weight, int batchAxis, int classAxis, boolean sparseLabel, boolean fromLogit)
SoftmaxCrossEntropyLoss with the given arguments.weight - the weight to apply on the loss value, default 1batchAxis - the axis that represents the mini-batch, default 0classAxis - the axis that represents the class probabilities, default -1sparseLabel - whether labels are integer array or probabilities, default truefromLogit - whether labels are log probabilities or un-normalized numbersSoftmaxCrossEntropyLosspublic static HingeLoss hingeLoss()
HingeLoss with default arguments.HingeLosspublic static HingeLoss hingeLoss(int margin, float weight, int batchAxis)
HingeLoss with the given arguments.margin - the margin in hinge loss. Defaults to 1.0weight - the weight to apply on loss value, default 1batchAxis - the axis that represents mini-batch, default 0HingeLosspublic Loss duplicate()
duplicate in class TrainingMetricTrainingMetricpublic void update(NDList labels, NDList predictions)
NDList of labels and predictions.
This is a synchronized operation. You should only call it at the end of a batch or epoch.
update in class TrainingMetriclabels - a NDList of labelspredictions - a NDList of predictionspublic void reset()
reset in class TrainingMetricpublic float getValue()
getValue in class TrainingMetricPair of metric name and valueprotected int[] excludeBatchAxis(NDArray loss, int batchAxis)
loss - the loss NDArraybatchAxis - the axis that represents the mini-batch