public class SigmoidBinaryCrossEntropyLoss extends Loss
SigmoidBinaryCrossEntropyLoss is a type of Loss.
Sigmoid binary cross-entropy loss is defined by: \(L = -\sum_i {label_i * log(prob_i) * posWeight + (1 - label_i) * log(1 - prob_i)}\) where \(prob = \frac{1}{1 + e^{-pred}}\)
| Constructor and Description |
|---|
SigmoidBinaryCrossEntropyLoss()
Performs Sigmoid cross-entropy loss for binary classification.
|
SigmoidBinaryCrossEntropyLoss(float weight,
int batchAxis,
boolean fromSigmoid)
Performs Sigmoid cross-entropy loss for binary classification.
|
| Modifier and Type | Method and Description |
|---|---|
NDArray |
getLoss(NDList label,
NDList prediction)
Calculates loss between the label and prediction.
|
duplicate, excludeBatchAxis, getValue, hingeLoss, hingeLoss, l1Loss, l1Loss, l2Loss, l2Loss, reset, sigmoidBinaryCrossEntropyLoss, sigmoidBinaryCrossEntropyLoss, softmaxCrossEntropyLoss, softmaxCrossEntropyLoss, updatecheckLabelShapes, checkLabelShapes, getNamepublic SigmoidBinaryCrossEntropyLoss(float weight,
int batchAxis,
boolean fromSigmoid)
weight - the weight to apply on the loss value, default 1batchAxis - the axis that represents the mini-batch, default 0fromSigmoid - whether the input is from the output of sigmoid, default falsepublic SigmoidBinaryCrossEntropyLoss()