| AbstractCompositeLoss |
AbstractCompositeLoss is a Loss class that can combine other Losses
together to make a larger loss.
|
| ElasticNetWeightDecay |
ElasticWeightDecay calculates L1+L2 penalty of a set of parameters.
|
| HingeLoss |
HingeLoss is a type of Loss.
|
| IndexLoss |
A wrapper for a Loss that evaluates on only a particular NDArray in the
predictions and/or labels NDLists.
|
| L1Loss |
L1Loss calculates L1 loss between label and prediction.
|
| L1WeightDecay |
L1WeightDecay calculates L1 penalty of a set of parameters.
|
| L2Loss |
Calculates L2Loss between label and prediction, a.k.a.
|
| L2WeightDecay |
L2WeightDecay calculates L2 penalty of a set of parameters.
|
| Loss |
Loss functions (or Cost functions) are used to evaluate the model predictions against true labels
for optimization.
|
| MaskedSoftmaxCrossEntropyLoss |
MaskedSoftmaxCrossEntropyLoss is an implementation of Loss that only considers a
specific number of values for the loss computations, and masks the rest according to the given
sequence.
|
| QuantileL1Loss |
QuantileL1Loss calculates the Weighted Quantile Loss between labels and predictions.
|
| SigmoidBinaryCrossEntropyLoss |
SigmoidBinaryCrossEntropyLoss is a type of Loss.
|
| SimpleCompositeLoss |
SimpleCompositeLoss is an implementation of the Loss abstract class that can
combine different Loss functions by adding the individual losses together.
|
| SingleShotDetectionLoss |
SingleShotDetectionLoss is an implementation of Loss.
|
| SoftmaxCrossEntropyLoss |
SoftmaxCrossEntropyLoss is a type of Loss that calculates the softmax cross
entropy loss.
|
| TabNetClassificationLoss |
Calculates the loss for tabNet in Classification tasks.
|
| TabNetRegressionLoss |
Calculates the loss of tabNet for regression tasks.
|
| YOLOv3Loss |
YOLOv3Loss is an implementation of Loss.
|
| YOLOv3Loss.Builder |
|