Shortcuts

BCEWithLogitsLSLoss

class torch_uncertainty.losses.BCEWithLogitsLSLoss(weight=None, reduction='mean', label_smoothing=0.0)[source]

Binary Cross Entropy with Logits Loss with label smoothing.

The original PyTorch implementation of the BCEWithLogitsLoss does not support label smoothing. This implementation adds label smoothing to the BCEWithLogitsLoss.

Parameters:
  • weight (Tensor, optional) – A manual rescaling weight given to the loss of each batch element. If given, has to be a Tensor of size “nbatch”. Defaults to None.

  • reduction (str, optional) – Specifies the reduction to apply to the output: ‘none’ | ‘mean’ | ‘sum’. ‘none’: no reduction will be applied, ‘mean’: the sum of the output will be divided by the number of elements in the output, ‘sum’: the output will be summed. Defaults to ‘mean’.

  • label_smoothing (float, optional) – The label smoothing factor. Defaults to 0.0.