Shortcuts

Disagreement

class torch_uncertainty.metrics.Disagreement(reduction='mean', **kwargs)[source]

The Disagreement Metric to estimate the confidence of an ensemble of estimators.

Parameters:
  • reduction (str, optional) –

    Determines how to reduce over the \(B\)/batch dimension:

    • 'mean' [default]: Averages score across samples

    • 'sum': Sum score across samples

    • 'none' or None: Returns score per sample

  • kwargs – Additional keyword arguments, see Advanced metric settings.

Inputs:
  • probs: \((B, N, C)\)

where \(B\) is the batch size, \(C\) is the number of classes and \(N\) is the number of estimators.

Note

A higher disagreement means a lower confidence.

Warning

Make sure that the probabilities in probs are normalized to sum to one.

Raises:

ValueError – If reduction is not one of 'mean', 'sum', 'none' or None.

compute()[source]

Compute Disagreement based on inputs passed in to update.

update(probs)[source]

Update state with prediction probabilities and targets.

Parameters:

probs (torch.Tensor) – Probabilities from the model.