mc_dropout#

torch_uncertainty.models.mc_dropout(model, num_estimators, last_layer=False, on_batch=True)[source]#

MC Dropout wrapper for a model.

Parameters:
  • model (nn.Module) – model to wrap

  • num_estimators (int) – number of estimators to use last_layer (bool, optional): whether to apply dropout to the last layer only. Defaults to False.

  • on_batch (bool) – Increase the batch_size to perform MC-Dropout. Otherwise in a for loop to reduce memory footprint. Defaults to True.

  • last_layer (bool, optional) – whether to apply dropout to the last layer only. Defaults to False.

Warning

Beware that on_batch==True can raise weird errors if not enough memory is available.