Shortcuts

MCDropout

class torch_uncertainty.models.MCDropout(model, num_estimators, last_layer, on_batch)[source]

MC Dropout wrapper for a model containing nn.Dropout modules.

Parameters:
  • model (nn.Module) – model to wrap

  • num_estimators (int) – number of estimators to use during the evaluation

  • last_layer (bool) – whether to apply dropout to the last layer only.

  • on_batch (bool) – Perform the MC-Dropout on the batch-size. Otherwise in a for loop. Useful when constrained in memory.

Warning

This module will work only if you apply dropout through modules declared in the constructor (__init__).

Warning

The last-layer option disables the lastly initialized dropout during evaluation: make sure that the last dropout is either functional or a module of its own.

forward(x)[source]

Forward pass of the model.

During training, the forward pass is the same as of the core model. During evaluation, the forward pass is repeated num_estimators times either on the batch size or in a for loop depending on last_layer.

Parameters:

x (Tensor) – input tensor of shape (B, …)

Returns:

output tensor of shape (num_estimators * B, …)

Return type:

Tensor

train(mode=True)[source]

Override the default train method to set the training mode of each submodule to be the same as the module itself except for the selected dropout modules.

Parameters:

mode (bool, optional) – whether to set the module to training mode. Defaults to True.