SILog#
- class torch_uncertainty.metrics.regression.SILog(sqrt=False, lmbda=1.0, **kwargs)[source]#
Computes The Scale-Invariant Logarithmic Loss metric.
The Scale-Invariant Logarithmic Loss (SILog), a metric designed for depth estimation tasks.
\[\text{SILog} = \frac{1}{N} \sum_{i=1}^{N} \left(\log(y_i) - \log(\hat{y_i})\right)^2 - \left(\frac{1}{N} \sum_{i=1}^{N} \log(y_i) \right)^2,\]where \(N\) is the batch size, \(y_i\) is a tensor of target values and \(\hat{y_i}\) is a tensor of prediction. Return the square root of SILog by setting
sqrt
to True.This metric evaluates the scale-invariant error between predicted and target values in log-space. It accounts for both the variance of the error and the mean log difference between predictions and targets. By setting the
sqrt
argument to True, the metric computes the square root of the SILog value.- Parameters:
sqrt – If True, return the square root of the metric. Defaults to
False.
lmbda – The regularization parameter on the variance of error. Defaults to
1.0.
kwargs – Additional keyword arguments, see Advanced metric settings.
- Reference:
[1] Depth Map Prediction from a Single Image using a Multi-Scale Deep Network, NeurIPS 2014.
[2] From big to small: Multi-scale local planar guidance for monocular depth estimation.
Example:
from torch_uncertainty.metrics.regression import SILog import torch # Initialize the SILog metric with sqrt=True silog_metric = SILog(sqrt=True, lmbda=1.0) # Example predictions and targets preds = torch.tensor([1.5, 2.0, 3.5, 5.0]) target = torch.tensor([1.4, 2.2, 3.3, 5.2]) # Update the metric state silog_metric.update(preds, target) # Compute the Scale-Invariant Logarithmic Loss result = silog_metric.compute() print(f"SILog: {result.item():.4f}") # Output: SILog: 0.0686