SigmoidFocalLoss

class dragon.vm.torch.nn.SigmoidFocalLoss(
  alpha=0.25,
  gamma=2.0,
  weight=None,
  size_average=None,
  negative_index=None,
  reduce=None,
  reduction='valid'
)[source]

Compute the sigmoid focal loss with sparse labels. [Lin et.al, 2017].

The FocalLoss function is defined as:

\[\text{FocalLoss}(p_{t}) = -(1 - p_{t})^{\gamma}\log(p_{t}) \]

Examples:

m = torch.nn.SigmoidFocalLoss()
logits = torch.randn(2, 2)
targets = torch.tensor([0, 1])
loss = m(logits, targets)

__init__

SigmoidFocalLoss.__init__(
  alpha=0.25,
  gamma=2.0,
  weight=None,
  size_average=None,
  negative_index=None,
  reduce=None,
  reduction='valid'
)[source]

Create a SigmoidFocalLoss module.

Parameters:
  • alpha (float, optional, default=0.25) – The scale factor on the rare class.
  • gamma (float, optional, default=2.) – The exponential decay factor on the easy examples.
  • weight (dragon.vm.torch.Tensor, optional) – The weight for each class.
  • size_average (bool, optional) – True to set the reduction to ‘mean’.
  • negative_index (int, optional) – The negative label index.
  • reduce (bool, optional) – True to set the reduction to ‘sum’ or ‘mean’.
  • reduction ({'none', 'mean', 'sum', 'valid'}, optional) – The reduce method.