SigmoidFocalLoss

class dragon.vm.torch.nn.SigmoidFocalLoss(
  alpha=0.25,
  gamma=2.0,
  weight=None,
  size_average=None,
  start_index=0,
  reduce=None,
  reduction='mean'
)[source]

Compute the sigmoid focal loss. [Lin et.al, 2017].

The FocalLoss function is defined as:

\[\text{FocalLoss}(p_{t}) = -(1 - p_{t})^{\gamma}\log(p_{t}) \]

Examples:

m = torch.nn.SigmoidFocalLoss()
logits = torch.randn(2, 2)
targets = torch.tensor([0, 1])
loss = m(logits, targets)

__init__

SigmoidFocalLoss.__init__(
  alpha=0.25,
  gamma=2.0,
  weight=None,
  size_average=None,
  start_index=0,
  reduce=None,
  reduction='mean'
)[source]

Create a SigmoidFocalLoss module.

Parameters:
  • alpha (float, optional, default=0.25) The scale factor on the rare class.
  • gamma (float, optional, default=2.) The exponential decay factor on the easy examples.
  • weight (dragon.vm.torch.Tensor, optional) The weight for each class.
  • size_average (bool, optional) True to set the reduction to ‘mean’.
  • start_index (int, optional, default=0) The start value of target.
  • reduce (bool, optional) True to set the reduction to ‘sum’ or ‘mean’.
  • reduction ({'none', 'mean', 'sum', 'valid'}, optional) The reduce method.