SCEWithLogitsLoss

class dragon.vm.torch.nn.SCEWithLogitsLoss(
  weight=None,
  size_average=None,
  reduce=None,
  reduction='mean',
  pos_weight=None
)[source]

Compute the softmax cross entropy with contiguous targets.

The CrossEntropy function is defined as:

\[\text{CrossEntropy}(p_{t}) = -\log(p_{t}) \]

Examples:

m = torch.nn.SCEWithLogitsLoss()
logits = torch.randn(2, 2)
targets = torch.tensor([[1, 0], [0, 1]], 'float32')
loss = m(logits, targets)

__init__

SCEWithLogitsLoss.__init__(
  weight=None,
  size_average=None,
  reduce=None,
  reduction='mean',
  pos_weight=None
)[source]

Create a SCEWithLogitsLoss module.

Parameters:
  • weight (dragon.vm.torch.Tensor, optional) – The weight for each class.
  • size_average (bool, optional) – True to set the reduction to ‘mean’.
  • reduce (bool, optional) – True to set the reduction to ‘sum’ or ‘mean’.
  • reduction ({'none', 'mean', 'sum'}, optional) – The reduce method.