CrossEntropyLoss

class dragon.vm.torch.nn.CrossEntropyLoss(
  weight=None,
  size_average=None,
  ignore_index=None,
  reduce=None,
  reduction='valid'
)[source]

Compute the softmax cross entropy with sparse labels.

The CrossEntropy function is defined as:

\[\text{CrossEntropy}(p_{t}) = -\log(p_{t}) \]

Examples:

m = torch.nn.CrossEntropyLoss()
logits = torch.randn(2, 2)
targets = torch.tensor([0, 1])
loss = m(logits, targets)

__init__

CrossEntropyLoss.__init__(
  weight=None,
  size_average=None,
  ignore_index=None,
  reduce=None,
  reduction='valid'
)[source]

Create a CrossEntropyLoss module.

Parameters:
  • weight (dragon.vm.torch.Tensor, optional) – The weight for each class.
  • size_average (bool, optional) – True to set the reduction to ‘mean’.
  • ignore_index (int, optional) – The label index to ignore.
  • reduce (bool, optional) – True to set the reduction to ‘sum’ or ‘mean’.
  • reduction ({'none', 'mean', 'sum', 'valid'}, optional) – The reduce method.