CrossEntropyLoss¶
- class
dragon.vm.torch.nn.
CrossEntropyLoss
(
weight=None,
size_average=None,
ignore_index=None,
reduce=None,
reduction='mean'
)[source]¶ Compute the softmax cross entropy.
The CrossEntropy function is defined as:
\[\text{CrossEntropy}(p_{t}) = -\log(p_{t}) \]Examples:
m = torch.nn.CrossEntropyLoss() logits = torch.randn(2, 2) targets = torch.tensor([0, 1]) loss = m(logits, targets)
See also
__init__¶
CrossEntropyLoss.
__init__
(
weight=None,
size_average=None,
ignore_index=None,
reduce=None,
reduction='mean'
)[source]¶Create a
CrossEntropyLoss
module.- Parameters:
- weight (dragon.vm.torch.Tensor, optional) – The weight for each class.
- size_average (bool, optional) –
True
to set thereduction
to ‘mean’. - ignore_index (int, optional) – The ignored value of target.
- reduce (bool, optional) –
True
to set thereduction
to ‘sum’ or ‘mean’. - reduction ({'none', 'mean', 'sum', 'valid'}, optional) – The reduce method.