CrossEntropyLoss¶
- class dragon.vm.torch.nn.CrossEntropyLoss(
 weight=None,
 size_average=None,
 ignore_index=None,
 reduce=None,
 reduction='mean'
 )[source]¶
- Compute the softmax cross entropy. - The CrossEntropy function is defined as: \[\text{CrossEntropy}(p_{t}) = -\log(p_{t}) \]- Examples: - m = torch.nn.CrossEntropyLoss() logits = torch.randn(2, 2) targets = torch.tensor([0, 1]) loss = m(logits, targets) - See also 
__init__¶
- CrossEntropyLoss.- __init__(
 weight=None,
 size_average=None,
 ignore_index=None,
 reduce=None,
 reduction='mean'
 )[source]¶
- Create a - CrossEntropyLossmodule.- Parameters:
- weight (dragon.vm.torch.Tensor, optional) – The weight for each class.
- size_average (bool, optional) – Trueto set thereductionto ‘mean’.
- ignore_index (int, optional) – The ignored value of target.
- reduce (bool, optional) – Trueto set thereductionto ‘sum’ or ‘mean’.
- reduction ({'none', 'mean', 'sum', 'valid'}, optional) – The reduce method.
 
 
