sparse_softmax_cross_entropy

dragon.losses.sparse_softmax_cross_entropy(
  inputs,
  axis=1,
  ignore_index=None,
  reduction='valid',
  **kwargs
)[source]

Compute the softmax cross entropy with sparse labels.

The CrossEntropy function is defined as:

\[\text{CrossEntropy}(p_{t}) = -\log(p_{t}) \]

Examples:

logit = dragon.constant([[0.5, 0.5], [0.3, 0.7]])
label = dragon.constant([1, 0])
print(dragon.losses.sparse_softmax_cross_entropy([logit, label]))  # 0.8030813
Parameters:
  • inputs (Sequence[dragon.Tensor]) – The tensor logit and label.
  • axis (int, optional, default=1) – The axis to apply softmax, can be negative.
  • ignore_index (int, optional) – The label index to ignore.
  • reduction ({'none', 'sum', 'mean', 'valid'}, optional) – The reduction method.
Returns:

dragon.Tensor – The output tensor.