softmax_cross_entropy

dragon.losses.softmax_cross_entropy(
  inputs,
  axis=1,
  reduction='mean',
  **kwargs
)[source]

Compute the softmax cross entropy with contiguous targets.

The CrossEntropy function is defined as:

\[\text{CrossEntropy}(p_{t}) = -\log(p_{t}) \]

Examples:

logit = dragon.constant([[0.5, 0.5], [0.3, 0.7]])
target = dragon.constant([[0., 1., ], [1., 0.]])
print(dragon.losses.softmax_cross_entropy([logit, target]))  # 0.8030813
Parameters:
  • inputs (Sequence[dragon.Tensor]) – The tensor logit and target.
  • axis (int, optional, default=1) – The axis to apply softmax, can be negative.
  • reduction ({'none', 'sum', 'mean'}, optional) – The reduction method.
Returns:

dragon.Tensor – The output tensor.