softmax

dragon.nn.softmax(
  inputs,
  axis=- 1,
  inplace=False,
  **kwargs
)[source]

Compute the softmax result.

The Softmax function is defined as:

\[\text{Softmax}(x_{i}) = \frac{\exp(x_{i})}{\sum_{j} \exp(x_{j})} \]

The argument axis could be negative:

x = dragon.ones((1, 4), dtype='float32')
print(dragon.nn.softmax(x, 1))  # [[0.25 0.25 0.25 0.25]]
print(dragon.nn.softmax(x, -1))  # Equivalent
Parameters:
  • inputs (dragon.Tensor) The input tensor.
  • axis (int, optional, default=-1) The axis to reduce.
  • inplace (bool, optional, default=False) Call in-place or return a new tensor.
Returns:

dragon.Tensor The output tensor.