softmax

dragon.vm.tensorflow.nn.softmax(
  logits,
  axis=-1,
  name=None,
  **kwargs
)[source]

Apply the softmax function.

The Softmax function is defined as:

\[\text{Softmax}(x) = \frac{e^{x_{i}}}{\sum e^{x_{j}}} \]

The argument axis could be negative:

x = tf.ones((1, 4), dtype='float32')
print(tf.nn.softmax(x, 1))   # [[0.25 0.25 0.25 0.25]]
print(tf.nn.softmax(x, -1))  # Equivalent
Parameters:
  • logits (dragon.Tensor) – The input tensor.
  • axis (int, optional, default=-1) – The axis to reduce.
  • name (str, optional) – A optional name for the operation.
Returns:

dragon.Tensor – The output tensor.