softmax

dragon.vm.tensorflow.keras.activations.softmax(
  x,
  axis=-1,
  **kwargs
)[source]

Apply the softmax function to input.

The Softmax function is defined as:

\[\text{Softmax}(x) = \frac{e^{x_{i}}}{\sum e^{x_{j}}} \]

Examples:

x = tf.constant([-1, 0, 1], 'float32')
print(tf.keras.activations.softmax(x, inplace=False))
Parameters:
  • x (dragon.Tensor) – The tensor \(x\).
  • axis (int, optional, default=-1) – The axis to reduce.
Returns:

dragon.Tensor – The output tensor.