log_softmax

dragon.vm.tensorflow.nn.log_softmax(
  logits,
  axis=- 1,
  name=None
)[source]

Apply the composite of logarithm and softmax.

The LogSoftmax function is defined as:

\[\text{LogSoftmax}(x) = \log(\frac{\exp(x_{i})}{\sum \exp(x_{j})}) \]

The argument axis could be negative:

x = tf.random.uniform((2, 3), -0.1, 0.1)
print(tf.nn.log_softmax(x, 1))
print(tf.nn.log_softmax(x, -1))  # Equivalent
Parameters:
  • logits (dragon.Tensor) The input tensor.
  • axis (int, optional, default=1) The axis to reduce.
  • name (str, optional) The operation name.
Returns:

dragon.Tensor The output tensor.