log_softmax

dragon.nn.log_softmax(
  inputs,
  axis=- 1,
  inplace=False,
  **kwargs
)[source]

Compute the composite of logarithm and softmax.

The LogSoftmax function is defined as:

\[\text{LogSoftmax}(x) = \log(\frac{\exp(x_{i})}{\sum \exp(x_{j})}) \]

The argument axis could be negative:

x = dragon.random.uniform((2, 3), -0.1, 0.1)
print(dragon.nn.log_softmax(x, 1))
print(dragon.nn.log_softmax(x, -1))  # Equivalent
Parameters:
  • inputs (dragon.Tensor) The input tensor.
  • axis (int, optional, default=-1) The axis to reduce.
  • inplace (bool, optional, default=False) Call in-place or return a new tensor.
Returns:

dragon.Tensor The output tensor.