log_softmax

dragon.vm.torch.nn.functional.log_softmax(
  input,
  dim,
  inplace=False
)[source]

Apply the composite of logarithm and softmax to input.

The LogSoftmax function is defined as:

\[\text{LogSoftmax}(x_{i}) = \log(\frac{\exp(x_{i})}{\sum \exp(x_{j})}) \]
Parameters:
  • input (dragon.vm.torch.Tensor) The input.
  • dim (int) The dimension to reduce.
  • inplace (bool, optional, default=False) Whether to do the operation in-place.
Returns:

dragon.vm.torch.Tensor The output tensor.