LogSoftmax

class dragon.vm.torch.nn.LogSoftmax(
  dim,
  inplace=False
)[source]

Apply the composite of logarithm and softmax.

The LogSoftmax function is defined as:

\[\text{LogSoftmax}(x) = \log(\frac{\exp(x_{i})}{\sum \exp(x_{j})}) \]

Examples:

m = torch.nn.LogSoftmax(dim=1)
x = torch.randn(2, 3)
y = m(x)

__init__

LogSoftmax.__init__(
  dim,
  inplace=False
)[source]

Create a LogSoftmax module.

Parameters:
  • dim (int) The dimension to reduce.
  • inplace (bool, optional, default=False) Whether to do the operation in-place.