LogSoftmax

class dragon.vm.torch.nn.LogSoftmax(dim)[source]

Apply the composite of logarithm and softmax.

The LogSoftmax function is defined as:

\[\text{LogSoftmax}(x) = \log(\frac{\exp(x_{i})}{\sum \exp(x_{j})}) \]

Examples:

m = torch.nn.LogSoftmax(dim=1)
x = torch.randn(2, 3)
y = m(x)

__init__

LogSoftmax.__init__(dim)[source]

Create a LogSoftmax module.

Parameters:
  • dim (int) – The dimension to reduce.