LogSoftmax¶
- class dragon.vm.torch.nn.LogSoftmax(
 dim,
 inplace=False
 )[source]¶
- Apply the composite of logarithm and softmax. - The LogSoftmax function is defined as: \[\text{LogSoftmax}(x) = \log(\frac{\exp(x_{i})}{\sum \exp(x_{j})}) \]- Examples: - m = torch.nn.LogSoftmax(dim=1) x = torch.randn(2, 3) y = m(x) - See also 
