Softmax

class dragon.vm.torch.nn.Softmax(
  dim,
  inplace=False
)[source]

Apply the softmax function.

The Softmax function is defined as:

\[\text{Softmax}(x_{i}) = \frac{\exp(x_{i})}{\sum_{j} \exp(x_{j})} \]

Examples:

m = torch.nn.Softmax(dim=1)
x = torch.randn(2, 3)
y = m(x)

__init__

Softmax.__init__(
  dim,
  inplace=False
)[source]

Create a Softmax module.

Parameters:
  • dim (int) – The dimension to reduce.
  • inplace (bool, optional, default=False) – Whether to do the operation in-place.