Softmax

class dragon.vm.torch.nn.Softmax(
  dim=None,
  inplace=False
)[source]

Apply the softmax function.

The Softmax function is defined as:

\[\text{Softmax}(x) = \frac{e^{x_{i}}}{\sum e^{x_{j}}} \]

Examples:

m = torch.nn.Softmax(dim=1)
x = torch.randn(2, 3)
y = m(x)

__init__

Softmax.__init__(
  dim=None,
  inplace=False
)[source]

Create a Softmax module.

Parameters:
  • dim (int, required) – The dimension to reduce.
  • inplace (bool, optional, default=False) – Whether to do the operation in-place.