Dragon API Dragon v0.3.0 Python Softmax¶ class dragon.vm.torch.nn.Softmax( dim, inplace=False)[source]¶ Apply the softmax function. The Softmax function is defined as: \[\text{Softmax}(x_{i}) = \frac{\exp(x_{i})}{\sum_{j} \exp(x_{j})} \] Examples: m = torch.nn.Softmax(dim=1) x = torch.randn(2, 3) y = m(x) See also torch.nn.functional.softmax(…) __init__¶ Softmax.__init__( dim, inplace=False)[source]¶ Create a Softmax module. Parameters: dim (int) – The dimension to reduce. inplace (bool, optional, default=False) – Whether to do the operation in-place.