silu¶
- dragon.vm.torch.nn.functional.- silu(input)[source]¶
- Apply the sigmoid linear unit to input. [Hendrycks & Gimpel, 2016]. - The SiLU function is defined as: \[\text{SiLU}(x) = x \cdot \frac{1}{1 + \exp(-x)} \]- Parameters:
- input (dragon.vm.torch.Tensor) – The input tensor.
 
 - Returns:
- dragon.vm.torch.Tensor – The output tensor. 
 - See also 
