SiLU¶
- class
dragon.vm.torch.nn.
SiLU
[source]¶ Apply the sigmoid linear unit. [Hendrycks & Gimpel, 2016].
The SiLU function is defined as:
\[\text{SiLU}(x) = x \cdot \frac{1}{1 + \exp(-x)} \]Examples:
m = torch.nn.So() x = torch.randn(2, 3) y = m(x)
See also