silu

dragon.nn.silu(
  inputs,
  **kwargs
)[source]

Apply the sigmoid linear unit. [Hendrycks & Gimpel, 2016].

The SiLU function is defined as:

\[\text{SiLU}(x) = x \cdot \frac{1}{1 + \exp(-x)} \]

Examples:

x = dragon.constant([-2.5, -1.0, 0.0, 1.0, 2.5])
print(dragon.nn.silu(x))
Parameters:
Returns:

dragon.Tensor – The output tensor.