silu

dragon.vm.tensorflow.nn.silu(features)[source]

Apply the sigmoid linear unit. [Hendrycks & Gimpel, 2016].

The SiLU function is defined as:

\[\text{SiLU}(x) = x \cdot \frac{1}{1 + \exp(-x)} \]

Examples:

x = tf.constant([-2.5, -1.0, 0.0, 1.0, 2.5])
print(tf.nn.silu(x))
Parameters:
Returns:

dragon.Tensor – The output tensor.