selu¶
- dragon.vm.tensorflow.keras.activations.- selu(
 x,
 **kwargs
 )[source]¶
- Apply the scaled exponential linear unit to input. [Klambauer et.al, 2017]. \[\text{SELU}(x) = 1.0507 * \begin{cases} x, & \text{ if } x \geq 0 \\ 1.67326 * (\exp(x) - 1), & \text{ otherwise } \end{cases} \]- Examples: - x = tf.constant([-1, 0, 1], 'float32') print(tf.keras.activations.selu(x, inplace=False)) - Parameters:
- x (dragon.Tensor) – The input tensor.
 
 - Returns:
- dragon.Tensor – The output tensor. 
 
