LeakyReLU¶
- class
dragon.vm.tensorflow.keras.layers.
LeakyReLU
(
alpha=0.3,
**kwargs
)[source]¶ Layer to apply the leaky rectified linear unit. [Clevert et.al, 2015].
The LeakyReLU function is defined as:
\[\text{LeakyReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ \alpha * x, & \text{ otherwise } \end{cases} \]Examples:
x = tf.constant([-1, 0, 1], 'float32') print(tf.keras.layers.LeakyReLU(inplace=False)(x))