ReLU¶
- class
dragon.vm.tensorflow.keras.layers.
ReLU
(
max_value=None,
negative_slope=0,
**kwargs
)[source]¶ Layer to apply the rectified linear unit. [Nair & Hinton, 2010].
The ReLU function is defined as:
\[\text{ReLU}(x) = \begin{cases} \min(x, v_{max}), & \text{ if } x \geq 0 \\ \alpha * x, & \text{ otherwise } \end{cases} \]Examples:
x = tf.constant([-1, 0, 1], 'float32') print(tf.keras.layers.ReLU(inplace=False)(x))