relu¶
dragon.vm.tensorflow.keras.activations.
relu
(
x,
alpha=0,
max_value=None,
**kwargs
)[source]¶Apply the rectified linear unit to input. [Nair & Hinton, 2010].
The ReLU function is defined as:
\[\text{ReLU}(x) = \begin{cases} \min(x, v_{max}), & \text{ if } x \geq 0 \\ \alpha * x, & \text{ otherwise } \end{cases} \]Examples:
x = tf.constant([-1, 0, 1], 'float32') print(tf.keras.activations.relu(x, inplace=False))
- Parameters:
- x (dragon.Tensor) – The input tensor.
- alpha (number, optional, default=0) – The value to \(\alpha\).
- max_value (number, optional) – The value to \(v_{max}\).