ReLU

class dragon.vm.tensorflow.keras.layers.ReLU(
  max_value=None,
  negative_slope=0,
  **kwargs
)[source]

Layer to apply the rectified linear unit. [Nair & Hinton, 2010].

The ReLU function is defined as:

\[\text{ReLU}(x) = \begin{cases} \min(x, v_{max}), & \text{ if } x \geq 0 \\ \alpha * x, & \text{ otherwise } \end{cases} \]

Examples:

x = tf.constant([-1, 0, 1], 'float32')
print(tf.keras.layers.ReLU(inplace=False)(x))

__init__

ReLU.__init__(
  max_value=None,
  negative_slope=0,
  **kwargs
)[source]

Create a ReLU layer.

Parameters:
  • max_value (number, optional) The value to \(v_{max}\).
  • negative_slope (float, optional, default=0.) The value to \(\alpha\).