relu

dragon.nn.relu(
  inputs,
  **kwargs
)[source]

Apply the rectified linear unit. [Nair & Hinton, 2010].

The ReLU function is defined as:

\[\text{ReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ 0, & \text{ otherwise } \end{cases} \]

Examples:

x = dragon.constant([-1, 0, 1], 'float32')
print(dragon.nn.relu(x, inplace=False))
Parameters:
Returns:

dragon.Tensor – The output tensor.