Relu

class dragon.vm.tensorlayer.layers.Relu(
  inplace=False,
  name=None
)[source]

The layer to apply the rectified linear unit. [Nair & Hinton, 2010].

The ReLU function is defined as:

\[\text{ReLU}(x) = \begin{cases} \min(x, v_{max}), & \text{ if } x \geq 0 \\ \alpha * x, & \text{ otherwise } \end{cases} \]

Examples:

x = tl.layers.Input([10, 5])
y = tl.layers.Relu(channel_shared=True)

__init__

Relu.__init__(
  inplace=False,
  name=None
)[source]

Create a Relu layer.

Parameters:
  • inplace (bool, optional, default=False) – Whether to do the operation in-place.
  • name (str, optional) – The optional layer name.