Relu¶
- class
dragon.vm.tensorlayer.layers.
Relu
(
inplace=False,
name=None
)[source]¶ Layer to apply the rectified linear unit. [Nair & Hinton, 2010].
The ReLU function is defined as:
\[\text{ReLU}(x) = \begin{cases} \min(x, v_{max}), & \text{ if } x \geq 0 \\ \alpha * x, & \text{ otherwise } \end{cases} \]Examples:
x = tl.layers.Input([10, 5]) y = tl.layers.Relu(channel_shared=True)