ReLU

class dragon.vm.caffe.layers.ReLU(layer_param)[source]

Apply the rectified linear unit. [Nair & Hinton, 2010].

The ReLU function is defined as:

\[\text{ReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ 0, & \text{ otherwise } \end{cases} \]

Examples:

layer {
    type: "ReLU"
    bottom: "conv2"
    top: "conv2/relu"
    relu_param {
       negative_slope: 0.
    }
}