PReLU¶
- class
dragon.vm.torch.nn.
PReLU
(
num_parameters=1,
init=0.25
)[source]¶ Apply the parametric rectified linear unit. [He et.al, 2015].
The PReLU function is defined as:
\[\text{PReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ weight * x, & \text{ otherwise } \end{cases} \]Examples:
# Use a single parameter to scale all channels # Typically known as the ``channel-shared`` style m = torch.nn.PReLU(num_parameters=1) x = torch.randn(2, 3) y = m(x) # Use different parameter for each channel mm = torch.nn.PReLU(num_parameters=3) z = mm(x)
See also