PReLU

class dragon.vm.torch.nn.PReLU(
  num_parameters=1,
  init=0.25
)[source]

Apply the parametric rectified linear unit. [He et.al, 2015].

The PReLU function is defined as:

\[\text{PReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ weight * x, & \text{ otherwise } \end{cases} \]

Examples:

# Use a single parameter to scale all channels
# Typically known as the ``channel-shared`` style
m = torch.nn.PReLU(num_parameters=1)
x = torch.randn(2, 3)
y = m(x)

# Use different parameter for each channel
mm =  torch.nn.PReLU(num_parameters=3)
z = mm(x)

__init__

PReLU.__init__(
  num_parameters=1,
  init=0.25
)[source]

Create a PReLU module.

Parameters:
  • num_parameters (int, optional, default=1) The number of parameters.
  • init (float, optional, default=0.25) The default value of parameters.