prelu

dragon.vm.torch.nn.functional.prelu(
  input,
  weight
)[source]

Apply parametric rectified linear unit to input. [He et.al, 2015].

The PReLU function is defined as:

\[\text{PReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ weight * x, & \text{ otherwise } \end{cases} \]
Parameters:
Returns:

dragon.vm.torch.Tensor The output tensor.