prelu

dragon.nn.prelu(
  inputs,
  data_format='NCHW',
  **kwargs
)[source]

Apply the parametric rectified linear unit. [He et.al, 2015].

The PReLU function is defined as:

\[\text{PReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ weight * x, & \text{ otherwise } \end{cases} \]

Examples:

x = dragon.constant([[-1., 0., 1.]])
w = dragon.fill((3,), value=0.25, dtype=x.dtype)
print(dragon.nn.prelu([x, w]))
Parameters:
  • inputs (Sequence[dragon.Tensor]) The input and weight.
  • data_format (str, optional, default='NCHW') 'NCHW' or 'NHWC'.
Returns:

dragon.Tensor The output tensor.