LeakyReLU

class dragon.vm.torch.nn.LeakyReLU(
  negative_slope=0.01,
  inplace=False
)[source]

Apply the leaky rectified linear unit.

The LeakyReLU function is defined as:

\[\text{LeakyReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ slope * x, & \text{ otherwise } \end{cases} \]

Examples:

m = torch.nn.LeakyReLU()
x = torch.randn(2, 3)
y = m(x)

See also

torch.nn.functional.leaky_relu(…) - Apply the leaky rectified linear unit.

__init__

LeakyReLU.__init__(
  negative_slope=0.01,
  inplace=False
)[source]

Create a LeakyReLU module.

Parameters:
  • negative_slope (float, optional, default=0.01) – The slope of negative side.
  • inplace (bool, optional, default=False) – Whether to do the operation in-place.