LeakyReLU¶
- class
dragon.vm.torch.nn.
LeakyReLU
(
negative_slope=0.01,
inplace=False
)[source]¶ Apply the leaky rectified linear unit.
The LeakyReLU function is defined as:
\[\text{LeakyReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ slope * x, & \text{ otherwise } \end{cases} \]Examples:
m = torch.nn.LeakyReLU() x = torch.randn(2, 3) y = m(x)
See also