leaky_relu¶
- dragon.nn.- leaky_relu(
 inputs,
 alpha=0.2,
 inplace=False,
 **kwargs
 )[source]¶
- Apply the leaky rectified linear unit. - The LeakyReLU function is defined as: \[\text{LeakyReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ \alpha * x, & \text{ otherwise } \end{cases} \]- Examples: - x = dragon.constant([-1., 0., 1.]) print(dragon.nn.leaky_relu(x)) - Parameters:
- inputs (dragon.Tensor) – The input tensor.
- alpha (number, optional, default=0.2) – The value to \(\alpha\).
- inplace (bool, optional, default=False) – Call in-place or return a new tensor.
 
 - Returns:
- dragon.Tensor – The output tensor. 
 
