elu

dragon.nn.elu(
  inputs,
  alpha=1.0,
  **kwargs
)[source]

Apply the exponential linear unit. [Clevert et.al, 2015].

The ELU function is defined as:

\[\text{ELU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ \alpha * (\exp(x) - 1), & \text{ otherwise } \end{cases} \]

Examples:

x = dragon.constant([-1, 0, 1], 'float32')
print(dragon.nn.elu(x, inplace=False))
Parameters:
  • inputs (dragon.Tensor) – The input tensor.
  • alpha (float, optional, default=1.) – The value to \(\alpha\).
Returns:

dragon.Tensor – The output tensor.