elu

dragon.vm.tensorflow.keras.activations.elu(
  x,
  alpha=1.0,
  **kwargs
)[source]

Apply the exponential linear unit to input. [Clevert et.al, 2015].

The ELU function is defined as:

\[\text{ELU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ \alpha * (\exp(x) - 1), & \text{ otherwise } \end{cases} \]

Examples:

x = tf.constant([-1, 0, 1], 'float32')
print(tf.keras.activations.elu(x, inplace=False))
Parameters:
  • x (dragon.Tensor) The input tensor.
  • alpha (float, optional, default=1.) The value to \(\alpha\).
Returns:

dragon.Tensor The output tensor.