selu¶
dragon.nn.
selu
(
inputs,
alpha=1.67326,
gamma=1.0507,
inplace=False,
**kwargs
)[source]¶Apply the scaled exponential linear unit. [Klambauer et.al, 2017].
The SELU function is defined as:
\[\text{SELU}(x) = \gamma * \begin{cases} x, & \text{ if } x \geq 0 \\ \alpha * (\exp(x) - 1), & \text{ otherwise } \end{cases} \]Examples:
x = dragon.constant([-1., 0., 1.]) print(dragon.nn.selu(x))
- Parameters:
- inputs (dragon.Tensor) – The input tensor.
- alpha (float, optional, default=1.67326) – The value to \(\alpha\).
- gamma (float, optional, default=1.0507) – The value to \(\gamma\).
- inplace (bool, optional, default=False) – Call in-place or return a new tensor.
- Returns:
dragon.Tensor – The output tensor.