gelu¶
dragon.nn.
gelu
(
inputs,
approximate=False,
**kwargs
)[source]¶Apply the gaussian error linear unit. [Hendrycks & Gimpel, 2016].
The GELU function is defined as:
\[\text{GELU}(x) = x\cdot\frac{1}{2}[1 + \text{erf}(x / \sqrt{2})] \]Examples:
x = dragon.constant([-1., 0., 1.]) print(dragon.nn.gelu(x))
- Parameters:
- inputs (dragon.Tensor) – The input tensor.
- approximate (bool, optional, default=False) – Whether to approximate the computation.
- Returns:
dragon.Tensor – The output tensor.