gelu

dragon.vm.tensorflow.nn.gelu(
  features,
  approximate=False,
  name=None
)[source]

Apply the gaussian error linear unit. [Hendrycks & Gimpel, 2016].

The GELU function is defined as:

\[\text{GELU}(x) = x\cdot\frac{1}{2}[1 + \text{erf}(x / \sqrt{2})] \]

Examples:

x = tf.constant([-1., 0., 1.])
print(tf.nn.gelu(x))
Parameters:
  • features (dragon.Tensor) – The input tensor.
  • approximate (bool, optional, default=False) – Whether to approximate the computation.
  • name (str, optional) – The operation name.
Returns:

dragon.Tensor – The output tensor.