gelu¶
- dragon.vm.torch.nn.functional.- gelu(
 input,
 approximate='none'
 )[source]¶
- Apply the gaussian error linear unit to input. [Clevert et.al, 2015]. - The GELU function is defined as: \[\text{GELU}(x) = x\cdot\frac{1}{2}[1 + \text{erf}(x / \sqrt{2})] \]- Parameters:
- input (dragon.vm.torch.Tensor) – The input tensor.
- approximate (str, optional, default='none') – The approximate algorithm.
 
 - Returns:
- dragon.vm.torch.Tensor – The output tensor. 
 - See also 
