LayerNorm¶
- class
dragon.vm.torch.nn.
LayerNorm
(
normalized_shape,
eps=1e-05,
elementwise_affine=True
)[source]¶ Apply the layer normalization. [Ba et.al, 2016]
The normalization is defined as:
\[y = \frac{x - \mathrm{E}[x]} {\sqrt{\mathrm{Var}[x] + \epsilon}} * \gamma + \beta \]Examples:
x = torch.randn(2, 3, 4) m = torch.nn.LayerNorm(x.size()[1:]) y = m(x)
See also
__init__¶
LayerNorm.
__init__
(
normalized_shape,
eps=1e-05,
elementwise_affine=True
)[source]¶Create a
LayerNorm
module.- Parameters:
- normalized_shape (Union[int, Sequence[int]]) – The size normalized over the last dimensions.
- eps (float, optional, default=1e-5) – The value to \(\epsilon\).
- elementwise_affine (bool, optional, default=True) –
True
to apply an affine transformation.