# batch_norm¶

dragon.vm.torch.nn.functional.batch_norm(
input,
running_mean,
running_var,
weight,
bias,
training=False,
momentum=0.1,
eps=1e-05
)[source]

Apply the batch normalization to input. [Ioffe & Szegedy, 2015].

The normalization is defined as:

$y = \frac{x - \mathrm{E}[x]}{\sqrt{\mathrm{Var}[x] + \epsilon}} * \gamma + \beta$

The moving average of stats are calculated as:

$x_{moving} \leftarrow (1 - momentum) * x_{moving} + momentum * x_{stat}$
Parameters:
Returns:

dragon.vm.torch.Tensor – The output tensor.