batch_normalization

dragon.vm.tensorflow.nn.batch_normalization(
  x,
  moving_mean,
  moving_variance,
  offset,
  scale,
  axis=- 1,
  momentum=0.9,
  variance_epsilon=1e-05,
  trainable=False,
  name=None
)[source]

Apply the batch normalization. [Ioffe & Szegedy, 2015].

The normalization is defined as:

\[y = \frac{x - \mathrm{E}[x]}{\sqrt{\mathrm{Var}[x] + \epsilon}} * \gamma + \beta \]

The moving average of stats are calculated as:

\[x_{moving} \leftarrow momentum * x_{moving} + (1 - momentum) * x_{stat} \]
Parameters:
  • x (dragon.Tensor) – The input tensor.
  • moving_mean (dragon.Tensor) – The moving mean.
  • moving_variance (dragon.Tensor) – The moving variance.
  • offset (dragon.Tensor) – The \(\beta\) tensor.
  • scale (dragon.Tensor) – The \(\gamma\) tensor.
  • axis (int, optional, default=-1) – The channel axis.
  • momentum (float, optional, default=0.9) – The momentum of moving average.
  • variance_epsilon (float, optional, default=1e-5) – The value of epsilon.
  • trainable (bool, optional, default=False) – The optional training flag.
  • name (str, optional) – The operation name.
Returns:

dragon.Tensor – The output tensor.