BatchNorm

class dragon.vm.tensorlayer.layers.BatchNorm(
  decay=0.9,
  epsilon=1e-05,
  act=None,
  beta_init='zeros',
  gamma_init='ones',
  moving_mean_init='zeros',
  moving_var_init='ones',
  num_features=None,
  data_format='channels_first',
  name=None
)[source]

The layer to apply the batch normalization. [Ioffe & Szegedy, 2015].

Examples:

x = tl.layers.Input([None, 32, 50, 50])
y = tl.layers.BatchNorm()(x)

__init__

BatchNorm.__init__(
  decay=0.9,
  epsilon=1e-05,
  act=None,
  beta_init='zeros',
  gamma_init='ones',
  moving_mean_init='zeros',
  moving_var_init='ones',
  num_features=None,
  data_format='channels_first',
  name=None
)[source]

Create a BatchNorm layer.

Parameters:
  • decay (float, optional, default=0.9) – The decay factor for moving average.
  • epsilon (float, optional, default=1e-5) – The epsilon.
  • act (callable, optional) – The optional activation function.
  • beta_init (Union[callable, str], optional) – The initializer for beta.
  • gamma_init (Union[callable, str], optional) – The initializer for gamma.
  • moving_mean_init (Union[callable, str], optional) – The initializer for moving_mean.
  • moving_var_init (Union[callable, str], optional) – The initializer for moving_var.
  • num_features (int, optional) – The number of input features.
  • data_format ({'channels_first', 'channels_last'}, optional) – The optional data format.
  • name (str, optional) – The optional layer name.