BatchNormalization

class dragon.vm.tensorflow.keras.layers.BatchNormalization(
  axis=-1,
  momentum=0.99,
  epsilon=0.001,
  center=True,
  scale=True,
  beta_initializer='zeros',
  gamma_initializer='ones',
  moving_mean_initializer='zeros',
  moving_variance_initializer='ones',
  beta_regularizer=None,
  gamma_regularizer=None,
  name=None,
  **kwargs
)[source]

The batch normalization layer. [Ioffe & Szegedy, 2015].

__init__

BatchNormalization.__init__(
  axis=-1,
  momentum=0.99,
  epsilon=0.001,
  center=True,
  scale=True,
  beta_initializer='zeros',
  gamma_initializer='ones',
  moving_mean_initializer='zeros',
  moving_variance_initializer='ones',
  beta_regularizer=None,
  gamma_regularizer=None,
  name=None,
  **kwargs
)[source]

Create a BatchNormalization layer.

Parameters:
  • axis (int, optional, default=-1) – The channel axis.
  • momentum (float, optional, default=0.99) – The momentum of moving average.
  • epsilon (float, optional, default=1e-3) – The epsilon value.
  • center (bool, optional, default=True) – False to freeze the beta anyway.
  • scale (bool, optional, default=True) – False to freeze the gamma anyway.
  • beta_initializer (Union[callable, str], optional) – The initializer for beta.
  • gamma_initializer (Union[callable, str], optional) – The initializer for gamma.
  • moving_mean_initializer (Union[callable, str], optional) – The initializer for moving_mean.
  • moving_variance_initializer (Union[callable, str], optional) – The initializer for moving_variance.
  • beta_regularizer (Union[callable, str], optional) – The regularizer for beta.
  • gamma_regularizer (Union[callable, str], optional) – The regularizer for gamma.