BatchNormalization¶
- class dragon.vm.tensorflow.keras.layers.BatchNormalization(
 axis=- 1,
 momentum=0.99,
 epsilon=0.001,
 center=True,
 scale=True,
 beta_initializer='zeros',
 gamma_initializer='ones',
 moving_mean_initializer='zeros',
 moving_variance_initializer='ones',
 beta_regularizer=None,
 gamma_regularizer=None,
 name=None,
 **kwargs
 )[source]¶
- Batch normalization layer. [Ioffe & Szegedy, 2015]. 
__init__¶
- BatchNormalization.- __init__(
 axis=- 1,
 momentum=0.99,
 epsilon=0.001,
 center=True,
 scale=True,
 beta_initializer='zeros',
 gamma_initializer='ones',
 moving_mean_initializer='zeros',
 moving_variance_initializer='ones',
 beta_regularizer=None,
 gamma_regularizer=None,
 name=None,
 **kwargs
 )[source]¶
- Create a - BatchNormalizationlayer.- Parameters:
- axis (int, optional, default=-1) – The channel axis.
- momentum (float, optional, default=0.99) – The decay factor of running average.
- epsilon (float, optional, default=1e-3) – The epsilon value.
- center (bool, optional, default=True) – Falseto freeze thebetaanyway.
- scale (bool, optional, default=True) – Falseto freeze thegammaanyway.
- beta_initializer (Union[callable, str], optional) – The initializer for beta tensor.
- gamma_initializer (Union[callable, str], optional) – The initializer for gamma tensor.
- moving_mean_initializer (Union[callable, str], optional) – The initializer for moving mean tensor.
- moving_variance_initializer (Union[callable, str], optional) – The initializer for moving variance tensor.
- beta_regularizer (Union[callable, str], optional) – The regularizer for beta tensor.
- gamma_regularizer (Union[callable, str], optional) – The regularizer for gamma tensor.
 
 
