LayerNormalization¶
- class dragon.vm.tensorflow.keras.layers.LayerNormalization(
 axis=- 1,
 epsilon=0.001,
 center=True,
 scale=True,
 beta_initializer='zeros',
 gamma_initializer='ones',
 beta_regularizer=None,
 gamma_regularizer=None,
 name=None,
 **kwargs
 )[source]¶
- LayerNormalization layer. [Ba et.al, 2016] 
__init__¶
- LayerNormalization.- __init__(
 axis=- 1,
 epsilon=0.001,
 center=True,
 scale=True,
 beta_initializer='zeros',
 gamma_initializer='ones',
 beta_regularizer=None,
 gamma_regularizer=None,
 name=None,
 **kwargs
 )[source]¶
- Create a - LayerNormalizationlayer.- Parameters:
- axis (int, optional, default=-1) – The channel axis.
- momentum (float, optional, default=0.99) – The decay factor of running average.
- epsilon (float, optional, default=1e-3) – The epsilon value.
- center (bool, optional, default=True) – Falseto freeze thebetaanyway.
- scale (bool, optional, default=True) – Falseto freeze thegammaanyway.
- beta_initializer (Union[callable, str], optional) – The initializer for beta tensor.
- gamma_initializer (Union[callable, str], optional) – The initializer for gamma tensor.
- beta_regularizer (Union[callable, str], optional) – The regularizer for beta tensor.
- gamma_regularizer (Union[callable, str], optional) – The regularizer for gamma tensor.
 
 
