Optimizer

class dragon.optimizers.Optimizer(
  scale=1,
  clip_norm=0,
  weight_decay=0
)[source]

The base class of optimizers.

__init__

Optimizer.__init__(
  scale=1,
  clip_norm=0,
  weight_decay=0
)[source]

Create a Optimizer.

Parameters:
  • scale (float, optional, default=1) – The scaling factor to gradient.
  • clip_norm (float, optional, default=0) – The maximum L2 norm to clip gradient.
  • weight_decay (float, optional, default=0) – The L2 penalty factor to weight.

Methods

apply_gradients

Optimizer.apply_gradients(grads_and_vars)[source]

Apply the gradients on variables.

Parameters:
  • grads_and_vars (Sequence[Sequence[dragon.Tensor]]) – The sequence of update pair.