gradients

dragon.gradients(
  ys,
  xs,
  grad_ys=None
)[source]

Compute the symbolic derivatives of ys w.r.t. xs .

By default, we will fill the gradient of ys with ones:

x = dragon.ones(shape=(1,))
y = x * 2
dx = dragon.gradients(y, x)  # [2,]

You can set grad_ys to use an existing constant:

dy = dragon.constant([2], dtype=x.dtype)
dx = dragon.gradients(y, x, dy)  # [4,]

Do not call this method under eager execution:

# Wrong usage
with dragon.eager_mode():
     x = dragon.ones(shape=(1,))
     y = x * 2
     dx = dragon.gradients(y, x)

# Correct usage
with dragon.eager_mode():
    x = dragon.ones(shape=(1,))
    with dragon.GradientTape() as tape:
        y = x * 2
    dx = tape.gradient(y, x)
Parameters:
  • ys (Sequence[dragon.Tensor]) – The target of derivatives.
  • xs (Sequence[dragon.Tensor]) – The source with respect to the ys.
  • grad_ys (Sequence[dragon.Tensor], optional) – The input gradient for ys.
Returns:

Sequence[dragon.Tensor] – The sum of derivatives for xs.