GradientTape

class dragon.GradientTape(persistent=False)[source]

Record the operations for auto differentiation.

You should enter a tape before the execution performed:

with dragon.eager_mode():
    x = dragon.ones(shape=(2, 3))
    with dragon.GradientTape() as tape:
        y = x + 1
    print(tape.gradient(y, x))  # None, as ``x`` is not watched

    with dragon.GradientTape() as tape:
        tape.watch(x)
        y = x + 1
    print(tape.gradient(y, x))  # Ok

__init__

GradientTape.__init__(persistent=False)[source]

Create a GradientTape.

Parameters:
  • persistent (bool, optional, default=False) – Whether to retain graph once gradient(...) called.

Methods

gradient

GradientTape.gradient(
  target,
  sources,
  output_gradients=None
)[source]

Compute the derivatives of target w.r.t. sources.

reset

GradientTape.reset()[source]

Destroy the tape and push a new one.

stop_recording

GradientTape.stop_recording()[source]

Temporarily stop the recording.

watch

GradientTape.watch(tensor)[source]

Ensure the input tensor will be traced.

Parameters:
  • tensor (Sequence[dragon.Tensor]) – The tensor(s) to be traced.