qml.math.in_backprop

in_backprop(tensor, interface=None)[source]

Returns True if the tensor is considered to be in a backpropagation environment, it works for Autograd, TensorFlow and Jax. It is not only checking the differentiability of the tensor like requires_grad(), but rather checking if the gradient is actually calculated.

Parameters
  • tensor (tensor_like) – input tensor

  • interface (str) – The name of the interface. Will be determined automatically if not provided.

Example

>>> x = tf.Variable([0.6, 0.1])
>>> requires_grad(x)
False
>>> with tf.GradientTape() as tape:
...     print(requires_grad(x))
True

See also

requires_grad()

Contents

Using PennyLane

Development

API

Internals