qml.gradients.finite_diff¶
-
finite_diff
(tape, argnum=None, h=1e-07, approx_order=1, n=1, strategy='forward', f0=None, validate_params=True)[source]¶ Transform a QNode to compute the finite-difference gradient of all gate parameters with respect to its inputs.
- Parameters
qnode (pennylane.QNode or QuantumTape) – quantum tape or QNode to differentiate
argnum (int or list[int] or None) – Trainable parameter indices to differentiate with respect to. If not provided, the derivatives with respect to all trainable parameters are returned.
h (float) – finite difference method step size
approx_order (int) – The approximation order of the finite-difference method to use.
n (int) – compute the \(n\)-th derivative
strategy (str) – The strategy of the finite difference method. Must be one of
"forward"
,"center"
, or"backward"
. For the"forward"
strategy, the finite-difference shifts occur at the points \(x_0, x_0+h, x_0+2h,\dots\), where \(h\) is some small stepsize. The"backwards"
strategy is similar, but in reverse: \(x_0, x_0-h, x_0-2h, \dots\). Finally, the"center"
strategy results in shifts symmetric around the unshifted point: \(\dots, x_0-2h, x_0-h, x_0, x_0+h, x_0+2h,\dots\).f0 (tensor_like[float] or None) – Output of the evaluated input tape. If provided, and the gradient recipe contains an unshifted term, this value is used, saving a quantum evaluation.
validate_params (bool) – Whether to validate the tape parameters or not. If
True
, theOperation.grad_method
attribute and the circuit structure will be analyzed to determine if the trainable parameters support the finite-difference method. IfFalse
, the finite-difference method will be applied to all parameters.
- Returns
If the input is a QNode, a tensor representing the output Jacobian matrix of size
(number_outputs, number_gate_parameters)
is returned.If the input is a tape, a tuple containing a list of generated tapes, in addition to a post-processing function to be applied to the evaluated tapes.
- Return type
tensor_like or tuple[list[QuantumTape], function]
Example
This transform can be registered directly as the quantum gradient transform to use during autodifferentiation:
>>> dev = qml.device("default.qubit", wires=2) >>> @qml.qnode(dev, gradient_fn=qml.gradients.finite_diff) ... def circuit(params): ... qml.RX(params[0], wires=0) ... qml.RY(params[1], wires=0) ... qml.RX(params[2], wires=0) ... return qml.expval(qml.PauliZ(0)), qml.var(qml.PauliZ(0)) >>> params = np.array([0.1, 0.2, 0.3], requires_grad=True) >>> qml.jacobian(circuit)(params) tensor([[-0.38751725, -0.18884792, -0.38355708], [ 0.69916868, 0.34072432, 0.69202365]], requires_grad=True)
Usage Details
This gradient transform can also be applied directly to
QNode
objects:>>> @qml.qnode(dev) ... def circuit(params): ... qml.RX(params[0], wires=0) ... qml.RY(params[1], wires=0) ... qml.RX(params[2], wires=0) ... return qml.expval(qml.PauliZ(0)), qml.var(qml.PauliZ(0)) >>> qml.gradients.finite_diff(circuit)(params) tensor([[-0.38751725, -0.18884792, -0.38355708], [ 0.69916868, 0.34072432, 0.69202365]], requires_grad=True)
This quantum gradient transform can also be applied to low-level
QuantumTape
objects. This will result in no implicit quantum device evaluation. Instead, the processed tapes, and post-processing function, which together define the gradient are directly returned:>>> with qml.tape.QuantumTape() as tape: ... qml.RX(params[0], wires=0) ... qml.RY(params[1], wires=0) ... qml.RX(params[2], wires=0) ... qml.expval(qml.PauliZ(0)) ... qml.var(qml.PauliZ(0)) >>> gradient_tapes, fn = qml.gradients.finite_diff(tape) >>> gradient_tapes [<QuantumTape: wires=[0], params=3>, <QuantumTape: wires=[0], params=3>, <QuantumTape: wires=[0], params=3>, <QuantumTape: wires=[0], params=3>]
This can be useful if the underlying circuits representing the gradient computation need to be analyzed.
The output tapes can then be evaluated and post-processed to retrieve the gradient:
>>> dev = qml.device("default.qubit", wires=2) >>> fn(qml.execute(gradient_tapes, dev, None)) [[-0.38751721 -0.18884787 -0.38355704] [ 0.69916862 0.34072424 0.69202359]]