Optimizers¶
Optimizers are objects which can be used to automatically update the parameters of a quantum or hybrid machine learning model. The optimizers you should use are dependent on your choice of classical interface (NumPy, PyTorch, and TensorFlow), and are available from different access points.
Regardless of their origin, all optimizers provide the same core functionality, and PennyLane is fully compatible with all of them.
NumPy Interface¶
When using the standard NumPy interface, PennyLane offers some built-in optimizers.
Some of these are specific to quantum optimization, such as the QNGOptimizer
, LieAlgebraOptimizer
RotosolveOptimizer
, RotoselectOptimizer
, and ShotAdaptiveOptimizer
.
Gradient-descent optimizer with past-gradient-dependent learning rate in each dimension. Gradient-descent optimizer with adaptive learning rate, first and second moment. Basic gradient-descent optimizer. Lie algebra optimizer. Gradient-descent optimizer with momentum. Gradient-descent optimizer with Nesterov momentum. Optimizer with adaptive learning rate, via calculation of the diagonal or block-diagonal approximation to the Fubini-Study metric tensor. Root mean squared propagation optimizer. Rotosolve gradient-free optimizer. Rotoselect gradient-free optimizer. Optimizer where the shot rate is adaptively calculated using the variances of the parameter-shift gradient.
PyTorch Interface¶
If you are using the PennyLane PyTorch interface, you should import one of the native
PyTorch optimizers (found in torch.optim
).
TensorFlow Interface¶
When using the PennyLane TensorFlow interface, you will need to leverage one of
the TensorFlow optimizers
(found in tf.keras.optimizers
).
Contents
Downloads