# Notebook downloads¶

In addition to the qubit rotation, Gaussian transformation, and hybrid quantum optimization tutorials in the documentation, we also have a selection of Jupyter notebooks available walking through some more advanced optimizations made possible with PennyLane, including supervised learning and quantum generative adversarial networks.

To open the interactive notebooks, launch the Jupyter notebook environment by clicking on the ‘Jupyter notebook’ shortcut in the start menu (Windows), or by running the following in the Anaconda Prompt/Command Prompt/Terminal:

```
jupyter notebook
```

Your web browser should open with the Jupyter notebook home page; simply click the ‘Upload’ button, browse to the tutorial file you downloaded below, and upload the file. You will now be able to open it and work through the tutorial.

Alternatively, you can also view the notebook contents on GitHub, without interactivity.

## Qubit notebooks¶

**Qubit rotation**(`Q1_qubit-rotation.ipynb`

/view on GitHub)Use PennyLane to optimize two rotation gates to flip a single qubit from state \(\ket{0}\) to state \(\ket{1}\). This notebook follows the same process as the qubit rotation tutorial, but with more emphasis on the optimization procedure, exploring the optimization landscape and different optimizers.

**Variational quantum eigensolver**(`Q2_variational-quantum-eigensolver.ipynb`

/view on GitHub)This notebook demonstrates the principle of a variational quantum eigensolver (VQE) [R7]. To showcase the hybrid computational capabilities of PennyLane, we train a quantum circuit to find the parameterized state \(\ket{\psi_v}\) that minimizes the squared energy expectation for a Hamiltonian \(H\):

\[\braketT{\psi_v}{H}{\psi_v}^2 =( 0.1 \braketT{\psi_v}{(\I\otimes \sigma_x)}{\psi_v} + 0.5 \braketT{\psi_v}{(\I\otimes \sigma_y)}{\psi_v} )^2,\]before training a second variational quantum circuit \(f(v_1,v_2)\) to minimize the the energy expectation of a fixed quantum state \(\ket{\psi}\):

\[\braketT{\psi}{H}{\psi}^2 =( v_1 \braketT{\psi}{(\I\otimes \sigma_x)}{\psi} + v_2\braketT{\psi}{(\I\otimes \sigma_y)}{\psi} )^2.\]**Variational classifier**(`Q3_variational-classifier.ipynb`

/view on GitHub)In this notebook we show how to use PennyLane to implement variational quantum classifiers - quantum circuits that can be trained from labeled data to classify new data samples. This optimization example demonstrates how to encode binary inputs into the initial state of the variational circuit, which is simply a computational basis state.

We then show how to encode real vectors as amplitude vectors (amplitude encoding) and train the model to recognize the first two classes of flowers in the Iris dataset.

**Quantum generative adversarial network (QGAN)**(`Q4_quantum-GAN.ipynb`

/view on GitHub)This demo constructs a quantum generative adversarial network (QGAN) [R23][R22] using two subcircuits, a generator and a discriminator. The generator attempts to generate synthetic quantum data to match a pattern of “real” data, while the discriminator tries to discern real data from fake data. The gradient of the discriminator’s output provides a training signal for the generator to improve its fake generated data.

## Continuous-variable notebooks¶

**Photon redirection**(`CV1_photon-redirection.ipynb`

/View on GitHub)Starting with a photon in mode 0 of a variational quantum optical circuit, the goal is to use PennyLane to optimize a beamsplitter to redirect the photon to mode 1. This notebook follows the same process as the photon redirection tutorial, but with more emphasis on the optimization procedure, comparing the use of the gradient-descent optimizer with and without momentum.

**Quantum neural networks**(`CV2_quantum-neural-net.ipynb`

/View on GitHub)In this notebook, we show how a continuous-variable quantum neural network model [R3] can be used to learn a fit for a one-dimensional function when being trained with noisy samples from that function. In this case, the variational quantum circuit is trained to fit a one-dimensional sine function from noisy data.