Lecture 5.1 - Building a Quantum Classifier
- URL:: https://learn.qiskit.org/summer-school/2021/lec5-1-building-quantum-classifier
- Links:: 2021 Qiskit Global Summer School
- Instructor:: Amira Abbas
# Notes
- 🕒 00:58 Hilbert Space is a big space!
- Quantum Computing gives us excess to exponential state space where we can apply computations.
- With just $275$ qubits, we can represent more states than the number of atoms in the observable universe.
- 🕒 01:55 Interference is more important/interesting as compared to exponential state space.
- 🕒 03:07 Quantum Machine Learning paradigms
- 🕒 05:15 How quantum computers can fit into the framework of classical machine learning?
- 🕒 06:35 Near-term vs fault-tolerant
- Near-term devices are those noisy devices that are available today.
- Fault-tolerant are those that will be error-free in the future, hopefully.
- 🕒 07:33 What can we do now with NISQ devices?
🕒 07:58 Variational models
- A quantum circuit that contains some parameters in it that we can train/optimize.
- 🕒 09:33 Names of Variational models
🕒 10:27 A first attempt at QML
- We can use a variational circuit as a classifier.
- 🕒 12:09 Task: Train a quantum circuit on labelled samples in order to predict labels for new data
- Steps
- 🕒 12:46 Step 1: Encode the classical data into a quantum state
- 🕒 13:09 Step 2: Apply a parameterized model
- 🕒 13:26 Step 3: Measure the circuit to extract labels
- 🕒 13:42 Step 4: Use optimization techniques (like gradient descent) to update model parameters
🕒 14:13 Data encoding
- How to encode classical data into a quantum system?
- This is still an open question/problem.
- It depends on the problem.
- 🕒 15:18 Basis encoding
- Encode classical data into computational basis states.
- Encode classical data into computational basis states.
- 🕒 17:15 Amplitude encoding
- Encode classical data into amplitude vectors.
- Use a parameterized gate such that the resulting state encodes feature into its amplitudes.
- Encode classical data into amplitude vectors.
- 🕒 20:05 Angle encoding
- Encode classical data into angles with which we rotate qubits.
- Encode classical data into angles with which we rotate qubits.
- 🕒 22:53 Higher order encoding
- Here, we are encoding higher-orders of our data, by encoding products of features as angles.
- Here, we are encoding higher-orders of our data, by encoding products of features as angles.
🕒 25:34 Applying a variational model
- How to design circuits? This is an open question/problem.
- 🕒 29:34 High-expressive variational circuit is not always advantageous.
- 🕒 30:18 Example of a typical variational model
- 🕒 33:32 How to extract labels from the model?
- This is also an open question.
- 🕒 34:20 Parity post-processing
- This is for binary classification.
- 🕒 39:42 Measuring the first qubit
- We can only measure the first qubit to interpret the probability of our label.
🕒 43:58 Optimization
- How to compute the gradient of a quantum circuit?
- This is called the parameter shift rule.
- 🕒 47:29 Is this advantageous?
- 🕒 48:40 Data encoding
- We map our original data into a higher-dimensional space.
- So we can think of this as a feature map, transforming our data from one space to another higher-dimensional space.
- 🕒 51:05 We can think of quantum classifiers as linear classifiers in a feature space (recall support vector machines and their linear decision function)
- At the moment, this doesn’t provide any known advantage.
- 🕒 48:40 Data encoding
- 🕒 52:13 Recap
- 🕒 53:32 What’s next?