The pace of development in quantum computing reflects the recent advances in machine learning and artificial intelligence. It is natural to wonder how quantum technologies can be used to boost machine learning algorithms: this field of research is called quantum machine learning. The objective of the course is to outline, from both a theoretical and applied perspective, what benefits current and near-future quantum technologies may bring to machine learning.
Two different paradigms of quantum computing are presented: analogue approaches based on quantum annealing and digital approaches based on quantum logic gates. The course will include practical sessions where real machine learning problems will be solved using a cloud quantum annealer and its open source frameworks, as well as discussions on the potential and limitations of the current technology.
The content is organized in four parts.
- The first part of the course introduces the main concepts of quantum computing from a theoretical standpoint.
- The second part of the course focuses on the quantum gate model, introducing the main mathematical framework as well as the available open source frameworks.
- The third part of the course focuses on the quantum annealing model, its comparison with the classical simulated annealing and its mathematical framework, in particular the relation between the Hamiltonian of a quantum system and a QUBO (quadratic unconstrained binary optimization) representation for NP-hard problems.
- In the fourth, hands-on part, students learn how to represent NP-hard problems in QUBO formulation and solve them with a quantum annealer. Students write and run machine learning code on actual quantum computers. Notebooks are available with exercises based on the D-Wave Ocean Suite targeting the D-Wave quantum annealer.
OBJECTIVES OF THE COURSE
Students will familiarize with the main concepts of quantum computing, both for the quantum gate and the quantum annealing architectures. Students will be able to assess which quantum architecture is most suited for a given machine learning problem and will be able to write and run a machine learning optimization algorithm on a quantum computer.
- Foundations of Operations Research is strongly suggested
- At lest 1 course among: Machine Learning, Recommender Systems
- Programming skills in Python are also strongly suggested
- Understanding different quantum computing models, assessing their limiting factors.
- Solving simple machine learning optimization problems on quantum computing architectures.
- Assessing different classes, implementations and limiting factors of quantum computer.
- Applying knowledge and tools acquired during the course to more advanced machine learning problems.
- March 15th – 10:00-13:00: Introduction to the course. Main concepts of Quantum Computing from a theoretical standpoint: the axioms of quantum mechanics and mathematical background.
- March 17th – 10:00-13:00: Quantum algorithms in the oracle model.
- March 19th – 10:00-13:00: Phase estimation and amplitude amplification.
- March 22nd – 10:00-13:00: The QRAM model and the quantum toolkit for machine learning.
- March 24th – 10:00-12:00: Quantum algorithms for machine learning.
- March 26th – 10:00-12:00: Quantum annealing model. Classical simulated annealing and quantum-inspired annealing.
- March 29th – 10:00-13:00: Relation between the Hamiltonian of a quantum system and a QUBO representation for NP-hard problems.
- March 31st – 10:00-13:00: Write NP-complete and NP-hard problems in QUBO formulation. Hands-on part, the DWave Python Environment, the QPU and its architecture.
- April 2nd – 10:00-13:00: Hands-on part, use the annealer to solve simple problems via minor embedding. Solving problems: graph partitioning, feature selection etc.