Quantum Kernels

On my fall vacation, I spent some time to understand the scope and goals of Quantum Machine Learning. For that purpose I walked through IBM’s Quantum Machine Learning course as well as the 2021 Qiskit Global Summer School. A great alternative view is offered by Xanadu’s PennyLane QML tutorials. Finally I selected two recent paper by Google Quantum AI about learning about quantum system. In this review I have summarized the main concepts around Supervised Learning with Quantum Kernels, since it is fundamental to much of the current quantum machine learning research. - November 6, 2022


Summary

This summary is my main take away from IBM qiskit’s Quantum Machine Learning course. This is why most of the links refer to this qiskit course.

In quantum machine learning, one first set of methods implements parameterized quantum circuits that can be used for two things:

  1. To encode data, where the parameters are determined by the data being encoded.

  2. As a quantum model, where the parameters are determined by an optimization process.

Thus the circuits are used to generate the separating hyperplane.

Training quantum circuits can be performed using Quantum Natural Gradient that uses the Quantum Fisher Information to adapt the steepest descent direction.

The second class of methods makes use of quantum circuits to estimate kernels.

The quantum kernels can be integrated in two ways:

  1. Application 1: Feature map is known, but is classically intractable. This is demonstrated with the hybrid algo combining quantum feature map and classical SVM.

  2. Application 2: We want to optimize a parametrized quantum feature map to minimize classification error. This technique is called kernel alignment. See also the Quantum kernel trainer tutorial.

In Havlicek (2019), the authors propose a family of quantum feature maps that are conjectured to be hard to simulate classically, and can be implemented as short-depth circuits on near-term quantum devices. Qiskit implements these as the PauliFeatureMap.

Insightful PennyLane tutorials:

References

  • Farhi et Neven, Classification with Quantum Neural Networks on Near Term Processors, 2018, arXiv:1802.06002.
    • Introduces the Variational classifier.

    • “We introduce a quantum neural network, QNN, that can represent labeled data, classical or quantum, and be trained by supervised learning. […] We introduce parameter dependent unitaries that can be adapted by supervised learning of labeled data. […] We then discuss presenting the data as quantum superpositions of computational basis states corresponding to different label values. […] Our work is exploratory and relies on the classical simulation of small quantum systems.”

  • Havlicek, Supervised learning with quantum enhanced feature spaces, 2019, arXiv:1804.11326, doi:10.1038/s41586-019-0980-2.
    • Two methods of classification are presented:

      • Quantum variational classification: a circuit implements the classification task directly i.e. generates the separating hyperplane. The data itself as well as optimization parameters are used to parametrize the circuit.

      • Quantum kernel estimation: the quantum circuit is used to implement the kernel, while the classification is performed using a classical SVM.

    • The method for data encoding is described in detail and introduces a family of feature maps, whose overlap is conjectured to be hard to estimate on a classical computer.

    • Kernel based training is shown to find better or equally good quantum models than variational circuit training, using less quantum processing.

  • Schuld & Killoran, Quantum machine learning in feature Hilbert spaces, 2019, arXiv:1803.07128, doi:10.1103/PhysRevLett.122.040504.
  • Lloyd, Schuld et al., Quantum embeddings for machine learning, 2020, arXiv:2001.03622.
    • a technique called quantum metric learning is introduced which enables effective quantum kernel alignment

  • Matsuo et al., Problem-specific Parameterized Quantum Circuits of the VQE Algorithm for Optimization Problems, 2020, arXiv:2006.05643.

Learning about quantum systems

  • Huang et al., Power of data in quantum machine learning, 2021, arXiv:2011.01938, doi:10.1038/s41467-021-22539-9.
    • QKSVM is used to quantify the computational power of data in quantum machine learning algorithms and to understand the conditions under which quantum models will be capable of outperforming classical ones.

    • Summary [all of this is about learning quantum models]

      • Data can elevate classical [machine learning] models to rival quantum models, even when the quantum circuits generating the data are hard to compute classically.

      • Following these constructions, in numerical experiments, we find that a variety of common quantum models in the literature perform similarly or worse than classical ML on both classical and quantum datasets due to a small geometric difference.

      • With the large geometric difference endowed by the projected quantum model, we are able to construct engineered datasets to demonstrate large prediction advantage over common classical ML models

    • The quantum model considered here is also referred to as a quantum neural network (QNN) [variational classifier as in Farhi (2018)]. In this work, we focus on both classical and quantum ML models based on kernel functions k(xi, xj). The quantum kernel used is Tr[rho(x_i) rho(x_j)].

    • Our foundation is a general prediction error bound for training classical/quantum ML models to predict some quantum model [= learning quantum models].

    • The potential advantage for one ML algorithm defined by K1 to predict better than another ML algorithm defined by K2 depends on the largest possible separation between s^K1 and s^K2 for a dataset [s is the model complexity of the trained function. Small complexity means good generalization]. The quantity to measure this separation is the asymmetric geometrical difference between models.

    • Uses Fashion-MNIST preprocessed with PCA to reduce the dimensionality.

    • Associated Supplementary Information

      • formal equivalence of an arbitrary depth neural network with a quantum kernel method built from the original quadratic quantum kernel.

      • Constructing dataset to separate quantum and classical model [i.e. redefine the targets y_i for each x_i].

    • Associated TensorFlow Quantum Data tutorial.

  • Huang et al., Quantum advantage in learning from experiments, 2022, doi:10.1126/science.abn7293.
    • The first demonstration of a provable exponential advantage in learning about quantum systems that is robust even on today’s noisy hardware.

    • Combines quantum computing and quantum sensing to squeeze out more accuracy when measurement quantum systems.

    • Recipe: Entangle the multiple samples of the measurement (by transducing data from a physical system to a stable quantum memory) and process by a quantum agent: quantum PCA, quantum learning.

    • Associated Google AI Blog.

    • See also Pennylane tutorial

Further readings

  • About Quantum Natural Gradient:
    • Stokes, Quantum Natural Gradient, 2020, arXiv:1909.02108.

    • Gacon, Simultaneous Perturbation Stochastic Approximation of the Quantum Fisher Information, 2021, arXiv:2103.09232.

  • Hubregtsen et al., Training quantum embedding kernels on near-term quantum computers, 2022, arXiv:2105.02276, doi:10.1103/PhysRevA.106.042431>.
    • Quantum embedding kernels (QEKs) constructed by embedding data into the Hilbert space of a quantum computer are a particular quantum kernel technique that allows to gather insights into learning problems and that are particularly suitable for noisy intermediate-scale quantum devices.

    • We further show under which conditions noise from device imperfections influences the predicted kernel and provide a strategy to mitigate these detrimental effects which is tailored to quantum embedding kernels.

  • Glick et al., Covariant quantum kernels for data with group structure, 2022, arXiv:2105.03406, aps:S37.00007
    • Quantum kernels exist that, subject to computational hardness assumptions, cannot be computed classically. It is an important challenge to find quantum kernels that provide an advantage in the classification of real-world data. We introduce a class of quantum kernels that can be used for data with a group structure.

  • Liu et al., A rigorous and robust quantum speed-up in supervised machine learning, 2021, arXiv:2010.02174, doi:10.1038/s41567-021-01287-z.
    • Proposes a machine learning problem based on discrete logarithm which is assumed to be hard for any classical machine learning algorithm.

    • QKSVM is proven to provide a speed up over classical methods for certain specific input data classes.

  • Abbas et al., The power of quantum neural networks, 2021, arXiv:2011.00027, doi:10.1038/s43588-021-00084-1.
    • Expressibility for quantum and classical models in terms of Fisher information

    • Quantum neural networks can show resilience to barren plateaus.