Quantum Generative Models¶
Ising models and associative memory,
Restricted Boltzmann Machines as the first working prototypes of deep networks,
What about Quantum Boltzmann Machines?
…  May 19, 2023
Work in progress…
In this section
Literature: RBM¶
 Hinton et al., A fast learning algorithm for deep belief nets, 2006, pdf:fastnc, doi:10.1162/neco.2006.18.7.1527  Univ. Toronto, Univ. Singapore.
“Using complementary priors, we derive a fast, greedy algorithm that can learn deep, directed belief networks one layer at a time, provided the top two layers form an undirected associative memory.”
 Hinton, A practical guide to training restricted boltzmann machines, 2010, pdf:guideTR, doi:10.1007/9783642352898_32
Literature: QBM¶
 Ventura et al., Quantum associative memory, 2000, arXiv:quantph/9807053, doi:10.1016/S00200255(99)001012 
…
 Adachi et Henderson, Application of Quantum Annealing to Training of Deep Neural Networks, 2015, arXiv:1510.06356  Lockheed Martin.
“We investigated an alternative approach that estimates model expectations of Restricted Boltzmann Machines using samples from a DWave quantum annealing machine. [alternative to Contrastive Divergence]”.
 Amin et al., Quantum Boltzmann Machine, 2016, pdf:10.1103/PhysRevX.8.021050, arXiv:1601.02036, doi:10.1103/PhysRevX.8.021050  DWave, Univ. Waterloo.
…
 Han et al., Unsupervised Generative Modeling Using Matrix Product States, , arXiv:1709.01662 doi:10.1103/PhysRevX.8.031012 
“Our model enjoys efficient learning analogous to the density matrix renormalization group method. […] We apply our method to generative modeling of several standard datasets including the Bars and Stripes, random binary patterns and the MNIST handwritten digits to illustrate the abilities, features and drawbacks of our model over popular generative models such as Hopfield model, Boltzmann machines and generative adversarial networks.”
 Gao et al., A quantum machine learning algorithm based on generative models, 2018, arXiv:1711.02038 doi:10.1126/sciadv.aat9004  Tsinghua Univ., Univ. Michigan
…
 Crawford et al., Reinforcement learning using quantum boltzmann machines, 2018, arXiv:1612.05695, doi:10.5555/3370185.3370188  1QBit, Univ. British Columbia.
“We investigate whether quantum annealers with select chip layouts can outperform classical computers in reinforcement learning tasks.”
 Verdon et al., A quantum algorithm to train neural networks using lowdepth circuits, 2019, arXiv:1712.05304  Univ. Waterloo.
Our algorithm, which we call the Quantum Approximate Boltzmann Machine (QABoM) algorithm, generates approximate samples of distributions for use in machine learning on a nearterm circuit model device rather than a quantum annealer.
 Zoufal et al., Variational quantum Boltzmann machines, 2021, pdf:10.1007/s42484020000337, arXiv:2006.06004, doi:10.1007/s42484020000337  IBM Quantum, ETH Zürich.
“Novel realization approach to quantum Boltzmann machines (QBMs). The preparation of the required Gibbs states, as well as the evaluation of the loss function’s analytic gradient, is based on variational quantum imaginary time evolution. […] We illustrate the application of this variational QBM approach to generative and discriminative learning tasks using numerical simulation”
 Zoufal, Generative Quantum Machine Learning, 2021, arXiv:2111.12738  ETH Zürich
PhD thesis
 Perot, Quantum Boltzmann Machines: Applications in Quantitative Finance, 2022, arXiv:2301.13295  RWTH Aachen, Jülich SC.
Master thesis
 Viszlai et al., Training Quantum Boltzmann Machines with Coresets, 2022, arXiv:, doi:10.1109/QCE53715.2022.00049  Univ. Chicago, ColdQuanta.
“We apply these ideas to Quantum Boltzmann Machines (QBM) where gradientbased steps which require Gibbs state sampling are the main computational bottleneck during training. By using a coreset in place of the full data set, we try to minimize the number of steps needed and accelerate the overall training time.”
 Boedeker et al., Optimal storage capacity of quantum Hopfield neural networks, 2022, arXiv:2210.07894,
“Our method opens an avenue for a systematic characterization of the storage capacity of quantum associative memories.”
 Huijgen et al., Training Quantum Boltzmann Machines with the βVariational Quantum Eigensolver, 2023, arXiv:2304.08631  Radboud Univ. (NL), Quantinuum.
“The training of the QBM consists of minimizing the relative entropy from the model to the target state. This requires QBM expectation values which are computationally intractable for large models in general. It is therefore important to develop heuristic training methods that work well in practice.”
Literature: QCBM¶
 Liu et Wang, Differentiable Learning of Quantum Circuit Born Machine, 2018, arXiv:1804.04168 doi:10.1103/PhysRevA.98.062324  Univ. Beijing
…
 Benedetti et al., A generative modeling approach for benchmarking and training shallow quantum circuits, 2019, arXiv:1801.07686 doi:10.1038/s4153401901578  Univ. College London, Qubitera, Rigetti, IonQ
…
 Coyle et al., The Born supremacy: quantum advantage and training of an Ising Born machine, 2020, pdf:nature/s41534020002889, arXiv:1904.02214 doi:10.1038/s41534020002889  Univ. Edinburgh
…
 Riofrio et al., A performance characterization of quantum generative models, , arXiv:2301.09363  QUTAC (BMW, Munich Re, BASF, SAP, Merck, Lufthansa)
[Presented at DLRQCI Austauschforum 2023, Hamburg]
…
Code¶

“In this thesis we explore using the DWave Advantage 4.1 quantum annealer to sample from quantum Boltzmann distributions and train quantum Boltzmann machines (QBMs). […] Our findings indicate that QBMs trained using the Advantage 4.1 are much noisier than those trained using simulations and struggle to perform at the same level as classical RBMs. However, there is the potential for QBMs to outperform classical RBMs if future generation annealers can generate samples closer to the desired theoretical distributions.”

“Quantum Restricted Boltzmann Machines based on the paper arXiv:1712.05304 [Verdon et al., 2019]”
See also jugit.fzjuelich:qip/qbm

“In this tutorial, we will explore quantum GANs to generate handwritten digits of zero.”