SciPost Submission Page
Accelerating equilibrium spin-glass simulations using quantum annealers via generative deep learning
by Giuseppe Scriva, Emanuele Costa, Benjamin McNaughton and Sebastiano Pilati
This is not the latest submitted version.
This Submission thread is now published as
|Authors (as registered SciPost users):||Sebastiano Pilati · Giuseppe Scriva|
|Preprint Link:||scipost_202212_00028v1 (pdf)|
|Date submitted:||2022-12-15 12:19|
|Submitted by:||Scriva, Giuseppe|
|Submitted to:||SciPost Physics|
Adiabatic quantum computers, such as the quantum annealers commercialized by D-Wave Systems Inc., are routinely used to tackle combinatorial optimization problems. In this article, we show how to exploit them to accelerate equilibrium Markov chain Monte Carlo simulations of computationally challenging spin-glass models at low but finite temperatures. This is achieved by training generative neural networks on data produced by a D-Wave quantum annealer, and then using them to generate smart proposals for the Metropolis-Hastings algorithm. In particular, we explore hybrid schemes by combining single spin-flip and neural proposals, as well as D-Wave and classical Monte Carlo training data. The hybrid algorithm outperforms the single spin-flip Metropolis-Hastings algorithm. It is competitive with parallel tempering in terms of correlation times, with the significant benefit of a much shorter equilibration time.
Submission & Refereeing History
You are currently on this page
Reports on this Submission
- Cite as: Anonymous, Report on arXiv:scipost_202212_00028v1, delivered 2023-04-04, doi: 10.21468/SciPost.Report.6999
1. The manuscript introduces a clear and neat application of quantum annealers, which is of great interest.
2. It promotes and justifies further research in application of quantum annealers to produce samples to train machine learning codes.
1. Numerical evidence is sufficient but not massive.
In this paper the authors report on a study on how to accelerate Monte Carlo simulations with a generative neural network that selects smart proposals which helps decide swap updates during the execution of the algorithm. The crucial point is that, to this end, one requires sampling a probability distribution of spin glass models which can be done efficiently and more quickly using a D-wave computer, for low temperatures. For high temperatures the authors conclude it has to be done use single spin-flip and neural proposals. The data produced is used to train the generative neural network and optimise the algorithm. The result outperforms the single spin-flip Metropolis-Hastings algorithm and competes with parallel tempering. The paper represents a clever and promising application of quantum annealers and I recommend publication. A few comments are the following. The numerical evidence is sufficient in my opinion but not massive. Some improvement is obtained in some cases, but there is a vast variety of parameters that can be tuned or explored, for example effect of different activation functions, more variety in lattice sizes, etc. Also, comparison is made with some algorithms, like single-spin flip or parallel tempering. There are more proposals in the recent literature. Anyhow, in my opinion, this work introduces a good enough indication that sampling with a quantum annealer is promising and justifies publication.
1. I would like to ask to include in the conclusions section a small report on other recent strategies and algorithms, further than parallel tempering.
- Cite as: Anonymous, Report on arXiv:scipost_202212_00028v1, delivered 2023-03-08, doi: 10.21468/SciPost.Report.6868
This is a report on "Accelerating equilibrium spin-glass simulations using quantum annealers via generative deep learning" by Scriva and collaborators.
The Authors present a thorough demonstration of how configurations generated by a quantum annealer may be used to train neural networks, to provide an alternative to classical monte carlo sampling of equilibrium correlations. The paper is clearly written and convincingly substantiates a link between different research areas of equilibrium statistical mechanics and quantum adiabatic machines. It fulfills the minimum criteria for publication in SciPost.
As a non-expert in machine learning and neural networks I appreciated the clarity of the manuscript, and also the clear explanation of the logic of training with experimental data. On the physics side of things -- it was not clear how/if finite physical temperature of the annealer enters the analysis. The authors touched on this issue in the paragraph towards the end where they discuss the interplay with finite annealing time. Perhaps, they can clarify the issue further, if/when they revise the manuscript.