2024
Eric R. Anschuetz, Chi-Fang Chen, Bobak T. Kiani, Robbie King
Random spin systems at low temperatures are glassy and feature computational hardness in finding low-energy states. We study the random all-to-all interacting fermionic Sachdev--Ye--Kitaev (SYK) model and prove that, in contrast, (I) the low-energy states have polynomial circuit depth, yet (II) the annealed and quenched free energies agree to inverse-polynomially low temperatures, ruling out a glassy phase transition in this sense. These results are derived by showing that fermionic and spin systems significantly differ in their commutation index, which quantifies the non-commutativity of Hamiltonian terms. Our results suggest that low-temperature strongly interacting fermions, unlike spins, belong in a classically nontrivial yet quantumly easy phase.
2024
Stephen P. Jordan, Noah Shutty, Mary Wootters, Adam Zalcman, Alexander Schmidhuber, Robbie King, Sergei V. Isakov, Ryan Babbush
We introduce Decoded Quantum Interferometry (DQI), a quantum algorithm for reducing classical optimization problems to classical decoding problems by exploiting structure in the Fourier spectrum of the objective function. DQI reduces sparse max-XORSAT to decoding LDPC codes, which can be achieved using powerful classical algorithms such as Belief Propagation (BP). As an initial benchmark, we compare DQI using belief propagation decoding against classical optimization via simulated annealing. In this setting we present evidence that, for a certain family of max-XORSAT instances, DQI with BP decoding achieves a better approximation ratio on average than simulated annealing, although not better than specialized classical algorithms tailored to those instances. We also analyze a combinatorial optimization problem corresponding to finding polynomials that intersect the maximum number of points. There, DQI efficiently achieves a better approximation ratio than any polynomial-time classical algorithm known to us, thus realizing an apparent exponential quantum speedup. Finally, we show that the problem defined by Yamakawa and Zhandry in order to prove an exponential separation between quantum and classical query complexity is a special case of the optimization problem efficiently solved by DQI.
2024
Rolando D. Somma, Robbie King, Robin Kothari, Thomas O’Brien, Ryan Babbush
We present shadow Hamiltonian simulation, a framework for simulating quantum dynamics using a compressed quantum state that we call the “shadow state”. The amplitudes of this shadow state are proportional to the expectations of a set of operators of interest. The shadow state evolves according to its own Schrodinger equation, and under broad conditions can be simulated on a quantum computer. We analyze a number of applications of this framework to quantum simulation problems. This includes simulating the dynamics of exponentially large systems of free fermions, or exponentially large systems of free bosons, the latter example recovering a recent algorithm for simulating exponentially many classical harmonic oscillators. Shadow Hamiltonian simulation can be extended to simulate expectations of more complex operators such as two-time correlators or Green’s functions, and to study the evolution of operators themselves in the Heisenberg picture.
2024
Robbie King, David Gosset, Robin Kothari, Ryan Babbush
Given copies of a quantum state ρ, a shadow tomography protocol aims to learn all expectation values from a fixed set of observables, to within a given precision ϵ. We say that a shadow tomography protocol is triply efficient if it is sample- and time-efficient, and only employs measurements that entangle a constant number of copies of ρ at a time. The classical shadows protocol based on random single-copy measurements is triply efficient for the set of local Pauli observables. This and other protocols based on random single-copy Clifford measurements can be understood as arising from fractional colorings of a graph G that encodes the commutation structure of the set of observables. Here we describe a framework for two-copy shadow tomography that uses an initial round of Bell measurements to reduce to a fractional coloring problem in an induced subgraph of G with bounded clique number. This coloring problem can be addressed using techniques from graph theory known as chi-boundedness. Using this framework we give the first triply efficient shadow tomography scheme for the set of local fermionic observables, which arise in a broad class of interacting fermionic systems in physics and chemistry. We also give a triply efficient scheme for the set of all n-qubit Pauli observables. Our protocols for these tasks use two-copy measurements, which is necessary: sample-efficient schemes are provably impossible using only single-copy measurements. Finally, we give a shadow tomography protocol that compresses an n-qubit quantum state into a poly(n)-sized classical representation, from which one can extract the expected value of any of the 4n Pauli observables in poly(n) time, up to a small constant error.
2024
Robbie King, Kianna Wan, Jarrod McClean
The ability of quantum computers to directly manipulate and analyze quantum states stored in quantum memory allows them to learn about aspects of our physical world that would otherwise be invisible given a modest number of measurements. Here we investigate a new learning resource which could be available to quantum computers in the future – measurements on the unknown state accompanied by its complex conjugate ρ ⊗ ρ ∗ . For a certain shadow tomography task, we surprisingly find that measurements on only copies of ρ ⊗ ρ ∗ can be exponentially more powerful than measurements on ρ ⊗K, even for large K. This expands the class of provable exponential advantages using only a constant overhead quantum memory, or minimal quantum memory, and we provide a number of examples where the state ρ ∗ is naturally available in both computational and physical applications. In addition, we precisely quantify the power of classical shadows on single copies under a generalized Clifford ensemble and give a class of quantities that can be efficiently learned. The learning task we study in both the single copy and quantum memory settings is physically natural and corresponds to real-space observables with a limit of bosonic modes, where it achieves an exponential improvement in detecting certain signals under a noisy background. We quantify a new and powerful resource in quantum learning, and we believe the advantage may find applications in improving quantum simulation, learning from quantum sensors, and uncovering new physical phenomena.
2023
Amira Abbas, Robbie King, Hsin-Yuan Huang, William J. Huggins, Ramis Movassagh, Dar Gilboa, Jarrod R. McClean
The success of modern deep learning hinges on the ability to train neural networks at scale. Through clever reuse of intermediate information, backpropagation facilitates training through gradient computation at a total cost roughly proportional to running the function, rather than incurring an additional factor proportional to the number of parameters – which can now be in the trillions. Naively, one expects that quantum measurement collapse entirely rules out the reuse of quantum information as in backpropagation. But recent developments in shadow tomography, which assumes access to multiple copies of a quantum state, have challenged that notion. Here, we investigate whether parameterized quantum models can train as efficiently as classical neural networks. We show that achieving backpropagation scaling is impossible without access to multiple copies of a state. With this added ability, we introduce an algorithm with foundations in shadow tomography that matches backpropagation scaling in quantum resources while reducing classical auxiliary computational costs to open problems in shadow tomography. These results highlight the nuance of reusing quantum information for practical purposes and clarify the unique difficulties in training large quantum models, which could alter the course of quantum machine learning.
2023
Robbie King, Tamara Kohler
We study the complexity of a classic problem in computational topology, the homology problem: given a description of some space X and an integer k, decide if X contains a k-dimensional hole. The setting and statement of the homology problem are completely classical, yet we find that the complexity is characterized by quantum complexity classes. Our result can be seen as an aspect of a connection between homology and supersymmetric quantum mechanics [Wit82]. We consider clique complexes, motivated by the practical application of topological data analysis (TDA). The clique complex of a graph is the simplicial complex formed by declaring every k+1-clique in the graph to be a k-simplex. Our main result is that deciding whether the clique complex of a weighted graph has a hole or not, given a suitable promise, is QMA1-hard and contained in QMA. Our main innovation is a technique to lower bound the eigenvalues of the combinatorial Laplacian operator. For this, we invoke a tool from algebraic topology known as spectral sequences. In particular, we exploit a connection between spectral sequences and Hodge theory [For94]. Spectral sequences will play a role analogous to perturbation theory for combinatorial Laplacians. In addition, we develop the simplicial surgery technique used in prior work [CK22]. Our result provides some suggestion that the quantum TDA algorithm [LGZ16] cannot be dequantized. More broadly, we hope that our results will open up new possibilities for quantum advantage in topological data analysis.
2022
Dominic W. Berry, Yuan Su, Casper Gyurik, Robbie King, Joao Basso, Alexander Del Toro Barba, Abhishek Rajput, Nathan Wiebe, Vedran Dunjko, and Ryan Babbush
Lloyd et al. [Nat. Commun. 7, 10138 (2016)] were first to demonstrate the promise of quantum algorithms for computing Betti numbers, a way to characterize topological features of data sets. Here, we propose, analyze, and optimize an improved quantum algorithm for topological data analysis (TDA) with reduced scaling, including a method for preparing Dicke states based on inequality testing, a more efficient amplitude estimation algorithm using Kaiser windows, and an optimal implementation of eigenvalue projectors based on Chebyshev polynomials. We compile our approach to a fault-tolerant gate set and estimate constant factors in the Toffoli complexity. Our analysis reveals that superquadratic quantum speedups are only possible for this problem when targeting a multiplicative error approximation and the Betti number grows asymptotically. Further, we propose a dequantization of the quantum TDA algorithm that shows that having exponentially large dimension and Betti number are necessary, but insufficient conditions, for superpolynomial advantage. We then introduce and analyze specific problem examples which have parameters in the regime where superpolynomial advantages may be achieved, and argue that quantum circuits with tens of billions of Toffoli gates can solve seemingly classically intractable instances.
2022
We give an approximation algorithm for Quantum Max-Cut which works by rounding an SDP relaxation to an entangled quantum state. The SDP is used to choose the parameters of a variational quantum circuit. The entangled state is then represented as the quantum circuit applied to a product state. It achieves an approximation ratio of 0.582 on triangle-free graphs. The previous best algorithms of Anshu, Gosset, Morenz, and Parekh, Thompson achieved approximation ratios of 0.531 and 0.533 respectively. In addition, we study the EPR Hamiltonian, which we argue is a natural intermediate problem which isolates some key quantum features of local Hamiltonian problems. For the EPR Hamiltonian, we give an approximation algorithm with approximation ratio on all graphs.