If the quantum computing era dawned 3 years ago, its rising sun may have ducked behind a cloud. In 2019, Google researchers claimed they had passed a milestone known as quantum supremacy when their quantum computer Sycamore performed in 200 seconds an abstruse calculation they said would tie up a supercomputer for 10,000 years. Now, scientists in China have done the computation in a few hours with ordinary processors. A supercomputer, they say, could beat Sycamore outright.
“I think they’re right that if they had access to a big enough supercomputer, they could have simulated the … task in a matter of seconds,” says Scott Aaronson, a computer scientist at the University of Texas, Austin. The advance takes a bit of the shine off Google’s claim, says Greg Kuperberg, a mathematician at the University of California, Davis. “Getting to 300 feet from the summit is less exciting than getting to the summit.”
Still, the promise of quantum computing remains undimmed, Kuperberg and others say. And Sergio Boixo, principal scientist for Google Quantum AI, said in an email the Google team knew its edge might not hold for very long. “In our 2019 paper, we said that classical algorithms would improve,” he said. But, “we don’t think this classical approach can keep up with quantum circuits in 2022 and beyond.”
The “problem” Sycamore solved was designed to be hard for a conventional computer but as easy as possible for a quantum computer, which manipulates qubits that can be set to 0, 1, or—thanks to quantum mechanics—any combination of 0 and 1 at the same time. Together, Sycamore’s 53 qubits, tiny resonating electrical circuits made of superconducting metal, can encode any number from 0 to 253 (roughly 9 quadrillion)—or even all of them at once.
Starting with all the qubits set to 0, Google researchers applied to single qubits and pairs a random but fixed set of logical operations, or gates, over 20 cycles, then read out the qubits. Crudely speaking, quantum waves representing all possible outputs sloshed among the qubits, and the gates created interference that reinforced some outputs and canceled others. So some should have appeared with greater probability than others. Over millions of trials, a spiky output pattern emerged.
The Google researchers argued that simulating those interference effects would overwhelm even Summit, a supercomputer at Oak Ridge National Laboratory, which has 9216 central processing units and 27,648 faster graphic processing units (GPUs). Researchers with IBM, which developed Summit, quickly countered that if they exploited every bit of hard drive available to the computer, it could handle the computation in a few days. Now, Pan Zhang, a statistical physicist at the Institute of Theoretical Physics at the Chinese Academy of Sciences, and colleagues have shown how to beat Sycamore in a paper in press at Physical Review Letters.
Following others, Zhang and colleagues recast the problem as a 3D mathematical array called a tensor network. It consisted of 20 layers, one for each cycle of gates, with each layer comprising 53 dots, one for each qubit. Lines connected the dots to represent the gates, with each gate encoded in a tensor—a 2D or 4D grid of complex numbers. Running the simulation then reduced to, essentially, multiplying all the tensors. “The advantage of the tensor network method is we can use many GPUs to do the computations in parallel,” Zhang says.
Zhang and colleagues also relied on a key insight: Sycamore’s computation was far from exact, so theirs didn’t need to be either. Sycamore calculated the distribution of outputs with an estimated fidelity of 0.2%—just enough to distinguish the fingerprintlike spikiness from the noise in the circuitry. So Zhang’s team traded accuracy for speed by cutting some lines in its network and eliminating the corresponding gates. Losing just eight lines made the computation 256 times faster while maintaining a fidelity of 0.37%.
The researchers calculated the output pattern for 1 million of the 9 quadrillion possible number strings, relying on an innovation of their own to obtain a truly random, representative set. The computation took 15 hours on 512 GPUs and yielded the telltale spiky output. “It’s fair to say that the Google experiment has been simulated on a conventional computer,” says Dominik Hangleiter, a quantum computer scientist at the University of Maryland, College Park. On a supercomputer, the computation would take a few dozen seconds, Zhang says—10 billion times faster than the Google team estimated.
The advance underscores the pitfalls of racing a quantum computer against a conventional one, researchers say. “There’s an urgent need for better quantum supremacy experiments,” Aaronson says. Zhang suggests a more practical approach: “We should find some real-world applications to demonstrate the quantum advantage.”
Still, the Google demonstration was not just hype, researchers say. Sycamore required far fewer operations and less power than a supercomputer, Zhang notes. And if Sycamore had slightly higher fidelity, he says, his team’s simulation couldn’t have kept up. As Hangleiter puts it, “The Google experiment did what it was meant to do, start this race.”