Complexity, Fidelity, Application
Things are getting real for researchers in the UC Santa Barbara John Martinis/Google group. They are making good on their intentions to claim supremacy in a tight global race to build the first quantum machine to outperform the world’s best classical supercomputers.
But what is quantum supremacy in a field where horizons are being widened on a regular basis, in which teams of the brightest quantum computing minds in the world routinely up the ante on the number and type of quantum bits (“qubits”) they can build, each with their own range of qualities?
“Let’s define that, because it’s kind of vague,” said Google researcher Charles Neill. Simply put, he continued, “we would like to perform an algorithm or computation that couldn’t be done otherwise. That’s what we actually mean.”
Neill is lead author of the group’s new paper, “A blueprint for demonstrating quantum supremacy with superconducting qubits,” now published in the journal Science.
Fortunately, nature offers up many such complex situations, in which the variables are so numerous and interdependent that classical computers can’t hold all the values and perform the operations. Think chemical reactions, fluid interactions, even quantum phase changes in solids and a host of other problems that have daunted researchers in the past. Something on the order of at least 49 qubits — roughly equivalent to a petabyte (one million gigabytes) of classical random access memory — could put a quantum computer on equal footing with the world’s supercomputers. Just recently, Neill’s Google/Martinis colleagues announced an effort toward quantum supremacy with a 72-qubit chip possessing a “bristlecone” architecture that has yet to be put through its paces.
But according to Neill, it’s more than the number of qubits on hand.
“You have to generate some sort of evolution in the system which leads you to use every state that has a name associated with it,” he said. The power of quantum computing lies in, among other things, the superpositioning of states. In classical computers, each bit can exist in one of two states — zero or one, off or on, true or false — but qubits can exist in a third state that is a superposition of both zero and one, raising exponentially the number of possible states a quantum system can explore.
Additionally, say the researchers, fidelity is important, because massive processing power is not worth much if it’s not accurate. Decoherence is a major challenge for anyone building a quantum computer — perturb the system, the information changes. Wait a few hundredths of a second too long, the information changes again.
“People might build 50 qubit systems, but you have to ask how well it computed what you wanted it to compute,” Neill said. “That’s a critical question. It’s the hardest part of the field.” Experiments with their superconducting qubits have demonstrated an error rate of one percent per qubit with three- and nine-qubit systems, which, they say, can be reduced as they scale up, via improvements in hardware, calibration, materials, architecture and machine learning.
Building a qubit system complete with error correction components — the researchers estimate a range of 100,000 to a million qubits — is doable and part of the plan. And still years away. But that doesn’t mean their system isn’t already capable of doing some heavy lifting. Just recently it was deployed, with spectroscopy, on the issue of many-body localization in a quantum phase change — a quantum computer solving a quantum statistical mechanics problem. In that experiment, the nine-qubit system became a quantum simulator, using photons bouncing around in their array to map the evolution of electrons in a system of increasing, yet highly controlled, disorder.
“A good reason why our fidelity was so high is because we’re able to reach complex states in very little time,” Neill explained. The more quickly a system can explore all possible states, the better the prediction of how a system will evolve, he said.
If all goes smoothly, the world should be seeing a practicable UCSB/Google quantum computer soon. The researchers are eager to put it through its paces, gaining answers to questions that were once accessible only through theory, extrapolation and highly educated guessing — and opening up a whole new level of experiments and research.
“It’s definitely very exciting,” said Google researcher Pedram Roushan, who led the many-body quantum simulation work published in Science in 2017. They expect their early work to stay close to home, such as research in condensed matter physics and quantum statistical mechanics, but they plan to branch out to other areas, including chemistry and materials, as the technology becomes more refined and accessible.
“For instance, knowing whether or not a molecule would form a bond or react in some other way with another molecule for some new technology… there are some important problems that you can’t roughly estimate; they really depend on details and very strong computational power,” Roushan said, hinting that a few years down the line they may be able to provide wider access to this computing power. “So you can get an account, log in and explore the quantum world.”