Text by Philip Ball
Headlines in the popular media are constantly anticipating an imminent revolution in computing thanks to the advent of quantum computers, which in principle can calculate much faster than ordinary classical computers. Prototype quantum computers already exist: companies such as IBM and Google have developed devices that operate with just a few quantum bits, while the Canadian company D-Wave sells commercial machines.
But what expectations are realistic? Making quantum bits stable enough to carry out reliable computation in their precarious quantum states is still immensely challenging to current technologies. Equally difficult, although less publicized, is the challenge of devising algorithms that can perform useful quantum computing tasks. Scaling up quantum computers from a few quantum bits to the several tens or hundreds needed to significantly out-perform classical devices is far from trivial, not least in terms of the handling of errors.
And there are still profound theoretical problems for the field. It is not yet clear what will be the best type of architecture for large-scale quantum computing. And there is still disagreement about precisely what it is that makes a quantum computer faster – an issue that bears on unsolved questions about the fundamentals of quantum theory.
This session will bring together leading researchers in the field, ranging from theorists concerned with the basic principles of quantum computing and algorithmic development to experts in the physical implementation of these ideas. In talks and panel debates they will explore what the real prospects are for quantum computers in the coming years, what are the hurdles yet to be overcome – and what opportunities exist for young researchers entering this exciting field.
Scott Aaronson is the David J. Bruton Centennial Professor of Computer Science at the University of Texas at Austin, USA. His research interests include quantum computing and theoretical computer science more broadly. He writes the influential blog Shtetl-Optimized.
Jay Gambetta is the Manager of the Theory of Quantum Computing and Information Section at IBM’s Thomas J. Watson Research Center, Yorktown Heights, New York, USA.
Seth Lloyd, a self-styled “quantum mechanic”, is the Nam Pyo Suh Professor at the Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts, USA.
John Martinis is a Professor of Physics at University of California at Santa Barbara, and a Research Scientist at the Google Quantum AI Laboratory, where he is head of the quantum hardware team whose goal is to build a useful quantum computer.
Chris Monroe is a Distinguished University Professor and Zorn Professor of Physics at the University of Maryland, USA, and co-founder and Chief Scientist at IonQ, Inc. He is a leading researcher in the use of individual atoms for realizing quantum computers and quantum simulators. He has also pioneered modular architectures for scaling up atomic quantum computers using photonic networks.
Philip Ball is a science writer and author, and a former editor for physical sciences at Nature. His next book, to be published in 2018, is an examination of current views on the interpretation of quantum mechanics.
“Complexity-Theoretic Foundations of Quantum Supremacy Experiments”
In the near future, there will likely be special-purpose quantum computers with 50 or so high-quality qubits. In this talk, I will discuss general theoretical foundations for how to use such devices to demonstrate “quantum supremacy”: that is, a clear quantum speedup for some tasks, motivated by the goal of overturning the Extended Church-Turing Thesis (which says that all physical systems can be efficiently simulated by classical computers) as confidently as possible.
“Approximate quantum computing with near-term devices”
For many years researchers in quantum information science have made a steady progress with proof-of-principle experiments demonstrating basic building blocks of future quantum computers. Theoretical tools have been developed for understanding how such devices might work in the future. With the advent of modest-sized quantum computers such as the IBM Q experience, the field is in the midst of a realignment towards a new era of quantum computing technology. We have entered an era with quantum computing becoming a technology. It is my view that the only viable path in the long term is universal fault-tolerant quantum computing. However, to determine if this is possible and to find value in quantum computing before fault-tolerance is available, I foresee an approaching horizon of ‘approximate quantum computing.’ In this talk I will give some examples of what I mean by approximate quantum computing and outline some of the challenges ahead.
“Quantum Computers in Society and in the Universe”
Quantum computers are approaching the point where they will be able to perform computations that classical computers cannot. Quantum information processing also represents a novel paradigm for understanding problems in physics, such as quantum gravity, that have previous resisted solution. This talk discusses the implications of these advances for human society and for our understanding the universe.
“Quantum Hardware at Google: Progress Towards Exponentially Growing Computational Complexity”
The quantum hardware group at Google is building superconducting qubit devices for quantum annealing, quantum simulation and gate-model quantum computing. A large effort this year is focused on demonstrating quantum supremacy on a 49-qubit device. Here the output of a quantum computer can only be checked with a large classical supercomputer, which is limited by the memory storage of the 249 state space. I will show experimental data towards this demonstration from a 9-qubit adjustable-coupler “gmon” device, which implements the basic sampling algorithm of supremacy for a computational (Hilbert) space of about 500. Fidelities in the 90% range indicate that huge Hilbert space computations should be possible with 20-49-qubit devices, which are presently being designed, built and tested. We have also gone beyond checking whether our quantum computer is operating properly: a quantum-materials simulation shows that complex energy spectra can be accurately predicted on our quantum computer.
“Reconfigurable and Scalable Quantum Computing with Atoms”
Individual atoms are standards for quantum information science, acting as qubits that have unsurpassed levels of quantum coherence, can be replicated and scaled with the atomic clock accuracy, and allow near-perfect measurement. Quantum gate operations between atomic ions are mediated with control laser beams, allowing the qubit connectivity graph to be reconfigured and optimally adapted to a given algorithm or mode of computing. Existing work has shown >99.9% fidelity operations, fully-connected control with up to about 10 qubits, and quantum simulations with limited control on up to 200 qubits – all with the same atomic architecture. I will speculate on combining all of this into a single universal quantum computing device that can be co-designed with future applications.
Illustrations by Constanza Rojas-Molina