claimed: “No computer existing, or that will ever exist, can break this barrier” of solving
the equations describing many entangled particles.
But in fact, years before Laughlin and Pines wrote these words, the physicist Richard
Feynman had articulated a rebuttal [9]. As Feynman put it: “Nature isn’t classical dammit,
and if you want to make a simulation of Nature you better make it quantum mechanical, and
by golly it’s a wonderful problem because it doesn’t look so easy.” Feynman had envisioned
using a quantum computer to solve the quantum physics problems that physicists and
chemists had failed to solve using digital computers. Laughlin and Pines knew well that
Feynman had made this proposal years earlier, but had dismissed his idea as impractical.
Now, some 35 years after Feynman’s proposal, we’re just beginning to reach the stage
where quantum computers can provide useful solutions to hard quantum problems.
3.2 Why quantum computing is hard
So why is it taking so long? What is it about quantum computing that’s so difficult? The
core of the problem stems from a fundamental feature of the quantum world — that we
cannot observe a quantum system without producing an uncontrollable disturbance in the
system. That means that if we want to use a quantum system to store and reliably process
information, then we need to keep that system nearly perfectly isolated from the outside
world. At the same time, though, we want the qubits to strongly interact with one another
so we can process the information; we also need to be able to control the system from the
outside, and eventually read out the qubits so we can find the result of our computation.
It is very challenging to build a quantum system that satisfies all of these desiderata. It
has taken many years of development in materials and control and fabrication to get where
we are now.
Eventually we expect to be able to protect quantum systems and scale up quantum
computers using the principle of quantum error correction [10]. The essential idea of
quantum error correction is that if we want to protect a quantum system from damage
then we should encode it in a very highly entangled state; like that 100-page book I
described earlier, this entangled state has the property that the environment, interacting
with parts of the system one at a time, is unable to glimpse the encoded information and
therefore can’t damage it. Furthermore, we’ve understood in principle how to process
quantum information which is encoded in a highly entangled state. Unfortunately, there
is a significant overhead cost for doing quantum error correction — writing the protected
quantum information into a highly entangled book requires many additional physical qubits
— so reliable quantum computers using quantum error correction are not likely to be
available very soon.
4 The NISQ era unfolds
4.1 The 50-qubit barrier
Even with fault-tolerant quantum computing still a rather distant dream, we are now
entering a pivotal new era in quantum technology. For this talk, I needed a name to describe
this impending new era, so I made up a word: NISQ. This stands for Noisy Intermediate-
Scale Quantum. Here “intermediate scale” refers to the size of quantum computers which
will be available in the next few years, with a number of qubits ranging from 50 to a
Accepted in Quantum 2018-07-30, click title to verify 4