18 Dec Synopsis: Quantum Computers Approach Milestone for Boson Sampling
Experiments show that when 20 photons travel through a complex optical network, only a quantum computer can efficiently sample the range of possible outcomes.
A team of researchers has sent 20 indistinguishable photons through an interferometer and measured how 14 of them emerged from the other side. The feat may seem unimportant, but it signifies a milestone in the field of quantum computation: approaching the point where a classical system cannot feasibly mimic a quantum system.
Solving the boson sampling problem means predicting the distribution of a set of input bosons—normally photons—after they have undergone some transformation procedure with multiple possible outcomes, or “modes.” The most efficient way to sample the range of possible distributions is to embody the calculation physically by experiment. In that case, the photons’ quantum behavior is included intrinsically as they negotiate the setup. Until recently, boson sampling experiments involved a handful of photons and fewer than 16 possible modes, offering at most a few tens of thousands of possible output configurations. Calculating the full range of outcomes in such a scenario is trivial even for simple classical computers, making it a poor test of the power of quantum-computational approaches.
Now, Jian-Wei Pan and Chao-Yang Lu, from the University of Science and Technology of China, Hefei, and their collaborators have created an optical system that processes up to 20 photons. The photons are directed simultaneously through a 60-mode interferometer composed of hundreds of beam splitters and are measured using 60 photon-counting detectors. After unavoidable losses in the system, the researchers found that they could routinely detect up to 14 photons per run, yielding an output with
degrees of freedom. This enormous possibility space—ten orders of magnitude greater than that achieved previously—is sampled and validated by the team’s inherently quantum-computational setup in a matter of minutes; a classical supercomputer would take hours to verify the results.
This research is published in Physical Review Letters.
Marric Stephens is a freelance science writer based in Bristol, UK.