close x
download

Brochure

Enter Your Info Below And We Will Send
You The Brochure To Your Inbox!

aibotics go-ditial brochure (en)

Thank you!
Your submission has been received!

Oops! Something went wrong while submitting the form

brochure
News

Light-based Quantum Computer Exceeds Fastest Classical Supercomputers

Daniel Garisto, Scientific American
3.12.2020

The setup of lasers and mirrors effectively “solved” a problem far too complicated for even the largest traditional computer system.

Light-based Quantum Computer Exceeds Fastest Classical Supercomputers
Credit: Alamy

For the first time, a quantum computer made from photons—particles of light—has outperformed even the fastest classical supercomputers.

Physicists led by Chao-Yang Lu and Jian-Wei Pan of the University of Science and Technology of China (USTC) in Shanghai performed a technique called Gaussian boson sampling with their quantum computer, named Jiŭzhāng. The result, reported in the journal Science, was 76 detected photons—far above and beyond the previous record of five detected photons and the capabilities of classical supercomputers.

Unlike a traditional computer built from silicon processors, Jiŭzhāng is an elaborate tabletop setup of lasers, mirrors, prisms and photon detectors. It is not a universal computer that could one day send e-mails or store files, but it does demonstrate the potential of quantum computing.

Last year, Google captured headlines when its quantum computer Sycamore took roughly three minutes to do what would take a supercomputer three days (or 10,000 years, depending on your estimation method). In their paper, the USTC team estimates that it would take the Sunway TaihuLight, the third most powerful supercomputer in the world, a staggering 2.5 billion years to perform the same calculation as Jiŭzhāng.  

This is only the second demonstration of quantum primacy, which is a term that describes the point at which a quantum computer exponentially outspeeds any classical one, effectively doing what would otherwise essentially be computationally impossible. It is not just proof of principle; there are also some hints that Gaussian boson sampling could have practical applications, such as solving specialized problems in quantum chemistry and math. More broadly, the ability to control photons as qubits is a prerequisite for any large-scale quantum internet. (A qubit is a quantum bit, analogous to the bits used to represent information in classical computing.)

“It was not obvious that this was going to happen,” says Scott Aaronson, a theoretical computer scientist now at the University of Texas at Austin who along with then-student Alex Arkhipov first outlined the basics of boson sampling in 2011. Boson sampling experiments were, for many years, stuck at around three to five detected photons, which is “a hell of a long way” from quantum primacy, according to Aaronson. “Scaling it up is hard,” he says. “Hats off to them.”

Over the past few years, quantum computing has risen from an obscurity to a multibillion dollar enterprise recognized for its potential impact on national security, the global economy and the foundations of physics and computer science. In 2019, the the U.S. National Quantum Initiative Act was signed into law to invest more than $1.2 billion in quantum technology over the next 10 years. The field has also garnered a fair amount of hype, with unrealistic timelines and bombastic claims about quantum computers making classical computers entirely obsolete.

This latest demonstration of quantum computing’s potential from the USTC group is critical because it differs dramatically from Google’s approach. Sycamore uses superconducting loops of metal to form qubits; in Jiŭzhāng, the photons themselves are the qubits. Independent corroboration that quantum computing principles can lead to primacy even on totally different hardware “gives us confidence that in the long term, eventually, useful quantum simulators and a fault-tolerant quantum computer will become feasible,” Lu says.

A Light Sampling

Why do quantum computers have enormous potential? Consider the famous double-slit experiment, in which a photon is fired at a barrier with two slits, A and B. The photon does not go through A, or through B. Instead, the double-slit experiment shows that the photon exists in a “superposition,” or combination of possibilities, of having gone through both A and B. In theory, exploiting quantum properties like superposition allows quantum computers to achieve exponential speedups over their classical counterparts when applied to certain specific problems.

Physicists in the early 2000s were interested in exploiting the quantum properties of photons to make a quantum computer, in part because photons can act as qubits at room temperatures, so there is no need for the costly task of cooling one’s system to a few kelvins (about –455 degrees Fahrenheit) as with other quantum computing schemes. But it quickly became apparent that building a universal photonic quantum computer was infeasible. To even build a working quantum computer would require millions of lasers and other optical devices. As a result, quantum primacy with photons seemed out of reach.

Then, in 2011, Aaronson and Arkhipov introduced the concept of boson sampling, showing how it could be done with a limited quantum computer made from just a few lasers, mirrors, prisms and photon detectors. Suddenly, there was a path for photonic quantum computers to show that they could be faster than classical computers.

The setup for boson sampling is analogous to the toy called a bean machine, which is just a peg-studded board covered with a sheet of clear glass. Balls are dropped into the rows of pegs from the top. On their way down, they bounce off of the pegs and each other until they land in slots at the bottom. Simulating the distribution of balls in slots is relatively easy on a classical computer.

Instead of balls, boson sampling uses photons, and it replaces pegs with mirrors and prisms. Photons from the lasers bounce off of mirrors and through prisms until they land in a “slot” to be detected. Unlike the classical balls, the photon’s quantum properties lead to an exponentially increasing number of possible distributions.

The problem boson sampling solves is essentially “What is the distribution of photons?” Boson sampling is a quantum computer that solves itself by being the distribution of photons. Meanwhile, a classical computer has to figure out the distribution of photons by computing what’s called the “permanent” of a matrix. For an input of two photons, this is just a short calculation with a two-by-two array. But as the number of photonic inputs and detectors goes up, the size of the array grows, exponentially increasing the problem’s computational difficulty.

Last year the USTC group demonstrated boson sampling with 14 detected photons—hard for a laptop to compute, but easy for a supercomputer. To scale up to quantum primacy, they used a slightly different protocol, Gaussian boson sampling.

According to Christine Silberhorn, an quantum optics expert at the University of Paderborn in Germany and one of the co-developers of Gaussian boson sampling, the technique was designed to avoid the unreliable single photons used in Aaronson and Arkhipov’s “vanilla” boson sampling.

“I really wanted to make it practical,” she says “It's a scheme which is specific to what you can do experimentally.”

Read the full story and more related stories on Scientific American

...
more posts