While quantum computers can do interesting things without dedicated memory, memory would offer a lot of flexibility in terms of the kind of algorithms they could run and how quantum systems could interact with each other and the outside world. Building quantum memory is a huge challenge, because reading and writing it both have to be extremely efficient and accurate, and the memory has to do something very atypical for quantum systems: hold its state for a considerable amount of time.
If we solve the problems, however, quantum memory offers some rather unusual properties. The process of writing to quantum memory is very similar to the process of quantum teleportation, meaning that the memory can potentially be sent between different computing facilities. And since the storage device is a quantum object, the possibility exists that two qubits of memory in different locations could become entangled, essentially delocalizing the value of the qubit and spreading it between two facilities.
In demonstration of that promise, Chinese researchers have entangled quantum memory in facilities more than 20 kilometers apart. Separately, they also did the entanglement with photons that traveled through 50 kilometers of optical cable. But the process of sending and entangling has an unfortunate side effect: It takes so long that the memory usually loses its coherence in the meantime.
The basic outlines of the experiment are quite simple for a process that is somewhat mind boggling. The qubits used here are small clouds of cold atoms (about a hundred million atoms for each). They are placed in a state where the atoms are indistinguishable from a quantum perspective and thus can be treated as a single quantum object. Since a quantum state is distributed to all atoms at the same time, it provides slightly more stability than other forms of quantum memory. The state of the atomic cloud is read and written using photons, and the atoms are placed in an optical cavity that traps these photons. This gives the photons a lot of opportunities to communicate with the atom cloud, which increases the efficiency of the operations.
When the state of the memory is determined by a writing photon, the atomic collective emits a second photon indicating its success. The polarization of this photon contains information about the state of the atoms and thus serves as an aid to entangle the memory.
Unfortunately, that photon is at a wavelength that isn’t very useful because it tends to be lost during transmission. So the researchers sacrificed a little efficiency for a lot of usability. They used a device that shifts the wavelength of the photons from the near-infrared to the wavelengths used in standard communication fibers. About 30 percent of the photons were lost, but the remaining photons can be sent over existing fiber networks with high efficiency (provided the right hardware is installed where the fiber ends).
There are losses from filtering noise and getting photons into the fiber, but the whole process is more than 30 percent efficient from start to finish. In this case, the two ends were 11 km apart, at the University of Science and Technology of China and Hefei Software Park.
For the entanglement, the authors created two qubits of quantum memory, generated photons from both, and sent those photons over separate cables to the Software Park. There, the photons were sent through a device that made them impossible to distinguish, causing them to become entangled. Since they, in turn, were entangled with the quantum memory that produced them, the two qubits of the memory were then entangled. Although they lived in the same lab, the geometry of the fibers could have been arbitrary — it was equivalent to the entanglement of two pieces of memory that were 22 km apart.
That is a big step higher than the previous record of 1.4 km.
To stretch things out a bit, the researchers then turned to a long coil of cable. Two photons were sent down the cable and then manipulated so that it was impossible to determine which path they took through the cable. This entangled them again, and with it the memories the photons emitted in the first place. The process required tracking the phase of the incoming photons, which is significantly more difficult, and therefore the overall efficiency dropped.
For a 50 km fiber path this led to rather low efficiencies, on the order of 10-4† Which means the time to reach entanglement increased – in this case to more than half a second. And that’s a problem, because the typical lifetime of a qubit stored in this memory is 70 microseconds, much shorter than the entanglement process. So the approach definitely falls into the “not quite ready for production” category.
Which is a shame, because the approach opens up a myriad of very intriguing possibilities. One is that spreading a qubit across two facilities through this off-location could make it possible to perform a single quantum computation on remote facilities, possibly with different hardware that has different strengths and weaknesses. And the researchers note that there is a technique called entanglement that could increase the distance between memory qubits even further, provided the qubits maintain their state. But if all these errors involve some degree of error, that error will quickly pile up and render the whole thing useless.
None of this should undermine the achievement shown here, but it does show how far we still have to go. The inefficiencies that emerge at each step of the process each represent a distinct engineering and/or physics challenge that we must address before it can be applied in the real world.
Nature2019. DOI: 10.1038/s41586-020-1976-7 (About DOIs).