Google has demonstrated that its approach to quantum error correction – seen as an important part of developing useful quantum computers – is scalable, giving researchers at the company confidence that practical devices will be ready in the coming years.

The building blocks of a quantum computer are qubits, akin to the transistors in a classical computer chip. But today’s qubits are susceptible to interference and errors that must be identified and corrected if we want to build quantum computers large enough to actually tackle real-world problems.

One popular approach to this is called surface code correction, in which many physical qubits work as one so-called logical qubit, essentially introducing redundancy. This is how much of the error correction in classical computers works, but in quantum computers there is an added complication because each qubit exists in a mixed superposition of 0 and 1 and any attempt to measure them directly destroys the data.

This means that adding more physical qubits to your logical qubit can actually be detrimental. “So far, when engineers tried to organise larger and larger ensembles of physical qubits into logical qubits to reach lower error rates, the opposite happened,” says Hartmut Neven at Google.

Google demonstrated this when it first announced a working error correction scheme in 2021, which resulted in a net increase in errors. Subsequent work at the Joint Quantum Institute in Maryland managed to reach a point where logical qubits didn’t worsen error rates, albeit at a technical rather than practical level.

Now, Google has shown that logical qubits can be increased in size and that this scale brings a reduction in the overall error rate. If that trend can be continued, and quantum computers can be increased in scale, then they will be capable of computation that would be impossible on even the most powerful classical computers. Neven says there is now “palpable confidence” among the team that Google will create a commercially useful quantum computer.

The team achieved its logical qubit milestone using the third generation of Google’s Sycamore quantum processor, which has 53 qubits. Surface code logical qubits are typically a grid of qubits paired with another of the same size, with a single qubit reserved to measure the value of others. The company’s experiment saw a move from 3 by 3 grids, involving 17 physical qubits, to 5 by 5 grids using 49 qubits, meaning almost the entire processor was acting as a single logical qubit. This increase brought a reduction in error rate from 3.028 per cent to 2.914 per cent.

Google’s team concedes that the improvement is small, but says that, in theory, the scaling-up process can be continued indefinitely and paves the way for a fault-tolerant quantum computer that can reliably carry out useful tasks. But moving to a 6 by 6 logical qubit – which would involve 71 physical qubits – is impossible with the company’s current generation of quantum processors and will require a big step forward in hardware.

Fernando Gonzalez-Zalba at the University of Cambridge says it would have been good to see a larger improvement in the error rate, but that the research is moving in the right direction.

“The individual components in the processor need to improve a little bit more in order to get an improvement in the logical error rate as technology scales,” he says. “[But] what we see in the series of publications that the team is producing is that they are substantially improving after every publication. I don’t think we are talking about years before we can see a scalable quantum error correction, I think they are pretty close.”

Topics: