fosstodon.org is one of the many independent Mastodon servers you can use to participate in the fediverse.
Fosstodon is an invite only Mastodon instance that is open to those who are interested in technology; particularly free & open source software. If you wish to join, contact us for an invite.

Administered by:

Server stats:

11K
active users

A lot of hoopla about Google's new quantum computer chip. To put it into perspective, in 2019, largest quantum ICs had ~50 physical qubits. Google's new chip has 105. Experts estimate practical applications of quantum computing when scale reaches a million+, maybe in 60-70 years' time? (If ever)

(That is to say that the applications we're reading about in the Google press releases would require 1M+ physical qubits)

The current state of the art in quantum computing is processors that can do amazingly fast computations addressing such tiny amounts of data as to make them practically useless - or should I say "use case-less"?

QC researchers estimate it will require about 1,000 physical qubits to model one logical qubit, because of the need for error correction.

Even if Willow significantly improves on that ratio, unless they've improved it by 10x, that means Willow implements less than one logical qubit *reliably*.

Their press release is like Intel announcing a chip with less than one transistor by talking about 3D animation and digital signal processing.

@jasongorman my understanding was that they’ve developed a new technique for error correction that reduces that ratio of physical to logical qbits as the size of the quantum computer grows. Also reduced the error rate on the individual qbits significantly. I think your estimates are quite far off and this is a real breakthrough.

wizzwizz4

@apsmith @jasongorman Do you have a reference for that? I'd be interested in reading about the mathematics, but I can't really bring myself to slog through the breathless press releases again.

@wizzwizz4 @apsmith @jasongorman
This is the paper. And yeah, it's as far as I know quite a nice accomplishment, even if the press releases are very cringe.
arxiv.org/abs/2408.13687

arXiv.orgQuantum error correction below the surface code thresholdQuantum error correction provides a path to reach practical quantum computing by combining multiple physical qubits into a logical qubit, where the logical error rate is suppressed exponentially as more qubits are added. However, this exponential suppression only occurs if the physical error rate is below a critical threshold. In this work, we present two surface code memories operating below this threshold: a distance-7 code and a distance-5 code integrated with a real-time decoder. The logical error rate of our larger quantum memory is suppressed by a factor of $Λ$ = 2.14 $\pm$ 0.02 when increasing the code distance by two, culminating in a 101-qubit distance-7 code with 0.143% $\pm$ 0.003% error per cycle of error correction. This logical memory is also beyond break-even, exceeding its best physical qubit's lifetime by a factor of 2.4 $\pm$ 0.3. We maintain below-threshold performance when decoding in real time, achieving an average decoder latency of 63 $μ$s at distance-5 up to a million cycles, with a cycle time of 1.1 $μ$s. To probe the limits of our error-correction performance, we run repetition codes up to distance-29 and find that logical performance is limited by rare correlated error events occurring approximately once every hour, or 3 $\times$ 10$^9$ cycles. Our results present device performance that, if scaled, could realize the operational requirements of large scale fault-tolerant quantum algorithms.