A lot of hoopla about Google's new quantum computer chip. To put it into perspective, in 2019, largest quantum ICs had ~50 physical qubits. Google's new chip has 105. Experts estimate practical applications of quantum computing when scale reaches a million+, maybe in 60-70 years' time? (If ever)
(That is to say that the applications we're reading about in the Google press releases would require 1M+ physical qubits)
The current state of the art in quantum computing is processors that can do amazingly fast computations addressing such tiny amounts of data as to make them practically useless - or should I say "use case-less"?
QC researchers estimate it will require about 1,000 physical qubits to model one logical qubit, because of the need for error correction.
Even if Willow significantly improves on that ratio, unless they've improved it by 10x, that means Willow implements less than one logical qubit *reliably*.
Their press release is like Intel announcing a chip with less than one transistor by talking about 3D animation and digital signal processing.
@jasongorman my understanding was that they’ve developed a new technique for error correction that reduces that ratio of physical to logical qbits as the size of the quantum computer grows. Also reduced the error rate on the individual qbits significantly. I think your estimates are quite far off and this is a real breakthrough.
@apsmith @jasongorman Do you have a reference for that? I'd be interested in reading about the mathematics, but I can't really bring myself to slog through the breathless press releases again.
@wizzwizz4 @apsmith @jasongorman
This is the paper. And yeah, it's as far as I know quite a nice accomplishment, even if the press releases are very cringe.
https://arxiv.org/abs/2408.13687