Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How plausible is it that someone could acquire this many qubits? How many doublings away are we?

EDIT: [1] suggests D-wave could be 5640 by 2020, doubling approx. every two years - 12 doublings = 24 years from now.

[1] https://en.wikipedia.org/wiki/D-Wave_Systems#D-Wave_2X_and_D...



We are very far away.

We are several theoretical and engineering breakthroughs away from a scalable quantum computer with sufficiently many error-corrected, logical qubits to do something useful with Shor's algorithm. The current rate of error-correction needs to decrease by a factor of 10 - 100. The current state of the art in physical qubits needs to increase by a factor of at least 10,000.

If we assume the number of usable physical qubits doubles every year while the decoherence rate halves every year, it's plausible a quantum computer could be designed to break 2048-bit RSA in slightly under 20 years.[1] To be generous, we'll also assume the implementation and engineering work doesn't meaningfully increase that estimate once the design is finished.

This is an optimistic forecast, to put it mildly. There are credible arguments from skeptics[2][3] in the academic community that it's not actually possible to build a quantum computer capable of breaking RSA in practice. Likewise, everything I've mentioned is only in regard to the known unknowns we need to resolve. It's very probable there are a variety of unknown unknowns to contend with as well.

This is also all aside from the possibility (emphasized by the first report I cited) of a looming winter in quantum computing research. In order to actually reach a point where RSA can be broken, the field needs to start paying out the checks it's been writing. This means actually achieving quantum supremacy and developing legitimately useful quantum computers - scalable or otherwise - for industry applications.

Finally, D-Wave's progress isn't relevant here. They're building a quantum annealer, not a general-purpose quantum computer. All the foregoing is based on the idea of building a general-purpose quantum computer to implement Shor's algorithm. Annealing methods aren't applicable.

_________________________

1. https://www.nap.edu/catalog/25196/quantum-computing-progress...

2. https://spectrum.ieee.org/computing/hardware/the-case-agains...

3. https://gilkalai.wordpress.com/2017/10/16/if-quantum-compute...


Hopefully we will see results way before we start breaking RSA. It looks like people are targeting applications in quantum chemistry [1]. Also, Dyakonov's criticisms are toothless, see comments here [2].

[1] https://arxiv.org/abs/1801.00862

[2] https://scirate.com/arxiv/1903.10760


Yep, quantum chemistry looks a lot more promising in the near term. Thank you for sharing that critique of Dyakonov, I'll have to give it a read.


D-Wave is a highly specific form targetting narrow problems.

Generalized qubits I thought were smaller.

Interesting to see the doubling, I've been asking for the qubit size "moores law" projection for a while but people in the business keep pushing back saying its a bad measure, but even bad measures back in the MIPS and GIGAFLOPS days made some sense.

Is it really nonsensical to measure how "fast" we're achieveing stable qubits? Feels like it should be a thing.


If you assume a Moore's Law-like increase in physical qubit capacity each year, it will take slightly under 20 years. That also assumes the rate of decoherence decreases commensurately.

The reason you get pushback is because Moore's Law exists within a historical context in which it was actually plausible for exponential increases to occur each year, year after year. The field of quantum computing is so nascent that such a context is utterly alien to it. We simply don't have the economies of scale, engineering capabilities nor even theoretical groundwork required to achieve and sustain those kinds of improvements. The other reason is because transistors and qubits are not directly comparable, and you shouldn't try to infer the growth trajectory of one from the other's.

So to answer your question directly - it's not nonsensical at all to measure and forecast the rate of physical qubit capacity increase. The report I cited in another comment does exactly that, which is how you can derive that 20 year estimate I made. But we don't have any evidence those kinds of annual doublings will be achievable anytime soon, so most of it comes down to educated guessing.


Qubits have not doubled every 2 years historically; I see no reason they’ll suddenly do so ever.

Computers did because process shrank. Most qubits are already 1 or a few subatomic particles. There’s zero feature shrinkage possible at those scales.


It's not qbits themselves that have to shrink, but the support machinery, that measures/cools/traps/shields them. Basically the qbit / area or qbit / volume metrics are the important. Also, dollar / qbit is the relevant factor at the end of the day. (And watt / qbit for operating the whole machine.)


Agreed qubit per area or volume is important, but those are not capable of undergoing a Moore's Law event. Entangled qubits cannot now be packed orders of magnitude closer since they're already at the quantum limits.

Moore's Law did not shrink support machinery much at all. It shrunk the lowermost information processing piece. That lowermost piece in this case cannot be shrunk in the same manner.

The fact the bottom cannot shrink is causing Moore's law to end as it is. There is no way around these quantum limits without a major change in physics, one that is unlikely to ever happen.

None of the pieces you list can scale the same way area did for Moore's Law to work for transistors.

Moore made his prediction in 1965. Transistor area in 1971 was a 10,000 nm process. Modern process is a ~10 nm process. This 1000 fold reduction in linear size (which is a 1 million reduction in area, 1 billion in volume) cannot within any reason happen to your items.

Money and power and count dropped in Moore's law only and precisely because feature size dropped, and this could only happen because the initial features were macro scale, not quantum scale. This cannot happen to qubits - they're already at the end of the quantum line.

We will likely someday get smaller quantum computers. They won't follow a Moore's Law from where they are now, however, for the same reasons nothing other than transistor based items followed a Moore's Law - the physics precludes it.


I'm not an expert on the D-wave machine, but I think it's nowhere near meeting the 10^-3 physical gate error rate criteria specified by the paper. They don't even really divide the computation up into gates; it's an annealer.


The difficulty of scaling quantum computers is that maintaining coherence requires making the noise floor exponentially low, because every qubit has a relationship to every other.

So the quantum equivalent of Moore's law is adding a single qubit every 18 months, which roughly doubles it's speed.


It has yet to be proven that adding one qubit double it's speed. Btw there is no actual hardware implementation of a logic gate with Qubit. I think quantum computing is a big load of bullshit taking a lot of funding that should be redirected at things like fighting ageing research.


More than 500 qubits is enough for shore's algorithm to destroy everything


That is for perfect mathematical qbits. Real qbits are always noisy, and you need at least 1500 real qbits to emulate a perfect qbit with reasonable certainty, putting your number in the same ballpark as the number of 20M noisy qbits from the article.


So they can use shore's algorithm?


*Shor's algorithm


don't forget error rate metric too




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: