> I'm curious if EC can mitigate the sub-par decoherence times.
The main EC paper referenced in this blog post showed that the logical qubit lifetime using a distance-7 code (all 105 qubits) was double the lifetime of the physical qubits of the same machine.
I'm not sure how lifetime relates to decoherence time, but if that helps please let me know.
That's very useful, I missed that when I read through the article.
If the logical qubit can have double the lifetime of any physical qubit, that's massive. Recall IBM's chips, with t-times of ~400microseconds. Doubling that would change the order of magnitude.
It still won't be enough to do much in the near term - like other commenters say, this seems to be a proof of concept - but the concept is very promising.
The first company to get there and make their systems easy to use could see a similar run up in value to NVIDIA after ChatGPT3. IBM seems to be the strongest in the space overall, for now.
I'm sorry if this is nitpicky but your comment is hilarious to me - doubling something is doubling something, "changing the order of magnitude" would entail multiplication by 10.
Hahaha not at all, great catch. Sometimes my gray matter just totally craps out... like thinking of "changing order of magnitude" as "adding 1 extra digit".
Reminds me of the time my research director pulled me aside for defining CPU as "core processing unit" instead of "central processing unit" in a paper!
The main EC paper referenced in this blog post showed that the logical qubit lifetime using a distance-7 code (all 105 qubits) was double the lifetime of the physical qubits of the same machine.
I'm not sure how lifetime relates to decoherence time, but if that helps please let me know.