given a million qubits ...
also last time I checked the record was 80 qubits and with every doubling of the cubits the complexity of the system and the impurities and the noise are increasing. so it's even questionable whether there will ever be useful quantum computers "fault-tolerant quantum computing architecture based on noise-resilient, topologically protected Majorana-based qubits."
Roadmap to fault tolerant quantum computation using topological qubit arrays
https://arxiv.org/abs/2502.12252I'm not proud of my ignorance, and I sure hope that eventually if I get it, it'd be very useful for me. At least it worked like that for monads.
(note, I have no idea how the braiding happens, or what it means, or ... the rest of the fucking owl, but ... the part about the local indistinguishability is an important part of the puzzle, and why it helps against noise ... also have no idea what's the G-factor, but ... also have no idea what the s-wave/p-wave superconductors are, but ... https://www.reddit.com/r/AskPhysics/comments/11opcy1/comment... ... also ... phew )
Quantum computing is genuinely hard. The hardware is an extremely specialized discipline. The software is at best a very unfamiliar kind of mathematics, and has basically nothing to do with programming. At best, it may one day be a black box that you can use to solve certain conventional programming problems quickly.
Microsoft's technology is pretty far behind as far as capacity but the scaling limitations are less significant and the error-correction overhead is either eliminated or smaller.
https://youtu.be/wSHmygPQukQ?t=723
Is this the scaling problem you are describing?
It's a bit similar to the invention of fast Fourier transform (was reinvented several times...), O(n log n) is so much better than O(n*2) that many problems in science and technology use FFT somewhere in their pipeline, just because it's so powerful, even if unrelated to signal processing. For example, multiplication of very large numbers use FFT (?!).
Can you explain more or share some resources?
As soon as the first practical quantum computer is made available, so much recorded TLS encrypted data is gonna get turned into plain text, probably destroying millions of people's lives. I hope everyone working in quantum research is aware of what their work is leading towards, they're not much better than arms manufacturers working on the next nuke.
I'm also curious. If you don't capture the key exchange but instead only a piece of cypher text. Is there a lower limit to the sample size required to attack the key? It feels like there must be.
but here’s what perplexity says: “Exponential Error Reduction: Willow demonstrates a scalable quantum error correction method, achieving an exponential reduction in error rates as the number of qubits increases125. This is crucial because qubits are prone to errors due to their sensitivity to environmental factors25. ”
It has progressed since: IBM Condor (demonstrated in december 2023) has 1121 qubits.