Here's what really happens. If you have a string of n-qubits, when you measure them, they might end up randomly in of of the 2^n possible configurations. However, if you apply some operations to your string of n-qubits using quantum gates, you can usefully bias their wave equations, such that the probabilities of certain configurations are much more likely to appear. (You can't have too many of these operations, however, as that runs the risk of decoherence.) Hopefully, you can do this in such a way, that the biased configurations are the answer to a problem you want to solve.
So then, if you have a quantum computer in such a setup, you can run it a bunch of times, and if everything goes well after enough iterations, you will be able to notice a bias towards certain configurations of the string of bits. If you can do this often enough to get statistical significance, then you can be pretty confident you've found your answers.
https://www.youtube.com/watch?v=IrbJYsep45E
https://www.youtube.com/watch?v=wUwZZaI5u0c
EDIT: I rather like Issac Arthur, but unfortunately, his Quantum Computing episode is an example of exactly this kind of popular misconception. I've called him out on it in comments.
https://www.youtube.com/watch?v=wgCuKTN8sX0
EDIT: I can't find my comment anymore, and I've also discovered that I'm now banned from the public Facebook group! Hmmm.
EDIT: It seems that Issac did correct his video, kind of. He still seems to advocate the 2^n parallelism, but then explains why that can't work around 18 minutes in.
This is an even more succinct way of saying it. This is the key point that people need to know, which separates someone from understanding the real potential of quantum computers from "quantum woo."
Let's be clear, even with this caveat, the potential of quantum computing is enormous.
No question. But to think clearly about it, one has to know this caveat. Likewise, to understand the potential of conventional computers, one needs to actually be able to think about computation within its realistic limits, not treat them as magic boxes.
The answer: we don’t know. We don’t even know if BQP is actually any bigger than P.
However, we’ve figured out how to do some things quickly on quantum computers that we haven’t ever figured out how to do quickly on classical computers, like certain mathematically useful operations on abelian groups.
One specific example is that you can do Fourier transforms in O(log(n)log(log(n))) rather than O(n log(n)), which is pretty cool. If your vector space is too big to even represent in classical memory, you might still be able to work with it on a quantum computer.
Why would there be anything you could compute on a quantum computer that you can't compute on a classical one (provided the classical one is powerful enough or given enough time for certain algorithms quantum computers are more efficient at).
Is there speculation that a quantum computer could be used for hypercomputing?
Quantum computers don't have any space advantage, I thought?
https://blog.acolyer.org/2018/02/02/polynomial-time-algorith...
> that the effort required to obtain a low enough error level for any implementation of universal quantum circuits increases exponentially with the number of qubits, and thus, quantum computers are not possible.
That's what dynamical decoupling (DD) or pulse sequences are for: you can get arbitrarily high quality quantum gates (typically, but not always, at the cost of increasing the gate times; removing the most significant first order error typically increases the gate time by less than an order magnitude, think 2x-4x) without increasing the number of physical qubits at all. People don't just rely on surface codes, anyone serious about implementing a quantum computer use surface codes after DD to reduce the infidelity to the threshold required for them. Which is why you don't need hundreds of physical qubits to have a single robust logical qubit.
For those who are not familiar, DD is like one of the oldest tricks in the bag, it's nothing like a new cutting edge type quantum error correction code. In fact, the oldest form of DD, spin echo, precedes any real discussion about quantum computers by a decade.
DD is possible essentially because quantum operations don't commute so errors don't simply add up as they do with classical errors; this makes it possible to obtain a better gate by carefully combining noisy gates such that (significant) errors cancel.
I'm not sure if I could distinguish what you just said from a markov-chain-based paper generator.
I feel old.
I wouldn't expect you to be familiar with it unless you've actually worked in the academia doing research quantum information. This isn't stuff we teach in class at all, even to PhD students, it's a part of research.
It'll also probably read like "from a markov-chain-based paper generator" to you too, but if you're interested about how actual noise behaves, you can read this for example: https://www.nature.com/articles/nphys1994 (arxiv link if you don't have access: https://arxiv.org/abs/1101.4707)
While it's about flux qubits, the 1/f behavior is almost universal in all current promising candidates, and DD virtually works in all quantum computers.
It’s along the same lines as, in a debate, asking each participant to sum up their opponent’s point of view in a way that they agree with. A fundamental tool for productive discourse.
Really enjoyed the interview.
I suspect that any speedup imparted by clever uses of entanglement will be counteracted by the fundamental need for error correction and averaging over several runs. I look forward to being proven either right or wrong, and I suspect I will be in my lifetime.
Perhaps you don't understand what people mean when they say scalable quantum computing? They mean that it scales with the error correction included. Needing to average over several runs is already factored into the complexity of quantum algorithms.
Also, whether you can make a scalable quantum computer isn't beside the point. It's the entire point Kalai's trying to make. If you can make a scalable quantum computer we already know that there are applications it will outperform a classical computer on performance and cost.
It's important to remember that the motivation for quantum computing came from the realization that it's intractable to simulate quantum phenomenon on classical computers. So any claim that classical computers can do just as good as quantum needs to butt up against this realization at some point (IMHO).
I'm looking forward to it either way.
On the other hand if nobody can provide a mathematical proof that it's impossible, it could just be engineering. Then again, "just engineering" doesn't prove a useful quantum computer may require practically impossible configurations (i.e., having to be in the middle of a sphere of lead dozens of miles in radius cooled to near absolute-zero is not necessarily impossible, but isn't something we're going to build anytime soon), or that we simply won't be able to figure out the requisite configurations.
Alternatively, this guy will prove to be wrong and someone will just build a quantum computer with a couple hundred qubits, at which point we may have enough data to observationally draw the curve rather than via math and the simplifications required to make it tractable, and maybe it'll be fine.
I agree that these are unusual and perhaps even strange lines of analysis.
What I mean by traditional computers catching up with it is that we will see innovations in traditional algorithms that either make quantum computers extremely pricey for a very long time in respect to traditional computers or that we find innovations that will bypass the things that they are good at without violating any of the current theories we have.
Currently, we have an enormous effort invested in our silicon architecture. We seem to be hitting limitations to silicon and they seem to be making a large bet on Quantum computers to eventually allow them to proceed. But, if quantum algorithms take a long time to have useful implementations then we have a local maxima where quantum computers aren't close enough to implement without investing a massive amount of money and that silicon just becomes cheaper and cheaper. When Intel finally reaches the limit of traditional computing that doesn't mean that it won't stop being cheaper.
Intel, IBM, Microsoft, DWave, Google and other companies have small bets on quantum computer engineering strategies that may or may not pay off. Also there seems to be a bunch of researchers publishing more and more papers on the theory of quantum algorithms. There are cross collaborations with academia on quantum computation (and they are basically what the research institutions work with the quantum engineering departments in those companies).
If we find that quantum computers can only speed up computations in the limited set of algorithms then they will probably end up as accelerator cards attached to traditional computers. I expect a order of magnitude of inefficiency to translate problems or do work on quantum computers. Either due to interconnecting or the way that we will figure out to operate them. So we will see these systems relegated to supercomputer workloads.
On the algorithms side, funding the research of quantum algorithms will probably dry up in the way that string theory research has.
Hopefully we will see a large breakthrough that will make quantum computers feasible. From my reading of the research it seems like Topological Quantum Computers have the best chance.
Doesn't work for things we know are exponentially faster on quantum computers. Eventually, with enough (but not an absurd number) of bits, a decent quantum computer can solve problems that wouldn't be possible on a classical computer the size of the universe operating for billions of years.
Improvements in algorithms cuts both ways, too. And for some classes of problems, it's possible to prove (given P != NP, etc) that any classical algorithm is much worse than quantum.
IIUC and IIRC (both somewhat doubtful admittedly), Feynman argued - https://people.eecs.berkeley.edu/~christos/classics/Feynman.... - that only a quantum computer can perform a scalable simulation of quantum physics. Assuming that is correct, there is no option for "traditional computers" to catch up for all problems at least.