Google has plans to show quantum supremacy in the next couple of months: where a quantum computer will perform a task that cannot be simulated on a classical computer [1]. These near-term (5-10 year) quantum computers will likely be used for simulating quantum chemistry and solving optimization problems [2].
[0]: https://www.technologyreview.com/s/609451/ibm-raises-the-bar...
[1]: https://www.technologyreview.com/s/609035/google-reveals-blu...
[2]: https://www.nature.com/news/commercialize-quantum-technologi...
D-Wave has a 2000-qubit machine.
Google and IBM want their qubits to be of high quality (high coherence times e.t.c). One of the big obstacles right now is scaling up the number of qubits while making sure that their quality is high.
[0]: https://en.wikipedia.org/wiki/Quantum_annealing
[1]: https://www.research.ibm.com/ibm-q/resources/quantum-volume....
Shameless plug, check out my book https://www.amazon.com/dp/0992001021/noBSLA for an in-depth view of the linear algebra background necessary for quantum computing.
If you know linear algebra well, then quantum mechanics and quantum computing is nothing fancy: just an area of applications (See Chapter 9 on QM). Here is an excerpt: https://minireference.com/static/excerpts/noBSguide2LA_previ...
Check these quantum states out as examples, which cannot be broken into a tensor product of two other states, they are unseparable -
Ex.1: 1/sqrt(2)∣00⟩ + 1/sqrt(2)∣11⟩ !=∣ψ1⟩⊗∣ψ2⟩
Ex.2: 1/sqrt(2)∣01⟩ − 1/sqrt(2)∣10⟩ !=∣ψ1⟩⊗∣ψ2⟩
So, in all previous 2 qubit examples he showed yielded 4 states (00, 01, 10, 11). Is the author saying that some two-qubit systems can be achieved such that not all 4 possible discrete states can participate in superposition? (i.e. in the system of ex. 1, states 10 and 01 are not possible and in ex. 2, states 00 and 11 are not possible?)
The classification of states as separable vs entangled refers to the existence of a local description for the two qubits. Remember that |00⟩ is shorthand for |0⟩⊗|0⟩, meaning the state of the two-qubit system when qubit 1 is in state |0⟩ and qubit 2 is in state |0⟩.
Separable states can be written in the form (α|0⟩+β|1⟩)⊗(γ|0⟩+δ|1⟩) = ∣ψ1⟩⊗∣ψ2⟩. Note there is a clear local description for the first qubit ∣ψ1⟩ and and a separate local description of the state ∣ψ2⟩. If Alice prepares her qubit 1 in the state ∣ψ1⟩ and Bob prepares his qubit 2 in the state ∣ψ2⟩ then the combine description of their two qubits is what's shown above. The state of the combined system is describable as the tensor product of two separate local descriptions.
Entangled states, on the contrary, are states that cannot be described as the tensor product of two local descriptions. Specifically, there exist configurations a,b,c,d for a two-qubit quantum system such that
a|00⟩ +b|01⟩ +c|10⟩ + d|11⟩ ≠ (α|0⟩+β|1⟩) ⊗ (γ|0⟩+δ|1⟩)
no matter what choice of α,β,γ,δ you make. The phenomenon of entanglement is a quantum-only thing, that can't really be understood via classical analogies, since classical systems can necessarily be described as the combination of two local descriptions. The examples given are two of the four Bell states, see https://en.wikipedia.org/wiki/Bell_state , but there are many more entangled states. Here is a concrete physics example https://en.wikipedia.org/wiki/Singlet_state#Singlets_and_Ent...Interestingly, many of the quantum computing experiments perform involve the manipulation of entabgled states because they serve as proof that something quantum is going on...
Keep up the good work!
They are similar to the old school analog electric computers. They have some interesting properties and I can actually imagine programming one unlike DQ.
There's a fourth class, continuous analog with entanglement which are superior to both, DQ, and continuous-variable quantum computers but right now we should really be looking into the continous variable ones.
Three good references:
Book on quantum and classical computing: Aaronson's "Quantum Computing since Democritus" for gentle-for-newbies but rigorous discussion
Very old paper on classical computing (analog vs digital): Von Neumann's "Probabilistic logics and the synthesis of reliable organisms from unreliable components" (pretty advanced)
Newish (old for the field) paper on quantum computing: Calderbank's "Good Quantum Error-Correcting Codes Exist"
Edit and addition: I work at Yale's Quantum Institute and we are some of the biggest proponents of "continuous variable" quantum computing. We use the continuous variables to encode a discrete "qudit" (with a "d") representation for the information, for all the reasons mentioned above (noise and error correction).
[1] Jose Luis Rosales, Vicente Martin, "Quantum Simulation of the Factorization Problem", 9 Nov 2016. (https://arxiv.org/abs/1601.04896)
This is pretty cool, but I think the main point of the article is to show an amazing link between number theory and the theory of quantum mechanics, not to suggest a practical quantum algorithm. Although, for certain hardware implementations what they are suggesting might be easier than Shor's algorithm, just because of technical reasons.
Either way, this is not "continuous variable" or "analog" in the sense discussed above. If you are interested about the distinction, you can also check out Adiabatic Quantum Computing, which also seems continuous on first sight, but it is equivalent to the typical model of quantum circuits (the trick being the existence of a "gap" between the ground state which we are adiabatically evolving and all the other states of the system).
I threw a bunch of terminology without defining them, but I would have to leave googling those to you. I would be happy to attempt to answer questions though.
Photonic systems have plenty of error correcting schemes.
There are things we can do to suppress errors and make a slightly bigger analog computer (classical or quantum) but there is no way to make an analog system scalable (i.e. not just lower errors to some floor, rather completely eliminate errors).
The book cited in my previous comment explains the details in a fairly understandable fashion, but it takes up a whole chapter. The gist of it is, you need some minimal "logic distance" between your data "levels" in order to be able to distinguish them and correct deviations. This requires digitization.
If somebody finds a way to error-correct analog representations they will have a way to solve NP-complete problems for instance. The Nobel price will be the least of their recognitions.
If one were to work with an infinite dimensional system, you’d still be employing finite truncations of infinite dimensional operators, leading you back to, more or less, a finite dimensional subspace.
Digitization—or rather, discretization—is important, and the reason computers have managed to be so successful. And programming a system like a universal gate-based quantum computer can be done now, with languages like Quil and libraries like pyQuil [0].
https://quantumexperience.ng.bluemix.net/qx/experience
They also include a brief tutorial on how to program for it too.
1) A QC is able to derive the private-keys for a wallet's public address, allowing for the theft of bitcoin
2) A QC able to perform the proof-of-work algorithm to mine new blocks at an order-of-magnitude faster rate than currently possible.
Fortunately for 1) (I think) it currently takes 2^512 (?) operations to break the private/public algorithm which is unfeasible to brute-force on normal hardware but a QC brings it down to 2^128 - but that's still on-the-order-of unfeasible - and in the event it ever does happen the blockchain could be changed overnight to use a new keying algorithm. And for 2) it would cause the blockchain difficulty to be pushed-up so high that people with QC machines would see the same ROI as today's industrial GPU and ASIC miners see - plus given that QC computers are horrendously expensive (think: billions of USD for a 50-bit general-purpose QC) it questions why you'd ever try to break Bitcoin as you'd already be a billionaire.
Where have you got that info? Quantum computers can break ECDSA in polynomial function of 512.
> the blockchain could be changed overnight to use a new keying algorithm.
How?
No, as even if the current underlying crypto falls (for whatever reason), the ledger up to that point could very likely endure in a new form.
Outside of the blockchain concept of itself, the oft-ignored indirect gift of bitcoin's longevity is the way it is providing us a new digital mechanism for the international distribution of wealth as more people join, the ledger proceeds and mined coins are exchanged more widely.
Full-disclosure: no horses in this race yet
Probably.