EDIT: My comment was kind of snarky and curt without much info; I need to be better about that. I did read Scott's post when it was published, and the consensus I got from it is that D-Wave's device achieves a constant-factor speedup, not a quantum one. I just think, with that being generally accepted by all parties now, it's pretty disingenuous to keep calling it a "quantum computer", when it's really a (faster) classical computer that uses quantum mechanical effects. I mean, the Intel chip in my laptop also uses QM effects because the feature size is so small, but nobody calls it a quantum computer. Maybe that's the media's fault though. Are D-Wave/Google themselves still saying "quantum computer"?
SECOND EDIT: Rereading Scott's post more carefully, it seems like Google and D-Wave are now calling it a "quantum annealing device" and are more forthcoming about the lack of quantum speedup. So unless they're talking out of both sides of their mouth and still saying "quantum computer" to the popular press to build hype, I guess everyone is a reasonable person after all and it's the media's fault as usual.
I am wondering if they demonstrated there is quantum entangling taking place between qubits and have they observed an asymptotic speedup?
Wonder if there is some hesitation to just say, yeah this doesn't work. We spent all this time and money on it, but that's ok. We learned what doesn't work that is still valuable.
I studied quantum computing a while back (maybe 7-8 years ago). There was a feeling it was going to be the next big thing. Grover's search and Shor's factoring was promising. But I think we ended up mostly with pictures of cats in the cloud and Javascript on the server ;-) which is ok too, I guess.
The previous HN discussion clarifies that google reports a constant (although large) increase in speed for a specific application.
I think basically if you grasp why Selby's algorithm is able to beat D-Wave, you'll understand why D-Wave isn't a useful quantum computer, whatever terminology you want to use to describe it.
N.B. Even Aaronson admits D-Wave may have useful technology[2], it's just that the present "dirty qbit" approach doesn't give you a machine useful for anything more than basic science research, which is not at all how D-Wave have hyped it.
[1] http://news.mit.edu/2015/3q-scott-aaronson-google-quantum-co...
[2] http://www.scottaaronson.com/blog/?p=2555#comment-967324
From the abstract.
That there's classical algorithms that might get a similar speedup, esp if on custom hardware, validates my prior concern.
Its not naively clear where the line is drawn between "speedup that could be obtained through clever heuristic/nondeterministic/whatever classical algorithms" and "speedup that is necessarily the domain of quantum computation." I suspect the question is a deep (open?) one in computational complexity theory.
Just to include me in the development somehow would have meant so much to me, instead I feel stabbed in the back by someone who was a hero to me. It's my fault because I foolishly told him and I didn't publish my code first but that's still how I feel.
quantum everything is obviously the future of technology, but it is a long long path.
i get more excited about fundmanetal quantum research and research on new methods of testing quantum states.
people claiming to have full working computers of any value, if they even do work-------------are just blowing smoke and mirrors like elizabeth holmes from theranos. nearly criminal behavior , alleging things are simply what they aren't in order to get money.
there is an epidemic of hack science and engineering occuring to secure and suck in scarce funds. and where-ever there is military related big money the likelihood of fraud is even bigger.
read the book about the hafnium bomb and you will understand why the military investors are frequently so easy to scam---a two pronged version of greed and desire to have the unstoppable weapons.
paradoxically enough , if more pre-investigation was done, they'd have more money to spend on real R&D. i guess economists would just call this a massive case of malinvestment.
There were a few discussions at the time (e.g. https://news.ycombinator.com/item?id=10698317) but perhaps the community is interested in more.