My first reaction is that it's way too optimistic: full quantum computers are really hard, slow developing, and not actually that useful since aside from famous applications like integer factoring (which as far as I know, is purely destructive - of security) there aren't that many speedups available from algorithms: https://en.wikipedia.org/wiki/Category:Quantum_algorithms Factoring 15 isn't an impressive achievement.
Then I notice the caveat 'for discrete optimization', and that it's D-Wave stuff. I'm not actually sure what D-Wave is doing or what it'd be used for. I look at http://en.wikipedia.org/wiki/Quantum_annealing and it says it's an optimization process like simulated annealing (OK, I know what that is) which uses quantum tunneling for the annealing jumps (not sure why that's better than regular RNG-guided jumps across fitness landscapes but I'll roll with it). What's the speedup here? "This O(N^{1/2}) advantage in quantum search (compared to the classical effort growing linearly with \Delta or N , the problem size) is well established".
So it's a linear classical algorithm versus a sub-linear square-root (faster growing than O(log N) if that helps). This is... OK but it's certainly no Shor's algorithm, turning an exponential into a quasi-log complexity operation! And then there's the question about how much the extra accuracy is actually worth. (If the difference between the answer returned by your best classical computer doing N ops and the quantum chip which can do N^2 equivalent ops is 0.00001%, this may or may not matter in practice.) This summary isn't consistent with the 'Scaling Benchmark - QC vs Intel', which is claiming advantages which increase faster than a square (300 qubits = 500x; 400 qubits = 1,000,000x; 512 qubits = 10,000,000,000; but 300^2=90,000; 400^2=160,000; 512^2=262,144) so I'm obviously missing something or doing something wrong. Between these 3 large issues, I can't be confident of any predictions about the value of future D-Wave chips. Well, let's take that at face-value.
Business-wise, they seem to be in a little trouble, dependent on massive external investment (eg. http://www.technologyreview.com/news/429429/the-cia-and-jeff-bezos-bet-on-quantum-computing/ ) with their current product costing $10m and the one published example of use being questionable http://blogs.nature.com/news/2012/08/d-wave-quantum-computer-solves-protein-folding-problem.html :
> The model consisted of mathematical representations of amino acids in a lattice, connected by different interaction strengths. The D-Wave computer found the lowest configurations of amino acids and interactions, which corresponds to the most economical folding of the proteins. It worked, but not particularly well. According to the researchers, 10,000 measurements using an 81-qubit version of the experiment gave the correct answer just 13 times. This was owing, in part, to the limitations of the machine itself, and in part to thermal noise that disrupted the computation. It’s also worth pointing that conventional computers could already solve these particular protein folding problems.
This suggests to me that they're already having serious scaling issues. Or http://www.technologyreview.com/news/424163/tapping-quantum-effects-for-software-that-learns/2/
> "You send in your problem and then get back a much more accurate result than you would on a conventional computer," says Rose. He says tests have shown software using the D-Wave system can learn things like how to recognize particular objects in photos up to 9 percent more accurately than a conventional alternative. Rose predicts that the gap will rapidly widen as programmers learn to optimize their code for the way D-Wave's technology behaves.
I'm actually reminded a lot of Symbolics and Thinking Machines, reading about D-Wave.
So, existing predictions:
1. Intel selling QC by 2025: http://predictionbook.com/predictions/4444 4%
2. >100 factored by 2020: http://predictionbook.com/predictions/3213 40% (may already have happened)
By 2030: http://predictionbook.com/predictions/3211 55%
By 2045: http://predictionbook.com/predictions/3212 75%
3. >100 qubit system by 2011: http://predictionbook.com/predictions/1502 1%, but I judged this as happening because D-Wave sold the Lockheed system in the nick of time. Embarrassing. I should have specified a fully-general QC chip.
So, new predictions. My big concerns are business viability and scaling:
1. D-Wave will be bankrupt, bought, or defunct in 5 years: http://predictionbook.com/predictions/8566 30%
* in 10 years: http://predictionbook.com/predictions/8567 50%
2. D-Wave will be commercially selling a >=1024-qubit quantum annealing chip by 2015: http://predictionbook.com/predictions/8568 40%
* by 2020: http://predictionbook.com/predictions/8569 60%
3. D-Wave will be commercially selling a >=2048-qubit quantum annealing chip by 2016: http://predictionbook.com/predictions/8570 25%
* by 2018: http://predictionbook.com/predictions/8571 35%
Then I notice the caveat 'for discrete optimization', and that it's D-Wave stuff. I'm not actually sure what D-Wave is doing or what it'd be used for. I look at http://en.wikipedia.org/wiki/Quantum_annealing and it says it's an optimization process like simulated annealing (OK, I know what that is) which uses quantum tunneling for the annealing jumps (not sure why that's better than regular RNG-guided jumps across fitness landscapes but I'll roll with it). What's the speedup here? "This O(N^{1/2}) advantage in quantum search (compared to the classical effort growing linearly with \Delta or N , the problem size) is well established".
So it's a linear classical algorithm versus a sub-linear square-root (faster growing than O(log N) if that helps). This is... OK but it's certainly no Shor's algorithm, turning an exponential into a quasi-log complexity operation! And then there's the question about how much the extra accuracy is actually worth. (If the difference between the answer returned by your best classical computer doing N ops and the quantum chip which can do N^2 equivalent ops is 0.00001%, this may or may not matter in practice.) This summary isn't consistent with the 'Scaling Benchmark - QC vs Intel', which is claiming advantages which increase faster than a square (300 qubits = 500x; 400 qubits = 1,000,000x; 512 qubits = 10,000,000,000; but 300^2=90,000; 400^2=160,000; 512^2=262,144) so I'm obviously missing something or doing something wrong. Between these 3 large issues, I can't be confident of any predictions about the value of future D-Wave chips. Well, let's take that at face-value.
Business-wise, they seem to be in a little trouble, dependent on massive external investment (eg. http://www.technologyreview.com/news/429429/the-cia-and-jeff-bezos-bet-on-quantum-computing/ ) with their current product costing $10m and the one published example of use being questionable http://blogs.nature.com/news/2012/08/d-wave-quantum-computer-solves-protein-folding-problem.html :
> The model consisted of mathematical representations of amino acids in a lattice, connected by different interaction strengths. The D-Wave computer found the lowest configurations of amino acids and interactions, which corresponds to the most economical folding of the proteins. It worked, but not particularly well. According to the researchers, 10,000 measurements using an 81-qubit version of the experiment gave the correct answer just 13 times. This was owing, in part, to the limitations of the machine itself, and in part to thermal noise that disrupted the computation. It’s also worth pointing that conventional computers could already solve these particular protein folding problems.
This suggests to me that they're already having serious scaling issues. Or http://www.technologyreview.com/news/424163/tapping-quantum-effects-for-software-that-learns/2/
> "You send in your problem and then get back a much more accurate result than you would on a conventional computer," says Rose. He says tests have shown software using the D-Wave system can learn things like how to recognize particular objects in photos up to 9 percent more accurately than a conventional alternative. Rose predicts that the gap will rapidly widen as programmers learn to optimize their code for the way D-Wave's technology behaves.
I'm actually reminded a lot of Symbolics and Thinking Machines, reading about D-Wave.
So, existing predictions:
1. Intel selling QC by 2025: http://predictionbook.com/predictions/4444 4%
2. >100 factored by 2020: http://predictionbook.com/predictions/3213 40% (may already have happened)
By 2030: http://predictionbook.com/predictions/3211 55%
By 2045: http://predictionbook.com/predictions/3212 75%
3. >100 qubit system by 2011: http://predictionbook.com/predictions/1502 1%, but I judged this as happening because D-Wave sold the Lockheed system in the nick of time. Embarrassing. I should have specified a fully-general QC chip.
So, new predictions. My big concerns are business viability and scaling:
1. D-Wave will be bankrupt, bought, or defunct in 5 years: http://predictionbook.com/predictions/8566 30%
* in 10 years: http://predictionbook.com/predictions/8567 50%
2. D-Wave will be commercially selling a >=1024-qubit quantum annealing chip by 2015: http://predictionbook.com/predictions/8568 40%
* by 2020: http://predictionbook.com/predictions/8569 60%
3. D-Wave will be commercially selling a >=2048-qubit quantum annealing chip by 2016: http://predictionbook.com/predictions/8570 25%
* by 2018: http://predictionbook.com/predictions/8571 35%
Quantum computers will be a world changer because of their ability to practically simulate quantum mechanics. This will affect everything from medicine to biology to materials, energy and nanotech.
But you are correct that in general they are not much better than classical. Your skepticism in DWave is well placed, they have shown no real evidence that what they have taken advantage of setting up correlations and leveraging interference. They do have something but it is most likely not a QC. If they had a real QC it would be a big deal.Oct 6, 2012
Agree with +Deen Abiola, the jury is still very much out on D-wave.Oct 6, 2012
http://www.scottaaronson.com/blog/?p=1136
"For me, three crucial points to keep in mind are:
(1) D-Wave still hasn’t demonstrated 2-qubit entanglement, which I see as one of the non-negotiable “sanity checks” for scalable quantum computing. In other words: if you’re producing entanglement, then you might or might not be getting quantum speedups, but if you’re not producing entanglement, then our current understanding fails to explain how you could possibly be getting quantum speedups.
(2) Unfortunately, the fact that D-Wave’s machine solves some particular problem in some amount of time, and a specific classical computer running (say) simulated annealing took more time, is not (by itself) good evidence that D-Wave was achieving the speedup because of quantum effects. Keep in mind that D-Wave has now spent ~$100 million and ~10 years of effort on a highly-optimized, special-purpose computer for solving one specific optimization problem. So, as I like to put it, quantum effects could be playing the role of “the stone in a stone soup”: attracting interest, investment, talented people, etc. to build a device that performs quite well at its specialized task, but not ultimately because of quantum coherence in that device.
(3) The quantum algorithm on which D-Wave’s business model is based — namely, the quantum adiabatic algorithm — has the property that it “degrades gracefully” to classical simulated annealing when the decoherence rate goes up. This, fundamentally, is the thing that makes it difficult to know what role, if any, quantum coherence is playing in the performance of their device. If they were trying to use Shor’s algorithm to factor numbers, the situation would be much more clear-cut: a decoherent version of Shor’s algorithm just gives you random garbage. But a decoherent version of the adiabatic algorithm still gives you a pretty good (but now essentially ”classical”) algorithm, and that’s what makes it hard to understand what’s going on here.
As I’ve said before, I no longer feel like playing an adversarial role. I really, genuinely hope D-Wave succeeds. But the burden is on them to demonstrate that their device uses quantum effects to obtain a speedup, and they still haven’t met that burden. When and if the situation changes, I’ll be happy to say so. Until then, though, I seem to have the unenviable task of repeating the same observation over and over, for 6+ years, and confirming that, no, the latest sale, VC round, announcement of another “application” (which, once again, might or might not exploit quantum effects), etc., hasn’t changed the truth of that observation."Oct 9, 2012
Update: D-wave is still around, but has only reached 1000 qubits rather than 2000+, and still is having trouble demonstrating any real speedups.43w