My visit to D-Wave: Beyond the roast-beef sandwich

Last week I was in Vancouver, to give talks at the University of British Columbia and at the American Association for the Advancement of Science annual meeting.  As part of that visit, on Friday afternoon, John Preskill, John Martinis, Michael Freedman and I accepted a gracious invitation to tour the headquarters of D-Wave Systems in Burnaby (a suburb of Vancouver).  We started out in a conference room, where they served us cookies and sodas.  Being the mature person that I am, the possibility of the cookies being poisoned at no point crossed my mind.

Then we started the tour of D-Wave’s labs.  We looked under a microscope at the superconducting chips; we saw the cooling systems used to get the chips down to 20 millikelvin.  In an experience that harked back to the mainframe era, we actually walked inside the giant black cubes that D-Wave was preparing for shipment.  (The machines are so large partly because of the need for cooling, and partly to let engineers go in and fix things.)  Afterwards, D-Wave CTO Geordie Rose gave a 2-hour presentation about their latest experimental results.  Then we all went out to dinner.  The D-Wave folks were extremely cordial to us and fielded all of our questions.

In spite of my announcement almost a year ago that I was retiring as Chief D-Wave Skeptic, I thought it would be fitting to give Shtetl-Optimized readers an update on what I learned from this visit.  I’ll start with three factual points before moving on to larger issues.

Point #1: D-Wave now has a 128-(qu)bit machine that can output approximate solutions to a particular NP-hard minimization problem—namely, the problem of minimizing the energy of 90-100 Ising spins with pairwise interactions along a certain fixed graph (the “input” to the machine being the tunable interaction strengths).  So I hereby retire my notorious comment from 2007, about the 16-bit machine that D-Wave used for its Sudoku demonstration being no more computationally-useful than a roast-beef sandwich.  D-Wave does have something today that’s more computationally-useful than a roast-beef sandwich; the question is “merely” whether it’s ever more useful than your laptop.  Geordie presented graphs that showed D-Wave’s quantum annealer solving its Ising spin problem “faster” than classical simulated annealing and tabu search (where “faster” means ignoring the time for cooling the annealer down, which seemed fair to me).  Unfortunately, the data didn’t go up to large input sizes, while the data that did go up to large input sizes only compared against complete classical algorithms rather than heuristic ones.  (Of course, all this is leaving aside the large blowups that would likely be incurred in practice, from reducing practical optimization problems to D-Wave’s fixed Ising spin problem.)  In summary, while the observed speedup is certainly interesting, it remains unclear exactly what to make of it, and especially, whether or not quantum coherence is playing a role.

Which brings me to Point #2.  It remains true, as I’ve reiterated here for years, that we have no direct evidence that quantum coherence is playing a role in the observed speedup, or indeed that entanglement between qubits is ever present in the system.  (Note that, if there’s no entanglement, then it becomes extremely implausible that quantum coherence could be playing a role in a speedup.  For while separable-mixed-state quantum computers are not yet known to be efficiently simulable classically, we certainly don’t have any examples where they give a speedup.)  Last year, as reported on this blog, D-Wave had a nice Nature paper that reported quantum tunneling behavior in an 8-qubit system.  However, when I asked D-Wave scientist Mohammad Amin, he said he didn’t think that experiment provided any evidence for entanglement between qubits.

The “obvious” way to demonstrate entanglement between qubits would be to show a Bell inequality violation.  (We know that this can be done in superconducting qubits, as the Schoelkopf group at Yale among others reported it a couple years ago.)  Meanwhile, the “obvious” way to demonstrate a role for quantum coherence in the apparent speedup would be gradually to “turn down” the system’s coherence (for example, by adding an interaction that constantly measured the qubits in the computational basis), and check that the annealer’s performance degraded to that of classical simulated annealing.  Unfortunately, the D-Wave folks told us that neither experiment seems feasible with their current setup, basically because they don’t have arbitrary local unitary transformations and measurements available.  They said they want to try to demonstrate 2-qubit entanglement, but in the meantime, are open to other ideas for how to demonstrate a quantum role in the apparent speedup with their existing setup.

Point #3: D-Wave was finally able to clarify a conceptual point that had been bugging me for years.  I—and apparently many others!—thought D-Wave was claiming that their qubits decohere almost immediately (so that, in particular, entanglement would almost certainly never be present during the computation), but that the lack of entanglement didn’t matter, for some complicated reason having to do with energy gaps.  I was far from alone in regarding such a claim as incredible: as mentioned earlier, there’s no evidence that a quantum computer without entanglement can solve any problem asymptotically faster than a classical computer.  However, that isn’t D-Wave’s claim.  What they think is that their system decoheres almost immediately in the energy eigenbasis, but that it doesn’t decohere in the computational basis—so that, in particular, there would be entanglement at intermediate stages.  If so, that would be perfectly fine from the standpoint of the adiabatic algorithm, which doesn’t need coherence in the energy eigenbasis anyway (after all, the whole point is that, throughout the computation, you want to stay as close to the system’s ground state as possible!).  I understand that, given their knowledge of decoherence mechanisms, some physicists are extremely skeptical that you could have rapid decoherence in the energy basis without getting decoherence in the computational basis also.  So certainly the burden is on D-Wave to demonstrate that they maintain coherence “where it counts.”  But at least I now understand what they’re claiming, and how it would be compatible (if true) with a quantum speedup.

Let me now move on to three broader questions raised by the above points.

The first is: rather than constantly adding more qubits and issuing more hard-to-evaluate announcements, while leaving the scientific characterization of its devices in a state of limbo, why doesn’t D-Wave just focus all its efforts on demonstrating entanglement, or otherwise getting stronger evidence for a quantum role in the apparent speedup?  When I put this question to Mohammad Amin, he said that, if D-Wave had followed my suggestion, it would have published some interesting research papers and then gone out of business—since the fundraising pressure is always for more qubits and more dramatic announcements, not for clearer understanding of its systems.  So, let me try to get a message out to the pointy-haired bosses of the world: a single qubit that you understand is better than a thousand qubits that you don’t.  There’s a reason why academic quantum computing groups focus on pushing down decoherence and demonstrating entanglement in 2, 3, or 4 qubits: because that way, at least you know that the qubits are qubits!  Once you’ve shown that the foundation is solid, then you try to scale up.  So, please support D-Wave if it wants to spend money to show Bell inequality violations, or other “smoking-gun” evidence that its qubits are working together coherently.  You’re welcome, D-Wave!

The second question is one that I’ve encountered many times on the blogosphere: who cares how D-Wave’s system works, and whether it does or doesn’t exploit quantum coherence, as long as it solves practical problems faster?  Sure, maybe what D-Wave is building is really a series of interesting, useful, but still basically “classical” annealing devices.  Maybe the word “quantum” is functioning here as the stone in a stone soup: attracting money, interest, and talented people to build something that, while neat, ultimately doesn’t much depend on quantum mechanics at all.  As long as D-Wave’s (literal!) black box solves the problem instances in such-and-such amount of time, why does it matter what’s inside?

To see the obtuseness of this question, consider a simple thought experiment: suppose D-Wave were marketing a classical, special-purpose, $10-million computer designed to perform simulated annealing, for 90-bit Ising spin glass problems with a certain fixed topology, somewhat better than an off-the-shelf computing cluster.  Would there be even 5% of the public interest that there is now?  I think D-Wave itself would be the first to admit the answer is no.  Indeed, Geordie Rose spoke explicitly in his presentation about the compelling nature of (as he put it) “the quantum computing story,” and how it was key to attracting investment.  People don’t care about this stuff because they want to find the ground states of Ising spin systems a bit faster; they care because they want to know whether or not the human race has finally achieved a new form of computing.  So characterizing the device matters, goddammit!  I pride myself on being willing to adjust my opinions on just about anything in response to new data (as I’ve certainly done in D-Wave’s case), but the insistence that black boxes must be opened and explanations provided is something I’ll carry to the grave.

Finally, given the skeptical-yet-positive tone of this post, some people will wonder whether I now regret my earlier, more unmitigated D-Wave skepticism.  The answer is no!  Asking questions is my job.  I’ll give D-Wave credit whenever it answers some of the questions—as it did on this visit!—and will shift my views accordingly.  But I’ll also neither stop asking nor apologize for asking, until the evidence for a quantum speedup becomes clear and indisputable (as it certainly hasn’t yet).  On the other hand, I do regret the snowballing nastiness that developed as a combined result of my and other skeptics’ statements, D-Wave’s and its supporters’ statements, and the adversarial nature of the blogosphere.  For the first time, I find myself really, genuinely hoping—with all my heart—that D-Wave will succeed in proving that it can do some (not necessarily universal) form of scalable quantum computation.  For, if nothing else, such a success would prove to the world that my $100,000 is safe, and decisively refute the QC skeptics who, right now, are getting even further under my skin than the uncritical D-Wave boosters ever did.

74 Responses to “My visit to D-Wave: Beyond the roast-beef sandwich”

  1. Hal Says:

    Cool, did they mention how to get a developer account?

  2. Steve Flammia Says:

    I’m not sure I agree with the opinion you state in your first “broader question”. I think there is value in both approaches to building a quantum computer: a bottom-up approach where we first build a quality qubit or two and then scale up, and a top-down approach where we get lots of questionable qubits together and see what we can do with them. In the latter case, we might not be able to build a scalable fault-tolerant quantum computer, but we might succeed in a medium scale analog quantum simulator that can still teach us something.

    Of course, eventually any device which claims to be a quantum computer needs to give a convincing demonstration that the machine is actually quantum. But I don’t see a need to limit ourselves to just the bottom-up approach at the moment.

  3. Douglas Knight Says:

    Since it is known that quantum annealing can avoid some obstructions that classical annealing gets stuck on, wouldn’t the natural demonstration be such an example? Probably all known examples are so specific that they give a classical algorithm, namely to look for the specific obstruction, so this would require trust and not be a black box demo (though a new example might work pretty well with a black box). And there are other issues, like finding an Ising example. But this seems like the obvious way to go.

  4. Dave Bacon Says:

    1. Dude, Scott, I’ve told you point 3 every single time this subject comes up! Damnit, I think I even said it at dinner last time you were in Seattle! Decoherence in an energy basis can be bad. It can also be perfectly fine, as far as I can tell, because the energy basis is NOT necessarily the computational basis. Coherence times are obviously not always relevant to quantum computational speedup. As an obvious example, consider coherence times of a universal adiabatic quantum computer at the end of it’s computation, in the energy basis. In fact you want to decohere, and you want to be in the lowest energy eigenstate in this example (actually you don’t even need this last part for some models!) Decoherence can be your friend (indeed, in error correction decoherence in the right basis IS your friend: how else can you dump entropy without decoherence and relaxation to get rid of the entropy you are accumulating!)

    2. I’d really like for you to define “quantum coherence.” You use this all the time, but I don’t know what it means. I guess I’m just dense. It certainly cannot correspond to a pure state in a superposition over some basis, as a) this is basis dependent (blech), and b) pure states are convenient fictions. What does it mean, exactly, for a state to have “lots of quantum coherence”? Operation definitions would be fine, mathematically precise even better. Computational definitions win the match 🙂

    3. I still maintain that it is a disservice to make claims about separable mixed states until someone can actually either prove something about this case or we have a substantial body of evidence that separable mixed states can be classically efficiently simulated. Look, I’m perfectly willing to agree that it is highly unlikely that there is an efficient classical algorithm for NP-complete problems. Not only is there a long record of people failing to find such things, it seems that this would imply some very strange things that people have also spend lots of time trying to figure out. But we have no such record for the case of classically simulating separable mixed states. For all we know there are classes of such systems which are not so hard to be BQP-complete, but not so easy as to fall into BPP. Until people give me other theorems or lots of work showing certain LARGE classes of separable mixed states are efficiently simulate-able, I’ll bring this up on this blog over and over again (go ahead, ban the ex-pontiff….and watch me unleash my legions of Catholic….oh wait, snap, all I got now is a lot of Java design pattern ideas to use in arguments. Damn You Google!)

    4) Finally I very much disagree with your statement “a single qubit that you understand is better than a thousand qubits that you don’t.” If this were true, then, for example, superconductors would have been essentially useless until 1957 when BCS figured out how they work. High temperature superconductors, iron superconductors? Not useful because we don’t understand them. I think what you really mean to say is “a single well characterized qubit is better than a thousand qubits that you haven’t been well characterized.” But then you get into a little problem, because for certain characteristics, D-wave has done a good job characterizing them. That characteristic is how well they perform a certain simulated annealing task. Have they totally characterized them? Not even close (because, of course, that task is intractable if you go by the best definition of how to do that, and if it isn’t intractable it’s not clear that you’re characterizing the portion of the system that allows it to offer computational speedup (if you characterize a state as having a small matrix product state representation, haven’t you just shown that it’s classically efficiently representable?) ….except in the cases where we are talking conjectured quantum speedups. The later being what we are most interested in, which is, of course open to considerable debate for the problem they are working on! So we could agree that the standard of showing quantum speedups for problems widely thought to offer a quantum-classical separation would be A grade material. But if we are talking about a regime where we quite honestly only have our past CLASSICAL experience with hardness to go on? Well I’d say it’s C/D class material. But that’s just the ex-grading-prof in me.)

    These things said, for a variety of physical and computational reasons I won’t bet that D-wave is able to beat classical scaling using their current system (shrinking energy gap + temperature = for hard instances you’ll likely populate states that aren’t global minimums.) I also, however, strongly believe that the adiabatic model is the right model to build a quantum computer on (a universal adiabatic quantum model.) Before I left quantum computing I wrote some papers that I think support this, some published, some relished to “not finished” in large part, in my opinion (not share by my coauthors!), quantum computing has gotten increasingly risk adverse in its approach to building a quantum computer. Of course this risk aversion is well earned with the amazing work of ion trappers and superconducting qubits coupled by microwave resonators (the later is work that if I had to wager will some day win a Nobel prize.) But I still think adiabatic models will win out in the end, for a long list of reasons that hopefully will come when I finally get off my ass and finish those papers (or find time to do then which I don’t spend playing with my son Maxwell 🙂 )

    Finally let me just say that your skepticism about D-wave is awesome and I don’t think you have to apologize for any of it nor turn off your skeptical meter when talking about D-wave. Hashing things out is one of the great things about being alive, in my limited experience at being alive. The blogosphere is, of course, a great blow-horn of spewed-ness that amplifies butterfly wing beats into tornados without the aid of a positive Lyapunov exponent. But in this case your mostly dealing with very smart people on all sides of the equation who understand that your strongly passionate arguments are meant to uncover your own portion of truth. Or at least that’s my very naive take on it all.

  5. Roger Says:

    Your skepticism of D-Wave is similar to what I think of the whole subject of quantum computing. Show me some evidence for a quantum speedup to some computation. Show me a really convincing example of a single qubit. You blame D-Wave for not doing the impossible. They have smart people. If there were some way that they could be demonstrating true quantum computing, they would be doing it. At some point you have to ask yourself if maybe there is some good reason that all of these efforts at quantum computing come up short.

  6. Dave Bacon Says:

    Roger: “Show me a really convincing example of a single qubit.”
    ….but I’m guessing those won’t be convincing.

  7. Joshua Zelinsky Says:

    Show me a really convincing example of a single qubit.

    Roger, I’m not sure what you mean. All sorts of examples with entangled qubits have been successfully demonstrated. That’s quite old. So there must be something else that you mean by “really convincing”- so what would convince you?

  8. Ray Stantz Says:

    You don’t know what it’s like out there. I’ve worked in the private sector. They expect results.

  9. Aram Says:

    Scott, you shouldn’t respond to Roger. It’ll only encourage him.

  10. rrtucci Says:

    On subject of size, I think I will wait for the D-Wave 2 before I buy one. I hear it will come with a bathroom.

  11. Scott Says:

    Aram #9: Thanks, I needed that.

  12. Scott Says:

    Douglas Knight #3:

      Since it is known that quantum annealing can avoid some obstructions that classical annealing gets stuck on, wouldn’t the natural demonstration be such an example?

    Yes, and I told them that! Maybe they’ll try it.

  13. Scott Says:

    Dave Bacon #3:

    1. Dude, then, I apologize for being dense! In my defense, what I needed was a clear, unambiguous statement from D-Wave to the effect that: “we think there’s quantum entanglement in our system, we have reason to think the decoherence mechanisms don’t quickly destroy that entanglement, and we agree that such long-lasting entanglement is probably necessary for a quantum speedup.” And despite many opportunities, the first time I heard such a statement was at this meeting in Burnaby.

    I’m sure part of the problem was simply a physics/CS language barrier. The whole “decoherence doesn’t matter because of energy gaps” line threw me off completely. Of course decoherence matters; the issue is just decoherence in which basis!

    2. Methinks you’re being too coy when you claim you don’t know what quantum coherence means. I prefer to define it negatively: in quantum mechanics, coherence is that which makes it impossible to model the system’s behavior using classical probability theory alone. You can have more or less of it; the coherence becomes better as the best classical probabilistic model becomes worse. There are lots of ways to quantify the departure from a classical probabilistic description, but the “best” way obviously depends on the context. On the one hand, you shouldn’t describe something as quantum if it becomes “classical” after a trivial change of basis. On the other hand, if your definition says that coherence isn’t present in the double-slit experiment or the Bell inequality, then you know your definition is too restrictive!

    In the present context, we really care about a specific type of coherence: namely, that which precludes, or which might preclude, an efficient classical simulation. Personally, I think that this type of coherence almost certainly has to involve entanglement. Which brings me to…

    3. Yes, I know that people haven’t thought nearly as hard about efficient classical simulation of separable-mixed-state QC as they have about (say) P vs. NP. But for whatever it’s worth, I thought a good deal about the former question, as a grad student at Berkeley. I convinced myself that, if a speedup using separable mixed states is possible, it would have to be very unlike the quantum speedups we know, due to the sparseness of separable mixed states (outside a tiny ball around the identity) and the “miraculousness” of unitary transformations always keeping you inside that sparse set. No, I wouldn’t wager $100,000 on the classical simulability of such systems, but I would wager $50 on it. Certainly I think the burden of evidence is on anyone claiming a speedup with no entanglement.

    4. Yeah, I meant that a single well-characterized qubit is better than a thousand poorly-characterized ones. At some “deep” philosophical level, you could argue that no one understands any qubit, but that’s not what I had in mind. Thanks for the clarification!

  14. Scott Says:

    Ray Stantz #8:

      You don’t know what it’s like out there. I’ve worked in the private sector. They expect results.

    Look, I appreciate the pressure D-Wave is under, more so after this visit.

    But if you call D-Wave “the private sector,” I think you have to add that we’re not talking about something even ε-close to an ordinary capitalist market. D-Wave’s funding has come from VCs excited about “the quantum computing story.” For the foreseeable future, its customers will be universities, government agencies, government contractors like Lockheed Martin, and maybe some other rich corporations—all of them driven less by their burning need to solve Ising spin minimization problems, than by their desire to be (or be seen as being) at the forefront of a new technology. In other words: for now, novelty is D-Wave’s product. In important respects, I think that makes D-Wave more like an academic lab than like a normal company. It also completely changes the nature of the “results” that people (including customers and investors) ought to be asking for. If you choose to invest in the Wright Brothers, don’t demand a “useful product” from them before there’s even been a Kitty Hawk!

  15. Hal Says:

    I don’t know if my comments will come out of moderation, but I would also add that there is a corallary the quickly fallows any discussion of quantum computers that involves management of system resources. Although D-Wave first computer came in at $10M, you can imagine that production version could be had for $1-5M within a few years and probably less than a $1M within a decade. As the class of problems and programming solutions begin to be explored we can find interesing parrallels between early development of quantum computing resource management and management of early computers.

  16. John Sidles Says:

    Scott, is there a mathematically natural reason why “entanglement” figures so prominently in assessments of D-Wave’s technology? We all appreciate that there are historical reasons, and pedagogical reasons, but it is far less obvious (to me anyway) that there are mathematically natural reasons. To the contrary, ubiquitous phenomena like entanglement “sudden death” (per arXiv:1005.3837 for example) are suggestive that focusing upon entanglement as the sine qua non of quantum computation may be unproductive, in the specific sense of overlooking genuine advantages of D-Wave’s technology, as a computational technology that is both (plausibly) non-classical and (plausibly) non-entangled.

  17. Roger Says:

    All sorts of examples with entangled qubits have been successfully demonstrated.

    Yes, there are quantum experiments that can be interpreted as demonstrating entangled states. That has been known for decades. What hasn’t been shown is that the entanglement can be used for a non-classical computation that outdoes what can be done with an efficient classical simulation.

    Scott’s skepticism about quantum computers can be summarized: How can D-Wave be selling a quantum computer when no one has even shown that quantum computing is possible? D-Wave is over-hyping their results, just like everyone else in the field.

  18. John Sidles Says:

    Just to mention another mathematical point that (when overlooked) can absorb illumination regarding technologies like D-Waves, dynamical non-locality is similarly ubiquitous in classical dynamics as in quantum dynamics.

    For example, consider $latex N$ classical spins interacting via a matrix of $latex N^2$ couplings. Depending upon the numerical entries in that matrix, the coarse-grained transport dynamics of the spin system can be (effectively) 1D, 2D, 3D, etc. Alternatively, for random coefficients there is no emergent spatial transport description at all. And it is easy to construct hybrid cases too, in which the coarse-grained transport equations exhibit Bell-type correlations.

    The point is that non-quantum dynamics in general, and in particular the emergence of spatial structure from non-quantum non-spatial microscopic dynamics, can be a topic more subtle than typical undergraduate textbooks lead students to conceive, and this in turn can lead us to under-appreciate the subtlety of D-Wave technology.

  19. Scott Says:

    John, I already addressed that question in the post and in my response to Dave Bacon. The crux of the issue is that we don’t know even a single example right now of an algorithm by which a quantum computer with no entanglement can get a speedup (even a conjectural speedup) over a classical computer. That’s not a “historical” or “pedagogical” statement; it’s just a statement about our current mathematical knowledge! Entanglement is just a quantum analogue of correlation; as such, it seems to get produced as a “natural byproduct” in all interesting quantum algorithms.

    Now, if you want to address this mathematical question head-on, more power to you: give serious theoretical evidence that QCs without entanglement can do something useful! (Or in the case relevant to D-Wave: construct an example of an energy landscape where the adiabatic algorithm outperforms simulated annealing, even though the AA’s quantum state always remains a separable mixed state.)

    In other words: sure, I’d be thrilled and excited for someone to breach the “quantum speedups require entanglement” barrier. What I object to is people acting nonchalant about it, as if there were no clear-cut theoretical barrier here to be breached.

  20. John Sidles Says:

    Scott, could you please clarify your reply as to whether by “speedup” you mean (a) exponential speedup, versus (b) polynomial speedup, versus (c) boring-old constant-factor speedup.

    After all, from a general engineering perspective — and certainly from D-Wave’s perspective — all three kinds of speedup can be transformational.

  21. Scott Says:

    Roger #17:

      Scott’s skepticism about quantum computers can be summarized: How can D-Wave be selling a quantum computer when no one has even shown that quantum computing is possible? D-Wave is over-hyping their results, just like everyone else in the field.

    Yes, I’ve strongly objected to D-Wave making claims about having a commercial product without first validating the underlying technology.

    But don’t you see the circularity in being skeptical of scalable quantum computing solely on the grounds that no one has demonstrated it yet? No technology—neither powered flight, nor nuclear weapons, nor scalable classical computing—was ever demonstrated before it was. If people are arguing about the feasibility of a technology at all, it can only be because it hasn’t been demonstrated yet! For such an argument to be more than a clash of hunches, it has be grounded in the known laws of physics, and in what they imply or seem to imply. It’s not enough to make declarations like “I believe in quantum mechanics but not quantum computing,” without explaining where you think the separation between theory and practice happens, and why.

    (Compare: “I believe that the earth is round, but not that you could literally go around it and get back where you started.” “I believe in Gödel’s incompleteness theorem, but I still think the usual axioms of arithmetic are consistent and complete—it’s just that no one’s managed to show it yet.”)

    Finally: inspired by Aram’s advice, I will not reply to any more of your comments, until you demonstrate the ability to change at least one of your opinions as a result of understanding something that a disagreeing commenter explained to you.

  22. Mike Says:


    “All sorts of examples with entangled qubits have been successfully demonstrated.

    Yes, there are quantum experiments that can be interpreted as demonstrating entangled states. That has been known for decades. What hasn’t been shown is that the entanglement can be used for a non-classical computation that outdoes what can be done with an efficient classical simulation.”

    Here is but one example of a proof of principle experiment carried out in the real world that shows that “entanglement can be used for a non-classical computation that outdoes what can be done with an efficient classical simulation.”


    In fact, these types of actual proofs have recently gone somewhat beyond factoring the number 15 😉

    But I suspect you’ll once again move the target and claim that of course these types of physical proofs have been carried out, they’ve been known for years now, but what you were really talking about was scaling up these experiments to carry out more useful calculations.

  23. Scott Says:

    John #20: To clarify, we currently don’t know how to get any speedup using an unentangled QC: not an exponential speedup, not a polynomial speedup, not even a significant constant-factor speedup. (OK, maybe the factor-of-two speedup for PARITY, depending on how the input is provided. That’s about all I can think of.)

  24. John Sidles Says:

    Scott #23, the wonderfully accurate ground-state energy estimates achievable with matrix product states have shown us that notable improvements in algorithmic capabilities (e.g., arXiv:1201.3975) can be associated to low-order state-space rank.

    So why is it implausible that low-order quantum coherences in D-Wave’s devices might substantially speed-up annealing algorithms?

  25. Scott Says:

    John #24: Matrix product states generally are entangled! If you’re talking about a “little” entanglement, that’s a very different question from no entanglement, and then it might depend strongly on how the entanglement is quantified. (E.g., the one-clean-qubit model does seem to provide some speedups over classical.)

    Incidentally, the paper you linked to appears to be evidence against rather than for the point you were trying to make! It shows that various systems with limited entanglement can be efficiently simulated by classical computers.

  26. Roger Says:

    Factoring 15 seems like an accomplishment, but see this 2011 Nature article for an explanation of why it is not really quantum computing.

  27. John Sidles Says:

    Scott #25: To affirm, yes I am talking about about polynomial-rank dynamical entanglement, and am making the point that there exists a large body of evidence — both from fundamental theory and from numerical experiments — that “little entanglements” are a valuable computational resource, no matter whether they are instantiated in (classical) tensor networks or in (non-classical?) D-Wave qubits. And it is evident too that we are similarly far from appreciating and exhausting the capabilities of present-day “little entanglements” as we are from realizing the Shor-type capabilities of future “full Hilbert entanglement.”

  28. aram Says:

    Scott #19:

    we don’t know even a single example right now of an algorithm by which a quantum computer with no entanglement can get a speedup (even a conjectural speedup) over a classical computer.

    What about DQC1? Based on that, I always would say that it matters whether the dynamics are entangling, not whether the state is.

  29. aram Says:

    Also, do you think you can turn on a ‘preview’ feature for comments? Otherwise you never know how your formatting is going to turn out.

  30. Mike Says:


    “Factoring 15 seems like an accomplishment, but see this 2011 Nature article for an explanation of why it is not really quantum computing.”

    The article you cite refers to a different attempt at factoring 15 — they’re doing it everywhere you know 😉 — the original article I cited (there are other experiments as well) did not involve room tempreature liquid-based nuclear magnetic resonance (NMR).

    As an aside, the article you cite actual argues that quantum computation may be possible using algorithms based on discord (noise), which if true, is evidence against the conjectures of some (which I’ve seen you latch onto for your own anti-science purposes) that scalable quantum computing is impossible because of such discord.

    But, I think I’ll follow the sage advice already proffered and not respond further to your comments. Not only do you never acknowledge any evidence provided by others, but you also apparently just throw out citations hoping no one reads them.

  31. Scott Says:

    aram 28 and 29:

    The states in the DQC1 model do have small amounts of entanglement. (For more info see this PhD thesis by Animesh Datta.) So DQC1 doesn’t conflict with my conjecture that if you never any entanglement, then you’re simulable in BPP.

    Sorry, I used to have comment previewing, but the resources consumed by the script apparently contributed to this blog going down often! Bluehost sucks.

  32. Roger Says:

    Mike, the 2011 Nature article was written after the 2009 experiment you cite. Yes, there are many papers on this subject. There is even one on the <a href=""Quantum Factorization of 143.

    I don’t know what you mean by my “own anti-science purposes”. It is not anti-science to be skeptical of unproven hypotheses. I write this on a computer that was made with quantum mechanics, so I accept computers and quantum mechanics. What I don’t accept are the outlandish and unproven claims of the quantum computing promoters.

  33. Roger Says:

    Oops, bad html. The link is Quantum Factorization of 143.

  34. Steven Says:

    “Once you’ve shown that the foundation is solid, then you try to scale up.”

    I don’t agree. There are two major problems if you are going with the gate model: getting good qubits with good interactions and scaling them up. The second problem could be harder than the first. Scientifically, it is less interesting, but at least from an engineering standpoint it is a major challenge.

    Why should we work on these problems serially? Work on them both at the same time. If D-Wave were an academic lab, I’d think their work was great, because presumably some of the techniques they are developing for scaling (managing controls, calibration, the problems go on and on) will be useful with better qubits, too.

  35. rrtucci Says:

    Lest anyone doubt that we italians are an epic race:
    OPERA superluminal photons due to loose wire

  36. Scott Says:

    Steven #34: Well, OK, if D-Wave had presented its work that way (“we’re not trying to solve problems faster; we’re just trying to understand the engineering challenges involved in putting large numbers of qubits together, using highly-decoherent qubits as a toy model case”), then I’m sure the whole conversation would’ve evolved differently. Obviously, both their claims and their goals are more ambitious than that.

  37. aram Says:

    By the way, I think part of D-wave’s decoherence comes from the fact that they have a lot of qubits. Not for any fundamental reason, like Gil posits, but because many qubits = many wires connected from the fridge to the outside, and these wires carry thermal noise. This noise also means that they band-limit their wires, which is why their gate time is so much slower than the dephasing time.

  38. Scott Says:

    rrtucci #35: LOL! Well, I knew something of that kind was more likely than a massive revision to the causal structure of spacetime. On the bright side, you guys still have Galileo, Fermi, and some of the world’s best food. 🙂

  39. Einstein was right! | The Quantum Pontiff Says:

    […] Apparently the timing error was likely caused by a loose cable (h/t rrtucci). […]

  40. Neil Dickson Says:

    For a long time, I’ve wanted to take an actual roast beef sandwich and hook it up to some electronic controls, just in case it’s really great at solving NP-hard problems. Imagine if you took the d-dimensional Ham Sandwich Problem (possibly NP-hard in d?), mapped it onto a roast beef sandwich, and you then had a sandwich solving a sandwich. Delicioptimized! 🙂

  41. jonas Says:

    Neil Dickson: a giroscope might be a better candidate than a roast beef sandwich (see “”).

  42. Daniel Says:

    Neil Dickson:

    The d-dimensional Ham Sandwich Problem is indeed known to be W[1]-hard parameterized by the dimension d:

  43. Neil Dickson Says:

    @Daniel: Thanks! 🙂 I found that too shortly after I posted, ’cause I realized it had been a while since I checked. I guess people are still actively working on those sorts of problem classifications after all! It makes me wonder whether there’s been any big progress in characterizing high dimensional nearest neighbour search lately.

  44. Bram Cohen Says:

    Of course, the taboo search which is being used in the classical algorithm being used for comparison doesn’t have a strong theoretical basis for it, it’s just ‘we tried a bunch of variants and this one seems the best’, which seems especially untrustworthy to me personally because it’s in part based on work I did in a summer internship out of high school, which consisted of trying random heuristics and mostly finding that they didn’t work.

    The upshot is that even if D-Wave manages to solve meaningfully large problems with a demonstrably lower base to the exponent then the classical one (which they haven’t even come close to) then it will still be entirely plausible that they’ll have simply found a better classical algorithm than the completely woowoo one which seems to be the best so far.

    In conclusion, finding a violation of the Bell inequality matters!

  45. Justin Dove Says:

    @ #3 and #12,

    I’ve always been under the impression that quantum annealing is just a classical algorithm similar to simulated annealing but written to make use of tunneling type transitions as opposed to thermal fluctuations. I.e. that the “quantum” part is really just the similarity between the algorithm and how certain quantum systems behave. On the other hand, the “speed up” part I’ve always thought was because some people believed that quantum computers could in principle carry out QA much quicker than a classical computer due to the fact that it *is* a quantum system that inherently exhibits that sort of behavior.

    So, my perhaps uneducated mind would like to reply with: “wouldn’t such a ‘tough scenario’ just verify that they are performing QA as opposed to SA, but not that they are doing QA any faster than a classical computer, and thus not actually prove that they are doing anything non-classical?”. But based on #3 and #12, I assume either I’m mistaken in my understanding of QA, or people have already been convinced that D-Wave is doing something fast and just aren’t convinced that its QA that they’re doing.

    Which is it (if either)?

  46. Some Other Scott Says:

    Scott #14: “D-Wave’s funding has come from VCs excited about ‘the quantum computing story.’ For the foreseeable future, its customers will be universities, government agencies, government contractors like Lockheed Martin, and maybe some other rich corporations—all of them driven less by their burning need to solve Ising spin minimization problems, than by their desire to be (or be seen as being) at the forefront of a new technology.”

    I would propose another explanation. I think the VCs want to make money. They care about the quantum story because they think that will sell machines because other people are interested in the quantum story. That reduces the downside for the VCs. And if D-Wave either has a quantum computer or has a magical black box that isn’t quantum but can speed up some problem of actual interest to someone, the VCs either become very rich or make a reasonable return.

    I think companies like LM will buy one for $10M the same reason that I might buy some really interesting gadget for $500. It’s not enough money to make a significant difference in how I do things and it’d be kind of cool to see the technology behind the gadget.

    For the VCs, I think the question has to be whether they can either convince potential customers that it does something valuable or drive down the price before running out of companies with an interest in technology and revenues significantly into the 10s of billions of dollars.

  47. Henning Dekant Says:

    In defense of pointy haired bosses:

    Form a business perspective they have nothing to gain but only to lose by looking into how “quantum” D-Wave really is.

    For marketing purposes they can already freely use the “quantum computing” term. For an average CIO this is just another buzzword signifying “shiny, cool and new”. It’s just to get a foot in the door. After that the only thing that matters is if the D-Wave device can deliver reasonable performance at an acceptable price point.

    CIO after all are not paid for advancing science.

    Not that you don’t seem to make an impact on D-Wave. I wondered on my blog why they didn’t make more hey off the Ramsey number research. You seem to be getting to them. Not sure that’s good for their business.

  48. Scott Aaronson Visits D-Wave – No Casualties Reported | wavewatching Says:

    […] of the controversial scuffle between Scott and the company.  So it’s quite newsworthy that according to Scott’s own accord the hatchet has been buried. No more comparing the D‑Wave One to a roast beef sandwich (hopefully […]

  49. Chris W. Says:

    I would propose another explanation. I think the VCs want to make money. They care about the quantum story because they think that will sell machines because other people are interested in the quantum story. That reduces the downside for the VCs. And if D-Wave either has a quantum computer or has a magical black box that isn’t quantum but can speed up some problem of actual interest to someone, the VCs either become very rich or make a reasonable return.

    Sounds a bit like buying real estate with a loan you can’t afford, figuring that some other sucker will buy it from you for even more money before the loan buries you, out of that sucker’s own desperate urge to get in on the boom. We know how that story ends…

    Then again, much of our economy runs off this kind of BS. How many times have high school dropouts with a wacky energy generation idea—that violates the Second Law no less—managed to secure investment to scale it up and commercialize it? You’d be surprised.

  50. John Sidles Says:

    Chris W, your scenario would make an entertaining scam movie, perhaps along the lines of Steve Martin and David Niven in Dirty Rotten Scoundrels.

    And yet, it seems to me a that a superior thriller about quantum computing could be templated from Ron Howard’s Apollo 13 … because we don’t know (yet) how that film ends.

    And so, it would be a genuine thriller.

  51. frederic gillet Says:

    Hasn’t Shor’s algorithm been demonstrated already on systems with entanglement?

    How many qbits would be needed to show an actual speedup over classical factorization?

  52. John Sidles Says:

    Frederic Gillet #51 asks: “Hasn’t Shor’s algorithm been demonstrated already on systems with entanglement error-correction?”

    Frederick, if we regard error-correction as an essential element of Shor’s algorithm, then the rather sobering answer (AFAIK) is “No, not even for the smallest qubit systems.” And in this regard the error-correction methods used in early (classical) computer delay-line memories are of considerable interest! Indeed, John von Neumann conceived (US Patent #2815488) a computer architecture in which pulse-shaping error-correction occurs thermodynamically rather than electronically; an approach that resembles today’s ideas for quantum error-protection via topologically non-trivial Hamiltonians.

  53. Chris W. Says:

    More quantum computing hype from IBM, in Wired Magazine.

  54. Weekly Picks « — the Blog Says:

    […] Shtetl-Optimized visits D-Wave and finds more than a roast-beef sandwhich. […]

  55. Shehab Says:


    In this 6 parts lecture (, Suzanne Gildert shows the ‘quantumness’ of D-Wave’s architecture. She points out two following thing:

    1. Quantum tunneling
    2. Spin chain

    Aren’t these enough to convince that their computation is non-classical?

    I think the next big thing to do for D-Wave is to come up with a polynomial algorithm which can reduce all NP-complete problems into their discrete optimization problem, right?

  56. Scott Says:

    Shehab #55: I discussed quantum tunneling in the post. As I said, the problem is that (as Mohammad Amin told me himself) tunneling could easily be present without any entanglement—it could just be a single-qubit effect. In which case, the whole computation would be easy to simulate classically, even if “quantum” at the 1-qubit level. I don’t understand what you mean by “spin chain”: is there a separate line of evidence from the quantum tunneling one? (I don’t like watching lectures on YouTube; something written would be nice.)

    By the very definition of NP, there are polynomial-time algorithms to reduce all other NP-complete problems to D-Wave’s Ising spin problem—nothing to do there, besides of course improving the reductions’ efficiency! The questions are what sort of speedup you get in practice when you do such a reduction, how the speedup compares to the blowup incurred by the reduction itself, and whether the speedup (if it survives) depends in any way on quantum effects.

  57. D-Wave’s Lightswitch Game » I will do science to it! Says:

    […] While I certainly don’t like the method of keeping all of your cool research to yourself, I do think that D-Wave is doing cool research.  That said, I think their marketing department is probably stretching the truth if not outright lying about what they actually are selling.  As I was attempting to say, though, simulated annealing will get the job done in any event, I think most academics are just upset they’re building something even remotely quantum while most of the rest of us are spending hours adjust delicate equipment in labs, trying to get a handful of qubits to do anything while they’re claiming hundreds.  If you’re interested in some of the controversy, you can spend a while reading Scott Aaronson’s very interesting blog. […]

  58. Shehab Says:

    Scott, now I get your point. It is not about the quantumness. It is about whether the speedup has anything to do with quantumness. I think your point is the speedup they are getting for the specific kind of optimization problem may also be possible in classical setup with enough resources. Right?

    Regarding the spin network, here is a D-Wave page.

  59. Fred Says:

    Reading Scott’s piece on IEEE about his 100K$:

    “The central problem is decoherence, meaning unwanted interactions between the computer and its external environment, which prematurely “measure” the computer and destroy its fragile quantum state.”

    I’ve been wondering – do we know any natural quantum system that exhibits long scale coherence?

    I just don’t have a good feel for how much effort it takes to prevent a simple system from going into decoherence – for example, how hard/easy is it in practice to reproduce the two slit experiment with electrons?

  60. John Sidles Says:

    Scott, the lively debate between Aram Harrow and Gil Kalai on Gödel’s Lost Letter and P=NP would become even livelier if Scott Aaronson were contributing more substantially. In particular, in recent weeks I’ve been looting the tombs of the algebraic geometry literature for concrete examples of sure/Shor separators and intend to post more on this topic. It would be terrific if the inventor of this concept would contribute … because irrespective of whether you and I agree or not on any particular topic, I always have a high regard for your imagination.

    Summary: Things are quiet here, so why not stir things up there?

  61. Chris W. Says:

    Slightly off-topic: See this post on Technology Review’s Physics arXiv Blog, to which I was pointed by a coworker:

    Quantum Biology and the Puzzle of Coherence

    Kauffman and co say a similar kind of critical transition occurs as quantum systems switch to a chaotic regime. Here the distinction between chaotic behaviour and ordinary quantum behaviour disappears. And in these conditions, quantum coherence suddenly changes from the fragile, blink-and-you-miss-it regime to a much more robust long-lived phenomenon.

    It is in this state, say Kauffman and co, that the observed processes of quantum biology must take place. They even demonstrate this by simulating the improved coherence of the light harvesting complexes involved in photosynthesis. “It is very likely that biological systems use this mechanism,” they say.

  62. Berg Says:

    Scott, suppose P = NP were to be proved tomorrow(any time between now and 1 year from now or so) how would this whole discussion evolve and where would D-wave stand? Specifically, say the issue was resolved with a constructive algorithm that had a cost that was less than or equal to O(n^3) complexity – Could we program a computer to find an “algorithm by which a quantum computer with no entanglement can get a speedup over a classical computer”? In general, how would a practical algorithm that proves P = NP affect the progress of QC? Outside of academia, would QC even really be desired at that point? Thanks, I’m writing a paper.

  63. John Sidles Says:

    I have posted to Gödel’s Lost Letter reasoned arguments to the effect that Scott’s recent IEEE Spectrum essay, when read carefully, contains within it the essential elements of a viable QIST Roadmap 3.0.

  64. Patrick Says:

    If they do the test to determine if the system is behaving as a quantum computer and the test fails, *then* they go out of business. It is a dysfunctional business model. As long as they don’t know for sure that it isn’t a quantum computer then they’re not committing fraud by claiming that it is one.

    They’re afraid to do the test not because of the resources, but because they’re worried about what the answer might be.

  65. Commercial Quantum Computer? « Tomi Engdahl’s ePanorama blog Says:

    […] Slashdot article D-Wave Announces Commercially Available Quantum Computer comments tell that this has the same central problem as before. D-Wave’s computers haven’t demonstrated that their commercial bits are entangled. There’s no way to really distinguish what they are doing from essentially classical simulated annealing. Recommended reading that is skeptical of D-Wave’s claims is much of what Scott Aaronson has wrote about them. See for example, although interestingly after he visited D-Wave’s labs in person his views changed slightly and became slightly more sympathetic to them […]

  66. Analog VLSI for Neural Networks – A Cautious Tale for Adiabatic Quantum Computing | Wavewatching Says:

    […] medical applications, but so far there is no obvious market niche for adiabatic quantum computing. Scott Aaranson argued that the "coolness" of quantum computing will sell machines.  While this label has some […]

  67. Order from Quantum Discord | Wavewatching Says:

    […] Although their erstwhile fiercest critic Scot Aaronson  has made peace with them, he expressed that he still would like to see a measure for the degree of entanglement that they achieve on their […]

  68. The CIA invests in Canadian company’s quantum computer | Sync™ Blog Says:

    […] as a roast-beef sandwich.” He’s since visited D-Wave and, while now less sceptical, is still not entirely convinced. And in 2010, an article entitled D-Wave Does Not Quantum Compute in IEEE Spectrum claimed that […]

  69. Google & NASA Acquire D-Wave Quantum Computer | Future Leap Says:

    […] the “quantum-ness” of D-Wave’s machines has been called into question, it seems that the landscape of opinion is shifting more towards confidence in D-Wave’s […]

  70. Dan Kolis Says:

    Hello to all,

    I seems like a lexicon of terms might help; ( but likely not much ). For instance What constitutes a “classical digital computer” ?, I mean I suspect the D-Wave core is 15K op amps in liquid helium: an unusual analog computer.

    In the early days of digital computers different flavors of digital computers seemed quite unique, BCD, Von Newman, now DSP micros with segregated program and memory are another mild variant.

    I think really only a small handful of problems tickle NP completeness, but many like flow Navier Stokes style, and protein folding may benefit from a custom analog computers. Since the design of the machine seems to permit a likely swappable central module thats near 0 K, this is useful cutting costs to make some of these other custom computers for problems of significance to the human condition, anyway, if in fact you like humans and worry thru there little concerns…

    I saw a video today of a magnetic show and tell of the Meissner effect and it was at science museum, titled “Revolutionary Quantum Magnetic levitation”. No point in bothering explaining this ‘revolution’ is a little post dated; ( 1933 ). And Quantum has nothing to do with it to any extent worth discussing.

    The ‘Quantum captures’ the publics imagination, and the cells are the rock in the rock soup to get play and make things money for funding pretty fun research, that has a painful cover story.

    Looking at the nomenclatures for the source code interface in Python, it seems to me exciting to see a s/w interface exactly like this without necessarily fixating on what is under the hood. I’d like to get the developer source code credential, but I read its cut off, which is a little sad.

    But I am, for instance, only interested in protein folding and a box like this might be a head start: maybe. And the observation is: programs, documentation, simulators, etc are possibly a small contribution along these lines, a convenient fiction the call is more magical, as some interim step ( dunno ) .

    So I wish D-Wave well and would conclude with the observation sometimes rock soup can produce very good soup, as long as the rock itself is not inspected too closely.

  71. Dan Kolis Says:

    Henning D says:
    In Comment #47 February 25th, 2012 at 11:47 am

    ” For marketing purposes they can already freely use the quantum computing” term. For an average CIO this is just another buzzword signifying “shiny, cool and new”. It’s just to get a foot in the door. After that the only thing that matters is if the D-Wave device can deliver … ”

    I suggest this is not ideal but tolerable, even possibly useful. Maybe.

    Sony is calling small LED diodes “Quantum dots”, so get used to it I suppose.

  72. Roundneck Striped T shirt Says:

    Simply reviewing the op’s blog readers will resonate with it since it’s correct so it’s good reading from a webmaster that is writing this on the internet to think about

  73. Analog VLSI for Neural Networks – A Cautious Tale for Adiabatic Quantum Computing | Wavewatching Says:

    […] medical applications, but so far there is no obvious market niche for adiabatic quantum computing. Scott Aaranson argued that the "coolness" of quantum computing will sell machines.  While this label has some […]

  74. D-Wave Systems Quantum Computer | All Star Activist Says:

    […] ^ Jump up to:a b My visit to D-wave: Beyond the Roast Beef Sandwich 21 February 2012 […]