## Better late than never

No, I’m not talking about Osama, but about my reactions below to a *New Yorker* article about quantum computing—reactions whose writing was rudely interrupted by last night’s news. Of *all the possible times* in the past decade to get him, they had to pick one that would overshadow an important *Shtetl-Optimized* discussion about complexity theory, the Many-Worlds Interpretation, and the popularization of science? Well, I guess I’ll let it slide.

As already discussed on Peter Woit’s blog, this week’s *New Yorker* has a long piece about quantum computing by the novelist Rivka Galchen (unfortunately the article is behind a paywall). Most of the article is about the quantum computing pioneer David Deutsch: his genius, his eccentricity, his certainty that parallel universes exist, his insistence on rational explanations for everything, his disdain for “intellectual obfuscators” (of whom Niels Bohr is a favorite example), his indifference to most of the problems that occupy other quantum computing researchers, the messiness of his house, his reluctance to leave his house, and his love of the TV show *House*.

Having spent a wonderful, mind-expanding day with Deutsch in 2002—at his house in Oxford, of course—I can personally vouch for all of the above (except the part about *House*, which hadn’t yet debuted then). On the one hand, Deutsch is one of the most brilliant conversationalists I’ve ever encountered; on the other hand, I was astonished to find myself, as a second-year graduate student, explaining to the father of quantum computing what BQP was. So basically, David Deutsch is someone who merits a *New Yorker* profile if anyone does. And I was pleased to see Galchen skillfully leveraging Deutsch’s highly-profilable personality to expose a lay audience (well, OK, a chardonnay-sipping Manhattan socialite audience) to some of the great questions of science and philosophy.

However, reading this article also depressed me, as it dawned on me that the entire thing could have* * been written fifteen years ago, with only minor changes to the parts about experiment and zero change to the theoretical parts. I thought: “has there *really* been that little progress in quantum computing theory the past decade and a half—at least progress that a *New Yorker* reader would care about?” Even the sociological observations are dated: Galchen writes about interest in quantum computing as the “Oxford flu,” rather than the “Waterloo flu” or “Caltech flu” that it’s been since 2000 or so (the latter two capitals of the field aren’t even mentioned!). A good analogy would be an article about the Web, published today, that described the strange and exciting new world of Netscape, HotBot, and AltaVista.

A more serious issue is that the article falls victim to almost every misleading pop-science trope about quantum computing that some of us have trying to correct for the past decade. For example:

With one millionth of the hardware of an ordinary laptop, a quantum computer could store as many bits of information as there are particles in the universe.

Noooooo! That’s only for an extremely strange definition of “store”…

Oxford’s eight-qubit quantum computer has significantly less computational power than an abacus, but fifty to a hundred qubits could make something as powerful as any laptop.

Noooooo! * *Fifty to a hundred qubits could *maybe* replace your laptop, *if* the only thing you wanted to use your laptop *for* was simulating a system of fifty to a hundred qubits…

In a 1985 paper, Deutsch pointed out that, because Turing was working with classical physics, his universal computer could imitate only a subset of possible computers. Turing’s theory needed to account for quantum mechanics if its logic was to hold. Deutsch proposed a universal quantum computer based on quantum physics, which would have calculating powers that Turing’s computer (even in theory) could not simulate.

There are at least three problems here. The first is conflating simulation with *efficient* simulation. At the risk of going hoarse, **a classical Turing machine can calculate absolutely everything that a quantum computer can calculate! **It might “merely” need exponentially more time. Second, no one has proved that a classical Turing machine really *does* need exponentially more time, i.e., that it can’t efficiently simulate a quantum computer. That remains a (deep, plausible, and widely-believed) *conjecture*, which will take enormous mathematical advances to resolve. And third, Deutsch’s landmark paper wasn’t among the ones to give *evidence* for that conjecture. The first such evidence only came later, with the work of Bernstein-Vazirani, Simon, and Shor.

To be fair to Galchen, Deutsch himself has often been inaccurate on these points, even though he ought to (and does!) know better. Specifically, he conflates the original Church-Turing Thesis (which isn’t challenged in the slightest by quantum computing) with its modern, polynomial-time version (which is), and he neglects to mention the conjectural status of quantum computers’ speedup. Here are two examples out of many, from *The Fabric of Reality*:

“quantum computers can perform computations of which no (human) mathematician will ever, even in principle, be capable.”

“if the visible universe were the extent of physical reality, physical reality would not even remotely contain the resources required to factorize such a large number.”

Am I just harping over technicalities here? In my view, the issue goes deeper. All of the above oversights can be understood as symptoms of **complexophobia**: *the fear of acknowledging that one is actually making statements about computational complexity theory*. Again and again, I’ve seen science writers go through strange verbal contortions to avoid the question of *how anyone could know that a computation inherently requires a huge amount of time*—as if the reader must be prevented, at all costs, from seeing such a claim as anything other than obvious. It can be fascinating to watch, in the same way it’s fascinating to watch a politician discuss (say) Confederate History Month without mentioning slavery. How long can you poke and prod the P versus NP beast without rousting it?

On the other hand, complexity theory *does* show up in Galchen’s article, and in an extremely interesting context: that of explaining where Deutsch got the idea for quantum computing.

According to Deutsch, the insight for [his famous 1985 paper] came from a conversation in the early eighties with the physicist Charles Bennett, of I.B.M., about computational-complexity theory, at the time a sexy new field that investigated the difficulty of a computational task.

Is “at the time” meant to imply complexity theory is no longer sexy, or merely that it’s no longer new? Leaving that aside…

Mass, for instance, is a fundamental property, because it remains the same in any setting; weight is a relative property, because an object’s weight depends on the strength of gravity acting on it … If computational complexity was like mass—if it was a relative property—then complexity was quite profound; if not, then not.

“I was just sounding off,” Deutsch said. “I said they make too much of this”—meaning complexity theory—“because there’s no standard computer with respect to which you should be calculating the complexity of the task.” Just as an object’s weight depends on the force of gravity in which it’s measured, the degree of computational complexity depended on the computer on which it was measured. One could find out how complex a task was to perform on a particular computer, but that didn’t say how complex a task was *fundamentally*, in reference to the universe … Complexity theorists, Deutsch reasoned, were wasting their time.

The tale continues with Bennett pointing out that the universe* itself* could be taken to be the “fundamental computer,” which leads Deutsch to the shocking realization that the complexity theorists weren’t *complete* morons. Sure, they had a silly theory where all the answers depended on which computer you chose (which somehow none of them ever noticed), but luckily, it could be fixed by the simple addition of quantum mechanics!

Over the anguished howls of my classical complexity-theorist friends, I should point out that this story isn’t *completely* false. There’s no denying that merging quantum mechanics with theoretical computer science was a major advance in human knowledge, and that the people who first had the idea to merge the two were *not* computer scientists, but physicists like Deutsch and Feynman (the latter’s role is completely left out of Galchen’s story).

But complexity theory wasn’t so much a flawed early attempt at quantum computing as an *essential prerequisite* to it: the thing that made it possible to articulate how quantum computers might differ from classical computers in the first place. Indeed, it occurs to me that Deutsch and Bennett’s conversation provides the key to resolving a puzzle discussed in the article:

“Quantum computers should have been invented in the nineteen-thirties,” [Deutsch] observed near the end of our conversation. “The stuff that I did in the late nineteen-seventies and early nineteen-eighties didn’t use any innovation that hadn’t been known in the thirties.” That is straightforwardly true. Deutsch went on, “The question is why.”

I used to go around saying the same thing: “someone like John von Neumann could have *easily* invented quantum computing in the 1930s, had he just put the pieces together!” But I now suspect this view is a mistake, the result of projecting what’s obvious today onto a much earlier era. For there’s at least *one* essential ingredient for quantum computing that wouldn’t enter scientific consciousness until the 1970s or so: complexity theory, and particularly the distinction between polynomial and exponential time.

Over the years, I’ve developed what I call the **Minus-Sign Test**, a reliable way to rate popularizations of quantum mechanics. To pass the Minus-Sign Test, all a popularization needs to do is *mention the minus signs: *i.e., interference between positive and negative amplitudes, the defining feature of quantum mechanics, the thing that makes it *different* from classical probability theory, the reason why we *can’t* say Schrödinger’s cat is “really either dead or alive,” and we simply don’t know which one, the reason why the entangled particles *can’t* have just agreed in advance that one would spin up and the other would spin down. Another name for the Minus-Sign Test is the **High-School Student Test**, since it’s the thing that determines whether a bright high-school student, meeting quantum mechanics for the first time through the popularization, would come away thinking of superposition as

(a) one of the coolest discoveries about Nature ever made, or

(b) a synonym used by some famous authority figures for ignorance.

Despite the low bar set by the Minus-Sign Test, I’m afraid almost every popular article about quantum mechanics ever written has failed it, the present piece included.

Reading Not Even Wrong, I was surprised at first that the discussion centered around Deutsch’s argument that quantum computing proves the existence of Many Worlds. (More precisely, Deutsch’s position is that Many Worlds is an established fact *with or without* quantum computing, but that for those who are too dense or stubborn to see it, a working quantum computer will be useful for hitting them over the head.)

As others pointed out: yes, the state of the universe as described by quantum mechanics is a vastly, *exponentially* bigger thing than anything dreamt of in classical physics; and a scalable quantum computer would be dramatic evidence that this exponentiality is really “out there,” that it’s not just an artifact of our best current theory. These are not merely truths, but truths worth shouting from the rooftops.

However, there’s then the further question of whether it’s useful to talk about *one quantum-mechanical universe* as an exponential number of parallel semi-classical universes. After all, to whatever extent the branches of a superposition successfully contribute to a quantum computation, to that extent they’re not so much “parallel universes” as one giant, fault-tolerantly-encoded, self-interfering blob; and to whatever extent those branches *do* look like parallel universes, to that extent they’re now forever out of causal contact with each other—the branches other than our own figuring into our explanations for observable events only in the way that classical counterfactuals figure in.

Anyway, I thought: does anyone still *care* about these issues? Wasn’t every possible argument and counterargument explored to death years ago?

But this reaction just reveals my personal bias. Sometime in graduate school, I realized that I was less interested in winning philosophical debates than in discovering new phenomena for philosophers to debate about. Why brood over the true meaning of (say) Gödel’s Theorem or the Bell Inequality, when there are probably other such worldview-changing results still to be found, and those results might render the brooding irrelevant anyway? Because of this attitude, I confess to being less interested in whether Many-Worlds is *true *than in whether it’s scientifically fruitful. As Peter Shor once memorably put it on this blog: why not be a Many-Worlder on Monday, a Bohmian on Tuesday, and a Copenhagenist on Wednesday, if that’s what helps you prove new theorems?

Ironically, this attitude seems to me to mesh well with Deutsch’s own emphasis on *explanation* as the goal of science. Ask not whether the parallel universes are “really there,” or whether they should really be called “parallel universes”—ask what explanatory work they do for *you*! (That is, over and above the explanatory work that QM itself already does for you, assuming you accept it and know how to use it.)

So for me, the single strongest argument in favor of Many-Worlds is what I call the “Deutsch argument”:

*Many-Worlds is scientifically fruitful, because it led David Deutsch to think of quantum computing.*

This argument carries considerable force for me. On the other hand, if we accept it, then it seems we should also accept the following argument:

*Bohmian mechanics is scientifically fruitful, because it led John Bell to think of the Bell inequality.*

Furthermore, consider the following facts:

*David Deutsch is a brilliant, iconoclastic theoretical physicist, who thought deeply about quantum foundations at a time when it was unfashionable to do so. His extraordinary (and not wholly-unjustified!) self-confidence in his own powers of reasoning has led to his defending not one but many heterodox ideas.*

Is it possible that these facts provide a common explanation for Deutsch’s certainty about Many-Worlds *and* his pioneering role in quantum computing, without our needing to invoke the former to explain the latter?

Let me end with a few miscellaneous reactions to Galchen’s article.

Physics advances by accepting absurdities. Its history is one of unbelievable ideas proving to be true.

I’d prefer to say the history of physics is one of a vast number of unbelievable ideas proving to be *false*, and a few *specific* unbelievable ideas proving to be true—especially ideas having to do with the use of negative numbers where one might have thought only positive numbers made sense.

[Robert Schoelkopf and his group at Yale] have configured their computer to run what is known as a Grover’s algorithm, one that deals with a four-card-monte type of question: Which hidden card is the queen? It’s a sort of Shor’s algorithm for beginners, something that a small quantum computer can take on.

No, small quantum computers can and have taken on both Shor’s *and * Grover’s algorithms, solving tiny instances in each case. The *real* difference between Shor’s and Grover’s algorithms is one that complexophobia might prevent Galchen from mentioning: Shor gives you a (conjectured) exponential speedup for some highly-specific problems (factoring and discrete log), while Grover gives you “merely” a quadratic speedup, but for a much wider class of problems.

“Look,” [Deutsch] went on, “I can’t stop you from writing an article about a weird English guy who thinks there are parallel universes. But I think that style of thinking is kind of a put-down to the reader. It’s almost like saying, If you’re not weird in these ways, you’ve got no hope as a creative thinker. That’s not true. The weirdness is only superficial.”

This was my favorite passage in the article.

Comment #1 May 3rd, 2011 at 3:16 am

I like that I’m a grad student in Comparative Philosophy (that is to say, I live in the land of woo-woo), and I could identify what was wrong with the blockquoted portions of the article before I read your commentary. 🙂 I guess that this blog is doing something useful.

Comment #2 May 3rd, 2011 at 3:43 am

Thanks, Carl! In the future, maybe I should just quote the troubling passages and leave the commentary on them to the commenters? 🙂

Comment #3 May 3rd, 2011 at 3:47 am

In Yuri Manin’s collection of essays “Mathematics as Metaphor”, he mentions somewhere that he was talking about the possibility of quantum computers around the time that Feynman first did. He might have written a paper in Russian about it back then. I can’t remember, but it’s probably not reading too much into things to say that he thinks he deserves some credit. You might want to look into that. It wouldn’t be surprising.

Comment #4 May 3rd, 2011 at 4:02 am

jb: Yes, the Nielsen & Chuang textbook (for example) talks about Manin, and several other people who had QC ideas around the same time as Deutsch and Feynman or even earlier.

I can think of few better illustrations for the old saying that what counts isn’t being the

firstperson to originate an idea, but the last one! 🙂 And, as I’ve seen again and again in my own career, that usually requires not just (e.g.) mentioning the idea in a paper about something else, published in some venue that none of the relevant people read, but reallyhammering the idea home(or being lucky enough that other people do that for you, while giving you credit). It’s not enough to wander the land; you’ve got to settle and build a city on it.Comment #5 May 3rd, 2011 at 4:06 am

The real question is: Why wasn’t the distinction between polynomial time and exponential time discovered before the 1970s?

Comment #6 May 3rd, 2011 at 4:20 am

Mitchell: Good question! The short answer is that it was, many times, but without a whole field of algorithm design (motivated by actually-existing computers), people didn’t grasp its importance.

(Gödel certainly understood the distinction in his 1956 letter to von Neumann, but he never published about it. In the 1960s, Edmonds, Hartmanis-Stearns, and Cobham

didpublish their seminal papers explaining, defending, and applying the distinction. But at least as I understand it, it wasn’t until the discovery of NP-completeness in the 1970s that the distinction started filtering into the wider scientific world.)Comment #7 May 3rd, 2011 at 5:08 am

An excellent post Scott. Insightful, balanced and very informative — no hint at all of the grumpiness sometimes found elsewhere when these topics are discussed.

Comment #8 May 3rd, 2011 at 6:30 am

So….a Letter to the Editor? Best case scenario, Science will have been popularized in an intelligent manner. Worst case scenario, you might bore the New Yorker readership out of their Armanis.

Comment #9 May 3rd, 2011 at 6:49 am

So….a Letter to the Editor?Eh, I already said what I wanted to here. And there’s too much to fit into a letter.

Comment #10 May 3rd, 2011 at 9:36 am

One of the things that bothers me about Deutsch’s way of thinking is the following. One of the superficial differences between classical and quantum mechanics is that the easiest, most accessible formalism of either one of them is different. Classical mechanics can be (but doesn’t have to be) formulated directly in terms of observable quantities obeying certain differential equations. In quantum mechanics you are forced into using redundant variables encoded (for example) in the wavefunction or density matrix. You then have a prescription of extracting from all this redundant information the very small subset of it corresponding to the physical information.

It seems to me that it is precisely this redundancy that is so celebrated by Deutsch as the force of quantum mechanics. But it is not – to compare meaningfully any aspect of quantum and classical mechanics (e.g. computational power) you’d have to use similar formalisms to distinguish language from content. I suspect then that you can formulate (clumsily and redundantly) purely classical mechanics in terms of redundant variables such as the wavefunction, and then you’d have yourself exponential speedup a la Deutsch.

In other words, I don’t see any aspect of actual QM as a dynamical theory that comes into that type of reasoning. Or, maybe I am missing something…

Comment #11 May 3rd, 2011 at 9:42 am

Could you post a link to further information on the “Minus-Sign Test” and/or popularizations that pass it? I’ve read quite a lot about quantum mechanics from a lay person level, yet I’m not sure I understand what you are talking about with this test, so I am afraid that perhaps it is a portion of QM that I do not understand.

Comment #12 May 3rd, 2011 at 9:45 am

That’s a great post, thanks Scott. It’s rather astounding to read that Deutsch has been conflating hypotheses and claiming things as facts that really are not.

Just by itself, the statement that quantum computers could solve problems that are beyond the scope of any classical computer has been so widely publicized that it deserves massive debunking.

Maybe this needs to be promoted as another subtitle below “Shtetl-Optimized”?

Comment #13 May 3rd, 2011 at 10:41 am

Ironically, this attitude seems to me to mesh well with Deutsch’s own emphasis on explanation as the goal of science. Ask not whether the parallel universes are “really there,” or whether they should really be called “parallel universes”—ask what explanatory work they do for you! (That is, over and above the explanatory work that QM itself already does for you, assuming you accept it and know how to use it.)So for me, the single strongest argument in favor of Many-Worlds is what I call the “Deutsch argument”:

Many-Worlds is scientifically fruitful, because it led David Deutsch to think of quantum computing.The point about whether an idea is fruitful seems distinct from Deutsch’s notion of explanations being important. Fruitfulness is generally what was emphasized by Lakatos.

Comment #14 May 3rd, 2011 at 11:40 am

Moshe #10:

I suspect then that you can formulate (clumsily and redundantly) purely classical mechanics in terms of redundant variables such as the wavefunction, and then you’d have yourself exponential speedup a la Deutsch.That’s where you’re mistaken, unless I’m misunderstanding you. The (conjectured) exponential speedup of Shor’s algorithm is not some artifact of the way quantum states are represented—if QM is correct, then it’s a real, actual phenomenon. On the other hand, I agree that thinking about the wavefunction “realistically” (as an exponentially-large classical object) seems to be a mistake that countless popular writers make, which then leads them to believe that quantum computers can solve black-box search problems instantaneously, store exponentially-many classical bits, and do other things that they’re known not to be able to do.

Comment #15 May 3rd, 2011 at 11:51 am

Kevin C #11:

Could you post a link to further information on the “Minus-Sign Test” and/or popularizations that pass it? I’ve read quite a lot about quantum mechanics from a lay person level, yet I’m not sure I understand what you are talking about with this test, so I am afraid that perhaps it is a portion of QM that I do not understand.I’m sorry to break the news that whatever popularizations you read failed you: the minus signs aren’t just a “portion” of QM, they’re the

entire thingthat distinguishes QM from classical probability theory!Here are some resources to get you started:

The wonderful book From Eternity to Here by Sean Carroll (which contains one of the only popular-level discussions of QM I know about to pass the Minus-Sign Test with flying colors)

My Quantum Computing Since Democritus Lecture 9 or Quantum Computing for High School Students

Pretty much anything by Feynman (e.g., “QED” or “The Character of Physical Law”)

The Square Root of NOT by Brian Hayes

Let me know if you read those and want more. 🙂

Comment #16 May 3rd, 2011 at 11:52 am

Scott, that is my point – the exponential speedup that actually does exist seems to me unrelated to the point of the formalism that motivates Deutsch. Rather, it uses facts about quantum mechanics that are inherently quantum, and not just its representation as exponentially large amount of unobservable information. Such representation can exists for any theory, in fact it ought to exist for classical mechanics as well. What facts are those is a fascinating question, but I think the wavefunction having many branches is a red herring.

Comment #17 May 3rd, 2011 at 11:56 am

Moshe: The way I’d put it is

(1) the wavefunction has exponentially many branches—so far, no distinction at all between QM and classical probability theory, but then

(2) the amplitudes can be negative—and that,

combinedwith (1), is what makes exponential quantum speedups possible.Comment #18 May 3rd, 2011 at 12:01 pm

T. #12:

Just by itself, the statement that quantum computers could solve problems that are beyond the scope of any classical computer has been so widely publicized that it deserves massive debunking.Maybe this needs to be promoted as another subtitle below “Shtetl-Optimized”?Done! 🙂 Thanks for the suggestion.

Comment #19 May 3rd, 2011 at 12:04 pm

“and can be simulated classically with exponential slowdown”

😉 Perfect. Exactly the point!

Comment #20 May 3rd, 2011 at 12:04 pm

Joshua #13:

The point about whether an idea is fruitful seems distinct from Deutsch’s notion of explanations being important. Fruitfulness is generally what was emphasized by Lakatos.For me, whether an idea is fruitful is a

testof whether it has explanatory power. I have a hard time thinking of any example of a satisfying explanation that doesn’t open up further avenues for research.Comment #21 May 3rd, 2011 at 12:06 pm

Scott, we both seem to agree that (1) is not enough, which is my main objection to (at least vulgarizations of) Deutsch intuition of “parallel” computations. I guess this is what you mean by the minus sign test…

But, it is not clear to me that (2) is sufficient, or the essential point. The fact is that only specific problems, whose structure is used in specific ways, enjoy the speedup. So, any “generic” explanation will have to give a reason why this doesn’t work for generic problems.

Comment #22 May 3rd, 2011 at 12:23 pm

Or to cast it in a language you may sympathize with – I’d be suspicious of any explanation of facts about complexity of quantum computation that does not use actual complexity theory in an essential way.

Comment #23 May 3rd, 2011 at 12:44 pm

Moshe: I certainly sympathize! But I think it’s useful to distinguish

(1) explaining a computational model itself, and

(2) explaining how interesting or nontrivial

algorithmswork in that model, or theimpossibility resultsthat can be proved about the model.I take it for granted that (2) is inherently complicated—so in particular, I don’t expect popular articles about quantum computing to be able to do it! I’d happily settle for popular articles doing (1) (which of course, most of them already fail at).

Comment #24 May 3rd, 2011 at 12:59 pm

Yeah, I agree, but my comment was not so much about the article, but about the question being discussed in that article: what facts about quantum mechanics are the essence of the difference you get in computational power (a vague question if there ever was one). The explanation given there (and the modification in your comment, to a lesser degree) seems to me misleading because it applies universally, it suggests speedup is a generic feature following solely from using the quantum mechanical formalism. The real answer is probably more intricate, if you have a reference for an interested outsider with something more convincing I’d be interested to read about it.

Comment #25 May 3rd, 2011 at 1:27 pm

Moshe,

I don’t think that a “speedup is a generic feature following solely from using the quantum mechanical formalism.”

I think the key word is “solely”. On a prior post:

https://scottaaronson-production.mystagingwebsite.com/?p=208

Scott said in part:

” . . . if we want a fast quantum factoring algorithm, we’re going to have to exploit some structure in the factoring problem: in other words, some mathematical property of factoring that it doesn’t share with just a generic problem of finding a needle in a haystack.”

Not sure if this is what your questions was drivingn at.

Comment #26 May 3rd, 2011 at 1:31 pm

Yeah, I’m committing the ancient sin of preaching to the choir. Thanks for reminding me of that post, this is probably what I need.

Comment #27 May 3rd, 2011 at 1:40 pm

Moshe,

I’m really out of my depth here, but perhaps in addition to the potential advantages inherent in quantum mechanics, each type of problem (factoring, simulation, etc.) must have its own peculiar exploitable feature in order for the (plausibly conjectured) speed up to occur. Or, maybe not. Perhaps someone else has a pithy answer to this.

Comment #28 May 3rd, 2011 at 1:45 pm

“And third, Deutsch’s landmark paper wasn’t among the ones to give evidence for that conjecture. The first such evidence only came later, with the work of Bernstein-Vazirani, Simon, and Shor.”

Well I’m not sure if I agree with you there. Deutsch definitely doesn’t show anything that would past muster as a computational complexity result, but he does show in that paper a small computational task where quantum and classical differ and he explicitly asks the question of whether this implies that that computational complexity depends on whether you have a quantum or classical computer. I’d count this as “evidence.” (And also I think your list should include Deutsch-Jozsa!)

As an interesting side note, Turing apparently was very interested in quantum theory and even in its foundations. In fact he discovered what we now call the quantum Zeno effect. I always like to think that had Turing lived longer we would have found quantum computers earlier (or in other words, Turing would be working in quantum computing had he lived today 🙂 )

Comment #29 May 3rd, 2011 at 1:55 pm

“I always like to think that had Turing lived longer we would have found quantum computers earlier (or in other words, Turing would be working in quantum computing had he lived today.)”

Well, Deutsch of course would say that he did 😉

Comment #30 May 3rd, 2011 at 2:05 pm

This response to the New Yorker article was very, very good!

As a big fan of both the New Yorker and this blog I strongly encourage you to write in to them despite the lengthiness of your comments.

Comment #31 May 3rd, 2011 at 2:06 pm

Why do I have this feeling that there is a lot of closet mathematicians around here?

Comment #32 May 3rd, 2011 at 2:25 pm

I don’t think what you’re calling “scientifically fruitful” has anything to do with Deutsch’s (Popper’s) notion of “explanation”. If, to take an example from Deutsch’s new book “Beginning of Infinity”, you had a good idea while thinking about the seasons being caused by some private affair between Greek gods, you’d call it “scientifically fruitful”, but I don’t think anybody would call it a good explanation. In that sense I’m not even sure about what your point is in that part of the post.

Comment #33 May 3rd, 2011 at 3:35 pm

Chris Fuchs once told the story that he went around asking people who had made notable discoveries in quantum computation whether they had been inspired by Everett’s interpretation of quantum mechanics. One — I think it was Simon — wrote back, “Who is Everett and what is his ‘interpretation’?”

Comment #34 May 3rd, 2011 at 3:39 pm

Mark Probst: I agree that it’s by no means necessary that something that’s scientifically fruitful also have explanatory power. My claim is simply that

empirically, theconverse statementholds: something that’snotscientifically fruitful is exceedingly unlikely to have explanatory power. Do you have a good countetexample (besides MWI, of course 🙂 )?Comment #35 May 3rd, 2011 at 3:52 pm

Dave Bacon #28: Sorry, I guess we differ here … if all I knew was Deutsch-Jozsa, I wouldn’t be convinced that anything of complexity-theoretic significance was going on. Whereas Bernstein-Vazirani would perk my interest. (Of course, in retrospect, complexity theorists “should have” leapt at Deutsch-Jozsa earlier — but Deutsch and Jozsa also “should have” pursued an asymptotic gap! Oh well, it only took a few years from there to Shor’s algorithm anyway.)

Comment #36 May 3rd, 2011 at 3:55 pm

OK, having said that, I had to go look. I think I was recalling it from PIRSA:07090068, where he quotes Jozsa as saying,

Shor said that he knew of Everett before starting work on his factoring algorithm, but that “the idea was really more to use periodicity, and inspired by Simon’s algorithm.” To which the punchline is Simon saying, “Who’s Everett, and what’s his interpretation?”

Comment #37 May 3rd, 2011 at 4:03 pm

Mike #27: Yes, that’s well-expressed. The combination of superposition and interference is what

canlead to quantum speedups for certain problems. But figuring out which problems you canactually geta quantum speedup for, and how much speedup, and by which problem-specific techniques, is an entire field of study. To whatever extent you can find general principles underlying the answers to such questions, you’ll revolutionize the field!Comment #38 May 3rd, 2011 at 4:06 pm

It’s nice to see that you haven’t lost your essayistic prowess, Scott.

There has been another CS related piece in the New Yorker lately which I found much worse than Galchen’s. Adam Gopnik thoughts on AI were quite shallow and error riddled. Well, I love the magazine nevertheless. (Although i don’t belong to the target audience of chardonnay-sipping Manhattan socialites)

Comment #39 May 3rd, 2011 at 4:06 pm

I like the amended blog tagline. (-:

Comment #40 May 3rd, 2011 at 4:31 pm

Scott,

“To whatever extent you can find general principles underlying the answers to such questions, you’ll revolutionize the field!”

Well, I’ll get to work on that right away 🙂

Comment #41 May 3rd, 2011 at 4:34 pm

Scott, do you have an agent? Because I bet you could write a killer pop science book on complexity theory and quantum computers. Get that puppy optioned for a PBS series, make you a couple million, go be as eccentric as you want. With a little work you could make Howard Hughes look as straight-laced as Mr. Cleaver.

Comment #42 May 3rd, 2011 at 4:37 pm

complexophobia – I thought that it’s fear of complex numbers. Which I suppose sometimes comes to the same thing as your definition.

Also, physics advances by TAMING absurdities. Which is evidently the opposite of what the New Yorker sometimes does.

Comment #43 May 3rd, 2011 at 4:47 pm

There is an interesting remark Scott have made (to ‘sp’) a few years ago (in the post “scientific americam article is out”) which mention the connection between impossibility of(computationally superior) quantum computers and David Deutsch’s argument for the many-worlds interpretation.

“If quantum computing is impossible, I would say that quantum mechanics as physicists have taught and understood it for 80 years is wrong. Think about it this way: if quantum mechanics can’t be used to efficiently sample any probability distribution beyond what a classical computer can efficiently sample (forget for the moment about deciding all languages in BQP), then there must be an efficient classical algorithm to simulate the outputs of a quantum process. But that, in turn, means that there’s some complete description of interacting physical systems that’s vastly more efficient than writing down the entangled wavefunction (i.e., that requires polynomially many rather than exponentially many bits). And this, I think, would be the biggest change in physicists understanding of QM since the 1920′s: it wouldn’t be a local hidden-variable theory, but it would be something even more astounding, a “succinct” hidden-variable theory. I think such a theory would, for example, overturn David Deutsch’s famous argument for the many-worlds interpretation (“where is all the computation happening, if not in parallel universes?”). The answer would be: it’s happening in a single universe that stores the succinct representation of the wavefunction.

You’ll notice that in the above argument, I never once mentioned noise. That’s not an accident. For either the noise is enough to enable an efficient classical simulation of the quantum computer, in which case you’re in the situation above, or else it isn’t, in which case some sort of nontrivial quantum computing is possible by definition.”

Comment #44 May 3rd, 2011 at 7:28 pm

“I think such a theory would, for example, overturn David Deutsch’s famous argument for the many-worlds interpretation (“where is all the computation happening, if not in parallel universes?”). The answer would be: it’s happening in a single universe that stores the succinct representation of the wavefunction”

Scott,

I’d be interested in your resonse to this. Is there any hint of this — were does the evidence lie.

Comment #45 May 3rd, 2011 at 8:51 pm

Scott I do disagree strongly with not including Deutsch-Jozsa in your list.

From the abstract to their 1992 paper:

“A class of problems is described which can be solved more efficiently by quantum computation than by any classical or stochastic method. The quantum computation solves the problem with certainty in exponentially less time than any classical deterministic computation.”

From the paper:

“In this paper we demonstrate the importance of quantum processes for issues in computational complexity. We describe a problem which can be solved more efficiently by a quantum computer than by any classical computer. The quantum computer solves the problem with certainty in exponentially less time than any classical deterministic computer, and in somewhat less time than the expected time of any classical stochastic computer.”

This still seems to me to be a big step in the “evidence” for quantum computers giving speedup. In particular showing QP not in ZPP relative to their oracle problem seems to me like a pretty good dose of evidence.

(Another paper that has somehow been lost to the sands of time that discusses some of these early complexity results is “The quantum challenge to structural complexity theory” by Berthiaume and Brassard.)

Comment #46 May 3rd, 2011 at 8:53 pm

Oh and a great quote from that paper (Bethiaume and Brassard) is “It must be pointed out that the class QP was introduced under this very name by Deutsch and Jozsa[9], showing that these physicists are far from ignorant of computational complexity theory. Thus we probably have much more to learn from them than they have from us.” 🙂

Comment #47 May 3rd, 2011 at 10:12 pm

“However, reading this article also depressed me, as it dawned on me that the entire thing could have been written fifteen years ago, with only minor changes to the parts about experiment and zero change to the theoretical parts.”

Yes, there were some minor inaccuracies, but this is what annoyed me the most. However, I don’t know what Deutsch has done in the last 15 years, either, and the main point of the article was to profile Deutsch not to explain the status of quantum computing.

Comment #48 May 4th, 2011 at 1:49 am

It looks like your character encodings got screwed up on

http://www.scottaaronson.com/writings/highschool.html .

None of the square roots come out right.

Comment #49 May 4th, 2011 at 4:19 am

I find it hard to believe the New Yorker article could be better than your blog post on it. Bravo.

Comment #50 May 4th, 2011 at 4:21 am

By the way, I have no interest in seeing your long form birth certificate, but it would be cool if you scanned and published your transcripts -high school through grad school, to your website.

Comment #51 May 4th, 2011 at 10:03 am

Had quantum computation been discovered earlier, would P vs NP still be the marquee problem in complexity theory, or would it be BPP vs BQP? It seems to me that P vs. NP is “only” interesting if they turn out to be equal, or resolving it leads to proof techniques that can be used to prove things about more “real” complexity classes, while BPP vs BQP will be interesting no matter what. You said that classical complexity was a prerequisite to quantum complexity, but now that we have quantum complexity, does the classical matter?

Comment #52 May 4th, 2011 at 10:16 am

Complexophobia: Most science journalism makes experts squirm uncomfortably. I doubt this is much more true for complexity theory than other topics. (I used to do cosmology …)

Evidence: One might say that Feynman presented evidence for exponential speedups in his 1982 paper. His Main Point is that there are quantum simulations that we don’t know how to do efficiently on a classical computer. Deutsch tried to restate the point in complexity theory terms, which was very important.

I remember hearing a talk by Seth Lloyd on quantum computing in the early 1990s. Michael Douglas, a theoretical physicist who knows a lot about computer science, was in the audience, too. Mike told me afterwards that the talk had disappointed him because it did not address Deutsch’s Big Idea that quantum computers can achieve exponential speedups. This stirred me to read Deutsch’s 1985 paper, and I was unimpressed. It was “obvious” to me that at best Deutsch was talking about an exponential “speedup” that occurs with exponentially small probability.

Later I read the Deutsch-Jozsa paper and I was unimpressed. It was “obvious” to me that the “speedup” resulted from insisting on deterministic rather than probabilistic algorithms.

But a few years later I changed my mind …

Even so, a lot of things are still “obvious” to me.

Comment #53 May 4th, 2011 at 12:53 pm

Vadim #51:

Had quantum computation been discovered earlier, would P vs NP still be the marquee problem in complexity theory, or would it be BPP vs BQP?No, I think P vs. NP would still be extremely important (along with NP vs. BQP, which you don’t mention). NP

isa “real” complexity class, in the sense that NP-complete problems are problems that we’d really like to solve. And I think there’s no real doubt that a proof of P≠NPwouldrequire (and lead to) all sorts of other important advances in mathematical knowledge, probably including new algorithms.However, I think the most important point is that the barriers to proving P≠NP, BPP≠BQP, NP⊄BQP, and so on

are all essentially the same: relativization, algebrization, natural proofs, and (less formally but most importantly) the huge variety of nontrivial algorithms. Therefore, you might as well focus on whichever of these problems is best suited to your present purpose (connecting to some other field, explaining to a popular audience, etc.), since any major advance on any of them would also constitute a major advance on the others.Comment #54 May 4th, 2011 at 1:19 pm

John Preskill #52:

Complexophobia: Most science journalism makes experts squirm uncomfortably. I doubt this is much more true for complexity theory than other topics.I respectfully disagree: it seems to me that the quality of science journalism varies enormously by topic. Daniel Dennett once wrote that no area of science has been better served by its writers than evolution. I would add that no area of science has been

worseserved by its writers than quantum mechanics! 🙂To be fair, I think a lot of this has to do with how the scientists themselves write. Charles Darwin and Thomas Huxley both wrote beautifully. By contrast, I’d argue that Bohr and Heisenberg started a tradition of writing really obscurely about quantum mechanics that continued for almost a century, and that’s only

startedbeing corrected recently, with the rise of quantum information. (Incidentally, I think Deutsch is completely right on this point. And while I found plenty to disagree with inThe Fabric of Reality, at least that book is clear enough that I can pinpoint exactly where the disagreements lie!)This stirred me to read Deutsch’s 1985 paper, and I was unimpressed. It was “obvious” to me that at best Deutsch was talking about an exponential “speedup” that occurs with exponentially small probability.Later I read the Deutsch-Jozsa paper and I was unimpressed. It was “obvious” to me that the “speedup” resulted from insisting on deterministic rather than probabilistic algorithms.Wait, are you arguing

againstme here? 🙂 I think these anecdotes (which I didn’t know, and which I thank you for sharing) make my point better than I did!The conclusions you drew were entirely reasonable ones to draw from the Deutsch and Deutsch-Jozsa papers—as I said, they’re also the conclusions that I would’ve drawn. A journalist’s spin might be: “ah, how foolish everyone else was, not to understand what Deutsch and Jozsa were driving at!” But science is kind of

all aboutpresenting the evidence that what you’re saying is true…Comment #55 May 4th, 2011 at 2:17 pm

Scott: “are you arguing against me here?”

No, not really. Deutsch and Deutsch-Jozsa were prescient, and for that they deserve much credit. But I would say that as of 1992 the most persuasive evidence for profound quantum speedups came from Feynman’s argument.

Comment #56 May 4th, 2011 at 2:55 pm

Deutsch and Deutsch-Jozsa were prescient, and for that they deserve much credit. But I would say that as of 1992 the most persuasive evidence for profound quantum speedups came from Feynman’s argument.Well-put; I’m in complete agreement there. (Except that I’d describe what Feynman gave not so much as an “argument” as a “pointing out of the lack of counterarguments”… 🙂 )

Comment #57 May 4th, 2011 at 3:31 pm

“Charles Darwin and Thomas Huxley both wrote beautifully. By contrast, I’d argue that Bohr and Heisenberg started a tradition of writing really obscurely about quantum mechanics that continued for almost a century, and that’s only started being corrected recently, with the rise of quantum information.”

You are just focusing on the best biology writers and the worst physics writers. Feynman (his Messenger lecture on quantum mechanics and his QED book) and Freeman Dyson, for example, wrote very clearly, in simple yet accurate terms, about quantum mechanics. I think the New Yorker article was pretty bad, riddled with cliches, but I blame the author Galchen for its shortcomings, not all of physicsdom. Scott, you gotta get over your love-hate relationship with us physicists. Me personally, I love all computational complexity theorists.

Comment #58 May 4th, 2011 at 3:57 pm

rr: I agree that Feynman and Dyson both wrote/write beautifully, but they weren’t the “founding fathers” of QM—Bohr and Heisenberg were. I also agree that Bohr and Heisenberg don’t bear direct responsibility for any shortcomings of a modern popular article. They were, however, responsible for starting the tradition of writing about QM in a way that flunks the Minus-Sign Test—a tradition that Feynman later tried his best to correct.

Glad to hear about your love for ALL computational complexity theorists! So I assume your days of posting snarky anticomplexity comments on

Shtetl-Optimizedare over?? 😉Comment #59 May 4th, 2011 at 4:13 pm

As someone who’s not a scientist, it seems obvious to me (which is how I know that I’m almost certainly wrong): cutting-edge physics and computational complexity are doomed to be forever butchered in the popular press because they’re inherently less accessible. I’m not saying the study of evolution is easy, but people can develop some intuition about it (even if the intuition is more wrong than right). On the other hand, if one wants to even think about quantum Turing machines, one had better be prepared to spend time learning the background.

Physics and computer science are huge interests of mine and have been for a while, the life sciences not as much. But if I had to write an article about either evolution or quantum computation, I could much easier bullshit through the evolution one, saying not much of note, but also not betraying as much ignorance.

We’re all philosophers, but most of us ain’t quantum physicists.

Comment #60 May 4th, 2011 at 4:26 pm

Vadam,

“We’re all philosophers, but most of us ain’t quantum physicists.”

This unfortunately sums up my predicament. 🙁

Comment #61 May 4th, 2011 at 4:38 pm

Mike #44:

The answer would be: it’s happening in a single universe that stores the succinct representation of the wavefunction”Scott,

I’d be interested in your resonse to this. Is there any hint of this — were does the evidence lie.Sorry, just realized I never responded to this!

There’s been essentially no change here—we still think that quantum computers

can’tbe efficiently simulated by classical computers. If anything, the evidence for that proposition has gotten stronger in recent years, with, e.g., the results on sampling problems due to Bremner, Jozsa, and Shepherd, and myself and Arkhipov.Comment #62 May 4th, 2011 at 4:50 pm

Vadim and Mike, I think there’s a simple solution to the problem: if you’re writing a popular article about a topic that you don’t understand well, have experts (preferably not the ones you’re writing about) fact-check and “concept-check” the article. Personally, I’ve been happy to do that for every journalist who’s ever asked me to, and I know many others would be happy too. Fact-checking a popular article is also an excellent experience for grad students.

Comment #63 May 4th, 2011 at 6:57 pm

I don’t hear Bohmian mechanics discussed much anymore. Is there any work being done to advance it?

Comment #64 May 4th, 2011 at 7:05 pm

Kamal: Not many people work on Bohmian mechanics, but a few who do (who I know about) are Sheldon Goldstein and Roderich Tumulka at Rutgers, and Antony Valentini at Perimeter Institute. Search the arXiv for their papers if you’re interested.

Comment #65 May 4th, 2011 at 7:14 pm

Thanks Scott

Comment #66 May 5th, 2011 at 5:21 am

Scott,

“If anything, the evidence for that proposition has gotten stronger in recent years”

That’s what I thought — thanks.

Comment #67 May 5th, 2011 at 8:23 am

Re-reading my comment, I just want to make sure I wasn’t coming off as dissing actual philosophers; philosophy isn’t something I know much about and I’m sure there’s a lot of study involved in mastering it. I just meant that there are more armchair philosophers than armchair quantum physicists or complexity theorists.

Comment #68 May 5th, 2011 at 2:57 pm

To Scott’s comment 56: I think that the strongest argument for the difficulty of simulating quantum systems on classical computers is STILL the lack of counter-arguments. While Scott once commented to me that he regarded Shor’s algorithm as having provided strong evidence for the difficulty of quantum simulation given the belief in the difficulty of factoring, I (respectfully!) regard this statement as nonsense: many more smart people have put in many more hours trying to figure out how to simulate quantum systems than have put in hours trying to figure out how to factor. And up until relatively recently, they had much more motivation to do so (and even today, I’d say they still have more motivation to do so). So, if you want to base arguments for the difficulty of a problem on people’s inability thus far to solve it, we should just look at people’s inability to simulate quantum systems as being the best argument in that vein.

Other than arguments based on “we don’t know how to solve this problem”, there is also the argument that “solving this problem implies solving many other problems”; basically, arguments based on reductions. Such arguments are strongest when applied to the problem that you can reduce other problems too; i.e., they are much stronger when applied to simulating quantum systems than to factoring.

Comment #69 May 5th, 2011 at 3:30 pm

Matt, I respectfully think

you’retalking nonsense! Firstly, before the quantum information era, my understanding is that most physicists didnotstress the exponentiality of Hilbert space, nor did they regard quantum simulation as having some qualitatively different level of hardness than classical simulation. (If they had, then they would’ve invented quantum computing in the 50s or 60s, before Feynman and Deutsch!) Rather, their impulse was to throw DMRG, Wick rotation, or some other calculational tool at the quantum simulation problem and hope it worked (a hope, of course, that was very often borne out!). I don’t know if any physicists of the 50s or 60s explicitly considered the question of whether there’s any such ansatz thatalwaysworks—but I bet they would’ve been surprised if you told them that, if there was one, you could also use it to factor integers.To lay my cards on the table:

Before Bernstein-Vazirani, Simon, or Shor’s algorithms, I probably would’ve given 50% odds for BPP≠BQP.

After Shor’s algorithm, I give it 95% odds.

If someone told me tomorrow that factoring was in BPP, I’d scale back to 75% odds.

Comment #70 May 5th, 2011 at 4:13 pm

Scott, I think this is a question of where each of us came from. I come from physics, and place more trust in the physicist’s abilities, you come from CS and place more trust in the computer scientists. To make the case that people knew about the importance of exponential growth in Hilbert space well before the 70s:

Exact diagonalization dates back I think to the 50s. Early papers were, if I recall correctly, doing ED on maybe 11 sites (Bonner and Fisher on the Heisenberg chain). They clearly knew that the problem grew exponentially that way, as they simulated the biggest thing they could.

At this point I’m just going to point to all those methods you mention, plus much older methods like perturbation theory and density functional theory and say that smart people wouldn’t have put all that effort into developing those methods if they thought that there was a faster way. Now, I feel your reply above is essentially like saying (to pick a specific example): “The development of perturbation theory for quantum systems does not in itself imply any understanding that simulating quantum systems is harder than simulating classical systems, since people also developed perturbation theory for problems like turbulence which can be simulated on a classical computer with a cost that merely grows polynomially with system size rather than exponentially”.

And that last is correct as a statement in itself. But since people were trying brute force methods for quantum as early as the 50s, I feel that they were aware that the simulation difficulty grew exponentially. That is, people tried and abandoned the brute force approach much more readily in the quantum world than they did in the classical world. In fact, in the quantum world it took quite a long time for ED to return to favor….the Bonner and Fisher Heisenberg chain simulations were not repeated for a long time, even when computers did allow much larger simulations, leading to some confusion. So, I think people almost got _too_ scared of the exponential growth. On the classical side, too, of course, computer simulation of turbulence also fell out of favor for a time, but computer simulations of non-turbulent systems were, I think, common since WW II. So, I don’t think that there was as great a fear of classical simulation.

Also, in many cases the approximate methods used for quantum systems were still numerical, like density functional theory, while for a long time the goal of perturbation theory for classical system was to get better understanding by pen-and-paper calculations. So, I think there was a difference in motivation for developing alternate methods: on the quantum side, people were happy to develop alternate, approximate methods that were still numerical, while on the classical side, the alternate methods were mostly preferred because they were non-numerical (one can point to a possible exception in numerical series calculations for classical stat mech, again dating to the 50’s—this is amusing from a complexity standpoint as they were concerned in this case with problems that could be handled by Monte Carlo simulation, i.e., using randomness, and that is sort of intermediate between P and BQP).

So, I’m standing by: if someone like Michael Fisher had been told the meaning of BPP and BQP back in 1955, he’d have put near certainty on them being different, regardless of factoring. Anyway, it’s interesting to get your perspective…I feel like it’s a result of coming from different fields, so that can make me re-evaluate my own judgments also.

Comment #71 May 5th, 2011 at 4:16 pm

Btw—I will definitely agree that all those physicists would have been surprised by the fact that an ansatz that always works allows you to factor integers. That is incredibly surprising. But, my point is that they probably would have felt that they would have had more important problems to solve with an ansatz that always worked (especially since RSA wasn’t around yet)

Comment #72 May 5th, 2011 at 11:31 pm

Vadim Pokotilov:

You don’t have to study a lot of philosophy to have fun debating with philosophers. The field is very buzz-word oriented, and debating issues is kind of like poker:

I see your “Wittgenstein’s glass ship” and I raise you “deterministic skip lists” and “the trichotomy”; and so on. And if you are tired and just want to get them off the porch, pay for the pizza and give him a tip.

Comment #73 May 6th, 2011 at 1:59 am

It is interesting indeed to compare the evidence for exponential speedup proposed in Feynman’s paper and the evidence from Shor’s algorithm. The fact that certain computations in quantum mechanics which are supposed to describe some natural phenomena require exponentially many steps on a digital computer may have several alternative explanations.

One explanation is that the computation can be carried out by a clever polynomial time algorithm. (Example: computing determinants from the definition is exponential but it is coputationally easy.) In hindsight although remarkable improved computational methods for these physics computations were found over the years it seems correct that the type of computations Feynmann refereed to are computationally hard for classical computers. (Probably there is a lot yet to prove there even regarding making these computations with a quantum computer, and certainly regarding hardness.)

Another explanation is that for cases that the computations Feynman referred to are computationally intractable (on a classical computer) they fail to describe the natural quantity they are suppose to describe. This, of course, does not mean that QM is wrong but just that a certain computation based on certain modeling via QM is wrong in some ranges of parameters. This still looks like a reasonable possibility. (It is reasonable even if quantum computers can be constructed and even more so if it is impossible to build them.)

Shor’s algorithm is a different and more dramatic sort of evidence. The case for exponential speedup compared to classical computers is very strong. The connection to real-life physics is not as close.

Comment #74 May 6th, 2011 at 8:17 am

Hello: I am not a physicist, I am a molecular biologist who is interested in mechanisms controlling mutation, such as somatic hypermutation, where B-cells carry out programmed mutation of immunoglobulin genes in response to stimulation by antigen. Please excuse my ignorance on the subject of quantum mechanics, but I was wondering if there is any theoretical possibility that DNA bases could ever exist in a state of superposition? After reading ‘The New Yorker’ article, including the description of a two qubit quantum computer that can find the Queen among four “cards” in only one “try”, I imagined the notion of a B-cell using a similar strategy to “find” the “correct” mutation directing higher affinity of an encoded immunoglobulin to an antigen. Is this an absurd idea? I have spent the past few days attempting to understand the “four- card-monte experiment” and confess that I can’t conceptualize it: I imagine a conventional computer as a person who turns over individual cards, one at a time, but a quantum computer as a person who “touches” all four cards simultaneously, and somehow, through this process, gleans the information, without having to turn the cards over, and decides correctly without inspection. Again, I apologize for my ignorance; perhaps my post will stand to illustrate flawed popular conceptions of quantum mechanics.

Comment #75 May 6th, 2011 at 11:21 am

Gil, why do you think factoring is hard classically?

Comment #76 May 6th, 2011 at 11:49 am

Greg: DNA certainly has quantum-mechanical aspects; I remember reading about how they were important for Watson and Crick when they worked out the DNA structure. But while I know next to nothing about biochemistry, I doubt that quantum coherence could be maintained in DNA long enough to carry out a “Grover search” even on a 4-element space. (Maybe others can correct me if I’m wrong.)

The main point to understand is that, to search a list of N items, Grover’s algorithm requires about (π/4)√N steps. That’s substantially less than N—which is amazing!—but it still does scale with N, so that (for example) if N=10

^{1000}you’d need ~10^{500}steps. When N=4, it so happens that you can do the search in 1 step, but for larger N you need more steps.Comment #77 May 6th, 2011 at 3:37 pm

Scott,

“The main point to understand is that, to search a list of N items, Grover’s algorithm requires about (π/4)√N steps. That’s substantially less than N—which is amazing!—but it still does scale with N, so that (for example) if N=101000 you’d need ~10500 steps. When N=4, it so happens that you can do the search in 1 step, but for larger N you need more steps.”

I think I understand this, but how do the number of qubits factor in?

On the assumption that you could precisely (enough) manipulate, say, 1000 “genuine” qubits, what impact? How about 10,000?

I know that’s a pretty big assumption, and I don’t have any idea how to do it myself, but I’m taking the position that I’m allowed to just view it all as an “engineering” problem, since (I believe) the concept is proved, at leave in ‘principle.”

Thanks.

Comment #78 May 6th, 2011 at 4:02 pm

You’re probably right about Grover DNA, but quantum superpositions have been observed in large organic molecules (up to 430 atoms):

“Fattening up Schroedinger’s cat”:

http://www.nature.com/news/2011/110405/full/news.2011.210.html

“Quantum interference of large organic molecules”:

http://www.nature.com/ncomms/journal/v2/n4/full/ncomms1263.html

(Note: viruses have also been suggested as possible:

http://arxiv.org/abs/0909.1469 )

Comment #79 May 6th, 2011 at 6:08 pm

Mike: With k qubits, you could use Grover’s algorithm to search a list of size roughly N=2

^{k}, using ~2^{k/2}steps. So if k=1000 (say), then the number of steps to run a Grover search of the appropriate size is the limiting factor, not the number of qubits. On the other hand, with 1000 qubits you could run Shor’s algorithm to factor a decent-sized number, and that would take a reasonable number of steps also.Comment #80 May 6th, 2011 at 7:51 pm

Thanks Scott. So it really does depend on the exploitable feature of the particular problem. And, I guess that means that engineering isn’t everything. However, if engineering breakthroughs do continue occur (OK, another assumption, but I think a safe bet at least to some limit), does a Grover search just become, at some point, in some sense, just a brute force calculation?

Comment #81 May 6th, 2011 at 9:02 pm

Scott,

Of course, I meant a brute force “quantum” calculation.

Comment #82 May 6th, 2011 at 11:09 pm

Comment #77:

Usually it is assumed that classical computer technology will improve fast enough to overwhelm any sqrt advantage to Grover class algorithms.

By contrast, according to Wikipedia the largest RSA number factored (late 2009) has 232 decimal digits (768 bits): http://eprint.iacr.org/2010/006.pdf . This took 10^{20} operations in roughly two years (equivalent to using one thousand 2.2GHz single core processors).

Roughly approximating the number of operations to factor a number with d decimal digits as exp(d^{1/3}), to fix exponent use use estimate from above reference that a 309 decimal digit number (1024 bit) would take about 1000 times longer, gives 660 billion years to factor a number with 617 decimal digits (2048 bit, largest of the http://en.wikipedia.org/wiki/RSA_numbers) using the same technology, and even with a million times as many processors (a billion 2.2GHz cores) still a few hundred thousand years.

A quantum computer could do that last one with on the order of a few 100k gates in principle, hence on the order of seconds even for microsecond gates. That’s the virtue of exponential speedup that Scott has emphasized.

But re the relation BQP and BPP, from Simon’s problem we know that the quantum computer has an exponential speedup over any classical algorithm. But Scott describes this only as “evidence” that the two complexity classes differ, because it is only relative to Simon’s oracle. For the uninitiated, what does this mean? Why doesn’t such an example show that BQP is bigger than BPP? For our intuition, what more would one need to demonstrate that BQP is bigger? What does it mean to a complexity theorist to say that quantum algorithms like Grover’s, Simon’s (and certainly Shor’s period finding, if not factoring), all provably faster than any classical algorithm, are not proof that the complexity class BQP is

bigger than BPP. (I apologize if this has been multiply answered in other threads on this blog, and everyone else has already mastered the role of relativization in complexity theory.)

Comment #83 May 7th, 2011 at 10:02 am

> “For our intuition, what more would one need …”

OK, I found some useful intuition here

http://terrytao.wordpress.com/2009/08/01/pnp-relativisation-and-multiple-choice-exams/

regarding the Baker-Gill-Solovay no-go result for P vs. NP relativization, but would be curious if an expert (“are we ready, Mr. Scott?”) could summarize the analogous situation for BPP vs. BQP.

Comment #84 May 7th, 2011 at 1:20 pm

Slipper.Mystery: So your question is, why doesn’t the oracle separation already yield a proof of BPP≠BQP in the unrelativized world?

The problem is that, once you get rid of the oracle, there might well be clever

classicalalgorithms for solving all the problems that quantum computers can solve (so that BPP would equal BQP). So for example, maybe everyexplicitfunction instantiating the Shor or Simon oracles has some additional structure that makes the problem easy to solve classically. We don’t think that’s the case, but no one can rule it out, and doing so is probablyharderthan proving P≠NP! (Though there’s no formal implication in either direction.)By analogy, it’s easy to construct an oracle relative to which P≠BPP (i.e., randomized algorithms can do things in polynomial time that deterministic ones can’t). But almost all theoretical computer scientists believe that in the “real,” unrelativized world, P=BPP, since it seems possible to construct extremely good pseudorandom number generators, which are good enough to “derandomize” every BPP algorithm.

So, there are certainly cases (even

provablecases, like IP=PSPACE) where two complexity classes are equal,despitethe existence of oracles that separate them. If you like, the existence of a separating oracle is merely the “first test” that two classes have to pass, on the way to being unequal! Two complexity classes can be equal for “nonrelativizing reasons” (reasons that don’t hold with an oracle around), as we believe is the case for P and BPP, andknowis the case for IP and PSPACE.We think BPP and BQP will pass all the remaining, harder tests; and that there’s no clever “pseudoquantum generator” (or whatever 🙂 ) to dequantize every unrelativized quantum algorithm. But if you prove that, you also prove P≠PSPACE, and probably collect a Fields Medal.

Comment #85 May 7th, 2011 at 1:25 pm

Mike #80:

So it really does depend on the exploitable feature of the particular problem.YES!

And, I guess that means that engineering isn’t everything.YES!

However, if engineering breakthroughs do continue occur (OK, another assumption, but I think a safe bet at least to some limit), does a Grover search just become, at some point, in some sense, just a brute force calculation?I’m not sure I understand what you’re asking. Yes, today we usually

dothink about Grover’s algorithm as just “quantum brute-force search.” But of course, quantum brute-force is quadratically faster than classical brute-force!Comment #86 May 7th, 2011 at 2:55 pm

Scott: “So for example, maybe every explicit function instantiating the Shor or Simon oracles has some additional structure that makes the problem easy to solve classically.”

Ok, tks for this. One last naive question for more concrete intuition: a Simon oracle could be to take a period ‘a’ of n random bits, Xor it with n bit binary numbers to give 2^{n-1} pairs, and map those pairs via a random permutation to (n-1) bit numbers, with the oracle then a look-up table for this (random) map from {0,1}^n->{0,1}^{n-1}. What additional structure might this have to make it easy (i.e., O(n) rather than O(2^{n/2}) to find ‘a’ classically, or is the function insufficiently explicit?

“and probably collect a Fields Medal.”

unless one is over 40

Comment #87 May 7th, 2011 at 3:36 pm

@Scott: Off the topic, but you often mention protein folding as a “naturally occurring NP complete problem”…Are there any good review papers that mathematically express the problem and talk about the heuristics used to solve particular instances of it?

Comment #88 May 7th, 2011 at 4:19 pm

Slipper.Mystery #86: As it happens, we currently don’t know

anyway to instantiate the Simon oracle to get a hard problem! (And for the Shor oracle, of course we can instantiate with the modular exponentiation function, but not much else…)The way you proposed doesn’t seem to work, since what’s the “random permutation”? How do you implement it in polynomial time?

To “instantiate” the oracle function f, you need to be able to give someone a polynomial-time algorithm for computing f (with no additional secrets), such that it’s

stillhard for them to determine the hidden period. And that’s what no one knows how to do in the case of the Simon oracle. (Sure, you can describe a polynomial-time computable function f such that f(x)=f(x+a) for all x, but all the obvious ways to do so will reveal a…)Comment #89 May 7th, 2011 at 4:22 pm

Pseudonymous #87: I tried

http://www.google.com/search?q=protein+folding+np-complete

and got several papers mathematically expressing the problem (in particular, in the “hydrophobic-hydrophilic” or “HP” model, which are keywords you might want to use).

Comment #90 May 7th, 2011 at 5:39 pm

Matt #75, I did not say that I think factoring is hard classically. (If I had to guess I would guess that indeed it is for various reasons, but there are people who will make the opposite guess.)

Shor’s algorithm is a very dramatic example of exponential speedup between what we can do with classic computers and what we can do with quantum computers. It applies to a general complexity theoretic question (with major practical relevance) which was extensively studied before (not in the context of quantum mechanics).

The computations Feynman talked about in his paper are indeed very interesting. In hindsight, I wouldn’t be surprised that general form of such computations are BQP-complete (or something like that.) (But I am not sure Feynman was explicit enough so we can talk about “general form of such computations.”) So they are perhaps even “harder” than factoring. But at the time the paper was written I dont think there were clear insights that this is genuinly computationally difficult. Also (like for protein folding that was just mentioned) the fact that a general problem of some type is computationally hard does not mean that instances of the problem we encounter naturally are hard as well.

Now suppose you have a concrete quantum mechanical computation that is feasible for the Hydrogen atom, and with immense effort you can make a similar computation for the Helium atom. Now you want to make the same computation for a large complicated atom. You have a complicated formula but evaluating it requires huge, completely hopeless computations on a classic computer.

It is interesting which scenario is closer to the truth:

1) There are miraculous simplification which will allow to carry out precisely these complicated computations for large atoms on classical computers.

2) The computation is indeed infeasible classically but we will be able to carry them with quantum computers and they will give results with great agreement with experiments.

3) The same as 2) but when we run the quantum computer the results which were so good for simple systems will not match what we see in nature.

Comment #91 May 7th, 2011 at 5:44 pm

Scott #88: “To “instantiate” the oracle function f, you need to be able to give someone a polynomial-time algorithm for computing f (with no additional secrets), such that it’s still hard for them to determine the hidden period.”

Ah, relativized illumination slowly dawns, with only two invocations of the complexity oracle (with respect to which many such NP questions are equivalent to P, but only if the black box function underlying this blog’s responses is arbitrarily trustworthy)

Comment #92 May 8th, 2011 at 5:26 am

I think it’s a bit mis-leading to say that almost noone works on de-Broglie Bohm.

Last year there were a conference in Europe dedicated to dBB alone which had more people than any other interpretation-related conference ever had.

So it’s safe to say it’s one of the most active fields in interpretation (possibly more active than Everett).

However there is very little SCIFI selling factor over dBB, so it doesn’t really get much pop-sci coverage.

Scott: What is your current position? MWI?

I watched a blog talk by you 1-2 years ago I think, but you didn’t make it clear exactly what your “gut” feeling is.

Comment #93 May 8th, 2011 at 8:09 am

Kvantum: I know there are other people working on deBroglie-Bohm; I just mentioned the three

who I’ve heard of(which is probably correlated with: ones in North America, ones who talk to quantum information / quantum foundations people not working on dBB).I thought I made my “current position” clear in this post!

Sometime in graduate school, I realized that I was less interested in winning philosophical debates than in discovering new phenomena for philosophers to debate about. Why brood over the true meaning of (say) Gödel’s Theorem or the Bell Inequality, when there are probably other such worldview-changing results still to be found, and those results might render the brooding irrelevant anyway? Because of this attitude, I confess to being less interested in whether Many-Worlds is true than in whether it’s scientifically fruitful. As Peter Shor once memorably put it on this blog: why not be a Many-Worlder on Monday, a Bohmian on Tuesday, and a Copenhagenist on Wednesday, if that’s what helps you prove new theorems?

Comment #94 May 8th, 2011 at 12:10 pm

Gil, I’d also call the problem of quantum simulation a problem of major practical importance. In fact, I think except for RSA and other cryptography applications, none of which existed before the 70s, I don’t know any practical importance to factoring, but perhaps you can correct me on that. So, on the grounds of importance, quantum simulation wins hands down….as in, I’d guess 2-3 orders of magnitude more person-hours have been spent working on that problem, by smart people too such as von Neumann, Dirac, Feynman (I’m not talking about quantum computing here, I’m talking about diagrams, which are a calculational tool), Anderson, White, Wilson, Schwinger, Mott, Bardeen, Schrieffer, Kohn, Luttinger, Pines, etc, etc…

Regarding Feynman’s paper (and Manin’s, and others) we can argue about the generality of problems considered. My feeling is that on the one hand, a physicists in the 50s would have had to be a bit silly not to think that quantum simulation was hard, simply because people were doing quantum simulation then and were seeing the exponential growth in practice! On the other hand, people certainly had not formalized the notion of hardness. To my mind, this formalization of different complexity classes is one of the major contributions of computer science. And considering that even formalizing NP took until the 70s (long after computer scientists had encountered many NP-hard problems), I won’t claim that physicists would have some formal understanding of why it is hard. They just would have known it, in the same sense as a pre-Cook-Levin computer scientist would have had some feeling that SAT was a tricky problem (perhaps, the physicist would have had a better sense, as they would have been concerned with practical implementation).

I don’t necessarily think even a fast classical algorithm for factoring should dent our belief that BQP and P differ, assuming that the fast algorithm for factoring worked in some way completely unrelated to Shor’s algorithm. After all, it would simply expose some extra structure in integers, but wouldn’t necessarily expose extra structure in quantum mechanics.

Comment #95 May 8th, 2011 at 12:18 pm

I’m going to slightly down-revise my 2-3 order of magnitude effort difference above…I was counting all work on quantum many-body as work towards more efficient quantum simulation (which it is! looking at things like emergent gauge theories in correlated electron systems is simply looking for hidden structure that allows a simpler understanding of those systems in terms of degrees of freedom that might be weakly coupled), but then to be fair I should count all work on number theory as work on factoring (which it also is, to the extent that it is concerned with finding extra structure in integers). Still, I’m going with more than 1 order of magnitude difference in effort.

Comment #96 May 8th, 2011 at 6:33 pm

Scott:

In your profession I can easily understand your thinking, whatever interpretation helps you do science better at the time is obviously the best model to hold in your head at that time.

However I find it hard to believe that you haven’t spent any sleepless nights, or atleast hours over some wineglasses thinking about this at all?

If you are 100% agnostic with no opinion what so ever, that’s pretty rare ?

Here is a list over attendees and more info about the dBB conference last year by the way:

http://www.vallico.net/tti/master.html?http://www.vallico.net/tti/deBB_10/conference.html

Comment #97 May 8th, 2011 at 7:34 pm

Rereading your “Quantum Computing Since Democritus Lecture 9”, I noticed a curiosity that perhaps went unnoticed to you: Fermat’s last theorem implies that if you use a norm other than 1 or 2, your qubits can’t have rational amplitudes.

So if you believe that God really loves pure qubits and rational numbers, you have a good reason to use the 2-norm for quantum mechanics!

I would go even further and propose a conjecture: the rationals are dense in the 2-norm unit sphere for vectors of any dimension. And they are not dense in the unit sphere of any other norm > 2.

Unfortunately, I think the second part of the conjecture is false.

Comment #98 May 8th, 2011 at 8:14 pm

Kvantum #96:

However I find it hard to believe that you haven’t spent any sleepless nights, or atleast hours over some wineglasses thinking about this at all?I spent

plentyof sleepless nights on the interpretation of QM as a graduate student at Berkeley—but then felt like I’d given enough of my life over to it, and was ready to move on. 🙂I think Many-Worlds does a better job than its competitors (Bohmian mechanics, the “Copenhagen interpretation” (whatever that is)) at emphasizing the aspect of QM—the exponentiality of Hilbert space—that

most deserves emphasizing, at least from a modern quantum computing standpoint. As Steven Weinberg once put it, Many-Worlds is like democracy: “terrible except for the alternatives.”But my real hope is that we’ll learn something new someday that changes the entire terms of the debate.

Comment #99 May 8th, 2011 at 8:26 pm

“As Steven Weinberg once put it, Many-Worlds is like democracy: “terrible except for the alternatives.”

But my real hope is that we’ll learn something new someday that changes the entire terms of the debate.”

Scott,

What might that be? What are the clues you’re most intrigued by?

Comment #100 May 8th, 2011 at 10:34 pm

Maxwell daemon:

By “the rationals are dense in the 2-norm unit sphere for vectors of any dimension. And they are not dense in the unit sphere of any other norm > 2.”, do you refer to Q^n, the subset of R^n consisting of vectors x such that each component of x is rational?

If so, Q^n is very likely to be dense in the unit ball of R^n for any norm you can dream up. This is very easy if n is finite and the norm is a q-norm, for q in [1,inf].

Comment #101 May 8th, 2011 at 11:17 pm

Mike:

What are the clues you’re most intrigued by?I don’t know of any—and if I did, I doubt I’d be able to discuss them in the space of a blog comment! 🙂 Basically, I think we’re in the intellectually-unsatisfying position of David Hume, who knew that the then-prevailing explanations for biology were bad, but didn’t have anything better to replace them with. (In that case, the gap between

Dialogues Concerning Natural Religionand Darwin’sOrigin of Specieswas 80 years.)Comment #102 May 8th, 2011 at 11:30 pm

Not the unit ball, the unit sphere. I’m afraid I was unclear.

Fermat’s last theorem states that $x^p + y^p = z^p$ has no nontrivial solutions for p > 2 and positive integers x,y and z. Divide everything by $z^p$ and you have the statement that $a^p + b^p = 1$ has no nontrivial solutions for p > 2 and positive rationals a and b. In other words $\|(a,b)\|_p = 1$ only has rational solutions if p = 2 or p = 1.

What I’m trying to conjecture is some kind of generalisation of Fermat’s last theorem, to further drive the point that the 2-norm is God’s choice for quantum mechanics, not only for qubits, but qudits of any dimension.

The problem is that a naïve generalisation is false, and even a not so naïve (Euler’s conjecture) is also false, so I think the religious argument only works for qubits.

Comment #103 May 8th, 2011 at 11:31 pm

Oh come on, Scott, no LaTeX here?

Comment #104 May 9th, 2011 at 3:02 am

Scott:

I actually disagree, because Born Rule is basically what QM is all about.

It’s the reason we take QM seriously and “pure MWI” cannot derive it!

Bohm can.

Pure MWI needs to postulate something to get Born Rule, so why not postulate particles? What is more natural than particles?

I feel MWI’ers are somewhat dishonest in their debates when they claim that MWI doesn’t violate Occam’s Razor and that MWI is only “QM with the assumption that the wavefunction is real”, because in this view you wont get probabilities.

So obviously this bare view is incorrect and then you have to postulate something, and it’s suddenly not as “simple & attractive” anymore…

You say you hope that there will be some new discoveries, are there any hints that this might occur? I feel like most phycistis and philosophers regard QM as “the final theory” and that we have to accept the weirdness.

Comment #105 May 9th, 2011 at 12:37 pm

Kvantum:

Everyinterpretation of QM is “somewhat dishonest” in the same sense (of talking about QM in a way that sweeps crucial points under the rug): for example, Bohmian mechanics shunts all the complexity of QM into an innocuously-named “guiding wave.”For me, though, the more serious problem with Bohmian mechanics is that, if you want “determinism” (which was the main original selling point), then that only works for a very specific physical system: particles (with no spin) moving around in a continuous space. As soon as you want to incorporate finite-dimensional observables (such as spin), it’s easy to prove that the hidden-variable values at earlier times can no longer determine the values at later times.

More broadly, you then confront the question:

why the particular evolution equation suggested by Bohm, rather than any number of other evolution equations I could also write down that equally well imply the Born rule and are equally compatible with experiment? Sure, in the specific case of particle positions and momenta, there’s one rule that looks kind of neat, but for other observables, I can write down all sorts of “guiding equations” with no clear way ever to decide between them.Anyway, I don’t deny that hidden-variable theories lead to lots of interesting mathematical questions—in fact, I got interested enough in grad school to write a whole long paper about the computational complexity of hidden-variable theories, which maybe would interest you.

Comment #106 May 9th, 2011 at 1:45 pm

Since we have digressed into the topic of interpretations: There is ONE interpretation of quantum “mechanics” that I have found to be far-and-away the most useful for actually believing quantum mechanics than any of the alternatives: That quantum mechanics, or more precisely quantum probability, is exactly probability theory except with non-commuting random variables. This interpretation has a proven track record of persuasion of research mathematicians, and of deflating certain distracting questions of quantum philosophy.

The so-called “many worlds” interpretation, if you interpret it as a description of path summation, is also sometimes useful or even very useful for understanding certain mathematical arguments in both classical and quantum probability. However, if it were fundamental, it would for instance be a baffling point that BQP probability does not contain NP (or even PP), since after all NP is another complexity class that can be defined using “many worlds”. Many science journalists are baffled at exactly this point, which irked Scott enough to make a statement which is still there at the top: “Quantum computers are not known to be able to solve NP-complete problems in polynomial time.”

Comment #107 May 9th, 2011 at 2:41 pm

Greg,

I don’t think thoughtful MWI proponents disagree with Scott’s (modified 🙂 ) statement at the top.

During the recent excitement about the claimed proof that P is not equal to NP, Deutsch made a comment that touches on this point:

“P and NP are complexity classes for computations performed by the universal Turing machine . . . We already know that computers harnessing quantum interference have a very different complexity theory from Turing machines. It so happens that such computers cannot solve NP-complete problems efficiently (though when I last asked, this had not yet been proved with the rigor that mathematicians require) . . .”

I suspect that he simultaneously thinks that the MWI is “fundamental” — though I’m uncertain exactly what you meant by that — accepts that the discovery of new physical phenomena could change this conclusion, and, incidentally, the corresponding complexity analysis, and recognizes that quantum computers are not known to be able to solve NP-complete problems in polynomial time.

Although as often the case, there’s a good chance that my understanding is limited or just plain wrong.

Comment #108 May 9th, 2011 at 4:24 pm

Mike – Certainly thoughtful MWI would agree with Scott’s statement. Scott’s statement is a question of mathematical results, not interpretation. Thoughtful proponents of any interpretation would be careful to obtain the same actual answers as everyone else.

So the question is not that, the question is how hard it is to be thoughtful if you espouse a particular interpretation.

Comment #109 May 9th, 2011 at 4:47 pm

Mike #107:

It so happens that such computers cannot solve NP-complete problems efficiently (though when I last asked, this had not yet been proved with the rigor that mathematicians require)Can you point me to where Deutsch wrote this (unless it was a private discussion)? “When I last asked, this had not yet been proved with the rigor that mathematicians require” is a good candidate for the understatement of the century… 😉 (Note that we’re talking about something strictly

harderto prove than P≠NP.)Comment #110 May 9th, 2011 at 5:03 pm

Scott,

Well, the century is still young — so there may be time for additional understatements 😉

It was in the FoR discussion thread on Yahoo:

http://groups.yahoo.com/group/Fabric-of-Reality/message/18235

I’m sure you can find further grist for the mill scattered among his “informal” comments. 🙂

Greg,

“the question is how hard it is to be thoughtful if you espouse a particular interpretation”

I don’t know — but probably not NP complete 🙂

Comment #111 May 14th, 2011 at 4:52 pm

Over the years, I’ve developed what I call the Minus-Sign TestThanks for this brilliant nugget! A little light went off in my head when I absorbed this insightful little gem.

Comment #112 May 24th, 2011 at 9:54 am

Looks like Deutsch do know that he is weird… 🙂

I have some reservation on your emphasis on the usefulness of interpretations rather than their truths. It’s only when your believe deeply in certain things that you’ll mentally look for deeper possibilities. Shor’s suggestion seems superficial to me.

Comment #113 July 2nd, 2013 at 11:41 am

Hi, @MIKE yes! my also I like this point 🙂

“and can be simulated classically with exponential slowdown”