Archive for April, 2010

Schrödinger’s cash

Thursday, April 15th, 2010

There’s an article in this week’s New Scientist by Justin Mullins about unforgeable quantum money.  By the standards of “quantum mechanics journalism,” the article is actually really good; I’d encourage you to read it if you want to know what’s going on in this area.  In particular, Mullins correctly emphasizes that the point of studying quantum money is to understand quantum mechanics better, not to mint practical QCash anytime soon (to do the latter, you’d first have to solve the minor problem of the money decohering within microseconds…).

My main quibble is just that I think the article overstates my own role!  In my Complexity’09 paper, the main thing I showed is that secure quantum money that anyone can verify is possible, assuming the counterfeiters only have black-box access to the device for verifying the money.  I also showed that, to get quantum money that anyone can verify, you have to make computational assumptions.  (By contrast, Stephen Wiesner’s scheme from the 1960s, in which only the bank could verify the money, was information-theoretically secure.)  But in terms of coming up with actual candidate quantum money schemes (as well as breaking those schemes!), the other members of the “quantum money club”—Andy Lutomirski, Avinatan Hassidim, David Gosset, Ed Farhi, Peter Shor—have been more active than me.

Two other quibbles:

(1) Mullins writes: “Then last year, Aaronson proposed a new approach that does away with the banknote and concentrates instead on the stream of information that represents quantum cash.”  In Wiesner’s scheme, too, I think it was pretty clear that the “banknote with qubits stuck to it” was just a fun way to tell the story…

(2) The article does a good job of explaining the distinction between information-theoretic and computational security.  But it doesn’t stress that, with the latter, we can’t actually prove that any of the “hard problems” are hard, without also proving P≠NP!  (I’ll admit that the importance of this point is slightly hard to convey in a popular article, possibly because many people, or so I’m told, go about their lives without proving anything.)  The best we can do is show that, if you could solve this problem, then you could also solve this other problem that people have studied for a long time.  But in the case of quantum money, we don’t even know how to do that—which is what we meant when we wrote in our ICS paper that “it seems possible that public key quantum money intrinsically requires a new mathematical leap of faith.”

Considered as research topics in complexity theory, uncloneable quantum money, copy-protected quantum software, and so on are almost as wide-open today as public-key encryption was in the 1970s.  That is, we don’t have a compelling intuition as to whether these tasks are possible at all: all quantum mechanics does is open up the possibility of them, which wasn’t there in the classical world.  Unfortunately, in the case of quantum money, most of the ideas we’ve had for realizing the possibility have turned out to be insecure—often for non-obvious reasons.  Assuming quantum money is possible, we don’t know what the right protocols are, what types of math to base them on, or how to argue for their security.  So if you’re not impressed by the results we have, why don’t you try your hand at this quantum money business?  Maybe you’ll have better luck than we did.

(Addendum: I also have a PowerPoint presentation on quantum money, which ironically goes into more detail than my Complexity paper.)

The Future of Computer Science, and Why Every Other Major Sucks By Comparison

Monday, April 12th, 2010

Does this post finally herald my return to regular blogging after a months-long absence?

I don’t know.  For me, writing a Shtetl-Optimized entry always followed the same process: I’d get an idea and start typing, furiously rocking back and forth in my chair.  Then the voices in my head would pipe up: no, I can’t say that—what will everyone think?—judging from past experience, they’ll probably take offense—I can already see the commenters boiling me alive—maybe if I rephrased it, or, y’know, provided some context—but to explain the real context, I’d need a whole book—and who has the time for that?—better wait till after tenure—meantime, maybe I could blog about something light and uncontroversial instead—but then what’s the point?—we already have one GASARCH—well, I could always put off a decision till later—

Back in the blog’s heyday, I’d win these fights about 40% the time and the voices would win about 60%.  (In other words: if you’ve ever taken offense at an entry of mine, rest assured that you haven’t even seen the half of my drafts folder.)  But now that I have an actual stake in this shabby world—students to advise and look after, a tenure case to build, conceivably even a family to start—the voices win more like 98% of the time.  And that’s why my blogging fell off.

Occasionally, though, something comes along so uncomplicatedly joyous that I feel no reservations about sharing it with the world.  Such was the case this weekend, when I was somehow called upon to represent MIT’s EECS Department in the annual “Professor Talent Show” at Campus Preview Weekend.  This is an event where six faculty members square off, taking eight minutes each to

(1) explain why their department is the coolest,
(2) crack jokes, and
(3) possibly demonstrate a musical or athletic talent.

Then, using electronic clickers, the several hundred prefrosh in attendence vote for which major carried the day.  Though I had no absolutely no talent of any kind to demonstrate, and was up against a banjo-player, violinist, and basketball-spinner among other tough competitors, for some reason EECS won!  You can see my PowerPoint slides here:

The Future of Computer Science, and Why Every Other Major Sucks By Comparison

(You can read the jokes that go along with each slide in the slide notes at the bottom.)

Update (4/15): I hadn’t realized at all that there’s actually a video of me giving the talk!  (Click on “Part 2.”)