Der Quantencomputer

Those of you who read German (I don’t) might enjoy a joint interview of me and Seth Lloyd about quantum computing, which was conducted in Seth’s office by the journalist Christian Meier, and published in the Swiss newspaper Neue Zürcher Zeitung.  Even if you don’t read German, you can just feed the interview into Google Translate, like I did.  While the interview covers ground that will be forehead-bangingly familiar to regular readers of this blog, I’m happy with how it turned out; even the slightly-garbled Google Translate output is much better than most quantum computing articles in the English-language press.  (And while Christian hoped to provoke spirited debate between me and Seth by interviewing us together, we surprised ourselves by finding very little that we actually disagreed about.)  I noticed only one error, when I’m quoted talking about “the discovery of the transistor in the 1960s.”  I might have said something about the widespread commercialization of transistors (and integrated circuits) in the 1960s, but I know full well that the transistor was invented at Bell Labs in 1947.

64 Responses to “Der Quantencomputer”

  1. Evan Berkowitz Says:

    I found the translation’s “It took years before David and Richard Josza German first presented a quantum algorithm…” quite hard to parse, until I realized what was going on…

  2. Scott Says:

    Evan #1: I still don’t fully understand how that happened! I know that “Deutsch” became “German,” but why did it get dissociated from David and attached to Richard Jozsa instead?

  3. Shmi Nux Says:

    Just a guess: once Google Translate decided that Deutsch is a nationality and not a name, despite not being grammatically correct German, it must have removed the qualifier, leaving “David und Richard Josza”. It then translated the qualifier and decided to stick it back in, but kept the original word order, instead of translating it as “the Germans David and Richard Josza”.

    Also, strangely, Google Translate is confused when it encounters a period not followed by whitespace and assumes it is some technical jargon, like object.element, and leaves the fragment untranslated, making the exact opposite mistake.

  4. Michael P Says:

    Bell Labs in 1947? Why not Lilienfeld in 1925-ish?

  5. Itai Says:

    Google Translate can cause many funny translations which are tottaly messed up.
    In Hebrew -English it is more easy to find funny ones like:
    הפיל הכבד הולך לאכול
    Which means the heavy elephant is going to eat.
    which translates into
    Elephant going to eat liver.

    The main cause is 2 meanings for כבד which is heavy and liver in Hebrew.
    second cause is that google was confused between time and verb in hebrew.

  6. Alex Says:

    Just nipicking. The transistor thing is unfortunate. But not outright wrong.

    It should translate:

    “Moore’s law took off after the invention of the transistor in the 60ies”

    which is ambiguous in German and could be parsed as either

    “Moore’s law took off [after the invention of the transistor] in the 60ies” (correct)


    “Moore’s law took off after [the invention of the transistor in the 60ies]” (wrong)

    On a slightly related note I had a hard time parsing “Die Quantencomputer” which translates to “The quantum computers” (plural). Was this intentional? Not knowing what was coming, I mentally discarded that parsing and choose

    “die, quantum computers!” as in “all quantum computers must die!” as my second guess.

    Quite fun actually.

  7. Scott Says:

    Alex #6:

      On a slightly related note I had a hard time parsing “Die Quantencomputer” which translates to “The quantum computers” (plural). Was this intentional?

    Not intentional, and fixed! Thanks. 🙂

  8. Shmi Nux Says:

    > If we were to discover that there is a deep reason why quantum computers are impossible, then that would ironically be the most exciting result. Because we would then have to rewrite the physics books in order to take account of the new understanding.

    Scott, how can a quantum computer used to simulate quantum systems be impossible? After all, it’s just another quantum system, only somewhat more controlled. What might be the distinction between the simulator and the system itself that makes a quantum simulation impossible?

  9. a Says:

    Did your work ever involve work Grothendieck

  10. Scott Says:

    Shmi #8: You should probably be asking the QC skeptics that question, rather than me. 😉 But I assume they would answer something like: “the quantum systems that can occur in Nature are all extremely special, in ways that probably allow us to simulate them efficiently using a classical computer. So yes, you can presumably build a computer that simulates those systems, but what you had built wouldn’t deserve the title ‘quantum computer.’ What we don’t think you can build is a machine that implements a completely arbitrary sequence of 1- and 2-qubit unitary transformations, in order (for example) to efficiently factor a 10,000-digit number using Shor’s algorithm. And of course, consistent with our prediction, such machines are nowhere found in Nature.”

  11. kodlu Says:

    This *must be* proof that Google translate has obviously started running a quantum algoritm (which one, I wonder? Nudge, nudge, wink, wink :-)) and due to the inherent randomness and parallelism in quantum theory, we must be in *the* one universe (out of 2^n parallel universes) where this happened.

  12. Scott Says:

    a #9: No, Grothendieck worked at a famously stratospheric level of abstraction and generality, and probably would’ve turned up his nose at the “grubbily concrete” problems that interest us in theoretical computer science. In a previous thread, I mentioned that I don’t even understand Grothendieck’s work well enough to admire him directly—I can only “eigenadmire” him, in the sense that so many of the mathematicians who I do know enough to admire, speak of Grothendieck only as the pious speak of God. Among the staggering number of things that he introduced to mathematics before he retired to become a misanthropic goat-herder, I understand Grothendieck’s constant in operator theory (which has been used in quantum information theory and theoretical computer science, though not by me), and at one point, I more-or-less understood the notion of a Grothendieck universe in set theory. And I also know the story of the “Grothendieck prime”, 57—the fact that 57 isn’t prime illustrating Grothendieck’s complete disregard for concrete examples, and his lack of need for them. I also know enough to know that what I do know doesn’t scratch the corners of the edges of the surface of what he did to become so revered.

  13. John Sidles Says:

    Scott [#10] summarizes QC skepticism as  “something like, the quantum systems that can occur in Nature are all extremely special, in ways that probably allow us to simulate them efficiently using a classical computer.”

    Here “the friend speaks my mind” as the Quakers say … that was well-expressed, Scott!

    Scott speaks of Grothendieck as having “become a misanthropic goat-herder.”

    This widely held yet exceedingly regrettable view originates largely in Grothendieck’s irascible focus upon seemingly bizarre questions like “what is a meter?”

    In this regard, an erratum (apparently unpublished?) by mathematician Juan Antonio Navarro Gonzalez, addressed to the Notices of the AMS and available on Gonzalez’ web-site, is commended to the attention of Shtetl Optimized readers:

    Since 1990, I haven’t heard from him [Grothendieck], except for some comments included in Allyn Jackson’s article Comme Appele du Neant [in two parts, Notices AMS, vol 51(9-10). 2004].

    Finally, let me say a few words about the disarmingly simple question “what is a metre?”.

    It sounds infantile, since the metre is a mere convention taught at school. But we should not underestimate a question raised by a man who has given so tremendous answers to the question “what is a space?”, which also seemed to be well known.

    The metre is an arbitrary length unit and, given the reluctance of Grothendieck to any artificial concept, it is plausible to assume that he is asking for the possibility of natural units.

    And this is not a ridiculous question.

    For the past year and more, a Gonzalez-style view of Grothendieck’s question “what is a meter?” has guided our group’s research in quantum simulation, along lines that are briefly summarized in Dick Lipton’s/Ken Regan’s weblog as the micro-essay Grothendieck encourages us to “Paint like Titian”

    One intended point is that “only the details are missing” — to complete the acerbic postcard-comment by Pauli — equally on both sides of the quantum computing/simulation debate!

    A great virtue of Grothendieck’s work (as it seems to me) is that his work concretely facilitates our 21st century’s pursuit Averroes’ Search

    Averroes dejo la pluma. Se dijo (sin demasiada fe) que suele estar muy cerca lo que buscamos.

    Averroes put down his pen. He told himself (without excessive faith) that what we seek is often nearby.

    Summary  Grothendieck’s mathematical worldview encourages us to regard [Harrow-style] Hilbert dynamics as the natural resolution of [Kalai-style] varietal dynamics. And this resolution’s realization is not as some far-future postulate, but an eminently practical worldview for which transformational engineering applications are near at-hand.

    Conclusion  The much-debated question “Who’s right about the feasibility of quantum computing?” is among the least interesting members of the large set of questions that Grothendieck’s work encourages us to ask about quantum computing and quantum simulation.

  14. John Sidles Says:

    Darn! Forgot my resolution that for the remainder of 2014, my every comment on Shtetl Optimized would include an explicit falsifiable prediction.

    So here’s one (that arises out of Grothendieck-directed research):

    Postulate  The permanent distributions of Lund-style scattershot boson-sampling distributions are log-normally distributed in the asymptotic limit \(n_{\mathsf{\text{photon}}}\to \infty\)


    @article{, Author = {Lund, A. P. and
    Laing, A. and Rahimi-Keshari, S. and Rudolph, T.
    and O’Brien, J. L. and Ralph, T. C.}, Journal =
    {Phys. Rev. Lett.}, Month = {Sep}, Pages =
    {100502}, Title = {Boson Sampling from a
    Gaussian State}, Volume = {113}, Year = {2014}}

  15. a Says:

    I am a comp sci too… since you got tenure, you could dabble on a few abstract non sense like etale cohomology without peer pressure…. I think Mulmuley has a program that encompasses far reaching conjectures generalizing the Weil conjectures…. you never know you may find that useful in QC and then all the AG flock will come searching for abstract problems here:)

  16. Me Says:

    I think the google translation is pretty good as one understands everything. However i tried my own translation in case something is unclear on Google Translate. As i didn’t want to post content of the Neue Zürcher Zeitung somewhere else, i posted it as a comment below the article.

  17. Thomas Says:

    from Google translate:

    “Peter Shor quantum computer makes the legs”

    …thank goodness, when all goes wrong, there is always Google Translate to cheer one up : )

  18. Saul Says:

    a#9 & Scott#10: Over a year ago Terry Tao Googleplused:

    “Grothendieck’s inequality asserts that a certain discrete optimisation problem (optimising a bilinear form over inputs that are +1 or -1) is equivalent up to constants to a continuous optimisation problem (optimising the same bilinear form over unit vectors in a Hilbert space). This is important for theoretical computer science, because the former problem essentially contains NP-hard problems such as MAX-CUT, whereas the latter can be rephrased as a semidefinite program (by writing the problem in terms of the Gram matrix of the unit vectors) and can be solved in polynomial time by algorithms such as the ellipsoid method. So there are certain NP-hard problems which one can solve “up to constants” in polynomial time: in some sense, the “ratio” between NP and P is bounded! Furthermore, in a certain technical sense, one cannot achieve a better approximation to such problems by any polynomial-time algorithm than through the Grothendieck inequality, at least if one assumes the Unique Games Conjecture (a result of Raghavendra and Steurer).”

    more on this:

  19. Scott Says:

    Saul #18: Yeah, I know that story. That’s what I was referring to when I wrote: “…I understand Grothendieck’s constant in operator theory (which has been used in quantum information theory and theoretical computer science, though not by me)”

  20. Scott Says:

    a #15:

      since you got tenure, you could dabble on a few abstract non sense like etale cohomology without peer pressure…

    But I don’t work on concrete problems because of peer pressure—I do it because that’s what I like! You might say: the style of math that I’ve always enjoyed the most draws from the problem-solving tradition of Erdös, and also from the philosophical tradition of Gödel and the “practical” tradition of Turing, Shannon, and von Neumann, but not at all from the abstractifying tradition of Grothendieck or the Bourbaki school. I guess I’ll delve into the latter kind of math when, and only when, I see firsthand its indispensability for the former kinds. That hasn’t happened yet, but you’re absolutely right that Mulmuley’s GCT is a likely culprit for how it might happen in the future.

  21. Joshua Zelinsky Says:

    By the way, have you seen (copy of the paper ). I’m wondering given this is there some generalized version of boson sampling with a small amount of interaction
    that makes sense at all using something like this? More generally, does it make sense to have a computational model using photons that
    have some limited amount of interaction? Maybe this gives rise to a general quantum computer but my guess
    is that a reasonable version of this would not do so.

  22. Scott Says:

    Joshua #21: Sorry, I haven’t read that paper and can’t comment on its details. But yes, a lot of physicists have been asking me recently about BosonSampling with a limited amount of interaction between the bosons. The obvious thing to say is that, if you have controllable interactions, then maybe you should just skip BosonSampling, and go straight for building a universal quantum computer! Part of the point of BosonSampling was to do something interesting even if your bosons are completely non-interacting. On the other hand, if you do want to do BosonSampling, it’s possible that interactions would ironically make life harder for you, by making your scattering amplitudes some messy and complicated expressions, rather than just matrix permanents. Unfortunately, I don’t have that much more to say right now—saying more would probably depend on knowing what the scattering amplitudes do look like in some particular system of interacting bosons.

  23. wolfgang Says:

    The NZZ is indeed one of the better German language newspapers.

    At one point you are quoted as saying:
    “Ein Quantencomputer hingegen wuerde solche Aufgaben im Schlaf erledigen …”
    which translates to
    “A quantum computer on the other hand would solve such tasks in its sleep …”

    Did you actually say this (or something similar)?
    I think the idea of sleeping quantum computers, solving problems while dreaming is somehow cute …

  24. Abel Says:

    before he retired to become a misanthropic goat-herder

    Heh, an unexpected place to find a lack of appreciation for the more philosophical/political concerns Grothendieck occupied himself with after the mathematical period, guess it might be the style in which he did so, or a lack of sense of humor in the reader :)…

  25. Jr Says:

    Impressive translation by Google (did Google translate pick up something when Scott visited?) and good article.

  26. Scott Says:

    wolfgang #23: Yes, I do like to talk about quantum simulation as something “quantum computers can do in their sleep,” in order to clearly differentiate it in laypeople’s minds from problems like factoring, for which “we had no right to expect a-priori” that a fast quantum algorithm would exist, and the fact that one does is a sort of happy accident.

  27. Stam Nicolis Says:

    There is a lot that’s known about what computations cannot be done more efficiently by a quantum computer than by a classical computer-and that’s work done at MIT, too, e.g.

  28. John Sidles Says:

    Abel remarks “Heh, [Shtetl Optimized] an unexpected place to find a lack of appreciation for the more philosophical/political concerns [that] Grothendieck occupied himself with after the mathematical period”

    Grothendieck’s later concerns substantially overlap with the themes of the film Interstellar (reviewed earlier this week here on Shtetl Optimized) … themes that include math, science, technology, integrity, community, narrative, art, and secrecy (among many).

    In particular, the film-maker Herve Nisic’s web-site provides an illuminating collection of Grothendieck’s Survivre et Livre essays from the early 1970s; this collection was compiled for Nisic’s documentary on Grothendieck, L’espace d’un homme (2013).

    To read Grothendieck’s essays in English, try pointing Google Translate to the on-line PDF document

    See in particular the essays on pages 3-4, Comment je suis devenu militant (How I became an activist) and pages 5-6, Science et société.

    Falsifiable prediction  This month’s Alan Turing thriller Imitation Game — a film that I haven’t yet seen, and definitely intend to! — will join Interstellar, L’espace d’un homme, and Birdman in illuminating Grothendieck’s Survivre et Livre STEAM-themes of math, science, technology, integrity, community, narrative, art, and secrecy.

    Remedial readings  A remedy for STEAM students who regard Grothendieck’s actions and sanity as dubious is Yannick Grannec’s Goddess of Small Victories (2014). A remedy for STEAM students who experience Grothendieck’s Survire et Livre essays as depressing is Kathleen Ann Goonan’s story Girl in Wave: Wave in Girl, which is collected in Neil Stephenson’s Hieroglyph: Stories and Visions for a Better Future (2014).

  29. Scott Says:

    Stam #27: Where did I ever say in the interview that nothing was known about the limitations of QCs? Besides the particular paper that you linked to, there’s, err, most of my and my friends’ research over the last 15 years! On the other hand, how much speedup quantum computers can give in principle for NP-complete problems is something where we genuinely don’t know nearly as much as we’d like.

  30. Scott Says:

    Incidentally, for those who really want to know how and why Grothendieck left mathematics to become a misanthropic recluse, here’s the single best article I’ve found, by Winfried Scharlau. Very appropriately, it neither celebrates nor condemns Grothendieck’s choices, just tries the best it can to understand them.

  31. Stam Nicolis Says:

    Scott#29: “Herr Aaronson, wo kann aus Ihrer Sicht der Quantencomputer mit Sicherheit punkten?

    Aaronson: Mit Sicherheit werden Quantensimulatoren eine exponentielle Beschleunigung bringen.”

    Here you state, in answer to the question, where, from your point of view, can a quantum computer surely score (against a classical computer, I guess, is implied), that, surely, a quantum simulator will realize an exponential speedup. The work of Farhi et al., for instance, shows that this isn’t the case for a certain class of problems. The classification seems considerably more subtle-as you know, naturally. Of course, I quote the paper I knew as an example, I didn’t claim it exhausted the work on the subject at MIT!

  32. Scott Says:

    Stam #31: I appreciate your trying to educate me about this subject, really I do. 😉 But I don’t see anything wrong with what I said (other than my implicit assumption that PromiseBPP≠PromiseBQP, which is a very different issue than what you’re talking about). For quantum computers to give an exponential speedup for quantum simulation, it suffices that for every classical algorithm, there’s some family of input instances where the quantum simulator is exponentially faster than that algorithm. And for the speedup to be important in practice, it suffices that such instances are common in practice. The fact that there exist problems for which QCs don’t give you much speedup is something that Seth and I both stressed elsewhere in the interview (and, in portions of the interview that were cut, I discussed the provable lack of speedup within the quantum black-box model, of which the result about PARITY is one example). But even if we hadn’t stressed it, it has no direct relevance to the near-certainty that QCs do give an exponential speedup for quantum simulation.

  33. Apteris Says:

    I was very impressed to see the caption beneath the picture of Professors Lloyd and Aaronson say “Scott Aaronson (links) und Seth Lloyd, zwei Koryphäen des Quantencomputing.”, i.e. “Scott Aaronson (left) and Seth Lloyd, two coryphaei of quantum computing.” How often do you see “coryphaei” used in modern parlance? Far too seldom.

  34. Scott Says:

    Michael P #4:

      Bell Labs in 1947? Why not Lilienfeld in 1925-ish?

    Sorry for the delay in replying! That’s a very interesting story, which I remember reading about in Joel Shurkin’s biography of Shockley. My understanding is that Lilienfeld developed much of the theory of solid-state transistors in the 1920s, but never actually built one? And actually building them brought to light unexpected phenomena that the theory then needed to be revised to account for, e.g. involving the electrons congregating on the transistor’s outer walls? Even so, Lilienfeld’s prior work is what prevented Bell Labs from getting a patent on the transistor, which was probably extremely important for future innovation. (If any part of that is mistaken, let me know.)

  35. quax Says:

    Apteris, #33 don’t mean to spoil your fun, but as a native speaker I can assure you that “Koryphäen” is much more commonly used in German than “coryphaei” in English.

    And you don’t have to take my word for it, you can quickly verify yourself that you get 148,000 results for “Koryphäen” on versus only 2,370 results on for “coryphaei”.

  36. Raoul Ohio Says:

    In view of the mathematics created by Grothendieck, it is worthwhile to give some thought to “did he just go crazy, or what?”. I can take a guess, based mostly on the Scharlau article.

    It appears he might have thought along the lines of “how can you do math when civilization is not likely to last much longer”. A good question that many of us probably occasionally think about. The highly condensed RO view on this topic is that (1) it is hard to predict what will really matter in the long run, (2) it is not likely anyone can do anything about it anyway, (3) sooner or later, the radius of the sun will exceed 1 AU anyway, and (4) we can develop some good science in the meanwhile, which is more satisfying.

    Those who decide to save the world find it disconcerting that most people don’t care. Thus they tend to become shrill. From Scharlau: “But in closer interaction, a deep personality disorder is clearly visible. This is not the place to go further into this subject.”. As I understand it, “personality disorder” is usually a euphemism for what in the US is called “an a$$hole”. A reasonable guess is that when no mathematicians lined up behind him, he thought “screw this, I can go to a commune where people understand me”. Having high energy and high powered brains, he continued to write reams of stuff. There does not appear to be any consensus about if it was just yammering or he actually had something to say.

    BTW, the article in “The Forward” has a photo in “handsome wild dude” mode, rather that the “crazy old hippie” pictures you usually see.

  37. Scott Says:

    Incidentally, John Sidles #13:

      The metre is an arbitrary length unit and, given the reluctance of Grothendieck to any artificial concept, it is plausible to assume that he is asking for the possibility of natural units.

    In that case, would Grothendieck have accepted the Planck length as an answer to his question? Or would that have involved an illegitimate intrusion of actual facts about physics?

  38. Scott Says:

    Raoul #36: Thanks for the useful insights, with which I largely agree! But I would add one more: if you do decide to contribute to the saving of civilization (e.g., from environmental catastrophe), it seems to me that you’re much more likely to be effective at it if you remain a part of civilization, and more-or-less follow its norms, than if you become a misanthropic recluse in the Pyrenees. Admittedly, you still probably won’t be that effective—when was the last time the US Congress did anything differently because of a petition signed by Nobel Laureates and members of the National Academy of Sciences?—but it’s a question of ε versus ε2.

  39. John Sidles Says:

    Scott, your remarks [#37] are worthy of response, and numerous references exist that are written by Grothendieck’s physicist colleagues … alas comments that cite these references disappear into /dev/null.

  40. John Sidles Says:

    Citations to work by Grothendieck’s IHES physicist-colleague David Ruelle, relating to the question “What is a metre?” and its generalizations, now appears as a comment in Ken Regan’s and Dick Lipton’s outstanding ecomium Alexander Grothendieck 1928–2014

    For some reason (that’s not evident to me), the Shtetl Optimized comment-filter rejects these Ruelle citations … or maybe it’s the Saharon Shelah citations that are problematic?

  41. Scott Says:

    John #40: I’d be more motivated to search for the comments of yours caught by my spam-filter (in addition to the thousands of comments that aren’t caught), if you showed more willingness to respond on point to what other people said. Do you understand that none of the David Ruelle quotes you offered actually bear at all on my question, let alone providing what you call a “concrete answer”? (I.e., would Grothendieck have accepted the Planck length as a natural unit of length, or wouldn’t he?) The quotes don’t even mention the Planck length, or anything related to it.

    The fact that my question might have reminded you of these quotes, doesn’t mean that the quotes address the question as far as anyone else can make out. If you can just grasp this one distinction, your comments will become orders of magnitude more comprehensible.

  42. John Sidles Says:

    Direct question  “Would Grothendieck have accepted the Planck length as a natural unit of length, or wouldn’t he?”

    Direct answer  “Probably not.”

    Followup question  Isn’t the answer “probably not” so minimally informative as to suggest that better questions could be posed? So why not pose them?


    Summary  Ruelle’s work — and especially his first-hand physics-centric perspective upon Grothendieck — poses rich mathematical questions that can be fruitfully considered … by young researchers especially.


    Falsifiable prediction  [ see Gödel’s Lost Letter ]

  43. Scott Says:

    OK then, here’s a next step: you could explain the reasons why Grothendieck (or anyone else) should or would reject the Planck length, as a deep answer to the question of what’s a natural, non-arbitrary length scale. That would be more informative, while still bearing on the question.

  44. John Sidles Says:

    Scott, your question [#47] is worthy of a response, but not today and not by me. The reason is, today is our thirty-fifth wedding anniversary!

    Falsifiable prediction  My spouse would be justifiably displeased were I to write-out a Grothendieck-oriented literature survey on mathematically natural descriptions of quantum measurement theory.

    Thank you sincerely, though, for the sustained generosity of Shtetl Optimized in hosting views that are sometimes not in sympathy with your own.

  45. Apteris Says:

    quax #35, indeed. No fun spoiled, I was simply glad to see an “advanced” word being used in what was already a high-quality article.

  46. Confused Says:

    Hi Professor Aaronson,

    I was wondering if you would be interested in commenting on:

    I’m not sure how to interpret the abstract. The author claims that memcomputing is a new paradigm that allows for solving NP complete problems in polynomial time in a manner similar to the human brain… But as far as I’m aware, and from what you’ve written previously on this site, there is no evidence that biology, including the human brain, is capable of this feat.

    Abstract—We introduce the notion of universal memcomputing
    machines (UMMs): a class of brain-inspired general-purpose
    computing machines based on systems with memory, whereby
    processing and storing of information occur on the same physical
    location. We analytically prove that the memory properties of
    UMMs endow them with universal computing power—they are
    Turing-complete—, intrinsic parallelism, functional polymorphism,
    and information overhead, namely their collective states can
    support exponential data compression directly in memory. We
    also demonstrate that a UMM has the same computational power
    as a non-deterministic Turing machine, namely it can solve NP–
    complete problems in polynomial time. However, by virtue of its
    information overhead, a UMM needs only an amount of memory
    cells (memprocessors) that grows polynomially with the problem
    size. As an example we provide the polynomial-time solution of
    the subset-sum problem and a simple hardware implementation
    of the same. Even though these results do not prove the statement
    NP=P within the Turing paradigm, the practical realization of
    these UMMs would represent a paradigm shift from present von
    Neumann architectures bringing us closer to brain-like neural

  47. Ben Standeven Says:

    Confused @46:
    Their machine is only brain-like in that the computing elements have memory; it’s not put together like a brain, and the computing elements don’t necessarily work like neurons.

    I didn’t read the article very thoroughly, but there are lots of different techniques for using analog computation to easily do things that are difficult or impossible for a digital computer – provided you ignore pesky details about precision, stability, dynamic range… This paper seems to be doing the same thing.

  48. Scott Says:

    Confused #46: Man oh man was that a cringe-inducing paper. To show you how I read such a paper, let me highlight a few crucial sentences and my reactions to them:

      In particular, we consider the NP-complete subset sum problem, and using two examples we show how a UMM can solve it in polynomial time using a number of memprocessors that either grows exponentially or, by exploiting the information overhead, uses a linear number of memprocessors.

    Solving an NP-complete problem in polynomial time using an exponential number of parallel processors is trivial: anyone can do that! So, the only remaining question is by what sleight-of-hand they’re going to claim that they can get the number of processors down to linear. I wonder if it will have something to do with arithmetic precision?

      In the present work, we employ an idealization of a DCRAM in the sense that we do not specify the actual electronic devices capable of reproducing exactly the machine we are going to present. Nevertheless, this does not exclude the possibility that such a machine can be realized in practice.

    Uh-oh! Trouble ahead.

      This implies that the function g(x) contains information on all sums of all possible sub-sets of G.

    Aha! Just as I feared, the exponential blowup in resources is being hidden, by shoehorning the parallel processing needed into numbers that take exponentially many bits just to write down. Oldest trick in the book; people observed similar things in the 60s and 70s.

      However, we point out that p is not bounded by n and it depends on N, which scales exponentially with the number of bits used to represent fmax.

    Ah, so the authors do understand the problem. But why did they wait until page 11 to tell us this? Why didn’t they say so on page 1, so people could make an informed decision about whether to read further? Do they not realize that this kills the advertised idea that implementing “memprocessors” in hardware could give any real advantage (as opposed to an advantage in a counterfactual theoretical model, one where arithmetic on exponentially-large integers is free)?

    The whole point of my 2005 essay NP-complete Problems and Physical Reality was that, if people read it and internalized its message, I thought it could serve as a prophylactic against exactly this sort of paper. Unfortunately, it failed at that goal.

  49. Douglas Knight Says:

    When you say that the paper failed, which step do you think failed? Do people fail to read it, fail to understand it, or fail to internalize the message?

  50. Scott Says:

    Douglas #49: Good question! In this case, I assume (hope?) that the failure was at step 1. There have probably also been failures at steps 2 and 3, but I don’t remember any example.

  51. Fabio Says:

    Dear Scott, from your comments it is clear that there is a misunderstanding on the differences between our memcomputing paradigm and the Turing paradigm. I urge you to read carefully both our theoretical work arXiv:1405.0931 and the paper arXiv:1411.4798 where we actually built a machine that solves an NP-complete problem in polynomial time (actually in 1 step) using polynomial resources.

    If after reading those papers you still have issues with our statements please write a comment or article in a refereed journal where these scientific matters should be settled.
    Fabio Traversa

  52. Confused Says:

    Thanks very much for your response Professor Aaronson, I was fairly certain that NP complete problems hadn’t just succumbed to a brilliant new architecture (paradigm, memparadigm, universal memparadigm, etc.)

  53. John Sidles Says:

    Scott wonders  “Why Grothendieck (or anyone else) should or would reject the Planck length, as a deep answer to the question of what’s a natural, non-arbitrary length scale.”

    Scott, let’s begin by noticing that your question “What’s a natural, non-arbitrary length scale?” is a restriction of Grothendieck’s broader question “What is a metre?”. Such restrictions are problematic because, as your own essays here on Shtetl Optimized have noted:

    Scott notes  “The words out of [a\(\ \)STEM-researcher’s] mouth will literally be ignored, misheard, or even contorted to the opposite of what they mean, if that’s what it takes to preserve the listener’s misconception.”

    Summary  For STEAM students especially, it’s more educational — and more transgressive, and therefore, a whole more fun! — to embrace Grothendieck’s question “What is a metre?” à\(\ \)nu than to embrace Bowdlerized restrictions of it.

    Koblitz\(\mathsf{{}^2}\)-Menezes analysis  A general method for answering questions like “What is a metre?” — a method that is mathematically compatible with the early Grothendieck’s Bourbakian traditions, and is socially compatible with the late Grothendieck’s Survivre et Livre concerns — has been set forth by Ann Koblitz, Neal Koblitz and Alfred Menezes in their recent survey “Elliptic curve cryptography: the serpentine course of a paradigm shift” (J. Numb. Th., 2011).

    In overview, following the Koblitz\(\mathsf{{}^2}\)-Menezes roadmap generates a concrete answers Grothendieck’s question “What is a metre?”, that first specifies an ideal model of metre-measuring (per Koblitz\(\mathsf{{}^2}\)-Menezes section 2), then considers path-dependence, institutional interests, closure, and narrative inversion (per Koblitz\(\mathsf{{}^2}\)-Menezes section 13).

    STEAM-students are well-advised to study the Koblitz/Menezes survey, with a view toward lessons and principles that (as it seems to me and many) are applicable to many STEAM disciplines.

    Quantum computing contexts  The quantum computing roadmap that Scott Aaronson and Dave Bacon set forth in “Quantum computing and the ultimate limits of computation: the case for a national investment” (The MIT/UW Computing Community Consortium, 2008) surveys quantum contexts for constructing Koblitz\(\mathsf{{}^2}\)-Menezes roadmaps. Students of quantum computing are well-advised to study the Aaronson-Bacon roadmap in light of the Koblitz\(\mathsf{{}^2}\)-Menezes survey.

    After-action reports  When reviewing any roadmap, students are well-advised to resist facile analyses along the lines of “Man oh man was that a cringe-inducing roadmap” (per Scott’s comment #48). Ross Koningstein and David Fork provide a friendlier and more fruitful format in their on-line after-action report (as the US Marines call such reports) in IEEE Spectrum this week, titled “What It Would Really Take to Reverse Climate Change”

    “Not every Google moon shot leaves Earth orbit. In 2011, the company decided that RE<C [Renewable Energy Cheaper than Coal] was not on track to meet its target and shut down the initiative. The two of us, who worked as engineers on the internal RE<C projects, were then forced to reexamine our assumptions.”

    Opinion  The quantum computing community (and its sponsors) would be well-advised to follow the example of Google/RE<C’s well-crafted after-action report.

    In a nutshell  The following ultra-compressed Koblitz\(\mathsf{{}^2}\)-Menezes-Aaronson-Bacon answer to Grothendieck’s STEAM-question à\(\ \)nu “What is a metre?” allots one sentence to each of the Koblitz\(\mathsf{{}^2}\)-Menezes survey elements, as illuminated by the Aaronson-Bacon quantum roadmap elements, with a view to long-range Google/RE<C/Survivre et Livre objectives.

    OK, so what is a metre?  A metre is a class of dynamical processes that generate length measurements in accord with the mise en pratique of the (new-version) Système International d’Unités; the microscopic description of these processes is symplectomorphic; the macroscopic description is thermodynamical; the mathematical accommodations that ensure the mutual consistency of these two descriptions is a longstanding (yet still-evolving) central theme of statistical mechanics, transport theory, and quantum field theory

    Ideal Model I: Security at center stage  Academic research in quantum dynamics responsibly focuses upon STEAM-disciplines that are unfettered by commercial, state, craft, and trade secrecy; these unfettered disciplines include (by tradition and international treaty) astronomical evolution (at cosmological, galactic, stellar, and planetary scales), gravity-wave observation (viewed as the dynamical/observational resolution of “What is a metre?”), low-temperature limits of condensed-matter physics (e.g., superconducting and topological phase transitions, color confinement, low-energy limits of string theory, hadron mass spectra), quantum metrology triangles (as even-today governed by the Convention du Mètre of 1875), and\(\ \)… new!\(\ \)… scattershot boson-sampling (and long may boson-sampling remain unfettered by security considerations!).

    Ideal Model II: Metres from art to science  Per Donald Knuth’s celebrated simulation-centric introduction to the Petkovsek-Wilf-Zeilberger book \(\mathsf{A=B}\), the transition from metre-art to metre-science is achieved via mises en pratique that are physically natural in the sense of the (new-version) Système International d’Unités, and that are mathematically natural in the sense of Grothendieck-Bourbaki … needless to say, the writing of these “naturalized” mises en pratique is a work in-progress (by many research groups, including ours).

    Ideal Model III: Vigorous debate regarding metres  The wonderful Harrow/Kalai debate, which superficially concerned the feasibility of quantum computing, can productively be appreciated (as it seems to me) as a Grothendieck-Bourbaki framework for assessing the quantum-dynamical and thermo-dynamical feasibility of pressing against quantum and thermodynamical limits to sensing, simulation, separation, and computation\(\ \)… including both the unrestricted topics of Ideal Model I (above) and the proprietary topics of (for example) Google’s RE<C initiative.

    Ideal Model IV: Careful vetting of metres  It’s mighty tough to better the aggregate judgment of validating experiments, peer reviewed publications, market-disciplined investment, stable international treaties, and (most of all) the enthusiasm of young researchers.

    Ideal Model V: Survival of the fittest metre  This is not much different from (and not less contentious than) the “survival of the fittest metre-measuring design” that Harry M. Collins so vividly describes in Gravity’s Shadow : the Search for Gravitational Waves (2004)\(\ \)… and yes, considerations that are central to the challenge of efficient boson-sampling simulation already have played crucial roles in the ongoing STEAM-drama of gravity-wave detection.

    It remains only to cover the concluding four elements of Koblitz\(\mathsf{{}^2}\)-Menezes analysis: path-dependence, institutional interests, closure, and narrative inversion

    Path-dependence  Plenty of paths lead to experimental demonstrations of quantum-speedup (or alternatively, an appreciation that such speed-ups are infeasible)\(\ \)… and all of these paths lead too to a world in which Grothendieck’s question “What is a metre?” and Google’s question “How can our computations be carbon-neutral?” are answered with Bourbakian naturality.

    Institutional interests  Koblitz\(\mathsf{{}^2}\)-Menezes roadmaps help us appreciate that questions like Grothedieck’s “What is a metre” and Aaronson-Arkhipov’s “Can scattershot boson-sampling distributions be simulated efficiently” have the virtue that they can be addressed openly and responsibly.

    Closure  The health of the entire STEAM enterprise requires that, in the long-run, gravity-wave observatories operate as-designed and the objectives of the RE<C Initiative be achieved\(\ \)… and enquiring “What is metre?” in the broadest natural sense is a terrific start toward achieving closure on these tough-yet-urgent objectives.

    Narrative inversion  The Harrow-Kalai debate (and the history of gravity-wave detection too) show us that every element of the above discussion is subject to narrative inversion; thus Mark Twain’s directive largely governs our appreciation of 20th century STEAM history:

    Mark Twain’s confession  Along through the story I have distributed a few anachronisms and unborn historical incidents and such things, so as to help the tale over the difficult places. This idea is not original with me; I got it out of Herodotus. Herodotus says, “Very few things happen at the right time, and the rest do not happen at all: the conscientious historian will correct these defects.”

    Needless to say, as Twain knew full well, Herodotus never said any such thing! Yet such is the power of narrative, that Twain’s faux-quotation nowadays is commonly quoted as fact, even by seriousl scholars. And Twain’s principle broadly governs\(\ \)— for better or worse; both inescapably and fortunately\(\ \)— the lessons that we draw from 20th century STEAM history, in constructing our 21st century STEAM roadmaps. Roadmaps that are going to work because, in the end, on our hot crowded Grothendieckian survivre er livre planet, these roadmaps have to work.

  54. John Sidles Says:

    Fabio Traversa [comment #51], please let me say that Scott’s critique of your work [in comment #48] makes several common-sense points that are well-considered (as it seems to many, including me).

    For example, your preprint “Memcomputing NP-complete problems in polynomial time using polynomial resources” (arXiv:1411.4798 [cs.ET]) shows a rack of computing elements, whose modules are described in the Appended Material.

    As I am sure you are aware, nowadays the internal components of these once-analog modules increasing are dynamically-configured field-programmable gate-arrays (FPGAs).

    Couldn’t your entire apparatus — including the analog multipliers — be digitally simulated in (for example) National Instruments language “G”? Then burned to commerical off-the-shelf (COTS) high-bandwidth FPGAs?

    Couldn’t this be done — often with considerable engineering gains in speed and accuracy! — not just in principle, but in practice? In fact, our engineering group (among many) does analog-to-FPGA reductions routinely.

    Isn’t this an explicit PTIME reduction of memcomputers to Turing machines?

    Concrete question  Which module(s) of your memcomputer would require exponential resources to simulate in LabView and/or FPGA modules?


    Extended Church-Turing Thesis (engineering version)  All commercial electronic instruments — including both memcomputing devices and D-Wave devices — can be simulated in LabView with polynomial overhead.

    Scott, it’s a pleasure to support your views in this instance!

  55. Scott Says:

    Fabio #51: Thanks for the response, but I don’t see how it addresses the central point at issue. If the memcomputing model is able to manipulate exponentially-long integers in a single time step (as I understood your paper to say), then it’s going to require exponential resources to simulate the memcomputing model—not merely using a Turing machine, but using any device that we know how to implement within the physical universe.

    But there’s no need to debate this in the abstract! I’ll tell you what: build a memcomputing machine, then show that you can use it to factor 1000-digit RSA challenge numbers into primes (as you must be able to do, if you can solve NP-complete problems). As soon as you do this, I’ll readily admit that I was wrong.

  56. Fred Says:

    Scott #48

    It’s really too bad the author didn’t suggest to build those UMM devices with “memristors” /S

  57. John Sidles Says:

    History repeats  Scott deploys against memcomputing an argument previously deployed against quantum computing:

    Scott argues [#55] “But there’s no need to debate this in the abstract! I’ll tell you what: build a memcomputing [resp. quantum computing] machine, then show that you can use it to factor 1000-digit RSA challenge numbers into primes (as you must be able to do, if you can solve NP-complete problems [resp. scalably error-correct]). As soon as you do this, I’ll readily admit that I was wrong.

    Let’s review some responses to this particular criticism of quantum computing, with a view toward discerning therein a viable response to Scott’s criticism of memcomputing.

    Tim Gowers’ question and Scott’s answer

    Tim Gowers asks [on MathOverflow in 2009] “Are there any interesting examples of random NP-complete problems?”

    Scott Aaronson answers [on MathOverflow in 2010]  “It’s been one of the great unsolved problems of theoretical computer science for the last 35 years!\(\ \)[…] “For [natural distributions], alas, we generally don’t have any formal hardness results”.

    Implication  Memcomputers might credibly claim practical efficacy in solving random instances of problem-classes that formally have NP-complete reductions, but whose random instances in practice almost always don’t … and this claim would not contravene any established results in complexity theory.

    Narratives by Gowers and Mazur  Tim Gowers’ recent weblog post “ICM2014 — Barak, Guralnick, Brown” extended his 2009 MathOverflow question as follows:

    Tim Gowers’ dream  “I’ve spent a lot of time in the last few years thinking about the well-known NP-complete problem where the input is a mathematical statement and the task (in the decision version) is to say whether there is a proof of that statement of length at most \(n\ \)— in some appropriate formal system.”

    “The fact that this problem is NP-complete does not deter mathematicians from spending their lives solving instances of it.”

    “What explains this apparent success? I dream that there might be a very nice answer to this question, rather than just a hand-wavy one that says that the instances studied by mathematicians are far from general.”

    Recent thoughtful follow-on essays by Tim Gowers (“Vividness in mathematics and narrative”, 2012) and Barry Mazur (“Visions, dreams, and mathematics”, 2012) are commended to Shtetl Optimized readers (BibTeX appended).

    Pivotal narratives  By what strategy can the quantum computing and memcomputing communities pursue the Gowers/Mazur program? Scott Adams’ recent on-line essay “The Pivot” (2014, BibTeX appended) provides a general strategy

    Scott Adams says  “The most fascinating phenomenon in the start-up world is called the pivot. That word has been used in every meeting I’ve attended. There’s more to it than you think.”

    “A pivot is when a start-up quickly changes from one product to another or from one business model to another. The valley is full of stories about companies that started with a lame idea and hit it big after a pivot.”

    “Another fascinating phenomenon in the valley is that every entrepreneur and investor seems genuinely interested in helping strangers succeed. I would go so far as to call it the defining feature of the start-up culture.”

    “The dominant worldview in Silicon Valley is that if you aren’t trying to make the world better, you’re in the wrong line of work. The net effect is that the start-up culture is shockingly generous. If you need something for your start-up, folks will happily help you find it. I would have predicted the opposite.”

    Pivotal opportunities  For reasons that Scott [Aaronson] articulated (as quoted the beginning of this comment)\(\ \)— reasons that apply forcibly to both quantum computing and memcomputing\(\ \)— it’s time for both disciplines to pivot from their original hardware-centric objectives.

    Falsifiable prediction  Quantum computing and memcomputing both will pivot toward algorithm-deliverables, that run on classical Turing machines, that are implemented upon ordinary von Neumann hardware architectures … and are programmed in nonstandard languages (e.g. “G”).


    @incollection{Gowers:2012aa, Address =
    {Princeton}, Author = {Timothy Gowers},
    Booktitle = {Circles disturbed: the
    interplay of mathematics and narrative},
    Editor = {Doxiades, Apostolos K. and
    Mazur, Barry}, Pages = {211--231},
    Publisher = {Princeton University
    Press}, Title = {Vividness in
    Mathematics and Narrative}, Year =

    @incollection{Mazur:2012aa, Address =
    {Princeton}, Author = {Barry Mazur},
    Booktitle = {Circles disturbed: the
    interplay of mathematics and narrative},
    Editor = {Doxiades, Apostolos K. and
    Mazur, Barry}, Pages = {183--210},
    Publisher = {Princeton University
    Press}, Title = {Visions, Dreams, and
    Mathematics}, Year = {2012}}

    @misc{Adams:2014aa, Author = {Scott
    Adams}, Howpublished = {The Scott Adams
    Weblog}, Month = {Jun 16,}, Title = {The
    Pivot}, Year = {2014}}

  58. Scott Says:

    John Sidles #57: Well, one obvious difference between the two cases is that, with quantum computing, the problem of decoherence was acknowledged by experts from the very beginning, and has by now been thought through in immense detail. We understand a lot both about why quantum fault-tolerance should work in principle, and about why realizing it experimentally is so hard (of course, productive research continues on both questions). With “memcomputing,” by contrast, there’s a clear-as-day reason why it shouldn’t work—namely, the need to store numbers with exponentially many bits—but incredibly, the issue is nowhere even discussed in the paper I was asked to comment on. So the burden of proof is different in the two cases, to put it mildly.

  59. John Sidles Says:

    Scott asserts “With quantum computing, the problem of decoherence was acknowledged by experts from the very beginning, and has by now been thought through in immense detail. We understand a lot both about why quantum fault-tolerance should work in principle, and about why realizing it experimentally is so hard.”

    The literature paints a picture of our evolving understanding of decoherent dynamics that is considerably more nuanced than that! The following chronological sequence of five articles is commended to Shtetl Optimized readers:

    Feynman (1982)  Simulating physics with computers  Feynman asks “Can a quantum system be probabilistically simulated by a classical universal computer?”

    Deutsch (1985)  Quantum theory, the Church-Turing principle and the universal quantum computer  Deutsch introduces the crucial notion of universality in quantum computing.

    Preskill (1998)  Quantum computing: pro and con  Preskill summarizes a decade’s hard-won understanding of fault-tolerant quantum dynamics; admirably balancing optimism against skepticism in regard to the feasibility of FTQC.

    Zangwill (2014)  The education of Walter Kohn and the creation of density functional theory  Meanwhile Walter Kohn leads an exodus that regards Hilbert space as the resolution of an underlying, lower-dimension, algebraically tractable state-space, in which Feynman’s quantum simulation objective can be achieved by classical means.

    Murcko (2014)  Accelerating drug discovery: the accurate prediction of potency  Quantum chemists assert that Feynman’s promised land of cheap, fast, accurate, reliable, large-scale quantum simulation has been reached in practice … and these simulations crucially exploit noise for purposes of high-accuracy free energy perturbation (FEP) estimates.

    Falsifiable prediction  It will be news to many Shtetl Optimized readers (as it was news to me) that Walter Kohn was a teen-age Kristallnacht refugee, who spent the war in a Canadian (!) internment camp.

    Kohn’s name therefore joins the remarkably long roster of imprisoned/interned algebraic geometers and dynamicists that includes also Galois, Weil, Grothendieck, Kähler, and Koblitz.

    Semi-serious question  Why does imprisonment seemingly foster an enduring passion (and an outstanding aptitude!) for algebraic geometry and dynamics?

    A pivotal challenge  How can quantum computing viably “pivot” (per Scott Adams’ remarks of #57) in view of ever-more-credible claims that Feynman’s 1982 quantum simulation objective has been achieved?

    Most usefully  Andrew Zangwill’s biography of Walter Kohn is heartily recommended to Shtetl Optimized readers, for its sympathetic in-depth portrait of a life well-lived in science and mathematics, spent grappling with tough quantum problems.


    @article{Feynman:1982hs, Author = {R. P. Feynman},
    Journal = {International Journal of Theoretical
    Physics}, Number = {6–7}, Pages = {467–488}, Title
    = {Simulating physics with computers}, Volume = 21,
    Year = 1982}

    @article{Deutsch:1985aa, Author = {{Deutsch}, D.},
    Journal = {Royal Society of London Proceedings
    Series A}, Month = jul, Pages = {97-117}, Title =
    {Quantum theory, the Church-Turing principle and the
    universal quantum computer}, Volume = 400, Year =

    @article{Preskill:1998aa, Author = {{Preskill}, J.},
    Journal = {Royal Society of London Proceedings
    Series A}, Month = jan, Pages = {469}, Title =
    {Quantum computing: pro and con}, Volume = 454,
    Year = 1998}

    @article{Zangwill:2014aa, Author = {Zangwill,
    Andrew}, Journal = {Archive for History of Exact
    Sciences}, Number = {6}, Pages = {775-848}, Title =
    {The education of {W}alter {K}ohn and the creation
    of density functional theory}, Volume = {68}, Year =

    @inproceedings{Murcko:2014aa, Author = {Mark
    Murcko}, Booktitle = {Advances in Drug Discovery and
    Development}, Month = {24 September}, Note = {URL:
    Organization = {Chemical \&\ Engineering News
    (Virtual Symposium)}, Title = {Accelerating drug
    discovery: the accurate prediction of potency},
    Year = {2014}}

  60. Joshua Zelinsky Says:

    John Sidles #59,

    Your timeline is missing quite a bit between 1985 and 1998.

  61. Tobias Says:

    Someone posted the english translation.
    It’s a comment on the article.

  62. Stam Nicolis Says:

    The reason the Planck length can’t-yet-be considered “fundamenal” is that, current physical understanding doesn’t imply that it’s a limiting length of a physical process. Within general relativity its relevance is that any mass, greater than the Planck mass, of size less than the Planck length, would suffer gravitational collapse to a black hole (if it were electrically neutral). However, at that length scale, by definition, quantum effects become comparable to gravitational effects-and it isn’t known what is the correct description of gravitational effects under those circumstances. It could be the case that, at such high curvatures, the putative black hole would evaporate through Hawking radiation-only that Hawking radiation is described when evaporation is a “slow” process, since it isn’t known how to sum over the requisite intermediate states.
    A similar example is the following: from the magnetic permeability and the electric permittivity of vacuum it’s possible to define a quantity that has the dimensions of velocity and whose numerical value is that of the speed of light. However, within electrostatics and magnetostaics this velocity desn’t have physical meaning. Only once time-dependent phenomena in electromagnetism have been consistently described, does it turn out that this velocity does have physical significance. But its significance as a limiting velocity in fact emerges once special relativity is understood-since for any wave equation the velocity parameter is a limiting velocity; but it turns out that only electromagnetic waves can’t be described as the vbrations of more fundamental degrees of freedom.

  63. Fred Says:

    Fred #56,

    haha, I was kidding about “memristors”, but then I just saw this:

    “A new memristor-based device could be used to build brainlike systems and base-10 computers”

  64. Serge Says:

    I also know the story of the “Grothendieck prime”, 57—the fact that 57 isn’t prime illustrating Grothendieck’s complete disregard for concrete examples, and his lack of need for them.

    In any case, 1957 was the year when Grothendieck’s mother passed away. It’s also in 1957 that he started to rebuild the foundations of algebraic geometry. Thus the number 57, if not prime in the arithmetic sense, was certainly prime to his heart. My opinion is that AG had become a mathematician – reinventing measure theory without even knowing about Lebesgue – to put up with the disappearance of his father. Later on, he might have turned into a leading algebraic geometer partly to avoid thinking about his mother’s death. However, by 1970 he went through what’s called today a professional burnout – first campaigning for ecology, then retiring alone in the region where he’d lived most of his youth. Which makes me think of the title of Marilyn Monroe’s last and uncompleted movie: “Something’s got to give…”