Why Quantum Mechanics?

In the past few months, I’ve twice injured the same ankle while playing with my kids. This, perhaps combined with covid, led me to several indisputable realizations:

  1. I am mortal.
  2. Despite my self-conception as a nerdy little kid awaiting the serious people’s approval, I am now firmly middle-aged. By my age, Einstein had completed general relativity, Turing had founded CS, won WWII, and proposed the Turing Test, and Galois, Ramanujan, and Ramsey had been dead for years.
  3. Thus, whatever I wanted to accomplish in my intellectual life, I should probably get started on it now.

Hence today’s post. I’m feeling a strong compulsion to write an essay, or possibly even a book, surveying and critically evaluating a century of ideas about the following question:

Q: Why should the universe have been quantum-mechanical?

If you want, you can divide Q into two subquestions:

Q1: Why didn’t God just make the universe classical and be done with it? What would’ve been wrong with that choice?

Q2: Assuming classical physics wasn’t good enough for whatever reason, why this specific alternative? Why the complex-valued amplitudes? Why unitary transformations? Why the Born rule? Why the tensor product?

Despite its greater specificity, Q2 is ironically the question that I feel we have a better handle on. I could spend half a semester teaching theorems that admittedly don’t answer Q2, as satisfyingly as Einstein answered the question “why the Lorentz transformations?,” but that at least render this particular set of mathematical choices (the 2-norm, the Born Rule, complex numbers, etc.) orders-of-magnitude less surprising than one might’ve thought they were a priori. Q1 therefore stands, to me at least, as the more mysterious of the two questions.

So, I want to write something about the space of credible answers to Q, and especially Q1, that humans can currently conceive. I want to do this for my own sake as much as for others’. I want to do it because I regard Q as one of the biggest questions ever asked, for which it seems plausible to me that there’s simply an answer that most experts would accept as valid once they saw it, but for which no such answer is known. And also because, besides having spent 25 years working in quantum information, I have the following qualifications for the job:

  • I don’t dismiss either Q1 or Q2 as silly; and
  • crucially, I don’t think I already know the answers, and merely need better arguments to justify them. I’m genuinely uncertain and confused.

The purpose of this post is to invite you to share your own answers to Q in the comments section. Before I embark on my survey project, I’d better know if there are promising ideas that I’ve missed, and this blog seems like as good a place as any to crowdsource the job.

Any answer is welcome, no matter how wild or speculative, so long as it honestly grapples with the actual nature of QM. To illustrate, nothing along the lines of “the universe is quantum because it needs to be holistic, interconnected, full of surprises, etc. etc.” will cut it, since such answers leave utterly unexplained why the world wasn’t simply endowed with those properties directly, rather than specifically via generalizing the rules of probability to allow interference and noncommuting observables.

Relatedly, whatever “design goal” you propose for the laws of physics, if the goal is satisfied by QM, but satisfied even better by theories that provide even more power than QM does—for instance, superluminal signalling, or violations of Tsirelson’s bound, or the efficient solution of NP-complete problems—then your explanation is out. This is a remarkably strong constraint.

Oh, needless to say, don’t try my patience with anything about the uncertainty principle being due to floating-point errors or rendering bugs, or anything else that relies on a travesty of QM lifted from a popular article or meme! 🙂

OK, maybe four more comments to enable a more productive discussion, before I shut up and turn things over to you:

  1. I’m aware, of course, of the radical uncertainty about what form an answer to Q should even take. Am I asking you to psychoanalyze the will of God in creating the universe? Or, what perhaps amounts to the same thing, am I asking for the design objectives of the giant computer simulation that we’re living in? (As in, “I’m 100% fine with living inside a Matrix … I just want to understand why it’s a unitary matrix!”) Am I instead asking for an anthropic explanation, showing why of course QM would be needed if you wanted life or consciousness like ours? Am I “merely” asking for simpler or more intuitive physical principles from which QM is to be derived as a consequence? Am I asking why QM is the “most elegant choice” in some space of mathematical options … even to the point where, with hindsight, a 19th-century mathematician or physicist could’ve been convinced that of course this must be part of Nature’s plan? Am I asking for something else entirely? You get to decide! Should you take up my challenge, this is both your privilege and your terrifying burden.
  2. I’m aware, of course, of the dizzying array of central physical phenomena that rely on QM for their ultimate explanation. These phenomena range from the stability of matter itself, which depends on the Pauli exclusion principle; to the nuclear fusion that powers the sun, which depends on a quantum tunneling effect; to the discrete energy levels of electrons (and hence, the combinatorial nature of chemistry), which relies on electrons being waves of probability amplitude that can only circle nuclei an integer number of times if their crests are to meet their troughs. Important as they are, though, I don’t regard any of these phenomena as satisfying answers to Q in themselves. The reason is simply that, in each case, it would seem like child’s-play to contrive some classical mechanism to produce the same effect, were that the goal. QM just seems far too grand to have been the answer to these questions! An exponentially larger state space for all of reality, plus the end of Newtonian determinism, just to overcome the technical problem that accelerating charges radiate energy in classical electrodynamics, thereby rendering atoms unstable? It reminds me of the Simpsons episode where Homer uses a teleportation machine to get a beer from the fridge without needing to get up off the couch.
  3. I’m aware of Gleason’s theorem, and of the specialness of the 1-norm and 2-norm in linear algebra, and of the arguments for complex amplitudes as opposed to reals or quaternions, and of the beautiful work of Lucien Hardy and of Chiribella et al. and others on axiomatic derivations of quantum theory. As some of you might remember, I even discussed much of this material in Quantum Computing Since Democritus! There’s a huge amount to say about these fascinating justifications for the rules of QM, and I hope to say some of it in my planned survey! For now, I’ll simply remark that every axiomatic reconstruction of QM that I’ve seen, impressive though it was, has relied on one or more axioms that struck me as weird, in the sense that I’d have little trouble dismissing the axioms as totally implausible and unmotivated if I hadn’t already known (from QM, of course) that they were true. The axiomatic reconstructions do help me somewhat with Q2, but little if at all with Q1.
  4. To keep the discussion focused, in this post I’d like to exclude answers along the lines of “but what if QM is merely an approximation to something else?,” to say nothing of “a century of evidence for QM was all just a massive illusion! LOCAL HIDDEN VARIABLES FOR THE WIN!!!” We can have those debates another day—God knows that, here on Shtetl-Optimized, we have and we will. Here I’m asking instead: imagine that, as fantastical as it sounds, QM were not only exactly true, but (along with relativity, thermodynamics, evolution, and the tastiness of chocolate) one of the profoundest truths our sorry species had ever discovered. Why should I have expected that truth all along? What possible reasons to expect it have I missed?

732 Responses to “Why Quantum Mechanics?”

  1. Haelfix Says:

    Before I even attempt to answer Q1, i’d like to get a clarification of comment 2 that you posted.

    One of the obvious answers here is that classical mechanics *can’t* reproduce nature as we observe it. That atoms would be unstable, that the ultraviolet catastrophe would have no resolution, and so on and so forth.

    You write “the reason is simply that, in each case, it would seem like child’s-play to contrive some classical mechanism to produce the same effect, were that the goal.”

    I don’t understand this. What classical mechanism can reproduce these sort of effects? Theorists spent a long time trying to cook up fixes for classical mechanics, and consistently failed. Indeed this was very much the impetus that led people like Planck and others to just abandon all reason and look for semi empirical ‘hacks’ to explain the data.

  2. Autolykos Says:

    My internal explanation was always basically #2 + anthropic principle (a world that doesn’t allow chemistry probably won’t have anything in it that can observe it).
    If one buys the simulation argument, a lot of QM starts to look like a bunch of hacks to limit resolution and simulation fidelity, and get lazy evaluation (you only need to calculate precise position or momentum when someone actually asks for them, and can give vague and ambiguous answers the rest of the time).
    Also, wavefunctions lend themselves to static solutions with very few (and small integer!) parameters which saves a lot of memory compared to the classical approach where you don’t even know when to stop with the precision.

  3. Daniel L Speyer Says:

    First shot at anthropics: can you derive complex chemistry from anything less weird?

    In a universe where fusion was always exothermic, you’d get one giant nucleus. Never, tons of hydrogen. But exothermic-only-until-iron gets us a mix of elements.

    And then there’s the electron shells: 2 8 18… That’s where all the covalent bonding behavior comes from.

    I’m not sure what newtonian chemistry would look like, but I can easily believe it wouldn’t be versatile enough for enzymes.

  4. Scott Says:

    Haelfix #1: You’re right, I should clarify what I meant. If you take the whole structure of classical mechanics as given (including Maxwell’s equations and so on), there’s indeed no simple fix that will make atoms stable, make electron shells discrete, etc. etc.—that’s how physicists realized the necessity of QM in the first place!

    But from a modern perspective, much of the structure of classical physics (including reversibility, the Euler-Lagrange equation, and more) is ultimately explained by QM! And if that’s the case, then it seems to me that we don’t get to reverse the causal arrow, and say that this classical stuff also explains QM.

    So consider, instead, an imagined scenario where we’re designing new laws of physics from scratch. In that scenario, it seems to me to beggar belief that you’d ever consider anything of the metaphysical enormity of QM for such a relatively quotidian purpose as allowing complex chemistry! It’s like, why not just introduce, as primitive elements in your physics, atoms that can only snap together in certain discrete tinkertoy-like ways? There are plenty of classical cellular automata like that, and on their face, they seem to have most or all of what you’d need for complex life, including Turing-universality. Or maybe you want tinkertoy-like atoms plus Lorentz-invariance? But that ought to be doable as well, and with less difficulty than inventing interesting relativistic quantum field theories…

  5. Scott Says:

    Autolykos #2:

      If one buys the simulation argument, a lot of QM starts to look like a bunch of hacks to limit resolution and simulation fidelity, and get lazy evaluation (you only need to calculate precise position or momentum when someone actually asks for them, and can give vague and ambiguous answers the rest of the time).

    Any answer along those lines, it seems to me, immediately crashes and burns once we realize that passing to wavefunctions, far from decreasing our classical simulation cost, has exponentially increased it—the fact famously exploited by quantum computation.

  6. Oleg Eterevsky Says:

    I don’t have any answers, but I have a reformulation of Q1 that I find interesting:

    Suppose you want to simulate a universe that could support life, but you don’t have a quantum computer, just a very big classical one. Would you be able to do it?

    At the first glance it seems that having some forms of energy and entropy should be enough for complexity to arise. But I don’t know, maybe without the quantum mechanics you don’t have enough spare entropy per unit of matter or something. (Like how stars wouldn’t be able to shine as long as they do without fusion.)

  7. Greg Guy Says:

    There seems to be some conceptual problem here. You keep saying that classical physics + some other unspecified physics < weird than QM. What exactly do you think makes QM uniquely strange? GR strikes me as pretty weird given its effect on space and time. It seems to me that it's classical physics that has all the strangeness in it. QM is quite sensible by comparison.

  8. Rahul Says:

    To answer Q1 do we have to first believe that God exists? 🙂

  9. Daniel H Ranard Says:

    I’ll assume QM is exact, as instructed. But even if the ultimate theory X falls squarely within the QM formalism, it may have some striking features that help answer your question. That is, your question may evolve from “Why QM?” to “Why X? [where X happens to be a QM theory]”. E.g. maybe the non-perturbative formulation of M-theory will more clearly shout “I’m self-evident!” than does QM writ large. (Working in QI, one may have a bias to think that the most salient features of a theory are the information-processing capabilities, but maybe those just come along for the ride?)

  10. Dmitri Urbanowicz Says:

    I recently learned about apparently well-known trick. Imagine you have the following dynamical law:

    |x> -> |x+1>

    Now instead of choosing initial x, let’s take equal superposition of all possible x. Then we’ll get a state, which globally never changes (i.e. transforms into itself). And yet, inhabitants of such a universe would still experience passage of time.

    This seems to suggest the following answer: the Creator didn’t want to choose initial conditions and intended the universe to be eternal.

  11. Job Says:

    Q1: Why didn’t God just make the universe classical and be done with it? What would’ve been wrong with that choice?

    In that universe we’d still have quantum computing and BQP, just by different names.
    There would still be a fast algorithm for factoring integers, using interference – Shower’s algorithm, named after the ripples in a pond on the rainy day that inspired its invention.

    It would be an intangible form of computing, yet somehow not inconsistent with natural laws.
    Computer scientists would make bad passive aggressive jokes to physicists about being “more efficient” at describing the world.

    But more importantly, imagine actually being unable to factor large integers. How bizarre would that be?
    I mean, I know we think of integer factorization as a difficult problem, but in the universe we live in it’s an easy problem.

    I actually can’t think of a better reason for the universe to be quantum mechanical than that.
    Integers must be efficiently factorizable.

  12. Nytta Leeten Says:

    I share the intuition that chemistry at our present time and place in the universe should be possible with other physical laws. So let us say QM here and now is unnecessarily convoluted. However, evidence we have nowadays points to that the universe in the present came about through an evolution from a “Big Bang”. QM as we observe it here and now, could it be a remnant of necessary initial conditions of the Big Bang? That is, could a universe launched by God through a Big Bang, but left to its own devices after that, have been built on anything simpler than QM? My work with QM dealt with a lot lower energies, so I do not have any good answers to this question.

    Let us say we can provide arguments why the Big Bang conditions required some of the QM weirdness, then Q1 implies the question why the Big Bang rather than a steady-state model a la Hoyle. In a steady-state model without a beginning and thus without any initial conditions “polluting” a present universe, a creator can create chemistry with less weirdness by the earlier assumption. But can things exist without a beginning in the most profound sense, will all questions lead back to an origin where the Unmoved Mover sits majestically? Time to put on the metaphysics hat…

  13. Yoni Says:

    Hi Scott

    I can’t wait to read the book. Please please make it as layman readable as possible 🙂

    From my very ‘under-educated in quantum mechanics’ point of view I have often wondered about Q, mainly Q1 (my QM education isn’t really deep enough to even have much of a wondering about Q2). Could there be something about consciousness that is tied to the process of moving from “many possible futures” to “one possible past”? It seems to me that any world that was fully deterministic would kind of look like the past, i.e. “dead” and static – already happened, so maybe it just wouldn’t “feel” like anything to traverse the timeline in that universe. Similarly it doesn’t “feel” like anything to be in the future of our world, the world where there is no particular answer to the question of what is what. However at the junction between the two, where many futures collapse into one past “feels” like something is happening.

  14. Chris Says:

    Something like it seems to fall out of the mathematical universe hypothesis, if in addition you are also a physicalist about consciousness.

    The MUH then implies the physical patterns grounding consciousness is just another mathematical structure – that is, conscious experience is *itself* a mathematical object that can be instantiated within mathematical worlds (which are actual). Consciousness is experienced from the vantage point of that structure itself, so you should expect to live in a kind of quotient isomorphism class of all possible worlds that contain substructures instantiating this object. In other words, you should expect to live in a continuous superposition of all instances in all possible worlds that contain that object, which mathematically amounts to whatever it is that physically grounds and distinguishes your unique conscious experience at this precise moment.

    That sounds a lot like Everett! If you make some further Kantian-like presumptions about how the world ought to appear in order for it to be cognizable in the first place, you might conclude that this continuous bulk has to be discretized somehow. Therefore quantum mechanics! That the MUH has presented us with a compelling motivation for the uniquely weird phenomena of quantum mechanics in this fashion is in my view one of the three or four most compelling reasons to believe that it’s true. Incidentally, similar considerations may solve the cosmological measure problem to boot.

  15. Mateus Araújo Says:

    I think the most way to tackle Q1 is to imagine a universe that is literally described by classical electrodynamics + general relativity, without considering physical problems like the instability of atoms. If the universe were classical matter would of course not be made of atoms! They are not predicted from the classical theories, though, so we are free to do away with them. What are the intrinsic features of such a universe?

    There’s a long list of pathologies. I think the gravest one are the singularities predicted by general relativity, but one also has chaotic systems, Hamiltonians that are not bounded below, etc. It’s a fundamentally continuous universe, with all the problems for computation that it brings. Computation might also become too easy; in our universe it’s of course impossible to do analogical computation with infinite-precision real numbers (that can solve PSPACE-complete problems in polynomial time), but in a classical universe? What might forbid it? Even Norton’s dome, which in our universe is just a pathology caused by using an unphysical potential, becomes a real problem. Why would its potential be unphysical in a classical universe?

  16. Primer Says:

    Can’t wait for the book. Even after just a few minutes of contemplation, your Q seems way more profound than one might think initially. I could imagine Multiverses (even high-Tegmark-level ones) and Simulations might appear way more “obviously bollocks” or “obviously true” after thinking about your Q for a while. This might end up in the top-5 of my alltime favorite thought-provoking thoughts. Thanks for occupying my brain for possibly years, Scott!

    Scott #5: My current model says we need infinite precision to calculate an interaction between 2 particles without QM, due to e.g. the distance between the particles being infinitely exact. With QM, we can stop once we get below Planck’s length. I’m not putting much credence into my model here, but I haven’t run across a layman-understandable “this-is-why-that’s-obviously-wrong” either.

  17. Joshua Zelinsky Says:

    Here’s an answer which touches on some of the anthropic issues, and then a rambling comment about why that answer isn’t really satisfying.

    Quantum mechanics may be actually one of the simplest theories which really does produce conscious observers. If we have something like the Mark IV Tegmark multiverse, where all consistent universes “exist” in some deep sense, then the only ones which are going to be worth noting are those which have observers and whose observers are intelligent enough to ask questions about their underlying physics. (It seems plausible that universes complicated enough to have observers are going in general to be complicated enough to be likely to have intelligent observers, so the intelligent requirement may not be restricting things that much.) Now, it seems plausible that if something like Mark IV is correct, then it should be weighted in some way so that mathematically simpler universes get more weight. If so, there may be a high probability that observers show up in quantum mechanical universes. We know that quantum mechanics does this, and the rules of our universe don’t seem to be that complicated. The Standard Model is just not that bad to write down. In contrast, if you want “tinkertoy atoms” you might be actually specifying a lot of things. Carbon chemistry is really complicated and even aside from CHNOPS, almost all living things seem to use at least a few non-CHNOPS atoms on top of that. Specifying that many complicated interacting tinkertoys may be so complicated that very few observers see themselves in such universes.

    Six unsatisfying things about this: First, it assumes some version of Tegmark hierarchy, which is a pretty big assumption. Second, we don’t have a Theory of Everything, so without a way to reconcile QM and General Relativity , deciding that the underlying physics is simple seems tough. Third, it is likely that whatever we do find to handle QM and GR is going to be more complicated than just QM and SR, which we can mathematically reconcile. So this then leads to the serious problem of why General Relativity, which seems like about as big an issue as why QM. (This may be my own biases coming in to play; I understand QM and SR a little. GR seems genuinely tougher). It doesn’t seem that GR is doing anything necessary to have intelligent life, and at a minimum, some variant of QM + SR Newtonian gravity doesn’t seem that complicated. Fifth, the assumed weighing for Tegmark should probably not just involve how simple universes are mathematically but how many observers there are. Even if QM is mathematically simple, it really isn’t clear that it produces a lot of observers compared to others. And for that matter, just what is the correct weighing of simplicity v. number of observers? Sixth, while we’re at it, what counts as an observer for these purposes?

  18. Augustin Vanrietvelde Says:

    Hi Scott,

    Thanks for this post, I feel like Q is definitely a facinating and extremely important question. Here is a sketch of an answer, that stems from studying quantum causal models (and comparing them to classical ones), in the spirit of https://arxiv.org/abs/1906.10726 and related works (disclaimer: this is from my PhD supervisor and coworkers, so I might of course be non-neutral here).

    Suppose you look at causal models of deterministic and reversible transformations, i.e. bijective functions in the classical case, and unitary maps in the quantum one. The “deterministic and reversible” part is important, and can be argued for, essentially, by noting that anything else would have good chances of stemming more from our own imperfect perspective than from the actual workings of the external world. Suppose also that you’re looking at finite dimensional systems (mostly for simplicity — I would be surprised if this didn’t generalise to infinite dimension).

    Then there is a natural question you might ask, about whether the causal structure of a transformation is always time-symmetric. By this I mean, take a deterministic and reversible transformation f:AB -> CD; it has some causal structure (for example, featuring all causal influences except from B to D). Then look at the causal structure of f^dag:CD -> AB, the reversed version of f. Is it just the reverse of f’s causal structure? If yes, then one can say that f’s causal structure is time-symmetric.

    I think it is pretty natural to argue that causal structures should be time-symmetric in a fundamental theory of the universe. After all, if there is some influence A -> D in the forward time-direction, shouldn’t this be the same thing as an influence D -> A in the backward time direction? Even more so given that your dynamics is deterministic — which sould mean that your causal analysis is not “missing on anything else”, for example a hidden common cause, or information getting lost, or whatnot.

    Now you probably can see where this is going. If I look at reversible deterministic classical transformations (i.e. bijections), then their causal structure is quite easy to define — think of functional independence — and it is not always time-symmetric. A typical counter-example is the classical CNOT: AB -> CD, where A and C represent the control system and B and D the target system. This gate features influence A ->D, but its reverse (which is also a CNOT) features no influence D->A.

    On the contrary, let us look at the quantum case. First, like in the classical case, one can define causal influence between inputs and outputs of a unitary channel in a neat way, that is equivalent to a handful of natural definitions you might think of. And one can then show that the causal structure of unitary channels is always time-symmetric! A good way to get a feeling of why this is the case is to look at the quantum version of my previous example, i.e. at the quantum CNOT: reversibility of the causal structure in this case is essentially given by the good old quantum backaction.

    So: the causal structure of quantum deterministic reversible transformations is time-symmetric, which is not always the case for classical ones. In my view this is a very interesting way in which quantum theory is, in fact, better-behaved than classical theory. And therefore a sensible (start of an) answer to Q!

  19. Ashley R Pollard Says:

    Interested observer of science who write SF, that’s my caveat here.

    QM can be constructed from rules about information. So the most elementary unit can carry no more than one bit of information (set by the Planck limit, I guess).

    Everything is a composite made of sub-set of ‘bits’ and is not determined until measured. Reversibility is maintained at the smallest state.

    I’m commenting so that I can be corrected by those who know what they are talking about. I am at best the equivalent of Marilyn Monroe being taught to explain relativity.

    And thank you.

  20. sam Says:

    A pretty unsatisfying possibility:

    1. Quantum theoryspace is in some sense ‘bigger’ than classical theoryspace, as suggested by Q2.
    2. Anything that can happen will happen.
    3. Hence, anthropically, we should expect to live in a quantum world rather than a classical world.

    (2) is probably philosophically defensible. Given (1) and (2), (3) ought to be a lock. (You maybe need some kind of argument about how quantumness isn’t innately less likely to produce anthropic observers, too.) But (1) is the really tender point, because I can _also_ imagine worlds with really weird mechanics, with lots of parameters (and hence occupying lots and lots of theoryspace), but still classical and life-supporting; it’s not at all clear that quantum theoryspace is any larger.

    If I was seriously trying to defend (1), I might go with an argument like this:

    4. Classical theory can be recovered from quantum theory, but not vice versa.
    5. Therefore, quantum theory is more parametric and uniform (using the programming language theoretic definitions), and has less degrees of freedom.

    But it’s then a stretch to go from (5) to talking about classical theoryspace at large – (4) is specific to our own universe, while I want to talk about any possible universe. I’d need a more general version, something like:

    6. Any quantum theory gives rise to a classical approximation.
    7. Therefore, classical theoryspace may contain theories that have no corresponding quantum theory.

    I dunno. Anthropic arguments always come out unsatisfying when we don’t know what the prior distribution of possibilities was.

  21. Jonathan Says:

    It’s been years since I did anything that has to do with QM, so I hope I’m not making a fool of myself here, but here goes.
    When we have two particles, we take the tensor product of their Hilbert spaces as the Hilbert space of the composite system, right? We can look at it the other way too. Start with the big Hilbert space H, and decompose it into the component Hilbert spaces.
    But wait – there is more than one way to do that. We can choose any 2-dimensional subspace H1, and then define the quotient space H2 = H/H1. H1 and H2 are two small Hilbert spaces which reproduce the original H when we take their tensor product.
    Only one choice of subspace H1 will get us the original two particles we started with. What will the other choices get us?
    Can we take the Hilbert space of the universe and decompose it into components in a new and different way? I don’t mean a different way as in grouping the particles from Earth with the particles from Venus or something like that. I mean choosing entirely different subspaces, each one being a weird intersection of all the particles in the universe.
    In this new picture – how do the laws of physics look? If they don’t make any sense – maybe this is why we experience this specific partition of the universal Hilbert space.
    Maybe the only axiom we need is that the universe is a Hilbert space, evolving unitarily in time (or maybe we don’t even need the evolution part, if we think of time as a space-like dimension). Everything else is derived from the way we divide up the universal Hilbert space into components.
    If we have only one axiom, we have less of an explanatory burden, and that’s why I think this whole thing is relevant here.

  22. Peter morgan Says:

    Your Q1, “Why didn’t God just make the universe classical and be done with it? What would’ve been wrong with that choice?” finds an answer of sorts in the first sentence of the abstract of my “Classical and Quantum Measurement Theories”, arXiv:2201.04667, “Classical and quantum measurement theories are usually held to be different because the algebra of classical measurements is commutative, however the Poisson bracket allows noncommutativity to be added naturally.”
    One way to read this is that God didn’t make the universe classical or quantum, God gave us the facility to measure his works systematically and to carefully consider the results. Colbeck&Renner, in Nature Communications 2011, “No extension of quantum theory can have improved predictive power”, prove under their assumptions that Quantum Mechanics is complete as a tool for the systematic description of past and future measurement results, so we can consider how we can make Classical Mechanics as complete in the same sense. The Poisson bracket allows such a completion.

    The first paragraph of arXiv:2201.04667 is “The classical measurement theory we are accustomed to is incomplete: it cannot describe all the measurement results that can be described by quantum measurement theory. Instead of asking how we can complete quantum mechanics so it can be more like classical mechanics, we should ask how we can complete classical measurement theory so it can be more like quantum measurement theory.”
    I cannot guarantee you will like the whole paper, but it is quite closely tuned to your Q.

  23. Scott Says:

    Greg Guy #7: No, I don’t say (and I don’t think I did say) that any minor modification of classical mechanics is necessarily “less weird” than QM. It’s certainly different than QM, though, and exponentially smaller in its state space, and easier to simulate on a classical computer, because otherwise it wouldn’t be a minor modification!

  24. Scott Says:

    Rahul #8:

      To answer Q1 do we have to first believe that God exists? 🙂

    No, you most definitely do not 🙂

  25. Scott Says:

    Daniel H Ranard #9: I completely agree; it’s possible that future developments in quantum gravity (or whatever) will shed light on the question, and indeed that’s one reason among many to work on quantum gravity. Even there, though, it would be great if we could say anything interesting today about which quantum gravity developments would say what!

    I will say that it’s not obvious to me whether the past 90 years of work on QFT, the Standard Model, and quantum gravity, as heroically important as it’s been, has shed any light on either Q1 or Q2, except insofar as it’s heightened their importance by confirming QM as the foundation of physics rather than the temporary expedient some hoped it was. What do others think?

  26. Scott Says:

    Dmitri Urbanowicz #10:

      This seems to suggest the following answer: the Creator didn’t want to choose initial conditions and intended the universe to be eternal.

    But you could’ve gotten the same thing with a classical block universe, or (say) a probability distribution over time steps. It didn’t have to be quantum.

    (I fear that I’m going to have many, many responses of this form! 🙂 )

  27. Scott Says:

    Job #11:

      I actually can’t think of a better reason for the universe to be quantum mechanical than that.
      Integers must be efficiently factorizable.

    That possibility has of course occurred to me … but dare I push back a step, and ask why it’s so much more important for integers to be efficiently factorizable, than for any of the other problems in NP-P to have efficient solutions? 😀

  28. Scott Says:

    Yoni #13:

      Could there be something about consciousness that is tied to the process of moving from “many possible futures” to “one possible past”? It seems to me that any world that was fully deterministic would kind of look like the past, i.e. “dead” and static – already happened, so maybe it just wouldn’t “feel” like anything to traverse the timeline in that universe.

    I actually think there’s a lot to that, that you may have put your finger on something profoundly important! But the difficulty is this: why not just achieve the branching, the many possible futures collapsing to one past, via classical probability (plus some “free will / Knightian uncertainty” here or there if you like)? Why do you need quantum superposition in particular?

  29. Scott Says:

    Chris #14:

      In other words, you should expect to live in a continuous superposition of all instances in all possible worlds that contain that object, which mathematically amounts to whatever it is that physically grounds and distinguishes your unique conscious experience at this precise moment.
      That sounds a lot like Everett!

    No, to me it sounds more like simply a listing of all the different possibilities, or a probability distribution over the possibilities, or some other way to rank them in prominence and/or organize them. Why jump immediately to QM, with the interference and the complex numbers and whatnot?

  30. ohm Says:

    If you start with General Relativity as an axiom (this doesn’t seem weird to me), then it seems clear that a consequence (e.g. Thorne, 1991: https://journals.aps.org/prd/abstract/10.1103/PhysRevD.44.1077, or Thorne, 1993: https://www.its.caltech.edu/~kip/index.html/PubScans/II-121.pdf) all kinds of problems involving infinite families of degenerate classical solutions arise, which in my mind seems to be a giant neon sign pointing towards something at least similar to Quantum Mechanics. The papers cited above point out that self-consistency requirements probably constrain these infinite multiplicities in various ways, providing a plausible mechanism by which Quantum Mechanics could arise as a direct consequence of nothing other than vanilla General Relativity. Unfortunately those papers also seem to generally have the attitude that “oh hey, look, these problems are tamed after quantization”, rather than taking seriously the possibility that QM itself is just a manifestation of GR. I’ve never heard a satisfying response to why this line of reasoning doesn’t seem to have been seriously followed up on, so maybe this could be something you talk about in your book!

  31. Scott Says:

    Mateus Araújo #15 (and also Primer #16): As I should’ve said in the original post, I completely agree with you that there are excellent, known reasons why the universe shouldn’t have been based on classical, continuous-all-the-way-down physics, ranging from singularities to various computational pathologies. But that still leaves the question: whatever was going to come in to discretize or regularize the continuum, why shouldn’t it have been classical, like a cellular automaton? What was wrong with Conway’s Game of Life, or your favorite similarly Turing-complete variation thereof (maybe a probabilistic one), as the basis of reality?

  32. Giorgio Torrieri Says:

    My best guess, trying my best to separate “the physics” from “the formalism” (often foundational people confuse these):

    The world is relational, one can only find out about a system by kicking it with some sort of device. So “measurement” and system can not represented mathematically by measurement=f(system).
    The next best thing is result~(measurement)|system> or
    result~Tr(measurement.system)
    where (measurement) is an operator hitting the system, and different measurements do not commute.
    If we use linear operators the two statements above are equivalent so no reason to worry which to pick.

    To go further, in the world we live in every measurement on every system gives some kind of result and there is a continuum set of measurements one can make, and results are real numbers. The right mathematical properties to describe this are possessed by Hermitian operators (completeness, real observables, connection to unitary operators. Note these are not specific to QM, see Sturm-Lioville theory).

    Now, non-commutation means lack of knowledge, and we need to somehow model this mathematically. Thats what probability is, a model of lack of knowledge. So how do we put together probability with the above facts? Gleasons theorem shows the consistent way to do so.

    So we essentially argued for quantum mechanics from “reasonable° assumptions (relationalism, the continuity of the set of questions, the guarantee to get answes). The fact that there are non-trivial consequences to this (quantization of energy levels in bound systems) is that much more satisfying.

  33. Scott Says:

    Joshua Zelinsky #17: I like that you’re grappling with the question in precisely the spirit I intended it! But my difficulty is: why shouldn’t we be able to design some classical cellular automaton, way way simpler than the Standard Model (with its 25 adjustable parameters and so on), which would similarly give rise to a collection of “atoms” that can fit together only in discrete, tinkertoy-like ways?

    Actually, you’ve made me realize that designing such a CA would be a phenomenal research project for anyone seeking to investigate Q1/Q2.

  34. pixelatedpersona Says:

    Stephen Wolfram tried / is trying to do this. See book A New Kind Of Science

  35. Craig Gidney Says:

    Something I thought was old hat, but which a couple people have told me has some small novel element, is that you can weaken the Born rule to “as amplitude limits to 1, probability limits to 100%”. You then recover the full p = |a|^2 rule by appealing to the usual suspects like the structure of the tensor product, no signalling, and the ability to port computational definitions from classical to quantum by using no-garbage reversible circuits as a shared language.

    Whenever you can translate a statement like “X has probability p” into something like “this family of garbageless circuits limits to always returning true” you can port this into and out of quantum mechanics as “this family of garbageless circuits limits to producing a superposition with all amplitudes equal to zero except for the |true_output> state”.

    I try to work through it in more detail here: https://algassert.com/post/1902

    Another way to think about this is… whenever we try to carefully investigate something, we do statistics on it. We do repeated experiments, or a huge variety of experiments, and aggregate all those runs into some simpler answer. This process of aggregating, of focusing on the tiny number of bits summarizing many input bits of unknown nature (e.g. a trillion shots being reduced from a trillion outcome bits to a 20 bit hit rate number), screens off the need to “get the input bits right”. It seems sufficient to converge on getting the behavior of the summary bits right.

  36. Scott P. Says:

    I think the answer to Question #1 is most likely to be along the lines of the answers to the questions “Why does the Solar System have 9 planets, instead of more or fewer?” “Why is the Earth’s surface 75% water, instead of 50%? Why did civilization get its start in Mesopotamia, instead of somewhere else first?

  37. OhMyGoodness Says:

    The answers to Q1 and Q2 are clear-

    Q1-The creation of a universe by a deity must arise from an intention. The intention must have some goal. The goal of the intention to create a universe must be associated with something that isn’t known by the deity (if known it would not provide interest). The created universe must therefore include elements that cannot be known prior to their measurement (a true randomnizer to said deity). Hence quantum mechanics are necessary in any universe created by a deity.

    If you evaluate potential explanatory models-Creator+First Person Shooter, Creator+Simulation, Creator+Boredom, etc., all require some element that cannot be known beforehand by the deity in order to provide sufficient interest to form an intention and hence quantum mechanics is a necessary condition. QED.

    Q2-Following from Q1, steam engines and flywheels all the way down is insufficiently interesting. Hence, our universe is the simplest possible that meets the deities requirements for unpredictability and interest.

    Of course the deity plays dice with the universe. Why roll at all if the outcome is known a priori?

  38. bertgoz Says:

    Why not tackling the even more ambitious question: why should the universe be mathematical at all?

    It seems that QM is an efficient patch to some of the bad inefficiencies that happen when modelling the universe as a mathematical entity.

    As Feynman said “Why should it take an infinite amount of logic to figure out what one stinky little bit of space-time is going to do?”

  39. Scott Says:

    Augustin Vanrietvelde #18: Thank you; your comment makes me glad I did this post! You’ve given me an argument that (1) genuinely depends on QM and (2) I’d genuinely never heard before.

    Alas, thinking through your CNOT example, I’m led to a very different conclusion than you are: namely, that I see no reason why the time-reverse of a causal influence from A to B, should be interpretable as a causal influence from B to A. Suppose, for example, you took a video of rain making grass wet, and then played it for me backwards: would I say that in the bizarro backwards world, the wet grass had caused the rain? No, I’d simply say that it “looked backwards” to me—or, on further reflection, that you’d broken causality by reversing the thermodynamic Arrow of Time!

    I do, on the other hand, agree that “no action without back-action” is an elegant feature of QM, and that it’s beautifully illustrated by the Hadamard conjugate of CNOT(a→b) being CNOT(b→a).

  40. Scott Says:

    Ashley R Pollard #19: The difficulty is that you could also have a classical theory where everything was built out of elementary units of information, at some minimum length scale (which you could call the “Planck scale” even though it wouldn’t involve Planck’s constant). Cellular automata provide explicit examples.

  41. dankane Says:

    As for Q1, it seems like the answer ought to be a combination of the anthropic principle and Occam’s razor.

    What is the *mathematically simplest* universe that gives rise to complex intelligences that can question why their universe is the way it is.

    And well mathematically speaking, quantum mechanics is pretty simple. Sure it uses exponentially large objects in the background to describe everyday situations, but the fundamental mathematical formulations are only a few equations. Now I’m not sure that it is actually the *simplest* way to give rise to complex intelligences (though really you’d expect to sample from a Kolmogorov prior rather than just take the single simplest option), but it seems a lot simpler than hacking classical mechanics to make things work.

    Like you could add tinker-toy atoms to classical mechanics, but you’d also have to add a whole bunch of ad hoc rules about which atoms can combine into molecules and how much energy they release when they do and what angles they form and so on and so on. Quantum mechanics on the other hand, manages to get this as an emergent property of a few equations.

  42. Scott Says:

    Sam #20:

      But (1) is the really tender point, because I can _also_ imagine worlds with really weird mechanics, with lots of parameters (and hence occupying lots and lots of theoryspace), but still classical and life-supporting; it’s not at all clear that quantum theoryspace is any larger.

    Yes, this is precisely the difficulty!

  43. Jacob Says:

    Maybe we flip Q1 around: Why does classical physics exist? If QM is really true at a fundamental level (assuming QM here includes QFT and the standard model and somehow also gravity), then why does it seem so weird to us? Presumably the answer is that sentient life requires a very large number of particles to exist, and also to exist in on a planet which is even larger, and it’s not that surprising that 10e23 particles mixed up in complex structures and at length scales much bigger than the plank length, behave differently than an ideal gas, and that different behavior is what we call classical physics. It’s a special case of QM that we have been psychologically adapted to.

    We would expect many of the same conservation laws to exist on the macro and micro scale, and they do, whereas some things aren’t observed at the biggest scale so they seem weird to us because we’ve never encountered them before. Both microwaves and X-rays behave very differently than visible light, even though they’re all the same “thing”.

    As for Q2…I suspect the answer is that we live in the simplest possible universe which is also complicated enough to allow life to form. The math only seems “complicated” because it’s hard to understand, but the number of basic assumptions are very few. SU(2) and SO(3) symmetry seem intuitive to me, SU(1) is a little weirder but I think it’s just because it has no classical analogue. The overall linearity of QM is probably not a coincidence either, but rather a core requirement of the theory. General relativity requires only a few extra assumptions, and I suspect when we have a GUTOE the number of assumptions will still be very few.

    QM needs to be on the complex plane is equivalent to saying the wavefunction needs two components at every point in spacetime. Maybe somebody could come up with a mathematically self-consistent and non-trivial QM that didn’t but I suspect it wouldn’t be able to support any type of molecule, and hence no life, and no sentient life.

    But why the QM formalism in general? Why is there a thing called a wavefunction, and we can operate on it with different operators (energy, position, momentum) and those mathematical operations correspond to measurements we make? I’ll admit this has me a bit stumped, and I’m curious what you come up with as an answer. From Noethers theorem we know that translations in time and space lead to conservation of energy and momentum respectively. It stands to reason we should be able to measure and/or calculate what these quantities are…somehow. Same for any other physical measurement, we should be able to describe it mathematically…somehow.

    It may be that QM is the simplest (aka fewest number of assumptions) theory that satisfies everything it needs to:
    1. Linear
    2. Local
    3. Poincare group symmetries (which stem from the universe being translationally, rotationally, and boost-invariant)
    4. Allows for the existence of atoms and molecules. This last one because of the anthropic principle.

  44. Scott Says:

    Jonathan #21: It’s certainly true that you can start with just an abstract Hilbert space plus, say, the Standard Model Hamiltonian, and then use the locality of the Hamiltonian to find a preferred tensor product decomposition for the Hilbert space, without needing to assume the tensor product decomposition from the start. And you can do even more impressive things than that; Sean Carroll (for example) has written a lot about this. But that’s the correct answer to a very different question than the one I asked! 🙂

  45. Scott Says:

    Peter Morgan #22: Why am I not surprised that you have a paper that answers the question and that I won’t understand? 🙂

    But I’m confused: do either you, or Colbeck and Renner, give any argument that quantum measurement theory is (in any sense) a unique completion of classical measurement theory? Or do you merely argue that it’s one possible completion, “completion” in the sense that nothing additional can be added to it?

  46. James Giammona Says:

    Very exciting! I’m looking forward to hearing what you come up with.

    I don’t have any strong views on this, but wanted to pass along two approaches/ideas to make sure you were aware of them.

    The first is a central role of information processing and some kind of renormalization process that leads to effective theories that look like QM. As you are deeply aware, information and entanglement seem to be fundamental components of WHY QM is the way it is.

    Toffoli has made some interesting points about action being the counterpart of entropy for computations/trajectories instead of for states. (https://link.springer.com/article/10.1023/A:1024411819910) and (http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.42.2410)

    I wonder if there is some general argument that says a lot of different micro-theories would converge to something that looks like the path-integral formulation of QM, since varied laws of micro-dynamics + coarse-graining will yield to an effective description that looks like laws that minimize an action. I don’t know why the state space would be a Hilbert space though. But it feels like looking at processes that coarse-grain to something that looks approximately like a Hilbert space might be a fruitful avenue.

    Second, while I haven’t looked into this in depth, there is a small community of people studying QM as being emergent from particular laws of logic and inference: https://arxiv.org/abs/1511.02823. I think this is in a similar vein to the papers you mentioned in your 3rd comment.

  47. entirelyuseless Says:

    I think the actual answer will involve determining what the universe needs to look like to an observer, given the fact that no observer can observe itself perfectly, even in theory.

    E.g. what would the universe look like to an AI instantiated in Conway’s Game of Life?

    It surely would not look like Conway’s Game of Life, given that the observers on the board could not have an exact model of themselves, which they could, if they could look at themselves the way we could if we saw the board.

    Since no one was yet answered this (as far as I know), it could simply be that Conway’s Game of Life (or any other classical universe), to an instantiated observer, would look like quantum mechanics or something very like it.

  48. Augustin Vanrietvelde Says:

    Scott #18: I’m glad you liked the idea!

    About your comparison between CNOTs and rain hitting grass, I’d say this is precisely why, when laying down the premises for quantum causal modeling, I was stressing the restriction to only look at reversible tranformations. Indeed the irreversible ones can definitely not be hoped to obey such a thing as time-symmetry of their causal structure — in fact, in the quantum case too, the causal structure of irreversible transformations is not time-symmetric.

    But the premise here is to restrict yourself to only look at reversible transformations. For such transformations, there is by definition no thermodynamic arrow of time, so your example doesn’t apply. And it is arguable that reversible (and deterministic) transformations are the only ones you should be looking at anyway, on the premise that irreversibility only happens when the human perspective gets in the way, and that the world out there is time-symmetric. So, in short, none of this is about processes that would “look backward to you”, they have been excluded from the start, both in the classical and quantum case.

    About your comment on backaction, I totally agree, and I would say that in some sense this whole time-symmetry thing can be seen as a way to formalise the “no action without backaction” principle. So you can see it as just this principle, but now stated in a mathematically and conceptually rigorous way, and also proven in all generality. In a sense it turns it from this “elegant feature” you’re talking about into a deep structural property!

  49. mjgeddes Says:

    I’d guess that the quantum aspects of reality are a special-case of a more general super-selection principle for reality itself. That is to say, the only assumptions we need are that the laws of physics themselves are not uniquely fixed, and that conscious observers exist at all.

    It could be that the laws of physics evolve over time, and just as the state of our particular universe started off simple at the big bang, and increased in complexity over time, so too is this happening at a deeper level…. to the laws of physics themselves. One could imagine universes popping in and out existence in a much vaster multiverse, and there’s some evolution of universes over many eons.

    I’m saying that principle of Occam’s razor isn’t necessarily true for the universe as a whole. Perhaps the laws of physics aren’t maximally simple, rather they’ve evolved towards greater complexity, just enough to allow conscious observers to exist. And quantum mechanics might just be the minimum amount of ‘extra’ complexity you could add to the simplest possible laws of physics to allow for conscious observers.

    You said in (4) you want to assume that current QM is exactly correct, but if that were true, I really don’t think there can be any answers for your questions. I mean some of the brightest minds have thought about quantum foundations for decades, so surely it’s unlikely that anything has been missed?

    Looking at the theorems like Gleason’s theorem etc., I don’t think they’re convincing arguments for completeness of QM, since in part they already *assume* that QM is correct, so arguments for completeness are just circular logic (for instance, a mathematician might assume that the Hilbert space formalism is correct, but if the ultimate theory turns out to use a slightly different geometry from Hilbert space, then obviously, the theorems that started by assuming standard Hilbert space would not apply).

  50. Ernie Davis Says:

    Presumably, Aristotle or Francis Bacon or Newton couldn’t have inferred QM just by thinking about the world of ordinary experience and applying Occam’s razor; you have to do a bunch of carefully designed experiments with fairly sophisticated equipment. If that’s true, there could have been a world that was exactly the same as far as ordinary experience is concerned, but worked on entirely different underlying principles.

    When you consider applying Occam’s razor using some measure of complexity like Kolmogorov’s, you have to take into account, not merely the complexity of the foundational physical theory, but also the unfathomable complexity of the boundary conditions in the QM account.

  51. Oleg S. Says:

    Can’t wait to read your book on Q 🙂

    Here’s how I see the problem:

    So, answer to Q is A1: QM is the best given constraints.

    Ok, Q1a: what are the options?, and Q1b: what are the constraints?

    Options: A1a – Probably computational power is what’s important, and not the practical details of implementation. Two obvious alternatives are less powerful universe (classical), and more powerful universe (the one where NP-complete tasks can be solved using polynomial resources). Q1c: is there less obvious alternative? Q1d: why universe is not classical, and Q1e: why universe is not super-powerful to offer NP solutions (I guess there is a term for this).

    Constraints: A2b.1 – life, and A2b.2 – simplicity. The universe should be simple enough so as not to look too suspicious, and complex enough to allow evolution of us. Note that simplicity is not the same thing as entropy: Big Bang was very unlikely because low entropy, yet at the same time very simple. Also, simplicity is a property of explanation, not the actual universe: nothing precludes universe from starting from very complex pattern, it’s just makes an unconvincing theory.

    Ok, Q1d: why not classical? Hypotheses: it’s hard to make it simple enough. You can build classical universe which is capable of producing intelligence in a Life game simulator (I assume it has enough power for that), but you need insanely complex starting position or instructions for that. Maybe I’m wrong and there are ideas on how to evolve universal turing machines from very simple classical start.

    Alright, Q1e: why not NP-powerful? Hypotheses: it’s NP-powerful is too powerful for the evolution of life. Intuition behind that is that many problems life tends to solve are actually NP-complete, and life just evolves more and more clever tricks to solve them using polynomial heuristics. Then, if you can just obtain the best solution using polynomial resources, you don’t really need to be very clever – you can just get there in one (P) step. This is obviously a speculation. Also, maybe in NP-powerful universe you have to solve super-NP-complete problems, and still need some intelligence. It would be interesting to read on the computational complexity of optimizing one’s fitness.

    Fine, Q1c: how BQP is the best? A) maybe it’s not – (I just don’t know other alternatives in the similar power spectrum, and would really enjoy reading about them). B) somehow, to capture free energy from low-entropy past, one needs complex chemical (or alternative) networks, which are built using very simple rules. I don’t know why chemistry cannot be purely classical, and again, maybe it can.

  52. matt Says:

    An important clue is that quantum mechanics is useful in pure math: topological invariants, supersymmetry and Morse theory, etc… So, clearly the answer is that some super-powerful aliens constructed a quantum universe to answer some question in 11-dimensional topology, and we are merely bugs in their code.

    And based off your ankle injuries, Scott, may I recommend increasing your physical activity, with basic weight training, flexibility and mobility, and cardio? It will make you a little less mortal and will probably increase your research output in the long run.

  53. Jeff Knisley Says:

    As a mathematician, I’ve often times thought about questions like Q myself, but from a different perspective. If I had the time and the ability, I would seek the answer somewhere in the arena of coherent states, squeezed states, Parseval frames, and similar. Admittedly, an “older, well-trodden” path, but given that from God’s perspective it all begins with light (according to Genesis), and given how much light is a central theme in humankind’s pursuit of understanding — first a particle, then a wave, then both, bent by a gravitational field, emitted in quanta, then a field, then a squeezed state that is now a tool for study of everything from gravitational waves to quantum computing — the theory of coherent states has for me always seemed to be where best to find why we think classically in such a quantum world.

    Feel free to stop with the first paragraph, but I’ll elaborate if any of you are still interested. Classical physics in many ways parallels the development of mathematics — intertwines perhaps — but in spite of the adage “God created the integers,” mathematics is anything but divinely inspired. Though deep, elegant, and incredibly useful, mathematics often struggles even with itself, not to mention with physics and the other sciences. All that massive effort to transform Riemann Integration into measure theory and the Lesbesgue integral yields an absolute integral, and much of physics — the Fourier Transform, for instance — requires integrals that aren’t absolute. Certainly, there are other “integrals,” but even then, essential identities like
    $$ \int_{-\infty}^{\infty} \frac{ \sin(x)} {x} dx = \pi $$
    require “improper integration” or other “ad hoc” justifications — and careful how you integrate such “conditional integrals” lest you change their values altogether. Classical physics, especially classical field theory, tracks with this rigorous followed by “ad hoc” mathematics. Where are these positive and negative infinities in the limits of integration, why do we model the world with sine waves that are infinite in extent, and even more important, why does doing so even work!

    One possible answer is redundancy. We base so much of both math and physics on uniqueness — unique representation by an orthonormal basis, for example — whereas so much of reality does not feature uniqueness. One reason for the success of neural networks as a machine learning algorithm is in their redundancy — I especially enjoy an example I use with students where I train a neural network (simple classifier), “destroy” a random 10% of the “artificial neurons”, and then show them how little, if at all, the neural network’s performance has been reduced.

    Enter the overcompleteness of the coherent states, as well as the related mathematical concept of a Parseval Frame. Coherent states seem to be instrumental in explaining why we have a classical perception of a quantum world — as well as illustrating where our quantum understanding is still not fully developed. Frames are an essential tool in applied mathematics, engineering, machine learning, and a host of other arenas where the importance of obtaining “a solution” outweighs the importance of it being a “unique” solution. Indeed, there is to me this “theme” that has emerged throughout modern mathematics, science, and engineering — namely, that “reality” is not a unique experience common to all, but instead, it is via redundancy a “shared experience” that admits multiple interpretations including highly idealized logical descriptions that are not part of that reality itself. That is what I would pursue, if I had the time and the ability to pull it off.

  54. [Thing] Says:

    Scott #5:

    Any answer along those lines, it seems to me, immediately crashes and burns once we realize that passing to wavefunctions, far from decreasing our classical simulation cost, has exponentially increased it—the fact famously exploited by quantum computation.

    All right, here’s an attempt to address this objection. Brace yourself for some wild, late-night-at-the-freshman-dorm quality speculation, but … what if:

    1) Universes governed by laws of physics which support intelligent life are, as Yudkowsky, Bostrom, et al propose about our universe, prone to being taken over by runaway superintelligent optimizers with more or less randomly selected objective functions.

    2) Our apparent reality is, not a simulation exactly, but an epiphenomenon of whatever some superintelligent optimizer is up to, operating on some lower substrate of physics (possibly itself an epiphenomenon of an even lower-level runaway optimization process, and so on some hopefully finite number of layers down until you get to the true fundamental physics).

    3) The substrate physics immediately below ours actually does allow arbitrary non-deterministic Turing-computation for free, or infinite parallelism, or something like that, i.e. something much more powerful than both classical and quantum computation.

    4) The superintelligent optimizer below us (SOBU for short) is indifferent to intelligent life forms such as us arising as a side effect of its computation, *just as long as* it is able to ensure that no superintelligent being that arises in our level of reality is ever able to “reach down” and commandeer its computational resources for purposes contrary to its objective.

    5) The method SOBU has found to minimize the risk of that happening entails that, compared to it, our computational capabilities are much more limited. Specifically, we are limited to the capabilities of quantum computers.

    6) This may strike us as a weird and arbitrary complexity class compared to classical Turing machines, but SOBU doesn’t care if some janky limited form of beyond-classical parallel computation “leaks” into our reality, as long as we are still computationally much weaker than it.

  55. Paul Topping Says:

    I don’t have a direct answer to these questions but want to make a suggestion that I hope is relevant.

    I suspect that our attempts to understand the universe are thwarted greatly by the mental hacks, installed by evolution, that we bring to bear on the problem. Specifically, we see the universe as made of objects with some permanence, influenced by forces, evolving over time. We interpret everything through the lens of causality. And those are just the mental hacks that we can see!

    We recognize that QM strains our ability to visualize how the universe works. As Feynman famously said, no one understands QM. I suspect this is just the tip of the iceberg of our lack of understanding. It’s probably worse than we realize.

    Here’s what I think will happen. We will eventually invent a real Artificial Intelligence capable of thinking of these things. We will eliminate these mental hacks from our AI, perhaps replacing them by some new ones not shared by humans. An entire science will arise around inventing new ways to look at the universe. Then we will set our non-human AI(s) loose on understanding the universe. I suspect they will come up with an entirely different way of looking at the universe, perhaps several. Then we will query the AI in order to figure out what they mean. I don’t offer a timeline for all this.

  56. Donald Says:

    Start not with god.
    Start with the Komolgorov complexity prior over the space of computer programs.

    Around 1/n th of this space has run time BB(n). This space doesn’t care to minimize runtime. Not in the slightest bit. However many of the programs are much faster. The only thing this prior cares about is how many bits something takes to express. And quantum mechanics can be expressed in few bits. (I would hazard a guess that there are no laws of classical physics that are similarly simple to our own and which produce a universe similar to this one. ) Of course, I can’t prove this and it may depend on how similar counts as similar. However, if this is true, it is a highly nonobvious mathematical fact.

    We can put some bounds on the size of our past, in that it has to be big enough to allow evolution, but small enough to let us be first. If the universe was exponentially huger than what we see through telescopes with a practical way for a self replicator to spread, there is no way we could have been first. This puts no limits on quantum branches of space past our cosmic horizon that doesn’t influence us.

    There is no particular restriction on physical law to be a shape easily grasped by human minds. Nor for the link between underlying law and surface phenomena to be direct.

    Any attempt to reformulate QM into a simpler or more elegant form is just evidence
    towards quantum mechanics having a low komolgorov complexity. We already know its pretty low, but we may be able to reduce logical uncertainty about exactly how low here. This includes showing quantum mechanics is optimal under criteria X, optimality is simple. If X is simple, this is further evidence of a low komolgorov complexity. (Some “up to choice of turing machine” stuff going on here. )

  57. Ernie Davis Says:

    I see that Lucien Hardy in the paper you mentioned (thanks for pointing it out) actually makes that claim: “Put another way, could a 19th century theorist have developed quantum theory without access to the empirical data that later became available to his 20th century descendants? In this paper it will be shown that quantum theory follows from five very reasonable axioms which might well have been posited without any particular access to empirical data.” But that’s wildly far-fetched. Hardy’s description of classical theory, “A classical system will have available to it a number, N , of distinguishable states. For example, we could consider a ball that can be in one of N boxes,” is not one that 19th century theorists would have accepted or even recognized; it is completely informed by a quantum mechanical world view. What the argument shows is that, if you view the world in the way that QM tells you to view the world, then QM works better than CM. The idea in Hardy’s conclusion that it should have been a classical theorist rather than Schrodinger complaining about “damned jumps” is absurd (or, I suppose, a joke). Quantum theory, after all, is quantized.

  58. Tu Says:

    Scott,

    I am thrilled to learn that you are dedicating some time to this undertaking. I can hardly think of someone more suited for the task than you. I like in particular that you highlight your non-dismissal of the questions as silly among your qualifications.

    In the spirit of “Metaphysical Spouting” and “Embarrassing Myself,” I howl the following crack-pot half-baked sub-Aaronson Beachboys Threshold ideas into the wind:

    I have always shared (what I am interpreting as) your dissatisfaction with our current understanding of QM. By this I do not mean a dissatisfaction with QM as a theory, but more a dissatisfaction the way that we talk/think about it. One can accept QM as “fundamental” and “true all the way down” and still be surprised and perplexed by violations of the Bell inequality. In the words of Bell himself: “it is a curious situation indeed.”

    It does feel as if we could just adjust our perspective a bit, or peel back one more layer, we would just see why things had to be so. Which layers should we try to peel back, then?

    Two suspicious characters that appear in our current quantum mechanical description of the universe that I think warrant close inspection are:

    Time: Different that space, and a really weird cat! Does privileging the present (by which I mean adopting some kind of relativity-respecting temporal ersatzism) point us towards QM somehow?

    Zermelo-Fraenkel Set Theory With The Axiom of Choice

    We are going below the beachboys threshold, but I warned you!

    In my view, ZFC is far to strong of an axiomatic system for any mathematics that we are applying to the physical word. As I am writing this, scrapping ZFC for some weaker system may help restrict the set of statements you can make about the physical universe, but is not likely to yield anything particularly quantum, so I guess you can just ignore this altogether. But I guess what I am vaguely gesturing at is this: does picking another axiomatic system (ideally a weaker one) for our mathematics, and really biting the bullet and sticking with all of the consequences of this all the way down, somehow point towards something that maybe looks like QM?

    Let it be known though, Tu thinks ZFC should be banned in physics departments around the world.

    Can’t wait to read the paper, and for you to invest the resulting nobel prize winnings into my QC startup!

  59. Timothy Chow Says:

    Consider the following

    Weirdness Hypothesis. In any quantum-mechanical universe, quantum mechanics will seem weird to its inhabitants.

    I don’t really believe the Weirdness Hypothesis, but it strikes me as being difficult to either prove or disprove. If your project succeeds, then presumably it will disprove the Weirdness Hypothesis. So this puts a lower bound on how hard your project is.

  60. David Speyer Says:

    This is much in the same direction as Jacob’s #42: Maybe the question you should be asking is “why does QM seem strange to us”? To which the answer, presumably, is that our brains evolved to think about phenomena which occur with Avogadro’s number of particles, not single particles. Which raises the question “why didn’t life evolve to use a small number of particles, so that we would experience quantum effects frequently in our daily life?”

  61. Ernie Davis Says:

    By the way, a word of warning: Twelve years ago, I said to myself, “I’m over fifty; it’s time for me to stop beating around the bush and work now on accomplishing what I want to accomplish”. I’m still dealing with the consequence of that decision, in the form of a book that is, currently, 3/4 written (incidentally, distantly related to your project, in that it axiomatizes some parts of “intuitive” physics). I can’t offhand think of any important work that was done with this mindset, though there probably are some.

  62. Yoni Says:

    Scott #28:

    Thank you; I’ll take that as a compliment.

    Unfortunately this gets to the bit where my lack of QM knowledge really stops me even thinking about the answer to that. But I think that any option you took would need to also have fundamental uncertainty of the future (which you posit getting through something like classical dice throws) but also fundamental *certainty* about the past. You can’t have two presents that can be arrived at from the same past. So far as I am aware (and please let me know if I am wrong), that’s something that QM has (some sort of conservation of information). I can’t see how you would get the same thing in a classical system. Say the path of an object is certain (bearing 0, speed x) and then there is a “dice roll” to change it’s bearing in some way or another (creating the uncertain future). Surely once it is rolled, and you now have a new bearing (0 + some random number) there are multiple histories that could have gotten you there, e.g. you could have had (0 plus 2 * random number) as your start bearing and the outcome of the dice roll being the opposite of what it actually was. So one present can be arrived at from multiple histories.

    If that’s the case then you have lost that fundamental thing about the present: one past, multiple futures. You will also have really lost the entire idea of “past” at all. If the past is uncertain like the future, then what makes it the past?

    I’m not 100% if I am just repeating myself or adding something. Do you have a concrete example of a classical process that could lead to the multiple futures from one present, but the present implies one, and only one, past? Am I making a mistake that QM has this property at all (i.e. in QM could you have two different starting functions that end up with the same precise end state? alternatively am I wrong that one starting state can lead to two different future states?)

    It feels to me that if there is something special about QM in general that gives you that situation then that’s really Q1 territory and the sub-question of “but why this particular version) strays into Q2 territory (where I can’t play as I don’t even know *how* to think about things)

  63. Peter Morgan Says:

    Sorry, Scott #44. I don’t think the answer I’m suggesting is either complete or unique but I personally find it helpful. The log-jam has been almost century-long, so I figure even a small help is good to have. I’ve stopped commenting here often, but this post of yours was too much:-), but in any case I’ll either find a way to say what I’ve been saying in a way that you find compelling enough to look harder, or I won’t. I’ve posted my comment on Facebook (see above as my Website link), so we could also have a discussion there, as you like.

    Quantum measurement theory isn’t unique, because GPTs (Generalized Probability Theories) are a thing. Nonassociativity, as for Jordan algebras, might be useful in some contexts. Different axioms give different structures. Georgio #32 more-or-less follows axiomatic constructions like Lucien Hardy’s, from which we can get ourselves either to a commutative classical formalism or to a noncommutative quantum formalism, although Planck’s constant has to be put in by hand. A commutative structure cannot model all transformations of measurement results, so why wouldn’t you want to include noncommutativity? Signal analysis includes noncommutative operations but is not usually said to be non-classical.
    Does it help if I agree with your #31? Continuous-all-the-way-down certainly seems wrong to me as an assumption: nonlinear differential equations might or might not result in vortices and other chaotic features all-the-way-down after some finite time, in which case differential equations would be very tricky (this, in its Navier-Stokes aspect, is a Millennium Prize problem). In such a case, measure theory and probability measures can still be possible, but part of that theory is the existence of incompatible probability measures. Note, as well, that if the axiom of choice is necessary to give a complete initial condition, then determinism would also be very tricky, so there can be a convergence of CM with QM from that slightly different direction, with the presence of an effectively irreducible noise.
    Augustin #18 is an interesting idea, but I think we can’t assume that QFT is so neat. [QFT is much more like thermodynamics, because of its focus on a continuum of possible measurements of temperature, density, et cetera, in different finite regions of space-time than it is like many-body physics.] I also doubt it’s a good idea to ignore that events in measurement devices are apparently not usually reversible, in contrast to the reversibility of unitary transformations at the level of finite-dimensional Hilbert spaces. An aspect that I think is significant but that AFAIK goes unnoticed in causal models, because they are discussed mostly at a logical level, is that Planck’s constant as an amplitude of a Poincaré invariant noise is distinct from thermal noise, which has different symmetries and has amplitude determined by temperature and by Boltzmann’s constant. Axiomatic constructions of QM also ignore this distinction, which cannot even be made except in a 1+1- or higher-dimensional space-time that supports an action of the Lorentz group.

  64. David Pearce Says:

    Why quantum mechanics?
    Because not even God can create information ex nihilo.
    Zero information = all possible descriptions = Everett’s multiverse.
    Unitary-only QM is the quantum version of the Library of Babel:
    https://www.quora.com/Why-does-the-universe-exist-Why-is-there-something-rather-than-nothing/answers/14473029

  65. Isaac Grosof Says:

    Hi Scott,

    For me, the reason why classical mechanics feels aesthetically unpleasing is that it posits discrete objects in a continuous world. Objects are either here or not here, they have sharp edges. In a continuous universe, objects should be”fuzzy edged’, in the manner of a wavefunction.

    To be slightly more mathematically precise about it, in classical mechanics, functions like the indicator function of the ‘does the particle occupy this location’ jump from 0 to 1, and thereby have infinite Lipschitz constant.

    A sharp jump like that feels wrong, like it ought to require infinite energy to perfectly segment “the particle” away from “not the particle”. So I much prefer “continuous first” theories like quantum mechanics on those grounds.

    How does one avoid having discrete objects in a continuous world? There’s really just two options: Continuous objects, like wavefunctions, or a discrete world, like cellular automata. So next I’ll address why I think cellular automata are aesthetically unsatisfying.

    All the cellular automata I’ve seen are fundamentally anisotropic: They have certain preferred directions and symmetries, corresponding to the shape of the cellular grid. These anisotropies are “globally visible”: no matter what scale one looks at a universe built on such a grid, it’ll be possible to tell the orientation of the grid far below. As a result, every emergent scale will have the same anisotropies. I think this problem might be a fundamental issue with discrete foundations: they’re always going to have global consequences.

    So, on aesthetic pleasingness grounds, I’m arguing in favor of quantum mechanics over a few other competitors. I can’t say that these considerations pick out quantum mechanics over all other possibilities, but I think the space of physical systems that quantum mechanics lies in is more appealing than some other spaces.

    Thanks for posing these questions, Scott!

  66. Jair Says:

    I think God first made a classical universe, but got a bit bored because it was too predictable. He sat in on a couple linear algebra classes the mortals gave and thought this sounded more interesting. Lately He’s been looking at Wolfram’s new stuff and laughing uproariously but taking it into consideration for the next go-round.

  67. Jay L Gischer Says:

    My knowledge of QM is pretty low. I do know that I have heard people who are experts say things like “The probability of this happening is 50-50 as close as we can measure, and nobody has found any way to tweak those probabilities in the slightest.”

    For instance, passing a polarized photon through a filter that’s 45 degrees off line of the polarization. Apparently there are other instances of this.

    And this makes me wonder, is it possible that if electrons and photons, etc, behaved like rocks, then might it not be the case that we would have a universe where entropy does not always increase?

    I don’t have the tools to work out that theory, but it seems like a place to look.

  68. Tu Says:

    Timothy Chow #58:

    I think the Weirdness hypothesis has something going for it. If I were God, and I were setting things up before the dawn of the universe, I would certainly want to ensure that any and all of its sentient inhabitants were guaranteed a minimal amount of confusion, frustration, and wonder upon inspecting my handiwork.

  69. Ivo Says:

    Ignorant layman perspective here.

    It’s possible the first question is not even allowed, as in, it may not make sense to ask. The world appears to be fundamentally QM. Isn’t asking why could it not be classical the same as asking why 3 could not be 5? Or why this Lego figurine could not have been made from carrots? It would not be the same figurine, and likewise, a ‘classical’ world would not be our world!

    Reflecting on the work of the likes of Nima Arkani-Hamed, it seems like we will first need to explain how the Big Bang came about (perhaps from a quantum fluctuation?) and how did the seemingly classical word emerge from it. Space, time, QFT – the entire thing. Only then we will be able to answer why all the quantum weirdness was needed. And only then we will be able to judge how hard would it really be to replace Schrödinger’s equation with Newton’s, while maintaining the world as we know it.

  70. Alex K Says:

    Is the distinction between a classical and a quantum universe significant? We could, if we wanted to, model our universe as a classical probabilistic Turing machine. (And inhabitants of a classical universe could model theirs as a quantum Turing machine.) Thus isn’t any universe with the strong Church-Turing thesis (i.e. one that can both implement and be implemented on a Turing machine) ultimately obeying the same laws?

    (Please let me know if my premises are mistaken – I would really appreciate it.)

  71. Tim Maudlin Says:

    A couple of comments that are not proposed answers.

    1) Of course, talk about why physics *had* to be this way or why God did this rather than that are not serious. If one means by “had to” nomological necessity, then it is trivial that the physical world had to be this way, and if you mean something stronger then it trivially did not have to. And God didn’t do anything, since there is no God. I take it what you really mean if that the laws of nature ought to, in some sense, be simple and compelling, in the sense that changes would lead to something more intrinsically complicated. I personally think that is so, but can’t prove it. You then object: but then why not just use classical laws since they look simple and compelling. And here one should remark: we don’t even have any internally coherent version of classical physics. The problem in classical EM, for example, of the self-field of a charged particle. There are kludges to avoid the problem, but they are kludgy and not fundamental solutions. So the idea that “classical physics” could somehow be worked out to be simple and sharply formulated is not something we know to be true.

    2) The sort of simplicity one looks for as rationally satisfying will only show up at the level of fundamental law, not whatever emergent approximations it may give rise to. I think there will be exactly one such law, and it will cover all physics, including gravity. And I think that the simplicity of the law will depend on its being formulated in the right mathematical language. We know we don’t have a TOE, and I don’t think we have good grounds to think we are even using the right mathematical language to formulate it. But until we have it, we cannot appreciate how any alternative would be more complicated and messy.

    3) I rather doubt that your take on “generalizing the rules of probability” is correct. As you know, for example, Bohmian Mechanics gets on just fine with classical probability. You seem convinced that probability calculus plays a deep role here. For what it’s worth, I see no reason to think that is in the right ballpark. I know this is just trading guesses, but at least acknowledge that yours is just a guess.

  72. Chris W. Says:

    Short answer: the underappreciated virtue of laziness!

    Instead of implementing lots of classical rules maybe this time the creator aimed for a “minimal set” of rules which result in the emergence many layers of new rules (the classical ones).
    Minimum effort but still a universe with interesting dynamic behavior (so interesting that you would want to watch it for at least 14 billion years).

    This answer is based on my (mis-)understanding of Feynman and his rotating arrows:
    instead of e.g. telling the photons to move in a straight line with c, you just let all the possibilities interfere and this process generate the classical rules.
    For this to work, you need destructive interference => complex numbers

    The born rule helps to make it easier to watch: at some point less blurry.

    If you worry about the higher computational effort for this approach vs. a classic one: well that just shows the how cheap the creator’s hardware resources are compared to his time as a developer. Especially for a quick prototype like this universe.

  73. Alessandro Strumia Says:

    Possible answer. To get big stuff out of small stuff (aka “decoupling”), a local theory must avoid equipartition of energy (otherwise energy goes into many small modes). Inventing some classical-like dynamics that avoids equipartition seems to me non trivial. QM achieves this by replacing small energy with small probability. Possibly there is no other road, and once you follow it, you arrive to QM.

  74. Ernie Davis Says:

    Let me put my point more directly. Are you looking for an answer A to the question “Why?” such that, if A had occurred, or had been presented, to a 19th century physicist, purely as an abstract argument with no empirical evidence, they would have said, “Wow! Maybe I should try to work out how such a theory would work”? (Lucien Hardy seems to be claiming that about his argument, but as I said, I don’t believe it.) If so, I’m doubtful there is any such A. If not, I’m not sure what you mean by “Why”?

  75. Wyrd Smythe Says:

    Love the survey, am looking forward to the results, and will read all the above comments with interest. Attempting to explain Q is far above my meager abilities, but I wanted to offer two things:

    Firstly, I loved that you wrote, “I don’t think I already know the answers, and merely need better arguments to justify them. I’m genuinely uncertain and confused.” Yes, indeed. That’s the only legitimate mental state, I think. Certainty in this area concerns me.

    Secondly, from what I can tell as a rank amateur, the real mysteries are: superposition, interference, and entanglement. Figure those out, and I think the rest of the dominoes fall. How can things be in multiple undefined states? How can matter interfere? (What’s doing the interfering?) What is entailed in the apparent non-locality of entanglement and wavefunction collapse?

    Most crucial question in my mind regards the Heisenberg Cut. Are cats (let alone worlds or universes) truly quantum? The classical world emerges from the quantum one; is the quantum one restricted solely to its domain?

  76. Boaz Barak Says:

    I am not qualified to even attempt to answer Q1, but let me ask an even more basic question.

    You ask why God did not make the universe simply classical (aka Newtonean) instead of quantum mechanical. But if we think of the universe as created by God, then the most intuitive picture of the world in not Newtonean, as in the sense of being some state machine that progresses according to simple. Rather it is a “telelogical” or Aristotlean universe in the sense of being guided by a final cause, with humanity being of course at the center of it.

    Indeed, for much of history that was the default assumption, and it was very hard for people to move from this to the Newtonean universe as the new “baseline”. At this point, we have completed this transition, and the analogous “Q1” is no longer really asked. Today we view a Newtonean classical computer as the “null hypothesis” of what we expect the universe to look like.

    So, I don’t really have an answer to your question Q1, but have a counter-question to you. Do you think that in a century or so, we will get so used to quantum mechanics, that we will no longer ask it?

    This reminds me of the famous von Neumann quote: “Young man, in mathematics you don’t understand things. You just get used to them.”

  77. Doug Says:

    “I’m aware, of course, of the dizzying array of central physical phenomena that rely on QM for their ultimate explanation. These phenomena range from the stability of matter itself, which depends on the Pauli exclusion principle; to the nuclear fusion that powers the sun, which depends on a quantum tunneling effect; to the discrete energy levels of electrons (and hence, the combinatorial nature of chemistry), which relies on electrons being waves of probability amplitude that can only circle nuclei an integer number of times if their crests are to meet their troughs. Important as they are, though, I don’t regard any of these phenomena as satisfying answers to Q[1] in themselves. The reason is simply that, in each case, it would seem like child’s-play to contrive some classical mechanism to produce the same effect, were that the goal.”

    Well… try doing that. I think you will find that these explanations fail. Like, the stability of matter one, for example. We thought for a long time we could get away with *macroscopic* stability, of like chairs and such, without needing to pull in the narrative of quantum mechanics, but no. You really need the fermion and the boson. I think this tack is more likely to be compelling than you think.

    Actually, really needing fermions and bosons — really needing their statistical laws — makes Q1 a little closer to Q2.

  78. Sandro Says:

    I completely agree with you that there are excellent, known reasons why the universe shouldn’t have been based on classical, continuous-all-the-way-down physics, ranging from singularities to various computational pathologies.

    A good review is John Baez’s, Struggles with the Continuum.

    But that still leaves the question: whatever was going to come in to discretize or regularize the continuum, why shouldn’t it have been classical, like a cellular automaton?

    Wolfram thinks it is, and he claims quantum mechanics emerges naturally from that formalism. Like everything Wolfram, take with a grain of salt and be prepared to abandon all familiar formalisms and start over with something new. Maybe there’s something to it though, and maybe it addresses your Q (see his section titled “Why This Universe? The Relativity of Rules” maybe?).

    His answer to Q I think is basically that all consistent formal systems exist and define the “Ruliad”.

  79. JimV Says:

    I am not qualified to answer, but I have some half-arsed opinions anyway.

    Firstly, the question “why did God (create the universe in this way”, while probably tongue-in-cheek, seems philosophically unsound to me. Absent reliable empirical evidence of said God, the concept adds no explanatory value. It simply pushes all the questions up one level without answering them. Why does this God exist? What is it made of? How does it work? It seems better to eliminate the unnecessary baggage and apply the questions to the universe directly.

    Secondly, why so much discreteness instead of continuous classical mechanics? Because Zeno was right. Infinite smallness is just as paradox-prone as infinite size. (In his first example, The Arrow, Zeno showed that an infinite sum can have a finite limit. The issue is how to reach that limit by a mechanical process in a continuous system.)

    Thirdly, why is there fundamental randomness? A) Because some small randomness adds robustness. I have given the example of a game system before, which falls into a trap and can’t get out. With some randomness, there is always of chance of emerging from stability traps. B) More fundamentally, it seems conceivable to me that the universe was a random perturbation of nothingness into several sets of particles and energies (including negative ones) which don’t interact with each other. We only see the ones we are made of and interact with, or perhaps different sets of particles repel each other, and we and they migrated to opposite ends of the universe long ago. From a random beginning, why not some fundamental randomness?

    Other than that, I go anthropic. I doubt if this universe is optimum for lots of life, but it allows us to exist. For a while.

    Probably that is entirely wrong, but it lets me sleep at night without worrying about what the actual answers are. As my friend Mario says, “Hey, believe whatever you need to, to get yourself through the night.”

    Thanks for asking.

  80. Lars Says:

    “By my age, Einstein had completed general relativity, Turing had founded CS, won WWII, and proposed the Turing Test, and Galois, Ramanujan, and Ramsey had been dead for years.”

    It’s good to have high aspirations, but there is such a thing as **too** high for anyone who is not Einstein, Turing or Ramanujan (which includes pretty much everyone except the latter)

    For anyone who is not, comparing one’s accomplishments to the latter individuals is just likely to lead to depression and possibly even despair.

    And beyond the age of about 40 in most cases, the age at which the comparison is made to people like Einstein is largely irrelevant , since if one has not done something similar by age 40, it is extremely unlikely that even another 50 years (or even 150 years) is going to make a difference.

  81. JM Says:

    If you haven’t seen it yet, here’s an interesting article discussing these questions (without anything in the way of an answer): I’m still mystified by the Born rule. Despite the title, it does talk about Q1 as well as Q2. See especially “§4: wtf magical reality fluid”. There are discussions in the comments that might answer pieces of the questions; I can’t really follow them so I can’t tell. (Note: the article is written by someone who takes the Everett interpretation as a given.)

  82. Philippe Grangier Says:

    This answer is adressing Scott’s remark : ‘For now, I’ll simply remark that every axiomatic reconstruction of QM that I’ve seen, impressive though it was, has relied on one or more axioms that struck me as ‘weird’, in the sense that I’d have little trouble dismissing the axioms as totally implausible and unmotivated if I hadn’t already known (from QM, of course) that they were true’.

    So my points are :

    – Something has to be weird, in the sense of being far away from classical intuition. But I agree that it might better be some physical idea, than an abstract mathematical hypothesis.

    – Also, it is probably a mistake to look for axioms from which one might get QM by deduction. For going from physics to mathematics, one needs an inductive step, that is guessing a mathematical formalism that can be justified by physical arguments, but not proven. Only when this formalism is present, one can use it in a deductive way, to get consequences, and ultimately to check agreement with experiments.

    – In a series of papers (see below) we have given physical ‘axioms’, from which one can justify that probabilities are needed, and that (certain and repeatable) events in the required formalism must be associated with projectors, not with partitions of an ensemble like in classical probability theory. Also, orthogonal projectors are attributed to mutually exclusive events, and the same projector is attributed to mutually certain events.

    – Given that, one can get unitary transforms from Uhlhorn’s theorem, and Born’s rule from Gleason’s theorem, see more details in references below.

    – Obviously there is no free lunch, and some strong initial physical hypotheses are needed. They are also related to a question by Scott : ‘whatever was going to come in to discretize or regularize the continuum, why shouldn’t it have been classical, like a cellular automaton ?’ The answer is : in addition to discretization, another crucial ingredient is needed, it is contextuality.

    Classical discreteness does exist (like in cellular automata), classical contextuality does exist (like in a poll where the answers depend on the ordering of questions), but QM requires BOTH discreteness and contextuality, that we call also contextual quantization, to provide the physical ingredients quoted above.

    – All this has been written in a series of papers, most of them published, and a broad audience introduction can be found in https://arxiv.org/abs/2105.14448 (with more details in 2111.10758 and 2201.00371)

    – Finally, I’m not sure this approach based on physical ideas will be convincing for Scott, but, after some thinking, it is my current best answer to his question. About God, I guess you know Laplace’s answer to Napoleon : Sire, je n’ai pas eu besoin de cette hypothese.

  83. asdf Says:

    I had thought that classical thermo had no sane way to get rid of the ultraviolet catastrophe. Therefore, quantum.

  84. Seth T Says:

    I love the questions you’re asking… Not sure I have anything brilliant to say but here goes…

    I am reminded of something I read many years ago. I think it was an introductory issue of some theory of computing type journal that was being given away as sample swag at a physics conference.

    In any case… the article in the journal that stuck with me explored concepts of physics-violating computers. For example, imagine you had a computer that had the property that the clock frequency doubled after each step. The first time step after initialization would take, for example, 1 second to complete. As the infinite sum 1 + 1/2 + 1/4 + …. = 2, this computer would execute an infinite number of steps in 2 seconds. This computer would allow one to evaluate whether or not any computer program would terminate! I.E. One could solve the https://en.wikipedia.org/wiki/Halting_problem Of course one could brute force any algorithmic encryption scheme, test any mathematical hypothesis such as Fermat’s last theorem, etc. in similar fashion.

    Now, of course, this computer completely violates special relativity (among other things – probably some thermodynamic considerations related to irreversibility and energy consumption at the Landauer limit and resulting overheating….) in the worst way. It requires superluminal communication as the clock speed period becomes shorter than the time for light to cross the length of the computing hardware, etc.

    This is suggestive, to me, of things that conceptually, the universe just could NOT sensibly do. If one could build such a computer then almost any magical concept could be brute-force solved immediately. Testing all possible non-trivial zeroes of the Riemann-Zeta function? No problem!

    Similarly, I wonder if a world without quantum which was just classical could enable similarly ridiculous information or computations. If I can make the logical leap to say that no quantum means no atoms and matter that is continuously divisible then a finite volume of space can encode an arbitrarily large amount of information in the pattern of matter which occupies it. (e.g. make 3d voxels of matter interspersed with vacuum like a porous sponge to encode data and make the voxels arbitrarily small). This violates, at the very least, the kind of black-hole information limit that we believe exists from QFT and/or GR, no?

  85. Greg Guy Says:

    I still find this conceptually confusing. It seems to me that if there could exist a ‘discrete’ classical theory that could reproduce all the phenomena in our Universe then we would be using that theory rather than QM. So my question then is what physical properties are actually dependent on unique QM effects? I don’t mean abstract results such as Bell’s Theorem. I mean what physical property could not exist if we kept only the fragment of QM that was consistent with a classical hidden variables theory?

  86. Vladimir Says:

    Scott #33 left me completely flabbergasted. 25 adjustable parameters seem like too big a price to pay for explaining all of chemistry to you? Which features of chemistry you think can be reproduced by cellular automatons with less than 25 rules?

  87. Keith McLaren Says:

    I’m not up on all the physics stuff I’m afraid as that’s way to high browed for me. But it appears from the outside that this is a bit too much for the old human mind to get to grips with. I just think of the poor fly seeing the beautiful world through a window but not understanding why it can’t get through to it.
    It may unfortunately be all too much for us to.
    And that would be a very cruel god pulling that one on us.
    No wonder we are succumbing to Ivy League nerdiness.
    (even though I’m a bit of a fan of nerdiness, all said)

  88. Lars Says:

    Tu said . If I were God, and I were setting things up before the dawn of the universe, I would certainly want to ensure that any and all of its sentient inhabitants were guaranteed a minimal amount of confusion, frustration, and wonder upon inspecting my handiwork.”

    Looks like God definitely has that one covered.

    And, if the failure to unite QM and GR despite a century’s worth of effort on the part of the smartest humans is any indication, it would certainly appear that God has actually achieved “maximal confusion”, even if she was only shooting for minimal. (At least that’s true for the “any …of it’s sentient inhabitants” part of the requirement. There may be — undoubtedly are — some inhabitants of the universe who are much smarter than even the smartest among us humans, who may be just minimally confused)

  89. Guy Says:

    « Actually, you’ve made me realize that designing such a CA would be a phenomenal research project for anyone seeking to investigate Q1/Q2.« 

    Stephen Wolfram’s project. Isn’t that exactly his thing… answering Q1 by studying strong emergence from CA?

    I so look forward to you essay/book. Thank-you.

  90. Nick Drozd Says:

    I have nothing to offer in the way of physics. But I do have two pieces of advice to keep in mind in answering your questions:

    1. Don’t put the cart before the horse.
    2. Make sure you know which one is the cart and which one is the horse.
  91. Anbar Says:

    Q1: It could have, but it would have been necessarily boring (deterministic) and non relativistic (UV catastrophe, radiation reaction)
    Q2: This specific alternative (unitary algebras with propositions as projectors and the Born rule to handle complementarity) is the most economic representation of a working logical framework that replaces subsets of a set as a way to define physical propositions about things; the replacement itself being necessary for empirical reasons due to the physical impossibility of checking the truth value of some propositions classically constructed using the connector AND

  92. Zalman Stern Says:

    I scanned the comments and didn’t see this mentioned. Since this is likely to be the start of a bigger thing, I hope this comment is useful.

    Why the constraint in the paragraph starting ‘Relatedly, whatever “design goal” you propose for the laws of physics, if the goal is satisfied by QM, but satisfied even better by theories that provide even more power than QM does…?’ Isn’t the least powerful theory that satisfies the goal the most elegant? Wouldn’t those other theories imply things that are either not desired or, dare I say it, simply not practical within the budget “God” was willing to spend on this project?

    Perhaps I do not understand the hierarchy of theories being explicitly and implicitly referenced here.

    -Z-

  93. tez Says:

    Maybe someone can help me with this which is tangentially related to one of Scott’s points:

    In one of the two original papers where Schroedinger was first quantitatively analysing issues of entanglement (either the cat paper or maybe the steering one) he threw out a comment along the lines of “well, perhaps if two systems are entangled there is a extremely rapid dephasing whenever we spatially separate them, so all of this weirdness will go away”. (Paraphrasing from memory, don’t have copies).

    This would throw some new parameter into the mix to govern just how rapidly such decoherence occurs. Back then experimental evidence for long range entanglement was presumably non-existent. Note that in this “fundamental dephasing” theory we would be back to an effectively local and separable theory, although one with a weird ontology (the “real states” would still be vectors in a Hilbert space!).

    What I’m curious about is which of the many successes of quantum theory would *not* have been possible in that theory? Did we really have to wait until the 70’s before we could rule that theory out, or are there phenomena which indirectly relied on long enough range entanglement that it was not viable much earlier?

  94. Chris Says:

    Scott #29:

    >No, to me it sounds more like simply a listing of all the different possibilities, or a probability distribution over the possibilities, or some other way to rank them in prominence and/or organize them. Why jump immediately to QM, with the interference and the complex numbers and whatnot?

    I think some of the following are fairly intuitive: 1) If the quotient space is continuous, as you would expect given a continuously infinite distribution of possible worlds, then the world as you experience it will be made up of waves and the dynamics ruled by smooth linear operators. 2) A natural product of wave-like dynamics is orthogonalization leading to superexponential bifurcation. 3) If this is to be at all cognizable, the bifurcation need to be controlled (read: you only experience the slice of the quotient space where it’s controlled). So we should expect this space to be discretized, with something like quantization of states. One way (perhaps the only way) to discretize waves is through interference effects. Interference seems really weird when you think of particles as being ontologically fundamental, but if you instead think of the waves as being fundamental, this effect is not strange, but quite natural. 4) You should expect dynamical evolution to be continuous, and in particular that transitions between states should be continuous. Lucien Hardy has shown that this assumption is enough to motivate representing quantum states with complex numbers.

  95. Scott Says:

    Zalman Stern #91:

      Isn’t the least powerful theory that satisfies the goal the most elegant? Wouldn’t those other theories imply things that are either not desired or, dare I say it, simply not practical within the budget “God” was willing to spend on this project?

    Please reread the passage … that’s exactly the point I was making! 🙂

  96. skaladom Says:

    Thought about this for a while, couldn’t figure out where to start from. Best I can say is that the question is framed in anthropomorphic / theistic terms, and while I get a vague sense that that is only supposed to be a metaphor for the “real question”, with such a deep subject (trying to make educated guesses about the abstract space of possible physical laws, no less), I don’t find it at all clear what the actual translation of your question would be, devoid of metaphors of creation and goals.

    I’m sure you’re getting plenty of good answers that go into the the actual physics much more than I could, so my answer is actually a request back: is it clear to yourself what the non-metaphorical question you’re asking really is?

  97. Zf42tf Says:

    By the Osterwalder-Schrader theorem, a quantum field theory is equivalent to a boring statistical mechanical system in Euclidean space under Wick rotation and analytic continuation. The symmetries of Euclidean space are self-evident enough to have been considered axiomatic for thousands of years.

  98. Object Of Objects Says:

    Partially echoing #59. To me, most of the curiosity around the question “why QM?” can be reduced to the curiosity around the question: why, in a quantum mechanical universe, do we find ourselves existing as classical life rather than quantum life?

  99. Scott Says:

    I was busy with teaching all afternoon, and in the meantime, there have probably been way too many comments for me to answer each one individually. Which is great! But let me address a couple points that keep cropping up over and over.

    Most importantly, people keep wanting to justify QM by reminding me about specific difficulties with the classical physics of the 19th century: for example, the ultraviolet catastrophe. To clarify, I never had any quarrel with the claim that, starting with 19th-century physics (especially electromagnetism), QM provided the only sensible completion.

    But, to say it one more time, what would’ve been wrong with a totally different starting point—let’s say, a classical cellular automaton? Sure, it wouldn’t lead to our physics, but it would lead to some physics that was computationally universal and presumably able to support complex life (at least, until I see a good argument otherwise).

    Which brings me to Stephen Wolfram, who several commenters already brought up. As I’ve been saying since 2002 (!!), Wolfram’s entire program for physics is doomed, precisely because it starts out by ignoring quantum mechanics, to the point where it can’t even reproduce violations of the Bell inequality. Then, after he notices the problem, Wolfram grafts little bits and pieces of QM onto his classical CA-like picture in a wholly inadequate and unconvincing way, never actually going so far as to define a Hilbert space or the operators on it.

    Even so, you could call me a “Wolframian” in the following limited sense, and in that sense only: I view it as a central task for physics to explain why Wolfram turns out to be wrong! The sorts of models that Wolfram hawks really do seem like the obvious first options on the whiteboard of possibilities for our universe. They’re just not the options that were realized.

    Relatedly, several commenters took issue with my claim that it would be “child’s play” to design classical laws of physics that would give rise to stable matter and a combinatorial explosion of possible chemicals. If you interpret “chemistry” to mean “the actual chemistry of our universe,” then this is indeed far from obvious. In some sense, a large part of the field of chemistry is all about designing classical heuristics to reproduce as much of our universe’s actual (quantum) chemistry as possible, but even the best known heuristics don’t always get it right!

    Again, though, I take a much broader view. By “chemistry,” I mean any rule-based system for sticking together movable building blocks of various types, in a space of some dimension, to produce an exponentially-large variety of stable, cohesive substances—substances whose properties are determined both by the individual building blocks and by how they’re linked. “Chemistry,” in this sense, could start from “atoms” that were completely different from our atoms. It could nevertheless be perfectly adequate as a basis for complex life, albeit radically different from our life.

    The research project that I expressed excitement about was the following:

      Construct an explicit example of a classical cellular automaton—the simpler the better—that gives rise to a “chemistry” in the above sense. The rules of the “chemistry”—i.e., the types of possible atoms and how the atoms can be linked—should be emergent from the underlying rules of the CA, just like they are in our universe, rather than explicitly encoded into the CA rules.

    Let me stick my neck out and conjecture that the above can not only be done, but done using rules that are manifestly “simpler” than (say) a full specification of the Standard Model.

    If I’m wrong about this conjecture, then so much the better, as I agree that we’d then have a satisfying solution to my Q1! 🙂

  100. asdf Says:

    Zf42tf, Wick rotation = imaginary time, right? So there are those complex numbers again. Ed Nelson’s stochastic mechanics are also interesting but don’t get all the way there.

  101. Scott Says:

    entirelyuseless #46:

      Since no one was yet answered this (as far as I know), it could simply be that Conway’s Game of Life (or any other classical universe), to an instantiated observer, would look like quantum mechanics or something very like it.

    Sorry, no, it wouldn’t. 🙂

    We know this because a classical CA couldn’t reproduce violations of Bell inequality, quantum supremacy experiments, or dozens of other experimentally-verified quantum effects. These effects are so important precisely because they refute the idea that the world is secretly classical, with QM merely an artifact of our limited perspective as observers. Einstein as well as thousands of lesser minds believed that hypothesis, so it has a distinguished pedigree, but the hypothesis has by now been refuted as soundly as anything in the history of science.

    There is a loophole, but I’d say it’s so extreme as to prove the rule. Namely, one can’t rule out that someone used, e.g., a giant Game of Life board to create a “Matrix” that’s running our entire quantum-mechanical universe as a computer simulation! To me, though, this just seems like an instance of a more general point: namely, that nothing in physics can rule out the possibility that the whole observed universe is a simulation, an illusion, or a lie. (The idea of “superdeterminism” goes to this same extreme, even though it strenuously denies doing so.)

  102. Scott Says:

    Augustin Vanrietvelde #47: Ah, thanks for clarifying! In that case, though, my followup question is this: why is it important even to have a notion of “causality,” at all, for reversible systems that lack any thermodynamic Arrow of Time? Why not say that these systems have dynamics, sure, but the whole concept of “causation” is tied up with irreversibility?

  103. Tim Maudlin Says:

    Re #98: “If I’m wrong about this conjecture, then so much the better, as I agree that we’d then have a satisfying solution to my Q1! ”

    I don’t understand this at all. Why require that the laws yield anything like chemistry? There is no a priori requirement that the universe be complex at all, whether at the fundamental scale or at emergent scale. Again: no God was involved trying to accomplish anything. If there is only one universe, and it has the simplest laws that yield emergent stable complexity, that doesn’t answer any puzzle, since there is nothing mandating that such complexity emerge.

  104. Scott Says:

    mjgeddes #48:

      You said in (4) you want to assume that current QM is exactly correct, but if that were true, I really don’t think there can be any answers for your questions. I mean some of the brightest minds have thought about quantum foundations for decades, so surely it’s unlikely that anything has been missed?

    I mean, some of the brightest minds had to think about Fermat’s Last Theorem for 350 years before any of them proved it! 🙂

    More to the point, though, let me give you the years of various developments that have played non-negligible roles in my own thoughts about the foundations of QM:

    Quantum teleportation – 1993
    Shor’s algorithm – 1994
    Grover’s algorithm – 1996
    Theory of quantum error-correction and fault tolerance – 1996±O(1)
    Discovery of the cosmological constant – 1998
    Lucien Hardy’s reconstruction of QM – 2001
    PBR Theorem – 2011
    Firewall paradox – 2012
    Harlow-Hayden argument – 2013
    ER=EPR – 2013
    Quantum circuit complexity as wormhole volume – 2014
    AdS/CFT as a quantum error-correcting code – 2014
    Chiribella et al.’s reconstruction of QM – 2015
    Black hole unitarity from Euclidean path integrals – 2019
    MIP*=RE – 2020

    So, maybe there’s nothing of any further interest to be discovered, but that doesn’t seem like the way to bet…

  105. entirelyuseless Says:

    Scott,

    I’m aware you can’t reproduce violations of Bell’s inequality with cellular automata, but that is only if you assume that there is a clear mapping between cellular automata and the things we experience (which are you pointing out with your example of the simulation.)

    But I am saying we know for a fact that there cannot be a clear mapping like that. Because it is impossible for an observer to have a full model of themselves. So if the world is a CA, it is not one that looks like one; your experience might be a composite of pieces of that CA located in various places and times on the board, not one localized cluster.

    Given that kind of uncertainty, I don’t see how Bell’s theorem would rule out a CA like that.

  106. Scott Says:

    Ernie Davis #49:

      Presumably, Aristotle or Francis Bacon or Newton couldn’t have inferred QM just by thinking about the world of ordinary experience and applying Occam’s razor; you have to do a bunch of carefully designed experiments with fairly sophisticated equipment. If that’s true, there could have been a world that was exactly the same as far as ordinary experience is concerned, but worked on entirely different underlying principles.

    As we’ve discussed elsewhere in the thread, this is far from obvious, and extremely interesting either way!

    Maybe QM provides the only acceptably simple explanation of experimental facts that were already well-known to Newton—including facts like the stability of matter and the existence of the sun, which Newton never even purported to explain. If so, one could make a strong case that Q1 has its answer right there.

    On the other hand, maybe there’s a reasonable classical model for all the facts known to Newton, with QM needed only for stange new experimental facts that were discovered in the 19th and early 20th centuries. If so, then that would arguably heighten the scientific urgency of Q1 even more!

    In summary, I hold that Q1 is either answerable or interesting (hopefully both!) 😀

  107. Scott Says:

    Ernie Davis #56: I agree that it would’ve taken a huge leap of imagination either to

    (1) discover QM as a possible mathematical structure without any need for experiments, or
    (2) realize the applicability of that structure to the real world from the experimental facts available by, say, 1870.

    Even if either of these could’ve been done in principle, perhaps they would’ve required superhuman intelligence. But I made the case in chapter 9 of Quantum Computing Since Democritus for why (1), at least, would’ve been far from impossible with hindsight!

    The examples of Riemann and Einstein, of course, stand forever as counterexamples to those who find it “wildly farfetched” that pure thought, aided by at most one or two scraps of data, could ever uncover the correct mathematical structures that underpin the real world…

  108. Scott Says:

    Ernie Davis #73:

      Are you looking for an answer A to the question “Why?” such that, if A had occurred, or had been presented, to a 19th century physicist, purely as an abstract argument with no empirical evidence, they would have said, “Wow! Maybe I should try to work out how such a theory would work”?

    Yup! That’s the least of what a satisfying answer to Q ought to do.

  109. AZ Says:

    Though not religious/theistic myself, I do recall an interesting footnote in Griffith’s QM, where he emphasizes that “not even God knows” the outcome to a quantum experiment. Were I the Abrahamic god, dead-set on granting *true* free will to humanity, then I would purposefully eliminate my foreknowledge of this specific universe’s evolution to do it, and relying on the nondeterminism of QM to accomplish task would be a pretty good choice.

    But as for a deeper mathematical constraint? I really do fear that excluding the “stability of atoms + anthropic principle” may leave you with no answer. It’s boring, and it breaks the rules you set out in the original post, but I have a feeling there’s no answer beyond it.

    And maybe there are just some uncomfortable axioms we need to assert and just swallow that pill. The universe needs some (hopefully finite!) number of axioms, so one of them may as well just be the Born rule and be done with it—don’t hate me, Mateus 😀

  110. Scott Says:

    Donald #55:

      Start with the Komolgorov complexity prior over the space of computer programs.

      Around 1/n th of this space has run time BB(n).

    Isn’t it a ~1/exp(n) rather than ~1/n fraction?

    More relevantly, you wrote a lot about how the discovery of even shorter computer programs to simulate the observed world would merely strengthen your Kolmogorov hypothesis even further—but then what possible evidence would falsify, or at least weaken, your hypothesis?

  111. Scott Says:

    Timothy Chow #58:

      Weirdness Hypothesis. In any quantum-mechanical universe, quantum mechanics will seem weird to its inhabitants.

      I don’t really believe the Weirdness Hypothesis, but it strikes me as being difficult to either prove or disprove. If your project succeeds, then presumably it will disprove the Weirdness Hypothesis. So this puts a lower bound on how hard your project is.

    Yup, that’s the goal! Namely, to make QM seem just as ordinary, pedestrian, and non-weird as general relativity. 😀

  112. Ted Says:

    Riffing off of Oleg Eterevsky #6 and Oleg S. #50: Could the computational power of different candidate laws of physics be the thing that determines which laws get instantiated in “the real world”?

    I agree with you that it seems plausible that some form of classical physics could support structures that are complex enough to support intelligent life. But there may be a subtle reason why this isn’t actually the case.

    This seems like a vaguely plausible foundational premise: maybe any candidate set of laws of physics can only lead to phenomena simple enough that those phenomena could in principle be efficiently qualitatively simulated by a hypothetical computer that obeys those same laws of physics. After all, it seems plausible (although far from certain) that quantum computers could efficiently simulate all realizable physical processes in our universe. And the extended Church-Turing thesis would probably actually be true if we lived in a completely classical universe. You rightfully bang your head against the wall when people incorrectly say that the Sycamore computer’s task was to “simulate itself”, but in a looser sense you can think of any physical process as “a computer efficiently simulating itself”, and so the complexity of realizable physical processes might be bounded by the computational resources of a hypothetical in-universe computer trying to simulate it.

    (To be clear, I’m definitely not suggesting that this or any other universe actually is a literal computer simulation constructed by any external intelligence. I’m just saying that that may be a useful thought experiment for bounding the complexity of physically realizable processes.)

    Anyway, it isn’t clear to me that a classical Turing machine would necessarily be powerful enough to efficiently simulate processes complex enough to support intelligent life, i.e. powerful enough to “run the math in real time”. But your mileage may vary; maybe it seems obvious to you that a classical computer would be powerful enough.

    Finally, another vaguely reasonable-sounding foundational premise should be that the laws of physics generate the least powerful computational system that’s powerful enough to generate processes complex enough for us to observe them. That’s why we don’t have laws of physics that allow superluminal signalling, computers that can efficiently solve NP-compete problems, etc.

  113. matt Says:

    To expand on my comment, there is one question you can ask: assume our civilization never developed the advanced experimental science to discover quantum mechanics, but did have lots of people working on math. How would those mathematicians have been led to invent something like quantum mechanics? Clearly, they would have invented matrices; after all, in the real-world those were discovered before quantum mechanics. From studying differential equations, it is natural that they would have generalized matrices to various kinds of operator algebras. So, the only key thing missing to invent quantum mechanics is the notion of tensor product. What route would have led them to that? I think either applications in topology or studying classical statistical mechanics and then naturally generalizing it. Are there any other routes?

  114. Scott Says:

    Yoni #61: To make a long story short, in QM, two “nearby” branches of the wavefunction can reconverge, in which case interference happens, which is the whole way we know QM is true in the first place! By contrast, for Second Law of Thermodynamics reasons, we don’t expect “faraway” branches to recombine between now and the heat death of the universe. Except possibly via heroic technological efforts, well beyond those of our current civilization—and even then, some people might argue that the very fact that two branches were ultimately recombined, proves that they were never “really” separate worlds in the first place. I’m not sure how any of that affects your thinking about this.

  115. Scott Says:

    Isaac Grosof #64: Thank you for another extremely interesting argument that’s novel to me, at least in the way you formulated it!

    Regarding anisotropy, though, if Lorentz invariance were to break down only at the Planck scale, with a quantum theory of gravity explaining why it’s recovered to excellent approximation at all larger scales, would that be totally fine with you, even aesthetically?

  116. Scott Says:

    Ivo #68:

      It’s possible the first question is not even allowed, as in, it may not make sense to ask. The world appears to be fundamentally QM. Isn’t asking why could it not be classical the same as asking why 3 could not be 5? Or why this Lego figurine could not have been made from carrots? It would not be the same figurine, and likewise, a ‘classical’ world would not be our world!

    I don’t think physics would get very far if that attitude were generalized! 😀

    “Daddy, why is the moon round?”

    “Simple: because if the moon were square, then this wouldn’t be our world, but a different world—namely, a world with a square moon!”

  117. Scott Says:

    Alex K #69:

      Is the distinction between a classical and a quantum universe significant?

    I’d say so, yes!

    I mean, it’s true that a classical universe could be simulated by a quantum Turing machine, and a quantum universe by a classical Turing machine. But I’m not asking to explain the nature of a hypothetical simulating meta-universe—only the actual universe of our experience. I.e., for the purposes of this question, I’ve swallowed the Blue Pill. 😀

  118. Clinton Says:

    Scott,
    Your essay/book will be eagerly anticipated!
    This reply is made without first reading anyone else’s replies above to avoid being pulled off track from my first thoughts. And apologies that this is so long … but you did ask a rather big question and did not give us a limit.

    ****************************************************************************
    First, here are my answers when asked these questions by the tenure committee …

    Q2:
    Why C? Obviously, the complex numbers are closed and complete. And then state vectors if anything will be “distinguishable” (orthogonality). And if we want to be able to take state vectors and get a complex number (given multiplication and phase operators) then there’s an inner product space … and now it’s a complex Hilbert space (finite case assumed …)
    Why unitary transformations? Linearity; not breaking the above chosen C Hilbert space
    Why the Born rule? Gleason’s theorem, again consequence of choosing C Hilbert space
    Why the tensor product? to combine the C Hilbert state spaces

    Q1:
    Well, if God chose to make everything out of complex amplitudes on the first day. God would choose state vectors on the second day by distinguishing states. Then on the third day, God creates the inner product, sees Hilbert space, and declares it good. On the fourth day, God uses the Born rule to know all probabilities. Because God does not wish to destroy creation, God only allows unitary operators on the fifth day. Then tensor products allowed God to combine the state spaces on the sixth day. Finally, on the seventh day, God rested and watched the universal wave function evolve.

    Q: Why should the universe have been quantum-mechanical?
    If the universe is made of complex amplitudes, or God made the universe out of complex amplitudes, then all else pretty much follows.

    ***************************************************************************

    And then here’s what I would NEVER say to the tenure committee …

    Since you brought up God … I’ll quote Isaiah 55:9
    “As the heavens are higher than the earth,
    so are my ways higher than your ways
    and my thoughts than your thoughts.”

    And then I’ll share this recollection. I was very fortunate to catch a lecture by Vaughan Jones before his untimely passing in 2020 … from an ear infection of all things … speaking of mortality. At the end of his talk, he graciously offered that he would hang around afterward for a bit and entertain any questions anyone might like to come up and ask him. Quite a few did take him up on that and he spent a while addressing different mathematical questions … and a few about kitesurfing … I had sat in the back so found myself last in line to meet him. I tried to be funny with my first question asking if he thought the best use for quantum computers will be to use the AJL algorithm to untie quantum knots. He obliged me with a chuckle. Then I asked if he thought the nature of our neural model of computation might be such that we have an upper bound on what we can know about the universe because our theories will always be produced within the arena of whatever our neural model of computation might be. He looked at me wide-eyed for a long moment, never losing his friendly smile. Then he squinted and said, “I think that the universe is even more incredible than we can possibly imagine.” We both nodded our heads for a few seconds, me smiling dumbly, having no good follow-up, and then I said, “Thank you, I enjoyed the talk” and left. I don’t want to try to unfairly “appropriate” his words for my own means – especially since they weren’t published. But I hope I can use them in the spirit of acknowledging that the way he motivated me at least to some further thinking in the way he emphasized that the universe could be more incredible than we can “possibly imagine”. That led me to question if our human thinking is subject to the laws of complexity classes and computability theory just as it might be subject to laws of physics … and how that might have consequences.

    So, I would first question the questions. And I would do that by reminding you of your statement that “quantum mechanics is not about physics.” It is not about anything. The name itself is misleading. This isn’t the fault of physicists but just the way history played out. It should be called something like, “The Interfering Amplitudes Probability Model Builder”. (Catchy right?) Because it isn’t even itself a theory but, again as you said, more like an operating system or a programing language made for constructing theories about the world. It is an arena for building models of our experiences as amplitudes.

    I don’t see then that we can say the universe is quantum mechanical. Quantum mechanics is not a thing but a means or language. What we have found is that the “Interfering Amplitudes Probability Model Builder” is what always seems to work if we want to build models of the universe – by which we always mean our experiences of the universe.
    I say all of that to say … we should consider the possibility that the answer to “Why should the universe have been quantum mechanical?” may be because we are at the tail end of a long history of painting ourselves into a corner … or being painted into a corner.

    Could mathematical physics itself have painted us into a corner? This is Wigner’s “unreasonable effectiveness.” In other words, sometimes math just seems … “too good to be true.” So … should we … you know … trust it? I know. I know. What else are we to do? I don’t (of course) have an answer to that. But this is one of the possibilities that sends shudders up my spine … What if math has done this wonderful job of painting the floor of our experiences … and when we ask it repeatedly “give us a way to create models with which we can make predictions about what is going to happen” that it leads us invariably to complex numbers and the quantum postulates outlined above. Or maybe even just that we happened to head off down into one particular valley in a wider mathematical landscape but we can’t “climb back up” the way we came.

    Why should the universe have been quantum mechanical?
    Because we asked math (or the math we know) for the answer.

    Or maybe it isn’t just the narrow discipline of mathematical thinking that is painting us into a corner. Another possibility that gives me shudders is maybe it is the very nature of our thinking itself. This idea, of course, goes back at least to Plato’s Cave. The obvious depressing problem with Plato’s allegory is that the prisoner who escapes the cave to see the true fire … may still only be looking at an illusion of himself escaping a cave and seeing a “true” fire … Thus, even if we think we have understanding, we may yet still be prisoners … within what may be our own projection. So … what if the “Interfering Amplitudes Probability Model Builder” is the computational model our brains use for modeling our experiences as probabilities? If it uses the model itself to generate its internal simulation of the external world … then every test we could apply to our experiences … would be a test applied within the framework of the model itself and so … it would always pass consistency checks with the model. Or would it? This is what I was wondering in the question posed to Vaughan Jones. It is an old philosophical question, of course, so not original. But I was trying to formulate it in something more like a computational complexity approach … asking if our cognitive model of computation might put something like an upper bound on the nature of predictive models we are even able to construct when we set about constructing them within the discipline of mathematical physics. If there were such a limit imposed by our cognitive faculties then the answer to Q would be for us to … look in the mirror. And I don’t mean anything “quantum brainy” about this. (After all quantum theory isn’t “about physics” …) I just mean that the current neuroscientific consensus is that our brains evolved to be a “prediction engine” and according to the neuroscientists all the brain does is generate “probably” what we will see or experience and then we take these “probabilities” to be reality.

    Why should the universe have been quantum mechanical?
    Because we understand it with our brains.

    Again, I take “Why should the universe have been quantum mechanical?” to better be phrased as the question, “Why should models of the universe always need to be constructed using the “Interfering Amplitudes Probability Model Builder”?”

    When we consider why “the universe is quantum mechanical” … it seems we should admit the possibility that we could be under that impression because of accidents of mathematical history or accidents of evolutionary cognitive development. So, perhaps there is computational complexity work to do here … to “prove” that we can trust math (that we are not trapped in an arbitrary mathematical valley – an island in theoryspace as someone once called it …) … and trust our own cognitive simulation of reality (that our own neural model of computation does not have an upper bound on how it represents reality to us in a way that fools us into thinking that the model itself is the model of reality).

  119. Scott Says:

    Tim Maudlin #70:

    1) I hope my other comments on this thread have clarified my position. Everything you mentioned that we “don’t know to be true” is indeed something for which I want to know whether it is true! More broadly, though, my interest is not restricted to laws that generate worlds that macroscopically resemble ours. One can write down millions of classical cellular automaton rules that, when run, will generate worlds that are clearly not our world, but are rich, complicated, Turing-universal, and seemingly able to support life and intelligence just as interesting as ours. I want to know: is this appearance illusory? Is yes, why? If no, then can you articulate any reason that would render it less surprising to me that we don’t find ourselves in any of those classical worlds, and instead find ourselves in this quantum-mechanical one?

    2) Again, even if there’s only one simple, rationally satisfying TOE to describe our worlds, would you agree that there would presumably be thousands of TOEs to describe other worlds—TOEs that are equally simple to write down, and that would equally rationally satisfy any inhabitants of those other worlds? If so, then doesn’t an enormous question remain, of “why do we live in this world as opposed to the other ones?”

    And yes, maybe the question has no satisfying answer—for example, because only our world exists and that’s that, or because, as Max Tegmark hold, all the worlds exist, and we happen to find ourselves in this world and that’s that.

    OK, but what if the classical worlds weren’t as conducive to life, or intelligence, or consciousness, or whatever, as one might’ve thought they were? Or what if they could only be made so at the cost of making them vastly more complicated than our world (of course, as you said, at the level of the fundamental laws rather than of emergent behavior)? As long as such things remain live possibilities, it seems to me that we haven’t ruled out that my Q1 has a straightforward correct answer, which would then be hugely important to know.

    3) We’re never going to agree about this, but even if a genie told me that Bohmian mechanics was true, my first instinct would probably be to forget about the hidden variables as epiphenomena with (by construction) no observable consequences, and just go back to doing standard QM. In which case, I certainly would wonder why the world was so contrived as to force me, in practice even if not in principle, to use this particular generalization of the probability calculus to Hermitian, trace-1 positive semidefinite complex matrices!

  120. Scott Says:

    Tim Maudlin #102:

      There is no a priori requirement that the universe be complex at all, whether at the fundamental scale or at emergent scale.

    There’s an anthropic requirement. We wouldn’t be here to debate this in a world with no macroscopic complexity. And if there were a trillion worlds, and only one of them had any macroscopic complexity, we’d necessarily find ourselves in that one if in any of them.

  121. Scott Says:

    Alessandro Strumia #72:

      Possible answer. To get big stuff out of small stuff (aka “decoupling”), a local theory must avoid equipartition of energy (otherwise energy goes into many small modes). Inventing some classical-like dynamics that avoids equipartition seems to me non trivial. QM achieves this by replacing small energy with small probability. Possibly there is no other road, and once you follow it, you arrive to QM.

    The part you call “nontrivial” is precisely the part I’m wondering about! Why not just invent a classical theory with a lower limit on the energies of the allowed modes? Note that, in classical cellular automata like Conway’s Game of Life, it’s far from obvious that there’s any notion corresponding to “energy,” but if there were such a notion, then certainly there would be a minimum nonzero allowed energy, namely the energy of a single live square or whatever.

  122. Peter Gerdes Says:

    Look, I don’t really believe this but it’s an interesting line of thought.

    Suppose you are some kind of dualist (in the sense of us having something like souls or being in something like the matrix) and you want a rules for the physical world that don’t expose this fact. In other words you want our choices to be governed by the operation of something outside the physical world but you want the physical world to appear causally closed even when you look at neurons under the microscope.

    QM gives you a really nice way to do this as you can simply evolve the wave function forward globally and then choose the branch you want to take globally based on which best matches the desires of the non-physical ‘souls’ or individuals externally placed into the simulation or whatever.

    However, as I said, I don’t really believe this. What I do really believe is that you can’t hope to recover an account of why we experience the world the way we do (it feels as if collapse is real) merely from decoherence type accounts. Why? Because even if you can show that with respect to a certain basis you can describe the world as (in some limit or ignoring options of small enough measure) the sum of a bunch of classical seeming branches that’s not enough. That’s because, if QM is the fundamental theory, you can’t give special status to being classical. I mean, if you don’t care about it being some kind of principled breakdown you could rewrite the overall wave function as the sum of a bunch of components that have parts that look like whatever Turing machines you want simply by ensuring that you have enough freedom in the way u represent those machines that you can choose a representation that lets you decrease the distance between your sum and the vector in the Hilbert space you approximating (you can just pick your components so their contribution to the overall sum has some max at a finite time and decays really quickly away from there so, by multiplying them by right choice of coefficients you can converge to whatever smooth function of time you want).

    To put the point differently, if I get to cut up the description of the universe into a bunch of components which merely need to sum to the actual (linear evolving) state of the world I could pick really weird decompositions such that, restricting attention to each component, those components look like they implement whatever computation you want. So you need some extra principle to explain why the world appears to us as if you choose one particular decomposition.

    Maybe that extra principle is as simple as saying one kind of decomposition is favored (or maybe it would be enough to favor a certain kind of basic. However, the point is that one needs some extra rule to explain why the world appears the way it does to us and maybe we need a better grip on what that missing piece is before we can answer this question.

  123. Tim Maudlin Says:

    Re 118 and 119:

    Here are few comments, not that I am expecting convergence but just to carry on a bit.

    First, about cellular automata. I get that *as a computer scientist* you regard cellular automata as “simple”. But thinking as a physicist, in a world where any sort of computer (and especially a digital computer) is a very complex, emergent thing, I don’t see *any* of them as simple. You think in terms of *writing* or *abstractly specifying* the rules under which they operate. I think in terms of *physically implementing* those rules, which requires a lot of complexity. So what comes natural from your discipline as a “simple” system looks to me as extremely complex. For example, actually physically implementing Conway’s rules for Life is a hell of a lot more complex than, say, F = mA, which isn’t a computation at all.

    Second, about the anthropic principle: you want to stay away from the Strong Anthropic Principle. There is nothing at all necessary tout court about the existence of life or intelligence or whatever. The universe, as far as everything we know, could perfectly well never have evolved life. If it hadn’t we wouldn’t be here to ask any questions. Sure. But if *you* in particular had not come into existence then *you* would not be here to ask this question. So what? Nonetheless, you are a completely contingent being, and your particular existence was not guaranteed by the laws of nature. Don’t confuse a *conditional* claim like “Given that we are here, such-and-such must have happened” with an unconditional one like “Such and such must have happened”. We have the Weak Anthropic Principle, which is fine. And if you want a theory that makes the existence of life (or more generally complex stable diverse structure) likely, we can look for a multiverse and invoke the Weak Anthropic Principle. So then may you are just arguing for a multiverse. (A physical one of the right kind, not Tegmark’s mathematical one). OK: that’s a target.

    Finally—and I’m a little upset to have be even writing this—the additional local “variables” (beables) in Bohemian mechanics are neither emergent (they are fundamental) nor hidden (collectively, they are what you can most easily see). “Standard QM”, if you mean by that what you find in standard textbooks, just is not a well-defined physical theory with well-defined ontology or laws. So it isn’t even the ballpark of answering the questions you are asking. Bohemian Mechanics—whatever other issues you have with it—is.

  124. Crackpot Says:

    I’ll take this as an invitation to exercise some crackpottery. The shortest possible version of the crackpottery is, take Pilot Waves, and subtract out the particles; you’re left with MWI, but I think that way of thinking about things is misleading. Rather, I think the correct way to interpret the wave you’re left with is a mass-energy distribution; the particle isn’t probabilistically in different locations, it’s always a waveform, and there are certain transformations we can make to change the orientation of the amplitude of the waveform in spacetime. Velocity is an amplitude orientation, in this model, and the uncertainty principle really boils down to the exact same limitation on orientation that the speed of light represents, under very constrained conditions in which the waveform is forcibly split (that is, measurement is a transformation).

    So the universe is, in a sense, classical. I’m not sure what the other questions are actually asking; for unitary transformations, the waveform is the waveform, regardless of its orientation; any pair of orientations are isomorphic. For the Born rule, it boils down to that energy is continuous (kind of) rather than discrete, but stable configurations are discrete, and perturbation is a better description (sub-quantum energy “sloshes” around a lot, but is only detectable when it reaches thresholds sufficient to trigger state changes between stable configurations, so energy events are more like continuous samples than single events). And for complex waveforms, I really don’t understand what the question is; what orientation should we expect the amplitudes to be in?

    Now, I say mass-energy distribution, but really, once you notice that the only thing we care about is curvature, all we’re actually talking about is curvature itself; curvature curving curvature. Mass and energy don’t cause curvature, they’re just particular forms curvature takes; mass, when the curvature creates a singularity in one orientation, energy, in other orientations. Earlier I suggested energy is kind of continuous; it is kind of continuous in the same sense mass is kind of continuous, which is to say, quantized, but (probably) quantized on an infinite logarithmic scale such that it can still be treated as continuous. Curvature thus would be governed by some kind of wave shape like sin(ln(x))/x; probably not the correct equation, but possessing the basic expected properties of the correct equation. (That particular equation has a particular relationship with a complex logarithmic spiral, which I think is a reasonable candidate for reasons I have yet to explain in anything like a comprehensible way.)

  125. Scott Says:

    Boaz Barak #75:

      So, I don’t really have an answer to your question Q1, but have a counter-question to you. Do you think that in a century or so, we will get so used to quantum mechanics, that we will no longer ask it?

    I have no idea!! As someone mentioned above, maybe our AI successors will be so intrinsically comfortable with QM that they won’t feel the need to ask Q1 that I feel. Or maybe they’ll no longer ask Q1, simply because they or we will have satisfactorily answered it! 🙂

    As long as we’re playing question-tennis, though, let me return your volley back with another of my own:

    Do you think it was ever satisfactorily explained why we should never have expected, even a-priori, to have found ourselves living in the pre-Newtonian teleological universe of Aristotle?

  126. Scott Says:

    asdf #82:

      I had thought that classical thermo had no sane way to get rid of the ultraviolet catastrophe. Therefore, quantum.

    See my comments #98 and #120.

  127. tez Says:

    matt #112 I can imagine some 19th century crackpot deciding to look at a theory of probability which allowed for negative probabilities and then coming up with some “natural” smearing constraints to ensure that only positive probabilities were observable under measurement. This is all you need in principle to get from phase-space classical particle mechanics to quantum mechanics (and it seems a more plausible route to me than someone stumbling upon hilbert space and 2-norms and relating that to physics etc)

  128. Shmi Says:

    Scott, First, Alan Turing won WWII only in a very anglo-centric version of history 🙂 It seems like a more complete and multi-sided view remains an… enigma for some.

    More seriously, but in the same vein, I think the questions you are asking can be broadened and narrowed at the same time. For example: Starting from a completely random, possibly n-dimensional “world”, what kind of “interesting” and enduring patterns can emerge? And by “enduring” I mean that, regardless of a specific instantiation of a random world, the same patterns emerge every time?

    Some years back I took an admittedly very modest crack at this kind of a question in a couple of LW posts about Order from Randomness:
    https://www.lesswrong.com/posts/aCuahwMSvbAsToK22/physics-has-laws-the-universe-might-not

    https://www.lesswrong.com/posts/2FZxTKTAtDs2bnfCh/order-from-randomness-ordering-the-universe-of-random

    In the second post I take one of the most random possible one-dimensional sequences, white noise, then order it, subtract the emerging linear trend, and look at the result. What we see is no longer completely random, but has a power law spectrum with a strange exponent of -1.86. The same exponent shows up in every run, too. That’s not quite the same as noticing a Hilbert space with unitary evolution emerge from some random set of, say, complex numbers, I just wanted to illustrate the general idea.

  129. Nicholas Teague Says:

    Cellular automata / hyper graph updating rules aside, the multi-way causal graphs of the wolfram physics model are the most intuitive channel I’ve found for thinking about the quantum mechanical nature of reality. Formalizes the multiverse.

  130. phi Says:

    Q1:

    It seems like our universe has a finite dimensional Hilbert space (a least for a given volume of space), despite the fact that its classical equivalent would have an infinite dimensional one. Maybe God had some kind of design requirement that there should be only finitely many (orthogonal) states. The mathematics of probability with infinitely many outcomes is notably more gnarly than the finite case, after all.

    This doesn’t get us all the way to ruling out even such things as cellular automata, which are also finite, but maybe there was also a requirement to have time be continuous? Let me explain that a bit further. The key “weird” property of quantum mechanics (at least IMO) is that given two states, one can make a superposition of those states that is “pure” (i.e. contains no information / has entropy 0). Another way of putting this would be to say that QM has no preferred basis. In the Heisenberg picture, the universe at any given moment in time always consists of the same information, it’s just that at different times, we’re “viewing it from different angles”. In quantum mechanics, this view can change continuously as a function of time. In classical mechanics, it has to discontinuously jump around in right angle increments, always staying aligned with the preferred basis. So in that sense, quantum mechanics can give us continuous time, while classical mechanics can’t, not with a finite theory.

    Q2:
    something something Fisher information metric on the space of all probability distributions over N outcomes something something

  131. Tu Says:

    Tim Maudlin #122:

    After Scott’s post on his Zen anti-interpretationism, I was afraid that the S/O comment section had become completely conquered by many-worlds people.

    I am heartened to see there are signs of a budding Bohmian insurrection! My money is on Bohmian mechanics being a more faithful rendering of the fundamental universe that the relative state interpretation when it is all said and done.

  132. Bruce Says:

    I think the question has a major unstated premise: that ours is the *only* universe. I find it just as likely that there exists another, completely unreachable and unobservable universe out there that runs on a different set of laws (resembling classical mechanics or otherwise). In fact, I think it’s just as plausible that every *possible* universe with every possible set of physical laws and initial conditions exists. In virtually every one of those universes, life, or even perhaps matter as we know it, doesn’t exist. But ours happens to be one of those where matter and life did emerge (see: the anthropic principle), and one in which the rules of nature are moderately complex–not so complex that we have no hope of understanding them, but more complex than evolution was capable of carving into our ape intuitions (the best it could manage was some sort of classical approximation of reality).

    Sometimes, I think of this like mathematics: when you first posit an idea like “prime numbers” or “the mandelbrot set” or “fibonacci numbers” or “conway’s game of life”, whether you realize it or not, those systems may have immense amounts of emergent consequences and complexities (“there are infinitely many primes”, “the area of the mandelbrot set is finite, but its boundary is infinite in length”, “the ratio of adjacent fibonacci numbers approaches the golden ratio”, etc.). In some sense, those emergent complexities exist independent of human beings ever discovering them, or even conceiving of the system itself. And in some sense, the initial idea of the system itself is identical to all of the infinite depths and emergent properties the system entails (when you describe prime numbers, you are describing an infinite set, whether you know it or not). I think our universe may be something like that: a set of things and a set of rules that determine relationships between those things, and immense amounts of emergent complexity from that. Other sets of rules exist, and other ways for those things to relate, so why shouldn’t those be considered universes as well? If I was religious, I would say our reality is just God thinking through the consequences of one possible set of initial conditions and physical laws for a universe, but I’m not religious and I see no reason why any sort of “thinker” needs to be involved, any more than \(2.718…\) needed Leonhard Euler in order for it to exist.

  133. Nisan S Says:

    The Game of Life is not a safe place for life to evolve. Everything is being constantly bombarded by gliders and stuff. We’re fortunate to live in a universe where gravity collects matter into tidy planets, whose surfaces are only lightly bombarded. I’d look for a reversible cellular automaton that has something like Newtonian gravity over long distances; how would one do this?

  134. Philippe Grangier Says:

    Following Anbar #90 : I agree with this answer to Q2, which is clearly inspired by quantum logic. However the sentences ‘most economic representation’ and ‘necessary for empirical reasons’ are too vague, and must be substanciated on a physical basis, rather than on a logical one. Then, given the good hypotheses and the good theorems (Uhlhorn and Gleason), unitary algebras and Born rule can be deduced, not postulated ( https://arxiv.org/abs/2111.10758 ).

  135. TP Says:

    Scott #33: I find it unsatisfying if in a classical world that is fundamentally continuous we put in discrete blocks arbitrarily by hand so that we can do cellular automata. If the discrete blocks arise organically in the continuous world then of course we have problems related to computation.

    For interesting chemistry we need the building blocks to be charged. However, charged particles lead to various problems in classical electromagnetism, e.g., the infinite self-energy of electron or the radiation from an accelerated charge. Quantum mechanics solves these problems. I suspect that any attempt to solve these problems will have to reproduce quantum mechanics in its entirety. As you have pointed out it seems impossible to change quantum mechanics just a little and it seems to be an island in theoryspace. Maybe any self-consistent theory that tackles these problems of classical electromagnetism has to go all the way to quantum mechanics for this reason.

  136. Andrei Says:

    Scott,

    I’d like to show you a quote from a mainstream QM book (Sakurai J., Napolitano J. Modern Quantum Mechanics 3ed, 2021). At page 230, in regards to EPR/Bell arguments, we read:

    “The fact that the quantum-mechanical predictions have been verified does not mean that the whole subject is now a triviality. Despite the experimental verdict we may still feel psychologically uncomfortable about many aspects of measurements of this kind.
    Consider in particular the following point: Right after observer A performs a measurement on particle 1, how does particle 2 – which may, in principle, be many light years away from particle 1 – get to “know” how to orient its spin so that the remarkable correlations apparent in Table 3.1 are realized? In one of the experiments to test Bell’s inequality (performed by A. Aspect and collaborators) the analyzer settings were changed so rapidly that A’s decision as to what to measure could not be made until it was too late for any kind of influence, traveling slower than light, to reach B.

    We conclude this section by showing that despite these peculiarities, we cannot use spin-correlation measurements to transmit any useful information between two macroscopically separated points. In particular, superluminal (faster than light) communications are impossible.”

    What we are facing here is much more than a “psychologically uncomfortable” feeling, or a “peculiarity”, we are facing a logical/mathematical inconsistency between QM and the space-time structure of special relativity. Please look carefully at the last paragraph:

    “we cannot use spin-correlation measurements to transmit any useful information…”

    SR does not make any distinction between “useful” faster-than light messages and useless faster-than light messages. If a bit of information regarding the A measurement was sent instantly at B, we have to conclude that SR is wrong. We need to go back to Newton’s absolute space and time (since we need to determine which measurement was first), introduce an absolute frame of reference and couple QM to that absolute reference. In other words, if this type of non-locality is true, we need to conclude that we don’t understand pretty much anything about the fundamental structure of the universe. We need to develop a new physics, and only then your Q question would be answered (if the reformulated QM + absolute space still resembles the QM we have now).

    As far as I can tell, the other possible option, superdeterminism, is much more parsimonious. It leaves space-time as it is, but QM becomes a statistical approximation of an underlying classical theory. So, again, the current structure of QM cannot tell us anything fundamental about the universe.

    In conclusion, as long as the inconsistency between QM and SR is not cleared, we cannot, and should not attempt to answer the type of questions you asked. Any answer derived from a set of incompatible statements is irrelevant. Solve the incompatibility first, modify what needs to be modified, and only then an answer may be achievable.

  137. Cain Says:

    When all observables commute then physics is no longer observer dependent. God would not create such a programmable world with no role for the soul of the observer since there would be very little need for God’s metaphysical power in the first place. Quantum mechanics necessitates God to determine when the state of the universe will change according to the Born rule rather than the Schrodinger equation, and no matter how many dinosaur fossils are dug up Richard Dawkins will never be able to explain this from more fundamental principles. QM is God’s signal that all the evidence against his existence from lesser sciences can now be discarded as a test of our faithfulness.

  138. Andrei Says:

    Tim Maudlin, Scott,

    ” “Standard QM”, if you mean by that what you find in standard textbooks, just is not a well-defined physical theory with well-defined ontology or laws. So it isn’t even the ballpark of answering the questions you are asking. Bohemian Mechanics—whatever other issues you have with it—is.”

    Yes, I think that the above paragraph is exactly true. It can be rigorously proven based on the EPR + Bell arguments. The only consistent views are non-locality (Bohm) and superdeterminism. Both require one to abandon “standard QM”.

    The inability of most physicists to accept this is responsible for the lack of progress in the field.

  139. Isaac Grosof Says:

    Hi Scott,

    Thanks for the reply, I’m glad my argument and/or its presentation were new and interesting.

    As for the anisotropy question, about how I would feel if Lorentz invariance broke down at quantum gravity scales but was approximately recovered at larger scales, I think that would be aesthetically suboptimal, but not as deal-breaking as global anisotropy. My greatest feeling of aesthetic distaste comes for broken symmetries that stay broken all the way up to global scales. Emergent symmetries are definitely something I’m on board with – the emergent symmetry of a homogeneous gas, for instance, is broken at smaller scales.

    That being said, Lorentz invariance, or another property saying that no direction or velocity is special, is a very nice and fundamental symmetry, and I’d strongly lean towards theories that preserved it all the way down to the fundamental scale. These properties seem fundamental to the notion of a “blank canvas” that the universe is drawn on, and I think that guides a lot of my feel on these matters.

    On the other hand, I felt the exact same way about parity symmetry until I learned about the Wu experiment, so I’m not going to believe too strongly in the “blank canvas” aesthetic criterion.

    Thanks again, Scott!

  140. Andrei Says:

    Haelfix,

    “One of the obvious answers here is that classical mechanics *can’t* reproduce nature as we observe it. That atoms would be unstable, that the ultraviolet catastrophe would have no resolution, and so on and so forth.”

    Just look at this paper:

    Stochastic Electrodynamics: The Closest Classical Approximation to Quantum Theory
    Timothy H. Boyer
    https://arxiv.org/abs/1903.00996

    Pretty much all the so-called failures of classical physics were solved in the context of classical electromagnetism (Stochastic Electrodynamics is classical EM + 1 assumption about the vacuum). Even the stability of atoms has been explained classically, although only qualitatively. There was never a rigorous proof that classical physics, as a framework cannot explain this or that observed phenomenon. It’s just that, at a certain point in time, no suitable classical model was put forward.

    ‘t Hooft published a classical interpretation of QM:

    Explicit construction of Local Hidden Variables for any quantum theory up to any desired accuracy
    https://arxiv.org/abs/2103.04335

    We still need to remember that there is no “quantum” theory of space, time or gravity. So, a large part of modern physics is still classical.

  141. Andrei Says:

    Cain,

    “When all observables commute then physics is no longer observer dependent.”

    Observer’s independence has nothing to do with commutation. Some properties do not commute because the measurement of one perturbs the system.

    All QM’s predictions are objective. All observers agree on what was measured and what the result of that measurement was.

  142. bertgoz Says:

    Scott #124 Occam’s razor when considering experimental data?

    I feel ultimately the description of the universe using mathematics and hence leading to the classical framework first and then to the quantum one, tells more about how the human mind works and it’s limitations than anything else

  143. async Says:

    Let my try an analogy just to be on the same page about how a satisfying answer would look like.

    We can derive the general structure of transformations between different reference frames from some very reasonable assumptions. There are only two possibilities: a) Galilean relativity (if there is no speed limit) b) special relativity (if there is a speed limit).

    Is the answer to the question “Why Special Relativity?” then along the lines of “There are only two possibilities and both are of comparable complexity. Nature just happened to realize one of them” or something else?

  144. Daryl McCullough Says:

    My feeling is that quantum mechanics can’t literally be true, because of the measurement problem, which I don’t think can be solved. Not without going beyond quantum mechanics. Maybe there is a way to make Many Worlds or Bohmian Mechanics work, but I don’t consider those to be orthodox quantum mechanics.

    Let me explain by starting with the simplest quantum system, a spinor. Consider the particle state \( |\psi \rangle = \frac{1}{\sqrt{2}} ( |U_z\rangle + |D_z\rangle ) \). This state is a superposition of spin-up in the z-direction and spin-down in the z-direction. Where do probabilities come in? On the one hand, you could use the Born rule to say that the particle has a 50/50 chance of being spin-up or spin-down. But that seems like nonsense. It’s neither spin-up in the z-direction nor spin-down in the z-direction. It’s a pure state describing a particle that is spin-up in the x-direction. Until you introduce measurements, there really are no probabilities in quantum mechanics.

    So introduce measurements. We allow this particle to interact with a Stern-Gerlach device that measures spin (in the z-direction, let’s stipulate). On the one hand, the Born rule says that our measurement will produce spin-up or spin-down with 50/50 probability. But on the other hand, if we consider the device itself to be a quantum system made of particles obeying the Schrodinger equation, then we would not find that the interaction of the particle and measuring device probabiistically results in either a measurement of spin-up or spin-down. We would find that the interaction produces some pure state that is a superposition of “The device having measured spin-up” and “The device having measured spin-down”. If we take into account the interaction of the device with the rest of the universe, then we would find (if we could make sense of the wave function of the universe) that the universe would be in a state of “a world in which the device measured spin-up” and “a world in which the device measured spin-down”.

    It seems to me that you never get an actual measurement result, and so you never get probabilities. To get actual measurement results, it seems to me that you have to treat the measuring system (and presumably the rest of the universe) as something that is separate from the system being measured. Such a separation has to be artificial if everything is described by quantum mechanics. Maybe there’s a God outside of the universe who observes its history, forcing it to collapse probabilistically to something definite? Or maybe it’s all subjective, and the appearance of measurement results is relative to the observer? (But that’s a departure from orthodox quantum mechanics, it seems to me).

  145. Mateus Araújo Says:

    Scott #31: I see, so what you’re asking is not why couldn’t the universe run on classical physical, but rather why couldn’t it run on classical computing. The answer, as already mentioned by Yoni #13 and OhMyGoodness #36, is that you need randomness, and classical computing can’t give you that. You specifically mention that you want a rich universe where complexity and information processing emerges, not a uniform and isotropic one. Well, with quantum mechanics you can start with a uniform and isotropic one, and quantum fluctuations will quickly make it more interesting. To get complex life you need evolution, which is intrinsically powered by randomness. I suppose you’ll agree with this, and object to the idea that classical computing can’t give you randomness.

    After all, we have pseudorandom number generators, P=BPP, and we’re used to simply postulating that we have some random bits available for our computation. Well, postulating that the bits are random doesn’t help at all in a matter of principle, you still can’t generate them. At best you can postulate some randomness in the initial conditions of the universe, that is used up as the universe evolves, like in Bohmian mechanics. There are several problems with that. First is that it’s just subjective randomness, not true randomness. Second is that you’ll eventually run out of randomness and your universe will reverse to plain determinism (Bohmian mechanics goes around this problem by hiding an infinite amount of randomness in the continuum, but this trick is not available for a discrete computer). Third is that you’re just putting that randomness by hand in the initial state. The universe is not evolving complexity by itself, it is just playing out the recording put it in the beginning.

    Consider the game of life: it does support complex patterns with complex behaviour, but it cannot generate them. All of the impressive life-like constructs we see there have been painstakingly designed. The same problem befalls PRNGs: we have to design our universe since the beginning with life forms that each incorporate a Mersenne Twister or something. It cannot evolve from nothing (also, I’m very skeptical that it would even work in a deterministic universe, a PRNG still needs a seed). To put it provocatively, a classical universe requires a deity, while a quantum one can run by itself.

    You might object: isn’t quantum randomness just an illusion caused by deterministic branching in Many-Worlds? Well, I wouldn’t say it’s an illusion, it’s the only consistent definition of true randomness that has ever been proposed, but yes, it’s fundamentally deterministic. Couldn’t a classical computer use branching to produce the same true randomness then? Well, no, you do need quantum mechanics in order to have Many-Worlds. In a classical computer the resources needed to compute the universe increase exponentially with time.

  146. Peter Morgan Says:

    tez #126 (and matt #112), there was such a 19th Century crackpot. His name was George Boole. In 1854 he published “The Laws of Thought”. At that time he knew he had a problem with probability that he couldn’t articulate. 12 years later, he could articulate the problem.
    Pitowsky puts it better than I can, “Surprisingly, the tools for such an analysis [of the Quantum Puzzle] were developed, independently of physics, over the last 140 years, beginning with George Boole [1862]. Boole’s research problem in this context can be phrased in modern terminology as follows: we are given a set of rational numbers P1, P2, …, Pn which represent the relative frequencies of n logically connected events. The problem is to specify necessary and sufficient conditions that these numbers can be realized as probabilities in some probability space.” That’s in “George Boole’s ‘Conditions of Possible Experience’ and the Quantum Puzzle”, Brit. J. Phil. Sci. 45 (1994). 95-125. Pitowsky is almost always worth reading.
    Boole of course did not present probability in an operator formalism, but this is the idea laid bare. Pitowsky uses the slightly clunky word “commeasurability” to refer to the conditions of possible experience being satisfied, whereas most modern literature uses “measurement compatibility”. In the modern quantum probability literature, probabilities are most often presented in an operator formalism, where “measurement (in)compatibility” is closely related to (non)commutativity of operators.

    Scott, I came to the measurement problem out of field theory, classical and quantum. Understanding the relationship between classical and quantum measurement theories as about two things, (1) noncommutativity and (2) that the spectrum of quantum noise is different from the spectrum of thermal and other noise, and removing them by extending classical measurement theory as needed, allows us to focus on the unnoticed elephant, which is that the classical dynamics is generated by the Liouville operator, **which is /not/ a positive operator**, in stark contrast to the quantum dynamics being generated by the Hamiltonian operator, which is a positive operator. This has extreme consequences because the positivity of the Hamiltonian operator is associated with analyticity in QM and in QFT.

    What I think is the real payoff here is that through this understanding I find it possible to look at the Wightman and Haag-Kastler axioms for quantum fields through fresh eyes. My real interest is the problem presented by renormalization, which as far as I can tell is not likely to be addressed well by work on CAs any time soon (although work on CAs does consider scaling, it essentially follows well-worn paths established in the 60s, though that is again only to my knowledge). The Wightman axioms are to me a step towards doing CAs right, if we can understand why they have no interacting models in 3+1-dimensions.

    Suppose we have an experiment that produces a list of numbers, from which we construct relative frequencies. Suppose we use a continuous probability density as an ideal model for those frequencies. We can ask what the probability density at the value x is, and get back p(x). Suppose now we use different coordinates, so we have to move the distribution by adding a constant: under assumptions that we often take to be satisfied in physics, we generate such translations using the differential operator ∂/∂x. “What is the value at x” and “use different coordinates” do not commute. If we ever transform the list of numbers we obtained from our experiment, we have to use an algebra and group of transformations to describe what we have done.
    Things get both elementary and complicated when we consider the Lie algebra generated by [∂/∂x,x]=1, which is all too familiar from QM. In the context of probability measures, it is fairly natural to consider characteristic functions as Fourier transforms of those probability measures, which fairly naturally introduces a complex structure as a way to discuss the sine and cosine components of the Fourier transform (I’m not asking a philosopher to agree that this is obvious, I’m only saying that this is one way to introduce a complex structure in a natural enough way that we can mostly stop worrying about it, FTW).

    You know I can go on for pages about this, but it’s all pretty simple math once anyone starts to run with it. Anyway, I’ve rehearsed this here before with only slight differences and you’ve always said it’s incomprehensible before, so it’s not likely to be any different this time, so I’ll be gone now. I’m mostly back to my work on renormalization in any case.

  147. Sandro Says:

    Tim Maudlin #70:

    But thinking as a physicist, in a world where any sort of computer (and especially a digital computer) is a very complex, emergent thing, I don’t see *any* of them as simple. You think in terms of *writing* or *abstractly specifying* the rules under which they operate. I think in terms of *physically implementing* those rules, which requires a lot of complexity.

    They require a lot of complexity because the formal rules governing our universe don’t directly correspond with the formal rules of the computer being built, and so we have to build projections using physical rules to establish a different formal substrate for computation. There’s no obvious a priori reason why a universe based on a different set of formal rules couldn’t exist though. Exploring why this is or is not so is exactly what Scott’s Q is about.

    Cellular automata are definitely formally simpler than our physics. There is no step in this process called “physically implementing the rules”, any more than there is a step to physically implementing the one dimensional strings of string theory. The cells of a cellular automaton define physical existence, and the rules that govern their behaviour just are what they are.

  148. gentzen Says:

    While trying to understand QM (starting in 2013/2014), I had many attempts that I later felt ashamed of. Not because I made those attempts, but because I talked about them with my close ones. It was clear that those thoughts could neither make full sense to me myself, and even less so to the non-specialists who listened to me. But I want to stop feeling ashamed of myself. Let me see which of my attempted analogies I can still remember:

    Q1?
    – Why not classical: symmetry between observation and action (equations and variables)
    – Why not classical: symmetry between space of positions and space of momenta
    – Why not classical: nature’s way to avoid infinite information content in finite volume
    Q2?
    – Why the oscillations and waves: It is a conflict resolution mechanism.
    – Why the oscillations and waves: It is a resonance phenomena.
    – Why the oscillations and waves: It is a phase transition, or more precisely the interface between different phases.
    – Why tensor structure (high dimensional instead of 4 dimensional): It is a probability density statistical description.
    – Why unitary: unitarity implies linearity, so as instrumentalist I sometimes thought: It is linear, because it is a low order approximation around a working point.
    – Lienhard Pagel: It is a fixed point, and Planck’s constant h cannot change, not even in an emergent theory on a higher level.
    – All is chaos (instead of nothing), and the order propagates through it. The idealization failing in our universe is that things can stay “exactly constant” while other things change. Even more, the idea that it is even possible to define what “staying exactly constant” means is an idealization.
    – Why complex number: The (number) field has a non-trivial automorphism, namely conjugation. Therefore the complex numbers themselves are not measureable quantities, and already hint at the “simplest” gauge symmetry.
    – John Wheeler: It is like the second law of thermodynamics, entropy always increases, but not because this would be strictly true, but because it emerges from a law of large numbers.

  149. Boaz Barak Says:

    Scott #124: Let me attempt answering your question: “Do you think it was ever satisfactorily explained why we should never have expected, even a-priori, to have found ourselves living in the pre-Newtonian teleological universe of Aristotle?”

    Despite not being a believer, I actually like your framing with “God”. The reason is that I think ultimately science is about finding a compelling and coherent story to explain the world. We feel we understand a phenomenon when we not only have a formula that can predict results of experiments, but also have metaphors and ways of thinking about it that make sense to us.

    So the question of the “a priori expectation” (which one can also think of the “baseline” or “null hypothesis”) can be framed by thinking that this story has an author. I prefer this view to the notion that there is an objective measure of simplicity a la Kolmogorov complexity.

    Now I don’t think that there is a single “satisfactory explanation” why we should reject the teleological Aristotolean universe. I am no historian, but believe it was a proccess to replace the teleological view with the mechanical view of the universe. In the beginning, the mechanical view might have been considered just a set of useful technical tools to do calculations and predictions. But as our world was more and more shaped by machines, engines, and clocks, these began to shape our metaphors as well. Hence we could think of the “clockwork universe” and these metaphors became more natural to us than stories about gods.

    Similarly, I don’t think the view of “it from qubit” and the universe as a computer would make much sense to us if the Turing machine remained a thought experiment, rather than a device that we carry in our pockets. Now that we have gotten so used to the computer, we think of them as a useful metaphor to explain other stuff, rather than a mysterious phenomenon that needs to be explained.

    This is basically what I meant by “getting used to” quantum mechanics. I think that it’s a gradual process, whereby as we find more and more practical and theoretical applications to a new framework, it begins to shape the metaphors we use, and the type of explanations we find “natural”, “simple”, or “beautiful”.

  150. Liam McGuinness Says:

    Q1: In a classical universe one can obtain an infinite amount of information from measurements of a single particle in an arbitrary amount of time, since there is no limit to the precision with which one can measure that particle. We might then ask — where is all this information kept on the particle and why isn’t it possible to see all these degrees of freedom? Is the information a result of some underlying structure? If so, why can’t we further decompose this particle down to reach the atomic level of information. This is why I believe that the universe cannot be perfectly classical, for it would mean an infinite amount of information encoded onto a single atomistic unit. If we accept that information is physical, then an atomic theory for matter also implies an atomic theory for information, and classical physics even when allowing for indivisible particles does not restict the amount of information such a particle can contain.

    Q2: From this rejection of infinite information per particle, we can generate a single axiom: (A0) the amount of information provided from a single measurement of a single system cannot exceed \(\gamma T\), where \(T\) is the amount of time spent measuring the system and \( \gamma \) is a unit conversion factor. I admit that this axiom is somewhat imprecise and non-rigorous, but I assert that QM is this axiom and A0 in turn is QM. Thus, the clearest explanation (and motivation) for quantum mechanics to my mind is that it provides the correct physical limit to information content. How exactly does this axiom reproduce QM, and why does \(T\) appear in A0?

    i) The information bound in A0 must be linear in \(T\) due to the Fourier-Gabor limit. If a frequency interacts with our particle, and we can estimate this frequency by performing a single measurement on our particle, then estimation of the frequency to better than \(1/T \) would violate this limit. Here it is now clear that my colloquial definition of information is simply 1/uncertainty, whereas (quantum) fisher information is defined as this quantity squared.

    ii) It reproduces the uncertainty principles – if we obtain one bit of information (say about the quantum state along \(x\) of a spin-\(\frac{1}{2}\) particle), we cannot immediately obtain another bit of information (say about the quantum state along \(y\)). Alternatively, if we immediately measure along \(x\) again, we get the same answer which is not another bit of information.

    iii) It reproduces the Schrödinger equation – to get more information after measuring the state along \(x\) we must wait an amount of time. The state can evolve away from \(\pm x\) according to A0 and we can obtain more/new information upon readout.

    i – iii) describe the predictions of quantum mechanical theory for measurements on any single system, they include how quickly the system can evolve in time and the probabilities for different measurement outcomes. If we are careful and rigorous in our definitions, then I expect that A0 and the quantum mechanical description of measurements and evolutions of states can be made equivalent. I also expect that at some level, this explanation is both obvious and already taken for granted by many quantum mechanics researchers (Scott included?). What I would like to stress though, is that when followed through completely, A0 has many consequences which actually diverge markedly from the mainstream understanding of quantum mechanics. Indeed, A0 predicts measurement outcomes that are in conflict with current QM theory, and A0 gets these predictions correct. This tells me that A0 is a powerful axiom and we should accept it.

    What do I mean by this? To start being rigorous we need to define what a ‘single’ system is. An electron is a single spin-\(\frac{1}{2}\), but it is also composed of quarks, so is it still a single information system? Can more than \(T\) units of information be provided from this electron in time \(T\)? I work with NV centers in diamond, where a single spin-1 system is formed by thousands of carbon atoms and a single nitrogen atom. How much information can measurements on this system provide? One natural answer is to say that, if the system is described by a single quantum mechanical wavefunction, which is not separable, then this constitutes, for informational purposes, a single system which obeys A0. I acknowledge that I am using a circular definition where quantum mechanical theory (a separable wavefunction) is included in my fundamental axiom which then is used to produce QM, but I can’t avoid it at the moment. We then have:

    iv) This definition of a single system reproduces entanglement – the measurement outcomes of all particles in an non-separable wavefunction are perfectly correlated and in such a way that no more than \(T\) amount of information (where \(T\) is the total measurement time) can be obtained. In effect an entangled \(N\) particle system is informationally equivalent to a single system.

    By assuming this definition of a single system, it is clear that A0 diverges wildly from mainstream quantum measurement theory, since it is universally accepted that, by entangling \(N\) particles, one can run algorithms and perform measurements to extract more information per unit time than would be possible using \(N\) unentangled particles. I.e. the informational content of a Hilbert space is increased by entangling the particles and making the wavefunction non-separable, and does not in fact decrease the informational content. Not only that, iv) predicts that the \(N\) particle entangled system cannot outperform just a single spin-\(\frac{1}{2}\) particle.

    First, let me say that I know advocating for iv) makes it seem, at first glance, like I have no understanding of first year quantum mechanics. However, I strongly advocate for using iv). To give one example, it is expected that the quantum fourier transform which uses entanglement, can improve frequency estimation. What is the frequency uncertainty when using the QFT? Precisely \(1/T\), independent of the number of qubits used in the QFT (see Childs, Preskill, and Renes. J. Mod. Opt. 47, 155–176 (2000)). One might argue that \(2^N\) frequencies can be discriminated when \(N\) qubits are used in the QFT. However, there is no principle in QM which rules out discrimination of \(2^N\) frequencies when using measurements of a single qubit. One very simple method to do this is excite the qubit into its excited state wait for it to emit a photon, then let the emitted photon pass through a spectrometer. The spatial location of the emitted photon allows an arbitrary number of frequencies/wavelengths to be discriminated, depending on the spectrometer bandwidth and resolution. However the uncertainty in the spatial location cannot be better than \(1/T\). There are many other methods to extract more than one bit of information from a single qubit in time \(T\). One could could irradiate a qubit with \(2^N\) different frequencies simultaneously and then see which one was absorbed. Or one could perform measurements on the single qubit at times \( T/2, T/4, T/8, …, T/2^N \). Theoretically I am not aware of any analysis that rules out iv).

    Furthermore, we can check for experimental evidence against iv), if any exists, then it is clear that we must rule it out. However, there is no such evidence. In fact, all available experimental evidence agrees with iv) and actually provides a significant discrepancy with standard QM predictions. The easiest comparison can be made with atomic clocks, where the frequency of a near-resonant field is estimated. The Heisenberg limit provides: \(\Delta f \ge 1/(N T) \). It is expected that this limit can be staturated by using entangled states, however ALL experimental evidence with entangled states has an uncertainty limited to \(\Delta f \ge 1/T \), when the total experimental time and overheads are correctly considered. This is consistent with iv). This same agreement occurs when analysing phase estimation, magnetic field estimation and all quantum metrology. Furthermore, due to connections with quantum algorithms, iv) also implies that quantum computers will not outperform classical computers.

  151. Tobias Maassen Says:

    Hi,

    having read some of the comments, but not all, I do not see some of the bigger problems of classical physics mentioned.

    One example is Newtonian Gravity allowing objects to reach infinite speed in finite time, another you mentioned is the solidity of matter(Pauli).

    Another is the problem, which invented quantum Physics: The ultraviolet catastrophe. It might be hard to think of alternatives post hoc, but Planck is the easiest explanation of radiation that does not produce infinite intensities or Perpetual Energy Machines.

    Are there cellular automata | strings | other theory canidates) wich preserve energy or a similar quantities?
    Would these allow for a complexity similar to chemistry?

    I belive (in the religious sense) in a theory more fundamental than QM, but our human understanding of Math is still inadequate.

  152. James Gallagher Says:

    I think the Born Rule is good evidence for an Anthropic Universe since there is no good reason for it to be selected amongst all even power rules apart from being “most likely”.

    Pauli already anticipated this in his not so well-known textbook on General Principles of Quantum Mechanics, P 15:

    (Only the results will show that one can do without quartic forms or forms of higher degree.)

    (he means experimental results)

  153. Philippe Grangier Says:

    Disagreement with Andrei #135 and #137 : there is (at least) a third way besides superdeterminism and nonlocality (Bohm), that is predictive incompleteness, see https://www.mdpi.com/1099-4300/23/12/1660 . It does not require to ‘abandon standard QM’, but rather to look at it in a slightly different way.

  154. Dan Says:

    I don’t know the answers to Q1 and Q2.

    I know that the answers to Q1 and Q2 are (most probably) disappointingly simple.

    Plus, the simulation theory seems plausible iff the outer world is based on similar ruleset (QM/classical physics)

  155. Andrew Matas Says:

    Let me grant you you are right, and there is a theory which can create a complex world with chemistry, and which runs on classical laws, for appropriate definitions of “complex”, “chemistry”, and “classical.” I think it is very plausible that a set of definitions can be made where such a theory is possible. Let’s call this theory X.

    It seems like your question then boils down to, a priori, why would we be “forced” to choose quantum mechanics over X, for some definition of “forced”? Well, “forced” can’t mean “quantum mechanics is the only logical possibility” since we are presupposing that X exists. So you must mean something like “why is quantum mechanics aesthetically more pleasing than X” or “why are the foundational principles of quantum mechanics psychologically feel more ‘inevitable’ than the principles of X?”

    Can you define an objective “plausibility” metric by which we could rank quantum mechanics and X? I can’t think of one that doesn’t boil down to a subjective, personal preference.

    The only way I can imagining answering Q1 in a scientific way is to prove X doesn’t exist, but I actually do agree with you that if you allow your definitions to be sufficiently broad, X probably does exist.

    Without nailing down some of these definitions for what set X should be drawn from (do cellular automata count?) and how to compare X and quantum mechanics, I fear this question can be answered in 10 different ways by 5 different people, so is scientifically meaningless.

  156. Andrew Matas Says:

    Following up from an earlier comment I wrote, which isn’t published yet 🙂

    Perhaps one way to formalize something like question 1, is: “What is the minimal set of assumptions such that quantum mechanics is the unique theory that can explain physics?”

    In other words, we start with a set of plausible physical theories, then apply conditions to that set until quantum mechanics is the only thing left. For example, this may involve requiring the theory can produce “chemistry,” and giving a concrete definition of “chemistry”.

    Then one could ask where the assumptions that are needed to rule out non-quantum theories are “weak” or “strong”, by some measure.

    This is similar to how we understand GR — it is the unique low-energy theory of an interacting, massless, spin-2 particle. That’s not to say it is the *only* theory of gravity, just that to the extent the assumptions seem plausible and simple, you should believe GR is a good description of gravity.

    I tend to agree with an earlier comment, that requiring *local* physics that does not distribute energy to very short wavelength modes by equipartition, is probably a very powerful condition that rules out a lot of non-quantum options.

  157. Scott Says:

    Vladimir #86:

      Scott #33 left me completely flabbergasted. 25 adjustable parameters seem like too big a price to pay for explaining all of chemistry to you? Which features of chemistry you think can be reproduced by cellular automatons with less than 25 rules?

    See my comment #99 (many other people’s questions are also addressed there). Briefly, I don’t mean our chemistry, I mean some chemistry with the right properties to support complex life.

  158. Scott Says:

    Lars #88:

      And, if the failure to unite QM and GR despite a century’s worth of effort on the part of the smartest humans is any indication, it would certainly appear that God has actually achieved “maximal confusion”, even if she was only shooting for minimal.

    I mean, the only reason physicists have been struggling to reconcile QM and GR is that they … figured out QM and figured out GR, to the point that they know the precise rate at which black holes emit Hawking radiation, despite never having been within several hundred light-years of one, and despite the process taking easily a googol years for the black holes at the centers of galaxies!

    When you think about it that way, it’s staggering how much we do understand. One could easily imagine universes that were far more confusing than this one.

  159. Scott Says:

    Guy #89:

      Stephen Wolfram’s project. Isn’t that exactly his thing… answering Q1 by studying strong emergence from CA?

    Right, but he never comes close to answering it. He starts by ignoring QM, then unconvincingly grafts on bits and pieces of it by fiat, never clearly defining what’s the Hilbert space, etc. Having said that, it’s true that CA-like models do seem more “natural” than our laws of physics in the space of computational possibilities! So then the question becomes one of explaining why those turn out not to be the right models for our world. The second step is important and is the one Wolfram never really takes! For more see my comment #99.

  160. Scott Says:

    Chris #94:

      You should expect dynamical evolution to be continuous, and in particular that transitions between states should be continuous. Lucien Hardy has shown that this assumption is enough to motivate representing quantum states with complex numbers.

    You’re misremembering his result in an interesting way! From the existence of continuous, reversible transformations between pure states (plus some other unobjectionable axioms), Hardy deduces the truth of some QM-like theory, but the amplitudes could still be either real or complex. To get that they’re complex, you need an axiom saying that the number of parameters needed to characterize a bipartite state should be the product of the numbers of parameters needed to characterize the individual components. Personally, I’ve never quite understood the motivation for that axiom, other than that we “peeked in the back of the book” and already know that amplitudes are supposed to be complex.

  161. Scott Says:

    skaladom #96:

      I don’t find it at all clear what the actual translation of your question would be, devoid of metaphors of creation and goals … is it clear to yourself what the non-metaphorical question you’re asking really is?

    Sure! What, if any, are the more basic principles that, once accepted, would make QM an unsurprising consequence?

    Crucially, I think that this question has been satisfactorily answered for other parts of physics. Einstein, famously, explained the Lorentz transformations as just logical consequences of the equivalence of inertial frames plus the special role of the speed of light. Likewise, Boltzmann explained pretty much the whole of thermodynamics as a logical consequence of the reversibility of laws of physics plus the specialness of the initial state.

    Do something analogous for QM.

  162. Scott Says:

    Object of Objects #98:

      To me, most of the curiosity around the question “why QM?” can be reduced to the curiosity around the question: why, in a quantum mechanical universe, do we find ourselves existing as classical life rather than quantum life?

    No, I’d say that at least conceptually, we know a pretty good answer to that—namely decoherence! What we don’t know nearly so well, is why we’re not classical life existing in a classical universe.

  163. Scott Says:

    entirelyuseless #105:

      I’m aware you can’t reproduce violations of Bell’s inequality with cellular automata, but that is only if you assume that there is a clear mapping between cellular automata and the things we experience … But I am saying we know for a fact that there cannot be a clear mapping like that … Given that kind of uncertainty, I don’t see how Bell’s theorem would rule out a CA like that.

    But isn’t it obvious that you can explain anything by anything else, if you’re allowed to shoehorn whatever doesn’t fit into the “mapping”? I could say the universe is explained by a kit-kat bar; GR and the Standard Model should just fall out as details of the as-yet-unknown kit-kat→universe mapping. You might call it … “entirelyuseless” 🙂

    To date, no one has comprehensibly explained to me how you get our quantum-mechanical, Bell-inequality-violating observed reality from a classical CA, in a way that isn’t so convoluted, contrived, ugly, and insight-free that you wouldn’t be vastly better off just to throw out the CA part, and talk directly about quantum mechanics. Certainly not Wolfram, not ‘t Hooft, not Sabine Hossenfelder, not Andrei on this blog … as far as I can tell, all the words they’ve produced have taken us 0% of the way toward answering the question.

  164. Crackpot Says:

    Considering one of your later comments, I’ve decided to start with assumptions, continuing the crackpot stuff:

    Assume the universe is made up of space-time.
    Assume the existence of space-time curvature.
    Assume curvature curves curvature.
    Assume that the curvature-curving-curvature results in a complex logarithmic spiral, the net result of which is that the curvature at a given distance R from a mass-energy origin (a singularity) is given by sin(ln(r))/r. (I can’t evaluate whether curvature-curving-curvature would actually work this way, so I’m positing it as an assumption.)
    Assume that curvature can be both positive and negative; that is, the complex spiral can spiral in either direction.

    Our equation sin(ln(r))/r has the distinctive property that, when considered in terms of forces, you have a striping effect; in terms of forces, attractive and repulsive forces alternate. This is an important place to begin, because if you have a field of mass-energy/singularities of sufficient density, they’ll begin to fall into one another, forming clumps. Accumulate enough “mass” in a clump like this, and it forms a larger singularity – however, as this singularity accumulates even more mass, the repulsive force one stripe out gets stronger and stronger, until, eventually, no more mass can enter.

    So our assumptions give rise to “particles” of uniform “mass”. Examining curvature, note that the stripes alternate in reverse; if we call “positive” curvature matter, and “negative” curvature antimatter, note that the stable “mass” of matter corresponds to the minimum mass necessary for an antimatter singularity, but the minimum mass necessary for an antimatter singularity is not in fact stable (as it doesn’t stabilize until it is much larger).

    We now have a cluster of particles of different but predictable masses, and a prediction that, for a given scale of particle, either matter or antimatter will almost completely dominate.

    Assume that matter and antimatter experience curvature in an opposing fashion (where matter would be attracted, antimatter would be repelled, and vice-versa) (this may or may not cash out to the idea that matter and antimatter travel in opposite directions in time; this ties into what I think the complex logarithmic spiral is, but that’s overcomplicated, so here’s the assumption).

    Now we get into an interesting position. Suppose we have a matter particle; starting at an arbitrary scale, there is a thick shell of space in which other matter particles are attracted, on either side of the radius of which they are repelled. Let us call this shell of space a nucleus, because that is exactly what it resembled. The outer thick shell of space is repulsive to other matter particles, but attractive to antimatter particles; let us call this the orbital space.

    Now, I’m talking about particles, but it is important to mention at this point, before I go further, that there is not a particle sitting in the middle of the curvature; the entirety of the curvature IS the “particle”. It’s tempting to point at the singularity and call that the particle, but I believe this is a misunderstanding of the geometry involved, because a stable singularity as I described it is no longer connected to the local space-time. And because of the nature of curvature, it isn’t actually meaningful to talk about a local “center” or “origin”.

    In the orbital space, antimatter particles can be captured. Because of the striping effect, stable antimatter particles are going to be substantially smaller, in terms of “mass”, than the stable matter particles they are captured by. They are repulsive to other antimatter particles; if you examine stable configurations of particles under these conditions, you basically get what look suspiciously like electron “shells”.

    The next question is, exactly how far does this repulsive phase extend? I’m going to suggest it extends quite far, to a distance of somewhere in the vicinity of 10^6 meters. Then begins an attractive phase, which I’m going to suggest ranges from 10^6 to somewhere in the vicinity of 10^12 meters, at which point a repulsive phase runs to 10^18, and so on and so forth. Approximately.

    Now, let’s suppose that we name the repulsive phase that ends at approximately 10^6 m “electricity”, and the attractive phase from 10^6 to 10^12 “gravity”, then we can arrive at certain conclusions. First, there is a minimum size for gravitically-bound mass, below which we should expect to only find electrically-bound mass. Second, there is a maximum size for the analogous structure we call a solar system, beyond which we expect to see a rapid and otherwise unexpected drop-off of mass. Third, we should expect to see a repulsive phase from 10^24 to 10^30m.

    As far as I know, we have not actually detected a repulsive force in the range of 10^12-10^18m, and the idea that protons are only repulsive up to 10^6m is probably contentious (but testable – for smaller objects, we should see magnetic anomalies at some distance from their surface, and in particular there should be some distance at which the magnetic field apparently disappears and then reverses its polarity). But in spite of this, this matches a lot of the basic structure of the universe surprisingly closely, out of relatively few assumptions.

    Of course, I have failed entirely to find a set of constants for sin(ln(r))/r which actually predict the orbital speed of our planets in the solar system. This might be due to a problem I am mathematically insufficient to deal with – if curvature curves curvature, what’s the actual radius of a given orbit?

    Returning to the actual questions posed: Does it actually work for quantum mechanics?

    I struggle to answer this, because, once you start considering probability waveforms as mass-energy distributions / curvature, it doesn’t look like it actually changes anything about quantum mechanics. You get some weird stuff, like hydrogen atoms being a meter across when measured “from the inside”, and in general all of our concepts of distances being horribly distorted by crazy lensing effects – but basically, as far as I can tell, things look more or less the same. You get a natural explanation for some forms of quantization, you get a natural explanation for why particles are the size they are, you get a natural explanation for waveform behavior. Things only really start to change when you get down to chromodynamics, as far as I know.

  165. Scott Says:

    Ted #112:

      Anyway, it isn’t clear to me that a classical Turing machine would necessarily be powerful enough to efficiently simulate processes complex enough to support intelligent life, i.e. powerful enough to “run the math in real time”. But your mileage may vary; maybe it seems obvious to you that a classical computer would be powerful enough.

    Given that life on earth seems to exploit “quantum computational effects” minimally if at all, it seems to me like the burden of proof is firmly on anyone who believes that classical computation wouldn’t be enough to simulate processes leading to complex life, and to do so with only polynomial overhead.

  166. Scott Says:

    matt #113:

      How would those mathematicians have been led to invent something like quantum mechanics?

    Obviously, because they would’ve wanted a notion of query complexity for Boolean functions that satisfied perfect composition and that matched approximate degree as a real polynomial for the n-bit OR function, and/or a model of computation that let them characterize complexity classes like PP and A0PP in terms of postselection. 😀

  167. Scott Says:

    Clinton #118:

      Why should the universe have been quantum mechanical?
      Because we asked math (or the math we know) for the answer.

    Nah, that doesn’t work.

    Because the question still stands: why should the universe have been such that, when we used math to ask for the answer, the answer that came back was “quantum mechanics,” rather than (say) “some classical cellular automaton,” which also would’ve involved math that we knew?

  168. David Says:

    This sounds like an extremely fun and interesting challenge. Of course, uninformed speculation on physics tends to veer into crackpot territory terrifyingly quickly, so please do not take the following overly seriously. That said, you asked and it sounds interesting to try to answer as best I can, so for whatever it’s worth…

    Let us suppose that universes must be mathematical, as it strains conceivability to envision a non-mathematical universe. Let us further assume that universes tend towards simplicity, as it’s easy to envision reality trying to be simple, but without this assumption, it’s impossible to conceive of anything, as matters could always have some additional complication that we haven’t thought of. If you like, envision a Tegmarkian multiverse, with all possible universes, but with simpler ones being more common, or more strongly present or the like. Or, if you prefer, imagine God making a bunch of universes, but starting with the simplest ones. Now, it seems quite certain that a quantum universe is not the simplest possible. The simplest possible universe is probably just a null set, and there are other possibilities more complex than that, but far simpler than the world in which we find ourselves. Imagine a void with a single, classical particle traversing it or the like. So if we are to assume mathematicality and Occam’s Razor (and it is difficult to imagine how we are supposed to even begin answering the question of why the universe is quantum if we don’t), the obvious possibility is that this isn’t the simplest universe possible, but it’s pretty close to the simplest universe in which an observer can ask questions about physics.

    So, if we’re creating a universe that contains an observer wondering about physics, what is the simplest way to make such an observer? They probably need some kind of substrate to start with. The simplest substrate is probably some amorphous material, but the observer needs to process information in order to actually observe. This suggests their substrate needs to change state in some manner that can be used for computation, at which point the most obvious simple solution is particles moving around, rather than an amorphous composition trying to move or undergo some complicated system of state changes. So, if we have particles that process information by their location, the obvious next step is to have some sort of force that can push them around. After all, an observer cannot exist without creating some sort of bias towards affecting something in a manner correlated with what they observed, so if our information processing is particle position, pushing or pulling on those particles seems like the simplest way to affect that processing to allow for thought or perception. So we have particles and force. If a force is to push the particles, it needs some reference point, some way of telling what direction it’s supposed to push in. Since we already have particles, why not use other particles to provide that reference point? So now we have particles exerting forces on each other. While these forces could push in all manner of directions and patterns, let’s start with attraction and repulsion as some of the simplest possibilities.

    So if we have a universe of particles with attractive and repulsive forces, that sounds like a nice, elegant way to start trying to craft an observer. Notably, this is not a quantum universe yet; we have classical particles and classical forces, and this is much simpler than trying to add in quantum effects. However, we have a problem. Attractive particles will tend to just collapse down into a frozen state of maximum compression, while repulsive particles will tend to blow apart. Neither of these states seems conducive to information processing, which we presumably need to have an observer: the collapse state will tend to be unchanging, rather than reacting to new information, while the explosion state will tend to have the particles stop interacting to a meaningful degree (unless the forces have unlimited range, at which point it’s hard to have an observer without it being disrupted by the rest of the universe (unless your observer is the entire universe, which is not what we observe in our world, and sounds substantially more complex than simply having a small observer in a larger world)). Thus, we need a form of bracing, a way to have particles interact and attract without simply collapsing into each other. If we consider our particles’ motion in terms of potential energy, i.e. the position in a potential well that can be used to push the particles around, a very simple and elegant way of bracing is to have the laws of physics basically just assert that bracing: energy cannot easily fall below a certain level. If energy is continuous, one would expect it to continuously bleed away in an attractive potential well. But if energy is quantized, that quantization can brace particles in attraction with enough distance for other interactions to occur.

    This would lead to a universe that is technically quantum, with energy only existing at certain levels. However, so far, we have a universe that looks nothing like our actual one: energy may be quantum, but the particles are purely classical points, rather than wave functions or anything like them. Can we find a reason for wave functions? Well, if our particles are points, then potential energy from particle attraction is purely a function of distance (for a given pair of particles; there could be other variables like degree of attraction or momentum of the attracted particles that might generate different amounts of energy between different combinations of particles at a given distance, but for any given pair of particles their potential energy from their attraction would very based on distance). If energy is quantized, and energy depends on distance, that would mean that distances between particles have to be quantized. For two particles, that would be possible, but for larger numbers, and with particles moving, the geometry would quickly become impossible. Fitting particles together such that at every time step, all of the distances match the allowed distances would not work with as few as three particles in the system.

    Thus, point particles do not work, and they need to spread out to allow for energy to remain quantized. Rather than using a point with a single value, we need a function. Let’s use a simple function like a sine wave. Trying to have a particle with negative presence where the sine is negative sounds like a headache, so let’s square the sine so we don’t have to bother with that. What do we have now? For one thing, the sine wave has resonance if confined in a potential well (if its square is reflecting particle presence, then it has to have value zero at the edges of the well. Moreover, if it isn’t in a potential well, we’re not dealing with potential degenerate collapses, so we really only care about energy quantization in potential wells), which provides a nice source of the quantization we need; if anything that’s simpler and more elegant than just saying “this point particle can’t get within a nanometer of this other one, because reasons”. And crucially, this solves the problem of fitting particles together in a manner that allows them all to maintain quantized energy states. Consider an electron and a proton. They attract, and the proton creates a spherical potential well confining the electron. If, rather than being a point, the electron is spread out with its presence dictated by the square of a truncated sine wave, the simplest way this can happen is for that wave to be half a period. Square that, and you get an electron that is largely collapsed on the proton, but with some presence surrounding it. What happens if another proton happens by at an arbitrary range (i.e. the situation that quantized energy with point particles could not handle)? Now both protons together project a potential well, this time elliptical. The electron maintains a half-period wave function across the well, which when squared produces an electron primarily halfway in between the two protons, with presence projecting throughout the elliptical well, though decaying quickly. In other words, we have a system that does not experience degenerate collapse that would prevent further evolution in response to outside effects, and which can maintain its ability to keep evolving in the presence of arbitrary outside particles. Classical physics fails the first test, and quantum physics with classical particles fails the second. Thus, quantum physics with wave functions sounds suspiciously like one of the simplest possible systems that could lead to an observer, and thus by the anthropic principle, it makes sense that our universe is quantum in much the manner we observe.

    I would like to reiterate not to take the above too seriously! This is just wild speculation, engaged in for my personal amusement and hopefully the entertainment of Scott and the other blog readers. This is not a serious theory of why quantum mechanics exists; it’s more the intellectual equivalent of a Banzai charge: ignore the overwhelming reasons why progress here is incredibly difficult and just try to advance. Nevertheless, it is interesting that quantum mechanics with wave functions does seem like one of the simplest ways a universe could exist in which matter wouldn’t simply collapse or explode, and thus one of the simplest universes in which we might expect to find ourselves.

  169. Roger Schlafly Says:

    A cellular automata universe would be deterministic. Humans would have no free will. If you want a world where some beings have consciousness and free will, then QM is the best known theory that allows for that possibility. You could add a stochastic process to a CA theory, but you might end up needing something like QM to explain that stochastic process.

  170. Gerard Says:

    Scott.

    There’s obviously a great deal that could be said about your questions. I’ll just offer one thought that I’ve often had. Suppose “God” wanted to create a universe in which beings could experience some kind of “freedom” (or at least the appearance of it) yet still have the whole thing work. QM seems like exactly the sort of theory you would need for something like that. On the one hand things must always approach a classical limit once the scale gets large enough, yet particles appear free to behave in an unconstrained manner at small scales. You’ve often said that the randomness inherent in QM isn’t really “free will” but I think it’s likely that randomness is just what “free will” looks like from the outside and conversely “free will” is what randomness feels like from the inside. The sort of automaton based universe you propose wouldn’t allow anything like this. Also if random behavior (ie. freedom) were completely unconstrained you would end up with a completely chaotic and unstructured universe which wouldn’t be interesting.

    Now I have a question for you:

    Is grade-school arithmetic intractable ?

    Specifically suppose you have a sequence of m signed rationals (p_i/q_i) where each p and q is an n bit integer. Consider the decision problem of deciding whether its sum is positive. Can that be decided in polynomial time ?

    The reason this problem seems interesting is that I suspect that the answer to that question is no (basically because the common denominator can be exponentially large and the difference from zero can be as small as 1/lcd), yet I think it can be decided in PSPACE because both division and addition can stream their output bits using only a polynomial amount of intermediate storage space.

  171. Ilio Says:

    Why Quantum Mechanics? I suspect that’s the only theory that you can either interpret as deterministic (MWI, etc) or non deterministic (Penrose, etc); L1 based and continuous (Bohm, etc) or L2 based and discrete (Aaronson, etc); where consciousness is special (Wigner, etc.) or not (Zurek, etc.), etc. In Bohr’s words (stolen from le Nielsen&Chuang), a world where the opposite of a deep truth may well be another deep truth. In other words, my version of the Weirdness Hypothesis (re Chow #59).

    But why the Weirdness Hypothesis? I suspect that our brains do construct our subjective perceptions of reality using neural ensembles that can flip as in the rabbit–duck illusion. You can think blue tribe, or red tribe, and that’s actually just a flip in one of a few key neural ensembles describing your worldview. In other words, the universe is what it is so that you can understand it no matter if you are blue tribe or red tribe, or neither (grey tribe) or both (apolitical). In other words, my version of the anthropic principle.

    As always, anthropic principles tend to lack meat, but noticely less for Q1: the Universe *is* classical, if you want to interpret Her this way. So Q1 is only asking why *you* prefer to think in terms of L2. Maybe some alien birds prefer quaternions because that’s a best fit for/from 3-D flight simulation? Anyway, thanks the fun this question brings. 🙂

  172. lewikee Says:

    Question for chemists: How often can you say “It is only because the constituents of these molecules are in various superpositions of states and interfering with each other that this phenomenon can happen”?

    The more I think of Q1, the more I think you will only get a satisfactory response from looking at a quality attempt at creating a CA/classical foundation that leads to complex chemistry. Then look at the biggest hurdles (or lack thereof!). If you see someone do a very good job at coming up with a theory, but she still says “I’ve tried every which way, but without *insert QM property here* I just can’t attain the required level of complexity” then you’ve got something to hold onto.

    Right now if someone gives you what seems like a good answer why QM is not necessary, there will always be some uncertainty until you can see the idea in “action”.

  173. fred Says:

    Comparing various models of universes first requires establishing everything clearly in terms of equivalent resources.
    If space and times are described as continuum, resources become infinite and it’s impossible to simulate anything perfectly in order to compare things objectively from a computational resource point of view.
    E.g. simulating perfectly even a thing as simple as a classical double pendulum (in a space/time where positions are described by reals) is intractable given any finite amount of computational resources.

    If things were fundamentally discrete, we could hope to maybe show that one model of computation can’t be “compressed”: the full set of resources and computational steps is required to reproduce a certain evolution from initial conditions, with no possible shortcut. Then we could declare that this model is truly fundamental (but the problem is that we can never be sure that a certain computation is the shortest/cheapest one, with Kolmogorov complexity, etc).

    So:
    1) what are the fundamental resources (how is time and space represented)?
    2) whether the evolution (as a computation) is compressible or not?

    I guess this is pretty much what S. Wolfram is doing: using a very basic/finite/discrete structures at the bottom (basically graphs). And evolving them using very simple computational rules, and observe the emergence of things like QM and general relativity and maybe the standard model.

  174. Philippe Grangier Says:

    Scott, from your comments you are definitively not a physicist (this is not personal, I could tell the same to many computer scientists). Let’s take a restaurant comparison : given a good dish, a physicist will appreciate it, describe it, maybe measure it, and give some conclusions that allow other folks to recognize this dish, know what can be expected from it, and compare it with other ones. But you want much more than that, you want to know all the tricks used by the Grand Chef to design the dish – even more, you want to be the Grand Chef himself, to enter all his most intimate thoughts.
    Back to reality, there may or may not be a Grand Chef, anyway the point is that in physics the experiments tell us what is going on, and what must be described and predicted. This is the true problem that the creators of QM had to solve, in the first quarter of the 20th century – and they did find an answer. If you ignore this very down-to-the-earth reality, and want to move to pure abstraction (or prescription), asking ‘why this, and not that’, I’m afraid that you will get lost into metaphysics. Please have a look at https://arxiv.org/abs/2105.14448, it is short and easy to read (except maybe the Appendix), and it may suggest an interesting way to go.

  175. fred Says:

    I know that Scott doesn’t like the argument: “quantum mechanics is mysterious!… consciousness is mysterious!… therefore the two must be related!”

    But… maybe the fastest way to prove that QM is required is to show that consciousness is fundamentally a quantum process!

    For now we can say that we don’t observe the opposite:
    if consciousness was just the result of simple digital computations, then it wouldn’t require all the fanciness of quantum mechanics on spacetime continuum to “summon” rich worlds teaming with conscious beings: nature/God would just have had to find ways to summon something as simple as the game of life, i.e. we’d be living in an actual universe that’s pretty much like playing Grand Theft Auto in Virtual Reality.

  176. David Says:

    @Comment 169:

    It is unclear either that humans have free will or that quantum mechanics makes free will possible. The concept of free will seems rather awkward: if our choices are deterministic than we clearly don’t have free will, and random choices don’t seem to fit the bill either. What is neither deterministic nor random? In addition, the many worlds model is probably the most likely form of quantum mechanics; in many worlds you end up in all branches of the wave function rather than choosing. Moreover, while quantum randomness is certainly a thing (from our point of view, if perhaps not from the point of view of an observer outside the universal wave function), as far as I know, human decisions are not generally affected by it. Many phenomena are effectively deterministic. Light reflecting off a mirror is a quantum process, but the way the wave functions add up results in the angle of reflection always equaling the angle of incidence. We would not say that because quantum mechanics exists, we cannot figure out how light will reflect!

  177. fred Says:

    Theories are tied to observations.

    Why the hell didn’t Newton go beyond his elegant theory of force and gravity and see that General Relativity was just an even better and more complete theory? Because Newton’s theory fitted the data as well as one could ever hope to, at the time. After all, how much simpler than F=G.m1.m2/r^2 can it get?!
    He would have been quite astonished to see that the GR equations are the more fundamental, more accurate answer.

    Whether it’s 1700 or 2022, the same amount of caution should be used.
    Especially when we’re so clearly not done (quantum gravity, the measurement problem, dark energy, black holes, etc, etc).

  178. Edmond Says:

    I’m afraid I don’t have anything extremely interesting to add to this conversation (which comes with having more of a biology background, I suppose — this isn’t my field by any stretch!), but for what it’s worth: while I recognise the non-stupidity of the Two Questions, I have never agonised about them overmuch because I’ve long been drawn to something like the Tegmarkian answer. Any universe-system that can be described in self-consistent mathematical terms “exists”, not because some God or matrix-engineer decided to run every self-consistent mathematical universe on a physical substrate for whatever reason, but because it really is math all the way down; we’re all just pieces of abstractly-“existing” maths, perceiving themselves.

    (I say “universe-system” rather than “universe” because at that level of thinking, it seems to me that the complete MWI branching set of “universes” we inhabit should be taken as a whole and considered one of the distinct, mathematically-independent “worlds”; if there is a universe that runs on purely Newtonian physics, it stands as the ‘equal’ of our entire Many Worlds multiverse, not of any one branch of the wave-function.)

    Where I diverge from “serious” Tegmarkians is that I’ve never seen the need to concern ourselves with the idea that more complex universes might be “less” “real” than simpler ones, and the ramifications of this idea in terms of whether QM really is “simplest”. This is what makes me feel a little inadequate bringing it up, because it makes my ‘answer’ a bit of conversation-stopper… there’s very little left to think about or investigate. But I can’t help the fact that I feel it’s just… true. If there are any orthodox Tregmarkians in the audience, and if our good host Scott will permit, I would appreciate an explanation for why you add this extra term to the proposition, rather than sticking with an uncompromising, straightforward “if you can compute it, it’s Real, end of story”. Am I missing something?

  179. Greg Guy Says:

    Suppose God was going over the construction of Reality one day and decided that the weird correlations in QM that give us Bell’s results were annoying, and so decided to get rid of them. What would the effect of that be? I realise that human-made artifacts such as quantum computers would no longer work, but what about the natural world? What changes would we observe? Does anyone have an answer or is it a somewhat meaningless question?

  180. Aram Ebtekar Says:

    Which of Hardy’s axioms do we find strange? I don’t know yet what to make of the composite systems axiom used to rule out real-valued amplitudes. However, I feel the rest come from a desire to get the best of both worlds: continuous and discrete. Among classical theories, continuous ones have more symmetries, which we want. However, lacking a natural length scale or uncertainty principle, they allow essentially unbounded amounts of information and computation within finitely bounded regions. Following Hardy’s reasoning, it seems QM-like theories come from any attempt to resolve this tension.

  181. Pavlos Says:

    As stated, Q1 is easy. 😛 Where would the fun be for God in creating a classical, i.e. deterministic, Universe? All events in such a world would be equivalent to the initial conditions. Any chaos-type complexity-driven pseudo-randomness would be unbearably boring to such an allmighty Being.

    In other words, why would the Universe have a time dimension, if everything was pre-determined? It seems superfluous, not something a God would bother with. A Cauchy slice should be enough for this Creator. A painting, rather than a movie. Quantum mechanical (true) randomness is the source of all adventure. In fact, one could argue that the Newtonian view of position and its rate of change being known in the same instant, making a trajectory equivalent to any of its points and turning us all into hapless automata, is the one that should look suspicious. Negating the observer is no way to go in a science based entirely on observer data. 😉

    With Q1 answered 😂 Q2 becomes: Which mathematical structures are appropriate for a theory of physical phenomena that are fundamentally random? Now that looks like it might involve some non-trivial mathematics, but at least we know the answer (a complex Hilbert space?).

  182. Ken Says:

    Obviously we’re all flirting with crackpottery here, but given that: my guess would have to do with there being anything at all. How to create a universe out of nothing, stably, without blowing up, generating not a formless blob but intricate complex structure, however you want to characterize that. That somehow — and no, I can’t say exactly how — QM (or QFT, or whatever the right generalization of QFT is) allows that, or even insists on it, and no other system would. Once you’ve got a lot of stuff, there might be many possibilities as to the dynamics that drives it. But how do you self-consistently get a lot of stuff, and interesting, complex stuff?

  183. Scott Says:

    Philippe Grangier #174:

      Scott, from your comments you are definitively not a physicist (this is not personal, I could tell the same to many computer scientists). Let’s take a restaurant comparison : given a good dish, a physicist will appreciate it, describe it, maybe measure it, and give some conclusions that allow other folks to recognize this dish, know what can be expected from it, and compare it with other ones. But you want much more than that, you want to know all the tricks used by the Grand Chef to design the dish – even more, you want to be the Grand Chef himself, to enter all his most intimate thoughts.

    “What really interests me is whether God had any choice in the creation of the world.” –my fellow computer scientist, Albert E. 😀

  184. Dan Says:

    “There is a loophole, but I’d say it’s so extreme as to prove the rule. Namely, one can’t rule out that someone used, e.g., a giant Game of Life board to create a “Matrix” that’s running our entire quantum-mechanical universe as a computer simulation! To me, though, this just seems like an instance of a more general point: namely, that nothing in physics can rule out the possibility that the whole observed universe is a simulation, an illusion, or a lie. (The idea of “superdeterminism” goes to this same extreme, even though it strenuously denies doing so.)

    “””

    Holly cow. I imagine the creator coming up with such a mindfuck idea for a simulation. He must have been “shite, this would be crazy 🤪. Just need to find a big board. I might as well simulate that one observer (Scott) realizes this brilliant idea and comments about it in a random remote niche blog. No verification algorithm can be simulated on the board to disprove this btw”

  185. Tim Maudlin Says:

    Re #161:

    “Einstein, famously, explained the Lorentz transformations as just logical consequences of the equivalence of inertial frames plus the special role of the speed of light.”

    If this is the sort of thing you have in mind, then I think one needs to think through this case more clearly. From the point of view of GR 1) there generically are no global inertial frames, or even exact local ones and 2) there is no such objective physical quantity as the “speed” of anything, including light. Therefore this “explanation”, whatever you have in mind, is not only not simple, it is incorrect. And not for subtle reasons like QM. It’s just not what really explains anything, even from the point of view of Relativity.

  186. Denis Kuperberg Says:

    Why do you except there should be a satisfying answer ? Except if we take “God” literally in your question, (as we should not, based on your comment #24), I don’t see any reason to believe we can expect an answer to this question. Your position seems to be assuming that the rules of the universe have been defined optimally in some sense. Indeed the word “God” in your question is not just a proxy for “the rules of Nature”, but implicitly justifies this additional assumption of well-chosen rules. This choice of words could be a hint that this assumption stems from the conscious or unconscious influence of millenia of religious thinking. In my opinion, using “God” in this kind of questions, even jokingly, opens the door for many biases and wrong preconceptions.
    Maybe there is an answer that we would find more pleasant than “it’s just how the world is”, but to me that would be a good surprise rather than a necessity.

  187. mjgeddes Says:

    Scott #161

    “What, if any, are the more basic principles that, once accepted, would make QM an unsurprising consequence?”

    (1). The notion of “Objective Reality” is a limit that only makes sense after infinite elapsed time from the perspective of observers within reality.

    (2). The basic design principle of reality is ‘Actualization’. Reality begins from a ground state of ‘possible worlds’ (non-constructive math), some of which start to get actualized (become actual worlds). Actualization simply means that an *objective* description of these worlds can increasingly be given purely in terms of computation (i.e, constructive mathematics). From (1), this process continues forever; worlds are always only in various *degrees* of actualization, which is the *measure* of their existence.

    (3). To ‘actualize’ reality, there are 3 conditions:

    (a). the whole can be decomposed into understandable parts (compositionality)
    (b). the parts can combine into larger integrated systems (complexity)
    (c). the parts affect each other in limited, logic ways (causality)

    (4). Quantum mechanics is simply a special case of the general ‘theory of actualization’, which explains the physics of conditions (a), (b) & (c) above. The 3 conditions together give reality the property of ‘comprehensibility’ , which is equivalent to ‘actualization’. Comprehensibility is the ease with which observers within reality can understand it.

    (5). Hilbert space is only a description of the space of possible worlds, it does not account for the actual process of actuliazation (properties a,b,c) which are expressed as : (a) computational topology, (b) function spaces, (c) computational geometry.

    (6). The full ‘theory of actualization’ is about the mapping between (1) Hilbert space, (2) Computational Geometry & (3). Space-Time. (1) is about the ground-state of reality (the space of possible worlds), (2) is about the actualization of reality (how reality is made comprehensible) & (3) is the actualized structure of reality (the observed physical world).

    Simple 😀

  188. Age bronze Says:

    There are three properties of the universe that are easy to implement in simulations if you give up one of them, but I don’t see any way to implement all of them together, and I suspect quantum mechanics is the most simple solution that has these properties.

    1. Seemingly euclidean geometry (unlike cellular automata, which only have discrete number of rotations) . I say seemingly because gravity distorts it a bit.

    2. Discreteness of states. Everything is obviously easy with infinite constructions.

    3. Reversibility. (That’s the unitary of the time evolution).

    (4? Turing completeness / non trivial interactions / anything preventing you from just calling a non interacting model a solution.)

    Go ahead and try to satisfy all those 3 conditions. 1,2 without 3 is easy: just use floating point numbers for your simulation like most physics engines. Giving up 1, you can satisfy 2,3 with Wolfram / Gerard t hooft style universes, but you’ll never recover eucleadean geometry. And giving up 2, that’s just classical physics with real numbers but it can never run on a computer.

    Whether our universe has discrete states is still up to new research in physics, but assuming it is, I think it’s absolutely amazing feat to achieve just those 3 properties of our universe. I challenge you to implement all 3 of them with any toy universe.

    How is quantum mechanics related to those 3 properties? If you try to satisfy those 3 properties naively, with particles that have definite position and just move every time step, you will have to have finite number of directions to move every time step, and no matter how much you zoom out, the metric will never be euclidean.

    (*** you can try to implement a universe with rational numbers as coordinates, but you have to give up on locality, or you’ll end up with extremely weak interactions)

    So I suspect that by smearing particles over space, as quantum mechanics does, you can get a universe which seems euclidean, yet with discrete states and without doing irreversible things like floating points and rounding errors. It’s all about finite dimensional representations of rotation groups. Spin, angular momentum, they are all ways to implement smooth rotations with discrete number of states and in a completely reversible way.

    Even the weirdness of Bell tests, they are also all about your ability to rotate freely and measure in different angles. If you had just 6 spatial directions, like in a grid, you couldn’t do it.

    Spin and qubits are not there for superposition or quantum weirdness or randomness, they are just the most simple representation of SU2, and they are required to make a universe with euclidean geometry, discrete states and reversibility.

  189. Dániel Varga Says:

    entirelyuseless #46:

    I think the actual answer will involve determining what the universe needs to look like to an observer, given the fact that no observer can observe itself perfectly, even in theory.

    E.g. what would the universe look like to an AI instantiated in Conway’s Game of Life?

    It surely would not look like Conway’s Game of Life, given that the observers on the board could not have an exact model of themselves, which they could, if they could look at themselves the way we could if we saw the board.

    I would like to signal boost this profound observation, and hopefully take it a bit further. Research proposal:

    1. Create an environment in a stylized universe like Conway’s Life or Fredkin and Toffoli’s billiard table.

    2. Create an agent that is part of this environment. We can assume that the universe is Turing-complete. The agent has to be smart enough to create models about its environment, but stylized enough so that we do not have to design a full-blown artificial general intelligence. It is not intelligence that matters for our purposes, but the ability to alter, predict and model one’s surroundings. We do not even have to consider constraints like how the agent evolved, we can simply assume that it appeared in its fully glory as some Boltzmann-brain-with-eyes-and-hands. But it is important that the brain is part of its environment, not some ghost in a machine.

    3. Formalize what it means for the agent to make observations and models about its environment. Figure out what kind of observations the agent is capable of doing in principle.

    4. Investigate the relationship between the actual laws of the universe, and the laws of the universe as discovered by the agent.

    Adding my own spin to this: I really believe we need to consider thermodynamics next to (or immediately after) QM when working on Scott’s Q. I’m saying this because we can’t answer the question without talking about time, and according to the block universe viewpoint I subscribe to, we can’t understand time without understanding how it emerges from the more fundamental notion of spacetime via thermodynamics. So I would add an extra constraint to the above research proposal. This constraint is a huge PITA for us human investigators, but might bring us closer to something valuable:

    5. Do not “hardwire” the concept of time into the laws of the toy universe. Rather, any concept of time should reference the agent’s internals, similarly to how GR defines time via ticking clocks, which are very specific patterns in spacetime.

    So the order in which the concepts build on each other is supposed to be 1. spacetime 2. agent 3. states/memories of agent. 4. time. That’s a very counterintuitive approach for us humans, but IMHO it is the right one. (And to dispel a possible misunderstanding in advance: this does not mean that time is subjective. I mean, time is kind of subjective in the above proposed toy block universe inhabited by a single Boltzmann brain Turing machine, but in our universe, it is not more and not less subjective than what’s prescribed by GR.)

  190. Scott Says:

    Denis Kuperberg #186:

      Why do you except [sic] there should be a satisfying answer ?

    Isn’t the entire point in science that, whatever you know, you try to explain as a consequence of something deeper … and when you find it, you try in turn to explain that, and so on as long as you can? Yes, maybe you and everyone else will eventually give up in despair, or maybe someone will explain why there’s no explanation of the sort you’d wanted (as with individual quantum measurement outcomes), and then that particular direction of inquiry is at an end. Until one of those things happens, though, the hunt for explanatory satisfaction continues, just like it did for Darwin and Boltzmann and Einstein and all the rest who refused to shrug and say “it’s that way because it just is”!

  191. George H. Says:

    Oh boy! First, you know way more QM than humble me… and I hope you’ll do a Bell inequality post (with photons) someday. Second I didn’t read all the comments, so this has probably been said.
    QM (as a model of the world) is true because of experiment (as you said), that’s the ‘how’ of the world that science can do. The ‘why’ of the world does not seem open to science. It is a fun parlor game to ask what would be different if various fundamental constants were changed by a factor of ten.
    On a personal level, I like the idea that it’s hard to confine a thing to tightly.

  192. John Stricker Says:

    Interesting topic…

  193. Scott Says:

    Peter Gerdes #122:

      Look, I don’t really believe this but it’s an interesting line of thought.
      Suppose you are some kind of dualist (in the sense of us having something like souls or being in something like the matrix) and you want a rules for the physical world that don’t expose this fact. In other words you want our choices to be governed by the operation of something outside the physical world but you want the physical world to appear causally closed even when you look at neurons under the microscope.
      QM gives you a really nice way to do this as you can simply evolve the wave function forward globally and then choose the branch you want to take globally based on which best matches the desires of the non-physical ‘souls’ or individuals externally placed into the simulation or whatever.

    Some others suggested this too. Since this post is already tagged “Embarrassing Myself,” I might as well come out and say it: I’ve thought for a long time that something in this general vicinity is one of the main possibilities on the table. Having said that, it’s actually nontrivial to spell out what QM gives you, that you couldn’t have equally well gotten from some classical theory with randomness and nondeterminism! And it’s also nontrivial to spell out why we’d still see the Born rule holding always and everywhere as far as we can test it, even when we look at neurons under a microscope, etc.—if so, then where does the Knightian unpredictability come in?

    My Ghost in the Quantum Turing Machine essay, from back in 2013, represented one attempt to articulate an answer to these questions. I didn’t focus in that essay on the “explaining QM” angle, or on the possibility of other, non-quantum physical theories that could give rise to the same sort of Knightian unpredictability, but that would be a direction to go if I ever revisited these ideas.

  194. Gerard Says:

    Scott #190

    > Isn’t the entire point in science that, whatever you know, you try to explain as a consequence of something deeper … and when you find it, you try in turn to explain that, and so on as long as you can?

    I wouldn’t say that was the entire point of science. Certainly providing a reasonably strong rational foundation for our efforts to engineer a world more in accord with our desires seems like it would alone be reason enough to pursue science and mathematics.

    As for whether science can ever provide a satisfactory explanation for our deepest questions about why things are the way they are, I don’t think so. I do however think that asking such questions is an important stage of the path to the truth.

    The problem with science is that it gets its epistemology exactly backwards. It starts with what it sees “out there” when in fact what we see “out there” is something we can never have certain knowledge of, as most philosophers have understood at least since Plato. The one thing that we do have certain knowledge of is “I AM” and I think the fact that your own religious tradition equates that statement with the name of God is something it would be worth reflecting on.

    PS. As for the question I asked in my last comment, sorry, I fear my feeble monkey brain failed me once again and I forgot that arithmetic is polylog time, not polynomial time, in the quantities on which it operates.

  195. Scott Says:

    Tim Maudlin #123:

      I get that *as a computer scientist* you regard cellular automata as “simple”. But thinking as a physicist, in a world where any sort of computer (and especially a digital computer) is a very complex, emergent thing, I don’t see *any* of them as simple. You think in terms of *writing* or *abstractly specifying* the rules under which they operate. I think in terms of *physically implementing* those rules, which requires a lot of complexity. So what comes natural from your discipline as a “simple” system looks to me as extremely complex. For example, actually physically implementing Conway’s rules for Life is a hell of a lot more complex than, say, F = mA, which isn’t a computation at all.

    If you let yourself take the laws of physics that we happen to find in this universe as the standard of simplicity, then of course anything else is going to look more complicated by comparison! But that’s obviously rigging the game.

    If you compare, say, Conway’s Game of Life against Newtonian mechanics in any non-rigged contest of simplicity, one that wasn’t specifically constructed to favor the latter, then it seems to me that Conway’s Game of Life wins hands down, just wipes the floor with Newtonian mechanics. It’s not just that it’s easier to write a program to simulate it in any programming language I’ve ever seen or heard of (though it is). The Game of Life is also immensely easier than Newtonian mechanics to explain to a child (believe me, I’ve tried both!). It would be far simpler to define in ZF set theory. If it can ever mean anything objective for one thing to be simpler than another thing, then the Game of Life, with its grid of bits replacing real-valued positions, velocities, accelerations, and masses, is simpler.

    But of course, the Game of Life is not our world, which means that we have an actual opportunity to learn something important! Is there, for example, a compelling a-priori reason why we should never have expected to live in a world that wasn’t Lorentz-invariant, or at least Galilean-invariant, or at least rotationally invariant? Maybe thinking about it will uncover reasons that weren’t immediately obvious. You’re a philosopher; you believe in thinking, right? 😀

  196. Daniel Harlow Says:

    I think you are overly dismissive of argument #2 that the world as we know it would be impossible without quantum mechanics. In order for us to be having this discussion at all, the laws of physics need to have the ability to generate interesting complex structures in a reasonable amount of time starting from a simple initial state. Now I know that as a computer scientist you are trained to think that is a trivial problem because of Turing completeness, universality, blah blah blah, but really I don’t think it is so simple. Why should the laws of physics allow a Turing machine to be built? And even if a Turing machine is possible, why should one exist? I think the CS intuition that “most things are universal” comes with baked-in assumptions about the stability of matter and the existence of low-entropy objects, and I think it is not so easy to achieve these with arbitrary laws of physics.

  197. Sid Says:

    Scott #192

    You may be interested in this paper which derives that the only possible reference frame transformations are Galiliean and Lorentz given 4 intuitive axioms – http://o.castera.free.fr/pdf/One_more_derivation.pdf

  198. ppnl Says:

    Maybe god was trying to maximize the computational power of the universe without making it as all powerful has he/she/it is. Can you imagine a set of physical laws that allow computers more powerful than quantum powers but does not destroy computational complexity?

    P=NP might be a bad idea.

  199. Corey Says:

    @Scott #99:

    “But, to say it one more time, what would’ve been wrong with a totally different starting point—let’s say, a classical cellular automaton? Sure, it wouldn’t lead to our physics, but it would lead to some physics that was computationally universal and presumably able to support complex life (at least, until I see a good argument otherwise). ”

    I have a meta-question regarding this exercise. Suppose for the sake of argument that your conjecture is correct and it is indeed possible to write down a set of rules for a CA, the emergent behavior of which is sufficient to support complex life. Not only that, but the complex life supported by this universe is sentient and experiences consciousness in a manner entirely compatible with our own experience. In fact, if you were to simulate this CA long enough (suppose this was proven constructively so you actually have the list of rules on a piece of paper) eventually there would emerge a sentient being in this universe named Aaron Scottson who asks about his own universe “what would’ve been wrong with a totally different starting point—let’s say, unitary evolution of vectors in Hilbert space?”

    All of this is a long-winded way to ask: Suppose there really wasn’t anything wrong with choosing a different, classical, starting point. How would this knowledge morph the nature of your inquiry into QM?

    Put another way, what would be the necessary conditions needed to, in your mind, transform this from a physics question into a pure metaphysics question?

  200. Scott Says:

    Also, Tim Maudlin #123:

      Finally—and I’m a little upset to have be even writing this—the additional local “variables” (beables) in Bohemian mechanics are neither emergent (they are fundamental) nor hidden (collectively, they are what you can most easily see). “Standard QM”, if you mean by that what you find in standard textbooks, just is not a well-defined physical theory with well-defined ontology or laws. So it isn’t even the ballpark of answering the questions you are asking. Bohemian Mechanics—whatever other issues you have with it—is.

    Tim, I think you should totally develop Bohemian Mechanics. 🙂

    Seriously, though, I told you we weren’t going to agree, but FWIW, I’m well-aware of the fact that the “beables” (my phone keeps recommending “Beatles”) are the only observable thing in BM. The trouble, as you know as well as I, is that either a beable value is tied up with something decohered, macroscopic, and classical, in which case every interpretation of QM will just talk unproblematically about its value (or its value in “our” branch of the wavefunction) with no need for the beable, or else the beable value is tied up in an as-yet-unmeasured quantum state, in which case our knowledge of the probability distribution over beable values is simply whatever follows from our knowledge of the quantum state itself. In neither case does consideration of the beable, in practice, give us any information whatsoever that we didn’t already know. (Though the mathematical fact that one can add such things to the quantum formalism without contradiction is, I agree, important and interesting.)

  201. Sid Says:

    Daniel Harlow #196:

    I’m not sure about Turing machines but there are decision versions of for ex the n-body problem which are PSPACE-hard (https://en.wikipedia.org/wiki/N-body_simulation#Computational_complexity) so even the inverse square law which is one of the simplest physics laws allows for non-trivial computation

  202. Chris Says:

    Scott #160

    >To get that they’re complex, you need an axiom saying that the number of parameters needed to characterize a bipartite state should be the product of the numbers of parameters needed to characterize the individual components.

    To me that seems fairly unobjectionable, his reasoning falls right in line with the Tegmarkian maximalist principle, in this case cashing out as saying that degrees of freedom factorize; that separate components need not be entangled. From the point of view of the quotient space, superposition is the default.

  203. Clinton Says:

    Scott #167:
    Ah, sorry, I didn’t explain my thoughts very well. Please allow me to try again.

    “Because the question still stands: why should the universe have been such that, when we
    used math to ask for the answer, the answer that came back was “quantum mechanics,”
    rather than (say) “some classical cellular automaton,” which also would’ve involved math that
    we knew?”

    There were two senses in which I meant “because we asked math (or the math we knew)”: First, a narrow sense and, second, a much, much broader sense.

    First the narrow sense:
    I’m thinking of the consensus around the end of the 19th century when it was widely held that all of the major theoretical foundations for physics had been worked out, that the universe was classical, and that what was left for science was just to fill in the details and the precision. Wasn’t it Planck’s advisor who said there was “nothing new to discover”?
    What bothers me is that if you had been writing this blog in say the 1880s then your Q would have been: Why should the universe have been Newtonian/classical? And the discussion would be about why God made the universe Newtonian/classical instead of some other way … because … well, obviously Newtonian physics had passed every test.
    Of course, we know what happened, experimental evidence led to the invention of the math we didn’t have – an extension of the laws of probability theory that allows negative numbers. The crucial moment perhaps when Schrodinger was challenged to “go find the wave equation” and begat the amplitude model. Heisenberg developed the same idea in parallel.
    But … what if Schrodinger and Heisenberg … had NOT produced the amplitude model? What if Schrodinger had become distracted in that Swiss cottage and lost his way … meanwhile Heisenberg fell ill and never recovered … until still today we just had some ad hoc Bohr-type models, not yet explained by a full theory. Meanwhile, everyone still walking around saying, “Why did the universe have to be classical”?
    We did not have the math to say otherwise.
    So, if an alien were to have visited Earth in the 1880s and an Earthman were to have asked the alien, “Why is the universe classical.” I can imagine the alien might have answered, “Because of your math.”
    Does your 2022 version of Q not risk making the same mistake? Maybe we think the universe is quantum mechanical because well … that is the best fitting tool we seem to have at the moment in our toolbox. But, it seems to be saying more than we should be able to say for us to say “the universe is quantum mechanical”. It seems we should be carefully modest and say, “our best model of the universe is quantum mechanical.” The former statement can lead us to making the Planck’s advisor error or to presuming that it “had to be so”.
    What I’m saying here is that when we thought the universe was classical it was because that was the math-tool we were aware of. So, back then when we said, “The universe is classical” it wasn’t because the universe was classical. It was because of … the math we knew at the time. Now we understand that the classical math-tool is actually just part of a larger math-tool called quantum theory and that we can explain previously unexplained phenomena with that larger math-tool. It seems to be tempting the gods of comedy for us to … yet again … stand up on our primate hind legs and declare … “Now! Now we know what the universe is! Not before! But now!”
    I’m just afraid we will end up sounding like Planck’s advisor. Which makes me think maybe the better question is “Why do we think that the universe is quantum mechanical?” Why did Planck’s advisor think the universe was classical?

    And then the much, much broader sense.
    So, this is the “unreasonable effectiveness” problem. And I admit I don’t even know what to make of the problem. Basically, the problem is this: Math is just … too perfect. Isn’t it? Why do we trust math? This is a really hard question to even ask. I don’t even know really how to ask it.
    Philosophers have said, “Math describes but doesn’t explain.”
    And that is an issue. But I think I’m even doubting if we should trust it to “describe”.
    I think this is usually called the “non-Platonic” view of mathematics. There are probably different takes on it, and I’m not at all an expert, but I guess I would fall somewhere on the view that mathematics is something like a part of our cognitive model of computation – an intrinsic, inseparable, part of how our brains create what we experience as reality. Yes, yes, of course, there is an external reality. But we only ever experience what is generated internally. So, the internal model must put it together for us by following its own … model of computation.
    So, what if the universe … isn’t even mathematical … much less quantum mechanical? What if mathematics is just some kind of evolved … approximation … for something that … we can’t even imagine?

    Just seems like maybe these two possibilities above, the narrow and broad sense in which mathematics could be misleading, should be considered when someone asks a question like “why is the universe quantum mechanical”? Or at least I would want someone who was writing a book/essay on this question (even if briefly) to explain why I should trust the presumptions behind the question 🙂

    And I know that probably makes you want to slap your forehead: “What, this doofus wants me to justify the use of math to explain the universe?” Isn’t it obvious! Well, yes. Your question implies that the universe is mathematical. I can’t think of a “reason” why it has to be. And if it doesn’t have to be mathematical then … it doesn’t have to be quantum mechanical.

  204. Stewart Peterson Says:

    Prof. Aaronson,

    I hope I’m not putting words in your mouth here, but it appears that you have concluded that quantum mechanics is mathematically equivalent to an abstract, probabilistic, operator calculus. Assumptions can then be stated to reflect the numerical values of experiments. The usual presentation in textbooks is, of course, the opposite: physics reasoning with the needed mathematics in the lemmas. Are you aware of a treatment at, say, the advanced undergraduate to beginning graduate level, which describes QM from a mathematician’s perspective – the way you understand it yourself? (That is, a text which defines QM as an operator calculus first, and adds the physical data last?)

    Such a text might help us readers ask better questions, in that it would allow everyone to see whether the methods used to define such an operator calculus are truly modern, or if they would be accessible to someone from the 19th Century. If no such text exists, and you write it, I will be your first customer.

  205. Matt H Says:

    This is certainly above my pay grade, but… here’s my highest-level conceptual argument for why something like quantum mechanics might be plausibly preferable. (I’m not at all sure that I can articulate this clearly or convincingly.)

    In classical physics, it’s always seemed to me extravagant to postulate an enormous high-dimensional phase space, out of which our *actual* universe will only ever occupy a volume of essentially zero size. The “existence” of the rest of that vastly infinitely larger phase space has, as far as I can see, no physical meaning whatsoever. Quantum mechanics is actually more parsimonious, in the sense that all the degrees of freedom in phase space have actual physical meaning and causal power.

    In other words, an individual classical system is really always one-dimensional: a continuous map, following certain laws, from time to points in phase space. An individual quantum system, on the other hand, occupies *all* of phase space at *all* times (more densely at some points than at others).

    I realize this might only be even remotely appealing to an Everettian mindset, and (separately) that favoring classical physics, one would probably deny that they’re actually “postulating” the “existence” of all of classical phase space, that it’s just a mathematical convenience. Also, it doesn’t distinguish quantum mechanics, in particular, from other possible theories that satisfy this condition. I’m not sure whether it could.

  206. Ted Says:

    Scott #160: Three comments regarding arguments for complex amplitudes:

    1. I believe that Chris #94 is actually referring to the fact that that if time is continuous and time-evolution operators \(U(t)\) are linear and compose in the natural way, then you want to be able to take \(n\)th roots over the field of scalars, so that for every time-evolution operator \(U(t)\) there exists a corresponding operator \(U(t/n)\). As you know, this is only possible for real-valued scalars if you either use ancilla qubits or introduce a new dimension to your Hilbert space (a process that you must then iterate). I believe that Chris #94’s only mistake was in attributing this argument to Hardy, when as far as I know, you came up with it yourself.

    2. You say above that “Personally, I’ve never quite understood the motivation for that axiom, other than that we ‘peeked in the back of the book’.” But in your lecture notes (https://www.scottaaronson.com/democritus/lec9.html), you say that “intuitively, it seems like the number of parameters needed to describe AB … should equal the product of the number of parameters needed to describe A and the number of parameters needed to describe B.” These two statements seem somewhat contradictory.

    3. I’ve never understood the parameter-counting argument for complex amplitudes, because whether or not one finds the postulate that “real parameters should multiply in composite systems” to be intuitive, it doesn’t seem to actually hold in complex QM. That’s because an \(n\)-dimensional mixed state only has \(n^2 – 1\) real degrees of freedom after you normalize the density matrix (or equivalently, identify together density matrices that are related by a positive multiplicative constant). So in fact real degrees of freedom combine supermultiplicatively in complex QM: two subsystems with \(M\)- and \(N\)-dimensional Hilbert spaces combine together to form a composite system with \((MN)^2 – 1\) degrees of freedom, which is strictly greater than the \((M^2-1)(N^2-1)\) degrees of freedom that you’d get if they combined multiplicatively. In your notes linked above, you just say “we assume, for convenience, that the state doesn’t have to be normalized”. But isn’t this a bit of a cheat, since the whole proof revolves around carefully counting degrees of freedom? If we allow ourselves to introduce dummy degrees of freedom “for convenience”, then the same argument would seem to work for real QM as well.

  207. mjgeddes Says:

    Continuing #187 to account for observers, my ‘theory of actualization’ leads to natural conjectures explaining intelligence, values and consciousness.

    Now, I *don’t* think that mind is fundamental as such. As I said, I think the ‘ground state’ of reality is simply a space of ‘possible worlds’, and as complexity is built-up, these worlds get increasingly ‘actualized’ entirely via computational processes, so the ball gets rolling without any observers or consciousness, which are emergent systems of computation.

    However, I think that *after* a certain complexity threshold is cleared, the continuing actualization of reality *does* need minds, and from this point forward consciousness *contributes* to the on-going actualization of reality! Not through any sort of mystical or non-physical process, but by structuring information (i.e., turning information into knowledge), thus helping to make reality increasingly comprehensible.

    So what are minds? Well, remember I talked about the ‘actualization’ process itself, which I said takes place on the level of computational geometry, function spaces and topology. And minds exist at this level. They’re simply the higher-level processes of ‘actualization’. Minds make mappings of (representations of) reality by modelling systems of causal relations that are complex and compositional. And these models are *themselves* new systems of causal relations that are complex & compositional! This is an open-ended recursive process.

    The meaning of life (for conscious observers) is thus simply the high-level version of the same process of ‘actualization’ that I think accounts for physics. It’s ‘Self-Actualization’ ! Of course, now we have to try to achieve a reasonable understanding of the meaning of the term ‘Self-Actualization’ 😉

    Here’s my explanation of consciousness and values:

    Consciousness is the highest level of recursive actualization. It’s a model of the perceived flow of time (temporal awareness). It works by integrating high-level values and low-level facts, to generate mid-level action plans. The representation of values, plans and facts is in the form of the temporal modalities ‘should, ‘would’ & ‘could’ respectively. And the generated ‘action plans’ which are “good” are simply the ones that structure information as knowledge such that it contributes to the on-going actualization of reality (i.e., generation of manageable complexity).

    To understand values, consider the motivations of God in the context of ‘actualization’. I believe that these motivations can be considered to consist of two complementary tendencies, (1) rationality & (2) creativity, because this is the combination that generates *manageable complexity* ( ‘actualized reality’).

    Rationality is about the *compression of information* (manifesting as intelligence) , whereas creativity is about the *exploration of possibilities* (manifesting as values). The balance between them generates manageable complexity.

    In conclusion, the “actualization” of reality is ultimately about the generation of *manageable complexity*, which is complexity that has enough structured details to be interesting, but can still be compressed enough to make it comprehensible. At a high-level, this is the balance between rationality and creativity in conscious minds. And that’s the meaning of life.

  208. Tim Maudlin Says:

    Scott, re #195 and #200,

    Taking computer languages—which are designed for Turing machines, which to begin with are abstractly discrete objects—is “rigging the game” to at least the same extent. As is talking about what you can explain to a human child, which an *extremely* complex item! The Game of Life requires *counting* occupied adjacent squares, which is an extremely abstract and complex thing to do. Counting and then consulting a rule that depends on the outcome of the count. I see no sense at all in which that is simpler than a simple differential equation.

    Regarding Bohmian Mechanics, we will disagree about some evaluate matters, but we should not disagree about plain facts. And its not at all a fact that any theory that has a quantum state in its ontology can “talk unproblematically about” the precise particle locations that constitute a Bohmian configuration. Obviously not. Everett certainly cannot. Does the particle location, as empirically accessible in Bohmian Mechanics “give us any information we didn’t already know”? Sure. For example, in a simple 2-slit situation, the location where the particle hits the screen (which we can find out) implies which slit it went though, which we didn’t know. I’m a bit of a loss at why you are making the claims you are.

  209. Tu Says:

    Scott #200:

    I just want to chime in to support your encouragement of more attention and development of Bohmian Mechanics. Sheldon Goldstein has been doing what I think is really interesting work but I don’t think he hangs out in this corner of the internet so I am hoping to convert some others.

    With respect to your question at hand, Scott, I do think that taking the Bohmian ontology seriously, and perhaps biting the Bohmian bullet all-the-way-to-the end may be fruitful or instructive. After all, the question that John Bell was most interested in before his death, and the question he wished others would pursue was this: why isn’t Bohmian mechanics Lorentz-invariant?

    People who favor a many-worlds interpretation or ontology tend to pounce on this as a reason to throw Bohmian mechanics in the trash, but I personally think it is a great mystery that the interpretation with the most clearly-defined ontology, the one that best resolves the “anschaulihkeit” issue of QM in my mind is in some sense saying that there is a preferred reference frame. What would happen if we took that seriously?

    Since I am anticipating some pushback for what I just wrote, here is Bell on the two-slit experiment:

    “Is it not clear from the smallness of the scintillation on the screen that we have to do with a particle? And is it not clear, from the diffraction and interference patterns, that the motion of the particle is directed by a wave?”

    So Tim, I agree with Scott. Keep going! Together, in the name of John Bell, we can defeat the many-worlds people fair and square.

  210. Andrei Says:

    Scott,

    “To date, no one has comprehensibly explained to me how you get our quantum-mechanical, Bell-inequality-violating observed reality from a classical CA, in a way that isn’t so convoluted, contrived, ugly, and insight-free that you wouldn’t be vastly better off just to throw out the CA part, and talk directly about quantum mechanics. Certainly not Wolfram, not ‘t Hooft, not Sabine Hossenfelder, not Andrei on this blog … as far as I can tell, all the words they’ve produced have taken us 0% of the way toward answering the question.”

    I actually explained this multiple times, but I guess I was not clear enough, so I’ll do it again. I’ll put some numbers before each statement so that you can agree/disagree with any of them without losing much time.

    P1. In field theories, like classical EM, GR, fluid mechanics, the system must satisfy the theories equations at any time. A system described by classical EM must satisfy Maxwell’s equations. – agree or disagree?

    P2. This is just a reformulation of P1, but I feel the need to insist on it. Any state that does not satisfy the theory’s equations is physically impossible. There is no need to select/fine-tune the initial conditions to guarantee that such a state never occurs. – agree or disagree?

    P3. A Bell test is a particular example of an EM system, consisting in three large groups of charges (electrons and nuclei). One group is the particle source, the other groups are the two detectors including whatever it is used to “choose” their orientation. – agree or disagree?

    P4. From P1, P2 and P3 it follows that only some states of the source/detectors are possible (those that satisfy Maxwell’s equation). There is an infinity of such allowed states (corresponding to any choice of the initial conditions), but the physically impossible states (that don’t satisfy Maxwell’s equations) are even more numerous (for each valid state you can create an invalid one by simply changing the electric field at one point and leaving everything else the same) – agree or disagree?

    P5. It is possible that when only the physically possible states are counted we will recover QM’s prediction. In other words, it is possible that those states that would lead to a different prediction than QM are impossible (don’t satisfy Maxwell’s equations) – agree or disagree?

    P6. I claim that P5 represents a reasonable explanation of how Bell’s inequality can be violated without non-locality and without any conspiracy (unless you want to claim that the requirement that a state that does not satisfy the theory’s equations is a conspiracy). – agree or disagree?

  211. Andrei Says:

    Philippe Grangier,

    “Disagreement with Andrei #135 and #137 : there is (at least) a third way besides superdeterminism and nonlocality (Bohm), that is predictive incompleteness, see https://www.mdpi.com/1099-4300/23/12/1660 . It does not require to ‘abandon standard QM’, but rather to look at it in a slightly different way.”

    I do not think “predictive incompleteness” can avoid the argument. Please address my argument below:

    We have an EPR-Bohm setup, two spin-entangled particles are sent to 2 distant stations, A and B. The spin measurements are simultaneous (space-like) and perform on the same axis (say Z). Let’s assume that the result at A is +1/2. We have:

    P1. The measurement at A does not change B (locality).
    P2. After the A measurement, the sate of B is -1/2 (QM prediction).

    From P1 and P2 it follows:

    P3: B was in a -1/2 spin state even before the A measurement. The spin of particle B on Z is predetermined.

    Symmetrically, the spin of particle A on Z was predetermined as well.

    The only assumptions here are 1. locality and 2. QM gives correct predictions. From these it logically follows that the measurement results are predetermined. In other words:

    C1: locality can only be saved by introducing deterministic hidden variables (the spins before measurements, or some other property that uniquely determines the results).

    Then we have Bell’s theorem which says:

    C2: Local hidden variable theories are impossible with the exception of superdeterminism.

    From C1 and C2 it follows that the superdeterminism is the only possible way to keep physics local.

  212. Dee Says:

    A slightly different approach:

    Waves, quantization, higher dimensionality, and the special nature of spacetime.

    QM is the base case, but need to consider interaction with spacetime, and a possible “residue” component (related to wave interactions or higher order dimensionality?)

    Waves – Are inherently complex.

    Measurement/2-slit experiment – single wave interacting with the 2 slits, get constructive and destructive interference; with a slit being observed, the measurement device acts like another set of filters or possibly inserts a wave pattern that interferes in a way that produces the 2 lines pattern

    Spacetime – how waves of matter, radiation, etc manifest. E.g. gravity waves
    At small scales quantization – frothiness, uncertainty (principle)
    Dark E – related? Could also be a side effect of higher dimensionality?

    Why all the different fundamental particles? A few options:
    1. They’re all the same as observed in their native dimensionality but look and behave differently as observed in 3+1D
    2. There are other universes (bubbles in the ocean) that are only electrons say, but we don’t live there for obvious reasons
    3. Something about how they’re manifesting as waves/wave interactions makes them appear different to us observers

    Probabilistic, Born rule & tunneling – interactions between the waves and spacetime

    Time reversal symmetry vs time – entropy/”residue” (creating aging/irreversible processes)
    Coherence/decoherence – on larger scales/times more interference between waves, more decoherence. Entangling = aligning waves in some special way?
    Dark matter – related to residue? As scale grows, impact of residue grows? On smallest scales/times, it is (more) reversible
    “Classical” is then just a consequence of the scales on which we live/observe.

    Mass E; a manifestation of the waves? Quantized waves?
    Mass can’t travel at c due to interaction between spacetime wave and this wave? Residue?
    C – speed of massless “particles” e.g. light constant – fundamental b/c no interaction travel of wave in spacetime, or it IS the single wave of spacetime propagating
    Bell inequality – info still has to travel from A to B in spacetime, traveling as a wave…

    Haven’t got to Pauli exclusion or Superposition yet, but you get the idea…

    Sorry for the stream of consciousness/incompleteness, need to get to bed soon. Happy to try to elaborate if there’s interest.

  213. DMC Says:

    If I were to speculate wildly about concepts which might eventually provide insight on this question, then my best guess would be reversible computing, and my craziest guess would be interaction nets. To keep this relatively concise: A quote, then some links, and then some comments.

    “A motivation for the study of technologies aimed at implementing reversible computing is that they offer what is predicted to be the only potential way to improve the computational energy efficiency of computers beyond the fundamental von Neumann–Landauer limit of kT ln(2) energy dissipated per irreversible bit operation.”

    https://en.wikipedia.org/wiki/Reversible_computing
    https://en.wikipedia.org/wiki/Landauer%27s_principle
    https://en.wikipedia.org/wiki/Interaction_nets

    I have no relevant expertise here; I’m only commenting because this stuff is obscure, interesting, and (from my uninformed perspective) possibly relevant. The interaction net thing in particular is basically just a vague association in my mind.

  214. Anbar Says:

    Philippe #133

    Hope you don’t get it wrong, but I guess my point in being synthetic was that I don’t feel I need theorems to trust the QM framework. It looks manifest to me that it is both fit for purpose and as “efficient” in fundamental computations as it could be.

    The “mutually exclusive” relationship is all that is needed to setup a working (projective) propositional logic that generically forbids connecting propositions with AND, but realizes OR through linear dependency.

    The Born rule is linear on all normalized arguments: the simplest way of producing a number that can always be interpreted as a probability, as empirically required by the need to make sense of the observed complementarity (and of the observed predictability of the Universe, despite the indeterminism)

    QED
    🙂

  215. Andrei Says:

    Philippe Grangier,

    I think I can spot the problem with your paper. In Chapter 5, below Figure 1 you say:

    “The resulting predictions can be effectively checked in the verification zone V in the common future of all light cones.”

    I think this is irrelevant. True, the experimental records can only be compared at V, but one can look at the time at which the measurements were performed (the time printed on those records) and conclude that the prediction of A about B was true immediately after the A measurement. It does not become true at V.

  216. Vladimir Says:

    Scott #99:

    Your conjecture seems highly unmotivated to me. What’s the most “chemistry”-like thing you’ve seen CA do? Putting that aside, I think you’ve left a crucial part out of your conjecture, namely the typicality of initial configurations leading to “chemistry”. Certainly in Conway’s Game of Life most random initial configurations don’t give rise to the cool stuff you can get by fine-tuning.

    Scott #174:

    Are you sure Albert E. is the guy you want to be taking cues from where QM is concerned? 😛

  217. Andrei Says:

    Scott,

    I looked upon your paper, The Ghost in the Quantum Turing Machine, and I think I can point an error in your reasoning in regard to Bell’s theorem. At page 22 you quote Conway and Kochen:

    “if there’s no faster than-light communication, and Alice and Bob have the “free will” to choose how to measure their respective particles, then the particles must have their own “free will” to choose how to respond to the measurements.”

    The error in this type of argument is that the EPR argument closes the possibility of local indeterminism (see Comment #211). So, by rejecting determinism, you cannot recover locality. So the alternatives are:

    1. local determinism (in the only surviving form, superdeterminism) and
    2. non-locality, which can be either non-deterministic, or deterministic.

    Non-determinism does not give you anything that determinism cannot.

  218. JimV Says:

    Having skimmed through all the comments so far, I think there is a lot of informed agreement that some randomness is necessary and a lot of discreteness is necessary. Other than that, I for one do not guess that there is only one way to get a universe with complex life forms. It seems to me that the way to bet is that our universe is a mediocre one (for the formation of complex lifeforms) among all physically possible universes. I am not saying the metaverse exists, since that raises additional questions, but that it seems quite possible to me that there are many other sets of physical laws and constants which could produce complex life.

    Life: physical entities which can reproduce themselves with random errors and thereby evolve means of memory and decision-making computations.

  219. Duh Says:

    Level-design that uses RNG can open the design-space much more widely than classical level-design. Without RNG, either all the levels look the same, or you have to hand-craft them all to be different in interesting ways. QM is just a way to make sure the universe looks interestingly different across its total configuration. Without it, there would be no level in which we evolved, so we wouldn’t be asking the question.

  220. Scott Says:

    Nicholas Teague #129:

      Cellular automata / hyper graph updating rules aside, the multi-way causal graphs of the wolfram physics model are the most intuitive channel I’ve found for thinking about the quantum mechanical nature of reality. Formalizes the multiverse.

    It’s “intuitive” only because it doesn’t have unitary evolution or (therefore) well-defined probabilities!!

  221. Dániel Varga Says:

    Scott #163

    Sorry, Scott, I think you pattern-match what entirelyuseless is saying to “deterministic hidden variable theory”, and dismiss it too quickly. It’s only my interpretation based on what entirelyuseless wrote, but I don’t think it is that. Do not forget that agent in Game of Life would never be able to say things like “danger, a glider is approaching”, or even “ouch, I was hit by a glider”, because the world is fundamentally inaccessible to them at that level, even if they build the Game of Life analogs of particle accelerators. Maybe it helps if you imagine that there is a huge chasm between the lower level cellular automaton rules, and the higher level QM rules, where the lower level rules are inaccessible to inhabitants of the universe, for fundamental reasons. Now you might, in turn, pattern match that to “deterministic hidden variable theory immediately slashed by Occam’s razor”, but it’s not exactly that, either. Let me explain my best case scenario:

    I design a deterministic cellular automaton, and agents in it. The agents can not observe their environment without altering it, which gives rise to some version of Uncertainty Principle, which in turn forces them accept a quantum theory of physics. I am NOT saying that the agents are wrong, and their world is “really” a hidden variable world. From their perspective, they know everything about their world that’s ever knowable, and it is quantum. But I, who designed their world, have a different, very valid perspective. It is a semantic question who is correct, the creator or the agents. What’s more important is the existence proof that QM can arise like that, the fact that with the extra knowledge the creator has, Occam’s razor does not slash the deterministic theory.

    Entirelyuseless, please feel free to clarify, or at least create an ad hoc comment thread with people like me who are open to your idea. 🙂 (Check out my comment #189 in case you missed it.)

  222. Joe Shipman Says:

    Q1 is tough, because a metaphysically-classical universe could have life and intelligence in it, apparently. The only attempt to reconcile this that seems to have any promise is what Wolfram is doing. If Wolfram’s project can’t succeed, then the only other explanation I have is theological—that you need true indeterminism in order to have both lawfulness and free will, as well as “room for God to act”.

  223. Scott Says:

    Bruce #132:

      I think the question has a major unstated premise: that ours is the *only* universe.

    Let me clarify that that’s in no way, shape, or form a premise! I could be 100% satisfied with an answer along the lines of, “really there’s a Tegmarkian multiverse, with some classical universes, some quantum universes, and some universes with other rules entirely, and a-priori we could’ve been in any of them, but the deck was stacked in favor of our finding ourselves in a quantum universe for the following reasons…”

    If, of course, actually-convincing arguments or evidence were brought that that was true.

  224. Crackpot Says:

    Disregard the previous two comments, probably, as I’ve realized they probably don’t actually explain what I’m getting at.

    Starting from here: Take Pilot Waves, and subtract out the particles; you’re left with MWI, but I think that way of thinking about things is misleading. Rather, I think the correct way to interpret the wave you’re left with is a mass-energy distribution; the particle isn’t probabilistically in different locations, it’s always a waveform.

    Strictly speaking, you don’t have to be left with a mass-energy distribution; anytime there aren’t particles, and instead there are waveforms, you should expect to get quantum-mechanical behavior. Curvature, for these purposes, I am counting as a waveform, which I’m using to describe any space-extensive structure with a variable quantity which can be interpreted as amplitude. I think the ideal case for a waveform for these purposes has infinite spacial extent, which is not quite the same as occupying all space.

    So if General Relativity is correct, and additionally understood to mean that mass-energy is in fact distributed over spacetime (and I don’t know how you can look at GR and think “There are particles there”), you should expect to get quantum-mechanical behavior.

    You should expect to see quantization in any physical system with a finite number of stable configurations in which state transitions are the only measurable thing; even if energy isn’t actually quantized, a finite number of stable configurations mean you’ll only see energy in specific quantities. (Particularly when the system is significantly smaller in scale than the observers, and things happen relatively fast compared to observational techniques).

    And classical really-real particles, if you think about it, are actually an enormous ask in terms of additional, and entirely unnecessary, complexity; the behavior we actually care about is all embedded in the fields, particles don’t actually add anything.

    The one thing is that I don’t think it is necessarily strictly the case that measurement events should behave the way they do, and I think that’s much more specific, requiring a reason for waveforms to be “coherent” – that is, there has to be a reason that a waveform/particle which is sitting between two attractive forces shouldn’t just split in half down the center, and if MWI is correct and the waveform can split and still be coherent in the sense of maintaining (at least from its own perspective) radial symmetry, that’s an additional piece of the puzzle that needs to be explained. Granted, a universe in which waveforms aren’t coherent wouldn’t support complex structures or chemistry, so we can lean on an anthropic explanation.

  225. Scott Says:

    Nisan S #133:

      The Game of Life is not a safe place for life to evolve. Everything is being constantly bombarded by gliders and stuff. We’re fortunate to live in a universe where gravity collects matter into tidy planets, whose surfaces are only lightly bombarded.

    I think your point about the Game of Life is an excellent one. In contrast to many other things that have been proposed, that’s a real reason why we should never have expected to find ourselves in the Lifeworld. But I also think there are even more directly relevant factors than gravity to explain why our universe is a safer place for life than Life:

    (1) Our universe supports not only Turing-universal computation, but fault-tolerant computation. (Note, however, that there are 2D cellular automata that support that as well, by celebrated work of Peter Gacs.)

    (2) Our universe has these concepts called “conservation of momentum” and “conservation of energy,” which often (albeit not always) tamp down on runaway butterfly effects. For example, if you’re hit by a cosmic ray, whose energy is minuscule compared to the energy in your body, there will for that reason almost always be no noticeable effect, unless (for example) the cosmic ray happens to mutate a DNA strand in such a way as to give you an aggressive cancer. The Game of Life seems to have no analogous conservation principles that would tamp down on the destructive effects of gliders.

  226. Scott Says:

    TP #135:

      For interesting chemistry we need the building blocks to be charged. However, charged particles lead to various problems in classical electromagnetism, e.g., the infinite self-energy of electron or the radiation from an accelerated charge. Quantum mechanics solves these problems. I suspect that any attempt to solve these problems will have to reproduce quantum mechanics in its entirety. As you have pointed out it seems impossible to change quantum mechanics just a little and it seems to be an island in theoryspace. Maybe any self-consistent theory that tackles these problems of classical electromagnetism has to go all the way to quantum mechanics for this reason.

    I ought to have been more explicit that what I was really doing, with my Q1 and Q2, was calling for an entire research program, only bits and pieces of which already exist (mostly focused around Q2).

    Your comment, with its inferential leaps from one sentence to the next, provides perfect illustrations of the sort of thing I’m asking for. Is it actually true that to get interesting chemistry, we need some notion of “charged particles” with a Coulomb force? If so, is it actually true that once we’ve introduced such particles, we’ll create problems that can only be solved with QM? Or are we just suffering from a lack of imagination, and (even more to the point) a lack of well-developed alternatives to gawk at?

  227. Peter Morgan Says:

    Sid #197 gives a link that Firefox assures me is unsafe. A search reveals its published version apparently to be https://doi.org/10.1119/1.10490

    One more derivation of the Lorentz transformation
    Jean‐Marc Lévy‐Leblond, American Journal of Physics 44, 271 (1976)
    ABSTRACT: After a criticism of the emphasis put on the invariance of the speed of light in standard derivations of the Lorentz transformation, another approach to special relativity is proposed. It consists of an elementary version of general group‐theoretical arguments on the structure of space–time, and makes use only of simple mathematical techniques. The principle of relativity is first stated in general terms, leading to the idea of equivalent frames of reference connected through ’’inertial’’ transformations obeying a group law. The theory of relativity then is constructed by constraining the transformations through four successive hypotheses: homogeneity of space–time, isotropy of space–time, group structure, causality condition. Only the Lorentz transformations and their degenerate Galilean limit obey these constraints. The role and significance of each one of the hypotheses is stressed by exhibiting and discussing counterexamples, that is, transformations obeying all but one of these hypotheses.

    The assumptions that are introduced look extremely strong to me, but YMMV.

  228. tez Says:

    How I have summarized some options, trying hard not to bias the student:

    > Here are a couple of examples of categories of results which, if they existed uncontroversially (they don’t!), would make me personally feel we have made progress on the questions:

    > Example 1: Consider a derivation I’m fond of https://arxiv.org/abs/1706.05261 which basically says “add BLAH to classical propositional logic and you *have* to end up with something isomorphic to classical probability theory as your system of plausible reasoning”. Perhaps there is a different BLAH’ which instead leads you inevitably to quantum probability theory as your system of plausible reasoning. There are many versions of proposals like this – quantum logic, operational reconstructions, QBism and so on. Seems unlikely such things will ever give me a good intuition about why hbar is the size it is, why its so damn practically useful to think about those little waves as really diffracting around, why those spectral lines “exist” in light emitted long before creatures capable of worrying about probabilities at all existed and so on – but maybe I’m just being too much the physicist to want such.

    > Example 2: Relativity taught us that different observers can seem to have incompatible narratives about what is going on, but once you identify the right “true” underlying description of a single reality they are all correct, so to speak. With quantum theory it seems hard to believe in such “singular agreeable realism” – witness the issues of contextuality (never mind the plethora of interpretations!) – but perhaps there is a different unification which shows that the only possible way to unify different observers each with their own personal “correct version of reality” is with something like QM. I think various topos approaches, and perhaps consistent histories, fall into this kind of category.

    > Example 3: Most of the issues I have with quantum theory are to do with the manifest disrespect entangled particles show for space and time. But perhaps space and time are simply anthropomorphic variables – useful to evolve a brain that prioritizes them when chasing mates and bananas was our mission, but not actually fundamental or useful degrees of freedom to describe whatever it is that is “actually going on” at non-monkey scales. There are many imprecise and frankly borderline crackpot attempts at seeing space and time as “emergent” out there, but that shouldn’t dissuade us from considering the general principle might be valid.

  229. Mitchell Porter Says:

    My oracle suggests that the answer to “why the quantum”, is that a maximal equiprobable Hartle sub-multiverse is one way to resolve the necessary ontological tradeoff between plenitude, internal intelligibility, and logical locality (pick any two). Unfortunately it also says we’re not ready to know the technical meanings of those terms. 🙂

  230. Scott P. Says:

    As stated, Q1 is easy. 😛 Where would the fun be for God in creating a classical, i.e. deterministic, Universe?

    The Schrödinger Equation is as deterministic as any of Newton’s Laws.

  231. fred Says:

    To me it’s not even clear that this can be done for any “theory”: since we will never be able to explain why there’s something instead of nothing, it’s hard to imagine how we can show that some aspect of reality is “necessary” in some absolute sense. At best we can only do it in relation to other features of reality.

    So, if there’s a way to show that QM is necessary it’s probably by looking at even more fundamental ideas that are hard to dismiss, like entropy, conservation of information, fundamental symmetries, the fact that spacetime has 3+1 dimensions, etc.

  232. Scott Says:

    Andrei #136:

      What we are facing here is much more than a “psychologically uncomfortable” feeling, or a “peculiarity”, we are facing a logical/mathematical inconsistency between QM and the space-time structure of special relativity. Please look carefully at the last paragraph:
      “we cannot use spin-correlation measurements to transmit any useful information…”
      SR does not make any distinction between “useful” faster-than light messages and useless faster-than light messages…

    This is a subject I know well, and recent conversations with you, Sabine, and others reinforced for me that it’s the precise point where I get off the train headed to Superdeterministic Crazytown. That town was built to solve a problem that everything I know about quantum information tells me is not a problem at all.

    By “useful” information, Sakurai clearly means any information whatsoever that one can choose to specify at point A. It’s a theorem in QM — I prove it every year in my undergrad class — that none of that can be transmitted. He means to exclude correlations, which occur even in classical probabilistic theories and are considered totally unproblematic there, but which can be stronger in QM in a way that helps A and B win certain nonlocal games, which is the whole content of the Bell inequality.

    The above is really all that special relativity requires; SR doesn’t care about Bell-type correlations. We could see that abstractly, even if we didn’t have successful relativistic QFTs that show it explicitly. That QM and special relativity are difficult but not impossible to reconcile, and that the rare theories that do successfully reconcile them are so phenomenally good at describing the world, is a powerful indication that physics is on the right track here.

  233. Scott Says:

    Cain #137:

      Quantum mechanics necessitates God to determine when the state of the universe will change according to the Born rule rather than the Schrodinger equation, and no matter how many dinosaur fossils are dug up Richard Dawkins will never be able to explain this from more fundamental principles. QM is God’s signal that all the evidence against his existence from lesser sciences can now be discarded as a test of our faithfulness.

    What’s strange is that, having reserved to Herself this immense power to collapse the wavefunction, God would then choose to deploy it only and ever when quantum systems become decoherent and “macroscopic,” and would (as far as we can tell) scrupulously follow the Born rule every time, as though p(x)=|ψ(x)|2 were a higher law even than God!

  234. Anbar Says:

    Scott P. #230

    The Schroedinger equation is deterministic, but the Universe is not, which is the entire point if you want to be entertained while watching 🙂

  235. fred Says:

    Scott #232

    “That QM and special relativity are difficult but not impossible to reconcile”

    Why do you assume that two theories that work separately can always be reconciled?
    Isn’t it possible that the transition from one mode to the other involves processes that just can’t be described by some compact mathematical relation?

  236. Philippe Grangier Says:

    To Anbar#214

    A big issue in all these discussions is to make clear what is your starting point, and what you want to get at the end. In some of the posts it is clear that the starting point is some mathematical foundational principle, ‘Let Be Psi‘, from which everything in the (many)world should be derived.
    My starting point is just the opposite : I start from empirical evidence, then distill out some physical principles, then induce (actually, guess and justify physically) a suitable mathematical structure (with projectors !) and hypotheses likely to embed these physical principles. This construction turns out to fit quite well with the mathematical hypotheses of Uhlhorn’s and Gleason’s theorems, so from there, I can go deductively by using these theorems, and get respectively unitary transforms and Born’s rule, that are the basic framework of QM.
    This is certainly not a full answer to Scott’s ‘Why QM’, but maybe a little hint to this why. This is also in line with 3. in his introduction, but without ‘dismissing the axioms as totally implausible and unmotivated if I hadn’t already known (from QM, of course) that they were true.’ Why not dismissing them ? Because they come from empirical evidence, not from an already abstract formalism.
    Now, if it is clear to you that a working (projective) propositional logic and the need to make sense of the observed complementarity are enough to get unitary transforms and Born’s rule without further ado, this is nice, but I’m not that smart.

  237. Daniel Harlow Says:

    By the way in terms of “look how old we are now” arguments, to me one of the most depressing is that Mozart died at 35.

  238. drm Says:

    A couple of dumb questions from a biologist:
    1) does QM require infinite precision for unitarity, etc. or do all of those irrational factors of sqr of 2 and pi take care of themselves?
    2) Is Bell’s non-local condition equivalent to the older (I gather) notion of contextuality?

  239. Scott Says:

    bertgoz #142:

      I feel ultimately the description of the universe using mathematics and hence leading to the classical framework first and then to the quantum one, tells more about how the human mind works and it’s limitations than anything else

    Answers of this kind seem endlessly popular, no doubt because of their air of transcendent wisdom, but they’ve never made the slightest bit of sense to me. We can use our understanding of physics, rooted in math, to build spacecraft and microchips and nuclear reactors, and all those things actually work—not just in the human mind, but in external reality. Which competing framework, not based on math, tells us more about the workings of reality, or better lets us escape the prison of our minds?

  240. Not That Simple Says:

    An addendum to the above that I should have added, is that coming up with classical explanations for things like atoms or chemical bonds is a major modern-day crackpot pastime – you can find lots and lots of crazy websites out there that contain attempts at it. Presumably, if it were easy to do, one of them would have done it.

    Just to think about how that’d have to work – do the atoms have little jigsaw puzzle bits on them that determine what they connect to? Do they have spring-catches to stop them from bouncing apart? Who works at the lathe that turns out these tiny machines?

    Is it all vortices in a fictionless fluid, like Lord Kelvin suggested? That one is a lot less patently absurd, but it didn’t go anywhere either. I really think you are underestimating the difficulty of coming up with a version of quantization that doesn’t involve quantum mechanics. After all, quantum mechanics was invented as a way to describe physical phenomena *first*, and the stuff about complex amplitudes and entanglement was noticed as an uncomfortable yet inevitable implication, *afterwards.*

  241. Scott P. Says:

    The Schroedinger equation is deterministic, but the Universe is not, which is the entire point if you want to be entertained while watching

    It remains to be proven whether the universe is deterministic or not. Personally, I see no problem with the universe being simultaneously deterministic and stochastic, but which (if either) it is is unclear. Certainly QM itself makes no statement about the presence or absence of determinism in the universe.

  242. Not That Simple Says:

    @Matt H

    Phase space in classical mechanics is used mainly in statistical mechanics, to let the exact present (which is a single point) be diffused into a cloud that represents what we think the present might be, or other times, all of the counterfactuals that are being considered for theoretical reasons. There has to be a space to contain that distribution, and it needs to have space-like devices attached to it, like measurement and the inner product, or else we couldn’t measure the blob and prove theorems about it.

    There are various ways to end up with a diffuse present, one of them being experimental uncertainty (representing Bayesian lack of knowledge), but other times one conjures things like “all of the states with energy E,” when you’re trying to show they have something in common.

    Phase space is not necessary to do calculations about individual initial conditions, and in a Newtonian universe phase space would have the same status (as a collection of counterfactuals) as “the space of all alternative laws of physics,” because a different initial condition would be just as counterfactual as different laws guiding evolution. It is not really a part of classical mechanics in the way Hilbert space is a part of quantum mechanics.

  243. Not That Simple Says:

    Scott,

    After reading some of your comment replies I understand that my objection wasn’t made with a full understanding of your position.

    But… now I have a different objection.

    If we invented a classical ruleset that allowed life, where chemical transformations of bulk substances were “ground-level” facts (this is kind of how people saw it in the earliest history of chemistry before atomic theory took hold), and if it was very simple and elegant, and highly favorable by the backwards anthropic principle*, and it seemed like a travesty that it wasn’t true…

    … we wouldn’t know if the real laws weren’t even simpler and more elegant, because nobody knows what the true laws are yet. I think we have to come back to this once physics is done and we have a basis for comparison.

    This applies to any grading system for “preferability” we could think of, even if you are like Sabine and think that elegance is a dumb grading system for a universe that contains ringworm. No matter how you’re ranking imaginary rules, you’ll never know if the real rules beat what you’ve come up with, because the real rules are as of yet unknown.

    * I don’t know what the name for this is, but it’s the idea that laws of physics that are more likely to lead to life are more favorable than those that are less likely, as a continuum version of the fact that we can rule out any that never lead to life.

  244. Jim Graber Says:

    arXiv:2107.06942 [pdf, other] quant-ph
    doi
    10.3390/e24010012
    The Relativity Principle at the Foundation of Quantum Mechanics

    Authors: W. M. Stuckey, Timothy McDevitt, Michael Silberstein

    Hi Scott,
    I think the above article attempts to answer your question about the basis for QM by applying the no preferred reference frame principle to h-bar. curious about your evaluation, both fundamental approach and net results.
    Thanks.

  245. wolfgang Says:

    Scott,

    i would like to mangle your two questions into one:
    Why did God use quantum theory to make the universe, but have it appear classical to us?
    And how exactly did he do that?

  246. Viktor Dukhovni Says:

    What’s always puzzled me about QM is the fact that in an apparently non-deterministic future we still somehow get *exact* conservation laws. The only way that happens is via entanglement, which constrains the sums of various random variables to be constant, which then gets us exponentially large state space.

    So to me the mystery is exact conservation in the face of Bell non-determinism. If we want the future to not be locally determined in the distant past, and want conservation, it seems we’ll be getting something like QM albeit not-necessarily with complex amplitudes on that basis alone…

    What’s your take on the puzzle of how conservation and randomness end up consistent?

  247. fred Says:

    Scott #225
    “The Game of Life seems to have no analogous conservation principles that would tamp down on the destructive effects of gliders”

    But I see no reason to assume that the 2D grid instantiating the Game of Life at its smallest level would be directly mapped to the reality perceived by entities at a much higher level of abstraction, and the apparent “destructive effects” of gliders may correspond to something entirely different at that level.

    A taste of this idea: if we consider someone playing Resident Evil 4 in VR on Quest 2, there’s no direct/obvious connection between the 3D space perceived by the player (e.g. the player is looking at a red cup falling onto a wooden table and breaking into a hundred pieces) and the corresponding low level bit patterns and their evolution inside the linear structure of the RAM.

  248. Chris W. Says:

    I hope a small question related to the basis for Q1 (we live in a QM universe) is ok

    it’s surprising for a layman like me that it’s an open question whether the universe is deterministic (Anbar #234, Scott P. #241).

    Is there some flaw in the reasoning “everything in the universe has QM state => the whole universe could be described as one QM state, which evolves deterministically according to the Schrödinger equation”?

  249. Johnny D Says:

    Schrodinger equation allows for static and dynamic states. Static only possible because of slight of hand to make phase irrelevant in Born rule. This goes a long way to answer why QM. It requires 2d wave function with something like phase and modulus. Dynamic solutions are those that are superpositions of static states. Super position and multiple degrees of freedom imply tensor product.

    Static solutions exist as eigenstates of Hermitian Hamiltoninion. These eigenstates exist cause the operator is over the complex numbers. The exponential of Hermitian is unitary.

    Can you create a classic system with static and dynamic states?

  250. Lorraine Ford Says:

    Roger Schlafly #169:

    Is it the case that quantum mechanics allows for the possibility of free will? Or is it the case that quantum mechanics attempts to rationalise the existing free will of the system? The system has free will in the sense that individual quantum outcomes can be thought of as the system, or parts of the system, freely assigning numbers to variables.

    Genuine free will can only be modelled as living things, or other entities, assigning one or more numbers to some of their own variables (as opposed to the laws of nature determining all the numbers for all their variables).

    So, I partly agree with you in the sense that quantum mechanics shows that the system could very easily cope with genuine free will.

  251. Clinton Says:

    Stewart Peterson #204:

    I noticed it didn’t look like anyone had replied to your request.

    The canonical text for the computational approach is by Nielsen and Chuang “Quantum Computation and Quantum Information”. All you need is Chapter 2.

    Mike and Ike say it best: “Quantum mechanics is a mathematical framework for the development of physical theories. On its own quantum mechanics doesn’t tell you what laws a physical system must obey, but it does provide a mathematical and conceptual framework for the development of such laws.”

    In other words, as Scott has said, quantum theory is not ABOUT physics. It is an extension of probability theory that allows for negative probabilities.

    I also highly recommend Scott’s book Quantum Computing Since Democritus or his lecture notes on which that book is based – both for the presentation of the theory and for the humor.

  252. Jacob Says:

    A question similar to your question 1 that I would love you to discuss: given that the laws of physics are so complicated, why can they be so well approximated by something so simple?

    “Why don’t Newtonian physics work?” doesn’t strike me as a terribly interesting question – there’s no reason to suppose they should.

    But “why do they almost work?” seems much more puzzling.

  253. Andrew Matas Says:

    I think there’s a misconception built into the premise of Q1.

    I don’t think, in physics, we have *ever* been able to say that any of the basic physical principles are inevitable, a priori, directly from “the void”. What we have been able to do, is *synthesize* or *unify* the basic laws, so that what originally appeared to be two or more independent properties of the Universe, could be understood as consequences of a single, deeper principle (eg: electrostatics, magnetostatics, and light all following from Maxwell’s equations).

    It’s not that Lorentz invariance is inevitable; but accepting Lorentz invariance as a basic principle *explains* a huge amount of observations, changes our point of view, and has been successful at letting us guess new laws.

    I think that the best we can hope for, as mere mortals, is that we may discover one day how to derive quantum mechanics from a deeper principle, that also contains some other aspect of physics, like relativity. Finding *what* principles underly our world would be a major accomplishment. I don’t think we will ever be able to say *why* “God” chose the particular set of principles that underly our world — or at least, I think that would be a *much* stronger intellectual achievement than anything achieved in physics to date. No harm in asking the question, but worth putting the question in proper context.

  254. Peter Morgan Says:

    JM #81 points to a blog post, https://www.lesswrong.com/posts/7A9rsJFLFqjpuxFy5/i-m-still-mystified-by-the-born-rule, where can be found, a long way down, a comment mentioning
    https://royalsocietypublishing.org/doi/abs/10.1098/rspa.2020.0282, “The Born rule as a statistics of quantum micro-events”, by the author of that paper. Recommended.

    Philippe Grangier #82, you likely won’t see this, but if you do I’m curious whether your taking of contextuality to be fundamentally non-classical, in the paper you link to, is something you absolutely won’t give up? I take contextuality, noncommutativity, and incompatibility of probability measures to be a natural extension of classical physics because such can be constructed straightforwardly using the Poisson bracket. The advantage of accepting that extension, I have found, is that we can then look at what other differences there are between classical and quantum physics without the distraction of noncommutativity. Spoiler: there are other differences.

  255. Scott Says:

    Andrew Matas #253: I’m obviously well-aware that, whenever physical principles have been successfully explained, it’s always been on the basis of deeper principles, which are then explained (if at all) by even deeper principles, and so on, with the chain necessarily terminating somewhere. That’s what I want in the case of QM: to go at least one step deeper.

  256. TomY Says:

    Two thoughts here:

    1. It is possible to think of a physicist as analogous to a computer science algorithms researcher, the physicist’s goal is to find an equation (~algorithm) that *can* be calculated (~efficient algorithm) with different initial conditions (~inputs) and test the results in the lab (~run the algorithm on the computer). Therefore, out of all things (turing-)computable, physics can only be described by the set of efficiently computable algorithms, otherwise we cant really test them on the computer (i.e. nature). Now one can say that the laws of chemistry must somehow be in BQP, so if physics is classical (i.e we have a classical computer to run the algorithms), chemistry cannot be computed by nature in scale. So now the question is why can’t we have chemistry in P? From here it might be possible to show that some simple chemical process is BQP-complete…

    2. A more general question I once asked myself is what abstractly a physical theory is (from a mathematical point of view)? Then one can think of different realizations to the general framework and start exploring what realizations have better properties. This is analogous to have the mathematical field as an abstract structure and the rationales and the reals as realizations of a field. One can say the rationales are so simple and finite in nature (~classical physics) but the reals have the important property of completeness (the limit of every Cauchy series exists blah blah).

  257. Doug Mounce Says:

    Great question Scott, and an early, sophisticated answer might still be of value with God’s answer to Job in that same regard; `hey, did you put the stars in the heavens?’ Well, the first answer obviously is, “no”, but Newton’s plan, which looks good from my perspective, apparently is not THE plan.

    PS – assuming THE plan implies that A plan will be logical

    PSS – if the world was logical then the world would make sense, but the world doesn’t make sense therefore the world is not logical, modus tollens, ~B & (A->B)) -> ~A

    PSSS – old Marx’ Bros quote – Zeppo says, “We’ve got to think!” Chico replies, “Nah, we already tried dat.”

  258. Anbar Says:

    Philippe #236

    Well, the ado was taken care of by a few bright guys in the late 1920s…

    In which sense the formalism and interpretation of QM are inevitable, given the empirical behavior of even something as simple as a photon, is laid out by Dirac in the introduction to the Principles, and Von Neumann figured out the formal logic behind the projectors shortly thereafter.

    The shortest route from empirical observations to the inevitability of QM that I can offer with hindsight is:
    quantized bound states and specific heats -> systems prepared in *similar* ways must behave in *exactly* the same way most of the times and *completely differently* otherwise -> enter mutually exclusive configurations with OR via linear dependency, and indeterminism, with (sesqui)linear Born rule to automatically generate consistent probabilities though amplitudes

    What else could be simpler than this?

  259. arch1 Says:

    General comments only: It seems that a satisfactory analysis of any of Q, Q1, or Q2 must:
    1) be clear upfront about assumed constraints (e.g. are only universes which can support consciousness being considered, or only ones which can support at least primitive life, or all universes however devoid of complex behavior?);
    2) admit, at least initially, that one or more of the questions may not have an answer (e.g. it may just be untrue that the universe, or even the universe we find ourselves in, had to be quantum mechanical);
    3) explicitly address, or exclude for good reasons, each of the four scenarios {artificial creation, naturalistic creation} x {single creation, many creations}, because they require different analysis approaches.

  260. Roger Schlafly Says:

    A cellular automata world might be simple to describe, but as Varga #221 and others pointed out, it would be horribly complex for anyone living in it. Probably more complicated than QM. The advantage to CA is supposed to be that it is less mysterious, but I think that it would be more mysterious.

  261. Scott Says:

    Daniel Harlow #196:

      I think you are overly dismissive of argument #2 that the world as we know it would be impossible without quantum mechanics. In order for us to be having this discussion at all, the laws of physics need to have the ability to generate interesting complex structures in a reasonable amount of time starting from a simple initial state. Now I know that as a computer scientist you are trained to think that is a trivial problem because of Turing completeness, universality, blah blah blah, but really I don’t think it is so simple. Why should the laws of physics allow a Turing machine to be built? And even if a Turing machine is possible, why should one exist? I think the CS intuition that “most things are universal” comes with baked-in assumptions about the stability of matter and the existence of low-entropy objects, and I think it is not so easy to achieve these with arbitrary laws of physics.

    Indeed, one way that this thread has been useful to me, is that it’s shifted me in the direction of thinking that that’s right. Multiple people made the case to me that it’s far from obvious how well
    (1) stable matter,
    (2) complex chemistry,
    (3) Lorentzian and other continuous symmetries,
    (4) robustness against small perturbations,
    (5) complex structures being not just possible but likely from “generic” initial data,

    can actually be achieved in simple Turing-universal classical cellular automaton models.

    Would you agree that this is, at the least, an exceedingly interesting research question?

    Even if the answers to this research question turned out to support your side of your argument, I confess that I still face a psychological difficulty with saying that the universe is quantum-mechanical for reasons like (1)-(5) above. My psychological difficulty is just that QM, and in particular the exponentiality of the wavefunction, seems too metaphysically extravagant to be the solution to problems like these! It’s like, in order to patch up some issues with the evolution of complex structures in one universe, you’re going to create (at least from the Everettian perspective) exp(n) additional universes?? Really? To what kind of deity would such an astronomically expensive trade seem worth it? Maybe a deity to whom computational resources were no object … but then why didn’t the deity just go whole hog, and make NP-complete problems efficiently solvable or whatever? 😀

  262. Scott Says:

    async #143:

      We can derive the general structure of transformations between different reference frames from some very reasonable assumptions. There are only two possibilities: a) Galilean relativity (if there is no speed limit) b) special relativity (if there is a speed limit).

      Is the answer to the question “Why Special Relativity?” then along the lines of “There are only two possibilities and both are of comparable complexity. Nature just happened to realize one of them” or something else?

    Actually I think the answer is better than that. It’s something like: “if you want a spacetime continuum and equivalence of all inertial frames, then there are only two possibilities … but one of those possibilities implies unbounded speeds, therefore no true locality or isolation of subsystems, and therefore the other possibility is realized.”

    This is an almost completely satisfying answer to the “why special relativity?” question. If we had an answer to the “why QM?” question that was even 25% as satisfying as that answer, I’d declare the research program advocated in this post a success and I’d go home happy.

  263. Scott Says:

    Daryl McCullough #144:

      My feeling is that quantum mechanics can’t literally be true, because of the measurement problem, which I don’t think can be solved. Not without going beyond quantum mechanics. Maybe there is a way to make Many Worlds or Bohmian Mechanics work, but I don’t consider those to be orthodox quantum mechanics.

    These days, my personal sense is that Many Worlds is the orthodox position … it’s just that not all of its adherents are willing to come out and say so in so many words! Instead they talk about decoherence theory, the Church of the Larger Hilbert Space, etc.—they just refrain from pointing out the Everettian corollary of all this, and change the subject if someone else tries to. 🙂

    Or to be maximally charitable, they’re Everettians except that they’re agnostic about whether there’s anyone home experiencing anything in any of the other branches, or whether they’re all just “ghost towns” like the Bohmians believe; and they’re also agnostic about whether future discoveries will change the whole outlook on these matters … which, if so, brings them exceedingly close to my own position.

  264. Scott Says:

    Mateus Araújo #145: My difficulty with your answer is, even if you assumed that probabilistic branching was crucial, why not just have a classical stochastic evolution rule? You wouldn’t have to stick all the random bits at the beginning, if you didn’t want to; instead you could posit that there were probabilistic transitions (or even “freely-willed transitions”).

    I’m not saying that there couldn’t be a compelling answer to that question, just that the question needs such an answer.

  265. Scott Says:

    Boaz Barak #149:

      But as our world was more and more shaped by machines, engines, and clocks, these began to shape our metaphors as well. Hence we could think of the “clockwork universe” and these metaphors became more natural to us than stories about gods.

      Similarly, I don’t think the view of “it from qubit” and the universe as a computer would make much sense to us if the Turing machine remained a thought experiment, rather than a device that we carry in our pockets. Now that we have gotten so used to the computer, we think of them as a useful metaphor to explain other stuff, rather than a mysterious phenomenon that needs to be explained.

    I actually strongly agree with that picture! But with one crucial addendum. Namely, I don’t see these various technologies (engines, trains, clocks, computers, now QCs…) as shaping our stories about the world in completely arbitrary ways … akin to spiders theorizing that spacetime is a giant web, or beavers theorizing a cosmic network of dams and lodges. Rather, I see these technologies as part of a feedback loop, where we learn more about the actual, objective nature of the world, and that understanding lets us build better technologies, and those technologies in turn provide better metaphors with which to describe more of the actual nature of the world (besides, of course, the technologies’ more direct assistance), and then we use the understanding to build still better technologies, and so on.

    I.e., with full awareness of all the cultural relativists and “science studies” types waiting on the sidelines to pounce, I will insist that the sequence, from a teleological universe to a mechanical universe to a computable universe to (now) an “It from Qubit” universe, represents actual, genuine progress in understanding reality better. 😀

  266. Scott Says:

    Tobias Maassen #151:

      I do not see some of the bigger problems of classical physics mentioned.

      One example is Newtonian Gravity allowing objects to reach infinite speed in finite time

    That was fixed by GR, not by QM.

      another you mentioned is the solidity of matter(Pauli).

      Another is the problem, which invented quantum Physics: The ultraviolet catastrophe.

    We’ve discussed both earlier in this thread.

      Are there cellular automata | strings | other theory canidates) wich preserve energy or a similar quantities?

    There are certainly CAs with natural notions of “energy,” although it’s challenging to get continuous symmetries.

      Would these allow for a complexity similar to chemistry?

    A research program to answer such questions is precisely what I’m calling for here!

  267. Chris Says:

    Scott #261

    >My psychological difficulty is just that QM, and in particular the exponentiality of the wavefunction, seems too metaphysically extravagant to be the solution to problems like these! It’s like, in order to patch up some issues with the evolution of complex structures in one universe, you’re going to create (at least from the Everettian perspective) exp(n) additional universes?? Really? To what kind of deity would such an astronomically expensive trade seem worth it? Maybe a deity to whom computational resources were no object … but then why didn’t the deity just go whole hog, and make NP-complete problems efficiently solvable or whatever? 😀

    It might help if you think in terms of parsimony. People rarely have a problem with assuming the existence of infinities, whether in mathematics or in physics, and it’s usually not because they are explicitly comfortable endorsing the ‘metaphysical extravagance’ that dependence on infinities entail, and it’s certainly not because there is any evidence whatsoever for the physical existence of infinities. Rather, it’s that infinities offer us the benefit of a theoretical parsimony of type and kind. It’s likely you’re probably harboring some latent cognitive dissonance with respect to this tradeoff; if you were strict about a heuristic biasing you against ‘metaphysical extravagance’, you’d be a strict finitist, and your default assumption about the universe would be that it is finite in extent, contains finite matter, etc. Is that the case?

    Is it metaphysically extravagant to suppose that there are infinitely many mathematical structures, including all perturbations on ones that closely resemble our universe? Or is it only metaphysically extravagant to suppose that this infinite collection has some ‘special sauce’ that makes it physical/concrete, rather than ‘merely’ mathematical/abstract? Seems to me that between “everything exists”, and “this, and only this, extremely huge and particular universe exists, out of the space of all possible ones that could have existed”, the former is vastly more parsimonious as a theory of everything. The “special sauce” concept is itself doing the work of making things appear metaphysically extravagant. So much the worse for special sauce!

  268. Scott Says:

    James Gallagher #152:

      I think the Born Rule is good evidence for an Anthropic Universe since there is no good reason for it to be selected amongst all even power rules apart from being “most likely”.

    I disagree in the strongest possible terms. This is one of the clearest examples of an aspect of QM that we can fully explain. Once unitary evolution picks out the 2-norm as conserved, you then don’t get conservation of probability unless the probabilities also go like the 2-norm. And, crucially, there are no linear transformations that nontrivially preserve the 4-norm, the 6-norm, or any higher norms. For more, see Chapter 9 of Quantum Computing Since Democritus, or my Is Quantum Mechanics An Island in Theoryspace?

  269. Scott Says:

    Andrew Matas #155:

      I fear this question can be answered in 10 different ways by 5 different people, so is scientifically meaningless.

    Countless other questions—e.g., “what is energy?”—can also be answered in 10 different ways by 5 different people, but hardly anyone claims that that makes them “scientifically meaningless.” The point is simply that many different correct answers can give complementary insights, while other answers can be rejected as straightforwardly wrong.

    I’d guess that in the worst case, the “why QM?” question has the same character, while in the best case, it does have a unique maximally compelling answer after all.

  270. Scott Says:

    Roger Schlafly #169:

      You could add a stochastic process to a CA theory, but you might end up needing something like QM to explain that stochastic process.

    That’s precisely my question, though: imagine that we lived in a classical CA universe that had stochastic transitions, but no quantum interference or anything like that. Would anyone even think to invent QM as a way to explain the origin of the stochastic transitions? It seems preposterous that they would. If so, though, then any convincing answer to the “why QM?” question will have to talk about more than mere stochasticity.

  271. Scott Says:

    Gerard #194:

      The problem with science is that it gets its epistemology exactly backwards. It starts with what it sees “out there” when in fact what we see “out there” is something we can never have certain knowledge of, as most philosophers have understood at least since Plato. The one thing that we do have certain knowledge of is “I AM” and I think the fact that your own religious tradition equates that statement with the name of God is something it would be worth reflecting on.

    You do realize that some people would be amused by the spectacle of monks sitting cross-legged on a mattress, or for that matter rabbis swaying in a yeshiva, lecturing the people who built spacecraft, microchips, and nuclear reactors on how the latter “got their epistemology exactly backwards”? 🙂

      PS. As for the question I asked in my last comment, sorry, I fear my feeble monkey brain failed me once again and I forgot that arithmetic is polylog time, not polynomial time, in the quantities on which it operates.

    Thanks for saving me the need to answer! But you might be interested to learn that, as soon as you ask about analogous questions involving square roots—e.g.,

    √a1 + … + √an ≤ √b1 + … + √bn?,

    where a1,…,an,b1,…,bn are positive integers, you get a problem that’s famously not even known be in NP (let alone P), although it is known to be in PSPACE!

  272. Scott Says:

    Sid #197:

      You may be interested in this paper which derives that the only possible reference frame transformations are Galiliean and Lorentz given 4 intuitive axioms…

    The point that that paper gives as its raison d’être—namely, that (as we now understand) the fundamental importance of c has nothing to do with the existence of apparently massless particles (photons), which if they are massless, happen to travel at the speed c, and instead has everything to do with c being the conversion factor between space and time—I’d always taken as obvious and well-known. So, is there anything else in that paper that I couldn’t have gotten from Albert E. in 1905? 🙂

  273. Scott Says:

    Corey #199:

      Suppose there really wasn’t anything wrong with choosing a different, classical, starting point. How would this knowledge morph the nature of your inquiry into QM?

    That’s an excellent question; thanks Corey! My answer is this: if it turned out that “Aaron Scottson,” or whoever, could just as easily have existed in a classical world, then I would take that as “settling” Q1, albeit in a negative way—analogous to how Paul Cohen’s independence proof “settled” the question of the Continuum Hypothesis. That is, it would tell me that there’s definitely no deep explanation for QM to be had, that it was a-priori just as plausible that we’d have found ourselves in a classical world, and the choice basically just boiled down to a dice roll. That itself, ironically, would be some of the most worthwhile and interesting knowledge I could imagine!

  274. Sid Says:

    Scott #272:

    Einstein had to assume the existence of an invariant speed (and then showed that that + no preferred reference frame implied Lorentz transformations).

    This paper shows that assuming homogeneity of space and time, isotropy, assumption that transformations (between reference frames) form a group, and causality, the only valid transformations are the Lorentz ones (so the existence of an invariant speed is *derived* from the axioms rather than *assumed*) OR the Galiliean ones — so where the “invariant speed” is infinite.

    There are no other possible transformations.

  275. Scott Says:

    Sid #274: I see; thanks! That’s indeed a stronger statement. I’d assumed someone had proved something like it, but I didn’t know who, when, or where.

  276. fred Says:

    Richard Feynman said

    “I think I can safely say that nobody understands quantum mechanics.”

    And, in the set of all possible mechanisms that nobody would be able to understand, God just picked the simplest.

  277. Not That Simple Says:

    I don’t know if anyone has already suggested this but,

    “it would seem like child’s-play to contrive some classical mechanism to produce the same effect, were that the goal.”

    Is super-mega-untrue and probably the answer to your question. Scientists before and in the early days of QM were unable to come up with classical mechanisms to produce things like atoms and molecular bonds. There was the “ultraviolet catastrophe,” where thermodynamics didn’t work. There was enormous effort put in to describing stuff using classical mechanics, back when that was all that existed, and it didn’t go anywhere.

    It would be a lot of work to put together a historical review of all the attempts at non-quantum physical explanations for stuff that wasn’t planets or machinery, mainly because those ideas have been ignored for a hundred years on account of not working. However if you did this I think you’d realize how QM is truly “necessary,” to have a world like ours, that’s not run by demons inside of spheres pulling levers when the spheres get close to the right kind of other spheres. 😉

  278. Philippe Grangier Says:

    To Andrei #211 and #216

    *I address your argument, though this is not fully easy with this simple editor, so I write between stars.*
    We have an EPR-Bohm setup, two spin-entangled particles are sent to 2 distant stations, A and B. The spin measurements are simultaneous (space-like) and perform on the same axis (say Z). Let’s assume that the result at A is +1/2. We have:
    P1. The measurement at A does not change B (locality).
P2. After the A measurement, the state of B is -1/2 (QM prediction).
    *No, first you should define what a ‘state’ is. In my definition a ‘state’, that I call a modality, only makes sense if the measurement context is defined. So one should write :
    P2. After the A measurement, the state of B is -1/2, with respect to an orientation defined by the A context (QM prediction).*
    From P1 and P2 it follows:
    P3: B was in a -1/2 spin state even before the A measurement. The spin of particle B on Z is predetermined.
    *No, since the prediction about the ‘state of B’ requires the A measurement, which was not done before*
    Symmetrically, the spin of particle A on Z was predetermined as well.
    *No, for the same reason, inverting A and B*
    The only assumptions here are 1. locality and 2. QM gives correct predictions. From these it logically follows that the measurement results are predetermined.
    *No again. The initial entangled state is predictively incomplete, and requires the A measurement to be done in order to make a meaningful prediction on the result of the B measurement; without that, B’s result is fully random*
    In other words:
    C1: locality can only be saved by introducing deterministic hidden variables (the spins before measurements, or some other property that uniquely determines the results).
    *No*
    Then we have Bell’s theorem which says:
    C2: Local hidden variable theories are impossible with the exception of superdeterminism.
    From C1 and C2 it follows that the superdeterminism is the only possible way to keep physics local.
    *No. Predictive incompleteness goes with contextual inferences, that are purely quantum, but do not require neither (Bohm-type) nonlocality, nor superdeterminism. More details in https://www.mdpi.com/1099-4300/23/12/1660 *
    I think I can spot the problem with your paper. In Chapter 5, below Figure 1 you say:
    “The resulting predictions can be effectively checked in the verification zone V in the common future of all light cones.”
    I think this is irrelevant. True, the experimental records can only be compared at V, but one can look at the time at which the measurements were performed (the time printed on those records) and conclude that the prediction of A about B was true immediately after the A measurement. It does not become true at V.
    * The prediction was true, but B’s particle was not affected by A’s measurement whatsoever. Only the contextual inference by A is true, and only be checked later, in a fully causal way. And it can be checked that no contradictions arise when inverting A and B*

  279. fred Says:

    Scott P. #230

    Apparently Newton’s equations aren’t even strictly deterministic:

    https://en.wikipedia.org/wiki/Norton%27s_dome

    in other words, determinism requires that two different states will never merge into a common state (time can run forward or backward). And it’s not true for Newton’s equations.

  280. Scott Says:

    Not That Simple #277:

      I don’t know if anyone has already suggested this but,

      “it would seem like child’s-play to contrive some classical mechanism to produce the same effect, were that the goal.”

      Is super-mega-untrue and probably the answer to your question. Scientists before and in the early days of QM were unable to come up with classical mechanisms to produce things like atoms and molecular bonds…

    That has indeed been suggested at least a dozen times. 🙂

    Briefly, “a world like ours” is not part of what I’m requiring here—I merely insist on “a world able to support complex life and intelligence.” It’s conceivable that you’re still right and there’s no simple classical way to get that, but it’s now far less obvious!

  281. Ted Says:

    Scott, may I ask if you have any thoughts on question #3 in my comment #206? I apologize for it not being entirely on-topic, but at least it has the advantage of being a very concrete technical question that you may be able to completely resolve in a few sentences (unlike most of the “big” questions in this post and its comments!).

  282. nt Says:

    My 1000iq wacky justification for qm goes like:

    Universe is geometric, and obeys relativity, and has a notion of distance

    1. Rotation invariance implies spinors (“square roots” of geometry) somehow
    2. Spinning things and waves are basically the same thing
    3. Waves are naturally described by complex numbers (I think you can avoid this but you get something that’s isomorphic to the complex numbers regardless)
    4. An angular distance then gets you the fubini study metric and all the wacky stuff it implies

    All this has to be compatible so out pops a kahler manifold which in combination with the fs metric forces a lot of quantum mechanical behavior

    * I have no idea why spin is so privileged but momentum and energy are related to translations, and they are important. But without it we don’t get any interesting behavior (or stable matter). That and it shows up in the purely geometric parts of general relativity (torsion) leads me to think there’s something deeper afoot.

    I have no justification for the particular values of parameters or forces and such, that just seems like some extra spice

  283. FeepingCreature Says:

    I specifically want to argue against the idea that there’s some association of quantum mechanics and consciousness. I think it’s a good exercise to try and imagine porting your brain to a classical algorithm – which we can do, since afawcs the brain is not a quantum computer – and then imagining having the concrete experience of that universe. You wouldn’t have truly (quantum) random events, but you’d still have unpredictable, chaotic outcomes, so you’d still need to model multiple futures, and you’d still experience making decisions and navigating state space, even in a non-branching block universe. I worry that this sort of thought – that reality has to be quantum because consciousness is quantum-related – arises more from the fact that we don’t understand consciousness as that we do understand it.

    Anyway, my idea is that qm is just the simplest setup where a small seed state can give a sufficient variety of outcomes for life to exist. I think life will turn out to be so unlikely that the universe has to allow an unbounded continuum of branches to find some with life in it at all.

  284. Guyren Howe Says:

    An interesting related question: what would a universe look like that had Quantum Mechanics, but not Relativity?

    Could you have a quantum Ether?

  285. Kenneth W. Regan Says:

    I proffered a “pourquoi-pas?”-type answer to your Q1 in my first slide at https://cse.buffalo.edu/~regan/Talks/QUnion.pdf

  286. Scott Says:

    Ted #281: OK, a couple responses:

    – Good catch, you indeed caught me contradicting what I wrote 15 years ago, when I was apparently more impressed with the (nAnB)2=nA2nB2 argument for amplitudes to be complex. I believe the explanation is simply this: I was genuinely impressed with the “parameter-counting” argument the first time I saw it (which would’ve been either in Lucien Hardy’s work or in Chris Fuchs’s). But then, the more I saw it invoked, the less impressed I became—because like, if there’s this standard axiom that everyone knows to throw in for the sole purpose of ruling out real amplitudes, then why not just rule out the real amplitudes by fiat and be done with it? 🙂 I’ll leave it to you to decide if this is rational.

    – Regarding your question about normalization: yeah, that bugged me too. But I believe the resolution is simply this: the parameter-counting only matters insofar as it leads to the possibility of local tomography: that is, reconstructing any bipartite mixed state ρAB solely from the correlations between the outcomes of measurements on A and B separately. So, what do we need for local tomography? Well, assume for simplicity that A and B are both qubits. Then as you correctly pointed out, when we include normalization, the number of independent real parameters needed to specify ρAB is only

    42 – 1 = 15,

    not 16.

    Now let’s count the number of operators needed for local tomography and see if it matches. Well, we can characterize ρAB from the expectation values of all the possible tensor products of Pauli operators: namely,

    I⊗I, I⊗X, I⊗Y, I⊗Z
    X⊗I, X⊗X, X⊗Y, X⊗Z
    Y⊗I, Y⊗X, Y⊗Y, Y⊗Z
    Z⊗I, Z⊗X, Z⊗Y, Z⊗Z

    Note that there are 16 possible tensor products here because

    dim(A)2 dim(B)2 = 16,

    the important thing about the 4 Pauli operators here being that they span the 4-dimensional vector space of 2×2 density matrices.

    AHA, but Tr((I⊗I) ρAB) = 1 for all ρAB, which means that one of the 16 tensor products is irrelevant, which means that really we only need 15 of them! An exact match, stemming from the fact that

    42 – 1 = 22×22 – 1.

  287. Possibly That Simple Says:

    Scott #280:

    “I merely insist on “a world able to support complex life and intelligence.””

    I understand that many cellular automata fit this bill. They would contain people that thought they were having this very conversation, if the initial conditions were random and infinite.

    I can’t think of a way to structure the problem as to forbid that answer. PRNGs can generate “random” initial conditions with extremely low Kolmogorov complexities, so you can’t ask for simple initial conditions. You can’t ask for small world either, because one trait of our universe is that it is much larger than it has to be to contain us.

    Maybe figuring out how to forbid “Boltzman brains in the game of life” as an answer will be a helpful step.

  288. Scott Says:

    Clinton #203: The part you’re missing is that, if someone in the 1800s had asked “why is classical mechanics true? why does it have the features it does?,” with hindsight that would’ve been one of the best questions they could possibly have asked! Because it would’ve had nontrivial answers! Albeit, answers that were only discovered later. For instance:

    Why is classical mechanics time-reversible? Why does it satisfy a least action / Euler-Lagrange principle? The answers would come from QM.

    Why does the gravitional force fall off like 1/r2? Why are gravitational and inertial mass the same? The answers would come from GR.

    In other words, there really were deeper principles waiting to be discovered (deeper principles expressed, yes, using math). So your thought experiment strikes me as supporting optimism, rather than pessimism, about the search for deeper principles underlying QM!

    Having said that, there’s an immense irony here: physicists were ultimately able to explain classical mechanics in terms of deeper theories, in large part because they discovered that classical mechanics wasn’t exactly right. The corrections were what led them to the deeper theories from which classical mechanics was then recovered as an excellent approximation.

    So we’re led to the following picture:

    – If (as you seem to think) QM isn’t exactly true, just like classical mechanics wasn’t, then we should ultimately be able to explain QM in terms of something deeper.

    – If (as I fear) QM is exactly true, then we might not ever be able to explain it in terms of anything deeper (but we can still try!).

  289. Boaz Barak Says:

    Wrote to you more philosophical thoughts by email, but here is a quarter-serious answer to Q1:

    God made the universe quantum mechanical so we could have Shor’s algorithm.

  290. Scott Says:

    Stewart Peterson #204:

      Are you aware of a treatment at, say, the advanced undergraduate to beginning graduate level, which describes QM from a mathematician’s perspective – the way you understand it yourself?

    You could try my Quantum Computing Since Democritus! Or my undergrad quantum information lecture notes! Among countless other resources … but you asked me! 🙂

  291. madaco Says:

    Regarding tensor products,
    if instead of quantum mechanical, it was instead merely probabilistic, where instead of a state vector we instead had a probability distribution, regarded as a vector in L^1(the state space)
    wouldn’t we still end up with tensor products for combining different systems?

    The time evolution operator would still be linear and preserve total probability, right?

    Not that like, the derivative of the state vector (probability distribution) with respect to time would be a linear function of the current state vector (uhh…. actually, would it be? uh… hm.), but like, the function from “probability distributions at time t_1” to “probability distributions at time t_2” would be linear, right?

    And, if that works for sub-systems, then, if you have two sub-systems together, then it seems like the tensor product is just about the only thing you could do. (?)

    I mean, assuming that you didn’t want to actually sample a particular state, and instead wanted to just keep updating a probability distribution.

    So, except for the “you could just sample a single state and stochastically update that over time, instead of keeping track of a probability distribution changing over time” (which admittedly is a big thing to except), I’m not sure that the “but the dimension of the state vector for large systems is exponentially large” is really a reason to disfavor qm compared to merely probabilistic things?

    Also, hey, can you have anything substantial which is stochastic in this way but also where the state vector evolution through time is reversible? It seems to me like the answer would be no, assuming that it is possible to reach a state in more than one way.

    Like, can you get all of “for the simulator, it is deterministic, but internally it is effectively random”, “separate parts of the system can be considered separately”, “for the simulator, the time evolution is reversible”, and “there is more than one way to get the same* result” any other way?

    (I say “for the simulator” not out of a belief in a simulator, but just as an easier way to phrase things.)

    I guess if “the same* result” had stuff that was actually different at a microscopic level, but which couldn’t be internally observed, then you could I guess, but at that point you are basically just including the history of the world in the current state of the world, in order to make the time evolution reversible, and that seems like cheating?

  292. wyatt the noob Says:

    Scott

    It seems like the David Deutsch worldview is at least interested in Q1 and has some opinions on it. From a recent reading of TFOR some candidate directions are 1) QM is needed to resolve time travel paradoxes 2) QM is needed to provide foundations for moral realism 3) QM is needed to provide foundations for information and specifically biology and intelligence. My guess is a better understanding of this worldview has other opinions about why QM is needed for bio, epistemology, computing to make sense

    I think it could be productive to address the David Deutsch world view as a whole as a way forward. I for one would love to see criticism and progress there.

  293. CR Drost Says:

    If I’m compelled to write, this is a dangerous post. Wheee…

    I have more to read but Lots of quick things I see as nonsense:

    – If you wanted to say this is the simplest mechanism allowing a certain complexity, that is hard because anything Turing complete admits a vast complexity.

    – I think “why quantum” could be connected to consciousness, maybe. A much deeper meditation on Penrose’s idea that real understanding can transcend bounds of ordinary logic to draw meta-conclusions … But that’s a half formed idea and pretty crackpot.

    – I met Andreas O. Tell on IRC several years back and he had a fun preprint. So like in the field that I got my masters in, condensed matter, there’s a technique that we are always using called density functional theory, DFT. The basic idea is to reduce the state space by doing an eigenvector decomposition of the state matrix… Keep only the states that have the highest several eigenvalues, try to reduce the dimensionality that way.

    Tell’s preprint argues that this offers a different interpretation for QM, where essentially some aspect of consciousness is able to push the eigenvalues around arbitrarily on the state matrix, as long as they don’t cross. This means that the eigenstate for the highest eigenvalue is kind of our “best guess of reality”, we can push the eigenvalue to one while all of the other eigenvalues drop to zero. From this he tries to rederive the born rule by stating that information comes into some confined volume at the speed of light, and we have to readjust our understanding based on this new information, and basically the eigenvalues can cross with some probability.

    So if I’m connecting to consciousness then that’s probably my angle, these are the prerequisite laws to have a vantage point and we only see the world with a vantage point. Less airy fairy, more brutal physical.

  294. philip Says:

    A purely classical universe wouldn’t take any time to go from start to end

  295. Scott Says:

    Tim Maudlin #208: I’ve given you a bunch of criteria of simplicity according to which Newtonian mechanics will very often win (for example, against Ptolemaic epicycles, or Aristotle’s teleological universe), but according to which it will also sometimes lose (for example, against Conway’s Game of Life). The fact that all the reasonable criteria I can think of agree in these judgments is what gives me confidence that they’re pointing to something real. Even if you deny that, though, I hope you’ll grant me this: at least I didn’t send Newtonian mechanics totally unarmed into the Occam’s Razor gladiatorial arena, while arming its opponents. I tried to set up a fair fight.

    You, by contrast, have set up “criteria of simplicity” according to which nothing besides our laws of physics could ever possibly win. Our laws are the simplest because they’re our laws, and because everything else has to be expressed in terms of them, whereas our laws can just effortlessly express themselves. Thus, all your judgments about the simplicity of our laws are entirely tautological: the contest you’ve set up is a bloodbath, F=ma just mowing down unarmed opponents with a machine gun.

    I’m frankly amazed that a philosopher of science either wouldn’t notice that or wouldn’t care!

    As for Bohmian mechanics, you keep making statements that I can only understand if I presuppose Bohmian mechanics is true! For example, you say that Bohmian mechanics “tells you” which slit the photon went through, as a function of where it hit the second screen, which is new information that you couldn’t have gotten without Bohm.

    I, by contrast, would’ve put the matter thus: Bohmian mechanics tells you a story about which slit the photon went through—a story that, by construction, has no causal effect on anything you can observe later. A different story, one equally compatible with all the predictions of QM, could’ve told you that the photon went through a different slit. (Admittedly, with 2 slits and 2 spatial dimensions, there happens to be a unique local, probability-preserving way to divvy up what goes where. But that’s a very special case and won’t generalize to when we include multiple photons, qubit degrees of freedom like polarization, etc.) So then, I’d say that a physicist is free to adopt any such hidden-variable story, or none of them, according to convenience.

    When someone with a more Everettian mindset thinks about Bohmian mechanics, it seems to decompose into three claims:

    (1) Yes, there’s a wavefunction of the universe, just like in MWI, and it evolves unitarily just like MWI says it does, with no special role for “measurement.”

    (2) Crucially, only one branch of the wavefunction has “anybody home” to experience anything; the other branches exist but are all “ghost towns.” (I don’t mean to dismiss or ridicule this claim; it’s interesting and for all I know it might be true!)

    (3) In order to pick out which branch has “anybody home,” in a way that agrees with the usual Born rule at any individual time, we ought to tell one particular kind of story about nonlocal hidden variables defined in the particle position basis. (Well, at least in nonrelativistic QM; it’s unclear what we ought to do in QFT or quantum gravity.)

    I don’t know whether you can engage with someone who understands Bohmian mechanics in this way, or whether the chasm is simply too great. In the latter case, maybe we should just drop this thread, since it isn’t directly relevant to the subject of the post.

  296. Andrei Says:

    Scott,

    “By “useful” information, Sakurai clearly means any information whatsoever that one can choose to specify at point A.”

    If you take the non-local route (denying P1. – The measurement at A does not change B) it necessarily follows that one bit of information (the measurement result at A) was instantly sent at B (so that B is changed accordingly). Agreeing that an UP result is represented by 1 and a DOWN result by 0, a measurement sequence UP,UP,DOWN,UP consists of 4 bits: 1101. Those 4 bits are sent instantly at B. B can access them instantly by performing the corresponding measurements at his location.

    “It’s a theorem in QM — I prove it every year in my undergrad class — that none of that can be transmitted.”

    You DID transmit those 4 bits, 1101. What the theorem proves is that you cannot use those bits to transfer some other information of your choice, like a picture of a cat, or your name or whatever. But, as I pointed earlier, SR is not concerned with the content of the message. So, if your “choice” is to sent 1111, you cannot do that in a controlled manner. But this is completely irrelevant, a red-herring. SR does not have any special postulate that distinguishes between information that you “choose” to transfer instantly and information that you don’t choose. It’s just like saying that SR is in perfect agreement with solar flares producing instant effects at Alpha Centauri just because you cannot control solar flares so you cannot send cat pictures with them. Just a big, ugly red herring.

    “He means to exclude correlations, which occur even in classical probabilistic theories and are considered totally unproblematic there…”

    Those classical correlations can always be explained locally. This is made explicit in field theories like classical EM.

    “…but which can be stronger in QM in a way that helps A and B win certain nonlocal games, which is the whole content of the Bell inequality.”

    Again, if you deny P1, you have an explicit non-local information transfer. It is that transfer that allows you to win those games.

    “SR doesn’t care about Bell-type correlations.”

    This is because those correlations CAN be explained locally as well, in a superdeterministic context. But once you make the choice to reject superdeterminism, and deny P1, that local explanation is not available anymore, and suddenly SR cares about them.

    “We could see that abstractly, even if we didn’t have successful relativistic QFTs that show it explicitly.”

    Again, QFT does not make a clear choice of accepting or rejecting P1. If you make that choice and add to it the postulate:

    The measurement of A causes an instantaneous change at B

    you get a not so successful QFT anymore. You introduce an asymmetry between the A and B measurements (A is random and it causes B, while B, being caused by A is not random). Good luck making that compatible with SR.

    “That QM and special relativity are difficult but not impossible to reconcile, and that the rare theories that do successfully reconcile them are so phenomenally good at describing the world, is a powerful indication that physics is on the right track here.”

    QFT itself is on the right track (since it does not explicitely deny P1), but denying P1 derails it. accepting P1, and so, superdeterminism, is the only way to stay on that track.

  297. Philippe Grangier Says:

    To Anbar #258

    Well, I know also the history of QM, and I’m telling it each year to my students, who like it. But if ‘nothing else could be simpler than this’, why did Scott launch this (pretty animated…) blog discussion ? that is by the way following myriads of books and papers claiming to ‘answer the question’ ? some piece must be missing somewhere… ?

  298. CR Drost Says:

    Slight caveat in that in MWI nobody is home anywhere 🙂

    Well I mean kinda. Think of MWI like a big graph, a Hamiltonian of the Universe says this state leads to that state leads to this other state, that’s the edges. People take this graph and put some complex numbers on some nodes and then evolve it through successive timesteps, moving the complex numbers to other nodes, based on some complex numbers attached to the edges. But if you’re a true disciple this is a silly game. All of the worlds exist and they exist “right now” and they contain completely deluded consciousnesses that are for some reason convinced that they are moving according to this graph and not its dual… But they are frozen in time and place, their notion of movement is illusory.

    It shares this philosophical queerness with Minkowski space… A true disciple of Minkowski space thinks first that the whole worldline is conscious and the notion of the consciousness moving along the world line is a local phenomenon which is quite absurd in the global picture… then maybe later has to come to believe in the idea of a soul zipping across the worldline to explain why we actually think things are changing.

    It’s not bad, but probably to get forward motion on philosophy of consciousness we need some sort of huge revolution coming from process philosophy… Literally that we cannot be conscious without changing, so that any static state being conscious is absurd. Like when you are rewatching Heroes and Hiro Nakamura freezes time, we just take for granted that his consciousness keeps going because he is able to keep changing, while all of the change around him has slowed to a near standstill, we Intuit that those consciousnesses have stopped.

  299. CR Drost Says:

    @Boaz Barak #289 Hah! That’d be fun. But surely we’d want Grover’s algorithm, the idea of being that evolution is some big computer and God wanted to solve for the history that generates conscious life, but he wanted that precious √N speedup because it’s taking too damn long

  300. Andrei Says:

    Philippe Grangier,

    P1. The measurement at A does not change B (locality).
P2. After the A measurement, the state of B is -1/2 (QM prediction).

    *No, first you should define what a ‘state’ is. In my definition a ‘state’, that I call a modality, only makes sense if the measurement context is defined. So one should write :
    P2. After the A measurement, the state of B is -1/2, with respect to an orientation defined by the A context (QM prediction).*

    As clearly specified in my description of the experiment, the detectors are fixed on the Z axis before the experiment. Z axis is unambiguously defined (say it points towards the galactic center or whatever). therefore, the experimental context is clear for both A and B. This context does not change between repeated measurements (difference from a Bell test where the detectors are reoriented between runs)

    From P1 and P2 it follows:
    P3: B was in a -1/2 spin state even before the A measurement. The spin of particle B on Z is predetermined.

    “*No, since the prediction about the ‘state of B’ requires the A measurement, which was not done before*”

    I don’t get this. Of course the prediction about B requires the A measurement. This is exactly the point. You measure A and the result allows you to predict B. You predict from A what the experimental record that will arrive from B contains.

    If you add the locality condition, that the measurement at A (that enables the prediction) did not cause any change at B, but B is left by the A measurement in the -1/2 spin on Z state, B must have been in the -1/2 spin on Z state even before the A measurement, otherwise you have a change.

    The only assumptions here are 1. locality and 2. QM gives correct predictions. From these it logically follows that the measurement results are predetermined.

    “*No again. The initial entangled state is predictively incomplete, and requires the A measurement to be done in order to make a meaningful prediction on the result of the B measurement;”

    Again, my argument does not deny that you need the A measurement to predict B, it actually needs that. So you didn’t show any problem with the argument.

    “without that, B’s result is fully random*”

    Without the measurement at A we cannot predict B. This does not make the B result “fully random”. We just can’t say anything about it.

    “*Predictive incompleteness goes with contextual inferences, that are purely quantum, but do not require neither (Bohm-type) nonlocality, nor superdeterminism. More details in https://www.mdpi.com/1099-4300/23/12/1660 *”

    Again, you didn’t show that any premise in my argument is wrong or unjustified. You simply reiterated P2 (the state of B is -1/2 after the A measurement). In order to refute the argument you need to deny P2 (or P1).

    “* The prediction was true, but B’s particle was not affected by A’s measurement whatsoever. Only the contextual inference by A is true”

    Well, if the B particle was not afected by A, but its state is -1/2 on Z after the A measurement, what state did B have before the A measurement? Any answer that is not -1/2 on Z would imply that the A measurement DID change B.

    “..and only be checked later, in a fully causal way.”

    Sure, but this does not concern my argument at all.

    “And it can be checked that no contradictions arise when inverting A and B*”

    This does not concern my argument either.

  301. Philippe Grangier Says:

    I noticed that Scott repeatedly refers to an ‘exponentially larger state space for all of reality’, sentence taken from 2. in his introduction. I have a remark on that : for N qubits, the dimension of the state space is 2^N, but what happens by taking the (mathematical, not physical) limit of N going to countable infinity, aleph_0 ? It is well known (from Cantor) that 2^N goes to non-countable infinity, aleph_1, and thus the Hilbert space is not separable any more, and is essentially non-manageable. So there is some kind of asymptotic unstability of the Hilbert space structure, but what happens then ?

    This is discussed in detail by von Neumann in http://www.numdam.org/item/CM_1939__6__1_0/ (free access), which is some kind of ancestor with respect to operator algebra, GNS etc. What Johnny says is that the unmanageable Hilbert space ‘blows up’ in parts, that are essentially disconnected from each other, and that are manageable (separable) again; in modern langage, they would be called superselection sectors. These sectors are usually type III in the Murray-von Neumann classification, but in this 1939 paper Johnny is very careful with that, because the existence of type III factors was not even sure at that time.

    His paper is very technical (though he uses only very basic maths), but I really recommend that you read the first introductory section, that is quite illuminating. Interpreted in physical terms, it tells that the (mathematical) limit of countably infinite N behaves essentially like classical physics, because the operators relative to different sectors all commute. Also, there is no need to specify ‘all details’ in each sector, this is unfeasible, but since the sectors are disconnected a macroscopic label is enough to identify each of them, in a fully probabilistic approach. This is quite remote from MWI, but quite interesting I think…

    There are more details in https://arxiv.org/abs/2003.03121 (published in Found. Phys.).

  302. Yev Says:

    Isaac Grosof #65 and #139.

    I think you are making an implicit assumption that anisotropic microscopic rules must lead to anisotropic macroscopic dynamics. I agree that that this is probably true for CGoL and other similar automata, but I don’t see why it should necessarily be true for all automata.

    The closest thing to a counterexample that I know of is the Arctic Circle Theorem: you start with simple anisotropic rules (placing dominoes on a square grid), but end up with something that has rotational symmetry (a circle!). But it’s not really dynamical in a way you might expect laws of physics to be dynamical, and it doesn’t even have translational symmetry.

    Anyway, I think this is an interesting open(?) question. Can a cellular automaton be “approximately isotropic on large enough scale”, for some definitions of those words?

  303. Yev Says:

    async #143, Scott #262.

    I think both of you are missing a third possibility: you can have Lorentzian, Galilean, or Euclidean geometry. Greg Egan’s Orthogonal trilogy is awesome, go read it if you haven’t 😀

    Of course this doesn’t really affect either of your arguments. You just have change 1/2 to 1/3. And if you want locality then Lorentzian is still the only option.

  304. Dave Sutter Says:

    I want to add one more rule – there is no such thing as wave function collapse. Apparent wave function collapse, the Born rule and classical physics are implications of how the brain works. If we were instead observers outside the universe who could see “reality”, we would see…a wave function that never collapses.

    Why the complex valued amplitudes, unitary transformations and tensor products? As far as I know, just because.

  305. Philippe Grangier Says:

    To Andrei #300

    If you only consider the situation where the two orientations are the same (perfect correlation), it is well known that you can construct a simple local classical model : just assume that for each pair the two spins have opposite orientations along the same (random) direction, determined when they are emitted. But such a model obeys Bell’s inequalities (BI), so you just ignore the difficulty, which is that they are violated when considering other orientations.

    My whole point is to discuss what happens when the orientations are different, and when Bell’s inequalities are violated (Clauser 1972, Aspect et al 1982, loophole-free 2015). Then the previous naive model fails, and one must consider what goes wrong in Bell’s hypotheses, because Bell’s reasoning is logically and mathematically correct. A good analysis was proposed by Jon Jarrett in 1984, splitting Bell’s hypotheses in two parts, called (elementary) locality and predictive completeness. The same ideas were presented again by Abner Shimony (1986), but with a different terminology : elementary locality became parameter independence, and predictive completeness became outcome independence.

    The trouble with Shimony’s terminology is that violating any of these two `independence’, that is required to violate BI, sounds like some form of non-locality. On the other hand, one can show simply that QM agrees with elementary) locality, but violates predictive completeness, and in addition that the violation of predictive completeness has little to do with non-locality, but much to do with non-classicality. For instance, any deterministic theory must agree with predictive completeness, and thus must violate elementary locality in order to violate BI; a good example is Bohm’s theory.

    As a conclusion, there is something special with QM non-determinism, that is to allow predictive incompleteness, and thereby to violate BI (in this discussion I ignored superdeterminism, that is another possibility for violation, but honestly I simply don’t like it). More details are given in https://arxiv.org/pdf/2012.09736.pdf

  306. Anbar Says:

    Philippe #297

    It is a consequence of the fact that indeterminism and a lack of substance for realism (both consequences of the empirically observed complementarity plus the laws of thermodynamics) are tough pills to swallow, combined with the fact that you don’t really need to concede on either of this points before using fruitfully the formalism, as long as you engage in some conspiracy theory of some kind because “interpretations”.

    Now, I’m not saying that trying to derive a more fundamental principle than “it’s the simplest thing I can concoct, that does the job” is necessarily a fruitless exercise, but the real “issue” is with indeterminism and no substance for realism.

    Which brings us to Scott’s theme, which seems to be if “complex life as we conceive it” could also have arisen in a classical (non-complementary) Universe, and leans quite strongly on the conspiracy side 🙂

    The answer “yes”, if possible at all, would imply the Ultimate Rube Goldberg device (made by the Ultimate Designer to satisfy a timeless urge?). Even if it was an option, the alternative (plain QM with indeterminism and no Objective Reality) requires incalculably less assumed complexity to start with.

    Why do I say this.

    Complex life “as we know it” requires organization on a large range of length scales.
    In a QM universe the overall complexity is bounded by the fact that the inward dive in length scales stops at a finite value, where the hydrodynamic approximation starts to fail and we find atoms and molecules with their quantized configurations and stable ground states.

    In order to mimic this in a classical Universe, bound states of elementary particles won’t do, because of the un-quantized nature of classical configurations and interactions: low frequency high amplitude fluctuations can unbind any unbindable system very quickly.

    You then end up essentially with an elementary particle for at least *every state of every atomic species*, with transmutation rules that are individually assigned (can’t even use any Lorentz symmetry, as the classical Universe must be non-relativistic), and the complexity explodes.

    The alternative in which is there is no end to the inward dive seems doomed on thermodynamic grounds, much like with Lorentz symmetry and the UV catastrophe

    Is this “proof” that the answer to the theme question is “no”? I don’t know, but it’s at least circumstantial evidence.

  307. Philippe Grangier Says:

    To Andrei #300

    If you only consider the situation where the two orientations are the same (perfect correlation), it is well known that you can construct a simple local classical model : just assume that for each pair the two spins have opposite orientations along the same (random) direction, determined when they are emitted. But such a model obeys Bell’s inequalities (BI), so you just ignore the difficulty, which is that they are violated when considering other orientations.

    My whole point is to discuss what happens when the orientations are different, and when Bell’s inequalities are violated (Clauser 1972, Aspect et al 1982, loophole-free 2015). Then the previous naive model fails, and one must consider what goes wrong in Bell’s hypotheses, because Bell’s reasoning is logically and mathematically correct. A good analysis was proposed by Jon Jarrett in 1984, splitting Bell’s hypotheses in two parts, called (elementary) locality and predictive completeness. The same ideas were presented again by Abner Shimony (1986), but with a different terminology : elementary locality became parameter independence, and predictive completeness became outcome independence.

    The trouble with Shimony’s terminology is that violating any of these two `independence’, that is required to violate BI, sounds like some form of non-locality. On the other hand, one can show simply that QM agrees with elementary locality, but violates predictive completeness, and in addition that the violation of predictive completeness has little to do with non-locality, but much to do with non-classicality. For instance, any deterministic theory must agree with predictive completeness, and thus must violate elementary locality in order to violate BI; a good example is Bohm’s theory. More details are given in https://arxiv.org/pdf/2012.09736.pdf

    As a conclusion, there is something special with QM non-determinism, that is to allow predictive incompleteness, and thereby to violate BI without explicit nonlocality. In this discussion I ignored superdeterminism, that is another possibility for violation, but honestly I don’t like it.

  308. B R Says:

    To be frank, I think the viewpoints espoused here are “too unitary”.

    More concretely, I would like to point out that the actual Schrodinger equation involves a *Hermitian* operator and not a unitary one. All the quantum-mechanical discussions that rely solely on unitary operators, and in particular anything to do with quantum computation and quantum information, are in my view based on an engineering viewpoint where somehow time evolution always happens in discrete steps (and Hilbert spaces are always finite-dimensional). They might be tremendously fun, useful, and may eventually produce real insghts into quantum gravity, but I do not think the answer to your questions lies there.

    Instead, let us first realize that we need at least some physical input. Einstein also needed it, even in his derivation of special relativity: he required that all of physics, inclusing the measurement of the speed of light, is frame-independent. One can ask if there is a similarly catchy phrase that will inevitably lead to all of quantum mechanics. The best candidate that I can come up with is one of reductionism: your theory of nature should not have an unnecessary ugly dichotomy between waves and particles. Instead it should put them both on the same footing. Good luck doing that with any classical theory! So this is where I would depart to deduce the apparently cherished inevitability of quantum mechanics…

  309. Michael M Says:

    When I retire, I need to make a point to really learn QM, and keep an archive of this blog.

    Apologies in advance if I am completely out of my element (applied mathematician). The only thing I have to contribute is at least in the spirit of the questions. Do we have implicit assumptions on the nature of the universe? Is it a state machine that can compute the ‘next state’ from previous? (Or some kind of async update since simultaneity is not true.) Is it a massive time evolving PDE? Is it something efficiently computable? What if it’s not?

    This is where I think we haven’t quite ruled out superdeterminism. I don’t necessarily care for it, but I also do not think it necessary to assume it has any intention behind the initial conditions. If the universe is a giant PDE, plus some global constraints it must satisfy, only certain solutions exist. One of those could be our world. Whether they seem non-physical to compute, goes more to our assumption that the universe is a computer and not a mathematical object. Personally I’ll sleep better at night if it’s not superdeterminism, but I think that’s a fundamental question worth asking.

    That said I completely agree that a classical or CA chemistry should be possible and someone should absolutely try to make one. I’d play the heck out of that simulator.

  310. Philippe Grangier Says:

    To Anbar #306

    You write : « Now, I’m not saying that trying to derive a more fundamental principle than “it’s the simplest thing I can concoct, that does the job” is necessarily a fruitless exercise, but the real “issue” is with indeterminism and no substance for realism. »

    The benefit of my approach is to exploit indeterminism in a smart way, and to save (contextual) realism, this is actually a quite decent realism from a philosophical point of view.

    And then : « Which brings us to Scott’s theme, which seems to be if “complex life as we conceive it” could also have arisen in a classical (non-complementary) Universe, and leans quite strongly on the conspiracy side. »

    Here I cannot tell, except with my previous restaurant metaphor, #174 : if you have spinach in your plate, you may certainly ask ‘why not a steak ? is the Grand Chef crazy ?’. But my more modest tendancy is to deal with the spinach.

  311. Ibrahim Says:

    (I am a software engineer who wrote two quantum circuit simulators for personal exploration)

    Are we as humans going to accept that reality is
    generated
    and generated through interactions of a few things
    and so only exists for that interaction, only for its scope
    and that is it and nothing more?

    Related to that,
    Are we going to realize that when we define a “state vector”, we think that we deserve to look from the “god’s perspective” with all the variables placed nicely together and there can be, to our convenience, *one representation to represent the whole.

    An that representation is a notation of precision, where amplitudes have precise and sharp values, such and such imaginary number, instead of “any value but not these”.

    We are still trying to hold on to the chair that we sit: the tools, notations and experiences. With them we got so far. We need to go to the very bottom and go up from there, with a notation of “not set”, “free to have such liberties”, “are not to be put together in one representation”. That means a new journey with extreme humbleness.

  312. Andrei Says:

    Philippe Grangier,

    “If you only consider the situation where the two orientations are the same (perfect correlation), it is well known that you can construct a simple local classical model : just assume that for each pair the two spins have opposite orientations along the same (random) direction, determined when they are emitted.”

    This is true, but the main point of the argument is that, while, as you say, you CAN build a local, deterministic hidden variable model you CANNOT build a local non-deterministic model. So, the conclusion of the argument is that local indeterminism is falsified by this experiment. You cannot just ignore that and continue to discuss your model in the context of a Bell test or whatever other experiment. Local indeterminism is dead and buried by EPR-Bohm, we need to forget about it.

    “But such a model obeys Bell’s inequalities (BI), so you just ignore the difficulty, which is that they are violated when considering other orientations.”

    Here I disagree. Local classical models could in principle violate Bell’s inequality if they are models with long-range interactions, like classical electromagnetism. The presence of such interactions makes the model contextual, since the state at A and the state at B, and the state of the particle source (S) are not independent (the whole system A+B+S has to satisfy Maxwell’s equations). In other words, the hidden variable (which is determined by the state of the source at the time of emission) is not independent of the detectors’ settings. This is the so-called superdeterministic loophole.

    “My whole point is to discuss what happens when the orientations are different, and when Bell’s inequalities are violated (Clauser 1972, Aspect et al 1982, loophole-free 2015). Then the previous naive model fails, and one must consider what goes wrong in Bell’s hypotheses, because Bell’s reasoning is logically and mathematically correct.”

    Bell’s independence assumption is wrong for the simple reason presented above. Bell’s model ignores that long-range interactions exist even between distant systems, he basically equates classical physics with Newtonian rigid-body mechanics. The independence assumption does not make sense in any field theory (classical EM, GR, fluid mechanics and even cellular automaton models which are in fact discrete field theories).

    “As a conclusion, there is something special with QM non-determinism, that is to allow predictive incompleteness, and thereby to violate BI (in this discussion I ignored superdeterminism, that is another possibility for violation, but honestly I simply don’t like it). More details are given in https://arxiv.org/pdf/2012.09736.pdf

    I think you dislike superdeterminism because you have a wrong understanding of it. But, like it or not, it’s the only local option on the table.

  313. Mateus Araújo Says:

    Scott #264: The problem is that you can’t even define what this “classical stochastic evolution rule” is. A “freely-willed transition rule” is even woollier.

    You are probably aware of the century-old difficulty in even defining what a probability is. We do have a good theory of subjective uncertainty, but that’s not good enough to power the universe, we need true randomness. Do you have an answer? What does it mean to say that state A transitions to state B with probability 2/3 or to state C with probability 1/3?

    We are used to letting true randomness be simply an unanalysed primitive. We know how to deal with it mathematically (with the Kolmogorov formalism), and we know how to produce it in practice (with QRNGs), so we don’t need to know what it is. But if you are writing down the rules that make a universe tick that doesn’t cut it, you do need a well-defined rule.

    The only solution I know is defining true randomness as deterministic branching, aka Many-Worlds. And as I’ve argued before, you do need quantum mechanics to get Many-Worlds.

    Another argument, that I don’t find persuasive, but I think you will, is via the Chiribella-D’Ariano-Perinotti reconstruction of quantum mechanics. They have some axioms defining what a reasonable probabilistic theory is, and show that if in addition you require purification, that is, that randomness must be generated by the theory itself, then you end up with quantum mechanics. If you want a classical probabilistic theory then you have to let your randomness be external, like we’re used to in (classical) Hamiltonian mechanics.

  314. Denis Kuperberg Says:

    Scott #190: I was merely questioning whether this question was just a scientific one. Science can answer the “How ?”, but not necessarily the “Why ?”. The way you phrase the question makes me think that this is on the border of what we can expect science to answer for sure with enough efforts.
    Don’t get me wrong : it might well be that for instance we find a unification of QM and GR, in which the laws of QM appear logical and inevitable, and that would count as a satisfying answer to your question.
    But it could also be that the fundamental laws of the universe appear needlessly complex and arbitrary to us, with no deeper explanation. I just wanted to leave room for this second alternative.

    Granted, the quest for simpler explanations have been an efficient guide for us so far, but we might encounter a fundamental roadblock on this path, it was the only point I was trying to make.

  315. Danylo Says:

    A1. Classical physics assumes that the past is entirely known to a God. And the future is completely determined by the past. So, everything is predetermined. I find it meaningless.

    A2. The rules of QM are simpler in some sense. For example, they allow us to access the Church of the larger Hilbert space. Which explains that complex things that we observe are just projections of something conceptually simpler, but which resides in higher dimension.

    Unrelated to QM, the set of complex numbers contains much more harmony (and beauty, though it’s subjective) than real numbers. They are not just R^2, as we can think of them.

  316. Mateus Araújo Says:

    Philippe Grangier #307: I don’t see how shuffling terminology around changes anything. Call it “outcome independence”, call it “predictive completeness”, call it contextuality. The fact of the matter remains that the probability of Bob’s outcome ‘b’ depends on Alice’s outcome ‘a’, which was produced in a space-like separated event. It’s nonlocal.

  317. Aristotal Says:

    On Q1: We can think of particles as wave-like excitations out of some vacuum ground state, similar to quasiparticles in condensed-matter systems. In our everyday life, waves also arise naturally out of perturbing any system… in that sense, a wave-like (and therefore, quantum) view of particles is much more natural than classical, “point-like” particles.

  318. David Shaw Says:

    Best speed with your latest exciting project! Personally I hope you develop it to book length.

    In trying to follow this debate I’m often struct by a certain asymmetry in how it is framed: we accept the naturalness of a classical world and try to think up reasons why quantum mechanics had to be added.

    For a moment, lets allow ourselves to start with the intuition that complex numbers are a natural starting point (they’re algebraically closed after all!). Now you can still ask Q, but you might decompose Q1 and Q2 differently. Q1* might be “Why didn’t God just keep the universe complex and be done with it?”. Q2* might be “Why this special alternative? Why force measurements to be real numbers? Why the Born rule….”. This leads to many of the same questions and potential anthropic explanations. But can thinking about the question from both ends help insight?

    Obviously, quantum mechanics is more than just complex numbers, but really my point is to push that any treatment of this subject also has to deal further with issues in the philosophical foundations of maths (even further than you already did in QCSD). Didn’t Gödel hole formalism beneath the waterline? But in attempting to axiomatize physics we seem to want to continue to believe! In seeing ℕ as somehow more ‘natural’ and less needing of special explanation aren’t we taking sides on Q?

    I’d also highlight the point from antiquity about moving too quickly to talking about R as ‘really real’ numbers. When we go to school we learn to count, to measure things and to sample from distributions (play games). For all of these ‘experiments’ we only directly need the integers and the rationals (their fairly trivial extension). Even when I measure the side of real world triangle (using whatever physical unit my teacher indicates) the physical result is a rational number. Now when I’m taught the ‘theory of triangles’ to make sense of my physical results I have to starting using numbers in R. Isn’t this already an important clue that the world of physical theories has to use a richer mathematics than that of our native perception? The ‘Ruler rule’ helps be jump between R and the rationals. Doesn’t the ‘Born rule’ just help me with another jump between number systems?

    Perhaps your new book should be titled ‘Quantum Computing since Pythagoras’?

  319. John van de Wetering Says:

    @Scott: You seem to say that you feel Q is sufficiently answered for relativity, so am I correct in taking that to mean that you would be happy with finding some set of physical principles which necessitates quantum theory? Cause you could argue that Q is *not* answered for relativity, because why would the speed of light be finite? That seems like a bit of an arbitrary choice as well, and I could easily imagine a complex life-bearing universe where causality works instantly.

    But assuming that finding good physical principles for quantum theory would indeed be sufficient: many such principles have been proposed (like the Hardy and Chiribella paper you mention), and as you rightfully say, these are not fully satisfactory.
    However the Chiribella reconstruction does bring up the purification axiom as a thing that differentiates the classical world from the quantum world: in a classical world it is possible to have *genuine* mixed states, where knowledge is inherently smeared, while in a quantum world any kind of mixture or uncertainty comes from not having access to the full system. This could probably be related in some formal way to “information must be preserved”, although I don’t know exactly how.

  320. John van de Wetering Says:

    Separate comment, cause separate idea.

    An anthropological argument: suppose Everett is right and additionally that our conciousness is classical. Then in our quantum universe there is an *uncountably infinite* number of classical conciousnesses existing on all sections of the universal wavefunction.
    So supposing that there’s many universes, some of which will be classical (like the cellular automata you propose), and some of which will be quantum-like (supporting superpositions of consciousnesses). Then all the classical universes will only support a finite or countable infinite number of consciousnesses while the quantum-like universes support an uncountable-infinite number of consciousnesses. Hence, probabilistically you will always find yourself in a quantum-like universe.
    That only partially answers Q1 though, as it gives an argument for why you shouldn’t find yourself in a classical universe. However, as you say, Q2 has many partial answers and there’s good reasons to believe that once you disallow a classical universe, a quantum universe is a very natural choice for several mathematical reasons.

  321. Gadi Says:

    Scott, if these are the kinds of questions that interest you, don’t you think studying physics gets you closer to the answer than studying computer science and quantum computing? Studying quantum field theory, etc.?

    I can also ask questions about all the non rigorous things in quantum field theory. Is there a formulation without renormalization? If not, then with which parameters does God actually run it? Do you realize that the current formulation of quantum field theory is far from being a computer program you can just postulate that God “runs”? That mathematical consistency of it has been an open problem for many decades now?

    How far into physics and quantum field theory do you really understand? Don’t you think you should get a very good understanding of it (not even claiming I have it- I’m talking about understanding it like at least those physicists in CERN that actually compute things with it) if those are the kind of questions that interest you?

  322. Steve M Says:

    ** If we’re living in a simulation **

    Perhaps God is running an optimization using something like genetic algorithms and trying to optimize for one or more parameters that are important (for some reason). Statistically, we’re probably not the optimal solution and if using something like genetic algorithm for optimization, there are lots of offspring that have nonsensical values. Let’s hope they don’t notice and kill off this descendant while I’m still typing.

  323. JH Says:

    I’ve written reams and reams on this before

    But basically, I think it comes down to causality. In classical mechanics, there’s a time dimension and particles are, in principle, distinguishable from one another. So you could take two electrons and trace their path backwards or forwards through time.

    You can meaningfully ask “Where was electron B two hours ago?”

    This is the change in quantum mechanics. There are only individual instants of time and there’s no causal connection between these instants, and no causal link between individual particles, making them interchangeable.

    You *can’t* meaningfully ask “Where was electron B two hours ago?” Because it’s not meaningful to compare electron B to any other electron.

    I’ve put it in detail in the last section here.
    https://thesmalluniverse.net/pages/QuantumMechanics.html

  324. Antoine Deleforge Says:

    Thank you so much Scott for asking this wonderful question and for the thread taking place here, which I am reading with fascination!

    My knowledge of QM is poor so I am afraid of embarrassing myself, but I can’t resist posting because something bothers me in this thread: Why are discussions on Q1 so much focused on cellular automata and their limits (anisotropy, problems with their implementations, how to get randomness, etc..)?

    Maybe I misunderstood something, but if one wants to make a universe that is classical but resembles ours, isn’t the most “obvious” try, much more obvious than using CAs, to simply replace complex amplitudes by nonnegative probability densities, unitary transforms by stochastic transforms, and see what can be done with this? In such a universe we would still have:
    -Elementary particles described by wave functions
    -Wave function collapse due to measurements
    -Tensor products
    -etc.

    but we wouldn’t have any of the “weird” quantum effects stemming from interferences of amplitudes. Am I right? Would such a universe still be called classical? Then it seems to me that a very promising route to answer Q1 would be to figure out what would go wrong with THAT universe.

    I guess this has already been studied before and would love to get pointers from knowledgeable people!

  325. JimV Says:

    Scott replied at 223, “I could be 100% satisfied with … there’s a Tegmarkian multiverse, with some classical universes, some quantum universes, and some universes with other rules entirely, and a-priori we could’ve been in any of them, but the deck was stacked in favor of our finding ourselves in a quantum universe for the following reasons”

    The odds are stacked because, like a water puddle fits the shape of the hole it is in, we evolved to take advantage of the way our universe works. The universe wasn’t made for us, we were made for this universe.

  326. Philippe Grangier Says:

    Andrei #312, so your main point is :

    « Local classical models could in principle violate Bell’s inequality if they are models with long-range interactions, like classical electromagnetism. The presence of such interactions makes the model contextual, since the state at A and the state at B, and the state of the particle source (S) are not independent (the whole system A+B+S has to satisfy Maxwell’s equations). In other words, the hidden variable (which is determined by the state of the source at the time of emission) is not independent of the detectors’ settings. This is the so-called superdeterministic loophole.(…) I think you dislike superdeterminism because you have a wrong understanding of it. But, like it or not, it’s the only local option on the table. »

    In my understanding of superdeterminism, the non-independence between A, B and S must come from their overlapping past light cones, in order to avoid a clash with special relativity. This possibiliby was already considered by Bell, but it makes that there are no more independant events, no more randomness, no more freedom of choice, since everything has been ‘written in the past’. So yes, I dislike this option, and no, I don’t think it is the only local one on the table. It is true that predictive incompleteness is not easy to grasp because of its fundamentally non-classical and contextual features, but well, it does the job. So coming back to your preferred (Bohm-like) option of giving up locality, I think that you have a wrong understanding of contextual randomness.

  327. JimV Says:

    Follow-up: also mildly stacked because some randomness and discreteness are necessary for a feasible universe, and QM is a way of providing that. Maybe the only or best way, but we have no way of testing that, and whatever proof we think we could make might have a counterexample in the hypothetical metaverse which we will never see.

  328. Dubious Says:

    Scott #261: I’ve hesitated to post this, because I think it adds nothing—even if it’s true, I don’t think it’s predictive or useful, and almost certainly non-falsifiable. But this comment suggests it, so I will!

    In one of your prior posts from awhile back, I think you mentioned something roughly like “what kind of universe could exist where basic math differed greatly or was meaningless” (not a quote and I’d have to go back), but I had two thoughts. One was: what if you had a universe of (relatively) ‘godlike’ people to whom operations on infinite quantities were trivial, like 2+2 to us. Hypercomputing was just “computing”. Math on finite quantities might be anywhere between meaningless and problematic, or perhaps a specialized area of study.

    What if such a species were interested in whether some form of finite life were possible… how to actually do this might still require some cleverness. They would have infinite computing resources at no cost, but limiting the simulation might be one of the main problems … some finite speed limit, some way to prevent hypercomputing or unlimited-range information transfer, etc. But at the same time, they have infinite resources, and they (unlike quantum computing 😉) can check all possible outcomes to see if finite life happened.

    But, like I said… this is more the realm of speculative fiction than satisfactory answer; it’s still “just because,” with no way to tell in any case.

  329. Crackpot Says:

    CR Drost #298

    Why do you say that MWI implies a time-crystal? Is that what you are implying?

    Or are you suggesting that a given consciousness exists in a specific “world” both “before” and “after” a “split” (I hate these metaphors), and the idea that there is any “movement” during a measurement event is false?

    Because the former seems confused, and the latter seems founded on a misunderstanding of MWI, because there aren’t literally distinct universes/worlds, but rather complex interactions of waves in a single universe in which sets of waves interact constructively, and other sets of waves interact destructively, in a way which results in the cancellation of large sets of wave interactions, and certain events can cause waves to split in ways which create new logical sets of waves, and this just happens to kind of resemble multiple universes / worlds.

    (Of course, I may be misinterpreting you entirely.)

    Now, on a more crackpot note, personally I think it’s false that the different “worlds” don’t interact, and I think world interactions figure importantly into what entropy “really is” (which is to say, I think entropy is a measurement of how many histories a given physical system is compatible with).

    And on another crackpot note, note that what we call “time” is actually three distinct things. The first is the “tick rate” of the universe, or, alternatively, the conversion factor between space and time. The second is the “history” of the universe. And the third is the “dimension of change” of the universe. These may all be the same physical phenomenon, but they don’t actually need to be; you can have a closed (loop) dimension for time which functions as the “dimension of change” of the universe, for example, with the “history” of universe written on “distance” (in every direction, a la special relativity), and the “tick rate” embedded in the observed relative angle between the local orientation of the loop dimension, and the relative orientation of the loop dimension of another object. In this formulation, going backward and forward in the “dimension of change” is identical – traveling backwards in time is basically the same as going forwards in time, because time and history are distinct.

  330. Steven Evans Says:

    Q: Why should the universe have been quantum-mechanical?

    A: The answer is 0.1134.
    You cannot ask a calculator why it calculates the way it does. If we assume that reality is no more than these complex probabilities, then simply a bunch of calculations is taking place. So all we can do is calculate. Anthropic-like considerations lead nowhere. While if we ask why is QM the way to avoid nothing, again a calculator doesn’t ask or answer questions like that.

    All the calculator can do is provide us with details of the calculations and how likely is it that they will contain some clue meaningful to a human, like some zeroes in the expansion of π arranged into a circle in “Contact”? It seems the universe really is just “shutting up and calculating”.

    When you type 0.1134 in a calculator and turn it upside down it reads “hello”. That’s all we are – a pattern on a calculator that says “hello”. Or asks “Why should the universe have been quantum-mechanical?”

  331. fred Says:

    Maybe the only question is what does it take to create consciousness.

    It would be paradoxical if consciousness was a purely classical process, something emerging from straightforward data processing, yet it would only be realized on digital computers that first have to emulate quantum mechanics or GR.

    I think it’s therefore more reasonable to expect that every observed fundamental mechanism of our reality (QM and GR) are necessary ingredient for consciousness.

    By Turing-Church, such fundamental mechanisms can always be emulated using a digital computer, even if very inefficiently (there’s no requirement that consciousness has to be updated at every fundamental step of the simulation), so we’re always back to square one.

    This is why Penrose is trying to escape this by claiming that there must be some fundamental mechanism besides QM and GR that we have not captured yet, and that ingredient isn’t computable (saying that a causal element in our reality can’t be computed is the same thing as saying that it’s God magically changing the state of our universe from the outside… quantum randomness is an example of this).

    Considering the universe as a giant computation could be misleading and confusing, maybe it’s better to consider the universe as a giant static mathematical construct, i.e. all there is at the very bottom is a giant set of “nodes” with infinite ways to connect them, and some subsets of connections have certain properties (like self-similarity, which Douglas Hofstadter claims is a special ingredient of consciousness) that simply auto select them to give rise to consciousness.

    In the end, there’s no way escaping that, no matter what how much we model the external reality we perceive, consciousness is the only fundamental element of reality we have access to. Trying to explain consciousness from abstract elements we model in our minds (like quantum fields and their vibration) is never going to work, it just can’t.

  332. Philippe Grangier Says:

    Mateus Araujo#316 : « Call it “outcome independence”, call it “predictive completeness”, call it contextuality. The fact of the matter remains that the probability of Bob’s outcome ‘b’ depends on Alice’s outcome ‘a’, which was produced in a space-like separated event. It’s nonlocal. »

    Dear Mateus, you are going too fast, the main points are :
    – first, split Bell’s hypotheses in two parts (1) and (2) as said before, whatever name you give to these parts, this is just a matter a taste. Either (1) or (2) or both must be violated by QM.
    – second, consider that part (1), elementary locality or parameter independence, is really associated with the intuitive idea of an action at a distance, so its violation deserves to be called nonlocality. It is NOT violated by standard QM.
    – third, consider that part (2), predictive completeness or outcome independence, is NOT associated with the intuitive idea of an action at a distance, but rather with an inference at a distance, that can only be verified in a common future. Therefore, its violation does not deserve to be called nonlocality, and has no problem whatsoever with special relativity.

    Now you may claim that (2) is also a form of nonlocality, in this case it is only a question of how you define nonlocality. But my claim is that distinguishing clearly between (1) and (2) is quite useful, and also leads to the interesting conclusion that the usual psi is (predictively) incomplete, as long as the measurement context has not been specified (see article quoted before).

  333. Allan E Says:

    Hello! My speculative answer to Q1:

    In order for us to spout metaphysics about a Universe, it has to be possible in the first place. The rules of QM are just the rules of what phenomena are self-sustainingly possible at-all. Now wave your hands and go possible => probable, self-sustainingly => something-equals-one and continue to justify via your favorite axiomatic QM framework.

    On that subject, I haven’t seen any axiomatizations of QM that directly generalize Polya/Jaynes/Cox – they all seem subtly different, and substantially more dry/boring.

    Good luck in your quest!

  334. Ted Says:

    Scott #286: Thanks so much, that clears everything up. I agree with you that the principle of local tomography is a much more satisfying desideratum than the principle that “subsystems’ numbers of real degrees of freedom combine multiplicatively” – especially since the latter principle is in fact false in QM, and counts of real degrees of freedom actually combine supermultiplicatively! The (much more awkward) correct statement that “((real degrees of freedom) + 1)s combine multiplicatively” is not directly interesting in itself, but is only indirectly interesting in that it gives us the principle of local tomography.

  335. Chris Says:

    FeepingCreature #283

    You could certainly convince some observers that they are living in a classical world, simulated or not. The problem is that this impression is illusory; they’ll simply be mistaken. They are, in fact, living in a quantum world, even if at a step removed, not unlike how for most of human history, we’ve lacked the tools to reliably study atomic physics, and therefore had no knowledge of this domain beyond wild speculation. Having said that, your thought experiment lays a decent claim against Penrose-type arguments, which rely specifically on the biological substrate of brains and neurons and their supposed quantum dependence (assuming your experiment goes as planned with no conspiracy), but that angle never seemed especially compelling to me in the first place. At least as far as the mathematical universe hypothesis is concerned, consciousness motivates the phenomenon of superposition because you’d expect consciousness to supervene over a bulk of possible worlds. You don’t need to assume much about how consciousness works to get there; only that consciousness supervenes over physical phenomena. I imagine among most who are scientifically minded, this is the default assumption about how consciousness relates to matter. At the very least, it seems fairly unproblematic. The other required assumptions are packed in to the MUH; that physical phenomena just are mathematical structures, and that isomorphic mathematical structures are numerically identical (ie. there are no haecceities in the mathematical ensemble; there is only one dihedral group of order 8).

    >Anyway, my idea is that qm is just the simplest setup where a small seed state can give a sufficient variety of outcomes for life to exist. I think life will turn out to be so unlikely that the universe has to allow an unbounded continuum of branches to find some with life in it at all.

    A spacially infinite classical universe can provide the same diversity of outcomes. I’m not sure what is gained combinatorially by representing the universe as a vector in an infinite dimensional Hilbert space.

  336. Dax Says:

    I keep thinking about the Quanta article about the maximum precision of clock given entropy, and wondering if there’s a way to turn that on its head and make it an axiom (except instead of being about physical clocks, it is a definition or condition of time itself), and out pops QM and maybe GR.

  337. CR Drost Says:

    @Crackpot #329

    If it helps to reify what I’m saying into a concrete example, we can do that.

    So suppose you have a qubit evolving under some Hamiltonian. The usual approach is to pick out a point on the Bloch sphere, track it over time. Your Hamiltonian is then a sort of flow field \(\vec v(\theta,\phi)\) on the Bloch sphere. Each point on the Bloch sphere is a “world” as far as MWI is concerned, and the insistence of MWI is that we should not be persnickety about “which state is it *actually* in”, the point that danced around the sphere following \(\vec v(\theta,\phi)\)… MWI says this is a mistake because it is not the qubit, but us, doing the dancing. We are following this point around on the sphere. The points are just there.

    The central conceit of MWI is that we can boil down our nonunitary dynamics to unitary dynamics if we just insist that our “hyper-Bloch-sphere” that describes all of the atoms in our bodies and the experimental apparatus and the quantum thing under study, each point on that state space feels a certain way. So then we ask which way we feel and we discover that we just have a superposition of ways that we feel, and they are isolated from each other. Just the standard unitary flow, the standard flow field induced by the hamiltonian…

  338. Mateus Araújo Says:

    Philippe Grangier #332: I know, I read the previous comments. I think the unsplit, informal version of Bell’s 1976 definition is a good definition of locality: the probability of an event can only depend on events in its past light cone. In particular it cannot depend on events that are space-like separated from it. Both (1) and (2) violate this definition, and therefore both are some sort of nonlocality. I think it’s impolite to rename things around and insist that only (1) is a proper definition of locality. There are so many definitions of locality!

    That said, I do agree that (2) is a milder sort of nonlocality. If (1) was violated experimentally I would see no hope of reconciliation with relativity. With (2) there is hope, but it is by no means easy. When you go for a realist model of single-world quantum mechanics (naïve textbook realism, collapse models, or Bohmian mechanics) you get a flagrant violation of relativity. The only way I know how to do the reconciliation is with Many-Worlds.

  339. Job Says:

    When we say that classical physics is incompatible with complex chemistry, are we referring to a specific set of laws and equations that turned out to be insufficient? Or is any possible classical theory known to be insufficient?

    Classical physics can describe billiards-balls, which is sufficient for universal computation. Then either a billiard-ball computer can efficiently compute states produced by complex chemistry, or it can’t.

    If it provably can’t, then we can settle BQP vs P right now.
    I’m guessing that, despite the claims, we can’t actually prove any such statement.

    An eventual compatibility between classical physics and complex chemistry is merely unlikely?

    Another thing that is worth mentioning is that classical machines can efficiently simulate any quantum systems that are describable by a few non-universal quantum gate sets.

    That means that, in practice, classical machines can simulate even large scale quantum interference, they will just struggle do so for every possible quantum circuit.

    E.g. we have to throw a sufficient number of random quantum experiments/circuits at a classical machine to have it produce the wrong answer.

    Basically, proving that classical physics can’t describe a quantum system is not that easy?

  340. fred Says:

    Rather than answering directly the question “why QM?”, we could try to inspect why the question even arises in the first place.
    Meaning that a deterministic universe build on QM is such that the effects of QM aren’t just direct bottom-up (building up from vibrations in quantum fields to macro structures), but alternative and subtle causal paths also spontaneously appear where the microscopic laws gets amplified and expressed at the macroscopic level (it’s the case with the question “why QM?”).
    This is not just true about QM in isolation, the entirety of physical mechanisms have to be considered together.
    E.g. the universe is such that the patterns in the brain of a cosmologist (such patterns are macroscopic structures), here on earth, are directly mapped onto the shapes of distant galaxies billions of light years away. This doesn’t happen by some direct effect of gravity of electromagnetism, but through what looks like very unlikely “conspiracies” (the evolution of life and then the evolution of intelligence). The isomorphism between the cosmologist brain patterns and distant galaxies is very strong, yet very hard to explain without walking back the entire evolution of life on earth.
    This is often referred to as “the universe is such that it’s able to eventually look back at itself”.

  341. Anonymous Says:

    These questions venture in to the realm of metaphysics. I have a lot to say on Q, but I will just leave a few nuggets here.

    – Consider that complex numbers, being algebraically complete, are more “real” than real numbers. The normal viewpoint is an accident of how human senses and classical measurement work. I would call complex numbers “full numbers” and real numbers “partial numbers”. So my answer here is “because the universe naturally uses full numbers”.

    – Fundamentally, I think we need to take Tegmark more seriously and consider “what is the ontology of mathematics” using physics and empiricism as pointers, rather than asking “what is the ontology of the observed world” and relegating mathematics as a mere tool. It is difficult to answer your questions using science, which doesn’t really address “why questions” if you keep asking “why” more than a few times. Metaphysics on the other hand, should be founded on mathematics — not a dead, static, utilitarian mathematics, but a mathematics that is alive, that moves, and which breathes fire into itself. Mathematics as Mind, and Mind as the primary ontological substance.

    – Briefly, the main problem of monism is to explain why things appear dualistic. I would suggest that this touches on a core aspect of QM — the Fourier transform. If everything is Mind (=Mathematics), there may still be two aspects of mind, which Kant called extended and thinking substances. Thinking (unextended) substance can be equated with the frequency domain (which consists of waves which extend infinitely), and the extended substances to the spacetime domain (which consists of infinity concentrated to finite time e.g. the Dirac delta function).

    – All of mathematics is dynamic, alive. Importantly, this includes zero. Zero is not simply a mark used on a ledger to keep track of apples. Zero is a process, and that process is akin to moving around a circle back to the starting point. There is continual non-0 displacement, but a net displacement of 0. The fundamental group of the circle is Z, the integers. The circle is what binds together the frequency and spacetime domains of the Fourier transform. Zero contains infinity and zero serves as the “unit” of the frequency domain. Within the spacetime domain of finitude, we have a dimensional unit 1, but that too is dynamic and alive…

    – These ideas are just suggestions, analogies. Simplified for brevity. If they make you uncomfortable, that’s good. Let them digest a bit.

  342. Tashi Says:

    I’ve read that an ankle will get weaker and more injury-prone after each injury. You can strengthen the stabilizer muscles and rebuild your proprioception by balancing on one leg for a little while, a few times a day.

    This comment, of course, contributes nothing to our shared understanding of the nature of the universe, so please feel free to delete it.

  343. Scott Says:

    Duh #219:

      Level-design that uses RNG can open the design-space much more widely than classical level-design. Without RNG, either all the levels look the same, or you have to hand-craft them all to be different in interesting ways. QM is just a way to make sure the universe looks interestingly different across its total configuration. Without it, there would be no level in which we evolved, so we wouldn’t be asking the question.

    <rant>

    While I’ve learned many interesting things from this thread and appreciate the many thoughtful comments, I’m also noticing a large number of commenters who seem to have a totally misplaced confidence in their answers to the “why QM?” question. As one example, every single answer (like the above one) that talks about randomness crashes and burns against the objection,

    “but then why not just classical randomness? if you didn’t already know that our universe involved complex amplitudes and unitary transformations and Hermitian operators, would you have invented all that stuff, just as a way to get randomness?”

    Unless, that is, this objection is explicitly confronted and rebutted: something some commenters (to their credit) have attempted with varying degrees of success but that others don’t even seem worried about.

    In general, when contemplating the “why QM?” question, I believe that a good rule of thumb is to remember the undergrads taking a midterm, who if you ask them to prove X, can produce paragraph after paragraph of verbiage whose tone is so self-confident that even a seasoned hand like me thinks, “well then they must really understand this, even if I can’t follow the train of reasoning” … until I turn the page, and I find paragraph after paragraph of equally self-confident verbiage proving a statement that’s false! 🙂

    I.e., whenever you think you’ve found a completely convincing reason why QM is basically inevitable, you always, always need to stop and ask yourself: “well, suppose certain experiments in the early 20th century had turned out a different way. Would my brain have been able to generate equally convincing reasons why not(QM) was basically inevitable?”

    </rant>

  344. Clark Van Oyen Says:

    You mention Einstein and he was similarly burdened with the “does god play dice?” Question and held an assumption that “no” was the answer and subsequently was incorrect. The question you are asking is “why does god play dice?” And perhaps, why do those dice have a specific number of sides, perhaps.

    I respect that as QM expert you have the benefit of context for framing this question. I am wondering if this question will be similar to: “which interpretation fo quantum mechanics is correct (Copenhagen or many worlds?” Do you feel this is the former or latter type of question? may it forever sit outside of experimental verification? Why?

    We expect Occam’s razor to hold. Perhaps more specifically we feel that excess information is wasteful. There are statistical arguments for that. But there are also plenty of examples where a simple pattern or system or order arises within an unnecessarily complex one. Could this be like asking, why does our intelligence run on wetware instead of hardware? It would be more “efficient” and simpler for intelligence to run on silicon or countless other media. (Hopefully this claim isn’t too contentious or distracting). The point being, systems evolve in a less direct path than design (the word you kept using). QM existed and then the universe evolved on top of it. I would assume QM is an evolutionary precursor to present day reality.

  345. Liam Says:

    Quantum mechanics lets you discretize the state space without discretizing space. In particular, it lets you simultaneously preserve continuous spatial symmetries and the third law of thermodynamics (entropy at zero temperature is a finite constant) in a system with particles.

    So for instance assume you want to have something like particles, and you also want rotational invariance (you’ve said you are satisfied with Einstein’s justification of Lorentz invariance so I assume you’re happy with taking continuous rotations as a given). Then if your ground state of hydrogen (or whatever your basic atomic building blocks are in your fancy new universe) is rotationally invariant, but you also have a definite position for the electron (or whatever), then you can generate an infinite degeneracy of states by rotating this state. So entropy is infinite. On the other hand, if you want your low energy states to have finite entropy, you need to somehow have states where continuous rotations acting on them generate only a finite number of states, in other words they have to be finite dimensional representations of SO(3). So they have to be spherical harmonics, i.e. the stable bound states basically have to be waves. But when you isolate and manipulate (i.e. measure) their constituents, they look like localizable particles?

  346. Philippe Grangier Says:

    Scott #337 : « well, suppose certain experiments in the early 20th century had turned out a different way. Would my brain have been able to generate equally convincing reasons why not(QM) was basically inevitable? »

    This question brings me back to my previous restaurant metaphor, #174 : given the dish cooked up for us by the Grand Chef, should we try to swallow what we got, or to get something else ? To take a non-quantum example, what physics would we have if Michelson and Morley had measured a non zero velocity of the earth through ether ? One can abstractly speculate on that, but physicists will certainly prefer to adopt special relativity, and to use it…

    To be clear, I don’t advocate « shut up and calculate », and I do think that « why QM ? » is an interesting question. But my effort will be towards first getting physical principles based on empirical evidence, and then finding appropriate mathematics to manage them : like calculus for Newton, linear algebra for QM, tensors for GR. So which mathematics are we missing now ? From #301, my best guess would be something like transfinite calculus, ie the proper simultaneous management of incommensurable scales.

  347. I Says:

    Scott #102:

    Two reasons for causation might be: it makes statistical inference tractable (i.e. check N^2 correlations instead of 2^N), as well as the whole SR thing where causation + other axioms leads to a finite speed of light and hence a local universe.

    As to your whole research agenda, would it be fair to phrase it as: “give an arguement convincing a smart person in a world which feels to them intuitively like we feel ours to be that they are be living in a quantum world. Further, this arguement should be as natural as Einstein’s arguement for SR.” In which case, isn’t that exactly what the GPT subset of quantum foundations was made to do?

  348. Peter Morgan Says:

    Scott #7³: While you say you “won’t understand” my work (#45), your rant is, I think, very, very close to what my work tries to unravel in detail. If we introduce classical randomness, how exactly does that differ from quantum randomness? If quantum randomness leads to measurement incompatibility, why do we think that classical randomness does not? What does classical randomness look like if we can find a natural way to include measurement incompatibility? If we introduce classical randomness and measurement incompatibility, what differences remain? (1) Quantum noise/randomness has a different spectrum from thermal, Gibbs noise/randomness; (2) classical dynamics, generated by the Liouvillian operator, is significantly different from quantum dynamics, generated by the Hamiltonian operator.
    To try to be clear, I never claim this is the right way to think about the relationship between CM and QM (at least, I try not to; I’m championing this because AFAICT there is no serious person championing it.) I think CM and QM, in their different formalisms and interpretations, are almost equally viable “pictures”, but with different advantages and disadvantages. I think Andrei (who I think must be Andrei K.) and Philippe Grangier have viable stochastic and contextual approaches that are problematic only because their formalisms are just that bit too distant from the standard Hilbert space formalisms of QM/QFT; I think ‘t Hooft’s, Elze’s, Wolfram’s, Hossenfelder&Palmer, Wetterich’s and other approaches to CAs and stochastic methods struggle to include measurement incompatibility as naturally as Hilbert space methods do (although ‘t Hooft and Wetterich discuss noncommutativity and incompatibility at length, it seems that their accounts have in practice not been clean enough in practice to bridge the gap. On my tombstone, “His wasn’t either”.)
    If we think these struggles are rhetorical as much as they are substantive, then Koopman’s Hilbert space formalism for CM seems one natural way to try to approach the gap between them. I have found a Koopman approach to be more successful than the Wigner function approach to that gap, even though I think that is also perfectly viable as a way to understand the relationship between CM and QM. When one considers the quantum measurement theory and quantum probability theory literature as well as the Koopman-von Neumann literature, I have found it mathematically preferable to adopt an algebraic approach to Koopman classical mechanics, so that we can compare the algebraic, symmetry, and analytic structures of CM and QM with, I suppose, even less distractions.

  349. Tiberiu M Says:

    Hi Scott,

    Here’s my take on it. I am a strong believer in the principle of plenitude. This idea has been around for millennia, but Max Tegmark gave it a more modern spin recently, as follows (I’m also adding a bit of my own interpretation to it):

    The universe is a mathematical object. It’s just a bunch of interconnected equations that together form a unified mathematical object. And mathematical objects are basically just abstract objects that, like all abstract objects, all exist on their own in this abstract realm (think Platonism). Therefore, all possible universes exist (by possible I mean that they can be reduced to an abstract object).

    The question now becomes “Why are we living in this particular, quantum mechanical universe, and not in a classical one?”. Here it comes down to the number of intelligent beings living in each one. Due to the infinite super positions of a QM universe (think the infinite branching in the Everett interpretation), a QM universe is infinitely bigger than a classical one. Therefore, it is inhabited by infinitely many more consciousnesses. Therefore, you are infinitely more likely to find yourself in a QM universe than in a classical one.

    There could be a classical universe out there hosting intelligent life, we just don’t happen to live in it. The same idea can also explain why the universe is (probably) infinite in space.

  350. Crackpot Says:

    CR Drost #337:

    Apologies for the rudeness of my original reply, incidentally, it wasn’t intentional, and I realized how it came across almost immediately after I hit submit.

    As for the points “just being there” – I mean, kind of? The Bloch sphere is a representation, not reality (well, maybe it is reality, who knows, but that’s not what MWI itself is saying); the distance between the points isn’t geometric. It’s correct, insofar as the metaphor goes, to say that it is us (the observer) who are moving, but again insofar as MWI itself is concerned, we’re not moving to a geometric destination which existed before we moved there, and what measurements we perform limit what points even exist.

    Suppose you have an experimental apparatus such that a measurement is made, embedded in an experimental apparatus in which the measurement is then irrevocably destroyed such that the original measurement cannot be recovered, such that the superposition of the entire apparatus never collapses. We’re not confined to any position on the Bloch sphere corresponding to that destroyed measurement; those positions only come into being when a wave function collapse occurs. It’s not a physical space which we move around on; it’s a conceptual space representing possibilities which can change depending on the evolution of the system.

    MWI doesn’t demand unitary dynamics, because fundamentally, it treats the probability wave as a wave, rather than being “about” probability at all; in the MWI framework, thinking in terms of probability (which when you get down to it is what unitary quantum mechanics is really all about, making sure the probability of finding a particle in a given location adds up to 1) is in a significant sense missing the point, because what probabilities are you even talking about? The particle isn’t “probably here”, the particle doesn’t exist, that’s the divergence point of the interpretation from pilot waves.

    I don’t know what you mean by “feels a certain way”.

  351. Crackpot Says:

    Scott #343:

    “I.e., whenever you think you’ve found a completely convincing reason why QM is basically inevitable, you always, always need to stop and ask yourself: “well, suppose certain experiments in the early 20th century had turned out a different way. Would my brain have been able to generate equally convincing reasons why not(QM) was basically inevitable?””

    – Personally, I think yes, because a lot of my crackpot nonsense started because I found QM entirely unsatisfying, and set out a couple of decades ago to “prove” that it was fundamentally wrong.

    I cannot state the disappointment when I realized my own nonsense said something like QM had to be there. At this point, I’m pretty sure that an important lesson to take from both pilot waves and MWI is that some form of QM is a fundamental feature of any physical system whose behavior can be described in terms of waves.

  352. JakeP Says:

    What if you assume, that BQP is simply in P? Once we understand the algorithm that makes this possible, the “mysteries” asked about here will make sense to us. The Born rule, complex amplitudes, etc, will just fall out naturally from how this algorithm is structured.

    For example, at first we did not know efficient algorithms for calculating a Fourier transform on discrete data, but once the trick of the FFT was found, not only did it unlock technological progress, but it also spurred on more research and understanding of the related transforms and variants of that basic idea.

    If satisfactory answers to Q1/Q2 have eluded us for so long, perhaps it is slight evidence that there IS in fact an efficient algorithm for simulating a quantum circuit after all?

  353. fred Says:

    Crackpot #351

    “QM is a fundamental feature of any physical system whose behavior can be described in terms of waves.”

    It’s true that QM is an extension of the idea that everything is described by wave mechanics.
    Water, gas, of course, but even solids, which aren’t solid at all at the fundamental level, but better modeled by a grid of point particles connected by springs (even diamond isn’t solid at a short enough time scale).
    So to answer “why QM?” we probably first should ask “why waves?”, and the answer is that as soon as we have point-like objects within continuous space, and short range forces (springs, or local causal connections), we have waves.
    But I think that people who deal with QM in a computational context will say that QM is fundamentally about some manipulation of imaginary probabilities rather than wave mechanics (Schrodinger equation). But it’s just a matter of perspective. It’s no coincidence that complex numbers are also what’s used to describe waves in classical/plain electrical engineering (without any reference to QM).

  354. Etienne Says:

    I’m not competent to say much about QM, but my intuition aligns closely with that of Age bronze #188: if I had to speculate on God’s desiderata when designing the universe I would start with

    1. Discrete state space and discrete time,
    2. Some form of extreme action principle obeying some form of Noether’s theorem.

    If you try to do classical mechanics in completely discrete space and time you immediately run into issues where your variational principle no longer has a unique solution, motivating a probabilistic universe. (There’s also the matter of the destruction of Noether’s theorem). I might hope to show that complex probabilities, wavefunctions, etc. naturally arise as the only (or simplest) way of salvaging an action principle in a discrete universe—I think it is very telling that in QM the symmetries do not necessarily need to arise from continuous group actions—though I have no idea how to begin.

    (One can also ask why the universe’s Hamiltonian is just-so, though I feel like that’s a less fundamental question and may not have any particularly satisfying answer, beyond appeal to anthropic principles.)

  355. Dimitris Papadimitriou Says:

    I don’t think that there is a really satisfying response, especially for the Q1, mainly because both classical and quantum physics are frameworks, not specific theories.
    Maybe one can imagine some classical ( or even something non- classical, non- quantum either) alternative for a hypothetical world that has some kind of stable atoms, stars etc, but this is very speculative, and if ” anything goes” , then go figure…
    If we assume that classical GR holds, then we have some good restrictions: A manifold, some energy conditions for the stress energy tensor, etc.
    Such a world is probably dominated by black holes, almost exclusively. Everything, sooner or later, will be inside its Schwartzschild radius, so…
    On the other hand, if some assumptions do not hold ( for example if certain energy conditions are violated ), then this hypothetical universe will have CTCs or timelike singularities, so, maybe it will not make sense as a plausible alternative world.

  356. Scott P. Says:

    The only solution I know is defining true randomness as deterministic branching, aka Many-Worlds.

    Maybe someone can answer this, but how does MWI deal with actual probabilities? Let’s say after a measurement there is a 1/3 chance that particle A is spin-up and 2/3 spin-down. As I understand it, MWI means that the universe/wavefunction branches at the point of measurement — in one branch A is spin-up, and in one it is spin-down. What then becomes of the 1/3? There are two branches, after all. What does it mean to have one branch be more probable than another?

  357. fred Says:

    Scott P. #356

    That’s one of the main difficulties of MWI, there’s no clear agreement among its proponents on how to deal with it (e.g. Sean Carroll has a lot to say on this).

    The way I think about it is that everyone agrees on what’s a binary split, a 50/50 branching.
    And then any other split can be decomposed in a series of 50/50 splits, with special hidden labels. So, to create a 25/75 split, you do two consecutive 50/50 binary splits, and you have 4 possible hidden labels (TT, FF, TF, FT) and assign TT to one branch and the other three to the second branch (FF, TF, FT, all three being indistinguishable except for those hidden variables).

  358. Matt Leifer Says:

    I have avoided reading the over 300 comments above and will just give my own speculation here. Apologies if any of this has been touched on above.

    For me, the most interesting thing about quantum mechanics, from an abstract point of view, is that both the Church of the Larger Hilbert Space (CLHS) and the Church of the Smaller Hilbert Space (CSHS) exist – the theory can be developed from either point of view self-consistently – but they each give entirely different intuitions on what the theory is “about”. I would go so far as to say that an unconscious attachment to either one point of view of the other explains a lot of people’s intuition about which interpretations of quantum theory are plausible.

    Now, what do I mean by these two churches? The central tenet of the CLHS is that quantum mechanics is a deterministic dynamical theory about an physical state (a pure quantum state) evolving in time under the Schroedinger equation. It is much like any field theory in classical physics, except that it has different rules for combining subsystems so the state does not live in “physical” space, and it has these weird rules for update upon measurement.

    An everyday member of the CLHS may accept the projection postulate as part of the theory, and use the Church as a practical guide, i.e. the tendency to purify every mixed state, view every CPT map as a unitary followed by tracing, and every POVM as a projective measurement on a larger space. However, true devotees would embrace a no-collapse interpretation, such as many-worlds or a hidden variable theory like de Broglie-Bohm, removing the fundamental status of the measurement postulates and explaining them in terms of something emergent (decoherence and branching) or something more fundamental (the hidden variables). This can be done to a large degree of success. There are of course still debates about the extent to which all problems are solved in many-worlds or Bohm, but for the most part people agree that these interpretations work, even if they do not think they are correct.

    In contrast, the CSHS starts from the premise that quantum mechanics is a generalization of classical probability theory. Something weird happened to the physical quantities that the theory is about – they became noncommutative. Given this, you formulate the theory as the most natural generalization of probability theory that you can think of for such a set of variables. This basically gives you quantum mechanics, via Gleason, Wigner, etc.

    CSHS practitioners typically prefer not to purify their mixed states. Since classical probability distributions do not have purifications, you will miss analogies to classical probability if you purify as your first step, so they prefer to stick with the original “small” Hilbert space. True believers note that the collapse of the state upon measurement resembles the change in a probability distribution when you apply Bayesian conditioning, so they want to view the quantum state as something “probability-like” (whatever you think classical probabilities are). This leads to psi-epistemicism and Copenhagenish ideas.

    Note that I am not talking about the debate between psi-ontic and psi-epistemic here. The Churches are to do with the way you go about constructing quantum mechanics, so are prior to that debate. Of course, CLHS does tend to predispose you to the psi-ontic point of view and CSHS to the psi-epistemic point of view, but that is derived from the fundamental tenets of the churches, not presumed at the outset.

    From the CLHS point of view, it is weird that the CSHS exists and vice versa. Most hypothetical physical theories do not allow both points of view to coexist. For example, classical probability has no CLHS because probability distributions cannot be purified.

    If you start from the CSHS point of view then you will be led towards frameworks like generalized probability theories (GPTs). Most GPTs do not have a CLHS. In fact, the Chiribella et. al. axiomatization can be understood as starting from the CSHS view and imposing that the CLHS must exist (plus a few other things). So this is evidence that theories admitting both points of view are rare. You certainly cannot have PR-boxes and much superquantum stuff in such a theory.

    But this does not really resolve the question. From the CLHS point of view, you would not generalize the theory by moving to the GPT framework because you don’t believe quantum theory is a generalization of probability in the first place. Instead, you would be more likely to alter the Schroedinger equation or maybe the tensor product rule. Your generalized framework would be to start with an arbitrary differential equation and a composition rule, and then try to see what is needed for a branching structure to emerge or for a satisfactory measurement process to be explained by hidden variables.

    Now, far less work has been done on this sort of framework, partly because CLHS advocates don’t seem to feel the need to axiomatize quantum theory, but are rather content if they can convince themselves that the measurement postulates emerge just for quantum mechanics itself. However, if one did develop this framework, I would be willing to bet that most theories in the CLHS generalized framework would not admit a CSHS.

    So we seem to be in a special place where we have a theory that admits both CLSH and CSHS points of view and the theory can be self-consistently developed from either point of view. My version of Q is: why is this so? Can the two churches be unified? Is there a meaningful physical principle which explains why we need to have both? If you can answer that, then we can let Chiribella et. al. do most of the legwork after that, modulo exploring the theories that might violate some of their other more technical axioms.

  359. A. Karhukainen Says:

    Scott: “By my age, Einstein had completed general relativity, Turing had founded CS, etc.”
    Well, but at that age Leibniz hadn’t yet written his Monadology.

    And regarding the comment #8 by Rahul: “To answer Q1 do we have to first believe that God exists?”
    I comment further: “Or do we have to believe that the God is perfect?”

  360. Tu Says:

    Request for help and/or clarification.

    Can someone who is a many-worlds person, or Everettian if you prefer, explain to me why the many-world ontology is so appearly or evident to you? I am looking for someone who is a die-hard, every branch is equally-real many-worldser.

    Was there one moment where it all clicked for you? Do you believe that there is an uncountable infinity of other branches of the wavefunction that are equally real, equally extant?

    Do I just lack imagination, or do I just not get it? From my perspective, when someone says something like, the only thing that really exists is the entire wavefunction of the universe, it feels like the word exists is doing a little bit too much work. That is, if things “exist” that cannot, even in principle, interact with “our branch” of the wavefunction, why say they exist?

    So yeah, I think nobody is home in the other branches of the wavefunction in a way that is physically meaningful to me. I think the situation remains mysterious. For the time being, I would rather contemplate and marvel at the mystery, rather than emphatically proclaim victory in the form of a massively wasteful and exorbitant ontology.

    What am I missing?

  361. Lorraine Ford Says:

    Quantum mechanics and free will seem to be somewhat like cellular automata and other computer programs in the sense that individual steps are taken (representable as numbers assigned to variables) in response to the individual situation a cell or an entity finds itself in. They are all about taking steps.

    This is very different to the experimentally verified laws of nature which, despite the delta symbols, are static relationships; the laws of nature are not steps.

    Steps are a different type of thing to relationships, but seemingly a system needs both relationships and steps.

    But it is only in computer programs that the individual steps become rules. With quantum mechanics, and with free will, the individual steps taken are not rules.

  362. Mateus Araújo Says:

    Scott P. #356: It’s simple. After the measurement in one third of the worlds the particle is spin-up and in two thirds of the worlds it is spin down.

    This “third” comes from finding a way to count worlds. It’s easy to find a rule that agrees with the Born rule: just define the measure of a world to be its 2-norm squared, and the relative measures will agree with the probabilities. There are plenty of formal arguments for deriving this measure in the literature, but I think the strongest argument is that it fits the data.

    Now, you propose a different measure: that we should count equally each set of worlds that share the same measurement result. As you noticed, this has the fatal flaw of contradicting the data. It’s also ill-defined: these measurement results are just one decoherent event we chose to pay attention to. There’s plenty of decoherence events happening all the time, everywhere. To actually count each decohered branch equally we would need to take them all into account. It’s clearly a hopeless proposition, and to the best of my knowledge nobody has even tried to do that.

  363. Crackpot Says:

    Scott P. #356:

    I see the problem, as the idea that “you” have a singular position on the amplitude is contradicted by the existence of superpositions; you aren’t an infinite collection of identical people who diverge at different paths, you’re all of them, that’s what a superposition is (if they were actually distinct, we wouldn’t observe superpositional behavior). So the natural answer, that it’s just a question of which “you” you find yourself being, doesn’t actually illuminate anything, because in the given example there’s still just two of you.

    Personally, on the crackpot side of things, I think “you” are more like a subset of the superposition; there’s no a single slice of the superposition that is “you”, but at the same time “you” are not, in fact, the entire superposition, and the superposition is actually quite “fuzzy”; there are very slight differences between different “you” clusters in the superposition, which are still coherent enough to not entirely cancel out. This may or may not resolve the problem, but an alternative version, in which we simply posit the existence of multiple “you” clusters which aren’t fuzzy, and are perfectly identical, may help resolve the original version of the question, which is to say why two possibilities shouldn’t always have a 50/50 split in terms of experience, the amplitude being merely amplitude after all, while still preserving superpositional behavior; you’re not the entire amplitude, you’re some portion of it, and thus if 2/3 of the amplitude goes in one direction, and 1/3 of the amplitude goes in another, you observe those probabilities in your personal experience.

    That is, we only get the problem if we assume that “you” are the entirety of the superposition; as soon as we assume “you” are only a subset of it, the problem disappears.

    Also, viewing it as branches is, I think, seeing the operation somewhat differently than it actually is. A measurement happens; the waveform “splits”. You yourself don’t split until you observe the measurement, and you don’t necessarily stay split; if the physical reality is identical at some point thereafter (nothing would be different in either case), then the “branches” have re-merged, because the waves have not actually diverged such that the different possible wave states cancel out (because, if we suppose there is no difference in the wave states, there’s no difference to cancel out), and you’re back to a superposition of either measurement. In practice this doesn’t matter at all (or at least I can’t think of a case where it makes any difference), but I think it matters to understand the interpretation, that there isn’t actually a separation between the “worlds”, it is purely a phenomenon of wave interference / cancellation.

    I’m also going to kind of agree with fred in that every case I’ve tried to work out has fundamentally been some series of 50/50 splits (and recombinations), except the fact that I can’t come up with a scenario that isn’t a 50/50 split isn’t evidence that one is impossible, but that ties into other crackpot stuff that isn’t actually relevant to the question. However, I’ll disagree on the splits persisting for the fundamentally-identical-cases, because if there isn’t a physical difference – if the wave functions are identical – then the waves go right back to a superposition and interfering/canceling each other out.

  364. Crackpot Says:

    fred #353:

    Yep. So if the universe is built out of waves, something like QM. If the universe is built out of particles – well, also something like QM.

    If energy is quantized, we’ll observe quantization. If energy isn’t quantized – well, we’ll still observe quantization, at least in any universe with relativity. To explain, insofar as an observer presupposes a finite set of probable small-scale stable configurations, which is to say, insofar as there’s something LIKE predictable chemistry, then there is a finite number of normal-conditions expected quantities of energy. In any universe with relativity, which is to say, any universe with a finite maximum speed, things on small scales “run faster” than things on large scales, such that only state changes between stable configurations can even be observed. (Observe that an observer whose atoms are the size of galaxies would have a scale-relative timescale incomprehensibly slower than our own, and wouldn’t be able to observe, for example, stars, which are far too short-lived and unstable to be observed on those timeframes).

  365. Maksim Zhuk Says:

    I’m not scientist so I can’t add on the questions you asked, though sometimes I think about them. For sometime I think that in all metaphysical and scientific theories we have similiar patterns. Ancient greeks or modern physics have many similiar ideas and topics (discret and continous, atoms and energy, etc). Maybe our brain is wired in the way we could think about reality only with this ideas. Maybe for deeper understanding we need like update our brain with genetic enginiering or cybernetics. Penrose believes that our brain could do superturing calculation, but I’m not even sure that our brain is Turing-complete. Artificial neural networks aren’t generaly turing complete. Does somebody proved that human brain is?

  366. Anbar Says:

    Scott #343

    – I.e., whenever you think you’ve found a completely convincing reason why QM is basically inevitable, you always, always need to stop and ask yourself: “well, suppose certain experiments in the early 20th century had turned out a different way. Would my brain have been able to generate equally convincing reasons why not(QM) was basically inevitable?” –

    No, because not(QM) was already logically incompatible with the 19th century experiments establishing Maxwell’s equations. Not sure how far back you need to go in terms of empirical evidence before classical explanations start requiring Rube Goldberg concoctions, but I would guess not much.

    Same goes for indeterminism: once observed that there is this intrinsic randomness in microscopic phenomena (as required to “make sense” of complementarity), trying to embed this randomness in a classical framework simply adds another substrate whose only purpose is essentially to negate the sentence “this intrinsic randomness is fundamental”. Why should one entertain that hypothesis?

    Am I victim of this self confident delusion you mentioned and missing something obvious?

  367. mjgeddes Says:

    Einstein always said that the fundamental physical reality of QM should be expressed in terms of some kind of *physical* spacetime, not abstract spaces. But what space-time? Not the classical kind. So the closest thing in spirit to what Einstein wanted has to start with the phase-space formulation of QM and find a physical geometry, a *non-commutative geometry*. So I would ask: what natural *physical* principles related to this kind of geometry could explain QM?

    I’d say that Alain Connes was probably on the right track , he’s the one thats tried to develop non-commutative geometry since the 80s, but the problem is, where are the underlying *physical* principles to motivate it? Without these, the task is hopeless, analogous to someone trying to find the math of general relativity without knowing any physics.

    This is the mistake of nearly all the commenters in this thread; one simply cannot hope to understand QM merely by shuffling math symbols or firing off vague verbal ‘interpretations’ of abstract non-physical concepts like ‘wave functions’, one must obtain the underlying *physical* principles, expressed in terms of *non-commutative geometry*.

  368. Mateus Araújo Says:

    Tu #360: For me it was when I realized that Many-Worlds is what you get when you just take what the Schrödinger equation says as literally true, and stop torturing it with an unphysical and ill-defined collapse. It got reinforced when I was taking a lecture on QFT and realized that the high-energy people simply ignore collapse, for them the theory is completely unitary. Obvious in retrospect: for them relativistic effects are crucial, and how could they ever reconcile that with a nonlocal collapse?

    I could embark on mental gymnastics as like in orthodox quantum mechanics or believe what the math was telling me. The choice was clear.

    And yes, all branches are real. There’s nothing in the math to differentiate them. The Bohmians like to postulate the existence of some invisible pink unicorns bringing the magic of reality to only one branch, but that’s just ridiculous. At least they realize that this is what it takes to deny the existence of the other branches.

  369. Scott Says:

    Daniel Varga #221:

      I design a deterministic cellular automaton, and agents in it. The agents can not observe their environment without altering it, which gives rise to some version of Uncertainty Principle, which in turn forces them accept a quantum theory of physics. I am NOT saying that the agents are wrong, and their world is “really” a hidden variable world. From their perspective, they know everything about their world that’s ever knowable, and it is quantum. But I, who designed their world, have a different, very valid perspective.

    My criticism is specific and technical: I see no reason to imagine that you can actually get QM that way. It seems to rely on a fuzzy notion of “the uncertainty principle” that was lifted from a popular magazine article, rather than the actual quantitative statement about complementary observables. But forget about the uncertainty principle: how would you explain Bell inequality violations this way, without having to resort to superdeterminism, thereby curing your headache by guillotine? How would you explain Shor’s or Grover’s algorithms??

    The whole point of all these discoveries is that QM does not act like a classical CA to which you only have fuzzy, incomplete access. Maybe that was an honorable first guess, but you still have to discard the guess now, because it makes clear predictions and those predictions are wrong.

    Again and again in this thread, I’ve admitted what I don’t know. But there’s one part I feel absolutely certain about, and that’s that the right approach to QM is to listen to what Nature says rather than dictating to her what she must’ve meant to say.

  370. Cleon Teunissen Says:

    Scott, in comment #228 you mention how during the 1800’s physicists could try and address deep questions such as “Why is mechanics time-reversible?” “Why does mechanics satisfy a least action principle?”

    (The following is tangent to the question raised in your blog post, but it may catch your attention.)

    Interestingly, the reason why Hamilton’s stationary action holds good can be demonstrated using means that were already available during Hamilton’s time. That is, the fact that Hamilton’s stationary action holds good can be explained entirely in terms of _classical mechanics_. Links are at the end of this comment.

    To avoid misunderstanding: I confirm of course that all of the phenomena of classical mechanics emerge from the quantum world. In that sense the entire body of classical mechanics is accounted for in terms of QM.

    I am aware of course that the claim that Hamilton’s stationary action can be understood _classically_ is an unexpected one. Your _expectation_ is that Hamilton’s stationary action comes from QM.

    I am aware: If a claim is highly _unexpected_ then the demonstration will have to be low friction, very accessible. (Conversely, if the demo would be opaque/dull then most likely the reader will dismiss it.)

    For the goal of vivid demonstration I have created a set of interactive diagrams. The diagrams have sliders, moving the sliders allows the visitor to explore effects of variation. As the trial trajectory is modified: the diagram shows how various values respond to that.

    I hope I can persuade you to check it out:
    Available in two locations:
    On the physics forum site physics.stackexchange:
    https://physics.stackexchange.com/a/670705/17198

    On my own website:
    http://www.cleonis.nl/physics/phys256/energy_position_equation.php

    (The version on my own website has the fully functional diagrams. The version on physics.stackexchange has animated gifs that are composited from screenshots.)

    (The link to the stackexchange version is presented here to show that this material has been vetted by others. I’m acutely aware of your ‘claimed mathematical breakthroughs list’.)

  371. Scott Says:

    fred #231:

      So, if there’s a way to show that QM is necessary it’s probably by looking at even more fundamental ideas that are hard to dismiss, like entropy, conservation of information, fundamental symmetries, the fact that spacetime has 3+1 dimensions, etc.

    I agree! That’s exactly what was done for many other aspects of physics that seemed unmotivated at first, such as (famously) the Lorentz transformations.

    QM, however, seems a lot more fundamental than the 3-dimensionality of space (certainly the string theorists regard the latter as a mere emergent detail 🙂 )

  372. Scott Says:

    fred #235:

      Why do you assume that two theories that work separately can always be reconciled?
      Isn’t it possible that the transition from one mode to the other involves processes that just can’t be described by some compact mathematical relation?

    Whether or not that’s possible, when it comes to QM and special relativity it’s not true! We know that they can be reconciled because they were.

  373. Scott Says:

    drm #238:

      A couple of dumb questions from a biologist:
      1) does QM require infinite precision for unitarity, etc. or do all of those irrational factors of sqr of 2 and pi take care of themselves?

    The short answer is that trying to eliminate irrational numbers from QM would make it horrendously less elegant. Furthermore, it follows from basic trigonometry that if you want your angles to be rational (or even rational multiples of π), that will typically make your lengths irrational; if you want the lengths to be rational that will typically make the angles irrational.

      2) Is Bell’s non-local condition equivalent to the older (I gather) notion of contextuality?

    No, they’re different (although there are some mathematical connections between the two). To talk about measurement contextuality, for example, you don’t need any tensor products or spatial separation between Alice and Bob.

  374. Clinton Says:

    Scott #288:

    I completely agree with all of your first point which I copy here below and regret that I gave you any other impression:

    “… if someone in the 1800s had asked “why is classical mechanics true? why does it have
    the features it does?”, with hindsight that would’ve been one of the best questions they
    could possibly have asked! Because it would’ve had nontrivial answers! Albeit answers
    that were only discovered later. For instance: Why is classical mechanics time-reversible?
    Why does it satisfy a least action / Euler-Lagrange principle? The answers would come from
    QM. Why does the gravitational force fall off like 1/r2? Why are gravitational and inertial
    mass the same? The answers would come from GR. In other words, there really were deeper
    principles waiting to be discovered (deeper principles expressed, yes, using math). So, your
    thought experiment strikes me as supporting optimism, rather than pessimism, about the
    search for deeper principles underlying QM! Having said that, there’s an immense irony here:
    physicists were ultimately able to explain classical mechanics in terms of deeper theories, in
    large part because they discovered that classical mechanics wasn’t exactly right. The
    corrections were what led them to the deeper theories from which classical mechanics was
    then recovered as an excellent approximation.”

    On your second point, if we change it as follows:

    “If (as you seem to think) QM isn’t exactly true may yet prove to be an
    incomplete model in the event of new evidence, just like classical mechanics
    wasn’t, then we should ultimately be able to explain QM in terms of something
    deeper.”

    And with that change, I think you and I both agree that this would be the optimistic outcome if evidence were to show QM incomplete AND we found a more complete model.

    To your third point:

    “If (as I fear) QM is exactly true, then we might not ever be able to explain it in terms of
    anything deeper (but we can still try!).”

    We both have a fear. But our fears are different. And this has been my point.

    The Scott Fear:
    Scott fears QM is exactly true. By “exactly true” Scott means that QM is the actual operating system of the universe. And by “fear” what Scott means is that Scott may never know why QM must be the actual operating system of the universe.

    The Clinton Fear:
    Clinton fears QM is exactly true. By “exactly true” Clinton means that QM is the best model humans can find of the universe. And by “fear” what Clinton means is that Clinton can never know if Clinton is just stuck on some island in mathematical theoryspace or if Clinton is deceived by his own neural model of computation.

    I fear that one of these nightmare scenarios below may be the truly horrifying answer to your Q.

    (A) We have descended into a valley of mathematics or landed on an island in theoryspace, from which we CANNOT construct the mathematical tools required to leave. In other words, WE ARE IRRETRIEVABLY TRAPPED (unlike what you and I feel would be the optimistic scenario in your second point) in a local minimum on the theoretical landscape.

    (B) Or, worse, the universe is not fundamentally mathematical at all – and so obviously cannot be QM. It is something … beyond mathematics – whatever that would even be. Maybe mathematics is just an … emergent property … or an evolved part of cognition. Yes, it works WITHIN the universe … but it doesn’t capture the “deeper” universe. I presume you are presuming that we will go about our project to understand reality by presuming that we WILL use mathematical reasoning. But I think once you do that then you find yourself ending up eventually at QM. Why? Well, the basic notions of symmetries that come with mathematics, how that leads to group theory, the normed division algebras, makes complex numbers the special case, and how if you say “I want to make a predictive model” … well, then you just assumed we were going to be using a probability model … and so … you get QM (see my first post). And, yes, I know that questioning the use of math makes you want to “howl into the dark” because … well what else are we supposed to do?! It is what works! But … OK … I’m telling you what a horrible truth this would be … what I’m AFRAID of … And that would be to KNOW that the only thing we know that works … is NOT and CANNOT be the way to understand the fundamental nature of reality.

    (C) Or, maybe worse of all, we are being tricked by our own neural model of computation. The very thing we rely upon to know and think anything at all is just generating an elaborate cognitive deception. Neuroscientists are now generally arguing that what we take to be reality, free will, a sense of self … are sophisticated computational illusions generated by the brain. Yes, I know, physics and philosophy have been onto the idea that we should doubt our sensory experiences for a long time … But I’m not just talking about doubting our sensory experiences … I’m talking about doubting the very computational model we are using to generate those sensory experiences, the logic of our very thoughts, the computational model itself. And I am NOT encouraged that neuroscientists report the brain encodes complex numbers, the brain represents states in vectors of complex amplitudes, that normalization is canonical in the brain, and that linear operators are standard in the brain – all of which sounds like a vaguely familiar model of computation. I would feel much LESS worried about this possibility IF the neuroscientists reported that the brain looked anything like a binary model of computation. But it doesn’t – not at all. I would also like to think optimistically that the explanation is that the brain evolved a model that was like the model running in the environment (the universe) that it evolved in. In that case we could get back to the Scott Fear of definitely knowing the “actual operating system” viewpoint.

    It is possible that these three situations overlap.

    To borrow Wittgenstein’s metaphor, we are like the FLY IN THE BOTTLE which is trapped in the bottle because nature endowed it with a model of seeing (phototaxis) that does not even potentially ALLOW it to realize that there is a way out.

  375. Scott Says:

    wolfgang #245:

      Why did God use quantum theory to make the universe, but have it appear classical to us?
      And how exactly did he do that?

    Oh come on, we all but know the answer to that one: decoherence! Not an add-on, but an unavoidable prediction of QM in a universe that’s gradually filling out its Hilbert space like ours is.

  376. Dimitris Papadimitriou Says:

    Even beyond the question why this particular framework exists and is ( supposedly) more fundamental than the other, there is a more difficult and, maybe, deeper issue:
    Are the existing laws of nature eternal, or they evolved from some more primitive or less ” accurate” ( or less well defined) primordial principles?
    Do they have an independent existence in some ” platonic” sense?
    Quantum mechanics seems to be more fundamental, yet it still depends from some ( naive versions of) classical notions ( like space and time). There is lots of speculation about the supposed emergence of spacetime ( and gravity) from some purely QM description, although gravity is the only universal interaction, as it affects everything, literally, that exists, including spacetime itself, and that’s a bit funny to think…

  377. Clinton Says:

    Scott #288:

    “If (as I fear) QM is exactly true, then we might not ever be able to explain it in terms of
    anything deeper (but we can still try!).”

    But should we try?

    A quick page search shows almost no mention in this thread of Godel or the Halting Problem. That can’t be right 🙂

    Let me try to come up with something completely off the wall then and you tell me why this doesn’t apply.

    Incompleteness:
    Assume that QM is exactly the operating system of the universe. In other words, the map is the terrain. I mean literally, the universe physically is made out of complex amplitudes and the most fundamental physical laws are exactly the axioms of QM. Then, clearly, QM is an axiomatic system at least sufficient to capture the properties of N. Therefore, it will be impossible to prove the completeness of QM from within the universe. (As a bonus rabbit trail, if this were true, then is GR an example of an unprovable true statement within QM?)

    The Halting Problem:
    Or, how about this. Aliens arrive tomorrow and declare they know everything and say, “Ask us anything.” You ask, “Why did it HAVE to be QM?” They give you the principle X or the underlying general model Y that explains why the universe had to be QM. Will not your first thought then be, “Yeah, but … why did it HAVE to be X? or why did it HAVE to be Y?” So, it seems like there is something wrong with the question. In other words, if QM is the operating system of the universe then … there is no deeper reason why … there is no deeper operating system. Otherwise, you get a halting problem situation because you are asking “Will this program (search for the fundamental model) halt?” I mean … Let P be the program to find or verify the most fundamental model of the universe. Let I be the evidence input. Let output be 1 if P halts on I and 0 if P never halts on I. Assume there exists a program Halt(P,I) that returns 1 if and only if P halts on I …

    These thoughts aren’t even half-baked 🙂 but I’m just trying to think of some way to prompt you to tell me why incompleteness/HP has nothing to do with your Q

  378. Zeb Says:

    Apologies for the overly long comment.

    I want to start by saying that I completely agree that cellular automata seem much more natural than quantum mechanics. There are certainly difficulties in making a CA world come “alive”, but it isn’t clear that they are intractable. Here are some of the difficulties that I see:

    – If the CA is not reversible, then it probably needs a hardcoded initial state (my intuition is that states which have an infinite chain of predecessors are rare if the CA is irreversible). This invites the religious question of “well who chose that state, then?”. It also invites Last Thursdayism – who is to say that where you are now isn’t the initial state?

    – If your initial state has only finitely many “living” cells, then somehow its evolution seems like it will be eventually predictable, even to those living within the system. Well, maybe not – perhaps the inhabitants can’t compute the consequences of the initial position faster than they occur – but I generally dislike this type of universe as a place to live. There is always the fear that every source of interestingness in the world will eventually die out.

    – If the state ever becomes periodic in some direction, then from that point on the CA is equivalent to a lower-dimensional CA.

    – If the initial state is chosen at random, then we have the question of what mechanism makes the random choice, and why is that mechanism so different in flavor from the deterministic evolution of the CA from that point on?

    – Once a CA has many local states or has a large neighborhood, describing its transition function becomes prohibitively difficult (I speak from experience). Even symmetry assumptions don’t buy you that much simplification.

    In order to deal with these objections, it seems that at the very least you will want to use a reversible CA, ideally with some element of randomness. One direction which I haven’t seen explored is the possibility of a deterministic CA with a non-random quasiperiodic starting state, or perhaps a CA defined on a quasicrystal tiling of the plane (such as the Penrose tiling). In such a universe, the element of randomness comes from not knowing your precise location in space. You still seem to run into a few of the objections above.

    An interesting side note is that reversibility and randomness seem to be the right framework for thermodynamics, and non-equilibrium thermodynamics might be a plausible recipe for life. Jeremy England has an intriguing research program which has the goal of showing that life naturally evolves in an environment with reversible physics, access to a heat bath, and access to a source of harvestable energy (based on the failure of a naive attempt to get this to work, I think that the ultimate source of this harvestable energy should be separated from the location where life is doing the living).

    Most of the objections to CA physics also apply to other models of physics. They lead to the following desiderata:

    – Physics should be reversible (note that this isn’t the same as saying that the laws of physics going forward in time should be the same as those going backwards in time, which is empirically false).

    – There should be some element of randomness. The randomness shouldn’t occur at some special time, but rather should be an ongoing part of the evolution of the world.

    – Physics should not depend on having a very strange initial state.

    Actually modern cosmology doesn’t seem to satisfy these desiderata, as far as I understand it… if physics is reversible, and information is conserved, and if only a finite amount of information can be squeezed into a tiny space by black hole information bounds, then where was all the information stored at the time of the Big Bang? Ok, I’ve exposed my ignorance; let’s leave this question to the side until someone solves quantum gravity.

    The discussion of CAs vs quantum mechanics has implicitly focused us on universes which satisfy the following desiderata:

    – The rules of physics should be computable, in the sense that you should be able to approximate the (statistical distribution of) outcomes to any desired accuracy.

    – The laws of physics should be based on local interactions. (Implicit in this is that there should be some notion of distance, and time, which looks at least vaguely like a manifold. I’m not sure how to answer the question “why not a Sierpinski triangle?”.)

    – We should be able to describe what is going on in a finite region of space to a high enough accuracy that we can make predictions about it at a fixed time in the future, using only a finite amount of information. (Chaos theory says that in some reasonable classical situations we may need a huge amount of information to predict times very far in the future, but it has little to say about a fixed time in the future.)

    All of these seem to be saying the same sort of thing. Local rules on finite alphabets is how we define Turing machines, after all, and logic is also based on local rules of deduction.

    Why do we think that “computability” is the natural boundary, here, rather than “polynomial-time computability”? One argument I’ve heard is that a Turing-complete universe with polynomial-time physics can simulate any computable universe, but no computable universe can simulate the physics of a universe with a halting oracle (except perhaps in some limiting sense). This isn’t completely satisfying to me: polynomial-time computability seems like it should be a natural requirement, even though it empirically isn’t.

    At the very least, even if physics isn’t simulatable in polynomial time, it seems that SAT-solving should still be hard – otherwise, what is the point of evolving complex life and inventing mathematics? This isn’t a very good argument, though: I’d be excited to have an NP-oracle, rather than distressed.

    Once we’ve gotten this far, I feel that we have sort of justified studying generalized probability theories. We want reversible randomness. You mentioned that you’ve seen the work of Chiribella et al. There actually seem to be many variations on this work: several different ways to reconstruct quantum mechanics from a small number of reasonable axioms plus one “weird” axiom. Interestingly, every reconstruction uses a slightly different collection of “reasonable” axioms. I’m quite curious about whether pooling together all of the “reasonable” axioms (and leaving out the “weird” ones) of the different approaches is enough to reconstruct quantum theory.

    Here is an attempt to list out the “reasonable” axioms that occur in various quantum reconstructions:

    – The framework should be “causal” in the sense of generalized probability theory (closely related to having no faster-than-light signalling, which seems essential for computability). Another way I’ve seen this phrased is that “local operations commute” (obviously implied by special relativity).

    – The system should not be deterministic. From here it’s only a small leap to assuming that the set of states is convex.

    – Since convex sets naturally have a dimension, we can ask whether the collection of possible states in a finite region must be finite-dimensional. At least it should be possible to approximate the state-space as finite-dimensional, if we hope for a computable physics? (Quantum Field Theory seems to violate strict finite-dimensionality, and as far as I can tell no one knows how seriously we should take that. But assigning a finite-dimensional Hilbert space to every possible region of space seems problematic once we start wondering about the exact amount of space necessary for the dimension of the Hilbert space to increase by exactly one. I wish someone who understood quantum gravity would explain how this is supposed to make sense.)

    – Different states should be meaningfully different: it should be possible to distinguish them by performing measurements on them. “Local discriminability” takes this a step further, but requiring it seems somewhat intuitive: it would be odd if there was a strange property of a pair of spatially separated particles which could only be tested by bringing them back together, especially because there would be no point in time when they were ever truly in the same place (assuming that space is not discrete).

    – We should be able to independently prepare states in separated locations, without getting any strange correlations between them. (Together with the previous assumption, I think this implies that the linear span of the state space of a composite system is a tensor product, but it’s been a while since I’ve gone through this stuff.)

    – State transitions should be locally reversible, as long as you look at all of the information available in a neighborhood of the thing that is changing. In particular, there should be a reversible operation which looks like flipping a coin when some of the information is thrown away. This assumption is less defensible than others, but it has intuitive philosophical arguments in its favor. (Can this assumption be used to justify “purification”?)

    – Time evolution should be infinitely divisible (and probably continuous, too). This rules out CAs by fiat (unless you have some clever stochastic CA where transitions occur based on a Poisson process, but that would seem to conflict with special relativity?). A fan of discrete systems might throw this assumption out, but if it helps to narrow things down to just quantum mechanics then I say let’s use it!

    – Measurements should be explainable in terms of the transitions that exist in the theory. So you can’t just have a theory with a weird collection of state spaces, and no interesting transitions at all! (I haven’t seen anyone state or use this axiom, but it certainly seems reasonable to me.)

    Did I leave any obvious reasonable axioms out? Are these enough to derive quantum mechanics? Or can you somehow satisfy all of this with a non-quantum theory?

  379. Jim Graber Says:

    No Preferred Reference Frame at the Foundation of Quantum Mechanics
    https://www.mdpi.com/1099-4300/24/1/12
    Hi Scott, I know you don’t like to receive references, but two things:
    1. Doesn’t this sound like a dead-on attempt to answer your question?
    2 Please at least glance at the picture(Click on link and scroll down)
    TIA for considering this
    I hope its as relevent as I think it is.

  380. fred Says:

    Scott

    “Whether or not that’s possible, when it comes to QM and special relativity it’s not true! We know that they can be reconciled because they were.”

    Yea, after I posted I realized you were talking about QM+SR and I was thinking about QM+Gravity.

  381. fred Says:

    Scott #371

    “I agree! That’s exactly what was done for many other aspects of physics that seemed unmotivated at first, such as (famously) the Lorentz transformations.
    QM, however, seems a lot more fundamental than the 3-dimensionality of space (certainly the string theorists regard the latter as a mere emergent detail)”

    What do you think of Sean Carroll’s program to show that everything, including spacetime, could be derived on top of the wave function as the most fundamental object?
    I guess that’s one way to go about proving that QM is necessary, no?

  382. mjgeddes Says:

    #fred, #Scott, 231 , 371

    Yes, that’s exactly what I mean by looking for deeper fundamental principles. All the clues are there, Scott, I think you could crack this by next week, just by thinking it through carefully 😀

    Lets summarize my proposed chain of reasoning:

    (1). We can’t understand QM starting with abstract “wave functions” or pure math, we must identity the motivating *physical* principles, *not* abstract concepts like wave functions

    (2). The “wave function” is non-physical, it’s just a computational method we use to calculate aspects of a deeper reality

    (3). The deeper reality has to be a physical geometry

    (4). The putative new geometry isn’t classical, it must be non-commutative

    (5). The closest quantum formulation is the ‘phase-space formulation’, so that should be the starting point when looking for the putative new geometry

    To get the deep physical principles, we need to understand what quantum mechanics is *actually* about at the deeper level, which is unknown.

    Now, if I were to guess, based on the ‘phase-space formulation’ and also the black-hole information stuff, my guess would be this:

    (6). *Quantum mechanics is a generalization of statistical mechanics*.

    The reason I like my postulate (6) is because it would naturally result in information, complexity and generalized probabilities (Quasiprobability distributions) playing important roles, which we know they do.

    So (6) might be a good initial postulate, but that still doesn’t tell us *what* it is that is actually stochastic in the quantum case. For classical mechanics, we know it’s the thermal vibrations of particles (thermodynamics), but what actually is it that’s vibrating in the quantum case?

  383. Scott Says:

    Viktor Dukhovni #246:

      What’s always puzzled me about QM is the fact that in an apparently non-deterministic future we still somehow get *exact* conservation laws … What’s your take on the puzzle of how conservation and randomness end up consistent?

    I’m not sure I understand what the puzzle is. Imagine, for example, a board game where we roll dice to move the pieces around, never adding or removing pieces from the board. We have plenty of randomness even though the number of pieces is exactly conserved quantity.

    In general, while symmetries are of course extremely important when trying to guess correct physical theories, it seems to me that their “fundamental” importance in physics is often grossly overstated. At the end of the day, a symmetry simply means that the space of valid, distinct states in your theory is something different, and smaller, than you naïvely thought it was. So, to give some silly CS examples, instead of the set of all possible n-bit strings, maybe you’re restricted to the set of n-bit strings of some Hamming weight k (i.e., the number of “1”s is a globally conserved quantity). Alternatively, maybe every two strings of the same Hamming weight are to be identified, since the theory is symmetric under permutations.

    In some sense, though, these are more statements about our own limitations than statements about the theory itself. After all, the whole time we were worried about building up a one-to-one map between our human notations and the actual physical states of the theory … that whole time, the theory itself was happily living well-defined in its actual state space, the one that’s left after you correctly mod out and otherwise account for all of the symmetries!

  384. Dan Meany Says:

    Maybe QM is more of an expression or artifact of our doing science through the doors of our perception (apologies to Jim Morrison).

    Consider a QM interpretation of the Monte Hall problem. When the first door opens, it appears the wave function for the door contents collapses, but only for the contestant, not the host, who knows what is behind all the doors already. If the host did not know what door the contestant picked he might open that one. If he did not know what was in the rooms he might open the one with the car. But he required a combination of both sets of knowledge, and that fact is also known to the contestant. On door opening, the actual “measurement” is not a revelation of what is behind the door, but the “captured” partial increase of knowledge of the contestant, which requires input state from both the host and the contestant. If there are four doors and two contestants, and the second contentant was absent when the first choice was made but arrives just before the first door is opened, their view of the wave function is also overlapping, but different.

    A complex number is a pair of reals with an extended definition of multiplication that applies to pairs. The weirdness of time – the Minkowski metric has a minus on the time squared; the time-dependent Schrodinger equation has a -i on the dt – with its suspected relation to human perception being laid out in time – is suspiciously similar to the complex-valued wave function. Human reason, if it is able to make any sense of cause and effect, must analyze with strict determinism and a global reality. However it is an anthropomorphism to neglect the scientist’s initial state and their state change during observation. If the observed item is another scientist observing back, there is no global reality for measurement that pertains solely to one. My view of me, my view of you, your view of you and your view of me, and all the combinations when they meet seems to be asking for pairs that have addition and multiplication descriptions.

  385. Possibly That Simple Says:

    Scott #386

    “In general, while symmetries are of course extremely important when trying to guess correct physical theories, it seems to me that their “fundamental” importance in physics is often grossly overstated. At the end of the day, a symmetry simply means that the space of valid, distinct states in your theory is something different, and smaller, than you naïvely thought it was.”

    Every theory guarantees the existence of symmetries – a symmetry is just a group whose action moves solutions to your equations along the surface in solution space that your theory constrains them to.

    Symmetries don’t have to be simple, like the rotational symmetry of x^2+y^2, they may also be elaborate and contrived like whatever the transformation is that preserves x^2+xy+y^3. These symmetries show up in things like plasma physics and are still useful.

    They are useful because they abstract away specific parameters and keep only the structure. The groups behind the fields in field theory leave the fundamental constants aside, as things that are implied to exist by the symmetry, but not fixed by it. It’s like telling you that a circle is round without getting into the details of how many inches and micrometers it is across.

  386. Stewart Peterson Says:

    Clinton #251, Scott #290 (if I may address you as such),

    Thank you very much for the resources.

    I certainly understand that the mathematical structure of quantum mechanics can be described as an abstract operator calculus, but does this mathematical structure necessarily result in correct physics? That is to say, can correct predictions about physical systems be made from this derived, purely theoretical, result, with only the physical constants filled in by experiment – analogous to the fact that special relativity can be derived from Maxwell’s equations?

    For example, it seems as though we can define superposition as the effect of a commutative operator that collapses the wavefunction. Determining the prior state of the system, if we want to determine the order of arguments – the preimage of this operator – is non-deterministic, obeying Bell’s theorem. The output of such a “preimage generator” would be a probability distribution over the (discrete) set of possible inputs, because a commutative operation erases information about the input string. To make this more concrete, all we have to go on when factoring 12 is [3,4],[4,3],[2,6],[6,2], which expands further to [3,2,2],[2,2,3],[2,2,3],[2,3,2]. Pick a factor out of the input string: it has a 2/3 probability of being 2, and a 1/3 probability of being 3 – but you cannot determine which argument is a 2 or a 3. Preimage generation of factors is therefore fundamentally non-deterministic.

    However – it also seems that this approach also implies that P!=NP. If the preimage generator is fundamentally non-deterministic, it cannot run in P. There is simply no information for a deterministic preimage generator to go on – it has been erased by the computation that created its input. Therefore, for any commutative operator which erases information at a rate which is polynomial with respect to the input string length (but sub-exponential), its preimage generator will be in NP. (An example of such a commutative operator seems to be, simply, evaluating a polynomial.) The commutative operator itself could be used to verify the correctness of the output of the preimage generator – to use the example above, integer multiplication – however, there cannot be a deterministic version of the preimage generator itself.

    To be clear, this is the reverse of the typical case: the preimage generator is taken to be the “original” computation, and the commutative operator (examples of which are clearly in P) is used to “verify” it; we are not using the preimage generator to verify the output of the commutative operator. (I am aware that the existence of lossy functions does not prove P!=NP via the existence of one-way functions, of course!) I am aware that we ordinarily do not care about the order of arguments given to a commutative operator, but in this case, let’s say we do: say, for example, that the arguments given to the commutative operator are a combination to a very complicated combination lock, and we need to recover them in order. (Put more abstractly, we’re treating the commutative operator as taking a vector input, and the operation is performed on the elements of the input data structure, with a scalar output.) Also, this approach seems to not relativize or algebrize, as it does not use an oracle. It further seems to not be equivalent to a natural proof, since a non-deterministic function cannot be Boolean and so it makes no sense to attempt to compute its circuit lower bound. Taken together, these conditions seem to cause this approach to clear the major early bar to such proof attempts.

    I use “it seems,” of course, because an insight that “seems” to solve a Millennium Prize Problem is unlikely to be correct. I just can’t figure out why it’s false, although this may be 1:45 AM thinking. I assume I will be swiftly corrected, if it is worth the time to do so. Sorry for the length.

  387. Vladimir Says:

    Mateus Araújo #368: can you elaborate on the difference between high-energy people and other physicists? As far as I know all (quantum) physicists follow the same procedure: initial conditions –> unitary evolution –> measurement.

  388. Philippe Grangier Says:

    Scott#375

    I suspect you are circular when answering wolfgang #245 by invoking decoherence. If you look for an answer to the first part of the question (why did God use quantum theory to make the universe ?) you cannot invoke the QM prediction of decoherence to answer the second part (why and how did He make it appear classical to us ?). Unless you consider that God made QM for the very purpose of using decoherence, which would be an answer to your initial question.

  389. Stewart Peterson Says:

    Update to previous comment: thinking about this some more, the “very complicated combination lock” is the verifier, not just any commutative operator, and obviously, if the answer depends on the order in which the arguments are entered, it’s not a commutative operator.

    That’s why you sleep on this stuff, equally obviously. I’ll think very carefully about whether it holds for a noncommutative operator with a vector input and a scalar output – right now, I don’t see why it doesn’t – before I open my mouth again. I sincerely apologize for taking up your time.

  390. Philippe Grangier Says:

    Scott#375 (continued)

    Said otherwise, your Q1 might be reformulated as :

    Q1’ : Why (and how) did God make the universe both quantum and classical, depending on the way or the scale you look at it ? What would’ve been wrong with choosing one possibility only ?

    I think the answer to this question is much easier, because choosing either one leads to obvious contradictions with empirical evidence. And thus God needs both of them to get a meaningful universe…

  391. Michel Says:

    As soon as we set up a ‘classical’ universe, we may get the reals as insufficient to manage reality. The reals automatically then give rise to the splitting field of complex numbers , which gives us a simpler to define universe, where more complexity is possible with less rules. In this way Q2 is (almost) inevitable.

  392. Dimitris Papadimitriou Says:

    Mateus Araujo #338, 368:
    Every interpretation of QM has some vagueness built in. They just exchange one kind of vagueness for another, and that seems to be inevitable.
    In the standard interpretation it is the ill – defined measurement process. In the original Everett version, the “ontological status” of the relative states was not clearly defined.
    In some sense, it was much closer to Zeh’s ” many minds ” interpretation ( that is even more vague ), than to the currently popular version.
    If one thinks that the branching ( or splitting, depending on the proponent) structure has to be taken literally, then several serious problems occur:
    How does this splitting is defined? Globally, or locally?
    The first option is incompatible with relativity ( there is no preferred slicing), and is extremely non – local and ill- defined. The second option ( that the splitting ” happens”, somehow locally confined in the future light cone ), is not compatible with QM, because of the usual non- local ( in the QM sense) aspects of quantum measurements. Also, in the case of EPR- type experiments, there are spacelike separated measurements / events that have to be correlated, and there is no other explanation for this in the MWI picture ( if you have a” local” branching), other than the usual consistency requirement. This is also the standard textbook explanation, so why bother with all this extra baggage about splitting worlds?
    The above issues are only the ” Tip of the Iceberg”:
    Things are getting worse if you allow Gravity to enter the party. Whatever the correct deeper theory is ( quantized gravity, or emergent, or whatever…) it needs a semi- classical limit ( non- Hausdorff? ).
    Not only that: It has to give, in each branch, the same predictions that GR gives, at least approximately, to fit observations.
    So, it seems that the claimed simplicity of the many worlds interpretation is only superficial, after all…

  393. Pedro Says:

    Scott #383: While I agree, that doesn’t answer the mystery of why the standard model happens to be built on gauge theories. Why is this redundancy principle so powerful in guessing the right Lagrangians for nature’s forces, if nature cares not about our silly mathematics?

  394. Veedrac Says:

    Scott #125:

    Do you think it was ever satisfactorily explained why we should never have expected, even a-priori, to have found ourselves living in the pre-Newtonian teleological universe of Aristotle?

    I don’t know enough to tell you why quantum mechanics is the right answer, but I do think there is a simple answer to why-not-classical that you are hinting at here, which is that it’s too small. If stars did not shine nor meteors fall, but we still learned about earthly facts of physics, evolution, and the history we’ve had of evolutionary catastrophe and innovation, then were we wise enough, we could still deduce *purely on first principles* that the world is too small, and there must be more to our universe we could not see. We would then not be surprised in the least to find out about quantum mechanics—for sure there must be an infinity of branching realities, else how could we so unlikely possibly come about?

    People, I think, are mostly blind to the exponential power of repeatedly multiplying one small probability by another. Classical mechanics already feels awfully small for our observable universe, apparently barren save for our one species that will probably die by its own hand. I can’t tell you what makes quantum mechanics the simplest big-and-stable theory, but at least it is big! You’d expect the simple theory underlying reality to be made up of some stuff combining with some stuff in a way that cheaply (in terms of descriptive complexity) made a lot more stuff, as much as possible of which is stable, and which amplifies the quantity of conscious life when it eventually is found. Quantum mechanics does that. It gives you a lot more minds for a lot less descriptive complexity than classical theories. The fact that each little primitive of reality is constantly doing things that create entire new spectrums of our whole reality is perhaps the least surprising fact there could be.

    I recognize that this is a lame answer that doesn’t say very much specifically about QM versus other big multiplicative theories, but I don’t have the background to comment on that.

  395. bertgoz Says:

    Scott #261

    (4) robustness against small perturbations

    This can be addressed by feedback loops even if all the individual components are highly unstable.

  396. Stewart Peterson Says:

    Followup to followup:

    The idea of a “preimage generator” was to avoid this problem, by creating a non-deterministic algorithm that would, effectively, need to guess the order of the arguments. The fact that it’s a commutative operator that produced the hash that is the input to the preimage generator is crucial to the information-theoretic argument, of course, and it cannot be noncommutative or there may be a way to reconstruct the input vector. Equally important is that verification be by a noncommutative algorithm: the commutative operator will return true to any given valid input vector (i.e., any combination lock “key” that is possible, but in the wrong order).

    Therefore, we need three elements: first, we start with the combination “key,” then the commutative operator hashes it, then the preimage generator guesses combinations based on factors, and finally we use the (noncommutative) combination lock to test it. A hard-coded key being present in the combination lock may, however, function as an oracle, and we don’t want that.

    There is, however, an analogous system: public key cryptography. So, we start with the private key, the commutative operator hashes it – by, for example, multiplying the bytes together, or any desired lossy algorithm that is sub-exponential in its erasure of information – then the preimage generator guesses which order the bytes were in, and finally, the public key is used to set the decryption algorithm and each order of bytes is tested. The iteration between the preimage generator and the decryption algorithm does not seem to relativize, because the time taken to run the decryption algorithm is taken to be part of the runtime of the overall algorithm – that is, it’s not an oracle – but you wrote the book on that, not me. By all means please correct me if I’m (still!) wrong.

    Hope this is still on-topic and, again, sincere apologies for taking up your time – I just wanted to correct that before the universe did and I thought this was important motivation for why it’s important to determine if a system is in fact fundamentally deterministic.

  397. Mateus Araújo Says:

    Vladimir #387: Sure. High-energy people are usually interested only in describing the (unitary) scattering matrix, leaving the measurements at the end implicit. Nobody cares about the post-measurement state. There is some work in modelling detectors with QFT, such as the Unruh-DeWitt detector, but they are usually studied in isolation. Nobody uses them to calculate what happens in a particle collider.

    In contrast, low-energy people like me usually model explicitly the measurements with PVMs or POVMs, and care about the post-measurement state. Sometimes of a system in isolation, or the state you get after measuring a part of an entangled state.

    In a nutshell, low-energy people use collapse in practice, and high-energy people don’t use collapse in practice.

  398. Mateus Araújo Says:

    Dimitris Papadimitriou #392: “Standard” quantum mechanics is so vague that it is hard to even criticize it. I prefer to focus on interpretations that are at least clear about what is going on, like Bohmian mechanics or collapse models. Copenhagen and QBism deny that quantum states are real and refuse to say what is real instead. They have a Strawberry Fields attitude: nothing is real, nothing to get hung about.

    I don’t see the point of referring to Everett’s original version. We are physicists, not historians. Keep in mind that Everett’s paper was castrated by Wheeler in an (unsuccessful) attempt to make it palatable to Bohr. His PhD thesis is much better, although not entirely satisfactory, as Everett didn’t know about decoherence. Saying that it is similar to many-minds is an empty insult.

    Branching occurs at the speed that decoherence spreads. May be slower than light in some situations, but never faster. I find it rather amusing that you think that local branching is somehow incompatible with QM. The apparent nonlocality in EPR experiments is only there because of the nonlocal collapse. Many-Worlds whole point is not having a collapse! There is no nonlocality that could be incompatible with local evolution. As for the specific mechanism that generates the Bell correlations locally, I recommend this paper by Brown and Timpson.

  399. Scott Says:

    Mateus Araújo #398:

      Copenhagen and QBism deny that quantum states are real and refuse to say what is real instead. They have a Strawberry Fields attitude: nothing is real, nothing to get hung about.

    I’m going to have to quote that when I teach!

  400. Philippe Grangier Says:

    Dimitris #392, Mateus #398, Scott #399

    “Copenhagen and QBism deny that quantum states are real and refuse to say what is real instead.”

    Taking argument from this refusal to conclude that MWI is the only alternative is a caricature. You can perfectly hold that systems within contexts are the real physical objects, and that the usual quantum states (psi) are mathematical tools allowing us to calculate probabilities of real measurements results, for real systems in real contexts. All QM is about (non-classical) probabilities, and MWI is an extravaganza trying to claim that probability amplitudes are real, instead of simply admitting that what is real are physical events, objects and properties.

  401. han Says:

    There is a theory that the Big Bang arose from quantum fluctuations due to Heisenberg uncertainty arising in a vacuum, thus allowing something to be created from nothing.

    Perhaps that is the “why” – it’s because quantum mechanics is the only thing that truly allows for something from nothing.

  402. Scott Says:

    han #401:

      There is a theory that the Big Bang arose from quantum fluctuations due to Heisenberg uncertainty arising in a vacuum, thus allowing something to be created from nothing.

    The problem is that that’s an egregious misunderstanding! Even if the Big Bang did arise as a vacuum fluctuation, the vacuum (which is itself an extremely complicated object in QFT) would’ve previously existed, and its own existence would remain unexplained.

    QM simply doesn’t help at all, at least on its own, with the “ur-mystery” of why there’s something rather than nothing.

  403. Scott Says:

    Chris W. #248:

      it’s surprising for a layman like me that it’s an open question whether the universe is deterministic (Anbar #234, Scott P. #241).

      Is there some flaw in the reasoning “everything in the universe has QM state => the whole universe could be described as one QM state, which evolves deterministically according to the Schrödinger equation”?

    As usual, it all comes down to definitions—not to some advanced physics that you as a layman don’t understand. The Schrödinger equation is deterministic; the Born rule for measurements is not. But is measurement actually a fundamental part of the laws of physics (as in Copenhagen and dynamical-collapse theories), or is it just an artifact of our experience as observers (as in MWI and Bohm)? Also, in the latter case, are there nonlocal hidden variables that restore the “determinism” even of the measurement outcomes (as in Bohm), even though we can never exploit that determinism to make predictions in practice?

  404. Scott Says:

    Johnny D. #249:

      Schrodinger equation allows for static and dynamic states. Static only possible because of slight of hand to make phase irrelevant in Born rule. This goes a long way to answer why QM. It requires 2d wave function with something like phase and modulus. Dynamic solutions are those that are superpositions of static states. Super position and multiple degrees of freedom imply tensor product.

      Static solutions exist as eigenstates of Hermitian Hamiltoninion. These eigenstates exist cause the operator is over the complex numbers. The exponential of Hermitian is unitary.

      Can you create a classic system with static and dynamic states?

    That’s a superb question. It reminds me of Boddy, Pollack, and Carroll’s quantum-mechanical resolution of the Boltzmann brain problem, namely that (in many cosmological models) the extremely late universe is just going to be sitting in an eigenstate of the Hamiltonian doing nothing, and it’s a misunderstanding of QM to think that Boltzmann brains will infinitely often be “fluctuating into existence” out of that eigenstate, since there won’t be any observers or measuring devices around to measure the fluctuations.

    But I can raise the same objection to you that I raised to Boddy et al. at the time: namely, just as Hamiltonians have eigenstates, it’s equally true that classical stochastic evolution laws (i.e., Markov chains) have stationary distributions! The only difference, it seems to me, is that it feels more tempting to regard a quantum-mechanical eigenstate as the “actual reality of what’s going on” (namely, nothing) than it is to regard a stationary distribution like the Gibbs distribution in the same way. For better or worse, people are tempted to regard a Gibbs distribution as merely an expression of human ignorance; if we knew the exact position and velocity of every atom, they point out, we’d see that the state wasn’t “stationary” at all, but frenetically jumping around.

  405. Scott Says:

    Jacob #252:

      A question similar to your question 1 that I would love you to discuss: given that the laws of physics are so complicated, why can they be so well approximated by something so simple?

      “Why don’t Newtonian physics work?” doesn’t strike me as a terribly interesting question – there’s no reason to suppose they should.

      But “why do they almost work?” seems much more puzzling.

    There’s a large part of your question that’s just straightforwardly physics rather than philosophy! I.e., if you assume the more recent theories, you can derive and explain why the earlier theories worked to such an excellent approximation—as, for example, Newtonian gravity was recovered as an approximation to GR, or classical electrodynamics as an approximation to QED. Indeed, those derivations were a central part of what the new theories had to do in order to be accepted in the first place!

    But there’s a revised version of your question (also asked, I think, by one or two other commenters here) that survives and that I find extremely interesting. Namely, whatever is the most fundamental theory of the physical world, why should it have given rise to this onion-like structure of cruder and cruder approximations (e.g., GR, QFT, nonrelativistic QM, classical mechanics) that each have their own internal logic, and that come closer and closer to the world of everyday human experience? Is there an anthropic story to tell about that? (Probably yes—there usually is 🙂 —but should we actually believe it?)

  406. Crackpot Says:

    Pedro #393:

    Isn’t a guage theory effectively just “We’re neglecting other forces and effects and constraining ourselves to particular configurations to ensure the neglected artifacts are negligible, and our theory is based on predicting the outcome of the thus-constrained system”? (Except more mathematically rigorous)

    Under that understanding, in what sense does nature care about our mathematics? We’re trying to minimize the effect nature has.

  407. Scott Says:

    Crackpot #406: No, Pedro #393 has a point. The central importance of gauge forces is the strongest argument against my position that symmetries are “just a thing that you mod out by” (i.e., more about our description than about the world). Gauge forces, however, seem to me to be about more than just symmetry: they arise from making an a-priori bizarre-seeming assumption that a certain internal symmetry holds separately at every point in space, even though the only way to prevent a contradiction when someone travels around a closed loop is then to introduce a new long-range force. And somehow this same trick works over and over. Someday maybe I’ll write a blog post about it … when I understand it.

  408. Brooks Says:

    I am late and not particularly knowledgeable, but IMO Q1 is just a specific case of the “why are so many of our conditions so perfect for us to exist in” question that is best-answered by the Anthropocene principle.

    Just like we need some form of random mutation and recombination for evolution to work and select for people who can talk about evolution, we need some form of uncertainty to create heterogeneity in the universe to allow statistically unlikely but critically important (for us) events.

    Why quantum mechanics? Well, why DNA? Why gravity? There are probably lots of other ways things could have worked, but if they produced conditions for sentience, we (or our protoplasmic counterparts) would be asking “why froblits? Why BNM2? Why the general charge field?”

    We can’t know whether quantum mechanics is the only way for a universe to work. We just have to be comfortable with that uncertainty.

  409. James Gallagher Says:

    Scott #268

    Sorry I only saw your reply today re Born Rule.

    I think you must be misunderstanding me, Schrödinger Evolution preserves |psi|^100654444222 (for example)

    Of course this is dumb, but then so maybe is our Universe…

    If you have a proof of the Born Rule from the other common axioms of Quantum Mechanics you would get a Nobel Prize I think…

    Actually even the Bohmians have a neat argument that whatever rule the Universe started out with it would very quickly evolve to the Born Rule – which is like a classical dynamical system evolving to the natural invariant measure as an attractor – but then you could still have the Universe restricted to the subset of trajectories which evolve to the |psi|^100654444222 invariant subset – there’s no obvious mathematical or physical reason to enforce the whole set of |psi|^2 invariant trajectories in the system.

    (Of course, this not evidence of Anthropic Universe in the usual sense of fine-tuning considerations, but just easy to see that the Born Rule would be very common in an Anthropic scenario (as the least “dumb” choice), so no need for design arguments or mathematical perfection underlying our Universe)

  410. Philippe Grangier Says:

    To Scott #403, answering Chris W. #248; see also PG#332 and #400, and Mateus#338

    In the last Växjö conference (August 2021) I felt that there was a consensus on some kind of trade-off :

    – if you insist on determinism (or at least classical randomness, based on ignorance) then you must give up locality in a rather strong sense, called elementary locality in #332. This is typically Bohm’s version of QM, and close to Bell’s ideas.

    – if you admit some kind of non-classical randomness, then you can save locality, or at least live with a much weaker form of non-locality, related to what I called predictive incompleteness in #332 (see also #338). This is my preferred version of QM, you may consider it as neo-Bohrian, despite my claim of psi being incomplete. Contextuality is essential here, and also accepting that contextual inferences are NOT nonlocal influences.

    – MWI claims to keep both determinism and locality, the price being the extravaganza quoted in #400. What surprises me most here is the claim that the universal psi should be ‘real’, contrary to any empirical evidence. Would MWI consider that psi only speaks about probabilities, and that ‘branching’ is simply updating these probabilities, I would consider it much more acceptable.

    – there are many other options, including the ones based on ‘agents’ beliefs’ and further away from any form of physical realism, but they were not strongly represented at that conference.

    I guess everybody reading these lines already know all these options, leading us to admit that God did not do a better job to reconcile physicists (and computer scientists) than he did with religions…

  411. wolfgang Says:

    @Scott #365

    >> we all but know the answer to that one
    Well, then tell us how many worlds did He use for that ?
    Minor issue, I know …
    … and how did the initial quantum state (encoding universes of different size etc.) decohere?

  412. Russel Says:

    My understanding of QM is too rudimentary to possibly offer any insights above what others will have, so I offer you this instead:

    When I arrived in the afterlife and met God, I asked him this very question – well, I say “when” and “in” and “met”, though really the afterlife is beyond time and space, and it’s more accurate to say God *is* the afterlife, but I digress – I asked, why quantum mechanics? Why this seemingly unnecessary complexity? And, being now joined with God I directly perceived the answer, which I now lay out in a narrative form suitable for Earthly communication.

    At first God did try to make universes based on simple rules – simplicity is beautiful to God too. And for sure, cellular automata seemed a sensible way of doing this, but have you ever glanced over the proof of Turing completeness for such systems? Simple rules simply shift the complexity of a calculation into the initial state, and massively complex and detailed structures are required to make even the simplest of calculations. Producing sentient beings in such universes required an unfathomably large and special set of initial conditions, the probability of sentience arising from random conditions is vanishingly small. Of course God made some infinitely large universe of this sort to get around this – only it just seemed too wasteful.

    So God moved on universes with more elaborate laws, laws just complex enough to allow the spontaneous and regular emergence of sentience. These laws turned out to be much like what you and I know as classical mechanics, or variations thereof. But the inhabitants of such universes quickly reached a point where they had discovered all the laws – and after that, well, they got bored. Weltschmerz. There was no sense of wonder, nothing left to debate, no uncertainty, no further progress to be made, no problems to solve, and in consequence, none of the higher planes of happiness that God wishes for his creations.

    And so God experimented with more complex universes. Too much complexity proved troublesome as well – the sentient beings that emerged were unable to solve anything, and gave up striving to understand. But somewhere in between there lay a sweet spot – where the sentient beings could perpetually live in wonder and debate and strive forward, attaining the satisfaction of discovery and understanding, but never the ennui and dissipation of having solved everything.

    Thus the beautiful irony of my question – why is the universe quantum mechanical? Simply so that I could wonder why.

  413. Philippe Grangier Says:

    Scott #405

    « Why should it have given rise to this onion-like structure of cruder and cruder approximations (e.g., GR, QFT, nonrelativistic QM, classical mechanics) that each have their own internal logic, and that come closer and closer to the world of everyday human experience? »

    Tentative answer : because historically it went just the other way around, i.e. making better and better experiments, built upon the previous step, and requiring better and better approximations ? And this is called the progress of science ?

  414. Mateus Araújo Says:

    Scott #399: Please do =)

    Philippe Grangier #400: That wasn’t an argument for Many-Worlds, it was an argument against Copenhagen and QBism. I did give arguments for Many-Worlds, spread around other comments: that it follows from a literal interpretation of the Schrödinger equation as true, and that it is the only way I know to make sense of probability and Bell nonlocality.

    I don’t see how saying that “systems within contexts” are real instead helps with anything. Do these systems have a mathematical description? It can’t be a quantum state, as you claimed that they are not real. What is it then? Just an informal notion? That doesn’t cut it. You do need to provide a precise mathematical description of what is real, otherwise you’re just producing yet another Bohrian smokescreen.

  415. Dan S Says:

    If you demand locality, then rigid bodies are out of the question and you are left with nothing but point particles (or strings?). But Newtonian point particles will never interact because the collision probability would be zero. To get around that, you need something like Feynman paths.

    On the other hand, continuous fields like you have with, for instance, Navier-Stokes, are local and non-quantum. Perhaps they aren’t interesting enough due to not being in a high dimensional space like QM. Perhaps the dimensionality of state space doesn’t cost anything and QM is favored because the equations are somehow simpler than something like Navier-Stokes.

    Ultimately, I think there is nothing that mandates QM in a strict sense. We could just as well have been living in Conway’s game of life. That probably could support evolution if played on a large enough board for enough time. Perhaps a three dimensional version, in order to increase the compute density.

    If we are to take seriously Max Tegmark’s mathematical universe, perhaps all these alternate theories are physically real. Every differential equation or cellular automaton you can think of is quietly chugging along in Plato’s space of Forms. Some of them support life, and of those some have a Scott Aaronson asking why we’re living in this particular one.

  416. Ajit R. Jadhav Says:

    The number of comments has gone to 400+ already! It would be impossible to read through all the preceding comments and only then write my reply. … In fact, I am going to save this discussion and think my way through all the (valid) points of all the comments slowly, over a period of time. But for the time being, let me write something directly in reference to the main text itself.

    > Q: Why should the universe have been quantum-mechanical?

    We say that the universe is QMcal, but only after (i) taking into account all the available evidence, including the most general phenomenological knowledge and experimental data, and then (ii) *inducing* a theoretical explanation which arises from, and is consistent with, the former.

    If tomorrow the sum totality of the observational knowledge (including the concrete experimental data) changes, we might have to say that the universe is not really QMcal after all, that it has some other (as of today unknown) sort of a nature. We will then try to find *that* theory.

    It’s all a matter of finding a theory that *consistently* explains *all* the evidence which is known at a given point of time.

    > Q1: Why didn’t God just make the universe classical and be done with it? What would’ve been wrong with that choice?

    Essentially, the answer is that some or the other explanation (e.g. a prediction) from the theory would be wrong, that it wouldn’t be consistent with how the world actually works. The ultraviolet catastrophe, for instance. The catastrophe does not occur in the real world out there; it’s merely a feature of the “classical” Maxwell-Heaviside-Lorentz theory.

    An important point here, however, is that there is no such a thing as a single, over-arching, “classical” theory. There are many different classical theories, each of which applies to its own set of phenomena and / or ranges of observations.

    There was the Newtonian mechanics of particles, and it applied to the idea of the electrically uncharged and finite-sized objects that interact only via the direct contact at their respective bounding surfaces. The qualifications used in the previous statement were not known at the time; these were subsequently discovered, over a period of time. But already in Newton’s own time, in fact in his own theory of gravity, the elements of the theory *failed* to conform to the ontological model of his own mechanics. In Newton’s law of gravity, you have objects that aren’t in the direct contact, and yet manage to exchange forces (of gravity) via an instantaneous action at a distance (IAD for short). So, Newtonian gravity already is a non-local theory — even if no one today would doubt that it is a “classical” theory. Then there also is the classical diffusion theory of Fourier’s (say the conduction of heat). It again is a non-local theory — and yet, very, very classical. Electrostatics is yet another classical but non-local theory (it has IAD).

    So, contrary to the present-day discussions, there was no such a thing as *the* “classical” theory, no such thing as a single ontological viewpoint / theory which remained applicable to *all* the pre-quantum theories. Why, even the geometrical theory of optics is useful (in certain contexts), but has no wave nature.

    In fact, even in “modern physics”, you often do a mix-n-match of different theories.

    Schrodinger’s wave mechanics is not relativistic. But using it still gives you a damn good estimate of the bonding energy in the case of the smaller atoms. For the helium atom, in fact, the difference in the bonding energies predicted by the relativistic and non-relativistic QMcal theories turns out to be just a fraction of a percent (less than 0.1 *percent*, IIRC). And, closed-form solutions are not available. So, the error due to *not* using a relativistic theory anyway gets completely lost in the jungle of the numerical errors alone. (A “jungle” it is, not a “forest”.) In fact, to back out a bit, even the Born-Oppenheimer approximation itself is just that — an approximation. But we use it.

    > Q2: Assuming classical physics wasn’t good enough for whatever reason, why this specific alternative? Why the complex-valued amplitudes? Why unitary transformations? Why the Born rule? Why the tensor product?

    That’s a very good set of questions. (Phrased way, way, better than Q1!) I will make sure to answer this second set when I come to writing / explaining my new theory (of the non-relativistic QM).

    Already, we discussed the complex-valued nature of solutions on this blog back in 2018 (and recently, I also discussed it at Dr. Roger Schlafly’s blog). So, that’s out of the way, in a way. So, I don’t think I would even include a discussion of this feature in my paper. Enough to say that solutions in my approach remain complex-valued too.

    In fact, you can think of my new approach as providing a layer underneath the postulates of the mainstream QM, a mechanism which explains how those postulates come about. So, all these (and similar) questions are very relevant / pertinent, but not each of them might actually get discussed in my paper — not in a detailed way, anyway.

    And, BTW, that precisely is the reason why I’ve been saying that I need to have some informal interaction with a physicist proper, say a prof or a post-doc of physics (or even a researcher in the QC field who has thought about the foundational aspects) so that I know what all points to include, and what all to leave out, in the published version of my paper(s). In the absence of such interactions, I could go on writing many things that are too obvious to the intended audience (the physicists proper) — I am talkative. And yet, I may perhaps end up not including discussion of points that are very obvious to me but may not be so to others. (Like, e.g., the complex-valued nature of \Psi.)

    Last minute addendum:

    My theory is deterministic, but nonlinear, and thereby leading to an “irreversibility” of the SDIC sort i.e. of the exponential divergence sort. (SDIC means: Sensitive Dependence on the Initial Conditions.)

    In general, IMO, high time that people made a distinction between a *law* of physics and a behaviour of a *system* whose elements obey that same law. A differential equation (or a set of them, or a mechanism formulated using differential terms) is deterministic. *Always*. But a system which is composed of elements each of which obeys that same deterministic law, may not itself be deterministic. Newton’s three laws of motion are deterministic, and so is his law of gravity. These apply to particles. As to the systems: The 2-body system is deterministic. But the 3-body system already is non-deterministic — even if based on the same, deterministic law. Prof. Norton has been studying the non-deterministic nature of systems based on Newton’s laws and has many interesting points to note at his site, even at a level that is accessible to the layman (e.g., Norton’s dome).

    As to the in-principle stochastic theories, well, *all* of them pertain to *systems*. All of them may be regarded as mere approximations which, by their starting point itself, choose to leave out a lot of information.

    For instance, the kinetic theory. It leaves out the detailed description of the individual gas molecules (regarded as “particle”s), simply because there are too many of them. But the theory is useful. “Useful” doesn’t mean “fundamental”. The kinetic theory gets going by partitioning the system state, and then populating those partitions with groups of a large number of particles (but it in fact uses real number to denote the number of particles in a group, not integers!). Noteworthy: The information pertaining to the instantaneous *position* and *speed* of an individual particle gets retained in some way, even if only in an aggregated form, but the information pertaining to the *direction* of motion of each individual particle gets lost in the process. Have a careful look at this point. If you do, understanding many puzzles becomes so easy and immediate, whether it be reversibility and time’s arrow, the Poincare recurrence, etc. …

    So, overall, the point of this addendum is this: People should make sure to know what they are talking about. Is it the basic law (which governs the *elements* of a system)? Or is it the behaviour of a *system* (i.e. an assemblage composed in a complex way from those elements, whether the complexity is due to the nature of their interactions, or due to their sheer number, or something else)? IMO, this is a highly relevant point, but the modern / present-day tendency is to gloss over it. Fundamental physical laws are deterministic, and yet can lead to a chaotic (or even fully non-deterministic) behaviour at the level of systems.

    Best,
    –Ajit

  417. Etienne Says:

    Philippe #413:

    Tentative answer : because historically it went just the other way around, i.e. making better and better experiments, built upon the previous step, and requiring better and better approximations ? And this is called the progress of science ?

    That’s missing the crux of the question though: why are the laws of nature even *approximable at all* at the macroscale?

    When my students implement simulations of physical systems and get the laws wrong—flip a sign, drop a term—they don’t get new, interesting chemistry and physics. They get complete chaos. Somehow the laws of the universe are such that they homogenize incredibly well at different length and time scales. (Not perfectly—even classical mechanics has its monsters, like turbulent flow—but well enough that it allowed rational exploration of science to develop in the first place).

    A steel spring has got its lattice of carbon and iron atoms, its dislocation, its delocalized cloud of electrons; nucleons and their quantum states and their quarks and strings and who knows what else—but at the human scale all of that melts away into the F = kx that even high school students can understand. It’s totally unclear to me why that must be the case.

    (You can argue that locality of physical forces, plus Taylor’s theorem (everything is quadratic if you don’t perturb it too much), leads to Dirichlet-like energies which promote smoothness across scales… maybe… but that doesn’t come close to fully resolving the question).

  418. Philippe Grangier Says:

    Mateus #414 : « You do need to provide a precise mathematical description of what is real, otherwise you’re just producing yet another Bohrian smokescreen. »

    Currently my best answer to your demand is https://arxiv.org/abs/2003.03121 , published as Found. Phys. 51, 76 (2021). It includes both systems and contexts in a unified algebraic framework, inspired by von Neumann’s paper quoted in #301. But it’s not a final answer yet, since it is essentially a ’static’ picture without time evolution.

    For MWI please see #410, unfortunately I don’t buy the idea, which also clashes with the above von Neumann paper in the asymptotic limit of a countably infinite number of particles (in this limit there cannot be any universal psi, due to sectorization).

  419. Mateus Araújo Says:

    Philippe Grangier #410: The challenge remains, as I mentioned in my comment, to actually produce this “mild” nonlocality in a realist model of quantum mechanics without a flagrant violation of relativity. Otherwise there’s nothing mild about it. The only way I know how to do it is with Many-Worlds. I believe it’s not possible to do it in a single world, but I’d love to be proven wrong.

  420. flergalwit Says:

    “An exponentially larger state space for all of reality…”
    While I know what you mean by this, is that necessarily the right way to think of it? After all, there’s this lecture

    Also regarding Q1, it might be relevant to note there are results saying that QM is in some sense a unique deformation of classical mechanics into a non-equivalent stable structure, and the instability of classical mechanics is strongly connected with its degeneracy in this framework (concretely, the exactness of pure states).[*]

    E.g. https://arxiv.org/abs/math/9809056 (and the 1977 Bayen et al paper referred to within).

    [*] This is a paraphrase of the summary of these results by Ludwig Faddeev, p.517 of Quantum Fields and Strings, a Course for Mathematicians vol.1. I have not read the 1977 paper.

  421. Scott Says:

    Annnnd … while we may or may not have gotten closer to explaining quantum mechanics from some deeper ur-principle … I can now report that I’ve injured the same foot a third time, once again while playing with my kids. (I resolved to be more careful, but they begged me to be “It” in tag until I finally gave in, thinking the foot must be OK and healed now, and … you know the rest!) I’ll find an orthopedic doctor, of course, but I’m resigned to the likelihood that this is just a permanent diminution in my quality of life, and I’ll be more-or-less hobbling from here till the end. Well, less exercise, so in expectation probably less time left, but possibly more time each day to think about QM!

  422. Clinton Says:

    Stewart Peterson #386

    I’ll just take the first question. You asked:

    “I certainly understand that the mathematical structure of quantum mechanics can be described as an abstract operator calculus, but does this mathematical structure necessarily result in correct physics? That is to say, can correct predictions about physical systems be made from this derived, purely theoretical, result, with only the physical constants filled in by experiment – analogous to the fact that special relativity can be derived from Maxwell’s equations?”

    In Chapter 9 of his Quantum Computing Since Democritus class notes, Scott gives the best answer:

    “So, what is quantum mechanics? Even though it was discovered by physicists, it’s not a physical theory in the same sense as electromagnetism or general relativity. In the usual “hierarchy of sciences” — with biology at the top, then chemistry, then physics, then math — quantum mechanics sits at a level between math and physics that I don’t know a good name for. Basically, quantum mechanics is the operating system that other physical theories run on as application software (with the exception of general relativity, which hasn’t yet been successfully ported to this particular OS). There’s even a word for taking a physical theory and porting it to this OS: “to quantize.” But if quantum mechanics isn’t physics in the usual sense — if it’s not about matter, or energy, or waves, or particles — then what is it about? From my perspective, it’s about information and probabilities and observables, and how they relate to each other.”

    In other (my) words, the postulates of “quantum mechanics” do not (necessarily) give a physical theory. So, it does not “result in … physics” or “result in” anything. QM is a class of computation. And as a class of computation, there will be computational characteristics that go along with using it. So, think of it like a programming language that requires systems to be represented in a specific way (as a vector in C Hilbert space) and requires a (Hamiltonian) operator to specify the evolution of the system (this is how physicists “use” it). There is no physics “in” the class of computation a priori – again, just consequences of the characteristics of its computational class, characteristics of complex numbers, linear operators, etc.

    For example, Planck’s constant is not “predicted” by the postulates of QM. It gets introduced if we want to use Planck’s energy formula to talk about energy – ie develop the more refined wave picture in continuous time where h must be experimentally determined. (See Nielsen and Chuang’s Quantum Computation and Quantum Information, Sec. 2.2.2)

    As for making correct predictions, I would put it like this: QM is the most successful computational class we know for constructing predictive physical models.

    Now, all of that being said, we did “get more than we originally bargained for” from the formalization of QM, two examples being, experimental confirmation of the violation of the Bell inequalities and of the (very recent) experimental confirmation of the requirement of complex numbers.

    https://www.pourlascience.fr/sd/physique/l-intrication-quantique-confirmee-par-une-experience-de-bell-sans-faille-12185.php
    https://www.nature.com/articles/s41586-021-04160-4

    Whether one wants to call those results “predictions about physical systems” or “characteristics of the computational class”, … there are competing … interpretations 😉

  423. Lorraine Ford Says:

    Scott,

    Sorry to hear about your foot. I remember when I hurt my foot years ago, it seemed to take ages and ages to heal. It might just be a matter of time.

    I have a different take on the quantum randomness issue. I think that an awful lot of nonsense is talked about the supposed randomness of individual quantum outcomes.

    Surely the MORE important points about individual quantum outcomes are:
    1. The system, or a part of the system (e.g. a particle), has taken a definite step (seemingly in response to a situation).
    2. This definite step would be modelled (e.g. in computer program) as the assignment of a number to a variable.
    3. The fact that the number can’t be predicted by an observer is not as relevant as points 1 and 2.

  424. Jacques Pienaar Says:

    In physics, it is always necessary to begin with some basic postulates. A good postulate is one which we do not feel inclined to question; we are willing to simply accept it as something that happens to be true of the universe we live in, and we feel no need to ask “why is it true?” (Feynman explains this point with characteristic charm here.)

    As an example, the physics community could have postulated the Lorentz transformations: they could have said, let’s just take length contraction and time dilation as given facts. But when Einstein showed that they could be derived from two other postulates (constancy of light speed and the relativity principle) physicists mostly agreed that those postulates were a much better starting point. But why? Why does nobody go around demanding to know: “why the relativity principle?” but we do go around asking “why the quantum?”

    In contemplating Q1, we ought to first ask ourselves why we can’t just accept the “quantumness of the universe” as a postulate that does not require further explanation. (I’m not suggesting that it would be obvious how best to formulate this “quantumness” as a postulate — that would still be an open question to be debated as part of Q2. The key issue in Q1 is why ANY postulate representing “quantumness” should be needed in the first place).

    For us to accept something as a postulate, we must feel comfortable with not interrogating it with further “why?” questions. That means it must be a statement about the nature of the world which fits comfortably with our implicit framework of thought (world-view).

    Physics operates always within a world-view that is historically and culturally conditioned, which tells us which concepts demand an explanation, and which ones can be accepted without question. If we feel that the “quantumness” of the world is in need of explanation, it is only because there is something about it that does not fit easily or comfortably with our implicit world-view.

    As a rule, the world-view is always the last thing to change. If we cannot explain a phenomenon using postulates that fit nicely into our world-view, we just keep on trying. If it persists, we call it an “anomaly” and try to get away with ignoring it. And if we cannot ignore it, then finally, reluctantly, we look for an alternative world-view in which we can find new postulates that explain the phenomenon. The “relativity principle” would not have been accepted as a postulate at just any time in human history. The way had to be paved for its acceptance by the work of many thinkers (Galileo and others) who helped create a world-view in which such a postulate would be acceptable.

    In the present case, the first step is to ask ourselves exactly why our present world-view is one that happens to accommodate a “classical” model of the world more comfortably than a quantum model. Is it because of the inherent merits of a “classical-friendly” world-view, such as its elegance, simplicity, or “naturalness to the mind”? Or is it an accident of the particular historical and cultural moment in which physics is presently embedded, which narrows the physicist’s imagination to exclude any “quantum-friendly” world-view?

    If we are satisfied that our world-view stands up on its own principles, then it makes sense to hold on to it and ask: “why the quantum?” But if we suspect our present world-view might be too narrow (incidentally, this is not an unreasonable suspicion, given that the physicists who shaped the present world-view are disproportionately white men raised in the tradition of Western philosophy) then we should instead ask: “how do I change my world-view, so that the quantumness of the universe might fit in comfortably as an unquestioned postulate”?

  425. Stewart Peterson Says:

    Scott #420:

    You’ve helped me in this thread; I hope this helps you. You may already know all this, but in case you don’t, here goes.

    It takes a while to get on an orthopedist’s calendar. In the meantime, an experienced physical therapist (a real one, not a chiropractor), specializing in sports medicine, has seen just about every type of ankle injury, and probably much worse ones from dealing with football players and so forth. They may be able to prevent scar tissue and/or bone spurs from forming, which will make surgery (if you need it) less complicated and more likely to succeed, and will make rehab easier as well. Take it easy but don’t immobilize it; try to keep everything as loose as you can, with lots of motion and very low forces. It is, however, very important to deal with this quickly.

    Signed,

    A guy who tore a ligament in high school, had parents who didn’t think time was of the essence in dealing with it, and hasn’t been quite the same since. (Please take it from me – don’t do what I did!)

  426. Vampyricon Says:

    Has anyone said that the reason for QM is to make the universe maximally impenetrable/misinterpretable for those who do not make a living studying it? 🙂

  427. Philippe Grangier Says:

    Mateus #419

    Please look at the light cones picture in Fig. 1 of https://arxiv.org/abs/2012.09736, I don’t see any flagrant violation of relativity here. But again, you have to admit that a contextual inference is only an inference, and not an influence. Deciding how `mild’ it is belongs to you…

  428. Marten Says:

    I am delighted by these questions and I agree they are important and perhaps answerable. I’m very interested in your take on them. Still as a theoretical physicist I have to say that I expect especially the second one to be the wrong question (although quite close to the right one). I expect the question should be “Why do we perceive physics to be governed by Quantum Mechanics.” I guess you want to exclude discussion on this topic (point 4) but I think it should be considered. Not as a way out of having to throw away determinism but because any theory we ever had of physics turned out the be only an “effective theory”. Pursuing this direction does not have to be like throwing up your hands in wonder and say oh but what if some other beautiful other theory exist that we don’t know yet. Often useful classifications can be made even from a position of relative ignorance. From the infinite space of possible quantum field theories we can understand how in the low energy experiments we are able to do on earth most of them all end up looking the same. Physicist first computed things in the simplest cases just because there they could do the computation. Only later it was understood that if nature was not that simple it would still come out looking exactly the same at least at energies that we can reach on earth. It could very well be the case that something similar is going on in the class of possible “Quantum-Like” theories. The simplest theory isn’t always right (See Q1). But it might be that more complicated theories look the same in some regime that we happen to be living in. The guiding principles of effective field theory could very well inspire a useful approach towards classifying the spaces of possible Quantum-Like theories.

  429. Philippe Grangier Says:

    Scott #420

    Scott, just wait a few more years, and you will get grandchildren ! They are a lot of fun also, even if you don’t run as fast as before. And since their parents take care of them, you have only the good part of the game, and a lot of time to think about QM !

  430. Chris W. Says:

    Scott, thanks for the clear answer regarding various definitions of determinism w.r.t the universe!

    There is one particular question about Q1 that’s stuck in my engineer’s brain (“how to engineer a universe” 😉 ):

    Maybe it is just not possible for creators to make a universe with an arbitrary complex rule set?

    The idea (btw is there a way to proof that there is always at least one of the many worlds, in which you don’t make a fool of yourself?):

    • I assume in the simulation hypothesis any number of rules can be imposed externally on the simulated universe
    • But if it is no simulation, i.e. all the rules have to packed into the initial condition of the universe, maybe there is a restriction and you can only fit in a very very “tiny amount of rules”
    • (even more crazy speculation) Something with randomness and interference of all possible options has the most “generative power” to create ever more complex emergent rules and structure?
    • If you add the Born rule as fundamental part, maybe this combination of random variation and selection yields the best generation of complex structure for the minimum amount of “resources”? (something like evolution / evolutionary algorithms)

    Of course this all is based on the assumption that for any complex universe the necessary classical rule set would be larger than a quantum one.

    Another motivation for the “complexity of underlying rules restriction” speculation is my understanding of the universe’s initial condition (singularity), i.e. absence of any complex structure

  431. Marten Says:

    “Why should it have given rise to this onion-like structure of cruder and cruder approximations (e.g., GR, QFT, nonrelativistic QM, classical mechanics) that each have their own internal logic, and that come closer and closer to the world of everyday human experience?”

    I think this doesn’t even need an anthropomorphic argument. Consistent complicated theories with many parameters usually have simpler consistent limits when taking some parameter to 0 or infinity. It is much easier to come up with a simple theory than a very complex one. Thus, we first understand physics in certain limits.

  432. Scott Says:

    Thinking “aloud” here: if there were some other universe that were simulating ours, then presumably that other universe would need at least as much computational power as quantum computation, but it could also have much more power. After all, if an applied or experimental scientist uses a classical supercomputer to do a massive amount of linear algebra, they don’t think about it as a “terrible waste of computing resources,” since linear algebra is in the complexity class NC1, whereas the classical computer can do all of the seemingly larger class P in polynomial time. In fact the scientist thinks of the supercomputer as just the right tool for the job. Structural complexity would only seem relevant to the scientist, only worth looking into at all, if it were telling them that they couldn’t do the computation they wanted to do. So, in the same way, even if God has the power of PSPACE or EXP, She does not necessarily consider it a waste to use that power to run our merely BQP universe.

  433. Kashyap Vasavada Says:

    I will try to answer Q1. Once God decided to make thinking animals like humans out of constituents like atoms and molecules which are some billion times smaller than human size, he/she did not have any choice about making laws different from laws operating in the human world! Humans cannot have any experience of living in atom size world. So human intuition has to be based on everyday experiences in dealing with tables, chairs, trees, and wild animals, which are classical. Then he/she had to make laws different from classical laws and of course, they would appear weird from the intuitive thoughts of humans. He/she had to find something different from everyday deterministic logic. Non-deterministic means probabilistic. So I can understand God’s design and motivation very well!!!

  434. Stewart Peterson Says:

    Clinton #422:

    Thank you very much for your thorough and thoughtful response.

    I was always taught to “shut up and calculate,” and furthermore, that anyone who didn’t shut up and calculate was a science-denying crazy person. That may have caused me to inadvertently ask: “is there a set of axioms, which we are not allowed to question, and from which we can shut up and calculate?” Obviously, this is not how science works, and I knew that then as now.

    The formulae we use to calculate are clearly correct to the best of our knowledge, but they are (historically) the product of inductive reasoning from experiment. Shall I conclude, then, that there is no definition-theorem-proof version of these? Effectively, that the mathematical structure, as far as it has been developed, allows us to construct non-physical models which do not match real-world data, but which are consistent with the mathematical structure?

    Is it even possible to construct a more-constrained mathematical structure, from which we can shut up and calculate? For example, deriving special relativity from Maxwell’s equations does not tell you, numerically, what the speed of light is. Nor would deriving the Planck energy equation from such an abstract structure tell you, numerically, what Planck’s constant is – but it would give you the equation, which will fit the data you will get once you conduct your experiment, and you can fill in the constants.

    Ordinarily, such a research program would not have much value, as it would reveal what we know anyway, and add a layer of formalism that nobody would ever use in practice. However, it would allow us to determine if the equations we use could have been derived, rather than inferred from experiment – if that is the goal.

    If the goal is unfalsifiable metaphysics, however, we could just as easily say this:

    The thought process which was used to develop quantum mechanics – developing rules through abstraction from data through inference conducted by debate – has cultural roots in Talmudic scholarship. Therefore, “God made the world quantum” so that His chosen people could discover it! If the universe only makes sense if analyzed in the manner of a Talmudic scholar, and no one can come up with a functionally-equivalent system that does not use Talmudic scholarship to develop it, that should demonstrate to everyone, forever, how Judaism is the only correct approach to understanding the world and everyone must come to it or forever wallow in ignorance. (Now, just to be perfectly clear, that doesn’t make anything “Jewish physics.” Once the physics is developed, it’s a physical fact, and anyone of any origin can process the information and understand it. It’s just that it was developed in a cultural system created by Jewish religious scholarship, to which the rest of the world must be forever indebted and which everyone else would be foolish not to use, if it is so effective as an analytical paradigm.)

    It depends, then, on what is meant by “why” QM. Why, in the mathematical sense of “from what is it a consequence,” or why, in the metaphysical sense of “what is the deep spiritual truth?” I was addressing the former question, since it seemed to be more concretely useful – see, e.g., the P!=NP proof attempt above – although I am certainly capable of addressing the latter. (I am writing a manuscript along the lines of the previous paragraph, which is unsurprisingly my personal experience. I found secular, “cultural Judaism,” by looking for a worldview that reflected reality. It is responsible, in my opinion, for everything that still works in American society – physics, math, technology, and even some of our military thinking – and the waning of its cultural influence has created our toxic political environment. I believe I can make a very good case that America desperately needs cultural Judaism, before it follows a similar self-destructive path to that which Germany followed when it decided to exclude Jewish culture. We have an advantage: democracy and a free press, through which we can advocate for our values. All is not lost, but we must step up.)

  435. Crackpot Says:

    Scott #407:

    A symmetry is just a statement of independence, that one property doesn’t depend on another property, yes? (That question is serious. I’m continually baffled by the way people talk about symmetries, like they’re deeply mysterious, that there’s no “reason” for, for example, radial symmetry to hold in a particular case; to me, that’s basically a statement that the default assumption is dependence instead, which seems more complex than the converse.)

    Or in the strong case, a symmetry is a set of properties that is independent of another set of properties, which granted can look a little weird when subsets of two independent sets themselves are not independent (i/e, {a,b,c} and {d,e,f} are independent, but {a,b} and {d,e} are not) – which I think is where the “weird trick” comes from, in a roundabout kind of way, noticing the properties c and f are actually in the set.

  436. Aditya Prasad Says:

    Hi Scott,

    I don’t know if you remember me, but we got a chance to hang out a couple years ago.

    I’m not sure if you’re familiar with the work of Markus Muller, but his notion of “observer states” closely resembles the view of certain spiritual traditions. If those adepts had thought hard about physics, I suspect they would have noted two things. First, what Yoni mentions (freedom in the future and a fixed past), but second, that it needs to provide the kind of almost-solipsism that can only come out of something like the measurement problem. That’s what prevents it from being implemented classically, I think. (You might wonder “but *why* do we need something like almost-solipsism,” which I could try my hand at answering perhaps. Of course, it probably won’t be an empirically justifiable answer…)

    I might also point to Rovelli and his recent discovery of and affinity for the Buddhist philosophical notion of “emptiness.” In short, emptiness could be described as the realization that “there is no way that things ‘actually are.'”

    I imagine you’re reluctant to take seriously the kinds of claims that religious people often like to make about having discovered this or that scientific principle ages ago, but something quite different is going on here. I’d be happy to say more, but this is probably enough for now.

    Cheers,
    A

  437. Johnny D Says:

    Scott #404
    Step 2: why do dynamic states need to be superpositions of static states?

    Atoms have static states. Static states give geometry of chemistry.

    Atoms interact with other atoms to give chemistry and materials. When 2 atoms are close, they interact. Since after interaction, 2 atoms that started, say, in their ground states can be in any superposition of static states, superposition allows many more types of interaction.

    It’s fun! An atom in a superposition can have a fixed but uncertain energy but an oscillating spacial distribution.

    What chemistry or material features are necessary and necessarily require these superpositions? I am not an expert, but there are examples (famously photosynthesis).

    I imagine heat transfer, liquids…

    Certainly it is easier to create the universe with superposition superpowers.

  438. Ajit R. Jadhav Says:

    Scott #421:

    Oh! How these things *happen*!

    …Get well soon!

    Kashyap Vasavada #433:

    I had written the following paragraphs in my original reply i.e. comment no. 416 above. I had saved my original reply to a plain text file too. But in the mechanics of the last minute editing, somehow, these two paragraphs got deleted. By me. Even if I don’t at all recall how I ended up doing that. (How things happen!) [I can tell that it was me who deleted these paragraphs, only because I had saved both the initial version and the final version in plain-text files on my HDD!]

    So, originally, in my comment #416 above, the following para’s appeared at the end of my answer to Q1 (i.e. just before going over to Q2).

    Quote:

    OTOH, yes, we do know that ultimately, you can’t go on using ideas of just the electrostatics (the way it is, in Schrodinger’s wave mechanics). You have to use electro-*dynamics*, which “automatically” means the special theory of relativity. So, in that sense, the universe isn’t *really* QMcal, it is QM + relativity-theoretical. And so on…

    So, the Q1 itself is “wrong” in the sense that it uses too vague a terminology. There are at least two vague terms in it: the idea of the Creator God, and the word “classical”. I would leave aside the idea of God in any discussions of the quantum phenomena. (Also the idea of consciousness, and, though none mentions it, also the idea of life, etc.) Even then, I had to point out that there is no such a thing as *the* “classical” world in the first place! There are many different “classical” (actually pre-QMcal) ideas, and so, possibly, many different notions of classical worlds.

    Unquote.

    Finally, might as well insert a brief note, regarding Physics, General Philosophy, and Theology

    Theories of physical sciences, in particular of physics, fall in a different class from those of, say, philosophy.

    But first, a point about their similarities: Both philosophy and physics are forms of science. In both, fundamental ideas are inductively conceived of, staring from a body of direct observational knowledge.

    Now the difference: The observational data, i.e. the phenomenological knowledge, on which the philosophic truths rest are such that such a knowledge would be available to any thinking adult, of any profession, at any time and age, in any culture, in any location, etc. This specific phenomenology, therefore, ends up including in itself only the simplest kind of observations. It is for this reason that philosophic observations are of the kind that would commonly form an element in any kind of an inquiry.

    OTOH, the phenomenological knowledge pertaining to physics is much more detailed and specific in nature. Physics knowledge is only derived from (and only applies to) certain special class(es) of phenomena — those pertaining to the nature and actions of inanimate objects and the similar characteristics of living beings. Physics critically makes use of the experimental method. The observations which it makes use of are such that the sum totality of the observational base is subject to revision.

    Thus, the fundamental truths of physics can be subject to revisions.

    It’s wrong to expect from any physics theory — even the most fundamental physics theory — the same nature as that which defines the philosophical kind of knowledge. To expect or ascribe such characteristics to theories of physics is to commit a primitive error. Such an error, in today’s world, typically arises out of too easily dismissing the entire domain of philosophy as such. [The late Professor Dr. Richard Feynman of CalTech easily springs to the mind.] But that’s not the only reason. There is another reason, another motivation too. The attempt to wrench the prestige which is actually due to the philosophy, and try and pass it off as it the prestige attached to one’s own profession. [Western scientists, and Americans, in particular, ought to introspect.]

    OTOH, in looking for a stable / immutable kind of observations, philosophy too ends up focusing on such aspects of the real world, and therefore ends up deriving such a kind of a knowledge, that this knowledge is often not of a most direct or relevance in applications, of a crucial importance in choosing between alternative answers. Philosophical knowledge provides a base for all other kinds of knowledge. But precisely for that reason, philosophy alone cannot settle issues like, e.g., those mentioned in Q2 here, or even in Q1.

    Theology, to me at least, looks like a primitive / rudimentary form of a philosophical kind of thinking. Also, for that same reason, susceptible to more easily being irresponsible. This is an issue of the type of a body of thoughts (not necessarily knowledge, but just a body of thoughts). It is not an issue of practitioners. Some theologians in fact are, in their personal dealings, most considerate and responsible people you would ever run into. But personal virtues like that do not change the nature of the body of thought. It does not change the nature of what kind an abstract thought does theology encapsulate, and puts forth.

    Needless to add, expecting to settle physics questions via appeal to theology is even worse an error than expecting physics to settle philosophic questions or vice versa.

    Best,
    –Ajit

  439. Paul Hayes Says:

    Clinton #422.

    The trouble with Scott’s, computer sciency, answer is where he says “quantum mechanics sits at a level between math and physics that I don’t know a good name for.” But we do know a good name for it and have for many decades (follow the link). It’s long been recognised that quantum theory is a (partial, algebraic) generalisation of probability theory and that quantum mechanics is just its (natural) application to mechanics; “[quantum] probabilistic mechanics”.

  440. Philippe Grangier Says:

    Jacques Pienaar #424: « In the present case, the first step is to ask ourselves exactly why our present world-view is one that happens to accommodate a “classical” model of the world more comfortably than a quantum model. »

    A tentative answer, taken from https://arxiv.org/abs/2105.14448 : The normal approach in physics since Newton has been to define objects, to attribute properties to them, and to measure these properties. One then asserts that the object “has” this property, for example that it has a position, a velocity, or a momentum. Does this “natural” approach work in quantum mechanics? Although physicists are extremely reluctant to admit it, the answer is clearly no – and this “no”, correctly interpreted, provides the empirical element that is missing in our understanding of the quantum description of the physical world.

    You will find more in this paper, but another way to tell it, at least in my view, is that QM requires to abandon classical reductionism, the idea that nature is built from smaller and smaller parts, like a house is built up from bricks. This is obviously not the case in QM, you can see it from entanglement, from contextuality, from non-commutativity… any one you prefer. This leads to the idea that in QM the physical object is not a (Newtonian) system alone, but a system within a context. This is quite hard to swallow indeed, but still physically and philosophically acceptable, and this certainly does not imply the death of physical realism.

  441. Andrei Says:

    Philippe Grangier,

    “In my understanding of superdeterminism, the non-independence between A, B and S must come from their overlapping past light cones, in order to avoid a clash with special relativity.”

    Indeed.

    “This possibiliby was already considered by Bell, but it makes that there are no more independant events, no more randomness, no more freedom of choice, since everything has been ‘written in the past’.”

    This is just good, old determinism, like Newtonian mechanics, classical EM, GR and so on. Neither of those theories allows for randomness or freedom or choice. This is not a valid argument against superdeterminism.

    “So yes, I dislike this option, and no, I don’t think it is the only local one on the table.”

    Well, in this case, please refute my argurment, since your last attempt was based on a misunderstanding of the argument (you assumed freely orienting detectors). Alternatively, you can provide a valid counterexample of the argument. Please explain how predictive incompleteness reproduces locally the perfect correlations of the EPR-Bohm setup.

    “It is true that predictive incompleteness is not easy to grasp because of its fundamentally non-classical and contextual features, but well, it does the job.”

    I’m looking forward to see how it does the job. How can you make two random measurements (measurements at A and B) always agree. The statistics of such events (two coin-flips) says that the probability of agreement is 50%. You need 100%.

    “So coming back to your preferred (Bohm-like) option of giving up locality, I think that you have a wrong understanding of contextual randomness.”

    1. My preferred option is locality. It is a reasonable option , based on the success of SR.

    2. I don’t think there are more types of non-locality. If A and B are space-like and A causes B or B causes A you have non-locality. In order to get the A and B measurements in an EPR-Bohm setup agree you need this. There is no such thing as a “mild” or “weak” non-locality (non-Bohm nonlocality). Either your theory is local or it is not. Scott also claims that the non-locality required to get perfect correlations in an indeterministic context are not problematic for SR. I refuted that claim in my post #296. The non-signaling theorem is just a red herring. The inability to control an otherwise non-local phenomenon does not make it local and does not make your theory compatible with SR.

  442. wolfgang Says:

    If you look for a deeper understanding of quantum theory , I would consider the inelegant properties as a starting point.
    The evolution of the state psi(t) uses time as a classical parameter, implicitly assuming the existence of classical clocks – and where would they come from ?
    A formulation of quantum theory without this classical time would be very interesting …
    T.P. Singh , 2021 is a paper which proposes such a formulation and might be interesting for your project …

  443. Martin Ondráček Says:

    I think the answer to the Question depends a lot on the interpretation of quantum physics. So it is no surprise that various interpretations have been talked about in this long discussion above and I am actually surprised that the discussion did not degenerate into “my interpretation is better than yours” contest right away. I wonder if this took a particular moderation effort on our host’s part or if people here are that reasonable in general. Anyway, my own suggestion concerns the many-world interpretation. The idea is it could be the most efficient theoretical framework (or at least one of few most efficient) in terms of generating as many “interesting stories” per “number and complexity of rules”. Something similar has been of course already mentioned multiple times in the comments above. I just wanted to stress that it may be the “ratio” by which the “quantum mechanics with many words” wins. Sure, something like Game of Life may have simpler rules yet, but the rules that define, say, the Standard model are not that terribly convoluted either. It helps that quantum theory itself, without considering a particular model and particular parameters, is just a linear unitary evolution of a Hilbert space after all. And while the Game of Life or a classical mechanics were able to produce the complexity necessary for something like the life to exist too, there would be only so many different structures and different “stories” taking place for a given initial configuration in the classical world, while a never-collapsing universal wave function could produce much more, even for one particular “initial condition”. Needless to say, this all remains a very fuzzy idea until someone comes with a way to formalize the “ratio” I have mentioned here, something which I do not know at all how to do.

  444. AK Says:

    It does not look like the universe has been quantum-mechanical.
    The Hamiltonian in Wheeler-DeWitt superspace is equal to zero. There is no concept of time. Unfortunately, we do not have the right tools to understand it at the moment.

    However, a “tiny” part of our Universe has been quantum-mechanical. Why?
    So we can have a valid formula for entropy

    $$S=\log\frac{\Delta p\Delta q}{(2\pi\hbar)^N}$$

  445. Philippe Grangier Says:

    Andrei #441 :

    « Please explain how predictive incompleteness reproduces locally the perfect correlations of the EPR-Bohm setup. »

    Technically (in the shut up and calculate version), it works just like standard QM. About whether this is local or not, see the exchanges with Mateus Araujo. In my opinion, the distinction beween elementary locality and predictive completeness is conceptually quite useful. But if you prefer not to make it, and to put everything in the same ‘non-locality’ black bag, this is your choice, and this amounts essentially in how you define non-locality. Predictive completeness has no problem with SR, see previous posts and reference.

    « If A and B are space-like and A causes B or B causes A you have non-locality. ».

    In an EPRB experiment Alice’s measurement does not « cause » Bob’s result, which is anyway undefined as long as Bob has not decided about an orientation. Her measurement only allows Alice to make a contextual (and probabilistic) inference about Bob’s result. Nevertheless, if you consider this to be still too much ‘nonlocal’, and prefer superdeterminism as a fully local alternative, I cannot prove you wrong – I can only tell that in my framework, superdeterminism is not the only alternative (and certainly not my preferred one).

  446. murmur Says:

    Scott #405:
    >[…] whatever is the most fundamental theory of the physical world, why should it have given rise to this onion-like structure of cruder and cruder approximations (e.g., GR, QFT, nonrelativistic QM, classical mechanics) that each have their own internal logic […]?

    Effective field theory?

  447. Dimitris Papadimitriou Says:

    Sorry to hear about the problem with your foot. In the previous year, I had a similar problem with my right arm.
    It took me several months, but now it seems o.k. I wish you the best.
    I thought about your questions a bit more, and I think that QM exists because Einsteinian Gravity ( or at least a slightly modified version of this) has to exist, for a world hat has complexity, observers, etc.
    Maybe the answer ( supposed that there is one…) is that obvious. I’ ll give several reasons for this:
    Gravity is the only ” interaction” that affects causality in a non trivial way, and this is important for a macroscopic world with some kind of observers. One could ” put by hand” some causal structure in some other contrived theory, but in GR this is an intrinsic characteristic of the theory.
    In some sense, non- trivial causality and Einsteinian Gravity are almost synonymous.
    There comes QM: for the universe to make sense, you need a ” stable”, well defined causal structure.
    QM permits some violations of some energy conditions, but not of the ” Averaged E.C.”, as it seems.
    So, it puts several serious problems for things like warpdrives, traversable wormholes, etc, while GR alone, as a theory of spacetime ( but not of matter fields) is not constrained.
    Also, gravity is the only universal interaction. It is also highly non- linear and has that kind of inherent ” instability”, due to its attractive nature, so it needs something else for things to be stable and non- static at the same time. QM is that appropriate ” something”.

    By the way, I don’t think that a reasonable world with Newtonian gravity can exist, even though some people are taking it for granted that it could. But that’s another story…

  448. Johnny D Says:

    Concluding thoughts of my comments 249 and 437.

    How can we list phenomena that require superposition? These phenomena are mostly left undiscovered since classical computation is disadvantaged at finding them. Quantum computers may discover zillions of phenomena that require superposition.

    Reasons quantum computers may contribute greatly to humans:
    1. The curiosity of just building them
    2. Give us faster algorithms for some problems
    3. Discover physics phenomena that require superposition

    I am hoping for lots of 3!

    My imagination runs wild. My favorite is quantum models of liquids. What properties of salt water flowing through ion gated membranes are truly quantum?

  449. Paul Hayes Says:

    Clinton #422

    Sorry about the wrong link. The correct link under “many decades” is here. The link to follow – to the first publication of an explicit recognition that QT is PT that I know of – is to this 1954 paper of Segal’s.

  450. Clinton Says:

    Hello Paul Hayes #439,
    You are absolutely correct to point out the fact that QM is a generalization of probability theory. I’m glad you did. In the midst of composing my reply to SP I myself paused at that exact sentence in Scott’s lecture notes and wondered if I should include it or include more of Scott’s own words about the GPT aspect. As you are no doubt aware Scott is all the time pointing out that QM is “just a generalization, or extension, of the laws of probability to allow minus signs.” And in my own post #118 above I refer to QM as a “Probability Model Builder”. But, again, I’m with you 110% because this is important.

    My thinking behind not expanding on it in my reply was that the question posed by SP appeared to be coming from a perspective that I know all too well – and that is the perspective of what I would call the “Traditional QM Physics Magic Story”. If you were a student like myself (Graduate Degree Engineering) who went through a few physics courses in college then you probably were indoctrinated into the QM “mystery” the way that I was.

    QM is (still to my knowledge) almost always first conveyed to students (or the general public) like a “Magic Mystery Story”. You know how this goes … “So, we’ve got these two slits and we begin firing electrons through them … wouldn’t you expect that they just go through one or the other …” It feels in a lot of ways like a magic trick … like a play on someone’s expectations.

    QM is (to my knowledge) rarely first conveyed to students (or the general public) like this: “So, we want to make predictions about what is going to happen when we conduct physics experiments. That means we are going to need a probability theory. Let’s begin by considering the probability theories that are available to us, which one(s) we might prefer to use for various reasons (such as being complete and closed, linearity, choice of basis, etc.), and then we will consider some actual experiments to see if the probability theory we think is best actually turns out to be good for the job.”

    Now, I’m not accusing introductory physics teachers of purposefully attempting to mislead people or of consciously teaching things this way. And I am SURE that there ARE some physics instructors who DO attempt to make the GPT connection from the start. Feynman, a great popularizer, clearly said that what made QM different was that it allowed negative probabilities. We just find ourselves at the end of a century-long process where physics just HAPPENED to be the application field where QM first arose. And, so, physicists tell THEIR story in the sequence of how it happened to them. Nothing wrong with that.

    What I sensed in SP’s question was a familiar perspective (one that I once held) that was based on how QM had been presented to me as “only something weird to do with physics”. Scott’s paragraph there from his lecture notes I felt was a good starting point for someone (as it was for me) in realizing that the “QM Magic Mystery Story” may not be the only way to understand or approach QM.

    I remember one of my mathematics professors related to us (I think this is from Schopenhauer) the three stages of truth:
    1. It is ridiculed.
    2. It is opposed.
    3. It is said that “We have known this result for some time now …”

    And so I am very glad to hear that we have known for some time now that QM is recognized as a GPT.

    Thank you again, Paul, very good point!

  451. Mateus Araújo Says:

    Philippe Grangier #427: I believe you misunderstood my challenge. I’m granting you that on the probability level violating (2) is a mild form of nonlocality. I’m challenging you to produce these probabilities from a realist model, without flagrantly violating relativity. Your paper does nothing of the sort.

  452. Dimitris Papadimitriou Says:

    Mateus Araujo #398
    I agree with you about the need for clarity, and a plausible, realistic ( if you like), description of the physical world at the fundamental level. I’m not a proponent of operational or naively ” epistemic” ways of thinking about nature, at least not on the fundamental level. I prefer to be agnostic about QM interpretations, though, and the reason is that the standard QM formalism does not gives us enough information to decide which interpretation is the “correct” one.
    I, also, respect the alternative theories that you mentioned ( especially the various Objective collapse or Gravity induced reduction theories that are potentially testable ), and I’m trying to avoid premature conclusions, because there are many potential surprises for us in the future, or I guess so…
    I just do not find that kind of clarity in any interpretation, not only in the epistemic ones.
    In many aspects, the Everettian / many worlds class of interpretations ( there are many different interpretations with that label, not a specific one- for example, S. Carroll seems to disagree with you about the local branching…) are more problematic than the traditional textbook ones.
    The reason for this, I think, is that such a realistic description is not possible if many serious issues are put under the rug.
    Most people are concerned about the combatibility of MW with the Born rule, or with the preferred basis problems, but , in my opinion, the most problematic thing has to do with the ” semi-classical limit”, that I mentioned in a previous comment. All usual debates about interpretations omit gravity/ spacetime/ semi-classical limit issues, so I can not take them too seriously, I’ m afraid.
    QM works perfectly well, as far as we know, and for the time being, that’s all.

  453. Scott P. Says:

    Now, you propose a different measure: that we should count equally each set of worlds that share the same measurement result. As you noticed, this has the fatal flaw of contradicting the data.

    Usually when the hypothesis (MWI) contradicts the data, we reject the hypothesis, not simply redefine things so that our hypothesis remains correct. 🙂

    It’s also ill-defined: these measurement results are just one decoherent event we chose to pay attention to. There’s plenty of decoherence events happening all the time, everywhere. To actually count each decohered branch equally we would need to take them all into account. It’s clearly a hopeless proposition, and to the best of my knowledge nobody has even tried to do that.

    I don’t see why it would be hopeless in principle. The whole point of a QM interpretation is to explain why we see the results we see.

  454. Gil Kalai Says:

    Very interesting question and discussion. This is a sort of meta-meta questions that seem remote from physics yet probably require much knowledge of physics and its history.

    Three questions that might be related are

    1) What is the origin/meaning of probability in nature and why is there probability?

    2) What is the origin/meaning of chaos and why is there complete chaos?

    (By chaos I mean a behavior that cannot be predicted (even probabilistically). Chaotic behavior seems crucial to the great difficulty to extend (practically) logic from TRUE/FALSE setting to a probabilistic setting, and to answer questions about the probability that some complex statement is correct*.)

    3) Why is there something rather than complete chaos?

    I would speculate that quantum mechanics is a framework (perhaps “minimal” or even “unique” in some sense) that allows answers to, or at least “better understanding of”, these questions 1) 2) 3).

    (If true this can serve as an answer of some sort for “Why should the universe have been quantum-mechanical?”)

    My thinking is influenced by that of Itamar Pitowsky’s (that was mentioned in the thread, by Peter Morgan in #146), but I certainly don’t represent or even remember Itamar’s precise views on these three questions or on the question that Scott raised.

    * For example, Rudolf Carnap, a central member of the Vienna philosophy circle, had a programme which he believed could lead to a whole logical calculus of probability starting with answer to the question: “What is the probability of a statement A given the validity of statement B?” and ending with an answer to “What is the probability that a theory X is correct?”.

  455. Jes Wolfe Says:

    Somewhere I saw a paper that argued that even God could not resolve the state of a quantum universe without “collapsing the waveform” or whatever the right metaphor is. It makes me think the answer to Q1 is a kind of radical No Privileged Observers philosophy. Analogous to what relativity does to the ordering of events – there’s no “objective” frame for space vs time – quantum mechanics does for measurement itself. The no hidden local variables rule means there’s no God’s Eye View that’s hidden from us, the muddle that we find ourselves in is as real as it gets. And I think that’s beautiful. I suspect that future mind-warping physics will find something else that seems obviously constant and objective and demonstrate that it’s relative and conditional, in a way that we haven’t even considered possible before

  456. Gil Says:

    typo: My question #2 should be:

    2) What is the origin/meaning of chaos and why is there chaos?

  457. Philippe Grangier Says:

    Mateus#451 « I’m challenging you to produce these probabilities from a realist model, without flagrantly violating relativity. Your paper does nothing of the sort. »

    I claim it produce these probabilities without flagrantly violating relativity, so we must disagree about what is a ‘realist model’.

    If you mean some kind of hidden-variable, Bell-type model, then you are certainly right, this is the first option in my post #410.

    But if you consider the second option of #410, including predictive incompleteness of psi and contextual inferences, I don’t see any ‘flagrant violation of relativity’ whatsoever. On the other hand, I’m ready to admit a ‘mild form of nonlocality’, whatever it means, and though I prefer not to use this wording.

  458. Michael Weissman Says:

    Scott- You’ve heard this before, but here are some thoughts on a narrow question within this area- why the Born rule?

    The only answer I’ve seen that manages to get it without either breaking the unitary rules or covering everything in metaphysical verbiage is Jacques Mallah’s “Many Computations” approach. To recap, for others, the question is why actual counts of results of approximately (pace Heraclitus) repeated experiments show Born probabilities. That’s very far from what would expect from a naive count of outcomes in a pure unitary (many worlds) picture in which one pretends that distinct outcomes are well-enough defined to be countable. Mallah starts with pure unitary quantum dynamics and a radically materialistic definition of what constitutes a counted outcome. It’s a thought, i.e. a fairly robust calculation by a subset of the overall quantum system.
    So why would an outcome represented by a larger quantum measure get more counts? Mallah proposes that to be robust enough to “count”, the quantum representation of the coherent thought must have decent signal-to-noise compared to an incoherent background white-noise component of the state. Ordinary signal-to-noise considerations then given that the coordinate space volume over which the signal must be averaged to beat the noise scales inversely with the measure, so the number of such volumes scales as the measure.

    The big question then becomes why would the total state consist of a coherent piece, the part we always talk about, plus the white-noise background, suggested by the Born rule. I argue for a fundamentally anthropic reason. No other combination would lead to probabilities that factorize into stable probabilities of sequential events. (That property is just assumed in the Gleason argument and the many others that try to derive the rule from the pure structure.) Without that property, probabilities of past events would not be fixed after the event. The preconditions for evolutionarily favored thinking would seem not to exist without that property.

    At one point I argued that this anthropic constraint would account for the incoherent background. On second thought, we already have been tempted to use anthropic arguments to justify the 2cd Law, i.e. the existence of the initially low-entropy component whose behavior we usually analyze. So maybe the noise should just be taken as given so that now the usual anthropic push is needed just for the existence of our coherent component, no longer to justify the non-existence of the incoherent component. In Mallah’s picture that incoherent component persists.

    Again, none of this says anything about why the whole unitary structure exists.

  459. Aditya Prasad Says:

    I guess I’ll say a little bit more about why “almost-solipsism.”

    We’ve already seen why indeterminism “should be” one of God’s design goals. A crucial question is, when should any particular instance of indeterminism get resolved?

    Well, from the perspective of any one of her countless incarnations, God wants reality to look free. And if indeterminism gets resolved before it gets to me, then it doesn’t look free to me. So I must be the ultimate resolver.

    Of course, this comes with a little problem: it’s logically inconsistent for each incarnation to be the ultimate resolver in a single shared reality. And we don’t want to do something silly like place each incarnation in her own reality where everyone else is an NPC, because those incarnations tend to go crazy instead of waking up (which is also a core design goal…). Something a little more clever is needed.

    I think you already see where this is going. The question you may have is, amn’t I just describing our _actual_ reality and trying to retrofit these design goals? And the answer is no: the Buddhists really did discover these design constraints ages ago. The Buddhist “no-self” is the fact that there are no observers; only observer-states (a la Muller). Their “emptiness” is the fact that there is no fixed external reality that is presenting itself to you. Liberation is the direct perception that reality is manifesting itself in this beautiful and crazy way, instant by (illusory) instant.

  460. Paul Hayes Says:

    Clinton #450 (“If you were a student like myself (Graduate Degree Engineering) who went through a few physics courses in college then you probably were indoctrinated into the QM “mystery” the way that I was.”)

    No, I was a bog standard physics degree student. So all the more scandalous that I was taught QM as almost all physics students were then, and still are: the current textbooks and courses still take those mysterious, “view from nowhere” Dirac-von Neumann axioms and go from there, and the malign influence of [semi-]classical antiquity is still felt.

  461. Mateus Araújo Says:

    Scott P. #453: In order to make probabilistic statements in Many-Worlds we need to introduce a measure over worlds. The non-probabilistic part of the theory (Hilbert spaces, tensor products, Schrödinger equation) does not determine which measure this is, it is a logically independent postulate. In Everett’s original paper he chose the 2-norm measure, which does fit the data. Now, six decades later, you are insisting that we have to replace the 2-norm measure with this “every measurement result is equal” measure. Even though you realize yourself that it contradicts the data. Moreover, you insist that the fact that your measure is contradicted by the data implies that we have to discard Many-Worlds. Not the measure you propose.

    I am at a loss for words.

  462. Pantelis Rodis Says:

    About Q1. If we build a universe only with classical physics then this universe would lack randomness and it would be deterministic and computable. There are many reasons why such a universe is not ideal to live in if you are an intelligent being and you want, at least to try, to make your own future.

  463. James Gallagher Says:

    Scott #420

    Sorry to hear about your foot, I’ve had an issue with gout last year and couldn’t walk for several weeks.

    I think QM explains our universe because we need unitary evolution otherwise we’d explode to infinity or shrink to zero and once you allow fundamental randomness then something like Gleason’s theorem, or your argument in the Theory in Island Space paper or just simplicity makes the Born Rule pretty obvious. (Of course the evolution may not be perfectly unitary, just enough for the stability we observe)

    (Btw, according to Abraham Pais, not only Born, but Dirac and Wigner initially suggested |psi| rather than |psi|^2 for the probability, see p. 9 Max Born and the Statistical Interpretation of Quantum Mechanics)

  464. Mateus Araújo Says:

    Dimitris Papadimitriou #452: Carroll is wrong about branching, among several other things. He is not a good source on Many-Worlds. Saunders is much better. Wallace as well. I’m also a fan of Brown. Deutsch is also pretty good.

  465. Mateus Araújo Says:

    Philippe Grangier #457: In your paper you don’t produce probabilities from anything, you just say they are what they are. And by “realist model” I definitely do not mean a hidden-variable model, it’s simply a model where you define what the real objects are. For example, a naïve realist reading of textbook quantum mechanics is such a model, where you declare that both the quantum state and the collapse are real. It has no hidden variables, and it also flagrantly violates relativity. Ditto for a more sophisticated collapse model. No hidden variables, and violates relativity. With Bohmian mechanics you do have hidden variables, and also a violation of relativity.

    If you can’t produce your probabilities from a realist model, you don’t have a “mild form of nonlocality”, you have nothing. If nothing is real than clearly nothing violates relativity!

  466. Philippe Grangier Says:

    Mateus #465

    Sorry Mateus, it is clear now that we have branched into different universes, and that whatever one of us says is fully decohered for the other.

    You are too much into a view with a ‘real psi’ to admit that there may be real objects and properties, without requiring that psi is ‘real’; whereas this is obvious to me.

    I just (re)quote https://arxiv.org/abs/2105.14448 as a simple introduction, but unfortunately even DeepL will not translate this into something intelligible for you.

  467. Scott Says:

    Anbar #258:

      Well, the ado was taken care of by a few bright guys in the late 1920s…

      In which sense the formalism and interpretation of QM are inevitable, given the empirical behavior of even something as simple as a photon, is laid out by Dirac in the introduction to the Principles, and Von Neumann figured out the formal logic behind the projectors shortly thereafter.

    Do you understand that I’m not asking why you’d use QM to create this world—which is obvious—but rather, why you’d use it to create a world? While the majority of commenters here get this, there seems to be a persistent minority for whom it’s a completely ungraspable concept.

  468. Scott Says:

    Chris #267:

      if you were strict about a heuristic biasing you against ‘metaphysical extravagance’, you’d be a strict finitist, and your default assumption about the universe would be that it is finite in extent, contains finite matter, etc. Is that the case?

      Is it metaphysically extravagant to suppose that there are infinitely many mathematical structures, including all perturbations on ones that closely resemble our universe?

    These are excellent questions, but my answer is: whenever there’s an implied infinity in physics, as for example in QFT or GR, and it’s not “provably benign” like the quantum-mechanical amplitudes forming a continuum, I assume the infinity is just an artifact of current knowledge, and that future discoveries will fix it as they so often have in the past.

    By contrast, if QC is possible and gives the hypothesized speedups, then the exponentiality of quantum states is 100% real, even though it’s only exploitable in specialized ways. And at least at first glance, it seems wasteful and extravagant to solve a “polynomial-sized problem” (like the stability of matter or the UV catastrophe) by introducing an exponential amount of additional structure! Which is simply to say: if we want to go this route, then the challenge is to find some way of thinking according to which this is not nearly as wasteful or extravagant as it looks.

  469. Scott Says:

    Guyren Howe #284:

      An interesting related question: what would a universe look like that had Quantum Mechanics, but not Relativity?

    It would just look like nonrelativistic QM, wouldn’t it? Or are you asking whether that

  470. Philippe Grangier Says:

    Scott #467 : « Do you understand that I’m not asking why you’d use QM to create this world—which is obvious—but rather, why you’d use it to create a world? »

    Well, I guess I’m among the minority who did not grasp the concept, which was maybe not so clear in your initial questions, it may depend on your mindset. But at least I’m happy to read that the answer to ‘why QM rules this world’ is obvious for you.

  471. Ted Says:

    Scott, did my previous comment get lost in moderation? (I can’t say that it was terribly insightful, but I don’t think it was offensive or off-topic, so I assume that you didn’t reject it.) Roughly reproducing it here:

    Proponents of the many-worlds interpretation and decoherent branch differentiation argue (correctly, IMO) that the MWI follows naturally from the principles of quantum mechanics. I wonder if it’s possible to turn this argument around, and to take the general idea of the MWI as the fundamental starting point for “deriving” (or at least motivating) why QM describes the real world?

    Of course, that just moves the explanatory burden onto the task of finding some natural-seeming general philosophical principles that would lead to the idea of the MWI – which is likely even harder than motivating QM itself. But to me, the idea of the MWI is one of the biggest conceptual departures of QM from classical laws – much bigger than simply modifying the classical stochastic 1-norm to the quantum 2-norm – so starting there that might be a promising avenue for motivating QM from the most general philosophical principles. You could perhaps imagine a reasonable line of philosophical reasoning that finds a Tegmarkian “multiverse” (in which all mathematically self-consistent laws of physics are physically realized “somewhere”) to be too extravagant, but to want to have “as many things exist/events happen as possible” within a single simple unified set of physical laws. There might be some way to get from that starting point to QM rather than classical physics.

  472. Mateus Araújo Says:

    Philippe Grangier #466: I’m not demanding psi to be real, I’m demanding something to be real. It can be psi, it can be a crazy hidden variable, it can be objects and properties, whatever, it just has to be mathematically well-defined.

    And now you quote for the third time a paper where you still don’t do it. It’s rude to waste people’s time like that.

  473. Lorraine Ford Says:

    Is quantum number jumping the only genuine moving part in the entire system? Is all other number change in the system merely due to passive mathematical law of nature relationship, which only kicks in when quantum number jumping has occurred?

    I think that the above is indeed the case. Individual quantum number jumps, by the system or a part of the system (e.g. a particle) in response to a situation, which would be modelled (e.g. in a computer program) as the deliberate assignment of numbers to variables, are the only genuine moving parts in the entire system.

    Quantum number jumping, i.e. primitive free will, and the basis for the more advanced free will of living things, is the only genuine moving part in the entire system.

  474. Mateus Araújo Says:

    Ted #470: I did give an argument of that sort in my comments #145 and #313. In a nutshell, you need Many-Worlds to have randomness, and you need quantum mechanics to get Many-Worlds. It doesn’t seem to have persuaded anyone, though.

  475. Anbar Says:

    Scott #467

    I do. See #306 and #366, as well as original answer to Q1

  476. Scott Says:

    wyatt the noob #292:

      It seems like the David Deutsch worldview is at least interested in Q1 and has some opinions on it. From a recent reading of TFOR some candidate directions are 1) QM is needed to resolve time travel paradoxes 2) QM is needed to provide foundations for moral realism 3) QM is needed to provide foundations for information and specifically biology and intelligence. My guess is a better understanding of this worldview has other opinions about why QM is needed for bio, epistemology, computing to make sense

      I think it could be productive to address the David Deutsch world view as a whole as a way forward.

    Deutsch, to be honest, had many confidently-asserted ideas in The Fabric of Reality that made no sense to me—I thought his later The Beginning of Infinity was a much better book.

    I at least understand the argument that “you need QM to resolve time-travel paradoxes,” but I think it’s flatly false. Even in QM, the fixed-point around a CTC needs to be a mixed state in general. So then why not, in the classical case, allow probability distributions as the fixed-points around CTCs—those will also necessarily exist?

    I don’t understand why QM would be needed to “provide foundations for moral realism” (!) (also, even if that were so, would Nature care?), nor do I understand why it would be needed to “provide foundations for information and specifically biology and intelligence.” If Deutsch thinks that MWI is needed for these things, then once again the question is, but why not a classical multiverse?

  477. Jester Says:

    Scott, do you include classical field theories (e.g. Maxwell) in classical mechanics, and Quantum Field theories (QED, QCD) in Quantum Mechanics; so the main difference is between classical vs. quantum, mechanics or not? Apologies if this has already been covered.

  478. Scott Says:

    philip #294:

      A purely classical universe wouldn’t take any time to go from start to end

    Of course it would! If people are worried about the “block universe” of classical physics, then I don’t understand why they aren’t equally worried about the “block multiverse” of MWI…

  479. Scott Says:

    B R #308:

      your theory of nature should not have an unnecessary ugly dichotomy between waves and particles. Instead it should put them both on the same footing. Good luck doing that with any classical theory! So this is where I would depart to deduce the apparently cherished inevitability of quantum mechanics…

    That just pushes the question back to: why do we need waves or particles? We need information, but why does it have to propagate in either of those ways, let alone in both of them?

    Furthermore, “wave/particle duality” strikes me as an old-fashioned way of talking. From a modern QFT standpoint, waves and particles are simply different ways to describe phenomena that can arise on quantum fields, the truly fundamental entities.

  480. Scott Says:

    Ibrahim #311:

      Are we as humans going to accept that reality is generated
      and generated through interactions of a few things and so only exists for that interaction, only for its scope and that is it and nothing more?

      Related to that, Are we going to realize that when we define a “state vector”, we think that we deserve to look from the “god’s perspective” with all the variables placed nicely together and there can be, to our convenience, *one representation to represent the whole.

      We are still trying to hold on to the chair that we sit: the tools, notations and experiences. With them we got so far. We need to go to the very bottom and go up from there, with a notation of “not set”, “free to have such liberties”, “are not to be put together in one representation”. That means a new journey with extreme humbleness.

    Alright then, will you use your extreme humbleness to enlighten the rest of us as to the true nature of what’s going on? (And was that humble enough? 😀 )

  481. Baruch Garcia Says:

    Hey Scott!

    Yes, this is THE most important question within science. Looking forward to your book! If you are going to write a review, of course you know, I think you should include John Wheeler’s take. Many see “it from bit” as a Rorschach test, connecting anything vaguely physics-related to anything vaguely information-related.

    For Wheeler thought, the answer was simple: self-reference. The universe is self-generated, as you can see in his famous “U-diagram” where the eye looks back at itself (optional: see Ed Witten’s interview about this idea in Quanta magazine). In other words the universe can’t exist in relation to anything else, so it has to exist in relation to itself. Wheeler also liked to say that you should find the answer first then do the calculation. He did not have a calculation, but over the span of 3+ decades, he obsessed over the possibility that there was some kind of relation to the self-referential undecidability proofs of Godel, Turing and others. There is no need to modify QM; no hidden variables, etc.

    Let me answer your Q2’s first.

    QM has two “processes,” as von Neumann called them, and this is the framework for ALL of QM.

    Process 1: Non-unitary collapse. Non-deterministic and measurable. (“Particle” behavior)
    Process 2: Unitary evolution. Deterministic and not directly measurable. (“Wave” behavior)

    Your summary in QCSD of QM as probabilities with negative values very effectively applies this wave-particle duality to quantum computing theorems. The interference from negative values is wave-like and the probabilities are particle-like.

    :: Born Rule ::
    Let’s look at a simplified real-valued Born rule, then extend it to complex values.
    Wave-like sinusoidal behavior is the same as rotational behavior.
    x^2 + y^2 =1 describes rotational/sinusoidal behavior (wave-like)
    Pr_x + Pr_y = 1 describes the probability Pr whether this or that state will be observed (particle-like)

    Thus x^2 + y^2 = Pr_x + Pr_y encodes wave-particle duality. The first terms of the right and left side correspond to each other, while the second terms of the right and left side correspond to each other. x^2=Pr_x ….. The Born Rule! (for reals) It is an easy exercise for the reader to see how this extends to multiplying a complex number by its conjugate.

    :: Complex Numbers::
    When you call imaginary numbers by Gauss’s original name for them, “lateral” numbers, it’s easy to see why complex numbers are called for in QM. Complex numbers (e.g. multiplying by i) describe rotations – i.e. wave behavior. So a Hilbert space described by C^N, is a necessary framework for wave-particle duality. The complex number describes the rotation (wave) and the dimension N, describes possible states in which a system can be observed in (particle).

    ::Unitarity::
    Same thing here. You can describe a unitary matrix as e^iH where H is a Hermitian matrix. (sorry about the notation)

    Every other aspect of the quantum formalism can fit into this particle-wave duality framework, i.e. the duality of the non-unitary Process 1 and the unitary Process 2.

    Okay, none of this is conceptually new, you can figure all this out just to establish that QM is what we thought it was, I mean it is on Bohr’s coat of arms. So what does von Neumann’s Process 1 and Process 2 have to do with undecidability proofs? (I wonder why von Neumann didn’t catch this himself.)

    Various proofs that rely on self-reference such as applying a Universal Turing Machine to itself, or a Zeno Machine to itself, or even the self-application of Quine’s protosyntax (Quine 1940, Smullyan 1957), have some kind of epistemological limit as a result. All these proofs use quotation, e.g. Godel numbering. Quine, among others has written extensively about this. The key idea is that the epistemological limits of pure math provide us with the epistemological limits of the physical world. In all these proofs, you have two cases, which I will call Case 1 and Case 2:

    Case 1: Provable/writable but not consistent
    Case 2: Consistent but not provable/writable.

    Godel, Turing and others have emphasized that “provability” (by effective method/procedure) means proof by physical, mechanical means e.g. ink on paper, chalk on a chalkboard, stylus on clay.

    Then Case 1 provides a mathematical framework for the physics of Process 1 and Case 2 provides a framework for the physics of Process 2.

    Case 1 Process 1
    Case 2 Process 2

    For example if you prepare an electron in spin z+ and measure the x-component of spin, sometimes you will get x+ and other times you will get x-. Inconsistency! This is NOT to say that quantum mechanics is an “inconsistent” theory. It is the most successful theory we have. What this means is that when you prepare a system with the same exact initial conditions, you will not always get the same final conditions, and they can fall into a nice probability distribution. That is Case 1/Process 1.

    In Case 2 / Process2, you have a consistent evolution of the Schrodinger equation. It evolves exactly the same each and every time, given the same initial conditions, but you cannot, in principle, measure a superposition (if you could, we would just rename it an eigenstate!).

    You can extend this correspondence between logic/CS and physics, as Wheeler envisioned, to explain really all of QM. For example, an Oracle corresponds to the first particle you decide to measure in an entangled pair, and the UTM that is now decidable, corresponds to the second particle observed in an entangled pair.

    So what can this self-referential QM do that standard QM cannot do? Many have called for a theory of quantum theory of gravity where there is no space or time. Build a theory of quantum gravity on information. For example, this simple result shows how undecidability pops out when you try to combine QM and GR.

    It is known that there is no generalized algorithm for the homeomorphy problem for compact 4-manifolds; the problem is undecidable (Markov 1958). Therefore there is no generalized algorithm for the homeomorphy problem for causal diamonds at the Planck scale (which is allowed by the topological censorship theorem).

    It is easy to disprove this result; just show that spacetime at the Planck scale has a trivial topology, or at least restricts some topologies, as is the case with 3-manifolds. But this approach to quantum gravity – generalizing QM a la Wheeler and leaving GR alone – has its benefits. For example, we have a description of an observer within a system, which is necessary for a satisfactory theory of quantum cosmology.

    Lastly, several people have tried to define living systems, e.g. Autopoiesis, (M,R) system, as self-referential and even undecidable systems. In the field of cybernetics, several researchers believe there is no generalized algorithm to describe living systems. And physicists have struggled to define an “observer” with a solid consensus. Wheeler’s self-referential observer gives a framework for living systems that does not exist with the familiar stick-and-ball chemistry descriptions of living systems.

    You can read Wheeler’s essays and see that his insight is pretty much on point with all these mathematical descriptions. His essays are a good place to start to explore this which you can read on jawarchive.wordpress.com.

    Why was the universe the way it is? Because it can’t be any other way! It has to exist in relation to something… itself!

  482. Dimitris Papadimitriou Says:

    In my previous comment about the constraints that the existence of QM puts on a universe with Einsteinian gravity, I mentioned that in such a world the averaged energy conditions are expected to be valid, so that world is probably causally stable, but I omitted some other obvious related implications:
    For example, Hawking radiation needs both GR and QFT, and it is essential for the 2nd law of thermodynamics to make sense in a world with black hole or deSitter horizons. There are some well known problems of course here, namely the information loss problem, but these have to do with our ignorance ( or with our insistence that we must trust some hypothetical ideas / principles), and not with some inherent inconsistency of nature.
    There are many other examples that support this idea, that QM complements in a sense GR despite the infamous apparent incompatibility of the two (resolution of singularities, cosmology, inflation,etc…)
    The more I think of it, the more I’m convinced that maybe this is it.

  483. Jester Says:

    I think the argument that classical mechanics ows the fact that it is working (where it is working) to Quantum Mechanics is flawed; since any improved physical theory would be able to reproduce the former results, otherwise it wouldn’t be adopted.

    Although it is interesting that many classical notions still work under the formally only slightly changed (then-)new not-classical mechanics.

    Of course, an important such notion is mass, which quantum mechanics uses, but doesn’t tultimately tell us what it is about…

    But these musings don’t really add anything useful to Scott’s original question(s), and even “42” probably won’t cut it, so I’m gonna stop.

  484. Scott Says:

    Mateus Araújo #313:

      We are used to letting true randomness be simply an unanalysed primitive. We know how to deal with it mathematically (with the Kolmogorov formalism), and we know how to produce it in practice (with QRNGs), so we don’t need to know what it is. But if you are writing down the rules that make a universe tick that doesn’t cut it, you do need a well-defined rule.

      The only solution I know is defining true randomness as deterministic branching, aka Many-Worlds. And as I’ve argued before, you do need quantum mechanics to get Many-Worlds.

    Are you seriously arguing that, if the world were classical, we wouldn’t be able to make the concept of probability well-defined, but because the world is quantum we can? That’s ironic because usually people argue the opposite—i.e., they understand what probability means in this-or-that classical context, but what could it possibly mean with Everett branches? 🙂

    Personally, it’s not a dealbreaker for me in either context—as long as you can explain why your probabilities are real and nonnegative and sum up to 1 I’m probably happy.

    But I will insist that probability seems conceptually prior to QM. Indeed, as you know, we can derive the rules of probability from axioms about rational betting agents, none of which rely on QM in any way, and all of which would seem reasonable in any that contained such agents at all.

    So I simply don’t see what the problem would be, with putting such probabilities into the fundamental laws without going through the Born rule, as is done in countless stochastic models of physical phenomena. I think there’s a genuine problem in explaining why, in our universe, probability only seems to appear in the fundamental laws via the Born rule, and one of the greatest obstacles to solving such problems is to treat their answers as obvious.

  485. Scott Says:

    John van de Wetering #319:

      You seem to say that you feel Q is sufficiently answered for relativity, so am I correct in taking that to mean that you would be happy with finding some set of physical principles which necessitates quantum theory? Cause you could argue that Q is *not* answered for relativity, because why would the speed of light be finite? That seems like a bit of an arbitrary choice as well, and I could easily imagine a complex life-bearing universe where causality works instantly.

    You’re right that even for special relativity, the question isn’t completely answered, because we don’t know why the speed of light needed to be finite—or, for that matter, why the laws of physics had to look the same in all inertial frames, why there needed to be such a concept as “inertial frames” at all. But these all seem, in retrospect, like natural goals in designing a universe. I.e., you want to be able to pick stuff up and move it around without changing its structure, and you also want an upper limit on the speed with which you can do so, since otherwise you could get a giant mess where everything instantaneously affects everything else.

    If QM could be derived as the inevitable consequence of similarly natural design goals, I’d say we ought to feel satisfied to have understood more than we had any right to.

  486. Scott Says:

    John van de Wetering #320:

      Then all the classical universes will only support a finite or countable infinite number of consciousnesses while the quantum-like universes support an uncountable-infinite number of consciousnesses. Hence, probabilistically you will always find yourself in a quantum-like universe.

    I really don’t think this works, because why couldn’t God just say “let there be a continuum of non-interacting classical universes?” Or 22^ℵ0 of them or whatever? Wouldn’t that be even more anthropically favored than the Everett multiverse?

  487. Nick Says:

    I’m not enough of an expert in quantum mechanics to have any suggestions for the Q, but I wanted to just chime in and say that I’d be very excited to read this book/essay (I hope it’s a book). I can’t think of a better person to write it.

  488. Jair Says:

    Scott, it’s simple. God is a Unitarian Universalist. Therefore She/He made the universe obey unitary evolution.

  489. Scott Says:

    Jair #488: Alas, your theory fails. A Unitarian Universalist God would clearly have made unitary evolution optional, depending on individual conscience. 😀

  490. mjgeddes Says:

    If we postulate that quantum mechanics is the generalized theory of statistical mechanics, the question is what is the physical interpretation of things like ‘negative probability’? What are the analogues of classical thermodynamics? For instance, for the waves, what is actually oscillating?

    If we go with Einstein and suppose that the geometry is the real fundamental thing, perhaps QM is not too different from general relativity after all, although it has to be non-classical (so non-commutative geometry?). And if a certain kind of geometry is needed for reality to be comprehensible at all, then perhaps something like QM and classical mechanics inevitably emerge from that in many different possible worlds.

    I’d focus heavily on the notions of symmetry and stability, because QM is all about the linear algebra, and ‘groups’ (basic objects of abstract algebra) are all about symmetries. So somehow connect symmetries to physical stabilities?

    There’s an intriguing strange sort of ‘doubling’ aspect to QM, for instance, why does an election need to rotate twice (through 720 degrees) to return to initial state? What’s the physical interpretation of that in terms of geometry?

    Thinking along these lines, the classical space-time of general relativity could be an amalgam of two more fundamental types of geometry, one representing the structure of QM , the other the structure of classical mechanics. QM geometry would then be analogous to the “square root” of classical space-time.

    Again, earlier in thread, I’ve suggested the 3 defining conditions for reality to be comprehensible at all: (1) Causality , (2) Compressible complexity, (3) Compositionality. Given this, the physical manifestation of complexity is perhaps inevitably something like the classical space-time, and that has to be decomposable into an amalgam of two more fundamental kinds of geometry, one about the *compositional structure* (Classical Mechanics!) the other about the *causal structure* (Quantum Mechanics!).

  491. Scott Says:

    Gadi #321:

      Scott, if these are the kinds of questions that interest you, don’t you think studying physics gets you closer to the answer than studying computer science and quantum computing? Studying quantum field theory, etc.?

      I can also ask questions about all the non rigorous things in quantum field theory. Is there a formulation without renormalization? If not, then with which parameters does God actually run it? Do you realize that the current formulation of quantum field theory is far from being a computer program you can just postulate that God “runs”? That mathematical consistency of it has been an open problem for many decades now?

      How far into physics and quantum field theory do you really understand? Don’t you think you should get a very good understanding of it (not even claiming I have it- I’m talking about understanding it like at least those physicists in CERN that actually compute things with it) if those are the kind of questions that interest you?

    I completely agree that

    (1) knowledge of physics in general, and QFT in particular, is potentially extremely relevant to the questions I’m asking about, and

    (2) my knowledge of (non-information-theoretic) physics, even at the undergraduate level, leaves much to be desired.

    A large part of the reason why I wrote this post in the first place was to elicit the arguments for QM that are known to people who know more physics than I do, and to collect the many different such arguments in one place! ~500 comments in, I think I’ve had some success at that.

    If I do write the survey or book I currently imagine, I’ll have to learn more physics, and indeed one reason for the writing project would be that it would be an ideal pretext to learn more physics.

    I wouldn’t dare to speculate about these matters if I hadn’t personally known so many people who, to put it mildly, do know QFT—Weinberg, Susskind, Bekenstein, Maldacena, Preskill, Eliezer Rabinovici, Daniel Harlow, Sean Carroll—and if, in my conversations with them about foundations of physics, they hadn’t usually been perfectly content to approximate physics as a collection of qubits being acted on by a quantum circuit, and then discuss whatever information-theoretic or complexity-theoretic question they wanted to discuss in that language.

    I believe they felt at liberty to do this for a few reasons:

    (1) The finiteness of the Bekenstein-Hawking entropy, which suggests that there really is a discrete collection of qubits at the Planck scale, even if we don’t yet know how it’s realized,

    (2) The modern, Wilsonian perspective on QFT, which suggests that whatever is going on at the Planck scale (strings, qubits, etc.), we’d perceive a QFT at the scales accessible to us, with the infamous renormalization problems of QFT probably reflecting nothing more than our ignorance of what’s happening at the shortest distances,

    (3) The fact that a near-century of progress in QFT and quantum gravity has left the basic principles of QM itself, not only 100% in place, but a rich topic even experiencing a rennaissance of new discoveries (e.g., in quantum computing and information),

    (4) The quantum version of the Extended Church-Turing Thesis, which suggests that even QFTs and quantum gravity theories can likely be simulated by standard quantum computers (i.e., in BQP) with at most polynomial slowdown, and vice versa.

  492. Scott Says:

    JH #323:

      This is the change in quantum mechanics. There are only individual instants of time and there’s no causal connection between these instants, and no causal link between individual particles, making them interchangeable.

    Certainly we recover a robust causality in QM, and connections between different instants, in the presence of decoherence.

    It’s true that indistinguishable particles—ones that we can know to be indistinguishable (!)—was one major new development of QM. But that just pushes the question back a step: why should the existence of knowably indistinguishable particles have been such an important design requirement for our universe?

  493. Scott Says:

    Steven Evans #330:

      Q: Why should the universe have been quantum-mechanical?

      A: The answer is 0.1134 … When you type 0.1134 in a calculator and turn it upside down it reads “hello”. That’s all we are – a pattern on a calculator that says “hello”. Or asks “Why should the universe have been quantum-mechanical?”

    WHOA. I probably need another bong hit to appreciate that insight… 😀

  494. Scott Says:

    Clark Van Oyen #344:

      The question you are asking is “why does god play dice?” And perhaps, why do those dice have a specific number of sides, perhaps.

    More like, why can the different ways of rolling the dice to get the same result interfere destructively and cancel each other out? Why should they be those kind of dice? 🙂

      I respect that as QM expert you have the benefit of context for framing this question. I am wondering if this question will be similar to: “which interpretation fo quantum mechanics is correct (Copenhagen or many worlds?” Do you feel this is the former or latter type of question? may it forever sit outside of experimental verification? Why?

    The relationship, if any, between the “why QM?” question and the “how should we interpret QM?” question is actually an excellent question in itself. Clearly the two questions are connected, in that certain answers to one would naturally suggest answers to the other, and vice versa. Equally clearly, the two questions are not the same; logically it seems either could be answered without shedding any light on the other.

    Between the two, I’m actually more optimistic about our ability to make progress on the “why QM?” question, because there’s an obvious path forward: namely, study the evolution of complex structures, chemistry, life, etc. in a wide variety of (simulated) classical universes, and see if there are things that consistently go wrong. Whereas with interpretation of QM, it’s not just that we’re at an impasse, it’s that it’s far from obvious what research directions have any hope of resolving the impasse.

    Unless, of course, future research were to reveal that QM is only an approximation to something deeper, in which case we’d be back to the drawing board on both the “why QM?” question and the QM interpretation question!

  495. Scott Says:

    Liam #345:

      Quantum mechanics lets you discretize the state space without discretizing space. In particular, it lets you simultaneously preserve continuous spatial symmetries and the third law of thermodynamics (entropy at zero temperature is a finite constant) in a system with particles.

      So for instance assume you want to have something like particles, and you also want rotational invariance (you’ve said you are satisfied with Einstein’s justification of Lorentz invariance so I assume you’re happy with taking continuous rotations as a given). Then if your ground state of hydrogen (or whatever your basic atomic building blocks are in your fancy new universe) is rotationally invariant, but you also have a definite position for the electron (or whatever), then you can generate an infinite degeneracy of states by rotating this state. So entropy is infinite. On the other hand, if you want your low energy states to have finite entropy, you need to somehow have states where continuous rotations acting on them generate only a finite number of states, in other words they have to be finite dimensional representations of SO(3). So they have to be spherical harmonics, i.e. the stable bound states basically have to be waves. But when you isolate and manipulate (i.e. measure) their constituents, they look like localizable particles?

    You make a very interesting argument—certainly one of the better ones on this thread—but of course it leaves many possibilities unaccounted for. What if we abandon point particles and have little hard spheres? What if we make entropy finite by simply saying that all measurements of continuous parameters are subject to fundamental noise—not for true quantum-mechanical reasons, but like in the popularized misunderstanding of the uncertainty principle? What if rotational invariance only has to emerge at macroscopic scales, while at the Planck scale we can have a random cloud of discrete points?

  496. Scott Says:

    I #347:

      As to your whole research agenda, would it be fair to phrase it as: “give an arguement convincing a smart person in a world which feels to them intuitively like we feel ours to be that they are be living in a quantum world. Further, this arguement should be as natural as Einstein’s arguement for SR.”

    Yeah, that seems fair.

      In which case, isn’t that exactly what the GPT subset of quantum foundations was made to do?

    I’m not sure. If it is, then what’s the best argument for the inevitability of QM that the GPT (Generalized Probabilistic Theories) research program has managed to come up with? Is it Hardy’s or Chiribella et al.’s? In general, I could easily imagine the GPT program telling us a lot about Q2, but it’s harder to imagine it answering Q1.

  497. Scott Says:

    Tiberiu M #349:

      Here it comes down to the number of intelligent beings living in each one. Due to the infinite super positions of a QM universe (think the infinite branching in the Everett interpretation), a QM universe is infinitely bigger than a classical one. Therefore, it is inhabited by infinitely many more consciousnesses. Therefore, you are infinitely more likely to find yourself in a QM universe than in a classical one.

      There could be a classical universe out there hosting intelligent life, we just don’t happen to live in it. The same idea can also explain why the universe is (probably) infinite in space.

    See my comment #486 for why that doesn’t work.

  498. Tom Says:

    When all you have is a hammer, everything looks like a nail…

    Obviously Scott is a very talented specialist, but I wonder if he’s not fooling himself by turning his hammer on philosophy and expecting this effort to resolve a mid-life crisis. Why fuss over QM when our institutions have already developed compelling answers over thousands of years to a more fundamental question: “Why do things exist at all?”

    Unfortunately many modern scientists are either unaware or willfully blind to the fact that such answers already exist. St. Thomas Aquinas tells us that it’s possible to know that God exists with natural reason alone. This has been the official stance of the Catholic Church since at least the first Vatican Council.

    http://www.scborromeo.org/ccc/p1s1c1.htm
    Section III, line 36

  499. Scott Says:

    Tom #498: If Catholic theology can answer the question of why anything exists, then can it also answer the far easier-seeming question of why that which exists obeys quantum mechanics as far as anyone can tell? If so, then would you be kind enough to enlighten me as to the answer?

  500. Scott Says:

    JakeP #352:

      What if you assume, that BQP is simply in P? Once we understand the algorithm that makes this possible, the “mysteries” asked about here will make sense to us. The Born rule, complex amplitudes, etc, will just fall out naturally from how this algorithm is structured.

    That would be nice! 🙂 I emphatically do not expect BQP=P, but if and when such a collapse were ever shown, it would indeed be worth carefully examining whether the proof shed any new light on the foundations of QM.

      If satisfactory answers to Q1/Q2 have eluded us for so long, perhaps it is slight evidence that there IS in fact an efficient algorithm for simulating a quantum circuit after all?

    Alas, heuristics of the form “I bet X is true, because if it weren’t then this other thing Y would seem too mysterious for me to understand,” have a pretty abysmal track record. 🙂

  501. Tom Says:

    Scott #499: That seems to be a rather more difficult question since it presupposes there is some reason why QM needs to exist. (And we don’t know this reason) The question of why anything exists however presupposes nothing since existence is self-evident.

  502. Scott Says:

    Etienne #354:

      if I had to speculate on God’s desiderata when designing the universe I would start with

      1. Discrete state space and discrete time,
      2. Some form of extreme action principle obeying some form of Noether’s theorem.

    That’s an interesting proposal! I’ve indeed often wondered about the role of Noether’s theorem in the “design goals” for our universe—given that, if you made up some random classical CA, Noether’s theorem would almost certainly not be relevant to it. Indeed, even if your CA were subject to continuous symmetries, you still wouldn’t automatically get associated conserved quantities, unless your CA also happened to satisfy an action principle / Euler-Lagrange equation (as you say).

    But … why is this so important? Is it just a mathematically convenient property of our laws, or is it something that actually plays an important role in enabling complex chemistry, life, and intelligence?

  503. Scott Says:

    Scott P. #356:

      There are two branches, after all. What does it mean to have one branch be more probable than another?

    I’d say that it simply means: if someone asks you to bet on which branch you’ll find yourself in before the branching happens, then you should accept all and only those bets that would make sense if the probabilities were indeed 1/3 and 2/3, or whatever else the Born rule says they are.

  504. Anbar Says:

    Scott #494

    – More like, why can the different ways of rolling the dice to get the same result interfere destructively and cancel each other out? Why should they be those kind of dice? –

    Because otherwise there would be no forbidden or assured transitions, i.e. indeterminism would imply unpredictability

  505. Ajit R. Jadhav Says:

    Gil #454 and #456:

    1. My error:

    I had realized by late last evening (a day after posting my reply #416) that I had committed an error, a glossing over of the distinction between non-determinism, and deterministic chaos. In particular, I said (in #416):

    > “But the 3-body system already is non-deterministic — even if based on the same, deterministic law.”

    I should have said:

    > “But the 3-body system already is chaos-theoretical — even if based on the same, deterministic law.”

    Similar corrections should apply to my other statements too… Indeed, on the second thoughts, my following statement (in #416) also is mistaken (or at least too hurriedly written):

    > “But a system which is composed of elements each of which obeys that same deterministic law, may not itself be deterministic.”

    Well, to the best of my knowledge, if the equation governing the elements of a system is deterministic, and if what you are dealing with is a system — and not a “random composition” of those elements (and please don’t ask me what that means!) — then the behaviour may be chaos-theoretical, but it will still be deterministic. That’s the position I should have stuck to.

    I realized this error yesterday, and was wondering if I should add yet another self-referential comment to this thread. But finding that you too were thinking of chaos and probability, I decided that the admission of the error was indeed due on a much more immediate basis.

    2. Randomness and deterministic chaos:

    Another point, in reference to what you said:

    On the issue of randomness vs. deterministic chaos, one of the most helpful resources I’ve ever run into is this paper by Geoff Boeing: [ https://www.mdpi.com/2079-8954/4/4/37 ] (open access).

    Refer to fig. 10 in it. The chaos-theoretical description deterministically selects a subset of the mathematically random points. So it is deterministic. … I knew this, but still ended up committing the above error. … However, note, as the author says:

    > “Strange attractors stretch and fold state space in higher dimensions, allowing their fractal forms to fill space without ever producing the same value twice.”

    Somehow, that last part (“without ever…”) had a way of interfering with my more rigourous knowledge, once I panned out a bit (while writing #416).

    Obviously, a part of me wants to describe the chaos-theoretical situation as non-deterministic. After all, you provably don’t get the same value ever again, do you? … Obviously, this part overtakes my thinking when I am getting a bit too “philosophical”. I need to guard against the tendency.

    OK, now let me come to your questions…

    3. Thoughts on Gil’s questions:

    > “1) What is the origin/meaning of probability in nature and why is there probability?”

    My philosophic conviction is that there is no probability “all the way down”. The most fundamental physical laws, at the most foundational layer, must therefore be deterministic. But the “Law vs Systems” distinction applies. (I touched on this point a bit in my comment #438 above). This consideration in fact introduces unexpected consequences; it leads to some issues which we don’t know how to handle right. Off hand, I can think of two categories of such issues:

    (i) the softer category of issues: The deterministic chaos. Its features share some of the characteristics which we would otherwise ascribe to non-deterministic systems. But the behaviour can be essentialized as being deterministic too, in a way.

    (ii) the plain hard category of issues: The theory-breakdown points like: singularities, or pathologies like Norton’s dome, etc.

    Now, note that even the apparatus of the Probability Theory fails (i.e., it has nothing to add) for the scenarios in the second category. (In fact, as usual, probabilistic analyses must start by subtracting some information which would otherwise be available!)

    Further, when the number of DoF’s is very large, the chaos-theoretical description is practically indistinguishable from that based on a mathematically defined ideal randomness.

    So, perhaps, it is more pertinent to ask the mathematician: “Where do you get your idea of the perfect randomness from? Please identify.”

    But the most basic point against the physical existence of probabilities is that the idea violates the law of identity.

    All in all, (i) it’s a dumb idea, and (ii) the appeal to emotions for the need to have something like that, can be easily dealt with if you can devise a satisfactory mechanism at a sufficiently basic level, say a chaos theoretical, and (iii) probability-based theories anyway never add to knowledge; they begin by subtracting information that would otherwise be available.

    > “2) What is the origin/meaning of chaos and why is there chaos?”

    We discussed that.

    > “3) Why is there something rather than complete chaos?”

    The alternative to “something” is not “chaos”; it is the existential nothing, the nought, the void, the “shunya”, the zilch, … .

    Chaos (even in its non-scientific usage) evokes the image of something that twists and turns and morphs in every way unimaginable and unpredictable; something that can gulp or spit out any definite thing once in a while but you can’t tell which thing, when, where, how, or why. That’s the non-scientific chaos for you.

    Now, hold on to that image. And observe that it has some definite things in it too, not just chaos. If it were nothing but only chaos, we couldn’t even grasp any thing about it. Even an image like this would be impossible. … Further, notice that even that chaotic thing has already been supposed to exist. So, in that sense, it is (at least posited) to be a something — at least in that imagination. It’s just that its behaviour — its actions — are completely unpredictable — to us.

    So the issue you want to raise could be better framed as the following: Why do things behave lawfully rather than chaotically? (More philosophically: Why does the law of identity hold?)

    I already gave a hint to the answer, so I won’t discuss it.

    4. About QM:

    You also said:

    “I would speculate that quantum mechanics is a framework (perhaps “minimal” or even “unique” in some sense) that allows answers to, or at least “better understanding of”, these questions 1) 2) 3).”

    Nope. It is true that the mainstream QM is only a framework, and not a complete description. But the mainstream QM theory is linear. So, it actually does not allow seeking answers to questions like the above.

    To seek answers to questions like chaos, all that you need is an ontology like Newtonian mechanics, say the technique of Molecular Dynamics. The mainstream QM in principle falls short.

    BTW, my following blog posts may be of interest:

    “Fundamental Chaos; Stable World”, August 2019. [ https://ajitjadhav.wordpress.com/2019/08/28/fundamental-chaos-stable-world/ ]

    “Determinism, Indeterminism, Probability, and the nature of the laws of physics—a second take…”, May 2019 [ https://ajitjadhav.wordpress.com/2019/05/01/determinism-indeterminism-probability-and-the-nature-of-the-laws-of-physics-a-second-take/ ]

    Best,
    –Ajit

  506. Mike Elliott Says:

    For an alternative perspective to the “classical life” presumption…let’s do a 180 and locate ourselves inside the wave function itself and explore the idea that certain quantum mechanical processes play a central role in our everyday choices and preferences. Before I elaborate on this proposal, if you’ll have it, I’d like to do a quick Gedanken experiment:
    Please choose your favorite ice cream, picking from these three: vanilla, chocolate, or strawberry.
    Now to elaborate, consider the possibility that we just conducted a quantum experiment on yourself as follows: what is objectively described as measurement and quantum probabilities in a QM theoretic framework forms a *duality* to choice and preference in our subjective experience – two sides of the same coin.
    So, when we set up the Gedanken experiment we set up a basis for measurement, call it the “ice cream basis”. It has three basis states, call them |V>, |C>, and |S> for vanilla, chocolate, and strawberry respectively. If you were, say, on the fence between chocolate and vanilla, a wave function that would have described you *before* you chose, would have been a superposition of |C> and |V> with roughly equal amplitudes for each, and a negligible amplitude for |S>.  On the other hand, if it was a slam dunk to choose Strawberry with every other choice having zero appeal, then your wave function would not have been in superposition and would be purely in the |S> state.
    In general, if you prefer A to B, then the probability of measurement outcome A is greater than B. Each time you make a choice, you are measuring some aspect of your mind.
    After you chose, you remained in an eigenstate of the “ice cream operator” with eigenvalue of your chosen flavor: for example, consider the choice again. Did you choose the same flavor? 
    So, what do we get from such a proposal? We get a framework in which folks have probabilistic free will (although not Knightian freedom as described in Scott’s Ghost essay). For many though, I think this will still be satisfactory – you have freedom to choose, the only constraint upon you is to choose in line with your preferences which is something perhaps impossible to not do. You will be probabilistically predictable in a quantum sense, but is this really any different than knowing your good friend Alice loves chocolate and will choose |C> 95% of the time? You still don’t know what she will pick on any particular trip to the ice cream parlor. Furthermore, you get this freedom within the laws of physics and the causal closure of physics (CCP) is preserved.
    Interestingly, as a bonus, there is an academic field called Quantum Cognition whose practitioners model the outcomes of psychological experiments using the framework of quantum mechanics. They are careful not to suggest quantum effects are actually occurring in the brain, probably because this still carries a stigma in scientific circles, but they argue the mathematics of QM is more successful at modeling human decision making than so-called ‘classical models’ of behavior. One example is called ‘question order bias’; the order in which subjects are asked questions can cause different outcomes even though the questions themselves are the same. In other words, ask question a then b then c, and you’ll likely get different answers than if you ask a, then c, then b. In QM, of course, the order of measurements is well understood to affect the outcome of those measurements. 
    Hopefully, this is compelling for its upside: we roll two otherwise unexplained subjective phenomena, choice and preference, into the laws of physics while rescuing a degree of free will by locating ourselves inside the wave function itself. Explanations of some psychological phenomena are free-bees and  go along for the ride.
    What about the downside, is this even possible? QM in physics is typically applied to atomic scale systems, or super cold temperatures, how could such physics be applicable in a warm, macroscopic object such as your body? Roughly 50 years ago the physicist Herbert Frohlich proposed a QM model – a particular Hamiltonian – that gave rise to a Frohlich condensate (something like a Bose-Einstein condensate). Frohlich showed, in theory, this condensate could form even in the noisy, warm environments of a biological organism. Frohlich’s model did not depend on isolating the system from the surrounding environment to maintain coherence. Rather, coherence was maintained by continuously pumping energy into the system (presumably the role of metabolism). Unfortunately, nearly a half century went by with no evidence of such a condensate and many physicists soured on the idea. Recently in 2015, however, a paper was published that cited experimental evidence of just such a condensate in a biological protein (https://aca.scitation.org/doi/pdf/10.1063/1.4931825). This experiment took place in a petri dish under exposure from a THz laser, so it still remains to see evidence of a Frohlich condensate actually in a living thing, and powered by its own metabolism.
    And, even supposing such a condensate did exist in the body, what could we be physically measuring when we make a choice in the “ice cream basis”? Some folks have suggested ion channels in the neurons of brains because these ion channels themselves have single ion diameters and operate on scales susceptible to quantum mechanical effects, yet, they can influence the firing of neurons and thereby have their effects amplified to macroscopic scales. These answers won’t come without much more research, even presuming quantum-life is true, but hopefully folks find this little Gedanken experiment thought provoking in an alternative way to traditional thinking on these topics.

  507. Steven Evans Says:

    Scott Says:
    Comment #493 January 30th, 2022 at 8:14 pm

    That took several minutes hard thought mentally juggling and rearranging the 2 facts about QM I know;). But if we forget about finite precision and collapsing/splitting wave functions we are simply assuming we have a quantum computer. Why does a quantum computer make quantum computations? Because it is a quantum computer. Your assumptions turn the universal quantum computer into the perfect, fundamental being and we don’t get to ask why they exist. Our only choice would be to make your assumptions scripture and to sing the praises of the quantum computer.

    We know how we get chemistry and biology when we pile the computations high, and we know only of this one possible instantiation of physicality, so any anthropic-like considerations become dull tautologies. People should visit Legoland – it’s amazing what you can make out of lego, too.

    Surely, for any scientific revolution to be launched before your ankles completely give way, the clue is where we definitely don’t have a clear description – the collapse/split of the wave function?

  508. alyosha Says:

    Scott #421: “[…] I’ll be more-or-less hobbling from here till the end. Well, less exercise, so in expectation probably less time left […]”

    Please don’t jump to such pessimism! Get good care, take good care, and you have reason to be hopeful. Plus i and a lot of folks are sending you supportive waves (both real and complex 😉

  509. Andrew Says:

    I’m a total amateur at QM. My amateur thought follows.

    Suppose (1) we want to create a simulation of a universe as we observe it (so it’s mostly classical), (2) we want the hardware we run the simulation on to use the same principles as the universe we simulate (for simplicity), and (3) we want the simulation to be fairly fast: i.e., we don’t want to take a really long time to simulate one time step.

    Requirement (1) means we don’t have to propagate a full many-worlds-interpretation-like wave function; rather, at each time step, we randomly (according to the Born rule) select one subspace from a lot of orthogonal ones, keep simulating it, and erase the rest, freeing up most resources at every time step.

    A solution might be a quantum computer, since its full state space size is exponentially larger than its classical state space size (caveats in a moment). That means we need to make it only a little larger than a typical natural quantum system to be able to also cover all the classical DOFs in the universe in a single MWI-like world.

    Caveats: I’m not sure that the quantum state space size is a good proxy for simulation power. Here I’m assuming it is. Also I have no idea if any of the details used here, such as pruning orthogonal subspaces, can be implemented on a QC without destroying the simulation.

    To give some numbers to this, take something like 10^80 atoms in the observable universe and multiply by some rough number to get at the classical number of DOF: something like M = 2^300 classical DOF.

    Then suppose the maximum natural quantum system’s state space in our universe is equivalent to approximately N = 10^4 qubits. By “natural” I mean the state space size needed to carry out chemistry.

    So then there are roughly M/N separate quantum systems to simulate plus roughly M classical DOF to simulate. We can do that on a quantum computer having log2(M/N 2^N + M) qubits, or roughly log2 M + N. Using the numbers above, that is 300 + 10^4.

    Thus, a quantum computer that can simulate a complicated chemical quantum system needs to be only a little larger (~10% larger) to simulate one MWI-like world of the observable universe. In the unlikely event that I’ve made it this far without committing one or more egregious errors, this would make a quantum computer very natural hardware for a universe, satisfying all three of my design requirements.

    A universe that runs this way would have a maximum in-universe QC size before it would exhaust the underlying hardware, so I can recklessly double down on this line of thought and predict (in doing so, I’m pretty sure I’m ignoring a ton of sensible things Scott says on this blog, but such is the right of the amateur) we might hit an upper limit to QC size in our universe.

  510. Ajit R. Jadhav Says:

    Scott #467:

    Do you understand that I’m not asking why you’d use QM to create this world—which is obvious—but rather, why you’d use it to create a world? While the majority of commenters here get this, there seems to be a persistent minority for whom it’s a completely ungraspable concept.

    But my position has been that I don’t, wouldn’t want to, and in fact cannot, know what specific physics theory to use in order to create a world like ours. And, also that none can.

    As to the business of just creating any which “world”: Please consult any of the many modern physicists, computer scientists, mathematicians… and philosophers.

    Scott #492:

    [snip] why should the existence of knowably indistinguishable particles have been such an important design requirement for our universe?

    Now you are doing much, much better. [See the effect of beginning to consider “our universe”?]

    … Replacing “design requirement” by “feature”, I would say that I am definitely going to add this bit to the set of things I should be explaining (even if only briefly).

    Best,
    –Ajit

  511. Philippe Grangier Says:

    It has been my first time contributing to this blog, so thanks to all for the interesting discussions, and sorry for wasting you time in case I did. As a last post here is a tentative answer to Q1: Why didn’t God just make the universe classical and be done with it? What would’ve been wrong with that choice?

    My answer : There was no such choice, the universe is both classical and quantum depending on the way and the scale you look at it. Either a purely classical world or a purely quantum world would be physically inconsistent and would lead to absurd predictions, there is an abundance of examples. This may sound terribly neo-Bohrian, but I accept the heritage, modulo my claim that psi is predictively incomplete, and needs a context to get a physical meaning.

  512. Luke W Says:

    Hey Scott,

    I’m wondering what you make of Tim Palmer’s suggestion that “quantum indeterminacy may perhaps be replaced by certain kinds of ‘hidden variable’ chaotic dynamic, provided that the chaos is sufficiently nasty.” https://royalsocietypublishing.org/doi/10.1098/rspa.1995.0145

    Could this be a way out of the quantum mechanical realm back into the comfortable world of determinism? I think the idea is that instead of considering collapsing of the superposition to be random, it is determined by hidden variables that act in a manner so chaotic that we perceive it to be random, but nature can somehow compute it. Further, it may even be uncomputable to us. (I may not have worded that correctly, I’m not an expert – just interested!).

    Thanks!

  513. Mateus Araújo Says:

    Scott #484: I’m dead serious. People usually argue the opposite because they understand neither probability nor Many-Worlds.

    You didn’t reply to my request to give a definition of objective probability. I don’t expect you to succeed, people have tried to do it for a century and failed. I just want you to be honest with yourself and realize that you can’t. This failure is so widely recognized that the consensus in philosophy is that objective probabilities do not exist, they are content to deal with subjective probabilities. Meanwhile the consensus in physics is that objective probabilities are obviously what quantum mechanics gives you, and they don’t worry about defining them.

    Now subjective probability is conceptually prior to quantum mechanics, and thoroughly unproblematic. Objective probability, on the other hand, was introduced in our theories with the Born rule, and remained mysterious until the advent of Many-Worlds.

    As for why is it a problem to have an undefined transition rule in your theory, that much should be clear. You might as well write that God decides to transition state A to B or C.

    It’s true, people have done classical stochastic models for a long time. But you don’t need to know what probability is to do that, you just need to know how to deal with it, and how to get a RNG when you actually need to run the model. Like how people have dealt with water for millennia before understanding what it was. They drunk it, swam in it, piped it, froze it, evaporated it, made steam engines with it. All that without having the faintest clue about atomic theory and chemical bonds.

    The reason why true randomness only appears through the Born rule is because true randomness is deterministic branching. I do think that’s obvious. If we lived in a Permutation City world where mind uploading and copying of agents was commonplace we’d also see true randomness appearing in this emergent level, but we don’t live in such a world, at least not yet.

  514. B R Says:

    Scott #479:

    > That just pushes the question back to: why do we need waves or particles? We need information, but why does it have to propagate in either of those ways, let alone in both of them?

    I would like to re-emphasize that I really think that you are going to need *some* physical input for this discussion to go anywhere. You do not just need information.

    (In fact, let me also point out that the physical input necessary to ‘derive’ special relativity is absolutely insufficient to conjure up a theory of everything. Similarly, I think that all the modern and possibly deep links between information theory and quantum gravity go too far down the rabbit hole, and therefore might not be the best starting point to answer your question. Sorry!)

    > Furthermore, “wave/particle duality” strikes me as an old-fashioned way of talking. From a modern QFT standpoint, waves and particles are simply different ways to describe phenomena that can arise on quantum fields, the truly fundamental entities.

    But you were precisely asking *why* these quantum fields were necessary, no? In other words: of course particle/wave duality is old-fashioned, but how many ways are there to arrive at a unified description? Quantum mechanics is one way. (And, as Weinberg argues, QFT is the sort-of unique combination of QM + SR.) If it is the unique way then are we not done? At least to me your question then seems to be answered if we accept the starting point that we require a unified theory of waves and particles.

    Whether to investigate this possibility depends of course on how appealing you think that starting point is. Personally I would love to see if such uniqueness can be convincingly argued for. I think it would provide a wonderful supplement to any QM textbook, because the only experimental input you would need would be the photoelectric effect and the two-split experiment for electrons.

  515. Peter Morgan Says:

    Scott, a question in MJ Geddes #490, “what is the physical interpretation of things like ‘negative probability’?” crystallizes for me that there is a fairly robust sense in which there is no such thing as a ‘negative probability’. There are, however, many examples of what I would call ‘negative and complex-valued pseudo-joint probabilities’, which have incompatible probability measures as marginals.
    Gil Kalai #454 name-checked Itamar Pitowsky, who is as clear as anything I’ve seen about the relationship between joint probabilities and incompatible probabilities: by definition, the latter do not admit representation as a joint probability. One way to think of the situation is for me that ‘negative and complex-valued pseudo-joint probabilities’ are a consequence of trying to force incompatible probabilities into a single joint probability.
    We know that incompatible relative frequencies occur in our analysis of experimental results often, but in particular they occur whenever we report a violation of Bell-inequalities. So being able to model incompatible relative frequencies is a good thing (as Wigner functions is OK, yes), even though our records of experimental results are certainly “classical”, just ordinary numbers on paper or in computer memory, as at least some strands of the Copenhagen interpretation insisted.

    Why QM? Given any large body of experimental data, on paper or perhaps Terabytes long, a list \([x]\), we can use arbitrary algorithms \(f_i\) to construct a list of summary values, which we call measurement results, \(M_i=f_i([x])\). Metadata about those Terabytes that derives from our knowledge of the experimental apparatus and procedure suggests that we choose specific algorithms, with a strong preference for terminating transformations, not just any algorithm at random. In particular, we distinguish different preparations, with which we associate sublists (\[x]_j\), so we obtain a finite array of measurement results \(M_{ij}=f_i([x]_j)\).
    Now, we can look for solutions of the linear equations \(M_{ij}=Tr[\hat\rho_j\hat O_i]\), which can always be solved for high enough dimension. Indeed, it can always be solved for high enough dimension even if we require that \(\hat\rho_j\) and \(\hat O_i\) must be diagonal. Thus, I suggest, both quantum and classical presentations of past data is always possible; the kicker, however, is that any given choice of dimension leads to different interpolated and extrapolated predictions for the measurement results of future experiments. A given choice of measurement results \(M_{ij}\) is effectively a compression of the \([x]\), with all the choices that implies, as well as the choice to construct the particular apparatus. Of course in the above there has been no mention of a Hamiltonian or Liouvillian dynamics, of space-time as a way to index our measurements, and a slew of other important aspects, but there’s this very low-level sense in which I take classical and quantum formalisms to be universally applicable. As physicists, we can’t step outside our actual records of experimental results [as people, we can also think many things that we do not write down or that is otherwise not part of those Terabytes, but of that we cannot speak in formal communication with the editors of physics journals.]
    What is not at all clear, however, is that such a linear model for the monolithic list \([x]\) is always the best way to analyze and present the results of experiments. I haven’t yet seen anything better than Hilbert spaces, POVMs, unitary dynamics, and all that, in such of the literature on generalized probability theory that I have seen, but people are certainly presenting alternatives (as just now on math.OA, “Dendriform algebras and noncommutative probability for pairs of faces”, https://arxiv.org/abs/2201.11747, which to me looks interesting).
    I hope some of this is a little helpful. Notice that there’s no mention here of “systems” or “particles”, which to me should only be introduced with as much care as we always should take when introducing “causes” as an explanation for what in physics is typically a very complex system of correlations, at the Avogadro Number scale and beyond. I suppose we can say that Judea Pearl has shown that up to a point it can be done, but I think it’s not to be done glibly even when it might seem obvious.

  516. Paul Hayes Says:

    Clinton #450 (“Feynman, a great popularizer, clearly said that what made QM different was that it allowed negative probabilities.”):

    BTW, in view of that and similar remarks here and elsewhere, QT is a proper probability theory, not a so-called quasiprobability theory, and it doesn’t allow negative probabilities.

  517. Scott Says:

    Steven Evans #507:

      Your assumptions turn the universal quantum computer into the perfect, fundamental being and we don’t get to ask why they exist. Our only choice would be to make your assumptions scripture and to sing the praises of the quantum computer.

    You do realize, don’t you, that we’re now 515 comments deep in a thread about why the universe is quantum, why the right model of computation for our world appears to be quantum computation? That I could just shrug and accept it, as many of my brilliant colleagues do, but that this thread exists because I (unlike many of them) hold out the hope that it might eventually be possible to do better?

  518. Liam Says:

    Scott #383 & #407

    Ok, here’s my attempt to say why I think symmetry really is as fundamental as physicists say it is, why you shouldn’t think about gauge theories as just a bizarre-seeming trick that works remarkably well for mysterious reasons, and why if you explained all this to a deity who was trying to invent rules for a new universe, she might find it so compelling that she dropped all her previous plans for a classical universe and started making a quantum one. I apologize in advance for the super-duper long comment, which honestly feels a bit inappropriate – in my defense, we’re already 500+ comments deep into this thread, so maybe one extra long one at this point isn’t hogging too much space?

    First, you say symmetries are really just a restriction to a subspace states in the theory, and the theory happily lives in this space. But this is not really what symmetries are. Crucially, symmetries don’t remove states from the theory, they relate what you see when you look at different states in the theory. For instance, Lorentz invariance tells you that if you know the energy of a particle at rest, then you know its energy at any velocity. This point is particularly clear for symmetries like boosts, which don’t commute with the Hamiltonian and so are not associated with a conserved charge, so there is no charge sector to restrict to. But even when they do commute with the Hamiltonian, so there is a conserved charge, symmetries are much more interesting than a restriction to a charge sector. In particular, we’re often interested in theories with multiple phases, one where the symmetry is preserved by the ground state and another where it is spontaneously broken. In either case, the symmetry commutes with the Hamiltonian. In the broken phase, the symmetry tells you things like the degeneracy of the vacuum, and it relates the different degenerate vacua to each other. Even better, the symmetry doesn’t need to be exact, but can be explicitly broken, and you can still systematically include the effects of the explicit breaking in the predictions of the symmetry as long as the breaking is small. This is what is at the heart of the chiral Lagrangian, which describes things like pion scattering and meson masses. When you quotient by a symmetry, by contrast, there cannot be any symmetry breaking, no matter how small. Anyway, you get the point – symmetries are not just a fictitious device that lives in our head, they are a physical action that you take on the states of the theory.

    I assume what you had in mind was that if you have a conserved charge, then you can project onto states with that charge and just get rid of all the other states. But even here, symmetries are usually more than that! We are often interested in continuous symmetries, and these are generally associated not just with a conserved charge but instead with something much more powerful, namely a conserved current and charge density, which satisfies a continuity equation. That means that you can measure how much charge is in any local region of space. Even if the total charge of the space is zero, you can dump some positive charge into some finite region (at the cost of putting some negative charge somewhere else), so in the finite region you are effectively seeing the predictions of the symmetry on charged states. Moreover, charge can’t just disappear from this region and pop up somewhere else – because of the continuity equation, the charge can only leave a region by passing through its boundary. So if you know the charge in some region at some initial time, and then you sit and monitor all the charge passing across the boundary, you also know the amount of charge at some later time. Again, the implications are much stronger than what you would get by just restricting to states with a fixed global charge. Even better, if your current is coupled to a gauge field, then you can measure the amount of charge in some region without actually looking inside that region, by using Gauss’ law.

    This brings us to the role of gauge fields and gauge symmetry. Ironically, although they might seem deeper, the modern understanding is that it is actually the gauge symmetries that are just a quotient of the space, and have more to do with how we as humans describe them than what the theory is doing at a deep level. Most physicists prefer to use the term “gauge redundancy” rather than “gauge symmetry” for this reason. The actual invariant meaning of a gauge symmetry has more to do with the operators that make sense in the theory – things like Wilson lines, in particular. Often, gauge theories have equivalent descriptions where there are no gauge fields, and you don’t ever refer to gauge transformations at all, but the Wilson line operators are still there in some form. The descriptions that invoke a gauge redundancy are more about our desire as humans to write things in terms of local Lorentz-covariant fields, in the limit where the gauge theory has weakly coupled spin-1 massless particles. In fact, you can show that if you have spin-1 massless particles, and you want to write a Lorentz vector field for them, then on general grounds you have to identify field configurations that are related by gauge transformations – Weinberg proves this in section 5.9 of his QFT textbook. The theory itself doesn’t care about whether or not you use this description. So yeah, it’s a hack, it’s amazing that the people who invented gauge theories came up with it, but we understand very well now why one has to make *exactly this hack* and not any other one. On the other hand, what the actual underlying physics *does* care about is that the spin-1 particles are coupled to conserved currents! This follows from what are known as “soft photon” theorems, and they imply that massless spin-1 particles can only couple to conserved quantities. So conserved quantities are actually the fundamental thing here, since they are what allow spin-1 massless particles to have any interactions. Similar statements apply to massless spin-2 particles: they have to coupled to a conserved energy and momentum, and if we want to write them in terms of fields then we have to introduce diffeomorphism invariance, which puts us on a path towards deriving General Relativity.

    So, why would our deity be moved by any of this? Well, one possible response to the “can’t you just get whatever interesting complicated dynamics you want using celular automata?” question is that, yes, you can do basically whatever you want with them, that’s the problem! The system is too unconstrained and ad hoc. If you said “can’t your deity just, moment by moment, choose what every particle does in this universe”, the answer would also be yes, but you would immediately reject this option. However, the beauty of quantum mechanics is that, together with Lorentz invariance and some assumptions about particles and locality, it *drastically* constrains you to just a few choices. At one point, something like this was one of the main dreams of string theory – that the constraints of a UV-complete theory of quantum gravity would be so stringent that string theory would fix the parameters of the Standard Model to at most a handful of choices. Whatever you think of the progress of this program, it’s clear why it was so appealing. Ironically, though, something like this is actually true of *low-energy* quantum gravity. In that case, the constraints of Lorentz invariance, a spin-2 massless particle, and unitary quantum mechanics appear to uniquely land you on General Relativity at low energies. Similarly, the constraints of a weakly coupled spin-1 massless particle land you on Yang-Mills theories. Ah, but “isn’t GR + QM an unsolved problem?” I hear someone saying. One of the surprising little secrets of quantum gravity is that, for accessible regimes of physics, we already have a perfectly good theory of quantum gravity! It’s called the Effective Field Theory of GR, and it works great. This is not just the statement that quantum effects are small in practical situations, it’s that *we know how to calculate these quantum effects very precisely* in practical situations. It’s a strange quirk of history that this isn’t better known – partly because the people who “discovered” it (primarily, to my knowledge, Ken Wilson and Steven Weinberg), did so almost in passing while they were trying to understand nonrenormalizable theories more generally. So one of the most appealing things about quantum mechanics is its inflexibility – up to a few parameters, it fixes the low-energy physics. In fact, almost all of the particles we see in practice are around at low energies because of symmetry in some way or other. The only exception is the Higgs boson, which has no apparent symmetry reason to be part of the low energy theory, and this is what mystifies people so much about it.

    So now you go to your deity and say, look, you can keep doing what you are planning with your classical universe, and it’s going to be the absolute wild west in terms of the landscape of rules you are allowed to consider. Or, you can do this thing called quantum mechanics, and with a couple assumptions about symmetry and massless particles with spin the low-energy theory is nearly going to be chosen for you. Which do you think she would pick?

    There are of course a number of objections you might make. For one, the Standard Model isn’t *that* constrained – you have to choose the gauge groups and their matter representations, plus the values of all the couplings – maybe some classical theory could be more or less equivalently constrained just by demanding internal consistency? To which I would say, sure, maybe, show me the system you have in mind and we can compare. But if the comparison is between the wide space of, say, all CA, then I think our quantum system pretty clearly wins. Another complaint you might have is that this isn’t a logical argument like Boltzmann’s derivation of thermodynamics, it’s essentially an appeal to aesthetics. I also don’t have a great answer to this, except to point out that the empirical fact that the Nature has apparently, time and again, chosen fundamental laws that obey rigid mathematical structures and follow from a small set of simple elegant principles is one of the great mysteries of our universe, and I think is implicitly part of the premise of Q1.

  519. OhMyGoodness Says:

    I may be missing some subtlety concerning the idea to simulate classical universes to see if complex life evolves but in this universe quantum processes are fundamental to complex life. No classical explanation is possible to allow for the efficiency and/or speed of photosynthesis, cellular respiration (electron transit through mitochondrial membrane), reaction rates of enzymes, etc. I would add to this list anything akin to human consciousness based partially on the evidence of the impact of anesthetics on consciousness that bind with quantum mechanical London forces at receptor sites. If complex life in a classical universe then it would necessarily rely on far slower and inefficient processes than here.

    When someone claims “intelligent design” my thought is, well not really that intelligent (say maybe 130 on the Stanford Binet). Clearly they had great tools but can’t gloss over that mistakes were made. It could have been done better no question by an intelligent designer with no constraints.

    Your negative comments about our species align well with the Norse creation story where humans arose from the underarm perspiration of a god.

  520. Jester Says:

    Is “quantum mechanics” something physicists/computer scientists do, or an inherent property of nature/reality?

    If the first, the Born rule is on the face of it no great mystery, “it´s just how we do things”, in the same way that action at a distance is no practical problem (perhaps a philosophical one; but we don´t do philosophy in science); the only thing that counts is that the results of the calculations are correct, that is usefully mappable to arranged experiments (or natural occurrences). The question then still is, why does one method work, while others seemingly don´t? (This seems to be Scott´s stance Q2.) But that could merely (??!!) be a problem of “we build tools, both physically and mentally, so *that* they work, which is where we stop worrying and use them. (We then use these tools to eventually create the next better generation of tools, a process to which there seems no obvious end.)

    If the second: how do physical systems “know about” their wave function, let alone the result of “measurements”? Do parts of the system measure other parts? In what way are they even separated? Is this in any way dependent on human observers at all, or rather God´s Ineffable Code (so to speak)? The latter seems to be Scott´s hope, from statements like “QM (being) exactly true” as “one of the profoundest truths our sorry species had ever discovered” and the Einstein quote whether the creator had any choice.

  521. fred Says:

    I find it interesting to note that all digital computers are the (trivial?) illustration of the fact that a very classical and stable reality can be built on top of QM.
    E.g. the hundreds of thousands of instances of a certain model of Lenovo Laptop running the same program give the same macroscopic output even though microscopically they’re all very different.
    In other words, a given digital computer is a system which branches microscopically just as much as any other system, but within it lies a sort of “conspiracy” that creates a stable macroscopic state comprised of millions of abstract symbols.
    But such symbols are “secret” and in the eye of the beholder: because digital computers are just extension of our brains, that high level reality only makes sense to us, and the same observation probably applies to the human brain; even though, moment by moment, my brain branches into many different paths microscopically, my thoughts are somewhat stable across many of those branches (maybe not as stable as a digital computation, but pretty close).
    Of course, fundamentally, this can be boiled down to the observation that three rocks on the ground will represent the number 3 in a stable manner, regardless of all the jiggling, decoherence and randomness going on at the atomic level.
    But rocks are just going to stay there and nothing interesting will happen. But other classical systems exist (like amino-acids), which can carry along a self-“complexification” of the mapping between such high level stable symbols and stable properties of the environment, to the point where those symbols eventually capture questions such as “Why QM?”.

  522. 1Zer0 Says:

    If I want to construct a world from scratch and I got the power of a probabilistic turing machine (overused, let's have some variety) probabilistic register machine (+ an integrated hypercomputer for one specific occasion) at my disposal I suppose I would first set some design goals that I would like to implement:

    • Goal 1) At least 7 macroscopic spatial dimensions.
    • Goal 2) 1 macroscopic temporal dimension.
    • Goal 3) Objects of arbitrary size may randomly exists at one point at a random location and then cease to exist     spontaneously such that no spawned object may exist more than once in the history of the universe (We need a function checkExistedAlready() ).
    • Goal 4)  There should be conscious entities with a large set of qualia experiences.
    • For true fun maybe Goal 5) There is a subset of those conscious entities studying the laws of the world, so whenever someone gets closer to a "final theory of the world", the world adapts and spawns in a new computable law constructed according to an uncomputable rule (Which is calculated by the Hypercomputer). The new law will not interfere with the macroscopic structure of the world.
    • Goal 6) No CTCs otherwise probabilistic RM would need to call the Hypercomputer on another occasion.

    Goals 1-3 should not be a problem. Even though here in a 4D world (I find it among the most mind blowing things overall that the dimensionality of our world so unbelievable low, as a side note), I unfortunately can't render it and noone can imagine a 7D image, the spacetime algebra of the world can be computed.

    I am not sure if Goal 4 can be implemented.

    If Yes:
    Nice. Let's see if I can inject the qualia of a 200 dimensional space into the consciousness of   someone living somewhere in my 7 dimensional spacetime. Or the finite string w e {1,0}* that is the mental impression "blue". Btw. does every stone implement every finite state machine (http://consc.net/papers/rock.html) and if yes and if qualia is just some information theoretical set of states, does any have stone every possible qualia over time?  

    If No:
    Nice. What else do I need to add qualia to my "lifeless" world inhabitants? Can a world have some non computable or even non mathematical properties that are needed for Qualia?
    If not with math, how can they be "wired" into the world?

     

     

    Which design goal can I have that would prevent me from using a probabilistic Register Machine? I could for example decide that I want spacetime to be R^n. If my space is R^n I can't possibly update an uncountable infinite subset of elements in R^n in  finite time with finite steps with my probab. RM.

    I don't know why our world has QM, the trivial reason I guess;

    If you believe all possible worlds exist, necessarily, there is at least one world with QM. If all possible worlds exists, there are "many" worlds far stranger than QM world. 

    Or it is necessary to achieve some design goal.

    Either way, great questions and the comment section has intriguing ideas, probably going to reread everything in summer when my mental abilities are at their peak.

     

  523. Jester Says:

    For the inclined reader, here is a nice elaboration of an example in physics where “from simple requirements of rational consistency we could have arrived at the Lorentz transformation. As Minkowski said ‘Such a premonition would have been an extraordinary triumph for pure mathematics.’ (Which seems to be Scott´s dream as well.)

    “Staircase Wit” https://www.mathpages.com/rr/s1-07/1-07.htm

    (I am not in any way affiliated with the author.)

  524. Crackpot Says:

    Why is the universe English?

    The Schrodinger Equation’s conceptualization of a “wave” is, effectively, a function which defines a complex number for every point in a given space. A complex number, in the context of a fixed set of dimensions, is equivalent to an amplitude. In either case, what we’re really talking about is a value.

    When you start examining what the limits of what a valid space must be for the purposes of the equation, and notice the existence of alternative coordinate systems, I just don’t think the question means anything. The universe isn’t “quantum mechanical” any more than it is “English”, “quantum mechanics” is just a way of talking about the properties of universes.

    Now, it’s possible it may be a more or less intuitive way of approaching questions about the universe; that is, the real question isn’t “Why is the universe quantum mechanical”, but “Would quantum mechanics always be the sensible language to talk about the universe”, the answer to which is clearly no, because classical physics worked until it didn’t. What language you use to describe the universe is going to come down to what phenomena you need to describe.

    Why isn’t the universe classical? Because classical physics is an incomplete language; it can’t describe certain phenomena. Observe that in classical physics, waves exist; the Schrodinger equation will apply. There will be quantum mechanical behavior, which classical physics would not be able to describe. So we’d have to invent quantum mechanics, or something like it, in order to describe that behavior. And then we’d be asking the same question.

  525. NT Says:

    Given your own description of QM vs classical mechanics as just L2 norm vs L1 norm, It’s easy to see how L2 is much more special/symmetric than L1:

    1) L2 norm is the only norm that is self-dual
    2) The symmetry group of (finite dimensional) space with L2 norm is infinite, whereas for other norms (L1 included) the symmetry group is just a finite groups of axis permutations.

    I obviously don’t know why is the more symmetric option usually the correct one, but it seems that whenever things could be more symmetric or less symmetric, the universe goes along with the more symmetric option. So in this sense QM is just another symmetry of the laws of physics. Just like we have rotational and translational symmetries of space, we also have a rotational symmetry of the space of “probability vectors”, which gives us QM.

  526. Anbar Says:

    Jester #520

    – Is “quantum mechanics” something physicists/computer scientists do, or an inherent property of nature/reality? –

    I don’t see the exclusion here.

    It is certainly something physicists/computer scientists do, but it is of course related to properties of nature/reality

    It is the ultimate, possibly least constrained, embodiment of the scientific method.

    You need a protocol to follow in order to start with the desired configuration, a protocol for waiting or doing other things, and a protocol to spot some desired configuration at the end; and you need a way to predict the success rates for each combination of protocols.

    QM provides exactly that.

    Coming to the supposedly incompatible second statement, the wave function is a mental representation of a specific protocol. It “belongs” to the QM user (when his confidence level is high enough that the relevant protocol was followed with no errors), but it of course represents both the system and the apparatus, i.e. “properties” of nature/reality, does it not?

  527. James Gallagher Says:

    QM can be seen to arise from quite elementary considerations if you try to construct an (interesting) reality from scratch yourself:

    First we start with one state, x, and allow it to vary (so it can do something non-trivial). So we need to give it a range of values, integers or reals seem reasonable first choice.

    Now we have to apply a function to vary it, so multiplication or addition seems a good choice.

    However, in the case of addition by anything other than 0 we would get a value tending to infinity, and in the case of multiplication by anything other than +1 or -1 we would get a state tending to zero or infinity.

    Neither case seems interesting.

    So rather than just a single real number let’s multiply by a couplet, ie a number with a magnitude and phase, like a complex number of magnitude 1, now the state varies continuously without exploding to infinity or shrinking to zero.

    THIS IS QUITE INTERESTING…

    But we really would like a little more complexity, so let’s introduce a second state y, and by similar reasoning we should multiply by a unitary 2×2 matrix to get an interesting universe.

    But although interesting it is also predictable (predetermined)

    So let’s introduce a spontaneous random change in x and y while keeping the overall modulus (x^2 + y^2) constant (otherwise we could get an exploding or shrinking universe)

    Now we have a very interesting universe, which is unpredictable but does not explode to infinity or shrink to zero, and can be analysed statistically.

    For a very large number of states, and large unitary matrices, the random change will barely be noticeable in the unitary dynamics (we can have very stable microscopic structures with decay lifetimes on huge timescales)

    Also the Born Rule now is just the need to keep the states on the “sphere of rotation in C^n” – it’s Anthropic in origin, if it doesn’t exist the Universe has states tending to infinity or zero, so we would never emerge to observe such a Universe, we need this “Pythagorean Rule” to enable us to emerge from the Universe’s evolution…

  528. Scott Says:

    Matt Leifer #358: Thanks for the extremely interesting comment—one of my favorites of the thread!

    I’ve also marvelled at the fact that QM seems to admit such different ways of describing the same situations—even the extremes you call Church of the Larger Hilbert Space and Church of the Smaller Hilbert Space (CLHS and CSHS). Certainly, most other theories that one could make up would not admit such an enormous range of possible views of how to interpret them. But could difficulty of the interpretation problem actually have been a “design goal,” a thing that we can imagine QM as having been “chosen” to satisfy? As if, in the heavens 14 billion years ago, there was one faction of radically subjectivist, QBist angels, but also a bitterly opposed faction of angels that wanted an objective state describing exponentially many possible experiences, and God had to pick a compromise theory that would satisfy both factions? 😀

  529. Clinton Says:

    Hello again Paul #516

    Thank you again! That was a terrible sentence I wrote! Poor Richard probably rolled over in his grave.

    I guess I was misremembering or mixing up things like this:

    https://www.nature.com/articles/471296a
    Which says “According to Feynman, the key difference in quantum theory is that the particle does not follow the classical path, or any single path. Rather, it samples every path connecting A and B, collecting a number called a phase for each one. Each of these, in concert, determines the probability that the particle will be detected at B.”

    Thus I was leaving out that while the amplitude (phase) may possibly be negative, that is NOT the probability but rather “determines” the probability. Or maybe is a part of determining the probability?

    Feynman’s paper
    https://cds.cern.ch/record/154856/files/pre-27827.pdf

    I went back and read over Feynman’s paper and find that Feynman appears to endorse something like using negative numbers in intermediate calculations as part of a probability theory. But please help me to understand that better if that is not what he is saying – because I do want to be clear on that. Here is another take by John Baez with more references:

    https://johncarlosbaez.wordpress.com/2013/07/19/negative-probabilities/

    What Feynman DOES say is: “It is not our intention to claim that quantum mechanics is best understood by going back to classical mechanical concepts and allowing negative probabilities … Rather we should like to emphasize the idea that negative probabilities in a physical theory does not exclude that theory, providing special conditions are put on what is known or verified.”

    So, I thank you because I definitely do NOT want to give anyone the impression that Feynman was saying “just add negative probabilities.”

    I’m probably getting into too much hot water by pulling Feynman into this – which is probably one of the reasons for Scott’s rule that we not try to pull in other sources but just make our own arguments 🙂 So let me stop doing that.

    I want to get back to basics and consider only what the QT postulates say. (I’m looking at the Nielsen and Chuang version of the postulates.)

    Postulate #1 allows negative complex numbers (amplitudes) encoding the state vector.

    Postulate #3 the Born Rule interprets the 2-norm of those amplitudes to be encoding probabilities of the basis states.

    Thus, you are absolutely correct that QT doesn’t have negative probabilities. The probabilities are always positive by the Born rule. What QT does allow is for the USE of negative numbers in the system state vector.

    The (possibly) negative complex numbers should not be understood to be the probabilities. Only the squared magnitude of those numbers may be understood as the probability – per Born’s infamous best use ever of a footnote.

    Does that then sound like the right way to say how negative numbers are involved?

    One thing I would add is that QT requires that the normalization by kept “in place” through some sort of “computational overhead” over those (possibly) negative amplitudes in the state vector. In other words, those supernatural monks keeping track of all these calculations on some cosmic side ledger in a higher plane of existence must maintain things so that even though the amplitudes may be negative they must still be staying in that Born relationship with the amplitudes for all other possible states … just in case a physicist (or some part of the environment) suddenly asks them for their squared magnitude 😉

  530. Scott Says:

    Anbar #366:

      … not(QM) was already logically incompatible with the 19th century experiments establishing Maxwell’s equations. Not sure how far back you need to go in terms of empirical evidence before classical explanations start requiring Rube Goldberg concoctions, but I would guess not much

      … Am I victim of this self confident delusion you mentioned and missing something obvious?

    Well, possibly! What I’ve found weird, in this thread, is to be sandwiched between two self-confident extremes:

    (1) People lecturing me on why the “why QM?” question can obviously never be answered; how I need to learn to accept that certain things are true “just because.”

    (2) People lecturing me on why the “why QM” question obviously has been answered, because how other than QM would you account for such-and-such empirically observed phenomenon?

    In some sense, both of these extremes pointedly refuse to enter into the thought experiment that this whole post was about: namely,

    Suppose you were designing a new universe from scratch. It wouldn’t have to look like this universe, but you might want it to produce rich, complex behavior in an elegant way, or something along those lines. What considerations would militate in favor of your choosing to make your universe quantum or classical?

    There are also many dozens of comments that do directly engage the question, and I appreciate those enormously! But some fraction of comments continue to round down to either (1) or (2), no matter how often I try to clarify.

  531. Scott Says:

    mjgeddes #367:

      This is the mistake of nearly all the commenters in this thread; one simply cannot hope to understand QM merely by shuffling math symbols or firing off vague verbal ‘interpretations’ of abstract non-physical concepts like ‘wave functions’, one must obtain the underlying *physical* principles, expressed in terms of *non-commutative geometry*.

    Dozens of commenters here have been talking about physical principles, including Lorentz invariance, the existence of stable bound states for atoms, the ability to resolve ultraviolet catastrophes, etc. etc. Why on earth would you identify “*physical* principles” with noncommutative geometry? The latter is just one particular mathematical idea for how to formulate quantum theories of gravity—an idea that doesn’t seem to have enjoyed great success in physics so far, although maybe that will change.

  532. Jester Says:

    Anbar #526:

    Indeed there is no exclusion.

    “It is the ultimate, possibly least constrained, embodiment of the scientific method.”

    I can see that, yes. Although perhaps a bit hopeful.

    “You need a protocol to follow in order to start with the desired configuration, a protocol for waiting or doing other things, and a protocol to spot some desired configuration at the end”

    I love this, because it both resembles “fixing of initial conditions, calculate, compare end result” of a physics calculation, as well well as “input, program running, output” of a computer calculation, devised and directed by people.

    “the wave function is a mental representation of a specific protocol. It “belongs” to the QM user”

    I see. But then my point is: how does nature do it herself (so to speak)? How do atoms do it to form chemical bonds? “Where” is the wavefunction “there”? Surely atoms themselves do not undertake the above threefold action.

    “it of course represents both the system and the apparatus, i.e. “properties” of nature/reality, does it not?”

    I guess so; the point I am grappling with is, how does QM work from the perspective of (say) an atom? How does it know what to do? Because the QM calculations are (probably) not taking place locally where it is.

    (Although perhaps these are improper, ill-defined questions, or naive… But I can´t help comparing it to for example an apple following the gradient of a gravitational field. Probably too naive; my apologies to the intellectual heavyweights around here, which I gather are many.)

  533. fred Says:

    The most puzzling thing to me about QM is that nature seems to take all the possible futures as input to what makes it pick just one amongst those futures.

    In other words, exactly the same reason that on one hand a QC seems do be doing some book keeping that’s vastly beyond what a classical computer can do (i.e. juggling all possibilities at once at no extra exponential cost), yet in the end so much of the information is discarded that there’s no super obvious practical win across the board.

  534. Scott Says:

    Cleon Teunissen #370:

      I am aware of course that the claim that Hamilton’s stationary action can be understood _classically_ is an unexpected one. Your _expectation_ is that Hamilton’s stationary action comes from QM.

      I am aware: If a claim is highly _unexpected_ then the demonstration will have to be low friction, very accessible. (Conversely, if the demo would be opaque/dull then most likely the reader will dismiss it.)…

    Alas, I took a look at your links and found them incomprehensible. You jump almost immediately into diagrams and equations, without ever explicitly stating what are the more basic principles from which you propose to derive Hamilton’s stationary action (if not QM), and crucially, why those more basic principles (whatever they are) don’t already implicitly presuppose the answer that you want.

    I confess that I might be biased by having had this argument previously, with people who were 100% confident that they could explain the stationary-action principle in a purely classical way, but then every time I asked them to teach me, I got a huge, complicated runaround, never bottoming out in anything I understood the way the quantum-mechanical explanation does.

  535. Scott Says:

    Clinton #374:

      The Scott Fear:
      Scott fears QM is exactly true. By “exactly true” Scott means that QM is the actual operating system of the universe. And by “fear” what Scott means is that Scott may never know why QM must be the actual operating system of the universe.

      The Clinton Fear:
      Clinton fears QM is exactly true. By “exactly true” Clinton means that QM is the best model humans can find of the universe. And by “fear” what Clinton means is that Clinton can never know if Clinton is just stuck on some island in mathematical theoryspace or if Clinton is deceived by his own neural model of computation.

    I mean, either

    (a) all of our experimental data continues to be consistent with the hypothesis that QM is the “actual operating system of the universe,” or else

    (b) it doesn’t.

    In case (a), we can simply continue regarding QM as the “actual operating system of the universe,” as best we can tell, subject to the usual proviso that in science you almost never “prove” your theories, you only accumulate more evidence for them or you rule them out.

    In case (b), we might or might not be smart enough to come up with the deeper theory, but at least we’ll then know that QM was not the “actual operating system of the universe,” and that its appearance of being so was illusory!

  536. Jester Says:

    Scott #530:

    “Suppose you were designing a new universe from scratch.”

    That is… a tall order.

    Also, what do you mean by “universe” here? Something like our reality, made of atomic matter, inhabited by animals etc.? Or a model universe like for example minecraft, or toy universes (models) by theoretical phycists, to probe/better understand its laws? Purely mathematical structures in the platonic sense?

    I was under the impression that your challenge/questions were pertaining to our actual reality, and the curious and peculiar scientific relevance of QM in it.

    As an aside: while QM and its formalism is indeed extremely important and unavoidable in many respects, there are many scientifically important questions it doesn´t (can´t?!) answer (like the particle families, whose properties are essentially empirical, especially mass). It is not the answer to every question in physics, let alone in the universe; so why are you so “hung up” on it? Apart of course from it being your expertise and you liking it, both of which are sufficient reasons to do it. Perhaps you overestimate its importance for universal, even quasi-philosophical questions?

  537. Scott Says:

    Clinton #377:

      “If (as I fear) QM is exactly true, then we might not ever be able to explain it in terms of anything deeper (but we can still try!).”

      But should we try?

    Yes, we should. 😀

      A quick page search shows almost no mention in this thread of Godel or the Halting Problem. That can’t be right…

    For starters, QM, and other physical theories, are not “formal systems” of the kind that the incompleteness theorem talks about. Formal systems are things that we can use to reason about physical theories.

    Yes, it’s possible that some theorems relevant to physics could be independent of ZFC or whatever—but even if so, such theorems would necessarily involve quantification over an infinite set of possible situations, or time evolution infinitely far into the future, or some other infinite element. Those theorems’ independence from ZFC would be no bar whatsoever to finding a unified theory of fundamental physics, which is more like finding the right “axioms” than like discovering all the possible consequences of those axioms (i.e., theorems). Sure, we might not discover that unified theory, but if so it will be for a different reason, such as lack of ingenuity or funding. As someone who knows the incompleteness theorem pretty well, I can tell you with certainty that the high-energy physicists will not be able to blame their failure on it. 🙂

  538. Shmi Says:

    I think I may have made this point in one of the comments before, without much traction:

    it is well known that one cannot have [x,p]=i in a finite Hilbert space, so either it’s an approximation or the Hilbert space is truly infinite.

  539. Jester Says:

    Incidently, do we know why the “speed of light” has the value that it has, relative to other physical values? And if this ratio were changed, would that render the universe unstable in the calculations?
    (Apologies if too off-topic; please disregard accordingly.)

  540. Scott Says:

    Jim Graber #379: Yes, that paper is indeed an attempt to answer the question in my post. I’d have to study it more carefully, but my immediate reaction is: it seems to rely on the coincidence that, in our (3+1)-dimensional universe, spin-1/2 particles physically instantiate qubits, with the possible spin directions corresponding to points on the Bloch sphere. I’ve certainly marveled at the same fact, but doesn’t it seem a bit too specialized to be taken as the reason for the entire edifice of QM?

  541. Scott Says:

    fred #381:

      What do you think of Sean Carroll’s program to show that everything, including spacetime, could be derived on top of the wave function as the most fundamental object?
      I guess that’s one way to go about proving that QM is necessary, no?

    I’m a fan of that program and I follow it with interest! Even if it succeeded, though, in a very precise sense this program would not show that QM was “necessary”: instead, it would show that QM was “sufficient.” 🙂

  542. Scott Says:

    Stewart Peterson #386: Very briefly, the P vs. NP question talks only about computational problems where

    (1) all of the required information is provided as part of the input;
    (2) as soon as you see the input, you know an efficient, explicit algorithm to check any proposed answer; and
    (3) the “only” difficulty is the exponential number of possible answers to check.

    3SAT and Sudoku are examples of such problems; reversing physical evolution (in cases where information has gotten lost) is not an example.

    That’s the reason why your 1:45AM thoughts can’t possibly have solved P vs. NP. 🙂

  543. Paul Hayes Says:

    Clinton #529 (“I went back and read over Feynman’s paper and find that Feynman appears to endorse something like using negative numbers in intermediate calculations as part of a probability theory.”)

    Sure. I endorse that too. Even in classical probability, e.g. P(A+B) = P(A) + P(B) + (-P(AB)).

    Does that then sound like the right way to say how negative numbers are involved?

    Sure. OTOH, for the sake of a better understanding of what’s going on, I’d strongly recommend steering clear of “QT postulates” and towards the QPT literature (e.g.).

  544. Scott Says:

    Philippe Grangier #388:

      I suspect you are circular when answering wolfgang #245 by invoking decoherence. If you look for an answer to the first part of the question (why did God use quantum theory to make the universe ?) you cannot invoke the QM prediction of decoherence to answer the second part (why and how did He make it appear classical to us ?). Unless you consider that God made QM for the very purpose of using decoherence, which would be an answer to your initial question.

    All I meant was that, if we’ve already accepted QM, and we know that the universe will start in a special initial state and then apply a Hamiltonian that gradually fills out the Hilbert space, then decoherence theory plus MWIism give us a ready-made explanation for why observers within our universe could perceive a classical world subject to occasional random jumps. I wasn’t proposing this as an answer to the “why QM?” question: at most it could be one piece of a much larger answer.

  545. Scott Says:

    Philippe Grangier #390:

      Q1’ : Why (and how) did God make the universe both quantum and classical, depending on the way or the scale you look at it ? What would’ve been wrong with choosing one possibility only ?

      I think the answer to this question is much easier, because choosing either one leads to obvious contradictions with empirical evidence. And thus God needs both of them to get a meaningful universe…

    No, that doesn’t work. To say it for the nth time, the “empirical evidence” isn’t fixed in this exercise, but depends on what kind of universe God chooses (or what kind of universe we choose when roleplaying God), which is precisely the question at issue!

  546. Scott Says:

    Michel #391:

      As soon as we set up a ‘classical’ universe, we may get the reals as insufficient to manage reality. The reals automatically then give rise to the splitting field of complex numbers , which gives us a simpler to define universe, where more complexity is possible with less rules. In this way Q2 is (almost) inevitable.

    Even if I accept that, it still doesn’t explain why complex numbers should appear in physics as quantum-mechanical amplitudes, rather than in any of a thousand other imaginable ways.

  547. Scott Says:

    Veedrac #394:

      I don’t know enough to tell you why quantum mechanics is the right answer, but I do think there is a simple answer to why-not-classical that you are hinting at here, which is that it’s too small. If stars did not shine nor meteors fall, but we still learned about earthly facts of physics, evolution, and the history we’ve had of evolutionary catastrophe and innovation, then were we wise enough, we could still deduce *purely on first principles* that the world is too small, and there must be more to our universe we could not see. We would then not be surprised in the least to find out about quantum mechanics—for sure there must be an infinity of branching realities, else how could we so unlikely possibly come about?

    Again, unfortunately, I don’t think the “bigness” argument works on its own. There are already billions of galaxies that we can see, so why not simply postulate an infinity more that we can’t see, before going to a quantum-mechanical wavefunction?

  548. fred Says:

    Could it be that QM is the optimum operating system of the universe because a QC can simulate QM systems in a way where the systems and their simulations are perfectly indistinguishable? So that the simulation hypothesis could be perfectly realized (QM realities can be stacked on top of one another).

    Something that’s not true about digital computers and “classical” physics: no digital computer with finite resources can perfectly simulate many non-trivial basic classical systems (like 3 body problem)?

  549. Martin Mertens Says:

    Scott 262 and 485

    “one of those possibilities implies unbounded speeds, therefore no true locality or isolation of subsystems, and therefore the other possibility is realized”

    “you want to be able to pick stuff up and move it around without changing its structure, and you also want an upper limit on the speed with which you can do so, since otherwise you could get a giant mess where everything instantaneously affects everything else.”

    Hi Scott, why is it obvious that the universe must allow true locality? Weren’t people fine with the idea that everything instantaneously affects everything else in the days of Newtonian physics?

  550. Scott Says:

    Brooks #408:

      I am late and not particularly knowledgeable, but IMO Q1 is just a specific case of the “why are so many of our conditions so perfect for us to exist in” question that is best-answered by the Anthropocene principle.

    LOL, I’m now imagining an “Anthropocene principle” that’s almost the opposite of the anthropic principle … saying that the universe must be such that life will not only arise, but also quickly destroy itself! 🙁

      Why quantum mechanics? Well, why DNA? Why gravity? There are probably lots of other ways things could have worked, but if they produced conditions for sentience, we (or our protoplasmic counterparts) would be asking “why froblits? Why BNM2? Why the general charge field?”

      We can’t know whether quantum mechanics is the only way for a universe to work. We just have to be comfortable with that uncertainty.

    The scientific spirit exists in a tension between two forces:

    (a) being comfortable with uncertainty, and
    (b) actually working to resolve your uncertainty.

    Jumping prematurely to an unsatisfying explanation, and fetishizing your in-principle inability ever to explain something, are both failure modes, since if there is a good explanation (as there so often has been in science’s history), both attitudes will massively interfere with your ability to find it.

  551. Scott Says:

    James Gallagher #409:

      I think you must be misunderstanding me, Schrödinger Evolution preserves |psi|^100654444222 (for example)

    No. No it does not. You are mistaken about this, and I did not misunderstand you. Unitary evolution preserves only the 2-norm. One can even prove that there are no nontrivial linear transformations that preserve other p-norms (except that stochastic evolution preserves the 1-norm on nonnegative real vectors only). Try some examples and see!

  552. OhMyGoodness Says:

    By the way the written quote from Haldane is-

    “The Creator would appear as endowed with a passion for stars, on the one hand, and for beetles on the other”

    It’s often said that when queried by a women’s church group what he had deduced from his studies he answered with-

    “He has Ann inordinate fondness for beetles.”

  553. Scott Says:

    Philippe Grangier #410:

      In the last Växjö conference (August 2021) I felt that there was a consensus on some kind of trade-off…

    I’ve attended only one Växjö quantum foundations conference, back in 2003. I enjoyed it a lot and would go again sometime, but I also found the meeting to be teeming with axe-grinders, spouters, word-redefiners, Bell’s-theorem deniers, local realist diehards, and other quasi-crackpots! While I was only a grad student then, it was enough to tell me not to put too much stock in any “Växjö consensus”… 😀

  554. Scott Says:

    Russel #412:

      But the inhabitants of such universes quickly reached a point where they had discovered all the laws – and after that, well, they got bored. Weltschmerz. There was no sense of wonder, nothing left to debate, no uncertainty, no further progress to be made, no problems to solve, and in consequence, none of the higher planes of happiness that God wishes for his creations.

      And so God experimented with more complex universes. Too much complexity proved troublesome as well – the sentient beings that emerged were unable to solve anything, and gave up striving to understand. But somewhere in between there lay a sweet spot – where the sentient beings could perpetually live in wonder and debate and strive forward, attaining the satisfaction of discovery and understanding, but never the ennui and dissipation of having solved everything.

      Thus the beautiful irony of my question – why is the universe quantum mechanical? Simply so that I could wonder why.

    Next time you meet God, could you please ask Her why She didn’t create additional particles at the LHC energy scale, so that physicists in the early 21st century would have some more experimental clues to go on, just like their forebears did, and would not be at risk of “ennui and dissipation”? Many of my particle physics friends would like to know the answer. 😀

  555. Scott Says:

    Jacques Pienaar #424:

      If we are satisfied that our world-view stands up on its own principles, then it makes sense to hold on to it and ask: “why the quantum?” But if we suspect our present world-view might be too narrow (incidentally, this is not an unreasonable suspicion, given that the physicists who shaped the present world-view are disproportionately white men raised in the tradition of Western philosophy) then we should instead ask: “how do I change my world-view, so that the quantumness of the universe might fit in comfortably as an unquestioned postulate”?

    Please enlighten me, then. In your view, which non-Western or non-white-male-dominated philosophical traditions shed the most light on the question of why our universe turns out to be describable by a complex unit vector evolving unitarily in a tensor product Hilbert space?

  556. Scott Says:

    Aditya Prasad #436:

      I might also point to Rovelli and his recent discovery of and affinity for the Buddhist philosophical notion of “emptiness.” In short, emptiness could be described as the realization that “there is no way that things ‘actually are.’”

    Even if that were so, science would still be concerned with explaining how things appear to be. Is it your contention that Buddhist philosophy can help us with that?

  557. Scott Says:

    Thanks so much to everyone who offered advice about my ankle! I can walk again today without too much problem, but will still need to see an orthopedist or physical therapist to figure out how to prevent this from recurring every few weeks…

  558. Scott Says:

    Philippe Grangier #470:

      Well, I guess I’m among the minority who did not grasp the concept, which was maybe not so clear in your initial questions, it may depend on your mindset. But at least I’m happy to read that the answer to ‘why QM rules this world’ is obvious for you.

    To say it one more time:

    (1) If you take the experimental data of this world as given, then it’s obvious that you need QM (or some theory to which QM is an excellent approximation) to explain that data.

    (2) If you don’t take the experimental data of this world as given, but are designing a new world from scratch, then it’s far from obvious why or whether you’d choose to make your new world quantum.

    How could I have said this more clearly?

  559. Scott Says:

    Ted #471: Alas, perfectly reasonable comments often end up in my spam filter! I’m glad that you successfully posted another version of what you wanted to say. Feel free to email me if it happens again.

  560. Scott Says:

    Jester #477:

      Scott, do you include classical field theories (e.g. Maxwell) in classical mechanics, and Quantum Field theories (QED, QCD) in Quantum Mechanics; so the main difference is between classical vs. quantum, mechanics or not?

    Yes.

  561. Scott Says:

    Luke W #512:

      I’m wondering what you make of Tim Palmer’s suggestion that “quantum indeterminacy may perhaps be replaced by certain kinds of ‘hidden variable’ chaotic dynamic, provided that the chaos is sufficiently nasty.”

    Tim is now a full-on superdeterminist—not surprisingly, since that’s indeed the only way to torture a local hidden-variable model like what he wants into reproducing the Bell inequality violations. In a recent email exchange, Tim assured me that while superdeterminism might seem like a vacuous dead-end, equally able to “explain” anything whatsoever, all would become clear if only I understood the role of fractal cosmologies, p-adic numbers, and Fermat primes in his story.

    If you can’t predict my reaction to that, check out our recent superdeterminism thread for clues. 🙂

  562. Aditya Prasad Says:

    Hi Scott. First, thanks for responding. I’m deeply impressed by your willingness to engage with us (and especially with crackpots like me!)

    If you consider QM an explanation of “how things appear to be,” then yes, I am claiming that Buddhist philosophy (or realization) can help with that. The phrase “no way that things actually are” roughly translates to “realism is false,” not that there’s no structure to appearances.

    My claim is that in Buddhist realization, there is the direct perception that one’s experience has certain features that are consistent with (one very straightforward interpretation of) the measurement problem. In particular, that all of (this) reality hinges on *you specifically* in a mind-shatteringly bizarre sense, and yet that other people are no less real or conscious, and are in the very same situation as you. (And also that there are countless other realities, all of which one discovers oneself manifesting in/as, as one approaches becoming a full Buddha.)

    I’m not asking you to believe that this is true, but if it were true, it seems like it would motivate at least some of the structure of QM. In particular, it resolves your question to Yoni regarding why not simple classical indeterminism. Would you agree?

  563. Scott Says:

    Mateus Araújo #513:

      You didn’t reply to my request to give a definition of objective probability. I don’t expect you to succeed, people have tried to do it for a century and failed. I just want you to be honest with yourself and realize that you can’t. This failure is so widely recognized that the consensus in philosophy is that objective probabilities do not exist, they are content to deal with subjective probabilities. Meanwhile the consensus in physics is that objective probabilities are obviously what quantum mechanics gives you, and they don’t worry about defining them.

    You and I part ways at this stop. I don’t accept that one has to give a definition of “objective probability” that would satisfy you, or the world’s philosophers, before one can use the concept in constructing a physical theory—it’s enough to know how to work with it. This is directly analogous to how Newton didn’t have to “define” force, Einstein didn’t have to “define” spacetime, etc. etc. It was enough for them to give mathematical descriptions of how these entities behaved in their theories.

    Presumably you and I agree that even if the world were classical, we’d still use probability theory all the time to describe our knowledge, just like we use it now? Nevertheless, you maintain that the only possible kosher way (or at least, the only way known) to take this formalism that we all use and build it into the fundamental laws, is the indirect, amplitude-based way that QM does it? If so, I respect that you’ve staked out intellectual territory that you might inhabit alone, or fairly close to it! 🙂

  564. Scott Says:

    Liam #518: Thanks for the extremely interesting comment—another of my favorites of this thread!

    I have two followups:

    (1) Every time I’ve struggled through an explanation of why gauge redundancies imply the existence of new forces, it felt to me like hocus-pocus. I.e., you take a step that I would never have contemplated, of just blatantly adding a new term to your Lagrangian to counterbalance the otherwise-bad effects of the gauge symmetry that you insisted on, and then voila, a new force! Is your position more like: (a) I should continue struggling with this until it doesn’t feel like hocus-pocus, or (b) it would be better to go in the opposite direction, of starting with the existence of spin-1 and spin-2 forces, and then seeing why they basically have to act like gauge forces?

    (2) You argue, fascinatingly, that the QM/SR/QFT is preferable to classical cellular automata as the basis for a universe, precisely because the former is so much more constrained. One could of course wonder why being constrained is so wonderful—to me, it seems wonderful if and only if the rare universes that satisfy your constraints happen to be the sorts of universes you wanted anyway, which then brings us right back to the question of why you wanted them! 🙂

    But let me take a different tack: the principles of QM/SR/QFT aren’t that constraining—or rather, they’re constraining except in all the ways in which they aren’t. They still leave it to God to choose the matter content, gauge groups, coupling constants, and even the dimensionality of spacetime. In other words, looking at the known laws of physics, we might say that:

    (i) God decided, for whatever reasons, on QM and SR.

    (ii) A great deal else follows from (i) as inevitable logical consequences, and as should surprise nobody, God decided on all of that other stuff too. 🙂

    (iii) After (ii), God still had a huge (probably infinite) space of possible choices. So God made one such choice, freely and arbitrarily as far as we know today, just as if She had chosen the Game of Life from among all possible cellular automata.

    Furthermore, we can imagine that, to the Mind of God, everything contained in (ii) would be so obvious as to pass without comment (God has, of course, mastered Weinberg’s QFT textbooks 🙂 ) … so that God’s “mental effort” would be focused entirely on (i) and (iii). If so, though, the content of (ii) no longer seems all that relevant to the “why QM?” question—precisely because we now understand its inevitability!

  565. Scott Says:

    Jester #520:

      Is “quantum mechanics” something physicists/computer scientists do, or an inherent property of nature/reality?

    Again and again in this thread, people have asked questions like that, but I confess that I still don’t see how it helps us!

    Clearly the answer to your question is “both”: yes, QM is a thing scientists do, but the reason why scientists do it is that reality has some inherent property that makes it the right thing to do, when you probe reality at the most fundamental level currently known. The question we’re asking is why reality has that inherent property.

    Incidentally, this is also my central objection to QBism, Copenhagen, and all other subjectivist interpretations of QM. The advocates of those interpretations keep repeating, in a thousand variations:

    “QM is not about Nature, it’s about us, how we talk about Nature, organize our knowledge, etc.!”

    Clearly this strikes them as a profound insight. Whereas I, as it were, already assimilated the insight and just proceeded immediately to what I see as the real question, namely, “why does Nature have whatever inherent property makes it an overwhelmingly good choice to talk about it using QM—a property we might abbreviate as ‘being quantum’?” 😀

  566. Veedrac Says:

    Scott #543:

    Again, unfortunately, I don’t think the “bigness” argument works on its own. There are already billions of galaxies that we can see, so why not simply postulate an infinity more that we can’t see, before going to a quantum-mechanical wavefunction?

    I do not think very big finite classical universes are any simpler. You spend so much descriptive complexity on encoding a very big number in addition to your universe, and then randomly seeding each part of your universe in a distinct and interesting way, whereas you could just describe your universe in a way that directly spawns that big and varied spectrum instead. A classical universe does not want to be big, it wants to be the size of the light cone, or the interaction cone if not so similar to ours, and if you just make that interaction faster, you’ve really just shrunk the pieces.

    More importantly, if you really do have that initialization of a really big universe, you’re still better off in an anthropic sense running a simple multiplicatively-growing universe off it, rather than a tiny classical one. Learning that there were other continents didn’t stop there being other stars, learning there were other stars didn’t stop there being galaxies, learning there were other galaxies didn’t stop reality constantly cloning itself in the quantum realm. The calculus doesn’t really change.

    (Sure, you say, but why interference, why not have this quantum cloning be non-interacting? Well, I don’t know, but the non-interacting spectrum of universes would still have to be non-classical, else they’re all just the same universe. The argument that QM is overly complex is a lot weaker against that comparison.)

    (Sure, you say, but why this anthropic assumption where your probability is proportional to complexities and quantities of observers? Well, I don’t know, but if it wasn’t the case that the universe was drawn from a computationally coherent distribution, I wouldn’t expect people to be able to reason about it probabilistically, nor would I expect to exist if anthropics did not.)

    (Sure, you say, but the quote said ‘infinite’, not just ‘very big finite’. Well, I don’t know, but look, over there, a conveniently timed distraction.)

  567. Lorraine Ford Says:

    How one symbolically represents the world is part of the problem.

    For starters, there is actually no such thing as a purely mathematical system, because mathematics only exists in the context of human consciousness and agency: people invented the symbols and the meaning of the symbols, people discern the symbols and people manipulate the symbols. A mathematical system is actually a larger thing that includes human consciousness and agency, but the essential human consciousness and agency aspect of a mathematical system is always discounted, i.e. regarded as being unworthy of consideration. To completely represent a stand-alone mathematical system, you would need to use symbols representing the equivalent of human consciousness and agency.

    It’s a similar thing when using symbols to represent the world-system. There are necessary aspects of a system that can’t be represented by the usual mathematical symbols:

    — It is logically necessary that any differentiated system (e.g. one that we would symbolically represent by equations, variables and numbers) can differentiate itself (i.e. discern difference in the aspect of the world that we would symbolically represent by the equations, variables and numbers).

    — And it is logically necessary that any moving system can move (what we would symbolically represent by) the numbers that apply to the variables. But you can’t represent number movement with equations: despite the delta symbols, equations can only ever represent relationship.

    That is why a computer program can symbolically represent a WHOLE system much better than mathematical symbols alone can, because a computer program has special symbols that can represent the discerning of difference, and special symbols that can represent steps (e.g. the assignment of numbers to variables).

  568. tg56 Says:

    Very interesting, I would read that book if you end up writing it!

    I wonder how much would fall out of the universe being in some way fundamentally discrete in time/space/content yet having no particular grain and additionally having all those aesthetically pleasing symmetries/conservation laws. If there’s a minimum meaningful length but no preferred grid does that imply things need to be fuzzy in an interesting way? (As should be obvious I know very little on the subject). I’m now trying to imagine what the Game of Life has to look like if there’s no grid, no tic of the clock, and no preferred direction or speed.

  569. Zeb Says:

    Assuming that someone has somehow managed to justify the unitary evolution / density matrix / partial trace parts of the QM formalism but not the Born rule, what is wrong with the following justification of the Born rule? (I feel like it is a “standard” argument, but my take on it might be slightly different from the usual one.)

    Suppose we wanted to test whether some repeatable process produces some particular outcome with probability at least \(p\). Then we set up a system where we repeatedly set up the experiment from scratch, entangle our measuring device with the outcome, and increment a counter every time the outcome we are interested occurs (and also increment another counter to keep track of how many experiments we’ve done). At the end of each experiment, we look at the value of our counters, and determine whether or not at least a \(p\) fraction of the experiments had the outcome we are interested in (we do this using a classical Turing machine, implemented in QM via the whole business of reversible logic gates and throwing away junk bits). At the end of each round of this system this we update the value of a single, fixed output bit (really a qubit) which stores the answer to the yes/no question “were at least a \(p\) fraction of the experiment outcomes the interesting one?”

    As the system runs, the state of the output bit evolves. We only care about the output bit – we don’t care about all the complicated internal state of the experiment, or the measuring device, or the counters, or the multiply-by-\(p\) gates – we only care about this one, single (qu)bit. So we take a trace, and end up with the density matrix of the output bit, which is a simple two-by-two matrix. Now standard quantum mechanics tells you that this two-by-two density matrix will asymptotically approach either the matrix \(\begin{bmatrix} 0 & 0\\ 0 & 1\end{bmatrix}\) or \(\begin{bmatrix} 1 & 0\\ 0 & 0\end{bmatrix}\), depending on whether the probability \(p\) was larger or smaller than the Born rule’s prediction.

    To me, this looks like a derivation of (a frequentist version of) the Born rule from the unitary evolution / partial trace setup together with the *topology* on the collection of two-by-two density matrices. Did I cheat somehow? Is the partial trace rule as hard to justify as the Born rule?

  570. Mateus Araújo Says:

    Scott #563: You still haven’t got the point. In Newton’s and Einstein’s theories the behaviour of force and spacetime are perfectly well-defined through their equations. You know how to transition from state A to state B. That’s all we need.

    This is not the case for objective probability. You are in state A, and from that you sometimes go to state B or state C, in such a way that when you repeat the transition many times the frequency of going to state B will be close to 1/3. Probably. How can’t you see that the transition rule is ill-defined?

    In our world it’s easy to get around this, we just use a QRNG with bias 1/3, and we have a well-defined transition rule. We can’t invoke a QRNG if the universe is classical, though.

    If our world were classical we would indeed use subjective probabilities, no worries. I don’t see what you mean by using them in our fundamental laws. Subjective probability? You mean the weights agents use to calculate expected utilities and make decisions in situations of insufficient knowledge? In our fundamental laws? That doesn’t make sense, we need objective probabilities for the fundamental laws. Or are you claiming that the quantum mechanical probabilities are subjective? In that case you are alone with the Bohmians in one corner, while pretty much all physicists agree with me that they are objective. Or what are you claiming? I genuinely don’t understand.

  571. Scott Says:

    Mateus Araújo #570: Your position here is as baffling to me as the superdeterminists’ position is to both of us.

    Why not just have a theory where the “true state of the world” is a classical probability distribution (p1,…,pN), where that distribution evolves in time by stochastic matrices, and where the interpretation of the probabilities is the obvious one—call the probabilities “objective,” “subjective,” or whatever other words you want, they’re the numbers that agents should use if they want to be right about what they’ll see next? I.e., simply the direct classical analogue of MWI, with probabilities in place of amplitudes? How could that possibly be problematic in any way that MWI itself isn’t problematic?

  572. Scott Says:

    NT #525:

      So in this sense QM is just another symmetry of the laws of physics. Just like we have rotational and translational symmetries of space, we also have a rotational symmetry of the space of “probability vectors”, which gives us QM.

    This is indeed an intriguing possibility, and not at all unrelated to what (e.g.) Lucien Hardy does in his derivation of QM!

    On the other hand, it’s clear that our universe was not optimized to be “as symmetric as possible.” Many symmetries, like parity and time reversal, exist but are broken, while others, like supersymmetry, are badly broken if they exist. And this isn’t terribly surprising: presumably a maximally symmetric universe would be some perfect, isotropic sphere that nothing ever happened to! 🙂

    Thus, one can’t just have a general heuristic of “our universe has property X because it’s more symmetric that way”—one really does have to explain why some symmetries were apparently more important to God than others.

  573. Scott Says:

    Jester #536:

      I was under the impression that your challenge/questions were pertaining to our actual reality, and the curious and peculiar scientific relevance of QM in it.

    Of course!

    The point, once again, is that to ask for explanations of our actual reality, means to ask “but why wasn’t it otherwise”? And that inherently requires considering other, hypothetical, non-realized ways that reality could have been.

      As an aside: while QM and its formalism is indeed extremely important and unavoidable in many respects, there are many scientifically important questions it doesn´t (can´t?!) answer…

    Again, of course! While one might eventually hope to explain every facet of existence, in this post I set ourselves the more … “modest,” “achievable,” “warmup” goal of merely explaining QM. 😀

    Of course that’s probably still too hard, in which case, the usual approach of a scientist would be to narrow the question still further, and further, until they reached something that they could actually answer. And I strongly endorse that here.

  574. Scott Says:

    Martin Mertens #549:

      Hi Scott, why is it obvious that the universe must allow true locality? Weren’t people fine with the idea that everything instantaneously affects everything else in the days of Newtonian physics?

    Oh, I don’t claim that this is obvious—not at all. I merely claim that seems extremely natural, in a way that QM currently doesn’t (at least to me).

    (Note also that, even in the 1600s, Newton was severely criticized by fellow natural philosophers for the instantaneous nature of his gravitational force law, and he accepted the criticism! He just said, reasonably, that it was the best he could currently come up with.)

  575. Scott Says:

    Aditya Prasad #562:

      I’m not asking you to believe that this [Buddhist philosophy] is true, but if it were true, it seems like it would motivate at least some of the structure of QM. In particular, it resolves your question to Yoni regarding why not simple classical indeterminism. Would you agree?

    No, sorry, I wouldn’t. You’d need to spell out for me a little more explicitly why a classical probabilistic universe would’ve been incompatible with the Buddha’s teachings. Also, did any Buddhists predict as much before QM was experimentally discovered in the early 20th century?

  576. Scott Says:

    Everyone: Having finally—finally!!—caught up in my responses in this thread, I’d now like to get back to the rest of my life. 🙂 Thanks so much for participating. Please confine all further comments to responses, rather than starting new topics, and then I’ll close the thread in another day or so (although I might do another post soon reflecting on what I’ve learned).

  577. Jester Says:

    Thank you, Scott.

    Is it fair to say then that your goal, other than the survey, is to do for “quantum” what quantum did for “classical”, i.e. the layer underneath, resp. show that this is not possible, thus ” quantum” is the most basic layer, and everything else is on top of it?

    (That would be neat, to understand at the end what you meant at the beginning! Also somewhat embarrassing… :-/ Ah well. Better late than never.)

  578. Scott Says:

    Jester #577:

      Is it fair to say then that your goal, other than the survey, is to do for “quantum” what quantum did for “classical”, i.e. the layer underneath, resp. show that this is not possible, thus ”quantum” is the most basic layer, and everything else is on top of it?

    Showing either of those things is certainly a worthy goal, yes! But I’d content myself with the much more limited goal of writing the best available survey about the question—a survey that left no relevant known facts unexamined.

  579. mjgeddes Says:

    Scott #573

    “While one might eventually hope to explain every facet of existence, in this post I set ourselves the more … “modest,” “achievable,” “warmup” goal of merely explaining QM. 😀 ”

    No, I think we may as well try to ‘explain everything’ , at least in terms of fundamental principles. Then I’m sure that QM , including the best interpretation, would be seen as natural and simple.

    In fact, I believe I did exactly that (explain everything) in posts #187 and #207, the only problem being that my explanation is still too high-level and general to be that useful at the moment – just need to fill in details and work out the math 😀 But basically, ‘Self-Actualization’ is the answer to everything.

    Meantime, in terms of specific guesses, I would reiterate what I said later in thread: ‘geometry’ is the fundamental reality, QM is about ‘generalized statistical mechanics’, and the notion of ‘symmetry’ is very important.

    Just an additional comment on stat mechanics: it’s an interesting point that in classical mechanics, if you had complete information about physical states, the ‘entropy’ of anything is zero. Whereas in QM, that need not be true. So perhaps QM is inevitable because it enables the notion of ‘entropy’ to be objective?

    But I would definitely look more closely at notions of ‘Symmetry’. I understand that symmetry is not everything in physics, but for *explaining QM* in particular, it might just be – Algebra is all about Symmetry.

  580. bystander Says:

    @Scott #575: When a tree falls in a forest and there is no observer, the wave function does not collapse.

    PS You are supposed to build quantum computers, so that Buddha can crack messages of the enemy. And the maximal speed is finite, so that non-Buddhist civilizations do not destroy the whole universe.

  581. Ted Says:

    Matt Leifer #358: Thanks for the fascinating comment!

    I’ve thought to myself before that there are two kinds of quantum mechanics practitioners in the world. (I know that this sounds like the setup to a joke, but it isn’t.) Please correct me if I’m wrong, but I think that my own distinction lines up very closely with your distinction between the CSHS and the CLHS, but it’s pithier (if less well-informed):

    1. The first type of person thinks that a pure state is a special case of a general state that happens to be represented by a rank-1 density operator. (Or equivalently, a general state that has zero von Neumann entropy, or one whose uncertainty is “all quantum rather than classical”, although you need to be careful about how you interpret that last claim.)
    2. The second type of person thinks that a general state is (loosely speaking) a special case of a pure state for which you don’t have access to all of the degrees of freedom.

    (I don’t mean “special case” literally for perspective #2. I just mean more broadly that from perspective #1, pure states “sit inside of” mixed states, while from perspective #2, mixed states “sit inside of” pure states, but in a different sense.)

  582. Aditya Prasad Says:

    The key point is that it not only needs to feel like there are many possible futures (and only one past), but that _which one gets chosen_ depends in some way on you, specifically. (I’m well aware that no self-respecting physicist believes that anyone has a _choice_ about which branch gets chosen, although even this isn’t as simple as it looks. All that’s needed for now is that you are the resolution point.)

    The only way to accomplish this classically (AFAICT) amounts to essentially solipsism, which God also doesn’t want. You must be interacting with other equally real beings, each of whom has the same ultimate freedom, from their own perspective. And the most straightforward read of QM and the measurement problem (that doesn’t wave it away by pretending that decoherence solves it) lands you in something very much like this position.

    It seems that no Buddhist was clever enough to develop this into a physical theory, I admit. And if you were to open up orthodox Buddhist scriptures, you’d be hard-pressed to find these constraints laid out so plainly. But you will repeatedly find analogies comparing life to a dream; awakening being about becoming lucid in that dream; exhortations to achieve total and ultimate freedom; constant reminders that other people, while not exactly “real” either, deserve the utmost compassion; etc. While I’m not sure that’s enough to give you the mathematical structure of QM, it sure seems to force something that behaves remarkably similar to it. (And although I’m _far_ from enlightened myself, I’ve had enough personal experience to convince me that they’re not far off the mark in their claims.)

  583. J Says:

    Following the (fun) line of thought advocated by Matt Leifer #358-

    Is it only QM that has the special property of being self-consistently developed from more than one perspective. I am *not* a physicist, and hence might be wrong about the following (apologies if so), but my lay understanding is that even classical mechanics can be formulated from multiple perspectives (Newtonian / Lagrangian / Hamiltonian).

    If this is so, even if we could argue for a physical principle of the form “A reasonable universe should have multiple workable mathematical formulations”, this might not help privilege QM over other theories (even classical mech) as much as we would like.

  584. Steven Evans Says:

    Scott Says:
    Comment #517 January 31st, 2022 at 8:41 am

    If everything observable to us is ultimately nothing but the output of a quantum computer, including all the comments, then that cannot contain the answer to why the quantum computer is and why it is not otherwise. These questions are then just patterns in the computer output (like 0.1134 looking like “hello”) which are meaningless to the computer. By making the assumptions you do, you render your question meaningless.

    All we can definitively say based on the assumptions is that we are in a quantum computer, and this is what the quantum computer can do. If we want to know why physicality appears roughly at least as quantum computation, surely the biggest hint must be where that physical effect mysteriously apparently disappears from view, namely the measurement problem.

  585. Scott Says:

    Steven Evans #584: Your argument (such as it is) has nothing whatsoever to do with quantum mechanics, and is reminiscent instead of a centuries-old religious argument against scientific materialism itself. Namely,

    “if what you atheistic scientists say is true, then according to your own theory, you can’t have any justified reasons for saying it. You’re just saying whatever you are because the initial conditions of the universe, the laws of motion, your evolutionary imperatives, or whatever determined that you’d say it. Therefore I don’t need to listen to you. HAHA CHECKMATE ATHEISTS!”

    The scientists have any number of possible responses—e.g., “forgive us if we still disagree with you; God must not have given us the reasoning power to see why you’re right”—but maybe the best response is simply that this is a textbook case of the genetic fallacy. Even a fallible process can produce valid arguments. Even a mindless computer, if it’s formed or trained correctly, can generate valid arguments (as with automated theorem-proving programs). To reject an argument, therefore, it’s not enough to heap scorn on its origin; you actually have to look inside the argument and articulate what’s wrong with it. There are no “Fools’ Mates” here.

  586. Baruch Garcia Says:

    Scott#537

    >For starters, QM, and other physical theories, are not “formal systems” of the kind that the >incompleteness theorem talks about. Formal systems are things that we can use to reason >about physical theories.

    This is true. You cannot apply incompleteness proofs *to* quantum mechanics like you apply them to Peano axioms. QM is not a formal system and even if it one wanted to claim that it was QM still uses real numbers, which Tarski has shown is not undecidable like the naturals.

    However, so many alternatives to standard QM (e.g. hidden variables) have been proposed because of the “illogical” nature. QM does not follow pre-Godelian, Aristotelian logic, which is why it seems so strange.

    But the quasi-paradoxical nature of quantum mechanics fits…. perfectly….. into the quasi-paradoxical nature of the self-application of formal systems (even formal systems which do not use the effective procedure that Godel and Turing use… e.g. Quine’s protosyntax).

    If you want a cartoon version of this, look at Hawking’s Godelian argument for the impossibility to obtain a ToE (which he later withdrew). He compares our knowledge of physics to 2D maps of the world, where one point must always be excluded, where one category of knowledge excludes another category of knowledge. He makes no mention of QM, yet he is describing complementarity without realizing it! His argument was wrong, but not without merit!

    My point is that this is all a priori. Then questions you asked about the Born rule, unitarity, complex numbers, etc, fit into this framework as explained in #481. The Born rule, unitary, etc. are not a priori, but they fit into the a priori framework. My guess is that some of these ideas will evaporate with a quantum theory of gravity (Bryce DeWitt, for example thought the notion of Hilbert Space might not survive a quantum theory of gravity!!).

    If you want to use theological speak this is “I am that I am”, only formalized.

  587. Clinton Says:

    Scott #535:

    As Mark Twain said: If I’d had more time I would have written a shorter blog post …

    Let me copy your previous post here to save the scrollback

    *******************************************************************

    The Scott Fear:
    Scott fears QM is exactly true. By “exactly true” Scott means that QM is the actual operating system of the universe. And by “fear” what Scott means is that Scott may never know why QM must be the actual operating system of the universe.

    The Clinton Fear:
    Clinton fears QM is exactly true. By “exactly true” Clinton means that QM is the best model humans can find of the universe. And by “fear” what Clinton means is that Clinton can never know if Clinton is just stuck on some island in mathematical theoryspace or if Clinton is deceived by his own neural model of computation.

    You said:

    “I mean, either

    (a) all of our experimental data continues to be consistent with the hypothesis that QM is the “actual operating system of the universe,” or else

    (b) it doesn’t.

    In case (a), we can simply continue regarding QM as the “actual operating system of the universe,” as best we can tell, subject to the usual proviso that in science you almost never “prove” your theories, you only accumulate more evidence for them or you rule them out.

    In case (b), we might or might not be smart enough to come up with the deeper theory, but at least we’ll then know that QM was not the “actual operating system of the universe,” and that its appearance of being so was illusory!”

    *******************************************************************

    During working daylight hours I’m 110% on board with the scientific method all the way up to a Quine-Putnam ontological commitment to a universal quantum reality … psi-ontic for the win!

    But … in the dark hours of the night when the cold shadows of mortality loom … there be doubt …

    I’m afraid that either our mathematical enterprise or our neural model of computation could be acting like Descartes’ demon … cooking up the books from which we read “our experimental data”. This is not an actual, intentional “demon”, of course, I’m worried about … but an unintentional evolutionary or developmental accident of our mathematical logic or the grey stuff between our ears. This is not the “sensory” type of demon that Descartes had in mind simply deceiving us about external “appearances” of reality. This would be a more indirect and subtle demon that could take the form of …

    EITHER

    (Demon X) Is the mathematical enterprise that gives “our experimental data” its nature (the mathematical/logical fruit “Adam” partook of when first he committed us to use formal systems based in first-order, or propositional, logic). By this, I mean that the experimental data itself could be doubted because it is fruit on a twig of probability theory, which is on a branch of measure theory, itself on the trunk of set theory, and all rising from the roots of first-order or propositional logic. It seems like once we primates climbed this particular tree to have a look around that we may have committed ourselves to end up … out on a QT limb. And … once we are up the math tree we find ourselves naturally dealing with symmetries, measure/probability theories … and inexorably QT is the best-looking branch for us to climb out on and have a look. But, what if we think that this is the “best we can tell” only because we happen to be up a propositional, first-order tree?

    OR

    (Demon Y) Is the evolved computational hardware of our neural model of computation. Neuroscientific evidence suggests that cognitively (the software) our brains are filling our heads full of illusions and delusions. And on the hardware side: synaptic sites on neuronal dendrites encode complex numbers in the transfer impedance of the membrane amplitude and the normalization of complex amplitudes over possible states for some receptive field represent probabilities. We may not have to go all the way up to a full neural quantum model of computation before we could come under something like a cognitive illusion where all information we acquire “appears” to us like normalized complex amplitudes.

    X and Y are not necessarily exclusive.

    The reason for bringing up these Cartesian demons is that … either of them could very well be misleading us into thinking that we can/should ask …

    Q = Why should the universe have been quantum mechanical?

    That is, either a mathematical/logical demon or a neurological demon could be behind condition (a) above.

    I realize that I’m stepping outside of the agreed rules of science when I argue that we should not trust condition (a) in more than just the normal sense of remaining vigilant for inconsistent evidence. But then, I think that you also are stepping outside of the agreed rules by not accepting as an answer for Q the rule you yourself gave above that you “can’t prove your theories, you only accumulate more evidence for them or you rule them out.” Aren’t you trying to prove why the universe should have been QM?

    So, help me with the exorcism 🙂

    How do we exorcise Demon X ? And, remember, it’s no good pointing to condition (a) because the condition of (a) may be due to Demon X. The best argument I can think of to exorcise Demon X is that the possibility of a Demon X existing would cut down every potential tree we might want to climb … if the illusion comes with the nature of the tree …

    How do we rule out Demon Y ? I mean … if the only form of information we can possibly know will always look to us fundamentally like a complex amplitude (or the normalized complex amplitude). How would we escape from that? I would like to think there could be some kind of computational answer here – something like universality that would say “No Demon Y can’t trap you in your own computational model.” But I’m not sure, especially if the model always converts all information into amplitudes – whatever form it is in externally – if that would even be a form of “information” …

    Scott, thank you for hosting this discussion. Thank you (as always) for the incredibly charitable giving of your time and expertise.

  588. murmur Says:

    Mateus Araújo #513: I would accept your argument about how branching gives rise to objective probabilities if it could explain the form of the resulting probabilities. This would be the case if the branches were equally likely (at least for a finite number of branches). However, the probability is not uniform; it’s given by the Born rule. So branching doesn’t derive the Born rule (though it’s certainly compatible with it); it’s imposed separately. So I have a hard time accepting branching as the explanation of probability in our world. One might as well say that QM leads to objective probabilities via the Born rule, bringing in branching seems superfluous.

  589. Scott Says:

    Clinton #587: No, I really don’t think I’m violating my own rules. In science, you can’t demand that the universe give you a deeper explanation of something, but you can always ask. You can always look for a deeper explanation, bearing in mind that you might not find one. And even the deeper explanation probably won’t consist of a mathematical proof. I.e., even when there’s a rigorous derivation of an old theory’s conclusions from a new starting point (e.g., Kepler’s laws from Newtonian mechanics, Newton’s gravitational force law from GR, thermodynamics from statistical mechanics…), the new starting point still has to be accepted on the basis of “mere” empirical evidence and/or reasonableness, not proof. Nevertheless, this is the main way that scientific progress happens. And it’s been pretty successful.

    And I see no reason why I need to believe someone who points to a theory originating in experience (like QM), and expresses certainty that this time we’ve surely hit rock bottom and will never, ever successfully explain it in terms of anything else. Maybe so, but what’s the source of the person’s certainty? What’s the harm in seeing if we can make the theory a little more natural- or inevitable-looking, e.g. by rederiving it from a different starting point?

  590. Jacques Pienaar Says:

    Scott #555:

    Please enlighten me, then. In your view, which non-Western or non-white-male-dominated philosophical traditions shed the most light on the question of why our universe turns out to be describable by a complex unit vector evolving unitarily in a tensor product Hilbert space?

    I thought you’d never ask!

    First let me straighten out one thing: deriving a “complex unit vector evolving unitarily in a tensor product Hilbert space” is part of Q2; my comment was aimed at Q1. I’m only suggesting that the kinds of postulates you need in Q2 look “weird” to our intuition because we are looking at them from a particular angle; but maybe we can find a home for one of them (thereby resolving Q1) by shifting our perspective.

    My current source of inspiration is the tradition of phenomenology, which — oh bitter irony! — is part of Western contitental philosophy and most definitely historically dominated by white men such as Husserl, Heidegger, Sartre, and Merleu-Ponty. That said, phenomenology does have intriguing overlaps with Buddhism and has also influenced feminist philosophy via Simone de Beauvoir. But more to the point: you’ll be hard-pressed to find a philosopher of quantum physics today who admires phenomenology (and if you do, please introduce me); that is enough to place it outside of the usual boxes that we physicists are used to.

    How specifically might phenomenology help to answer “Why the quantum”?

    In “CSHS” interpretations, particularly “Copenhagenish” ones (borrowing terms from Matt Leifer #358) like those of Rovelli, QBism, Healey, Grangier, and others, there is an emphasis on “observation”. Although often used negatively, as in `an outcome doesn’t exist unless it is observed’, the implication is positive: that things do exist just in case they are observed. If you filter that idea through a (non-phenomenological) Western philosophical lens — the kind that makes a Cartesian split between a proactive `knower’ and a passive world just lying out there waiting to get `known’ — then it does sound like some variant of solipsism. But if you read it through the lens of phenomenology, the idea comes alive as a potentially constructive thesis, against whose backdrop a suitably formulated quantum principle just might appear natural.

    Phenomenology tells us that the ground-level of reality lies in what is observed, and everything that claims existence in physics — black holes, atoms, the big bang — has grown out of and has its roots in perception. The difference is subtle but important. When observations are taken to confirm the existence of a physical object, we usually interpret our observations as revealing a pre-existing truth that was formerly hidden from observation. On this usual account, observation is secondary to what is real: the objects of physics lie behind observed phenomena, as behind a veil or a screen, like puppet masters putting on a show for us.

    Phenomenology flips that view on its head. When observations “confirm” that an object exists, they just as much contribute to its existence. The objects of physics are not constructed behind the observed phenomena, they are built upon it and made out of it; they are imposed upon perception and if they fit well, then we weave them into the tapestry of our reality.

    Turning now to QM: one aspect of perception is that it is inherently contextual, since every observation occurs in some context. Therefore contextuality is the default, and it is non-contextuality that becomes mysterious. How is science able to transcend the contextuality of nature to make claims that are (largely) non-contextual? How is classical physics possible in this world? If we reject the initial supposition of “hidden variables” behind what we observe, then why can we in most cases get away with pretending that they are there? Notice how the shoe is on the other foot: we are now contemplating a world where quantum theory fits more comfortably than classical physics!

    You will (rightly) complain that making vague references to “contextuality” is still a long way from deriving the very particular structures of contextuality that lead to Hilbert spaces. Obviously to do that we need to make some substantive postulates.

    So, here is my constructive proposal. Take a list of all those postulates that people have proposed in reconstructions of QM — the ones which struck you as “weird” and “unmotivated” — and revisit them with a thoroughly phenomenological mind-set. Maybe this time one of them will suddenly jump out at you as a perfectly natural constraint to impose upon the phenomenological world, much like the “relativity principle” or “constant speed of light” seemed like perfectly natural postulates to make in Einstein’s world.

    Addendum: Besides phenomenology, I also think neo-Kantianism, American pragmatism, and enactivist cognition might be fruitful places to look, and I am probably ignorant of many others. For those curious about phenomenology, a good entry point is Zahavi’s book.

  591. Scott Says:

    Jacques Pienaar #590: Would you agree that dinosaur fossils, let’s say, were really there in the ground for 65+ million years before anyone dug them up, records of a vanished world that actually existed—that none of this was “constructed by the act of observation”? If so, then is the thesis that while 19th-century phenomenologists were wrong about all the stuff they thought they were talking about, they were nevertheless right about quantum states, which they didn’t know they were talking about? 🙂

    More fundamentally, my objection is that this game seems too easy. The Continental philosopher gets to, as it were, fling mud indiscriminately at anything that seems like it might be important to Western science: objectivity, determinism, predictability, locality, noncontextuality, Boolean logic, reductionism… And then, if science painstakingly comes to the conclusion that one or more items on that list really do need to be rejected, or even just revisited or reinterpreted, the Continental philosopher gets to do a victory dance, as if he or she deserves glory for a brilliant premonition.

    But despite what everyone says, questioning established postulates is trivially easy, like sailing out into the open sea! The hard part is reaching land—i.e., finding clearly-stated new postulates to replace the old ones.

    Even so, I don’t know of a single thinker, before QM, who specifically proposed that the way we calculate probabilities might become wildly wrong at the scale of atoms—if anyone can suggest a reference, let me know! (If I remember right, William James did speculate that Newtonian determinism might be found to no longer hold at the atomic scale, and I do respect him for that.)

  592. Steven Evans Says:

    Scott Says:
    Comment #585 January 31st, 2022 at 9:55 pm

    OK. But the quantum computer would need to output verifiable details of its origin. But the same output of a computer can be achieved on different instantiations (a tape, a chip), so there is no reason to think the computer can output verifiable details of its origin.

    Also, if we assume physical observations are simply the output of a quantum computer then any conclusions would apply equally to a quantum computer that doesn’t underpin a physical universe. The main issue about how fire is breathed into the equations is being lost.

    The measurement problem does seem to have some connection with how equations are made physical, though; because some of the physical effect apparently disappears from view.

  593. Stewart Peterson Says:

    Scott #542:

    First, thank you very much for addressing this question at all.

    I had heard of constraints, or axioms, #2 and #3 in your formulation. The cryptosystem in #396 passes – as far as I can see – both tests. In the case of #2, the algorithm to check the proposed answer is the decryption algorithm. In the case of #3, the hashing function, each iteration of the preimage generator, and each iteration of the decryption algorithm are all in P for known (existing) cryptosystems.

    In the case of your axiom #1, however, I had not heard that stated in so many words, and I thought I had read Cook’s problem statement fairly carefully. I see multiple plausible interpretations of your axiom #1:
    1: “P vs. NP is indeterminate over any system containing an oracle.” It certainly is; I thought I had bypassed this by not using an oracle. The cryptosystem in #396 does not “hardwire” any of the information needed to solve the problem. This information is contained entirely in the private key and the public key. My approach looks like it starts with not enough information, but that’s because it’s deliberately run backwards: we start with all of the information, store a public key in one subroutine that will accept only one private key, scramble the private key within another subroutine, and force yet another subroutine to unscramble the private key. We verify, in other words, the original private key – not the fact that a key that we generate can be scrambled into the same ciphertext. The scrambling is not used for encryption – it’s very weak, and if used to generate keys, would be cracked by the first guess by the preimage generator. It’s used to create such a highly-many-to-one mapping that guessing the specific preimage that was used is difficult: the task in your #3.
    2. “Lossy functions are not in P, by definition.” I think it’s unlikely that you meant this.
    3. “Preimage generators are not in NP, by definition, because all of the information needed to solve the problem must be input to the algorithm performing the guessing.” If you say this is true, I defer to your judgment – but I still wonder how this is possible. If we have enough information to know which guess to make, to what extent are we really guessing? It’s like having a maze with arrows on the floor; the whole point of a maze is you don’t have enough information at the beginning to know which way to go. The lack of information is the reason you have to guess. Now, sure, all of the information must be available to the overall algorithm – but not necessarily to the guessing subroutine. If this case is outside the scope of the problem, by definition, that sure is interesting, since I haven’t seen any statement to that effect in the literature. I equally-sure am open to hearing it, though!
    4. “Once you scramble the private key, the cryptanalysis is outside the scope of P vs. NP.” Depending on definitions which I’m sure you know better than I do, this may be true – but then, why would anyone think that P!=NP implies that public key cryptography is secure, or can be made to be secure, and that P=NP implies that it cannot be made to be secure? If it were outside the scope of the problem to brute-force the system, why would anybody be talking about P vs. NP in this context? After all, a brute-force attacker doesn’t have the private key, and therefore doesn’t have all the information needed to solve the problem. If this is what you meant, then brute-forcing a key is outside the scope of P vs. NP, whereas much of the literature seems to discuss just this. So, I’m assuming this isn’t what you meant, either.
    5. “The analysis is valid; it just doesn’t imply anything about P vs. NP due to the particular way that the P vs. NP problem is defined by convention.” In this case, we can discard the P vs. NP talk and discuss determinism and non-determinism, as it occurs outside of the P vs. NP problem. (This addresses the original topic of the thread, and this is why I brought up the whole topic.) If correct, my initial argument in #386 would establish that information loss in the equation of state of a process causes the reversal of the process to be fundamentally non-deterministic, which is a useful result if we’re trying to identify whether all non-deterministic systems can be modeled deterministically. I argued above that this is not possible. This appears to be in agreement with current data: we can determine the probability distribution of a superposition, but the outcome of the collapse of the wavefunction is fundamentally unknowable. Information about what will happen has been destroyed by the operation that composed the system’s state. However – you wrote the book on this, literally, and if it were valid, you presumably would have thought of it.
    6. “Peterson is a dummy; he hasn’t established even this much.” I’m more than willing to take this interpretation, but I’m curious as to why. I may have misinterpreted what you said; I certainly appear to have misinterpreted Cook’s problem statement if what you say is correct. Is there a formal definition of your axiom #1 somewhere in the literature? I read your 2008 paper on algebrization and didn’t find anything (other than interpretation #1 above) that looked like it in there.

    Would it perhaps be better if I posted a single, cohesive, formal statement, instead of relying on a correction to a correction to an argument? That might clear up what I’m really trying to get at.

  594. Philippe Grangier Says:

    Scott#470 « How could I have said this more clearly? » I feel compelled to answer this, in a post-last post…

    You said it clearly, but it simply does not fit into my mindset. To take again the restaurant metaphor, as a physicist I consider my duty to deal with the dish in front of me, there is a lot to explore and understand, and when doing that I’m sitting on the shoulders of giants, as the wording says. But as an ambitious theoretical computer scientist you want much more, you want to be the Grand Chef and redesign the whole menu – well…

    In order to get an idea about how physics works, I recommend that you have a look at the article ‘Why some physical theories should never die’, by Olivier Darrigol, available at http://www.sphere.univ-paris-diderot.fr/IMG/pdf/Martins_paper.pdf . In my words, it explains that a new, more refined physical theory never totally replaces the previous one, some bits and pieces always remain as a background, absolutely required just to build up new experiments, based on well-known physics and techniques. Maybe this was not so much apparent – though already true – in classical physics, until we came to QM, which makes no sense if taken out of the classical context allowing us to define experiments (sorry again for being neo-Bohrian). QM alone predicts ghostly behaviors and self-multiplying worlds, in complete contradiction with obvious empirical evidence. OK, you want the world to be ‘quantum only’, and you don’t want any reference to empirical evidence; unfortunately I cannot follow you on these grounds.

  595. Gil Kalai Says:

    Scott #517 “we’re now 515 comments deep in a thread about why the universe is quantum, why the right model of computation for our world appears to be quantum computation?”

    I share the view that the question of the “right model of computation of the world” might be important also for understanding QM and “why QM”. I am not sure how common this view is among physicists and mathematicians, so the original post and the question “Why QM” make sense also for those who do not regard computation as part of the answer (and also my previous comment was computation-free).

    Scott and I differ on the answer to the question about the “right model”.

    The universe being quantum leaves two possibilities for “the right computational model of the world”

    The first is noisy quantum computation below the noise level required for quantum fault tolerance

    The second is noisy quantum computation above the noise level required for quantum fault tolerance

    Scott’s view is that the right model of computation of the world is “noisy quantum computation with noise rate below the noise level required for quantum fault tolerance,” while my view is that the right model of computation of the world is “noisy quantum computation with noise level above the noise level required for quantum fault tolerance”.

    My brief speculation (#454) for “why QM?” is that QM is required to explain probability, to explain chaos, and (NEW) to explain the emergence of geometry/space-time. Furthermore, I think that understanding probability, chaos, and geometry (in the physical context), depends not only on identifying QM as the fundamental physics theory but also on realizing that “a world devoid of quantum computation” represents the correct computational model.

  596. Anbar Says:

    Jester #532 and Scott

    – I guess so; the point I am grappling with is, how does QM work from the perspective of (say) an atom? How does it know what to do? Because the QM calculations are (probably) not taking place locally where it is –

    This is the way I see it

    An atom “knows what to do” as much as we do. The difference between us and the atom is that we are complex enough, through the perks of evolution, to understand what’s happening (we have the circuitry within our bodies to generate meta-objects like the concept of a material object, represent self-reference and ricorsivity, etc)

    Opposite to the atom, we recognize that the world is predictable and we try to work our way from there by using the recursive self-referential language of meta-objects: QM is were we landed (so far, at least)

    Back Scott’s question whether anything like a complex object pondering on the concept of recursion would be possible in a classical universe, I’ve been trying to argue that the answer is “possibly, but it would require Rube Goldberg concoctions that would make the string landscape look simpler than a platonic solid, in comparison”

  597. Mateus Araújo Says:

    Scott #571: Since you think that I’m on a superdeterministic level of insanity, I think I’m forced to clarify that this stuff I’m talking about subjective and objective probabilities is uncontroversial. There’s a huge literature on the topic, and pretty much everybody that has studied it agrees with me. What is controversial is my claim that Many-Worlds solves the problem.

    As for your classical Many-Worlds theory, it is fine, I mentioned it was a solution way back in comment #145. Now the state of the world is not a string of n-bits that evolves probabilistically in a ill-defined way, but a vector of 2^n real numbers that evolves deterministically via a stochastic matrix. The transition rule is now well-defined. The weights p_i are literally the amount of worlds of each kind, and probability theory reduces to measure theory.

    Now please don’t say that the probabilities are objective or subjective or whatever, that’s just intellectual vandalism. You were explicit that the vector of probabilities was the true state of the world, that makes them objective probabilities. Objective probabilities are a property of the world, subjective probabilities are a property of your head. Objective probabilities exist without any agent to think about them, subjective probabilities do not. You can have subjective probabilities about deterministic phenomena, but there are no objective probabilities involved (excepted of course in the case of branching). Do you see that they are completely different things?

  598. gentzen Says:

    Mateus Araújo #513:

    You didn’t reply to my request to give a definition of objective probability. I don’t expect you to succeed, people have tried to do it for a century and failed. I just want you to be honest with yourself and realize that you can’t. This failure is so widely recognized that the consensus in philosophy is that objective probabilities do not exist, they are content to deal with subjective probabilities. Meanwhile the consensus in physics is that objective probabilities are obviously what quantum mechanics gives you, and they don’t worry about defining them.

    Even if perfect “objective probabilities” might not exist, it still makes sense to try to explain which properties they should have. This enables to better judge the quality of approximations used as substitute in practice, for the specific application at hand.

    I wrote: “Indeed, the definition of a truly random process is hard.” having such practical issues in applications in mind. In fact, I had tried my luck at a definition, and later realized one of its flaws:

    The theories of probability and randomness had their origin in gambling and games more general. A “truly random phenomena” in that context would be one producing outcomes that are completely unpredictable. And not just unpredictable for you and me, but for anybody, including the most omniscient opponent. But we need more, we actually need to “know” that our opponent cannot predict it, and if he could predict it nevertheless, then he has somehow cheated.

    But the most omniscient opponent is a red herring. What is important are our actual opponents. A box with identical balls with numbers on them that get mixed extensively produces truly random phenomena, at least if we can ensure that our opponents have not manipulated things to their advantage. And the overall procedure must be such that also all other possibilities for manipulation (or cheating more generally) have been prevented. The successes of secret services in past wars indicate that this is extremely difficult to achieve.

    The unpredictable for anybody is a mistake. It must be unpredictable for both my opponents and proponents, but if some entity like nature is neither my proponent nor my opponent (or at least does not act in such a way), then it is unproblematic if it is predictable for her. An interesting question arises whether I myself am necessary my proponent, or whether I can act sufficiently neutral such that using a pseudorandom generator would not yet by itself violate the randomness of the process.

    (Using a pseudorandom generator gives me a reasonably small ID for reproducing the specific experiment. Such an ID by itself would not violate classical statistics, but could be problematic for quantum randomness, which is fundamentally unclonable.)

    I had tried my luck with highlighting important properties randomness should have for specific applications before:

    One theoretical weakness of a Turing machine is its predictability. An all powerful and omniscient opponent could exploit this weakness when playing some game against the Turing machine. So if a theoretical machine had access to a random source which its opponent could not predict (and could conceal its internal state from its opponent), then this theoretical machine would be more powerful than a Turing machine.

    The problem with this type of theoretical machine in real life is not whether the random source is perfectly random or not (assuming it to be perfectly random is a harmless idealization), but that we can never be sure whether we were successful in concealing our internal state from our opponent. So in the concrete case, one can never be sure whether it is valid to idealize the current instance of a situation by such a machine.

    The general attitude of that text is quite similar, but my newer elaborations are different in important details.

    I guess you (=Mateus Araújo) had something completely different in mind, when asking for a definition of objective probability. But my attempts at definitions are not so far off, if you take into account my gut feeling that “true randomness” can be simulated/approximated by mathematical structures:

    Let me try to give one reason why I believe that “true randomness” can be simulated/approximated very space efficiently. … There is a form of mathematical platonism involved, which assumes that mathematical structures are able to model or approximate nearly everything that can exist in this world. True randomness is part of what can exist, and … there is a gut feeling that this randomness is already present in sufficiently space limited contexts

  599. baruch garcia Says:

    Feynman, Witten, Maldacena and several others have urged us not to ignore John Wheeler’s argument. Yet, his approach never really appears in lists of quantum interpretations or approaches to “unification“ (Wheeler’s mantra was “How come the quantum?” How come existence?”) His “it from bit” is often misinterpreted, sometimes as classical, because it doesn’t use the term “qubit”, even though “it from bit” preceded the term “qubit” by 5 years. The exception is from people like Maldacena and Witten and some others, who have understood and defended as possible if not plausible, Wheeler’s claims. I am afraid it will go quite a bit longer being ignored, missing inroads to new areas of research.

    The Gödel argument, when presented on a blackboard, is either accepted easily and with shock “Oh I see, wow!” or it is dismissed because there is some misunderstanding about the claim. For example one critic confused semi-decidability with undecidability, and thought this whole Wheeler approach was connecting semi-decidability with QM instead of undecidability. The conversation sounded like the cow/daughter confusion scene from Fiddler on the Rood.

    If there is any fogginess, I encourage you to think twice about throwing away everything about this approach. I encourage you instead to focus on learning the subtle differences between semi-decidability and undecidability, between decoherence and measurement, then revisiting the argument. Primers on these subjects will appear on jawarchive.wordpress.com, as well as a proper formal treatment of Wheeler’s argument in the future for anyone who wishes to understand more.

    Like Feynman, like Witten, like Maldacena, like many others, I encourage you to look into Wheeler’s approach. With a soon-coming formalization of Wheeler’s ideas with applications, hopefully any fogginess and misunderstanding will be cleared up! We can and we will understand Wheeler’s question “How come the quantum?” or as Scott puts it “Why quantum mechanics?”

    Good luck everyone on your search!

  600. Mateus Araújo Says:

    murmur #588: One needs branching in order to define what are objective probabilities. Saying that an event E happens with probability 1/3 means that we branch and in 1/3 of the worlds event E happens. This explains why is it impossible to predict whether E or ¬E happens: because both do happen, you will have future selves experiencing each of them. It also explains the law of large numbers: if you do a sequence of measurements, in most worlds the frequencies you obtain will be close to 1/3 (it can be proven rigorously).

    This doesn’t tell you what is the probability of a given event, to answer that you need do decide how to count branches, i.e., to define a measure over them. Well, since we have the law of large numbers we can find it out experimentally, and it is clear that if we count branches according to their 2-norm we fit the data very well. That’s enough for me.

    You claim that the probability is not uniform, but this is not true. There’s no such thing as “uniform” by itself, mathematically we can only say uniform relative to some choice of measure (often the Lebesgue measure). Well the probability is uniform with respect to the 2-norm measure.

    You seem to advocate for the measure where we count each measurement result equally. I already explained in my comment #362 why this doesn’t make sense, but maybe it’s worth citing a more sophisticated argument. This recent paper by Saunders shows that if you insist on counting branches equally but respects their decoherence structure you go back to counting them according to their 2-norm, modulo some rounding errors.

  601. gentzen Says:

    Scott #104:

    mjgeddes #48:

    You said in (4) you want to assume that current QM is exactly correct, but if that were true, I really don’t think there can be any answers for your questions. I mean some of the brightest minds have thought about quantum foundations for decades, so surely it’s unlikely that anything has been missed?

    I mean, some of the brightest minds had to think about Fermat’s Last Theorem for 350 years before any of them proved it! 🙂

    More to the point, though, let me give you the years of various developments that have played non-negligible roles in my own thoughts about the foundations of QM:

    Quantum teleportation – 1993
    Shor’s algorithm – 1994
    Grover’s algorithm – 1996
    Theory of quantum error-correction and fault tolerance – 1996±O(1)
    Discovery of the cosmological constant – 1998
    Lucien Hardy’s reconstruction of QM – 2001
    PBR Theorem – 2011
    Firewall paradox – 2012
    Harlow-Hayden argument – 2013
    ER=EPR – 2013
    Quantum circuit complexity as wormhole volume – 2014
    AdS/CFT as a quantum error-correcting code – 2014
    Chiribella et al.’s reconstruction of QM – 2015
    Black hole unitarity from Euclidean path integrals – 2019
    MIP*=RE – 2020

    So, maybe there’s nothing of any further interest to be discovered, but that doesn’t seem like the way to bet…

    Wow, this is impressive. And it nicely mirrors the spririt of this blog post and the constructive atmosphere of the discussions in the comments.

    What I like especially about it is the concrete illustration of how progress happens by small steps. (Well, I admit that developments like MIP*=RE are big steps, but if you dig into this big step, you learn that it was enabled by many smaller steps.) I find it important to encourage the public discussion of small steps. To an individual researcher himself, small steps (like a sharpened imagery) might feel nice on a personal level, but “well know” in sufficiently small communities on the larger level. And sharing such small steps in a way that others can enjoy them too can be a larger effort than apparent from the finished 2-3 pages of text and images:

    Strilanc said:
    You then gradually internalize that measurement is indistinguishable from controlled operations targeting low-entropy qubits. Which makes the many-worlds interpretation more and more compelling, because it basically amounts to asserting that equivalence.

    But then you gradually stop to publish your improved understanding in easily understandable images. First you stop publishing on your blog, and later you might even stop formulating it in words understandable by educated listeners. Of course, this is an unfair game to a certain extent. You know that your understanding is nothing special, but mostly shared by “your community” with a similar background. The popularizers somehow seem to have a free license to propagate misunderstandings. If for example Sabine Hossenfelder publishes something like The Trouble with Many Worlds, her misunderstanding of the many-worlds interpretation will have a much bigger impact than some blog post by a young researcher making it clear with easily understandable examples why that way to understand MWI must be wrong.

    In my opinion, a huge part of the funding for quantum computing research was public money, and part of the hoped for return on investment was to get an improved understanding of quantum mechanics. And this includes perspectives of young researchers, like your approach to distinguish between “before-hand experience” description vs. “in-the-moment experience” descriptions:

  602. fred Says:

    It would be interesting to get feedback from Scott to see if this thread of 600+ posts has moved his needle in any direction even just a tiny bit.
    And if not, why he thinks that’s the case?
    E.g. a reflection of the blog’s audience, a reflection on the question itself, or something about himself, etc.

  603. Mateus Araújo Says:

    gentzen #598: I’m afraid you misunderstood me. I’m certain that “perfect” objective probabilities do exist. A simple photon shinning on a beam splitter is an example, and there are countless more from quantum mechanics. I don’t see why is it a mistake to ask for a phenomenon that is unpredictable to anybody, we do have that. What I’m asking is for a mathematical definition of objective probability, this is the century-old difficulty.

  604. gentzen Says:

    The chapter “Question 10: Reconstructions” in Maximilian Schlosshauer (editor) “Elegance And Enigma – The Quantum Interviews” (2011) contains the opinions of 17 researchers on a similar question as this blog post, including Tim Maudlin and Lucien Hardy. Many participants in the comments here probably know this already, but I guess only very few know the following information from its preface:

    This is not the first interview book on quantum mechanics. In the early …, Julian Brown, a radio producer with the BBC Science Unit, teamed up with Paul Davies to do a series of interviews with physicists interested in the foundations of quantum mechanics. Davies presented these interviews in the form of a program for BBC Radio 3, featuring conversations with Alain Aspect, John Bell, John Wheeler, Rudolf Peierls, David Deutsch, John Taylor, David Bohm, and Basil Hiley. The program found enthusiastic listeners, including at least one of our interviewees (see Lucien Hardy’s story, page 29). Buoyed by this success, Brown and Davies decided to publish the transcripts of the interviews in book form. The Ghost in the Atom: A Discussion of the Mysteries of Quantum Physics came out in …

    (Sorry for replacing “unreliable” information by …)

    I find the way that John Wheeler frames his “Why the quantum?” question in the conversion with Paul Davies in that older book unexpected and illuminating:

    But when Everett produced his many-universes interpretation for quantum theory you changed your mind for a while. Why was that?

    What attracted you to this remarkable idea?

    … But I also have a deeper objection: the Everett interpretation takes quantum theory in its present form as the currency, in terms of which everything has to be explained or understood, leaving the act of observation as a mere secondary phenomenon. In my view we need to find a different outlook in which the primary concept is to make meaning out of observation and, from that derive the formalism of quantum theory.

    So you think that the many-universes approach may still be useful?

    Yes, I think one has to work both sides of the railroad track.

    But in the meantime you’re siding with Bohr.

    Yes. As regards the really fundamental foundations of knowledge, I cannot believe that nature has ‘built in’, as if by a corps of Swiss watchmakers, any machinery, equation or mathematical formalism which rigidly relates physical events separated in time. Rather I believe that these events go together in a higgledy-piggledy fashion and that what seem to be precise equations emerge in every case in a statistical way from the physics of large numbers; quantum theory in particular seems to work like that.

    But do you think that quantum theory could be just an approximate theory and that there could be a better theory?

    First, let me say quantum theory in an every-day context is unshakeable, unchallengeable, undefeatable – it’s battle tested. In that sense it’s like the second law of thermodynamics which tells us that heat flows from hot to cold. This too is battle tested – unshakeable, unchallengeable, invincible. Yet we know that the second law of thermodynamics does not go back to any equations written down at the beginning of time, not to any ‘built in’ machinery – not to any corps of Swiss watchmakers – but rather to the combination of a very large number of events. It’s in this sense that I feel that quantum theory likewise will some day be shown to depend on the mathematics of very large numbers. Even Einstein, who opposed quantum theory in so many ways, expressed the point of view that quantum theory would turn out to be like thermodynamics.

  605. Giuliano Says:

    Scott #288

    “…if someone in the 1800s had asked “why is classical mechanics true? why does it have the features it does?,” with hindsight that would’ve been one of the best questions they could possibly have asked! Because it would’ve had nontrivial answers!”

    Indeed, Kant’s 1786 “Metaphysical Foundations of Natural Science” is not trivial at all…

  606. Jester Says:

    Anbar #596:

    “An atom “knows what to do” as much as we do. The difference between us and the atom is that we are complex enough (…) to understand what’s happening”

    But this is what I meant when I asked whether QM is something we complex organisms do as calculations, and whether whatever happens in nature all by itself is THE SAME! So nature does QM without knowing it? But how? What is the “mechanism”?

    Although apparently this doesn´t strike anyone else as odd, so I guess I will ascribe it to my own peculiar view.

    Anyway, thank you for your reply.

  607. Peter Morgan Says:

    Thanks, Scott, for quite a wild ride. I hope your foot will heal completely and sooner rather than later.
    I’m glad I’ve followed along every twelve hours or so, otherwise I would surely have never started. The comments you’ve picked out as good have seemed good to me, but there have been many others that I’ve thought worthwhile. I look forward to your summary of what you’ve learned.
    One aspect it’s been interesting to see has been the perennial misunderstanding of what you intend by your “Why?” questions. I think I still don’t quite understand what you think would be good answers, so please forgive me that my comments have more focused on “How is it that?” At least, I think that’s approximately what I have tried to do. I suppose I take the view that the way to have an answer to a “Why?” sneak up on me is to answer as many “How?” questions as I can think of, from as many points of view as I can find and partially assimilate from other people, and hope.

    In response to J #583, therefore, “even classical mechanics can be formulated from multiple perspectives (Newtonian / Lagrangian / Hamiltonian)”, I find it helpful to add to that list Koopman’s Hilbert space formalism for classical mechanics (Koopmanian? Koopmannian?), and even more so to use an algebraic approach to Koopman classical mechanics (there are other formalisms, of course). That gives us, I suppose, a way to answer “How can we make classical and quantum similar enough that the question ‘Why should the universe have been quantum-mechanical?’ is not quite as much a concern?” I find Koopman approaches to CM helpfully different from Wigner function approaches to QM, although of course the latter are also good to know.
    Another perspective I find helpful is an algebraic approach to probability theory, which becomes quantum probability theory with just the addition of noncommutativity, I think because that approach abstracts away from concerns about dynamics and causality, which we can add back in after we have understood the role of “collapse” when dynamics and causality are absent.
    I apologize if all this is too mathematically distant from an answer to a simple “Why?”

  608. OhMyGoodness Says:

    Mateus #597

    I don’t follow your logic. It seems to me maybe you reject the privileged position of an observer in quantum mechanics. A sufficiently large number of experiments (similar to GHZ say) confirm a superposition state prior to measurement. There is no basis to sub divide the pre measurement state with individual probabilities. Objective probabilities can only be assigned to the outcome of an observation and those are objective probabilities as attested to by voluminous experimental evidence. The objective probability is possible only in conjunction with an objective observation. There just is no reality to sub divide pre measurement.

    GHZ and similar experiments scare me by the way-they provide good evidence that reality really does behave in accordance with QM.

  609. Jacques Pienaar Says:

    Scott #565:

    Incidentally, this is also my central objection to QBism, Copenhagen, and all other subjectivist interpretations of QM. […] The advocates of those interpretations keep repeating, in a thousand variations:

    “QM is not about Nature, it’s about us, how we talk about Nature, organize our knowledge, etc.!”

    Clearly this strikes them as a profound insight. Whereas I, as it were, already assimilated the insight and just proceeded immediately to what I see as the real question, namely, “why does Nature have whatever inherent property makes it an overwhelmingly good choice to talk about it using QM—a property we might abbreviate as ‘being quantum’?”

    You say QBism ignores the `real question’. Yet here is a quote from Fuch’s 2016 paper `On Participatory Realism’ that emphatically says the opposite:

    Ultimately I view QBism as a quest to point to something in the world and say, that’s intrinsic to the world. But I don’t have a conclusive answer yet. […] You can always ask — you should always ask — what is it about the world that compels me to adopt this instrument rather than that instrument? […] The fact that we adopt this formula rather than some other formula is telling us something about the character of the world as it is, independent of us. If we can answer the question `Why the Born Rule?’ or John Wheeler’s question `Why the quantum?’ then we’ll be making a statement about how the world is, one
    that’s not `just’ instrumentalism.

    Fuchs has said similar things repeatedly in various places since the very beginning of QBism. Here’s one from a 2001 (!) paper he wrote while at Bell Labs, before he was even fully QBist (my emphasis):

    The quantum state is information. Subjective, incomplete information. Put in the right mindset, this is not so intolerable. It is a statement about our world. There is something about the world that keeps us from ever getting more information than can be captured through the formal structure of quantum mechanics. Einstein had wanted us to look further — to find out how the incomplete information could be completed — but perhaps the real question is, “Why can it not be completed?”

    If I had the time, I bet I could come up with a dozen more like that. So I am really baffled why you seem to think — along with Mateus Araújo #398 — that QBism says `nothing is real, nothing to get hung about’. Did you somehow skip these parts of QBist literature? Or are you simply committed to the idea that whatever is real has to reside in the quantum state?

  610. Paul Hayes Says:

    BTW, in case no-one’s posted a link to it already, there’s a book about the modern attempts to answer this question (“By contrast, the focus of the new wave is the reconstruction of quantum theory from physical principles. Contemporary researchers are looking for an answer to Wheeler’s famous question “Why the quantum?” [17] and are driven to understand the origin of the formalism itself.).

  611. Andrei Says:

    Philippe Grangier,

    “In an EPRB experiment Alice’s measurement does not « cause » Bob’s result”

    OK.

    “…which is anyway undefined as long as Bob has not decided about an orientation.”

    I think I’ve corrected you twice that in my argument the detectors do not change orientation. You are answering again to a different argument.

    “Her measurement only allows Alice to make a contextual (and probabilistic) inference about Bob’s result.”

    In my argument the inference is exact (100% probability).

    My argument is different from the original EPR one. It does not depend on counterfactuals (unperformed measurements). It doesn’t matter how the records are manipulated and sent from A to B or whatever. At the end of the day you have a piece of paper showing two columns of space-like measurement results, perfectly anti-correlated. Just tell me how your model explains those printed values.

  612. gentzen Says:

    Mateus Araújo #603:

    I’m afraid you misunderstood me. I’m certain that “perfect” objective probabilities do exist. A simple photon shinning on a beam splitter is an example, and there are countless more from quantum mechanics.

    Your challenge was to try to define objective probabilities, if I understood you correctly. I gave it a try, and I claim that it goes beyond mere subjective probability. I don’t want to claim that it exactly captures the required properties of objective probabilities, and my excuse is that all my attempts in the past later turned out to have “room for improvement”.

    My understanding is that your own shoot at your challenge was: “The only solution I know is defining true randomness as deterministic branching, aka Many-Worlds.”

    I don’t see why is it a mistake to ask for a phenomenon that is unpredictable to anybody, we do have that. What I’m asking is for a mathematical definition of objective probability, this is the century-old difficulty.

    A good mathematical definition should be invariant under isomorphism. But if I have a model of the world, I can always “adjoin” an “all knowing pure observer”, and get an isomorphic model. Therefore I must limit the “unpredictable for anybody” in my definition to the “relevant stakeholders” in such a way that it excludes at least superfluous “all knowing pure observers”.

  613. Anbar Says:

    Jester #606

    “ But this is what I meant when I asked whether QM is something we complex organisms do as calculations, and whether whatever happens in nature all by itself is THE SAME! So nature does QM without knowing it? But how? What is the “mechanism”? “

    Why should there be a mechanism in the first place? Ether was not needed, at the end. Besides, it’s not like the Earth would need to calculate its orbit, in a classical universe 🙂

    I probably don’t understand what you mean with “how”… QM describes a self consistent way (and possibly the most efficient, from a design point of view -> Q2) for complex systems to make sense of a predictable non Rube Goldberg Universe, like the one we seem to inhabit

    Coming to Q1, the obstacle is that all the biological processes depending on molecular details like those that make a given molecule an enzyme and a similar molecule (or maybe a switched off version of the same molecule) not, would essentially need their own fundamental description and individual designs, rather than being emergent and naturally selected… definitely not a task worthy the time of a timeless being

  614. Dax Says:

    I just finished implementing Heisenberg-Weil gates in Cirq and remembered this post. Addressing Q2, it seems otherwise impossible to count in half-steps on a qutrit. I guess you could say a solution with only real numbers is possible if you do something like a half step is 0.5(|001>) + 0.5(|010>)+ 0.0(|100>) and such, but that ends up being a non-smooth solution. I’m not sure if a smooth solution using only classical probabilities exists or not, but regardless the complex solution seems so much more elegant (I’m new to the field and am still kind of blown away by how nicely that turns out). So maybe “you can count to 3 by halves” is the axiom that unfolds into Q2.

  615. Scott Says:

    Stewart Peterson #593: P vs. NP has nothing to do with “information loss.” It’s about information that’s already present in the input—e.g., the existence or nonexistence of a satisfying assignment to a 3SAT instance, or (yes) a preimage of a hash function output—but that appears to be exponentially hard to access. If your knowledge is coming from Steve Cook’s Millennium Prize essay, then try, I dunno, my P vs. NP survey article, and come back when you’re done if you still think P vs. NP is relevant to your interests?

  616. Tu Says:

    Mateus Araújo #368: Thank you for the thoughtful reply– I really appreciate. I certainly admire that you are clear about which bullet you are biting. I could carry on with quibbles and questions, but I will save it for another post in the future.

    I have enjoyed reading your lively back-and-forth with Philippe Grangier, so thanks to you both for that!

    Matt Leifer #358: This comment represents some of the best writing and clearest thinking on this subject that I have ever come across. I am going to print it out and sleep with it under my pillow. Thank you for sharing.

    Mateus Araújo, many posts: I think Scott understands that there is an important philosophical distinction between subjective and objective probability, but is making the point that it is easy to imagine a universe where the only kind of probability that arises is subjective. While this would disqualify “QM as a source of objective probability” as an answer to Q2, it does not refute your argument about QM as a source of objective probability with respect to our universe.

  617. Scott Says:

    Philippe Grangier #594: I completely agree with you about the pattern of new physical theories “stealing” all the successes of the old ones—sometimes literally to the point of recycling the exact same equations, but assigning new, more sophisticated meanings to the variables, reducing to the old meanings in some appropriate limit. I believe Feynman and Penrose both wrote about this.

    But here’s another pattern in the history of physics: experiments yield unexpected results, somewhat arbitrary-looking theories are carefully constructed to explain those results, and then someone realizes why it actually followed from more general principles that the theories must’ve taken the form they did. A few examples include:

    – Michelson-Morley → Einstein
    – Zoo of elementary particles → Standard Model with SU(3)×SU(2)×U(1) gauge group
    – QFT with hacky renormalization recipes → Wilsonian effective field theory

    In each of these cases, on the one hand, we can say that with hindsight, physics took a somewhat painstaking, roundabout route to a place where in principle, pure thought probably could’ve gotten faster. But on the other hand, physics can hold its head high, since this is how science is supposed to work—how it’s worked in the best cases—and given the limitations of human intelligence, no one in history has yet discovered a faster way in practice! 🙂

  618. Scott Says:

    Mateus Araújo #597: Alright, alright, I retract my comment about your position being superdeterminism-level insane. 🙂

    You’ve now ceded to me (and I’m sorry that I missed it earlier) the main point I cared about: namely, that you haven’t given any evidence or arguments against “classical probabilistic Many-Worlds”—which therefore remains 100% on the table, awaiting considerations that would rule it out in favor of the amplitude-based Everettverse.

    I, in return, am happy to cede to you the main point that you cared about: namely, that in both the Everettverse and the Classical Probabiliverse, the probabilities of branches could properly be called “objective.”

    To my way of thinking, there are some entities that are clearly “objective” (dinosaur fossils), and others that are clearly “subjective” (opinions), and probabilities are fascinating precisely in how they straddle the two realms. I.e., I could easily imagine two people who I’d regard as equally sane, both in possession of a final theory of physics, yet disagreeing with each other about whether to call some particular probability “objective.” On reflection, though, I agree with you that the Everettverse and the Classical Probabiliverse both settle this debate, in favor of the “objective” side, simply by fiat!

    Here’s something to ponder, though: your position seems to commit you to the view that, even if we only ever saw classical randomness in fundamental physics, and never quantum interference, a Probabiliverse (with all possible outcomes realized) would be the only philosophically acceptable way to make sense of it.

    But given how hard it is to convince many people of MWI in this world, which does have quantum interference, can you even imagine how hard it would be to convince them if we lived in a Classical Probabiliverse?

    In fact, if anti-Everettians understood this about your position, they could use it to their advantage. They could say: “Everettians, like Deutsch, are always claiming that quantum interference, as in the double-slit experiment, forces us to accept the reality of a muliverse. But now here’s Mateus, saying that even without interference, a multiverse would still be the only way to make sense of randomness in the fundamental laws! So it seems the whole interference argument was a giant red herring…” 🙂

  619. Liam Says:

    Scott #564:

    To answer your question (1) – yes, I’m saying you should do (b), i.e. start with the existence of spin-1 or spin-2 massless particles and see why they basically have to act like gauge forces. The way the argument works is that you classify all finite-dimensional unitary representations of the Poincare group for massless particles with spin. You find that some of the Lorentz generators have to act trivially. Then for, say, spin-1, you look at the vector representation A_mu, and you find that the Lorentz generators that needed to act trivially actually generate a shift in A_mu. Stated in position space, this shift is the usual gauge transformation of A_mu. So gauge transformations need to “do nothing”, i.e. be a redundancy of description. Then you try to add interactions in your Lagrangian between the gauge field and other stuff, and to preserve this redundancy the gauge field has to couple to a conserved current. So you are forced into using gauge theory. If you have 30 minutes of spare time and just want to get the basic idea, I recommend reading just the first four pages (or at least the first two paragraphs!) of chapter 8 in Weinberg’s QFT textbook.

  620. Scott Says:

    fred #602:

      It would be interesting to get feedback from Scott to see if this thread of 600+ posts has moved his needle in any direction even just a tiny bit.

    Yes, it has! (A tiny bit 🙂 )

    It’s solidified for me that the a-prioristic arguments for QM that have some purchase on me, that I want to explore further, can pretty much all be classified into a few broad families:

    (1a) QM as an elegant way to “reconcile the discrete and the continuous,” producing a universe with continuous positions, momenta, etc., but still finite upper bounds on information content, as well as stable atoms with discrete, combinatorial Lego-like behaviors.

    (1b) QM as an elegant way to preserve symmetries—rotational symmetry, Lorentz symmetry, etc.—even in the face of discreteness that would naïvely seem to break those symmetries.

    (2a) QM as a way to get randomness that can never be explained by introducing local hidden variables, that must be “freshly generated on-the-fly” (under no-signalling assumptions), and that could conceivably even bear on questions of identity and free will.

    (2b) QM as a way to let us keep arguing forever about what’s “ontic” and what’s “epistemic,” whether we’re forced to accept the reality of a multiverse or can reject the branches that we don’t see as unrealized hypotheticals, rather than clearly resolving such enormities one way or the other as most of the known alternatives would do.

    Much of my remaining unease comes from the fact that we now have too many plausible-seeming explanations on the ta