Why Quantum Mechanics?
In the past few months, I’ve twice injured the same ankle while playing with my kids. This, perhaps combined with covid, led me to several indisputable realizations:
- I am mortal.
- Despite my self-conception as a nerdy little kid awaiting the serious people’s approval, I am now firmly middle-aged. By my age, Einstein had completed general relativity, Turing had founded CS, won WWII, and proposed the Turing Test, and Galois, Ramanujan, and Ramsey had been dead for years.
- Thus, whatever I wanted to accomplish in my intellectual life, I should probably get started on it now.
Hence today’s post. I’m feeling a strong compulsion to write an essay, or possibly even a book, surveying and critically evaluating a century of ideas about the following question:
Q: Why should the universe have been quantum-mechanical?
If you want, you can divide Q into two subquestions:
Q1: Why didn’t God just make the universe classical and be done with it? What would’ve been wrong with that choice?
Q2: Assuming classical physics wasn’t good enough for whatever reason, why this specific alternative? Why the complex-valued amplitudes? Why unitary transformations? Why the Born rule? Why the tensor product?
Despite its greater specificity, Q2 is ironically the question that I feel we have a better handle on. I could spend half a semester teaching theorems that admittedly don’t answer Q2, as satisfyingly as Einstein answered the question “why the Lorentz transformations?,” but that at least render this particular set of mathematical choices (the 2-norm, the Born Rule, complex numbers, etc.) orders-of-magnitude less surprising than one might’ve thought they were a priori. Q1 therefore stands, to me at least, as the more mysterious of the two questions.
So, I want to write something about the space of credible answers to Q, and especially Q1, that humans can currently conceive. I want to do this for my own sake as much as for others’. I want to do it because I regard Q as one of the biggest questions ever asked, for which it seems plausible to me that there’s simply an answer that most experts would accept as valid once they saw it, but for which no such answer is known. And also because, besides having spent 25 years working in quantum information, I have the following qualifications for the job:
- I don’t dismiss either Q1 or Q2 as silly; and
- crucially, I don’t think I already know the answers, and merely need better arguments to justify them. I’m genuinely uncertain and confused.
The purpose of this post is to invite you to share your own answers to Q in the comments section. Before I embark on my survey project, I’d better know if there are promising ideas that I’ve missed, and this blog seems like as good a place as any to crowdsource the job.
Any answer is welcome, no matter how wild or speculative, so long as it honestly grapples with the actual nature of QM. To illustrate, nothing along the lines of “the universe is quantum because it needs to be holistic, interconnected, full of surprises, etc. etc.” will cut it, since such answers leave utterly unexplained why the world wasn’t simply endowed with those properties directly, rather than specifically via generalizing the rules of probability to allow interference and noncommuting observables.
Relatedly, whatever “design goal” you propose for the laws of physics, if the goal is satisfied by QM, but satisfied even better by theories that provide even more power than QM does—for instance, superluminal signalling, or violations of Tsirelson’s bound, or the efficient solution of NP-complete problems—then your explanation is out. This is a remarkably strong constraint.
Oh, needless to say, don’t try my patience with anything about the uncertainty principle being due to floating-point errors or rendering bugs, or anything else that relies on a travesty of QM lifted from a popular article or meme! 🙂
OK, maybe four more comments to enable a more productive discussion, before I shut up and turn things over to you:
- I’m aware, of course, of the radical uncertainty about what form an answer to Q should even take. Am I asking you to psychoanalyze the will of God in creating the universe? Or, what perhaps amounts to the same thing, am I asking for the design objectives of the giant computer simulation that we’re living in? (As in, “I’m 100% fine with living inside a Matrix … I just want to understand why it’s a unitary matrix!”) Am I instead asking for an anthropic explanation, showing why of course QM would be needed if you wanted life or consciousness like ours? Am I “merely” asking for simpler or more intuitive physical principles from which QM is to be derived as a consequence? Am I asking why QM is the “most elegant choice” in some space of mathematical options … even to the point where, with hindsight, a 19th-century mathematician or physicist could’ve been convinced that of course this must be part of Nature’s plan? Am I asking for something else entirely? You get to decide! Should you take up my challenge, this is both your privilege and your terrifying burden.
- I’m aware, of course, of the dizzying array of central physical phenomena that rely on QM for their ultimate explanation. These phenomena range from the stability of matter itself, which depends on the Pauli exclusion principle; to the nuclear fusion that powers the sun, which depends on a quantum tunneling effect; to the discrete energy levels of electrons (and hence, the combinatorial nature of chemistry), which relies on electrons being waves of probability amplitude that can only circle nuclei an integer number of times if their crests are to meet their troughs. Important as they are, though, I don’t regard any of these phenomena as satisfying answers to Q in themselves. The reason is simply that, in each case, it would seem like child’s-play to contrive some classical mechanism to produce the same effect, were that the goal. QM just seems far too grand to have been the answer to these questions! An exponentially larger state space for all of reality, plus the end of Newtonian determinism, just to overcome the technical problem that accelerating charges radiate energy in classical electrodynamics, thereby rendering atoms unstable? It reminds me of the Simpsons episode where Homer uses a teleportation machine to get a beer from the fridge without needing to get up off the couch.
- I’m aware of Gleason’s theorem, and of the specialness of the 1-norm and 2-norm in linear algebra, and of the arguments for complex amplitudes as opposed to reals or quaternions, and of the beautiful work of Lucien Hardy and of Chiribella et al. and others on axiomatic derivations of quantum theory. As some of you might remember, I even discussed much of this material in Quantum Computing Since Democritus! There’s a huge amount to say about these fascinating justifications for the rules of QM, and I hope to say some of it in my planned survey! For now, I’ll simply remark that every axiomatic reconstruction of QM that I’ve seen, impressive though it was, has relied on one or more axioms that struck me as weird, in the sense that I’d have little trouble dismissing the axioms as totally implausible and unmotivated if I hadn’t already known (from QM, of course) that they were true. The axiomatic reconstructions do help me somewhat with Q2, but little if at all with Q1.
- To keep the discussion focused, in this post I’d like to exclude answers along the lines of “but what if QM is merely an approximation to something else?,” to say nothing of “a century of evidence for QM was all just a massive illusion! LOCAL HIDDEN VARIABLES FOR THE WIN!!!” We can have those debates another day—God knows that, here on Shtetl-Optimized, we have and we will. Here I’m asking instead: imagine that, as fantastical as it sounds, QM were not only exactly true, but (along with relativity, thermodynamics, evolution, and the tastiness of chocolate) one of the profoundest truths our sorry species had ever discovered. Why should I have expected that truth all along? What possible reasons to expect it have I missed?
Follow
Comment #1 January 25th, 2022 at 1:21 am
Before I even attempt to answer Q1, i’d like to get a clarification of comment 2 that you posted.
One of the obvious answers here is that classical mechanics *can’t* reproduce nature as we observe it. That atoms would be unstable, that the ultraviolet catastrophe would have no resolution, and so on and so forth.
You write “the reason is simply that, in each case, it would seem like child’s-play to contrive some classical mechanism to produce the same effect, were that the goal.”
I don’t understand this. What classical mechanism can reproduce these sort of effects? Theorists spent a long time trying to cook up fixes for classical mechanics, and consistently failed. Indeed this was very much the impetus that led people like Planck and others to just abandon all reason and look for semi empirical ‘hacks’ to explain the data.
Comment #2 January 25th, 2022 at 1:27 am
My internal explanation was always basically #2 + anthropic principle (a world that doesn’t allow chemistry probably won’t have anything in it that can observe it).
If one buys the simulation argument, a lot of QM starts to look like a bunch of hacks to limit resolution and simulation fidelity, and get lazy evaluation (you only need to calculate precise position or momentum when someone actually asks for them, and can give vague and ambiguous answers the rest of the time).
Also, wavefunctions lend themselves to static solutions with very few (and small integer!) parameters which saves a lot of memory compared to the classical approach where you don’t even know when to stop with the precision.
Comment #3 January 25th, 2022 at 1:35 am
First shot at anthropics: can you derive complex chemistry from anything less weird?
In a universe where fusion was always exothermic, you’d get one giant nucleus. Never, tons of hydrogen. But exothermic-only-until-iron gets us a mix of elements.
And then there’s the electron shells: 2 8 18… That’s where all the covalent bonding behavior comes from.
I’m not sure what newtonian chemistry would look like, but I can easily believe it wouldn’t be versatile enough for enzymes.
Comment #4 January 25th, 2022 at 2:01 am
Haelfix #1: You’re right, I should clarify what I meant. If you take the whole structure of classical mechanics as given (including Maxwell’s equations and so on), there’s indeed no simple fix that will make atoms stable, make electron shells discrete, etc. etc.—that’s how physicists realized the necessity of QM in the first place!
But from a modern perspective, much of the structure of classical physics (including reversibility, the Euler-Lagrange equation, and more) is ultimately explained by QM! And if that’s the case, then it seems to me that we don’t get to reverse the causal arrow, and say that this classical stuff also explains QM.
So consider, instead, an imagined scenario where we’re designing new laws of physics from scratch. In that scenario, it seems to me to beggar belief that you’d ever consider anything of the metaphysical enormity of QM for such a relatively quotidian purpose as allowing complex chemistry! It’s like, why not just introduce, as primitive elements in your physics, atoms that can only snap together in certain discrete tinkertoy-like ways? There are plenty of classical cellular automata like that, and on their face, they seem to have most or all of what you’d need for complex life, including Turing-universality. Or maybe you want tinkertoy-like atoms plus Lorentz-invariance? But that ought to be doable as well, and with less difficulty than inventing interesting relativistic quantum field theories…
Comment #5 January 25th, 2022 at 2:05 am
Autolykos #2:
If one buys the simulation argument, a lot of QM starts to look like a bunch of hacks to limit resolution and simulation fidelity, and get lazy evaluation (you only need to calculate precise position or momentum when someone actually asks for them, and can give vague and ambiguous answers the rest of the time).
Any answer along those lines, it seems to me, immediately crashes and burns once we realize that passing to wavefunctions, far from decreasing our classical simulation cost, has exponentially increased it—the fact famously exploited by quantum computation.
Comment #6 January 25th, 2022 at 2:18 am
I don’t have any answers, but I have a reformulation of Q1 that I find interesting:
Suppose you want to simulate a universe that could support life, but you don’t have a quantum computer, just a very big classical one. Would you be able to do it?
At the first glance it seems that having some forms of energy and entropy should be enough for complexity to arise. But I don’t know, maybe without the quantum mechanics you don’t have enough spare entropy per unit of matter or something. (Like how stars wouldn’t be able to shine as long as they do without fusion.)
Comment #7 January 25th, 2022 at 2:43 am
There seems to be some conceptual problem here. You keep saying that classical physics + some other unspecified physics < weird than QM. What exactly do you think makes QM uniquely strange? GR strikes me as pretty weird given its effect on space and time. It seems to me that it's classical physics that has all the strangeness in it. QM is quite sensible by comparison.
Comment #8 January 25th, 2022 at 2:58 am
To answer Q1 do we have to first believe that God exists? 🙂
Comment #9 January 25th, 2022 at 3:02 am
I’ll assume QM is exact, as instructed. But even if the ultimate theory X falls squarely within the QM formalism, it may have some striking features that help answer your question. That is, your question may evolve from “Why QM?” to “Why X? [where X happens to be a QM theory]”. E.g. maybe the non-perturbative formulation of M-theory will more clearly shout “I’m self-evident!” than does QM writ large. (Working in QI, one may have a bias to think that the most salient features of a theory are the information-processing capabilities, but maybe those just come along for the ride?)
Comment #10 January 25th, 2022 at 3:28 am
I recently learned about apparently well-known trick. Imagine you have the following dynamical law:
|x> -> |x+1>
Now instead of choosing initial x, let’s take equal superposition of all possible x. Then we’ll get a state, which globally never changes (i.e. transforms into itself). And yet, inhabitants of such a universe would still experience passage of time.
This seems to suggest the following answer: the Creator didn’t want to choose initial conditions and intended the universe to be eternal.
Comment #11 January 25th, 2022 at 3:28 am
In that universe we’d still have quantum computing and BQP, just by different names.
There would still be a fast algorithm for factoring integers, using interference – Shower’s algorithm, named after the ripples in a pond on the rainy day that inspired its invention.
It would be an intangible form of computing, yet somehow not inconsistent with natural laws.
Computer scientists would make bad passive aggressive jokes to physicists about being “more efficient” at describing the world.
But more importantly, imagine actually being unable to factor large integers. How bizarre would that be?
I mean, I know we think of integer factorization as a difficult problem, but in the universe we live in it’s an easy problem.
I actually can’t think of a better reason for the universe to be quantum mechanical than that.
Integers must be efficiently factorizable.
Comment #12 January 25th, 2022 at 3:31 am
I share the intuition that chemistry at our present time and place in the universe should be possible with other physical laws. So let us say QM here and now is unnecessarily convoluted. However, evidence we have nowadays points to that the universe in the present came about through an evolution from a “Big Bang”. QM as we observe it here and now, could it be a remnant of necessary initial conditions of the Big Bang? That is, could a universe launched by God through a Big Bang, but left to its own devices after that, have been built on anything simpler than QM? My work with QM dealt with a lot lower energies, so I do not have any good answers to this question.
Let us say we can provide arguments why the Big Bang conditions required some of the QM weirdness, then Q1 implies the question why the Big Bang rather than a steady-state model a la Hoyle. In a steady-state model without a beginning and thus without any initial conditions “polluting” a present universe, a creator can create chemistry with less weirdness by the earlier assumption. But can things exist without a beginning in the most profound sense, will all questions lead back to an origin where the Unmoved Mover sits majestically? Time to put on the metaphysics hat…
Comment #13 January 25th, 2022 at 3:50 am
Hi Scott
I can’t wait to read the book. Please please make it as layman readable as possible 🙂
From my very ‘under-educated in quantum mechanics’ point of view I have often wondered about Q, mainly Q1 (my QM education isn’t really deep enough to even have much of a wondering about Q2). Could there be something about consciousness that is tied to the process of moving from “many possible futures” to “one possible past”? It seems to me that any world that was fully deterministic would kind of look like the past, i.e. “dead” and static – already happened, so maybe it just wouldn’t “feel” like anything to traverse the timeline in that universe. Similarly it doesn’t “feel” like anything to be in the future of our world, the world where there is no particular answer to the question of what is what. However at the junction between the two, where many futures collapse into one past “feels” like something is happening.
Comment #14 January 25th, 2022 at 4:20 am
Something like it seems to fall out of the mathematical universe hypothesis, if in addition you are also a physicalist about consciousness.
The MUH then implies the physical patterns grounding consciousness is just another mathematical structure – that is, conscious experience is *itself* a mathematical object that can be instantiated within mathematical worlds (which are actual). Consciousness is experienced from the vantage point of that structure itself, so you should expect to live in a kind of quotient isomorphism class of all possible worlds that contain substructures instantiating this object. In other words, you should expect to live in a continuous superposition of all instances in all possible worlds that contain that object, which mathematically amounts to whatever it is that physically grounds and distinguishes your unique conscious experience at this precise moment.
That sounds a lot like Everett! If you make some further Kantian-like presumptions about how the world ought to appear in order for it to be cognizable in the first place, you might conclude that this continuous bulk has to be discretized somehow. Therefore quantum mechanics! That the MUH has presented us with a compelling motivation for the uniquely weird phenomena of quantum mechanics in this fashion is in my view one of the three or four most compelling reasons to believe that it’s true. Incidentally, similar considerations may solve the cosmological measure problem to boot.
Comment #15 January 25th, 2022 at 4:56 am
I think the most way to tackle Q1 is to imagine a universe that is literally described by classical electrodynamics + general relativity, without considering physical problems like the instability of atoms. If the universe were classical matter would of course not be made of atoms! They are not predicted from the classical theories, though, so we are free to do away with them. What are the intrinsic features of such a universe?
There’s a long list of pathologies. I think the gravest one are the singularities predicted by general relativity, but one also has chaotic systems, Hamiltonians that are not bounded below, etc. It’s a fundamentally continuous universe, with all the problems for computation that it brings. Computation might also become too easy; in our universe it’s of course impossible to do analogical computation with infinite-precision real numbers (that can solve PSPACE-complete problems in polynomial time), but in a classical universe? What might forbid it? Even Norton’s dome, which in our universe is just a pathology caused by using an unphysical potential, becomes a real problem. Why would its potential be unphysical in a classical universe?
Comment #16 January 25th, 2022 at 5:22 am
Can’t wait for the book. Even after just a few minutes of contemplation, your Q seems way more profound than one might think initially. I could imagine Multiverses (even high-Tegmark-level ones) and Simulations might appear way more “obviously bollocks” or “obviously true” after thinking about your Q for a while. This might end up in the top-5 of my alltime favorite thought-provoking thoughts. Thanks for occupying my brain for possibly years, Scott!
Scott #5: My current model says we need infinite precision to calculate an interaction between 2 particles without QM, due to e.g. the distance between the particles being infinitely exact. With QM, we can stop once we get below Planck’s length. I’m not putting much credence into my model here, but I haven’t run across a layman-understandable “this-is-why-that’s-obviously-wrong” either.
Comment #17 January 25th, 2022 at 6:08 am
Here’s an answer which touches on some of the anthropic issues, and then a rambling comment about why that answer isn’t really satisfying.
Quantum mechanics may be actually one of the simplest theories which really does produce conscious observers. If we have something like the Mark IV Tegmark multiverse, where all consistent universes “exist” in some deep sense, then the only ones which are going to be worth noting are those which have observers and whose observers are intelligent enough to ask questions about their underlying physics. (It seems plausible that universes complicated enough to have observers are going in general to be complicated enough to be likely to have intelligent observers, so the intelligent requirement may not be restricting things that much.) Now, it seems plausible that if something like Mark IV is correct, then it should be weighted in some way so that mathematically simpler universes get more weight. If so, there may be a high probability that observers show up in quantum mechanical universes. We know that quantum mechanics does this, and the rules of our universe don’t seem to be that complicated. The Standard Model is just not that bad to write down. In contrast, if you want “tinkertoy atoms” you might be actually specifying a lot of things. Carbon chemistry is really complicated and even aside from CHNOPS, almost all living things seem to use at least a few non-CHNOPS atoms on top of that. Specifying that many complicated interacting tinkertoys may be so complicated that very few observers see themselves in such universes.
Six unsatisfying things about this: First, it assumes some version of Tegmark hierarchy, which is a pretty big assumption. Second, we don’t have a Theory of Everything, so without a way to reconcile QM and General Relativity , deciding that the underlying physics is simple seems tough. Third, it is likely that whatever we do find to handle QM and GR is going to be more complicated than just QM and SR, which we can mathematically reconcile. So this then leads to the serious problem of why General Relativity, which seems like about as big an issue as why QM. (This may be my own biases coming in to play; I understand QM and SR a little. GR seems genuinely tougher). It doesn’t seem that GR is doing anything necessary to have intelligent life, and at a minimum, some variant of QM + SR Newtonian gravity doesn’t seem that complicated. Fifth, the assumed weighing for Tegmark should probably not just involve how simple universes are mathematically but how many observers there are. Even if QM is mathematically simple, it really isn’t clear that it produces a lot of observers compared to others. And for that matter, just what is the correct weighing of simplicity v. number of observers? Sixth, while we’re at it, what counts as an observer for these purposes?
Comment #18 January 25th, 2022 at 6:28 am
Hi Scott,
Thanks for this post, I feel like Q is definitely a facinating and extremely important question. Here is a sketch of an answer, that stems from studying quantum causal models (and comparing them to classical ones), in the spirit of https://arxiv.org/abs/1906.10726 and related works (disclaimer: this is from my PhD supervisor and coworkers, so I might of course be non-neutral here).
Suppose you look at causal models of deterministic and reversible transformations, i.e. bijective functions in the classical case, and unitary maps in the quantum one. The “deterministic and reversible” part is important, and can be argued for, essentially, by noting that anything else would have good chances of stemming more from our own imperfect perspective than from the actual workings of the external world. Suppose also that you’re looking at finite dimensional systems (mostly for simplicity — I would be surprised if this didn’t generalise to infinite dimension).
Then there is a natural question you might ask, about whether the causal structure of a transformation is always time-symmetric. By this I mean, take a deterministic and reversible transformation f:AB -> CD; it has some causal structure (for example, featuring all causal influences except from B to D). Then look at the causal structure of f^dag:CD -> AB, the reversed version of f. Is it just the reverse of f’s causal structure? If yes, then one can say that f’s causal structure is time-symmetric.
I think it is pretty natural to argue that causal structures should be time-symmetric in a fundamental theory of the universe. After all, if there is some influence A -> D in the forward time-direction, shouldn’t this be the same thing as an influence D -> A in the backward time direction? Even more so given that your dynamics is deterministic — which sould mean that your causal analysis is not “missing on anything else”, for example a hidden common cause, or information getting lost, or whatnot.
Now you probably can see where this is going. If I look at reversible deterministic classical transformations (i.e. bijections), then their causal structure is quite easy to define — think of functional independence — and it is not always time-symmetric. A typical counter-example is the classical CNOT: AB -> CD, where A and C represent the control system and B and D the target system. This gate features influence A ->D, but its reverse (which is also a CNOT) features no influence D->A.
On the contrary, let us look at the quantum case. First, like in the classical case, one can define causal influence between inputs and outputs of a unitary channel in a neat way, that is equivalent to a handful of natural definitions you might think of. And one can then show that the causal structure of unitary channels is always time-symmetric! A good way to get a feeling of why this is the case is to look at the quantum version of my previous example, i.e. at the quantum CNOT: reversibility of the causal structure in this case is essentially given by the good old quantum backaction.
So: the causal structure of quantum deterministic reversible transformations is time-symmetric, which is not always the case for classical ones. In my view this is a very interesting way in which quantum theory is, in fact, better-behaved than classical theory. And therefore a sensible (start of an) answer to Q!
Comment #19 January 25th, 2022 at 7:28 am
Interested observer of science who write SF, that’s my caveat here.
QM can be constructed from rules about information. So the most elementary unit can carry no more than one bit of information (set by the Planck limit, I guess).
Everything is a composite made of sub-set of ‘bits’ and is not determined until measured. Reversibility is maintained at the smallest state.
I’m commenting so that I can be corrected by those who know what they are talking about. I am at best the equivalent of Marilyn Monroe being taught to explain relativity.
And thank you.
Comment #20 January 25th, 2022 at 7:34 am
A pretty unsatisfying possibility:
1. Quantum theoryspace is in some sense ‘bigger’ than classical theoryspace, as suggested by Q2.
2. Anything that can happen will happen.
3. Hence, anthropically, we should expect to live in a quantum world rather than a classical world.
(2) is probably philosophically defensible. Given (1) and (2), (3) ought to be a lock. (You maybe need some kind of argument about how quantumness isn’t innately less likely to produce anthropic observers, too.) But (1) is the really tender point, because I can _also_ imagine worlds with really weird mechanics, with lots of parameters (and hence occupying lots and lots of theoryspace), but still classical and life-supporting; it’s not at all clear that quantum theoryspace is any larger.
If I was seriously trying to defend (1), I might go with an argument like this:
4. Classical theory can be recovered from quantum theory, but not vice versa.
5. Therefore, quantum theory is more parametric and uniform (using the programming language theoretic definitions), and has less degrees of freedom.
But it’s then a stretch to go from (5) to talking about classical theoryspace at large – (4) is specific to our own universe, while I want to talk about any possible universe. I’d need a more general version, something like:
6. Any quantum theory gives rise to a classical approximation.
7. Therefore, classical theoryspace may contain theories that have no corresponding quantum theory.
I dunno. Anthropic arguments always come out unsatisfying when we don’t know what the prior distribution of possibilities was.
Comment #21 January 25th, 2022 at 7:41 am
It’s been years since I did anything that has to do with QM, so I hope I’m not making a fool of myself here, but here goes.
When we have two particles, we take the tensor product of their Hilbert spaces as the Hilbert space of the composite system, right? We can look at it the other way too. Start with the big Hilbert space H, and decompose it into the component Hilbert spaces.
But wait – there is more than one way to do that. We can choose any 2-dimensional subspace H1, and then define the quotient space H2 = H/H1. H1 and H2 are two small Hilbert spaces which reproduce the original H when we take their tensor product.
Only one choice of subspace H1 will get us the original two particles we started with. What will the other choices get us?
Can we take the Hilbert space of the universe and decompose it into components in a new and different way? I don’t mean a different way as in grouping the particles from Earth with the particles from Venus or something like that. I mean choosing entirely different subspaces, each one being a weird intersection of all the particles in the universe.
In this new picture – how do the laws of physics look? If they don’t make any sense – maybe this is why we experience this specific partition of the universal Hilbert space.
Maybe the only axiom we need is that the universe is a Hilbert space, evolving unitarily in time (or maybe we don’t even need the evolution part, if we think of time as a space-like dimension). Everything else is derived from the way we divide up the universal Hilbert space into components.
If we have only one axiom, we have less of an explanatory burden, and that’s why I think this whole thing is relevant here.
Comment #22 January 25th, 2022 at 7:49 am
Your Q1, “Why didn’t God just make the universe classical and be done with it? What would’ve been wrong with that choice?” finds an answer of sorts in the first sentence of the abstract of my “Classical and Quantum Measurement Theories”, arXiv:2201.04667, “Classical and quantum measurement theories are usually held to be different because the algebra of classical measurements is commutative, however the Poisson bracket allows noncommutativity to be added naturally.”
One way to read this is that God didn’t make the universe classical or quantum, God gave us the facility to measure his works systematically and to carefully consider the results. Colbeck&Renner, in Nature Communications 2011, “No extension of quantum theory can have improved predictive power”, prove under their assumptions that Quantum Mechanics is complete as a tool for the systematic description of past and future measurement results, so we can consider how we can make Classical Mechanics as complete in the same sense. The Poisson bracket allows such a completion.
The first paragraph of arXiv:2201.04667 is “The classical measurement theory we are accustomed to is incomplete: it cannot describe all the measurement results that can be described by quantum measurement theory. Instead of asking how we can complete quantum mechanics so it can be more like classical mechanics, we should ask how we can complete classical measurement theory so it can be more like quantum measurement theory.”
I cannot guarantee you will like the whole paper, but it is quite closely tuned to your Q.
Comment #23 January 25th, 2022 at 7:53 am
Greg Guy #7: No, I don’t say (and I don’t think I did say) that any minor modification of classical mechanics is necessarily “less weird” than QM. It’s certainly different than QM, though, and exponentially smaller in its state space, and easier to simulate on a classical computer, because otherwise it wouldn’t be a minor modification!
Comment #24 January 25th, 2022 at 7:54 am
Rahul #8:
To answer Q1 do we have to first believe that God exists? 🙂
No, you most definitely do not 🙂
Comment #25 January 25th, 2022 at 8:19 am
Daniel H Ranard #9: I completely agree; it’s possible that future developments in quantum gravity (or whatever) will shed light on the question, and indeed that’s one reason among many to work on quantum gravity. Even there, though, it would be great if we could say anything interesting today about which quantum gravity developments would say what!
I will say that it’s not obvious to me whether the past 90 years of work on QFT, the Standard Model, and quantum gravity, as heroically important as it’s been, has shed any light on either Q1 or Q2, except insofar as it’s heightened their importance by confirming QM as the foundation of physics rather than the temporary expedient some hoped it was. What do others think?
Comment #26 January 25th, 2022 at 8:43 am
Dmitri Urbanowicz #10:
This seems to suggest the following answer: the Creator didn’t want to choose initial conditions and intended the universe to be eternal.
But you could’ve gotten the same thing with a classical block universe, or (say) a probability distribution over time steps. It didn’t have to be quantum.
(I fear that I’m going to have many, many responses of this form! 🙂 )
Comment #27 January 25th, 2022 at 8:47 am
Job #11:
I actually can’t think of a better reason for the universe to be quantum mechanical than that.
Integers must be efficiently factorizable.
That possibility has of course occurred to me … but dare I push back a step, and ask why it’s so much more important for integers to be efficiently factorizable, than for any of the other problems in NP-P to have efficient solutions? 😀
Comment #28 January 25th, 2022 at 8:57 am
Yoni #13:
Could there be something about consciousness that is tied to the process of moving from “many possible futures” to “one possible past”? It seems to me that any world that was fully deterministic would kind of look like the past, i.e. “dead” and static – already happened, so maybe it just wouldn’t “feel” like anything to traverse the timeline in that universe.
I actually think there’s a lot to that, that you may have put your finger on something profoundly important! But the difficulty is this: why not just achieve the branching, the many possible futures collapsing to one past, via classical probability (plus some “free will / Knightian uncertainty” here or there if you like)? Why do you need quantum superposition in particular?
Comment #29 January 25th, 2022 at 9:05 am
Chris #14:
In other words, you should expect to live in a continuous superposition of all instances in all possible worlds that contain that object, which mathematically amounts to whatever it is that physically grounds and distinguishes your unique conscious experience at this precise moment.
That sounds a lot like Everett!
No, to me it sounds more like simply a listing of all the different possibilities, or a probability distribution over the possibilities, or some other way to rank them in prominence and/or organize them. Why jump immediately to QM, with the interference and the complex numbers and whatnot?
Comment #30 January 25th, 2022 at 9:10 am
If you start with General Relativity as an axiom (this doesn’t seem weird to me), then it seems clear that a consequence (e.g. Thorne, 1991: https://journals.aps.org/prd/abstract/10.1103/PhysRevD.44.1077, or Thorne, 1993: https://www.its.caltech.edu/~kip/index.html/PubScans/II-121.pdf) all kinds of problems involving infinite families of degenerate classical solutions arise, which in my mind seems to be a giant neon sign pointing towards something at least similar to Quantum Mechanics. The papers cited above point out that self-consistency requirements probably constrain these infinite multiplicities in various ways, providing a plausible mechanism by which Quantum Mechanics could arise as a direct consequence of nothing other than vanilla General Relativity. Unfortunately those papers also seem to generally have the attitude that “oh hey, look, these problems are tamed after quantization”, rather than taking seriously the possibility that QM itself is just a manifestation of GR. I’ve never heard a satisfying response to why this line of reasoning doesn’t seem to have been seriously followed up on, so maybe this could be something you talk about in your book!
Comment #31 January 25th, 2022 at 9:12 am
Mateus Araújo #15 (and also Primer #16): As I should’ve said in the original post, I completely agree with you that there are excellent, known reasons why the universe shouldn’t have been based on classical, continuous-all-the-way-down physics, ranging from singularities to various computational pathologies. But that still leaves the question: whatever was going to come in to discretize or regularize the continuum, why shouldn’t it have been classical, like a cellular automaton? What was wrong with Conway’s Game of Life, or your favorite similarly Turing-complete variation thereof (maybe a probabilistic one), as the basis of reality?
Comment #32 January 25th, 2022 at 9:12 am
My best guess, trying my best to separate “the physics” from “the formalism” (often foundational people confuse these):
The world is relational, one can only find out about a system by kicking it with some sort of device. So “measurement” and system can not represented mathematically by measurement=f(system).
The next best thing is result~(measurement)|system> or
result~Tr(measurement.system)
where (measurement) is an operator hitting the system, and different measurements do not commute.
If we use linear operators the two statements above are equivalent so no reason to worry which to pick.
To go further, in the world we live in every measurement on every system gives some kind of result and there is a continuum set of measurements one can make, and results are real numbers. The right mathematical properties to describe this are possessed by Hermitian operators (completeness, real observables, connection to unitary operators. Note these are not specific to QM, see Sturm-Lioville theory).
Now, non-commutation means lack of knowledge, and we need to somehow model this mathematically. Thats what probability is, a model of lack of knowledge. So how do we put together probability with the above facts? Gleasons theorem shows the consistent way to do so.
So we essentially argued for quantum mechanics from “reasonable° assumptions (relationalism, the continuity of the set of questions, the guarantee to get answes). The fact that there are non-trivial consequences to this (quantization of energy levels in bound systems) is that much more satisfying.
Comment #33 January 25th, 2022 at 9:21 am
Joshua Zelinsky #17: I like that you’re grappling with the question in precisely the spirit I intended it! But my difficulty is: why shouldn’t we be able to design some classical cellular automaton, way way simpler than the Standard Model (with its 25 adjustable parameters and so on), which would similarly give rise to a collection of “atoms” that can fit together only in discrete, tinkertoy-like ways?
Actually, you’ve made me realize that designing such a CA would be a phenomenal research project for anyone seeking to investigate Q1/Q2.
Comment #34 January 26th, 2022 at 9:57 am
Stephen Wolfram tried / is trying to do this. See book A New Kind Of Science
Comment #35 January 25th, 2022 at 9:25 am
Something I thought was old hat, but which a couple people have told me has some small novel element, is that you can weaken the Born rule to “as amplitude limits to 1, probability limits to 100%”. You then recover the full p = |a|^2 rule by appealing to the usual suspects like the structure of the tensor product, no signalling, and the ability to port computational definitions from classical to quantum by using no-garbage reversible circuits as a shared language.
Whenever you can translate a statement like “X has probability p” into something like “this family of garbageless circuits limits to always returning true” you can port this into and out of quantum mechanics as “this family of garbageless circuits limits to producing a superposition with all amplitudes equal to zero except for the |true_output> state”.
I try to work through it in more detail here: https://algassert.com/post/1902
Another way to think about this is… whenever we try to carefully investigate something, we do statistics on it. We do repeated experiments, or a huge variety of experiments, and aggregate all those runs into some simpler answer. This process of aggregating, of focusing on the tiny number of bits summarizing many input bits of unknown nature (e.g. a trillion shots being reduced from a trillion outcome bits to a 20 bit hit rate number), screens off the need to “get the input bits right”. It seems sufficient to converge on getting the behavior of the summary bits right.
Comment #36 January 25th, 2022 at 9:26 am
I think the answer to Question #1 is most likely to be along the lines of the answers to the questions “Why does the Solar System have 9 planets, instead of more or fewer?” “Why is the Earth’s surface 75% water, instead of 50%? Why did civilization get its start in Mesopotamia, instead of somewhere else first?
Comment #37 January 25th, 2022 at 9:27 am
The answers to Q1 and Q2 are clear-
Q1-The creation of a universe by a deity must arise from an intention. The intention must have some goal. The goal of the intention to create a universe must be associated with something that isn’t known by the deity (if known it would not provide interest). The created universe must therefore include elements that cannot be known prior to their measurement (a true randomnizer to said deity). Hence quantum mechanics are necessary in any universe created by a deity.
If you evaluate potential explanatory models-Creator+First Person Shooter, Creator+Simulation, Creator+Boredom, etc., all require some element that cannot be known beforehand by the deity in order to provide sufficient interest to form an intention and hence quantum mechanics is a necessary condition. QED.
Q2-Following from Q1, steam engines and flywheels all the way down is insufficiently interesting. Hence, our universe is the simplest possible that meets the deities requirements for unpredictability and interest.
Of course the deity plays dice with the universe. Why roll at all if the outcome is known a priori?
Comment #38 January 25th, 2022 at 9:30 am
Why not tackling the even more ambitious question: why should the universe be mathematical at all?
It seems that QM is an efficient patch to some of the bad inefficiencies that happen when modelling the universe as a mathematical entity.
As Feynman said “Why should it take an infinite amount of logic to figure out what one stinky little bit of space-time is going to do?”
Comment #39 January 25th, 2022 at 9:32 am
Augustin Vanrietvelde #18: Thank you; your comment makes me glad I did this post! You’ve given me an argument that (1) genuinely depends on QM and (2) I’d genuinely never heard before.
Alas, thinking through your CNOT example, I’m led to a very different conclusion than you are: namely, that I see no reason why the time-reverse of a causal influence from A to B, should be interpretable as a causal influence from B to A. Suppose, for example, you took a video of rain making grass wet, and then played it for me backwards: would I say that in the bizarro backwards world, the wet grass had caused the rain? No, I’d simply say that it “looked backwards” to me—or, on further reflection, that you’d broken causality by reversing the thermodynamic Arrow of Time!
I do, on the other hand, agree that “no action without back-action” is an elegant feature of QM, and that it’s beautifully illustrated by the Hadamard conjugate of CNOT(a→b) being CNOT(b→a).
Comment #40 January 25th, 2022 at 9:36 am
Ashley R Pollard #19: The difficulty is that you could also have a classical theory where everything was built out of elementary units of information, at some minimum length scale (which you could call the “Planck scale” even though it wouldn’t involve Planck’s constant). Cellular automata provide explicit examples.
Comment #41 January 25th, 2022 at 9:39 am
As for Q1, it seems like the answer ought to be a combination of the anthropic principle and Occam’s razor.
What is the *mathematically simplest* universe that gives rise to complex intelligences that can question why their universe is the way it is.
And well mathematically speaking, quantum mechanics is pretty simple. Sure it uses exponentially large objects in the background to describe everyday situations, but the fundamental mathematical formulations are only a few equations. Now I’m not sure that it is actually the *simplest* way to give rise to complex intelligences (though really you’d expect to sample from a Kolmogorov prior rather than just take the single simplest option), but it seems a lot simpler than hacking classical mechanics to make things work.
Like you could add tinker-toy atoms to classical mechanics, but you’d also have to add a whole bunch of ad hoc rules about which atoms can combine into molecules and how much energy they release when they do and what angles they form and so on and so on. Quantum mechanics on the other hand, manages to get this as an emergent property of a few equations.
Comment #42 January 25th, 2022 at 9:39 am
Sam #20:
But (1) is the really tender point, because I can _also_ imagine worlds with really weird mechanics, with lots of parameters (and hence occupying lots and lots of theoryspace), but still classical and life-supporting; it’s not at all clear that quantum theoryspace is any larger.
Yes, this is precisely the difficulty!
Comment #43 January 25th, 2022 at 9:43 am
Maybe we flip Q1 around: Why does classical physics exist? If QM is really true at a fundamental level (assuming QM here includes QFT and the standard model and somehow also gravity), then why does it seem so weird to us? Presumably the answer is that sentient life requires a very large number of particles to exist, and also to exist in on a planet which is even larger, and it’s not that surprising that 10e23 particles mixed up in complex structures and at length scales much bigger than the plank length, behave differently than an ideal gas, and that different behavior is what we call classical physics. It’s a special case of QM that we have been psychologically adapted to.
We would expect many of the same conservation laws to exist on the macro and micro scale, and they do, whereas some things aren’t observed at the biggest scale so they seem weird to us because we’ve never encountered them before. Both microwaves and X-rays behave very differently than visible light, even though they’re all the same “thing”.
As for Q2…I suspect the answer is that we live in the simplest possible universe which is also complicated enough to allow life to form. The math only seems “complicated” because it’s hard to understand, but the number of basic assumptions are very few. SU(2) and SO(3) symmetry seem intuitive to me, SU(1) is a little weirder but I think it’s just because it has no classical analogue. The overall linearity of QM is probably not a coincidence either, but rather a core requirement of the theory. General relativity requires only a few extra assumptions, and I suspect when we have a GUTOE the number of assumptions will still be very few.
QM needs to be on the complex plane is equivalent to saying the wavefunction needs two components at every point in spacetime. Maybe somebody could come up with a mathematically self-consistent and non-trivial QM that didn’t but I suspect it wouldn’t be able to support any type of molecule, and hence no life, and no sentient life.
But why the QM formalism in general? Why is there a thing called a wavefunction, and we can operate on it with different operators (energy, position, momentum) and those mathematical operations correspond to measurements we make? I’ll admit this has me a bit stumped, and I’m curious what you come up with as an answer. From Noethers theorem we know that translations in time and space lead to conservation of energy and momentum respectively. It stands to reason we should be able to measure and/or calculate what these quantities are…somehow. Same for any other physical measurement, we should be able to describe it mathematically…somehow.
It may be that QM is the simplest (aka fewest number of assumptions) theory that satisfies everything it needs to:
1. Linear
2. Local
3. Poincare group symmetries (which stem from the universe being translationally, rotationally, and boost-invariant)
4. Allows for the existence of atoms and molecules. This last one because of the anthropic principle.
Comment #44 January 25th, 2022 at 9:44 am
Jonathan #21: It’s certainly true that you can start with just an abstract Hilbert space plus, say, the Standard Model Hamiltonian, and then use the locality of the Hamiltonian to find a preferred tensor product decomposition for the Hilbert space, without needing to assume the tensor product decomposition from the start. And you can do even more impressive things than that; Sean Carroll (for example) has written a lot about this. But that’s the correct answer to a very different question than the one I asked! 🙂
Comment #45 January 25th, 2022 at 9:53 am
Peter Morgan #22: Why am I not surprised that you have a paper that answers the question and that I won’t understand? 🙂
But I’m confused: do either you, or Colbeck and Renner, give any argument that quantum measurement theory is (in any sense) a unique completion of classical measurement theory? Or do you merely argue that it’s one possible completion, “completion” in the sense that nothing additional can be added to it?
Comment #46 January 25th, 2022 at 9:53 am
Very exciting! I’m looking forward to hearing what you come up with.
I don’t have any strong views on this, but wanted to pass along two approaches/ideas to make sure you were aware of them.
The first is a central role of information processing and some kind of renormalization process that leads to effective theories that look like QM. As you are deeply aware, information and entanglement seem to be fundamental components of WHY QM is the way it is.
Toffoli has made some interesting points about action being the counterpart of entropy for computations/trajectories instead of for states. (https://link.springer.com/article/10.1023/A:1024411819910) and (http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.42.2410)
I wonder if there is some general argument that says a lot of different micro-theories would converge to something that looks like the path-integral formulation of QM, since varied laws of micro-dynamics + coarse-graining will yield to an effective description that looks like laws that minimize an action. I don’t know why the state space would be a Hilbert space though. But it feels like looking at processes that coarse-grain to something that looks approximately like a Hilbert space might be a fruitful avenue.
Second, while I haven’t looked into this in depth, there is a small community of people studying QM as being emergent from particular laws of logic and inference: https://arxiv.org/abs/1511.02823. I think this is in a similar vein to the papers you mentioned in your 3rd comment.
Comment #47 January 25th, 2022 at 10:01 am
I think the actual answer will involve determining what the universe needs to look like to an observer, given the fact that no observer can observe itself perfectly, even in theory.
E.g. what would the universe look like to an AI instantiated in Conway’s Game of Life?
It surely would not look like Conway’s Game of Life, given that the observers on the board could not have an exact model of themselves, which they could, if they could look at themselves the way we could if we saw the board.
Since no one was yet answered this (as far as I know), it could simply be that Conway’s Game of Life (or any other classical universe), to an instantiated observer, would look like quantum mechanics or something very like it.
Comment #48 January 25th, 2022 at 10:03 am
Scott #18: I’m glad you liked the idea!
About your comparison between CNOTs and rain hitting grass, I’d say this is precisely why, when laying down the premises for quantum causal modeling, I was stressing the restriction to only look at reversible tranformations. Indeed the irreversible ones can definitely not be hoped to obey such a thing as time-symmetry of their causal structure — in fact, in the quantum case too, the causal structure of irreversible transformations is not time-symmetric.
But the premise here is to restrict yourself to only look at reversible transformations. For such transformations, there is by definition no thermodynamic arrow of time, so your example doesn’t apply. And it is arguable that reversible (and deterministic) transformations are the only ones you should be looking at anyway, on the premise that irreversibility only happens when the human perspective gets in the way, and that the world out there is time-symmetric. So, in short, none of this is about processes that would “look backward to you”, they have been excluded from the start, both in the classical and quantum case.
About your comment on backaction, I totally agree, and I would say that in some sense this whole time-symmetry thing can be seen as a way to formalise the “no action without backaction” principle. So you can see it as just this principle, but now stated in a mathematically and conceptually rigorous way, and also proven in all generality. In a sense it turns it from this “elegant feature” you’re talking about into a deep structural property!
Comment #49 January 25th, 2022 at 10:08 am
I’d guess that the quantum aspects of reality are a special-case of a more general super-selection principle for reality itself. That is to say, the only assumptions we need are that the laws of physics themselves are not uniquely fixed, and that conscious observers exist at all.
It could be that the laws of physics evolve over time, and just as the state of our particular universe started off simple at the big bang, and increased in complexity over time, so too is this happening at a deeper level…. to the laws of physics themselves. One could imagine universes popping in and out existence in a much vaster multiverse, and there’s some evolution of universes over many eons.
I’m saying that principle of Occam’s razor isn’t necessarily true for the universe as a whole. Perhaps the laws of physics aren’t maximally simple, rather they’ve evolved towards greater complexity, just enough to allow conscious observers to exist. And quantum mechanics might just be the minimum amount of ‘extra’ complexity you could add to the simplest possible laws of physics to allow for conscious observers.
You said in (4) you want to assume that current QM is exactly correct, but if that were true, I really don’t think there can be any answers for your questions. I mean some of the brightest minds have thought about quantum foundations for decades, so surely it’s unlikely that anything has been missed?
Looking at the theorems like Gleason’s theorem etc., I don’t think they’re convincing arguments for completeness of QM, since in part they already *assume* that QM is correct, so arguments for completeness are just circular logic (for instance, a mathematician might assume that the Hilbert space formalism is correct, but if the ultimate theory turns out to use a slightly different geometry from Hilbert space, then obviously, the theorems that started by assuming standard Hilbert space would not apply).
Comment #50 January 25th, 2022 at 10:26 am
Presumably, Aristotle or Francis Bacon or Newton couldn’t have inferred QM just by thinking about the world of ordinary experience and applying Occam’s razor; you have to do a bunch of carefully designed experiments with fairly sophisticated equipment. If that’s true, there could have been a world that was exactly the same as far as ordinary experience is concerned, but worked on entirely different underlying principles.
When you consider applying Occam’s razor using some measure of complexity like Kolmogorov’s, you have to take into account, not merely the complexity of the foundational physical theory, but also the unfathomable complexity of the boundary conditions in the QM account.
Comment #51 January 25th, 2022 at 10:44 am
Can’t wait to read your book on Q 🙂
Here’s how I see the problem:
So, answer to Q is A1: QM is the best given constraints.
Ok, Q1a: what are the options?, and Q1b: what are the constraints?
Options: A1a – Probably computational power is what’s important, and not the practical details of implementation. Two obvious alternatives are less powerful universe (classical), and more powerful universe (the one where NP-complete tasks can be solved using polynomial resources). Q1c: is there less obvious alternative? Q1d: why universe is not classical, and Q1e: why universe is not super-powerful to offer NP solutions (I guess there is a term for this).
Constraints: A2b.1 – life, and A2b.2 – simplicity. The universe should be simple enough so as not to look too suspicious, and complex enough to allow evolution of us. Note that simplicity is not the same thing as entropy: Big Bang was very unlikely because low entropy, yet at the same time very simple. Also, simplicity is a property of explanation, not the actual universe: nothing precludes universe from starting from very complex pattern, it’s just makes an unconvincing theory.
Ok, Q1d: why not classical? Hypotheses: it’s hard to make it simple enough. You can build classical universe which is capable of producing intelligence in a Life game simulator (I assume it has enough power for that), but you need insanely complex starting position or instructions for that. Maybe I’m wrong and there are ideas on how to evolve universal turing machines from very simple classical start.
Alright, Q1e: why not NP-powerful? Hypotheses: it’s NP-powerful is too powerful for the evolution of life. Intuition behind that is that many problems life tends to solve are actually NP-complete, and life just evolves more and more clever tricks to solve them using polynomial heuristics. Then, if you can just obtain the best solution using polynomial resources, you don’t really need to be very clever – you can just get there in one (P) step. This is obviously a speculation. Also, maybe in NP-powerful universe you have to solve super-NP-complete problems, and still need some intelligence. It would be interesting to read on the computational complexity of optimizing one’s fitness.
Fine, Q1c: how BQP is the best? A) maybe it’s not – (I just don’t know other alternatives in the similar power spectrum, and would really enjoy reading about them). B) somehow, to capture free energy from low-entropy past, one needs complex chemical (or alternative) networks, which are built using very simple rules. I don’t know why chemistry cannot be purely classical, and again, maybe it can.
Comment #52 January 25th, 2022 at 10:45 am
An important clue is that quantum mechanics is useful in pure math: topological invariants, supersymmetry and Morse theory, etc… So, clearly the answer is that some super-powerful aliens constructed a quantum universe to answer some question in 11-dimensional topology, and we are merely bugs in their code.
And based off your ankle injuries, Scott, may I recommend increasing your physical activity, with basic weight training, flexibility and mobility, and cardio? It will make you a little less mortal and will probably increase your research output in the long run.
Comment #53 January 25th, 2022 at 10:57 am
As a mathematician, I’ve often times thought about questions like Q myself, but from a different perspective. If I had the time and the ability, I would seek the answer somewhere in the arena of coherent states, squeezed states, Parseval frames, and similar. Admittedly, an “older, well-trodden” path, but given that from God’s perspective it all begins with light (according to Genesis), and given how much light is a central theme in humankind’s pursuit of understanding — first a particle, then a wave, then both, bent by a gravitational field, emitted in quanta, then a field, then a squeezed state that is now a tool for study of everything from gravitational waves to quantum computing — the theory of coherent states has for me always seemed to be where best to find why we think classically in such a quantum world.
Feel free to stop with the first paragraph, but I’ll elaborate if any of you are still interested. Classical physics in many ways parallels the development of mathematics — intertwines perhaps — but in spite of the adage “God created the integers,” mathematics is anything but divinely inspired. Though deep, elegant, and incredibly useful, mathematics often struggles even with itself, not to mention with physics and the other sciences. All that massive effort to transform Riemann Integration into measure theory and the Lesbesgue integral yields an absolute integral, and much of physics — the Fourier Transform, for instance — requires integrals that aren’t absolute. Certainly, there are other “integrals,” but even then, essential identities like
$$ \int_{-\infty}^{\infty} \frac{ \sin(x)} {x} dx = \pi $$
require “improper integration” or other “ad hoc” justifications — and careful how you integrate such “conditional integrals” lest you change their values altogether. Classical physics, especially classical field theory, tracks with this rigorous followed by “ad hoc” mathematics. Where are these positive and negative infinities in the limits of integration, why do we model the world with sine waves that are infinite in extent, and even more important, why does doing so even work!
One possible answer is redundancy. We base so much of both math and physics on uniqueness — unique representation by an orthonormal basis, for example — whereas so much of reality does not feature uniqueness. One reason for the success of neural networks as a machine learning algorithm is in their redundancy — I especially enjoy an example I use with students where I train a neural network (simple classifier), “destroy” a random 10% of the “artificial neurons”, and then show them how little, if at all, the neural network’s performance has been reduced.
Enter the overcompleteness of the coherent states, as well as the related mathematical concept of a Parseval Frame. Coherent states seem to be instrumental in explaining why we have a classical perception of a quantum world — as well as illustrating where our quantum understanding is still not fully developed. Frames are an essential tool in applied mathematics, engineering, machine learning, and a host of other arenas where the importance of obtaining “a solution” outweighs the importance of it being a “unique” solution. Indeed, there is to me this “theme” that has emerged throughout modern mathematics, science, and engineering — namely, that “reality” is not a unique experience common to all, but instead, it is via redundancy a “shared experience” that admits multiple interpretations including highly idealized logical descriptions that are not part of that reality itself. That is what I would pursue, if I had the time and the ability to pull it off.
Comment #54 January 25th, 2022 at 10:59 am
Scott #5:
All right, here’s an attempt to address this objection. Brace yourself for some wild, late-night-at-the-freshman-dorm quality speculation, but … what if:
1) Universes governed by laws of physics which support intelligent life are, as Yudkowsky, Bostrom, et al propose about our universe, prone to being taken over by runaway superintelligent optimizers with more or less randomly selected objective functions.
2) Our apparent reality is, not a simulation exactly, but an epiphenomenon of whatever some superintelligent optimizer is up to, operating on some lower substrate of physics (possibly itself an epiphenomenon of an even lower-level runaway optimization process, and so on some hopefully finite number of layers down until you get to the true fundamental physics).
3) The substrate physics immediately below ours actually does allow arbitrary non-deterministic Turing-computation for free, or infinite parallelism, or something like that, i.e. something much more powerful than both classical and quantum computation.
4) The superintelligent optimizer below us (SOBU for short) is indifferent to intelligent life forms such as us arising as a side effect of its computation, *just as long as* it is able to ensure that no superintelligent being that arises in our level of reality is ever able to “reach down” and commandeer its computational resources for purposes contrary to its objective.
5) The method SOBU has found to minimize the risk of that happening entails that, compared to it, our computational capabilities are much more limited. Specifically, we are limited to the capabilities of quantum computers.
6) This may strike us as a weird and arbitrary complexity class compared to classical Turing machines, but SOBU doesn’t care if some janky limited form of beyond-classical parallel computation “leaks” into our reality, as long as we are still computationally much weaker than it.
Comment #55 January 25th, 2022 at 11:06 am
I don’t have a direct answer to these questions but want to make a suggestion that I hope is relevant.
I suspect that our attempts to understand the universe are thwarted greatly by the mental hacks, installed by evolution, that we bring to bear on the problem. Specifically, we see the universe as made of objects with some permanence, influenced by forces, evolving over time. We interpret everything through the lens of causality. And those are just the mental hacks that we can see!
We recognize that QM strains our ability to visualize how the universe works. As Feynman famously said, no one understands QM. I suspect this is just the tip of the iceberg of our lack of understanding. It’s probably worse than we realize.
Here’s what I think will happen. We will eventually invent a real Artificial Intelligence capable of thinking of these things. We will eliminate these mental hacks from our AI, perhaps replacing them by some new ones not shared by humans. An entire science will arise around inventing new ways to look at the universe. Then we will set our non-human AI(s) loose on understanding the universe. I suspect they will come up with an entirely different way of looking at the universe, perhaps several. Then we will query the AI in order to figure out what they mean. I don’t offer a timeline for all this.
Comment #56 January 25th, 2022 at 11:11 am
Start not with god.
Start with the Komolgorov complexity prior over the space of computer programs.
Around 1/n th of this space has run time BB(n). This space doesn’t care to minimize runtime. Not in the slightest bit. However many of the programs are much faster. The only thing this prior cares about is how many bits something takes to express. And quantum mechanics can be expressed in few bits. (I would hazard a guess that there are no laws of classical physics that are similarly simple to our own and which produce a universe similar to this one. ) Of course, I can’t prove this and it may depend on how similar counts as similar. However, if this is true, it is a highly nonobvious mathematical fact.
We can put some bounds on the size of our past, in that it has to be big enough to allow evolution, but small enough to let us be first. If the universe was exponentially huger than what we see through telescopes with a practical way for a self replicator to spread, there is no way we could have been first. This puts no limits on quantum branches of space past our cosmic horizon that doesn’t influence us.
There is no particular restriction on physical law to be a shape easily grasped by human minds. Nor for the link between underlying law and surface phenomena to be direct.
Any attempt to reformulate QM into a simpler or more elegant form is just evidence
towards quantum mechanics having a low komolgorov complexity. We already know its pretty low, but we may be able to reduce logical uncertainty about exactly how low here. This includes showing quantum mechanics is optimal under criteria X, optimality is simple. If X is simple, this is further evidence of a low komolgorov complexity. (Some “up to choice of turing machine” stuff going on here. )
Comment #57 January 25th, 2022 at 11:33 am
I see that Lucien Hardy in the paper you mentioned (thanks for pointing it out) actually makes that claim: “Put another way, could a 19th century theorist have developed quantum theory without access to the empirical data that later became available to his 20th century descendants? In this paper it will be shown that quantum theory follows from five very reasonable axioms which might well have been posited without any particular access to empirical data.” But that’s wildly far-fetched. Hardy’s description of classical theory, “A classical system will have available to it a number, N , of distinguishable states. For example, we could consider a ball that can be in one of N boxes,” is not one that 19th century theorists would have accepted or even recognized; it is completely informed by a quantum mechanical world view. What the argument shows is that, if you view the world in the way that QM tells you to view the world, then QM works better than CM. The idea in Hardy’s conclusion that it should have been a classical theorist rather than Schrodinger complaining about “damned jumps” is absurd (or, I suppose, a joke). Quantum theory, after all, is quantized.
Comment #58 January 25th, 2022 at 11:35 am
Scott,
I am thrilled to learn that you are dedicating some time to this undertaking. I can hardly think of someone more suited for the task than you. I like in particular that you highlight your non-dismissal of the questions as silly among your qualifications.
In the spirit of “Metaphysical Spouting” and “Embarrassing Myself,” I howl the following crack-pot half-baked sub-Aaronson Beachboys Threshold ideas into the wind:
I have always shared (what I am interpreting as) your dissatisfaction with our current understanding of QM. By this I do not mean a dissatisfaction with QM as a theory, but more a dissatisfaction the way that we talk/think about it. One can accept QM as “fundamental” and “true all the way down” and still be surprised and perplexed by violations of the Bell inequality. In the words of Bell himself: “it is a curious situation indeed.”
It does feel as if we could just adjust our perspective a bit, or peel back one more layer, we would just see why things had to be so. Which layers should we try to peel back, then?
Two suspicious characters that appear in our current quantum mechanical description of the universe that I think warrant close inspection are:
Time: Different that space, and a really weird cat! Does privileging the present (by which I mean adopting some kind of relativity-respecting temporal ersatzism) point us towards QM somehow?
Zermelo-Fraenkel Set Theory With The Axiom of Choice
We are going below the beachboys threshold, but I warned you!
In my view, ZFC is far to strong of an axiomatic system for any mathematics that we are applying to the physical word. As I am writing this, scrapping ZFC for some weaker system may help restrict the set of statements you can make about the physical universe, but is not likely to yield anything particularly quantum, so I guess you can just ignore this altogether. But I guess what I am vaguely gesturing at is this: does picking another axiomatic system (ideally a weaker one) for our mathematics, and really biting the bullet and sticking with all of the consequences of this all the way down, somehow point towards something that maybe looks like QM?
Let it be known though, Tu thinks ZFC should be banned in physics departments around the world.
Can’t wait to read the paper, and for you to invest the resulting nobel prize winnings into my QC startup!
Comment #59 January 25th, 2022 at 11:46 am
Consider the following
Weirdness Hypothesis. In any quantum-mechanical universe, quantum mechanics will seem weird to its inhabitants.
I don’t really believe the Weirdness Hypothesis, but it strikes me as being difficult to either prove or disprove. If your project succeeds, then presumably it will disprove the Weirdness Hypothesis. So this puts a lower bound on how hard your project is.
Comment #60 January 25th, 2022 at 11:57 am
This is much in the same direction as Jacob’s #42: Maybe the question you should be asking is “why does QM seem strange to us”? To which the answer, presumably, is that our brains evolved to think about phenomena which occur with Avogadro’s number of particles, not single particles. Which raises the question “why didn’t life evolve to use a small number of particles, so that we would experience quantum effects frequently in our daily life?”
Comment #61 January 25th, 2022 at 11:58 am
By the way, a word of warning: Twelve years ago, I said to myself, “I’m over fifty; it’s time for me to stop beating around the bush and work now on accomplishing what I want to accomplish”. I’m still dealing with the consequence of that decision, in the form of a book that is, currently, 3/4 written (incidentally, distantly related to your project, in that it axiomatizes some parts of “intuitive” physics). I can’t offhand think of any important work that was done with this mindset, though there probably are some.
Comment #62 January 25th, 2022 at 12:25 pm
Scott #28:
Thank you; I’ll take that as a compliment.
Unfortunately this gets to the bit where my lack of QM knowledge really stops me even thinking about the answer to that. But I think that any option you took would need to also have fundamental uncertainty of the future (which you posit getting through something like classical dice throws) but also fundamental *certainty* about the past. You can’t have two presents that can be arrived at from the same past. So far as I am aware (and please let me know if I am wrong), that’s something that QM has (some sort of conservation of information). I can’t see how you would get the same thing in a classical system. Say the path of an object is certain (bearing 0, speed x) and then there is a “dice roll” to change it’s bearing in some way or another (creating the uncertain future). Surely once it is rolled, and you now have a new bearing (0 + some random number) there are multiple histories that could have gotten you there, e.g. you could have had (0 plus 2 * random number) as your start bearing and the outcome of the dice roll being the opposite of what it actually was. So one present can be arrived at from multiple histories.
If that’s the case then you have lost that fundamental thing about the present: one past, multiple futures. You will also have really lost the entire idea of “past” at all. If the past is uncertain like the future, then what makes it the past?
I’m not 100% if I am just repeating myself or adding something. Do you have a concrete example of a classical process that could lead to the multiple futures from one present, but the present implies one, and only one, past? Am I making a mistake that QM has this property at all (i.e. in QM could you have two different starting functions that end up with the same precise end state? alternatively am I wrong that one starting state can lead to two different future states?)
It feels to me that if there is something special about QM in general that gives you that situation then that’s really Q1 territory and the sub-question of “but why this particular version) strays into Q2 territory (where I can’t play as I don’t even know *how* to think about things)
Comment #63 January 25th, 2022 at 12:49 pm
Sorry, Scott #44. I don’t think the answer I’m suggesting is either complete or unique but I personally find it helpful. The log-jam has been almost century-long, so I figure even a small help is good to have. I’ve stopped commenting here often, but this post of yours was too much:-), but in any case I’ll either find a way to say what I’ve been saying in a way that you find compelling enough to look harder, or I won’t. I’ve posted my comment on Facebook (see above as my Website link), so we could also have a discussion there, as you like.
Quantum measurement theory isn’t unique, because GPTs (Generalized Probability Theories) are a thing. Nonassociativity, as for Jordan algebras, might be useful in some contexts. Different axioms give different structures. Georgio #32 more-or-less follows axiomatic constructions like Lucien Hardy’s, from which we can get ourselves either to a commutative classical formalism or to a noncommutative quantum formalism, although Planck’s constant has to be put in by hand. A commutative structure cannot model all transformations of measurement results, so why wouldn’t you want to include noncommutativity? Signal analysis includes noncommutative operations but is not usually said to be non-classical.
Does it help if I agree with your #31? Continuous-all-the-way-down certainly seems wrong to me as an assumption: nonlinear differential equations might or might not result in vortices and other chaotic features all-the-way-down after some finite time, in which case differential equations would be very tricky (this, in its Navier-Stokes aspect, is a Millennium Prize problem). In such a case, measure theory and probability measures can still be possible, but part of that theory is the existence of incompatible probability measures. Note, as well, that if the axiom of choice is necessary to give a complete initial condition, then determinism would also be very tricky, so there can be a convergence of CM with QM from that slightly different direction, with the presence of an effectively irreducible noise.
Augustin #18 is an interesting idea, but I think we can’t assume that QFT is so neat. [QFT is much more like thermodynamics, because of its focus on a continuum of possible measurements of temperature, density, et cetera, in different finite regions of space-time than it is like many-body physics.] I also doubt it’s a good idea to ignore that events in measurement devices are apparently not usually reversible, in contrast to the reversibility of unitary transformations at the level of finite-dimensional Hilbert spaces. An aspect that I think is significant but that AFAIK goes unnoticed in causal models, because they are discussed mostly at a logical level, is that Planck’s constant as an amplitude of a Poincaré invariant noise is distinct from thermal noise, which has different symmetries and has amplitude determined by temperature and by Boltzmann’s constant. Axiomatic constructions of QM also ignore this distinction, which cannot even be made except in a 1+1- or higher-dimensional space-time that supports an action of the Lorentz group.
Comment #64 January 25th, 2022 at 12:53 pm
Why quantum mechanics?
Because not even God can create information ex nihilo.
Zero information = all possible descriptions = Everett’s multiverse.
Unitary-only QM is the quantum version of the Library of Babel:
https://www.quora.com/Why-does-the-universe-exist-Why-is-there-something-rather-than-nothing/answers/14473029
Comment #65 January 25th, 2022 at 1:01 pm
Hi Scott,
For me, the reason why classical mechanics feels aesthetically unpleasing is that it posits discrete objects in a continuous world. Objects are either here or not here, they have sharp edges. In a continuous universe, objects should be”fuzzy edged’, in the manner of a wavefunction.
To be slightly more mathematically precise about it, in classical mechanics, functions like the indicator function of the ‘does the particle occupy this location’ jump from 0 to 1, and thereby have infinite Lipschitz constant.
A sharp jump like that feels wrong, like it ought to require infinite energy to perfectly segment “the particle” away from “not the particle”. So I much prefer “continuous first” theories like quantum mechanics on those grounds.
How does one avoid having discrete objects in a continuous world? There’s really just two options: Continuous objects, like wavefunctions, or a discrete world, like cellular automata. So next I’ll address why I think cellular automata are aesthetically unsatisfying.
All the cellular automata I’ve seen are fundamentally anisotropic: They have certain preferred directions and symmetries, corresponding to the shape of the cellular grid. These anisotropies are “globally visible”: no matter what scale one looks at a universe built on such a grid, it’ll be possible to tell the orientation of the grid far below. As a result, every emergent scale will have the same anisotropies. I think this problem might be a fundamental issue with discrete foundations: they’re always going to have global consequences.
So, on aesthetic pleasingness grounds, I’m arguing in favor of quantum mechanics over a few other competitors. I can’t say that these considerations pick out quantum mechanics over all other possibilities, but I think the space of physical systems that quantum mechanics lies in is more appealing than some other spaces.
Thanks for posing these questions, Scott!
Comment #66 January 25th, 2022 at 1:02 pm
I think God first made a classical universe, but got a bit bored because it was too predictable. He sat in on a couple linear algebra classes the mortals gave and thought this sounded more interesting. Lately He’s been looking at Wolfram’s new stuff and laughing uproariously but taking it into consideration for the next go-round.
Comment #67 January 25th, 2022 at 1:16 pm
My knowledge of QM is pretty low. I do know that I have heard people who are experts say things like “The probability of this happening is 50-50 as close as we can measure, and nobody has found any way to tweak those probabilities in the slightest.”
For instance, passing a polarized photon through a filter that’s 45 degrees off line of the polarization. Apparently there are other instances of this.
And this makes me wonder, is it possible that if electrons and photons, etc, behaved like rocks, then might it not be the case that we would have a universe where entropy does not always increase?
I don’t have the tools to work out that theory, but it seems like a place to look.
Comment #68 January 25th, 2022 at 1:18 pm
Timothy Chow #58:
I think the Weirdness hypothesis has something going for it. If I were God, and I were setting things up before the dawn of the universe, I would certainly want to ensure that any and all of its sentient inhabitants were guaranteed a minimal amount of confusion, frustration, and wonder upon inspecting my handiwork.
Comment #69 January 25th, 2022 at 1:22 pm
Ignorant layman perspective here.
It’s possible the first question is not even allowed, as in, it may not make sense to ask. The world appears to be fundamentally QM. Isn’t asking why could it not be classical the same as asking why 3 could not be 5? Or why this Lego figurine could not have been made from carrots? It would not be the same figurine, and likewise, a ‘classical’ world would not be our world!
Reflecting on the work of the likes of Nima Arkani-Hamed, it seems like we will first need to explain how the Big Bang came about (perhaps from a quantum fluctuation?) and how did the seemingly classical word emerge from it. Space, time, QFT – the entire thing. Only then we will be able to answer why all the quantum weirdness was needed. And only then we will be able to judge how hard would it really be to replace Schrödinger’s equation with Newton’s, while maintaining the world as we know it.
Comment #70 January 25th, 2022 at 1:52 pm
Is the distinction between a classical and a quantum universe significant? We could, if we wanted to, model our universe as a classical probabilistic Turing machine. (And inhabitants of a classical universe could model theirs as a quantum Turing machine.) Thus isn’t any universe with the strong Church-Turing thesis (i.e. one that can both implement and be implemented on a Turing machine) ultimately obeying the same laws?
(Please let me know if my premises are mistaken – I would really appreciate it.)
Comment #71 January 25th, 2022 at 1:56 pm
A couple of comments that are not proposed answers.
1) Of course, talk about why physics *had* to be this way or why God did this rather than that are not serious. If one means by “had to” nomological necessity, then it is trivial that the physical world had to be this way, and if you mean something stronger then it trivially did not have to. And God didn’t do anything, since there is no God. I take it what you really mean if that the laws of nature ought to, in some sense, be simple and compelling, in the sense that changes would lead to something more intrinsically complicated. I personally think that is so, but can’t prove it. You then object: but then why not just use classical laws since they look simple and compelling. And here one should remark: we don’t even have any internally coherent version of classical physics. The problem in classical EM, for example, of the self-field of a charged particle. There are kludges to avoid the problem, but they are kludgy and not fundamental solutions. So the idea that “classical physics” could somehow be worked out to be simple and sharply formulated is not something we know to be true.
2) The sort of simplicity one looks for as rationally satisfying will only show up at the level of fundamental law, not whatever emergent approximations it may give rise to. I think there will be exactly one such law, and it will cover all physics, including gravity. And I think that the simplicity of the law will depend on its being formulated in the right mathematical language. We know we don’t have a TOE, and I don’t think we have good grounds to think we are even using the right mathematical language to formulate it. But until we have it, we cannot appreciate how any alternative would be more complicated and messy.
3) I rather doubt that your take on “generalizing the rules of probability” is correct. As you know, for example, Bohmian Mechanics gets on just fine with classical probability. You seem convinced that probability calculus plays a deep role here. For what it’s worth, I see no reason to think that is in the right ballpark. I know this is just trading guesses, but at least acknowledge that yours is just a guess.
Comment #72 January 25th, 2022 at 1:59 pm
Short answer: the underappreciated virtue of laziness!
Instead of implementing lots of classical rules maybe this time the creator aimed for a “minimal set” of rules which result in the emergence many layers of new rules (the classical ones).
Minimum effort but still a universe with interesting dynamic behavior (so interesting that you would want to watch it for at least 14 billion years).
This answer is based on my (mis-)understanding of Feynman and his rotating arrows:
instead of e.g. telling the photons to move in a straight line with c, you just let all the possibilities interfere and this process generate the classical rules.
For this to work, you need destructive interference => complex numbers
The born rule helps to make it easier to watch: at some point less blurry.
If you worry about the higher computational effort for this approach vs. a classic one: well that just shows the how cheap the creator’s hardware resources are compared to his time as a developer. Especially for a quick prototype like this universe.
Comment #73 January 25th, 2022 at 2:07 pm
Possible answer. To get big stuff out of small stuff (aka “decoupling”), a local theory must avoid equipartition of energy (otherwise energy goes into many small modes). Inventing some classical-like dynamics that avoids equipartition seems to me non trivial. QM achieves this by replacing small energy with small probability. Possibly there is no other road, and once you follow it, you arrive to QM.
Comment #74 January 25th, 2022 at 2:07 pm
Let me put my point more directly. Are you looking for an answer A to the question “Why?” such that, if A had occurred, or had been presented, to a 19th century physicist, purely as an abstract argument with no empirical evidence, they would have said, “Wow! Maybe I should try to work out how such a theory would work”? (Lucien Hardy seems to be claiming that about his argument, but as I said, I don’t believe it.) If so, I’m doubtful there is any such A. If not, I’m not sure what you mean by “Why”?
Comment #75 January 25th, 2022 at 2:09 pm
Love the survey, am looking forward to the results, and will read all the above comments with interest. Attempting to explain Q is far above my meager abilities, but I wanted to offer two things:
Firstly, I loved that you wrote, “I don’t think I already know the answers, and merely need better arguments to justify them. I’m genuinely uncertain and confused.” Yes, indeed. That’s the only legitimate mental state, I think. Certainty in this area concerns me.
Secondly, from what I can tell as a rank amateur, the real mysteries are: superposition, interference, and entanglement. Figure those out, and I think the rest of the dominoes fall. How can things be in multiple undefined states? How can matter interfere? (What’s doing the interfering?) What is entailed in the apparent non-locality of entanglement and wavefunction collapse?
Most crucial question in my mind regards the Heisenberg Cut. Are cats (let alone worlds or universes) truly quantum? The classical world emerges from the quantum one; is the quantum one restricted solely to its domain?
Comment #76 January 25th, 2022 at 2:18 pm
I am not qualified to even attempt to answer Q1, but let me ask an even more basic question.
You ask why God did not make the universe simply classical (aka Newtonean) instead of quantum mechanical. But if we think of the universe as created by God, then the most intuitive picture of the world in not Newtonean, as in the sense of being some state machine that progresses according to simple. Rather it is a “telelogical” or Aristotlean universe in the sense of being guided by a final cause, with humanity being of course at the center of it.
Indeed, for much of history that was the default assumption, and it was very hard for people to move from this to the Newtonean universe as the new “baseline”. At this point, we have completed this transition, and the analogous “Q1” is no longer really asked. Today we view a Newtonean classical computer as the “null hypothesis” of what we expect the universe to look like.
So, I don’t really have an answer to your question Q1, but have a counter-question to you. Do you think that in a century or so, we will get so used to quantum mechanics, that we will no longer ask it?
This reminds me of the famous von Neumann quote: “Young man, in mathematics you don’t understand things. You just get used to them.”
Comment #77 January 25th, 2022 at 2:20 pm
“I’m aware, of course, of the dizzying array of central physical phenomena that rely on QM for their ultimate explanation. These phenomena range from the stability of matter itself, which depends on the Pauli exclusion principle; to the nuclear fusion that powers the sun, which depends on a quantum tunneling effect; to the discrete energy levels of electrons (and hence, the combinatorial nature of chemistry), which relies on electrons being waves of probability amplitude that can only circle nuclei an integer number of times if their crests are to meet their troughs. Important as they are, though, I don’t regard any of these phenomena as satisfying answers to Q[1] in themselves. The reason is simply that, in each case, it would seem like child’s-play to contrive some classical mechanism to produce the same effect, were that the goal.”
Well… try doing that. I think you will find that these explanations fail. Like, the stability of matter one, for example. We thought for a long time we could get away with *macroscopic* stability, of like chairs and such, without needing to pull in the narrative of quantum mechanics, but no. You really need the fermion and the boson. I think this tack is more likely to be compelling than you think.
Actually, really needing fermions and bosons — really needing their statistical laws — makes Q1 a little closer to Q2.
Comment #78 January 25th, 2022 at 2:47 pm
A good review is John Baez’s, Struggles with the Continuum.
Wolfram thinks it is, and he claims quantum mechanics emerges naturally from that formalism. Like everything Wolfram, take with a grain of salt and be prepared to abandon all familiar formalisms and start over with something new. Maybe there’s something to it though, and maybe it addresses your Q (see his section titled “Why This Universe? The Relativity of Rules” maybe?).
His answer to Q I think is basically that all consistent formal systems exist and define the “Ruliad”.
Comment #79 January 25th, 2022 at 2:52 pm
I am not qualified to answer, but I have some half-arsed opinions anyway.
Firstly, the question “why did God (create the universe in this way”, while probably tongue-in-cheek, seems philosophically unsound to me. Absent reliable empirical evidence of said God, the concept adds no explanatory value. It simply pushes all the questions up one level without answering them. Why does this God exist? What is it made of? How does it work? It seems better to eliminate the unnecessary baggage and apply the questions to the universe directly.
Secondly, why so much discreteness instead of continuous classical mechanics? Because Zeno was right. Infinite smallness is just as paradox-prone as infinite size. (In his first example, The Arrow, Zeno showed that an infinite sum can have a finite limit. The issue is how to reach that limit by a mechanical process in a continuous system.)
Thirdly, why is there fundamental randomness? A) Because some small randomness adds robustness. I have given the example of a game system before, which falls into a trap and can’t get out. With some randomness, there is always of chance of emerging from stability traps. B) More fundamentally, it seems conceivable to me that the universe was a random perturbation of nothingness into several sets of particles and energies (including negative ones) which don’t interact with each other. We only see the ones we are made of and interact with, or perhaps different sets of particles repel each other, and we and they migrated to opposite ends of the universe long ago. From a random beginning, why not some fundamental randomness?
Other than that, I go anthropic. I doubt if this universe is optimum for lots of life, but it allows us to exist. For a while.
Probably that is entirely wrong, but it lets me sleep at night without worrying about what the actual answers are. As my friend Mario says, “Hey, believe whatever you need to, to get yourself through the night.”
Thanks for asking.
Comment #80 January 25th, 2022 at 2:55 pm
“By my age, Einstein had completed general relativity, Turing had founded CS, won WWII, and proposed the Turing Test, and Galois, Ramanujan, and Ramsey had been dead for years.”
It’s good to have high aspirations, but there is such a thing as **too** high for anyone who is not Einstein, Turing or Ramanujan (which includes pretty much everyone except the latter)
For anyone who is not, comparing one’s accomplishments to the latter individuals is just likely to lead to depression and possibly even despair.
And beyond the age of about 40 in most cases, the age at which the comparison is made to people like Einstein is largely irrelevant , since if one has not done something similar by age 40, it is extremely unlikely that even another 50 years (or even 150 years) is going to make a difference.
Comment #81 January 25th, 2022 at 2:56 pm
If you haven’t seen it yet, here’s an interesting article discussing these questions (without anything in the way of an answer): I’m still mystified by the Born rule. Despite the title, it does talk about Q1 as well as Q2. See especially “§4: wtf magical reality fluid”. There are discussions in the comments that might answer pieces of the questions; I can’t really follow them so I can’t tell. (Note: the article is written by someone who takes the Everett interpretation as a given.)
Comment #82 January 25th, 2022 at 3:05 pm
This answer is adressing Scott’s remark : ‘For now, I’ll simply remark that every axiomatic reconstruction of QM that I’ve seen, impressive though it was, has relied on one or more axioms that struck me as ‘weird’, in the sense that I’d have little trouble dismissing the axioms as totally implausible and unmotivated if I hadn’t already known (from QM, of course) that they were true’.
So my points are :
– Something has to be weird, in the sense of being far away from classical intuition. But I agree that it might better be some physical idea, than an abstract mathematical hypothesis.
– Also, it is probably a mistake to look for axioms from which one might get QM by deduction. For going from physics to mathematics, one needs an inductive step, that is guessing a mathematical formalism that can be justified by physical arguments, but not proven. Only when this formalism is present, one can use it in a deductive way, to get consequences, and ultimately to check agreement with experiments.
– In a series of papers (see below) we have given physical ‘axioms’, from which one can justify that probabilities are needed, and that (certain and repeatable) events in the required formalism must be associated with projectors, not with partitions of an ensemble like in classical probability theory. Also, orthogonal projectors are attributed to mutually exclusive events, and the same projector is attributed to mutually certain events.
– Given that, one can get unitary transforms from Uhlhorn’s theorem, and Born’s rule from Gleason’s theorem, see more details in references below.
– Obviously there is no free lunch, and some strong initial physical hypotheses are needed. They are also related to a question by Scott : ‘whatever was going to come in to discretize or regularize the continuum, why shouldn’t it have been classical, like a cellular automaton ?’ The answer is : in addition to discretization, another crucial ingredient is needed, it is contextuality.
Classical discreteness does exist (like in cellular automata), classical contextuality does exist (like in a poll where the answers depend on the ordering of questions), but QM requires BOTH discreteness and contextuality, that we call also contextual quantization, to provide the physical ingredients quoted above.
– All this has been written in a series of papers, most of them published, and a broad audience introduction can be found in https://arxiv.org/abs/2105.14448 (with more details in 2111.10758 and 2201.00371)
– Finally, I’m not sure this approach based on physical ideas will be convincing for Scott, but, after some thinking, it is my current best answer to his question. About God, I guess you know Laplace’s answer to Napoleon : Sire, je n’ai pas eu besoin de cette hypothese.
Comment #83 January 25th, 2022 at 3:12 pm
I had thought that classical thermo had no sane way to get rid of the ultraviolet catastrophe. Therefore, quantum.
Comment #84 January 25th, 2022 at 3:17 pm
I love the questions you’re asking… Not sure I have anything brilliant to say but here goes…
I am reminded of something I read many years ago. I think it was an introductory issue of some theory of computing type journal that was being given away as sample swag at a physics conference.
In any case… the article in the journal that stuck with me explored concepts of physics-violating computers. For example, imagine you had a computer that had the property that the clock frequency doubled after each step. The first time step after initialization would take, for example, 1 second to complete. As the infinite sum 1 + 1/2 + 1/4 + …. = 2, this computer would execute an infinite number of steps in 2 seconds. This computer would allow one to evaluate whether or not any computer program would terminate! I.E. One could solve the https://en.wikipedia.org/wiki/Halting_problem Of course one could brute force any algorithmic encryption scheme, test any mathematical hypothesis such as Fermat’s last theorem, etc. in similar fashion.
Now, of course, this computer completely violates special relativity (among other things – probably some thermodynamic considerations related to irreversibility and energy consumption at the Landauer limit and resulting overheating….) in the worst way. It requires superluminal communication as the clock speed period becomes shorter than the time for light to cross the length of the computing hardware, etc.
This is suggestive, to me, of things that conceptually, the universe just could NOT sensibly do. If one could build such a computer then almost any magical concept could be brute-force solved immediately. Testing all possible non-trivial zeroes of the Riemann-Zeta function? No problem!
Similarly, I wonder if a world without quantum which was just classical could enable similarly ridiculous information or computations. If I can make the logical leap to say that no quantum means no atoms and matter that is continuously divisible then a finite volume of space can encode an arbitrarily large amount of information in the pattern of matter which occupies it. (e.g. make 3d voxels of matter interspersed with vacuum like a porous sponge to encode data and make the voxels arbitrarily small). This violates, at the very least, the kind of black-hole information limit that we believe exists from QFT and/or GR, no?
Comment #85 January 25th, 2022 at 3:25 pm
I still find this conceptually confusing. It seems to me that if there could exist a ‘discrete’ classical theory that could reproduce all the phenomena in our Universe then we would be using that theory rather than QM. So my question then is what physical properties are actually dependent on unique QM effects? I don’t mean abstract results such as Bell’s Theorem. I mean what physical property could not exist if we kept only the fragment of QM that was consistent with a classical hidden variables theory?
Comment #86 January 25th, 2022 at 3:30 pm
Scott #33 left me completely flabbergasted. 25 adjustable parameters seem like too big a price to pay for explaining all of chemistry to you? Which features of chemistry you think can be reproduced by cellular automatons with less than 25 rules?
Comment #87 January 25th, 2022 at 3:34 pm
I’m not up on all the physics stuff I’m afraid as that’s way to high browed for me. But it appears from the outside that this is a bit too much for the old human mind to get to grips with. I just think of the poor fly seeing the beautiful world through a window but not understanding why it can’t get through to it.
It may unfortunately be all too much for us to.
And that would be a very cruel god pulling that one on us.
No wonder we are succumbing to Ivy League nerdiness.
(even though I’m a bit of a fan of nerdiness, all said)
Comment #88 January 25th, 2022 at 3:41 pm
Tu said . If I were God, and I were setting things up before the dawn of the universe, I would certainly want to ensure that any and all of its sentient inhabitants were guaranteed a minimal amount of confusion, frustration, and wonder upon inspecting my handiwork.”
Looks like God definitely has that one covered.
And, if the failure to unite QM and GR despite a century’s worth of effort on the part of the smartest humans is any indication, it would certainly appear that God has actually achieved “maximal confusion”, even if she was only shooting for minimal. (At least that’s true for the “any …of it’s sentient inhabitants” part of the requirement. There may be — undoubtedly are — some inhabitants of the universe who are much smarter than even the smartest among us humans, who may be just minimally confused)
Comment #89 January 25th, 2022 at 4:16 pm
« Actually, you’ve made me realize that designing such a CA would be a phenomenal research project for anyone seeking to investigate Q1/Q2.«
Stephen Wolfram’s project. Isn’t that exactly his thing… answering Q1 by studying strong emergence from CA?
I so look forward to you essay/book. Thank-you.
Comment #90 January 25th, 2022 at 4:52 pm
I have nothing to offer in the way of physics. But I do have two pieces of advice to keep in mind in answering your questions:
Comment #91 January 25th, 2022 at 4:54 pm
Q1: It could have, but it would have been necessarily boring (deterministic) and non relativistic (UV catastrophe, radiation reaction)
Q2: This specific alternative (unitary algebras with propositions as projectors and the Born rule to handle complementarity) is the most economic representation of a working logical framework that replaces subsets of a set as a way to define physical propositions about things; the replacement itself being necessary for empirical reasons due to the physical impossibility of checking the truth value of some propositions classically constructed using the connector AND
Comment #92 January 25th, 2022 at 6:00 pm
I scanned the comments and didn’t see this mentioned. Since this is likely to be the start of a bigger thing, I hope this comment is useful.
Why the constraint in the paragraph starting ‘Relatedly, whatever “design goal” you propose for the laws of physics, if the goal is satisfied by QM, but satisfied even better by theories that provide even more power than QM does…?’ Isn’t the least powerful theory that satisfies the goal the most elegant? Wouldn’t those other theories imply things that are either not desired or, dare I say it, simply not practical within the budget “God” was willing to spend on this project?
Perhaps I do not understand the hierarchy of theories being explicitly and implicitly referenced here.
-Z-
Comment #93 January 25th, 2022 at 6:03 pm
Maybe someone can help me with this which is tangentially related to one of Scott’s points:
In one of the two original papers where Schroedinger was first quantitatively analysing issues of entanglement (either the cat paper or maybe the steering one) he threw out a comment along the lines of “well, perhaps if two systems are entangled there is a extremely rapid dephasing whenever we spatially separate them, so all of this weirdness will go away”. (Paraphrasing from memory, don’t have copies).
This would throw some new parameter into the mix to govern just how rapidly such decoherence occurs. Back then experimental evidence for long range entanglement was presumably non-existent. Note that in this “fundamental dephasing” theory we would be back to an effectively local and separable theory, although one with a weird ontology (the “real states” would still be vectors in a Hilbert space!).
What I’m curious about is which of the many successes of quantum theory would *not* have been possible in that theory? Did we really have to wait until the 70’s before we could rule that theory out, or are there phenomena which indirectly relied on long enough range entanglement that it was not viable much earlier?
Comment #94 January 25th, 2022 at 6:03 pm
Scott #29:
>No, to me it sounds more like simply a listing of all the different possibilities, or a probability distribution over the possibilities, or some other way to rank them in prominence and/or organize them. Why jump immediately to QM, with the interference and the complex numbers and whatnot?
I think some of the following are fairly intuitive: 1) If the quotient space is continuous, as you would expect given a continuously infinite distribution of possible worlds, then the world as you experience it will be made up of waves and the dynamics ruled by smooth linear operators. 2) A natural product of wave-like dynamics is orthogonalization leading to superexponential bifurcation. 3) If this is to be at all cognizable, the bifurcation need to be controlled (read: you only experience the slice of the quotient space where it’s controlled). So we should expect this space to be discretized, with something like quantization of states. One way (perhaps the only way) to discretize waves is through interference effects. Interference seems really weird when you think of particles as being ontologically fundamental, but if you instead think of the waves as being fundamental, this effect is not strange, but quite natural. 4) You should expect dynamical evolution to be continuous, and in particular that transitions between states should be continuous. Lucien Hardy has shown that this assumption is enough to motivate representing quantum states with complex numbers.
Comment #95 January 25th, 2022 at 6:10 pm
Zalman Stern #91:
Isn’t the least powerful theory that satisfies the goal the most elegant? Wouldn’t those other theories imply things that are either not desired or, dare I say it, simply not practical within the budget “God” was willing to spend on this project?
Please reread the passage … that’s exactly the point I was making! 🙂
Comment #96 January 25th, 2022 at 6:13 pm
Thought about this for a while, couldn’t figure out where to start from. Best I can say is that the question is framed in anthropomorphic / theistic terms, and while I get a vague sense that that is only supposed to be a metaphor for the “real question”, with such a deep subject (trying to make educated guesses about the abstract space of possible physical laws, no less), I don’t find it at all clear what the actual translation of your question would be, devoid of metaphors of creation and goals.
I’m sure you’re getting plenty of good answers that go into the the actual physics much more than I could, so my answer is actually a request back: is it clear to yourself what the non-metaphorical question you’re asking really is?
Comment #97 January 25th, 2022 at 6:32 pm
By the Osterwalder-Schrader theorem, a quantum field theory is equivalent to a boring statistical mechanical system in Euclidean space under Wick rotation and analytic continuation. The symmetries of Euclidean space are self-evident enough to have been considered axiomatic for thousands of years.
Comment #98 January 25th, 2022 at 6:34 pm
Partially echoing #59. To me, most of the curiosity around the question “why QM?” can be reduced to the curiosity around the question: why, in a quantum mechanical universe, do we find ourselves existing as classical life rather than quantum life?
Comment #99 January 25th, 2022 at 7:00 pm
I was busy with teaching all afternoon, and in the meantime, there have probably been way too many comments for me to answer each one individually. Which is great! But let me address a couple points that keep cropping up over and over.
Most importantly, people keep wanting to justify QM by reminding me about specific difficulties with the classical physics of the 19th century: for example, the ultraviolet catastrophe. To clarify, I never had any quarrel with the claim that, starting with 19th-century physics (especially electromagnetism), QM provided the only sensible completion.
But, to say it one more time, what would’ve been wrong with a totally different starting point—let’s say, a classical cellular automaton? Sure, it wouldn’t lead to our physics, but it would lead to some physics that was computationally universal and presumably able to support complex life (at least, until I see a good argument otherwise).
Which brings me to Stephen Wolfram, who several commenters already brought up. As I’ve been saying since 2002 (!!), Wolfram’s entire program for physics is doomed, precisely because it starts out by ignoring quantum mechanics, to the point where it can’t even reproduce violations of the Bell inequality. Then, after he notices the problem, Wolfram grafts little bits and pieces of QM onto his classical CA-like picture in a wholly inadequate and unconvincing way, never actually going so far as to define a Hilbert space or the operators on it.
Even so, you could call me a “Wolframian” in the following limited sense, and in that sense only: I view it as a central task for physics to explain why Wolfram turns out to be wrong! The sorts of models that Wolfram hawks really do seem like the obvious first options on the whiteboard of possibilities for our universe. They’re just not the options that were realized.
Relatedly, several commenters took issue with my claim that it would be “child’s play” to design classical laws of physics that would give rise to stable matter and a combinatorial explosion of possible chemicals. If you interpret “chemistry” to mean “the actual chemistry of our universe,” then this is indeed far from obvious. In some sense, a large part of the field of chemistry is all about designing classical heuristics to reproduce as much of our universe’s actual (quantum) chemistry as possible, but even the best known heuristics don’t always get it right!
Again, though, I take a much broader view. By “chemistry,” I mean any rule-based system for sticking together movable building blocks of various types, in a space of some dimension, to produce an exponentially-large variety of stable, cohesive substances—substances whose properties are determined both by the individual building blocks and by how they’re linked. “Chemistry,” in this sense, could start from “atoms” that were completely different from our atoms. It could nevertheless be perfectly adequate as a basis for complex life, albeit radically different from our life.
The research project that I expressed excitement about was the following:
Construct an explicit example of a classical cellular automaton—the simpler the better—that gives rise to a “chemistry” in the above sense. The rules of the “chemistry”—i.e., the types of possible atoms and how the atoms can be linked—should be emergent from the underlying rules of the CA, just like they are in our universe, rather than explicitly encoded into the CA rules.
Let me stick my neck out and conjecture that the above can not only be done, but done using rules that are manifestly “simpler” than (say) a full specification of the Standard Model.
If I’m wrong about this conjecture, then so much the better, as I agree that we’d then have a satisfying solution to my Q1! 🙂
Comment #100 January 25th, 2022 at 7:04 pm
Zf42tf, Wick rotation = imaginary time, right? So there are those complex numbers again. Ed Nelson’s stochastic mechanics are also interesting but don’t get all the way there.
Comment #101 January 25th, 2022 at 7:19 pm
entirelyuseless #46:
Since no one was yet answered this (as far as I know), it could simply be that Conway’s Game of Life (or any other classical universe), to an instantiated observer, would look like quantum mechanics or something very like it.
Sorry, no, it wouldn’t. 🙂
We know this because a classical CA couldn’t reproduce violations of Bell inequality, quantum supremacy experiments, or dozens of other experimentally-verified quantum effects. These effects are so important precisely because they refute the idea that the world is secretly classical, with QM merely an artifact of our limited perspective as observers. Einstein as well as thousands of lesser minds believed that hypothesis, so it has a distinguished pedigree, but the hypothesis has by now been refuted as soundly as anything in the history of science.
There is a loophole, but I’d say it’s so extreme as to prove the rule. Namely, one can’t rule out that someone used, e.g., a giant Game of Life board to create a “Matrix” that’s running our entire quantum-mechanical universe as a computer simulation! To me, though, this just seems like an instance of a more general point: namely, that nothing in physics can rule out the possibility that the whole observed universe is a simulation, an illusion, or a lie. (The idea of “superdeterminism” goes to this same extreme, even though it strenuously denies doing so.)
Comment #102 January 25th, 2022 at 7:25 pm
Augustin Vanrietvelde #47: Ah, thanks for clarifying! In that case, though, my followup question is this: why is it important even to have a notion of “causality,” at all, for reversible systems that lack any thermodynamic Arrow of Time? Why not say that these systems have dynamics, sure, but the whole concept of “causation” is tied up with irreversibility?
Comment #103 January 25th, 2022 at 7:32 pm
Re #98: “If I’m wrong about this conjecture, then so much the better, as I agree that we’d then have a satisfying solution to my Q1! ”
I don’t understand this at all. Why require that the laws yield anything like chemistry? There is no a priori requirement that the universe be complex at all, whether at the fundamental scale or at emergent scale. Again: no God was involved trying to accomplish anything. If there is only one universe, and it has the simplest laws that yield emergent stable complexity, that doesn’t answer any puzzle, since there is nothing mandating that such complexity emerge.
Comment #104 January 25th, 2022 at 7:39 pm
mjgeddes #48:
You said in (4) you want to assume that current QM is exactly correct, but if that were true, I really don’t think there can be any answers for your questions. I mean some of the brightest minds have thought about quantum foundations for decades, so surely it’s unlikely that anything has been missed?
I mean, some of the brightest minds had to think about Fermat’s Last Theorem for 350 years before any of them proved it! 🙂
More to the point, though, let me give you the years of various developments that have played non-negligible roles in my own thoughts about the foundations of QM:
Quantum teleportation – 1993
Shor’s algorithm – 1994
Grover’s algorithm – 1996
Theory of quantum error-correction and fault tolerance – 1996±O(1)
Discovery of the cosmological constant – 1998
Lucien Hardy’s reconstruction of QM – 2001
PBR Theorem – 2011
Firewall paradox – 2012
Harlow-Hayden argument – 2013
ER=EPR – 2013
Quantum circuit complexity as wormhole volume – 2014
AdS/CFT as a quantum error-correcting code – 2014
Chiribella et al.’s reconstruction of QM – 2015
Black hole unitarity from Euclidean path integrals – 2019
MIP*=RE – 2020
So, maybe there’s nothing of any further interest to be discovered, but that doesn’t seem like the way to bet…
Comment #105 January 25th, 2022 at 7:40 pm
Scott,
I’m aware you can’t reproduce violations of Bell’s inequality with cellular automata, but that is only if you assume that there is a clear mapping between cellular automata and the things we experience (which are you pointing out with your example of the simulation.)
But I am saying we know for a fact that there cannot be a clear mapping like that. Because it is impossible for an observer to have a full model of themselves. So if the world is a CA, it is not one that looks like one; your experience might be a composite of pieces of that CA located in various places and times on the board, not one localized cluster.
Given that kind of uncertainty, I don’t see how Bell’s theorem would rule out a CA like that.
Comment #106 January 25th, 2022 at 7:51 pm
Ernie Davis #49:
Presumably, Aristotle or Francis Bacon or Newton couldn’t have inferred QM just by thinking about the world of ordinary experience and applying Occam’s razor; you have to do a bunch of carefully designed experiments with fairly sophisticated equipment. If that’s true, there could have been a world that was exactly the same as far as ordinary experience is concerned, but worked on entirely different underlying principles.
As we’ve discussed elsewhere in the thread, this is far from obvious, and extremely interesting either way!
Maybe QM provides the only acceptably simple explanation of experimental facts that were already well-known to Newton—including facts like the stability of matter and the existence of the sun, which Newton never even purported to explain. If so, one could make a strong case that Q1 has its answer right there.
On the other hand, maybe there’s a reasonable classical model for all the facts known to Newton, with QM needed only for stange new experimental facts that were discovered in the 19th and early 20th centuries. If so, then that would arguably heighten the scientific urgency of Q1 even more!
In summary, I hold that Q1 is either answerable or interesting (hopefully both!) 😀
Comment #107 January 25th, 2022 at 8:03 pm
Ernie Davis #56: I agree that it would’ve taken a huge leap of imagination either to
(1) discover QM as a possible mathematical structure without any need for experiments, or
(2) realize the applicability of that structure to the real world from the experimental facts available by, say, 1870.
Even if either of these could’ve been done in principle, perhaps they would’ve required superhuman intelligence. But I made the case in chapter 9 of Quantum Computing Since Democritus for why (1), at least, would’ve been far from impossible with hindsight!
The examples of Riemann and Einstein, of course, stand forever as counterexamples to those who find it “wildly farfetched” that pure thought, aided by at most one or two scraps of data, could ever uncover the correct mathematical structures that underpin the real world…
Comment #108 January 25th, 2022 at 8:05 pm
Ernie Davis #73:
Are you looking for an answer A to the question “Why?” such that, if A had occurred, or had been presented, to a 19th century physicist, purely as an abstract argument with no empirical evidence, they would have said, “Wow! Maybe I should try to work out how such a theory would work”?
Yup! That’s the least of what a satisfying answer to Q ought to do.
Comment #109 January 25th, 2022 at 8:11 pm
Though not religious/theistic myself, I do recall an interesting footnote in Griffith’s QM, where he emphasizes that “not even God knows” the outcome to a quantum experiment. Were I the Abrahamic god, dead-set on granting *true* free will to humanity, then I would purposefully eliminate my foreknowledge of this specific universe’s evolution to do it, and relying on the nondeterminism of QM to accomplish task would be a pretty good choice.
But as for a deeper mathematical constraint? I really do fear that excluding the “stability of atoms + anthropic principle” may leave you with no answer. It’s boring, and it breaks the rules you set out in the original post, but I have a feeling there’s no answer beyond it.
And maybe there are just some uncomfortable axioms we need to assert and just swallow that pill. The universe needs some (hopefully finite!) number of axioms, so one of them may as well just be the Born rule and be done with it—don’t hate me, Mateus 😀
Comment #110 January 25th, 2022 at 8:14 pm
Donald #55:
Start with the Komolgorov complexity prior over the space of computer programs.
Around 1/n th of this space has run time BB(n).
Isn’t it a ~1/exp(n) rather than ~1/n fraction?
More relevantly, you wrote a lot about how the discovery of even shorter computer programs to simulate the observed world would merely strengthen your Kolmogorov hypothesis even further—but then what possible evidence would falsify, or at least weaken, your hypothesis?
Comment #111 January 25th, 2022 at 8:16 pm
Timothy Chow #58:
Weirdness Hypothesis. In any quantum-mechanical universe, quantum mechanics will seem weird to its inhabitants.
I don’t really believe the Weirdness Hypothesis, but it strikes me as being difficult to either prove or disprove. If your project succeeds, then presumably it will disprove the Weirdness Hypothesis. So this puts a lower bound on how hard your project is.
Yup, that’s the goal! Namely, to make QM seem just as ordinary, pedestrian, and non-weird as general relativity. 😀
Comment #112 January 25th, 2022 at 8:20 pm
Riffing off of Oleg Eterevsky #6 and Oleg S. #50: Could the computational power of different candidate laws of physics be the thing that determines which laws get instantiated in “the real world”?
I agree with you that it seems plausible that some form of classical physics could support structures that are complex enough to support intelligent life. But there may be a subtle reason why this isn’t actually the case.
This seems like a vaguely plausible foundational premise: maybe any candidate set of laws of physics can only lead to phenomena simple enough that those phenomena could in principle be efficiently qualitatively simulated by a hypothetical computer that obeys those same laws of physics. After all, it seems plausible (although far from certain) that quantum computers could efficiently simulate all realizable physical processes in our universe. And the extended Church-Turing thesis would probably actually be true if we lived in a completely classical universe. You rightfully bang your head against the wall when people incorrectly say that the Sycamore computer’s task was to “simulate itself”, but in a looser sense you can think of any physical process as “a computer efficiently simulating itself”, and so the complexity of realizable physical processes might be bounded by the computational resources of a hypothetical in-universe computer trying to simulate it.
(To be clear, I’m definitely not suggesting that this or any other universe actually is a literal computer simulation constructed by any external intelligence. I’m just saying that that may be a useful thought experiment for bounding the complexity of physically realizable processes.)
Anyway, it isn’t clear to me that a classical Turing machine would necessarily be powerful enough to efficiently simulate processes complex enough to support intelligent life, i.e. powerful enough to “run the math in real time”. But your mileage may vary; maybe it seems obvious to you that a classical computer would be powerful enough.
Finally, another vaguely reasonable-sounding foundational premise should be that the laws of physics generate the least powerful computational system that’s powerful enough to generate processes complex enough for us to observe them. That’s why we don’t have laws of physics that allow superluminal signalling, computers that can efficiently solve NP-compete problems, etc.
Comment #113 January 25th, 2022 at 8:22 pm
To expand on my comment, there is one question you can ask: assume our civilization never developed the advanced experimental science to discover quantum mechanics, but did have lots of people working on math. How would those mathematicians have been led to invent something like quantum mechanics? Clearly, they would have invented matrices; after all, in the real-world those were discovered before quantum mechanics. From studying differential equations, it is natural that they would have generalized matrices to various kinds of operator algebras. So, the only key thing missing to invent quantum mechanics is the notion of tensor product. What route would have led them to that? I think either applications in topology or studying classical statistical mechanics and then naturally generalizing it. Are there any other routes?
Comment #114 January 25th, 2022 at 8:22 pm
Yoni #61: To make a long story short, in QM, two “nearby” branches of the wavefunction can reconverge, in which case interference happens, which is the whole way we know QM is true in the first place! By contrast, for Second Law of Thermodynamics reasons, we don’t expect “faraway” branches to recombine between now and the heat death of the universe. Except possibly via heroic technological efforts, well beyond those of our current civilization—and even then, some people might argue that the very fact that two branches were ultimately recombined, proves that they were never “really” separate worlds in the first place. I’m not sure how any of that affects your thinking about this.
Comment #115 January 25th, 2022 at 8:26 pm
Isaac Grosof #64: Thank you for another extremely interesting argument that’s novel to me, at least in the way you formulated it!
Regarding anisotropy, though, if Lorentz invariance were to break down only at the Planck scale, with a quantum theory of gravity explaining why it’s recovered to excellent approximation at all larger scales, would that be totally fine with you, even aesthetically?
Comment #116 January 25th, 2022 at 8:30 pm
Ivo #68:
It’s possible the first question is not even allowed, as in, it may not make sense to ask. The world appears to be fundamentally QM. Isn’t asking why could it not be classical the same as asking why 3 could not be 5? Or why this Lego figurine could not have been made from carrots? It would not be the same figurine, and likewise, a ‘classical’ world would not be our world!
I don’t think physics would get very far if that attitude were generalized! 😀
“Daddy, why is the moon round?”
“Simple: because if the moon were square, then this wouldn’t be our world, but a different world—namely, a world with a square moon!”
Comment #117 January 25th, 2022 at 8:33 pm
Alex K #69:
Is the distinction between a classical and a quantum universe significant?
I’d say so, yes!
I mean, it’s true that a classical universe could be simulated by a quantum Turing machine, and a quantum universe by a classical Turing machine. But I’m not asking to explain the nature of a hypothetical simulating meta-universe—only the actual universe of our experience. I.e., for the purposes of this question, I’ve swallowed the Blue Pill. 😀
Comment #118 January 25th, 2022 at 8:54 pm
Scott,
Your essay/book will be eagerly anticipated!
This reply is made without first reading anyone else’s replies above to avoid being pulled off track from my first thoughts. And apologies that this is so long … but you did ask a rather big question and did not give us a limit.
****************************************************************************
First, here are my answers when asked these questions by the tenure committee …
Q2:
Why C? Obviously, the complex numbers are closed and complete. And then state vectors if anything will be “distinguishable” (orthogonality). And if we want to be able to take state vectors and get a complex number (given multiplication and phase operators) then there’s an inner product space … and now it’s a complex Hilbert space (finite case assumed …)
Why unitary transformations? Linearity; not breaking the above chosen C Hilbert space
Why the Born rule? Gleason’s theorem, again consequence of choosing C Hilbert space
Why the tensor product? to combine the C Hilbert state spaces
Q1:
Well, if God chose to make everything out of complex amplitudes on the first day. God would choose state vectors on the second day by distinguishing states. Then on the third day, God creates the inner product, sees Hilbert space, and declares it good. On the fourth day, God uses the Born rule to know all probabilities. Because God does not wish to destroy creation, God only allows unitary operators on the fifth day. Then tensor products allowed God to combine the state spaces on the sixth day. Finally, on the seventh day, God rested and watched the universal wave function evolve.
Q: Why should the universe have been quantum-mechanical?
If the universe is made of complex amplitudes, or God made the universe out of complex amplitudes, then all else pretty much follows.
***************************************************************************
And then here’s what I would NEVER say to the tenure committee …
Since you brought up God … I’ll quote Isaiah 55:9
“As the heavens are higher than the earth,
so are my ways higher than your ways
and my thoughts than your thoughts.”
And then I’ll share this recollection. I was very fortunate to catch a lecture by Vaughan Jones before his untimely passing in 2020 … from an ear infection of all things … speaking of mortality. At the end of his talk, he graciously offered that he would hang around afterward for a bit and entertain any questions anyone might like to come up and ask him. Quite a few did take him up on that and he spent a while addressing different mathematical questions … and a few about kitesurfing … I had sat in the back so found myself last in line to meet him. I tried to be funny with my first question asking if he thought the best use for quantum computers will be to use the AJL algorithm to untie quantum knots. He obliged me with a chuckle. Then I asked if he thought the nature of our neural model of computation might be such that we have an upper bound on what we can know about the universe because our theories will always be produced within the arena of whatever our neural model of computation might be. He looked at me wide-eyed for a long moment, never losing his friendly smile. Then he squinted and said, “I think that the universe is even more incredible than we can possibly imagine.” We both nodded our heads for a few seconds, me smiling dumbly, having no good follow-up, and then I said, “Thank you, I enjoyed the talk” and left. I don’t want to try to unfairly “appropriate” his words for my own means – especially since they weren’t published. But I hope I can use them in the spirit of acknowledging that the way he motivated me at least to some further thinking in the way he emphasized that the universe could be more incredible than we can “possibly imagine”. That led me to question if our human thinking is subject to the laws of complexity classes and computability theory just as it might be subject to laws of physics … and how that might have consequences.
So, I would first question the questions. And I would do that by reminding you of your statement that “quantum mechanics is not about physics.” It is not about anything. The name itself is misleading. This isn’t the fault of physicists but just the way history played out. It should be called something like, “The Interfering Amplitudes Probability Model Builder”. (Catchy right?) Because it isn’t even itself a theory but, again as you said, more like an operating system or a programing language made for constructing theories about the world. It is an arena for building models of our experiences as amplitudes.
I don’t see then that we can say the universe is quantum mechanical. Quantum mechanics is not a thing but a means or language. What we have found is that the “Interfering Amplitudes Probability Model Builder” is what always seems to work if we want to build models of the universe – by which we always mean our experiences of the universe.
I say all of that to say … we should consider the possibility that the answer to “Why should the universe have been quantum mechanical?” may be because we are at the tail end of a long history of painting ourselves into a corner … or being painted into a corner.
Could mathematical physics itself have painted us into a corner? This is Wigner’s “unreasonable effectiveness.” In other words, sometimes math just seems … “too good to be true.” So … should we … you know … trust it? I know. I know. What else are we to do? I don’t (of course) have an answer to that. But this is one of the possibilities that sends shudders up my spine … What if math has done this wonderful job of painting the floor of our experiences … and when we ask it repeatedly “give us a way to create models with which we can make predictions about what is going to happen” that it leads us invariably to complex numbers and the quantum postulates outlined above. Or maybe even just that we happened to head off down into one particular valley in a wider mathematical landscape but we can’t “climb back up” the way we came.
Why should the universe have been quantum mechanical?
Because we asked math (or the math we know) for the answer.
Or maybe it isn’t just the narrow discipline of mathematical thinking that is painting us into a corner. Another possibility that gives me shudders is maybe it is the very nature of our thinking itself. This idea, of course, goes back at least to Plato’s Cave. The obvious depressing problem with Plato’s allegory is that the prisoner who escapes the cave to see the true fire … may still only be looking at an illusion of himself escaping a cave and seeing a “true” fire … Thus, even if we think we have understanding, we may yet still be prisoners … within what may be our own projection. So … what if the “Interfering Amplitudes Probability Model Builder” is the computational model our brains use for modeling our experiences as probabilities? If it uses the model itself to generate its internal simulation of the external world … then every test we could apply to our experiences … would be a test applied within the framework of the model itself and so … it would always pass consistency checks with the model. Or would it? This is what I was wondering in the question posed to Vaughan Jones. It is an old philosophical question, of course, so not original. But I was trying to formulate it in something more like a computational complexity approach … asking if our cognitive model of computation might put something like an upper bound on the nature of predictive models we are even able to construct when we set about constructing them within the discipline of mathematical physics. If there were such a limit imposed by our cognitive faculties then the answer to Q would be for us to … look in the mirror. And I don’t mean anything “quantum brainy” about this. (After all quantum theory isn’t “about physics” …) I just mean that the current neuroscientific consensus is that our brains evolved to be a “prediction engine” and according to the neuroscientists all the brain does is generate “probably” what we will see or experience and then we take these “probabilities” to be reality.
Why should the universe have been quantum mechanical?
Because we understand it with our brains.
Again, I take “Why should the universe have been quantum mechanical?” to better be phrased as the question, “Why should models of the universe always need to be constructed using the “Interfering Amplitudes Probability Model Builder”?”
When we consider why “the universe is quantum mechanical” … it seems we should admit the possibility that we could be under that impression because of accidents of mathematical history or accidents of evolutionary cognitive development. So, perhaps there is computational complexity work to do here … to “prove” that we can trust math (that we are not trapped in an arbitrary mathematical valley – an island in theoryspace as someone once called it …) … and trust our own cognitive simulation of reality (that our own neural model of computation does not have an upper bound on how it represents reality to us in a way that fools us into thinking that the model itself is the model of reality).
Comment #119 January 25th, 2022 at 9:00 pm
Tim Maudlin #70:
1) I hope my other comments on this thread have clarified my position. Everything you mentioned that we “don’t know to be true” is indeed something for which I want to know whether it is true! More broadly, though, my interest is not restricted to laws that generate worlds that macroscopically resemble ours. One can write down millions of classical cellular automaton rules that, when run, will generate worlds that are clearly not our world, but are rich, complicated, Turing-universal, and seemingly able to support life and intelligence just as interesting as ours. I want to know: is this appearance illusory? Is yes, why? If no, then can you articulate any reason that would render it less surprising to me that we don’t find ourselves in any of those classical worlds, and instead find ourselves in this quantum-mechanical one?
2) Again, even if there’s only one simple, rationally satisfying TOE to describe our worlds, would you agree that there would presumably be thousands of TOEs to describe other worlds—TOEs that are equally simple to write down, and that would equally rationally satisfy any inhabitants of those other worlds? If so, then doesn’t an enormous question remain, of “why do we live in this world as opposed to the other ones?”
And yes, maybe the question has no satisfying answer—for example, because only our world exists and that’s that, or because, as Max Tegmark hold, all the worlds exist, and we happen to find ourselves in this world and that’s that.
OK, but what if the classical worlds weren’t as conducive to life, or intelligence, or consciousness, or whatever, as one might’ve thought they were? Or what if they could only be made so at the cost of making them vastly more complicated than our world (of course, as you said, at the level of the fundamental laws rather than of emergent behavior)? As long as such things remain live possibilities, it seems to me that we haven’t ruled out that my Q1 has a straightforward correct answer, which would then be hugely important to know.
3) We’re never going to agree about this, but even if a genie told me that Bohmian mechanics was true, my first instinct would probably be to forget about the hidden variables as epiphenomena with (by construction) no observable consequences, and just go back to doing standard QM. In which case, I certainly would wonder why the world was so contrived as to force me, in practice even if not in principle, to use this particular generalization of the probability calculus to Hermitian, trace-1 positive semidefinite complex matrices!
Comment #120 January 25th, 2022 at 9:04 pm
Tim Maudlin #102:
There is no a priori requirement that the universe be complex at all, whether at the fundamental scale or at emergent scale.
There’s an anthropic requirement. We wouldn’t be here to debate this in a world with no macroscopic complexity. And if there were a trillion worlds, and only one of them had any macroscopic complexity, we’d necessarily find ourselves in that one if in any of them.
Comment #121 January 25th, 2022 at 9:26 pm
Alessandro Strumia #72:
Possible answer. To get big stuff out of small stuff (aka “decoupling”), a local theory must avoid equipartition of energy (otherwise energy goes into many small modes). Inventing some classical-like dynamics that avoids equipartition seems to me non trivial. QM achieves this by replacing small energy with small probability. Possibly there is no other road, and once you follow it, you arrive to QM.
The part you call “nontrivial” is precisely the part I’m wondering about! Why not just invent a classical theory with a lower limit on the energies of the allowed modes? Note that, in classical cellular automata like Conway’s Game of Life, it’s far from obvious that there’s any notion corresponding to “energy,” but if there were such a notion, then certainly there would be a minimum nonzero allowed energy, namely the energy of a single live square or whatever.
Comment #122 January 25th, 2022 at 9:29 pm
Look, I don’t really believe this but it’s an interesting line of thought.
Suppose you are some kind of dualist (in the sense of us having something like souls or being in something like the matrix) and you want a rules for the physical world that don’t expose this fact. In other words you want our choices to be governed by the operation of something outside the physical world but you want the physical world to appear causally closed even when you look at neurons under the microscope.
QM gives you a really nice way to do this as you can simply evolve the wave function forward globally and then choose the branch you want to take globally based on which best matches the desires of the non-physical ‘souls’ or individuals externally placed into the simulation or whatever.
However, as I said, I don’t really believe this. What I do really believe is that you can’t hope to recover an account of why we experience the world the way we do (it feels as if collapse is real) merely from decoherence type accounts. Why? Because even if you can show that with respect to a certain basis you can describe the world as (in some limit or ignoring options of small enough measure) the sum of a bunch of classical seeming branches that’s not enough. That’s because, if QM is the fundamental theory, you can’t give special status to being classical. I mean, if you don’t care about it being some kind of principled breakdown you could rewrite the overall wave function as the sum of a bunch of components that have parts that look like whatever Turing machines you want simply by ensuring that you have enough freedom in the way u represent those machines that you can choose a representation that lets you decrease the distance between your sum and the vector in the Hilbert space you approximating (you can just pick your components so their contribution to the overall sum has some max at a finite time and decays really quickly away from there so, by multiplying them by right choice of coefficients you can converge to whatever smooth function of time you want).
To put the point differently, if I get to cut up the description of the universe into a bunch of components which merely need to sum to the actual (linear evolving) state of the world I could pick really weird decompositions such that, restricting attention to each component, those components look like they implement whatever computation you want. So you need some extra principle to explain why the world appears to us as if you choose one particular decomposition.
Maybe that extra principle is as simple as saying one kind of decomposition is favored (or maybe it would be enough to favor a certain kind of basic. However, the point is that one needs some extra rule to explain why the world appears the way it does to us and maybe we need a better grip on what that missing piece is before we can answer this question.
Comment #123 January 25th, 2022 at 9:43 pm
Re 118 and 119:
Here are few comments, not that I am expecting convergence but just to carry on a bit.
First, about cellular automata. I get that *as a computer scientist* you regard cellular automata as “simple”. But thinking as a physicist, in a world where any sort of computer (and especially a digital computer) is a very complex, emergent thing, I don’t see *any* of them as simple. You think in terms of *writing* or *abstractly specifying* the rules under which they operate. I think in terms of *physically implementing* those rules, which requires a lot of complexity. So what comes natural from your discipline as a “simple” system looks to me as extremely complex. For example, actually physically implementing Conway’s rules for Life is a hell of a lot more complex than, say, F = mA, which isn’t a computation at all.
Second, about the anthropic principle: you want to stay away from the Strong Anthropic Principle. There is nothing at all necessary tout court about the existence of life or intelligence or whatever. The universe, as far as everything we know, could perfectly well never have evolved life. If it hadn’t we wouldn’t be here to ask any questions. Sure. But if *you* in particular had not come into existence then *you* would not be here to ask this question. So what? Nonetheless, you are a completely contingent being, and your particular existence was not guaranteed by the laws of nature. Don’t confuse a *conditional* claim like “Given that we are here, such-and-such must have happened” with an unconditional one like “Such and such must have happened”. We have the Weak Anthropic Principle, which is fine. And if you want a theory that makes the existence of life (or more generally complex stable diverse structure) likely, we can look for a multiverse and invoke the Weak Anthropic Principle. So then may you are just arguing for a multiverse. (A physical one of the right kind, not Tegmark’s mathematical one). OK: that’s a target.
Finally—and I’m a little upset to have be even writing this—the additional local “variables” (beables) in Bohemian mechanics are neither emergent (they are fundamental) nor hidden (collectively, they are what you can most easily see). “Standard QM”, if you mean by that what you find in standard textbooks, just is not a well-defined physical theory with well-defined ontology or laws. So it isn’t even the ballpark of answering the questions you are asking. Bohemian Mechanics—whatever other issues you have with it—is.
Comment #124 January 25th, 2022 at 9:51 pm
I’ll take this as an invitation to exercise some crackpottery. The shortest possible version of the crackpottery is, take Pilot Waves, and subtract out the particles; you’re left with MWI, but I think that way of thinking about things is misleading. Rather, I think the correct way to interpret the wave you’re left with is a mass-energy distribution; the particle isn’t probabilistically in different locations, it’s always a waveform, and there are certain transformations we can make to change the orientation of the amplitude of the waveform in spacetime. Velocity is an amplitude orientation, in this model, and the uncertainty principle really boils down to the exact same limitation on orientation that the speed of light represents, under very constrained conditions in which the waveform is forcibly split (that is, measurement is a transformation).
So the universe is, in a sense, classical. I’m not sure what the other questions are actually asking; for unitary transformations, the waveform is the waveform, regardless of its orientation; any pair of orientations are isomorphic. For the Born rule, it boils down to that energy is continuous (kind of) rather than discrete, but stable configurations are discrete, and perturbation is a better description (sub-quantum energy “sloshes” around a lot, but is only detectable when it reaches thresholds sufficient to trigger state changes between stable configurations, so energy events are more like continuous samples than single events). And for complex waveforms, I really don’t understand what the question is; what orientation should we expect the amplitudes to be in?
Now, I say mass-energy distribution, but really, once you notice that the only thing we care about is curvature, all we’re actually talking about is curvature itself; curvature curving curvature. Mass and energy don’t cause curvature, they’re just particular forms curvature takes; mass, when the curvature creates a singularity in one orientation, energy, in other orientations. Earlier I suggested energy is kind of continuous; it is kind of continuous in the same sense mass is kind of continuous, which is to say, quantized, but (probably) quantized on an infinite logarithmic scale such that it can still be treated as continuous. Curvature thus would be governed by some kind of wave shape like sin(ln(x))/x; probably not the correct equation, but possessing the basic expected properties of the correct equation. (That particular equation has a particular relationship with a complex logarithmic spiral, which I think is a reasonable candidate for reasons I have yet to explain in anything like a comprehensible way.)
Comment #125 January 25th, 2022 at 9:52 pm
Boaz Barak #75:
So, I don’t really have an answer to your question Q1, but have a counter-question to you. Do you think that in a century or so, we will get so used to quantum mechanics, that we will no longer ask it?
I have no idea!! As someone mentioned above, maybe our AI successors will be so intrinsically comfortable with QM that they won’t feel the need to ask Q1 that I feel. Or maybe they’ll no longer ask Q1, simply because they or we will have satisfactorily answered it! 🙂
As long as we’re playing question-tennis, though, let me return your volley back with another of my own:
Do you think it was ever satisfactorily explained why we should never have expected, even a-priori, to have found ourselves living in the pre-Newtonian teleological universe of Aristotle?
Comment #126 January 25th, 2022 at 9:55 pm
asdf #82:
I had thought that classical thermo had no sane way to get rid of the ultraviolet catastrophe. Therefore, quantum.
See my comments #98 and #120.
Comment #127 January 25th, 2022 at 10:21 pm
matt #112 I can imagine some 19th century crackpot deciding to look at a theory of probability which allowed for negative probabilities and then coming up with some “natural” smearing constraints to ensure that only positive probabilities were observable under measurement. This is all you need in principle to get from phase-space classical particle mechanics to quantum mechanics (and it seems a more plausible route to me than someone stumbling upon hilbert space and 2-norms and relating that to physics etc)
Comment #128 January 25th, 2022 at 10:34 pm
Scott, First, Alan Turing won WWII only in a very anglo-centric version of history 🙂 It seems like a more complete and multi-sided view remains an… enigma for some.
More seriously, but in the same vein, I think the questions you are asking can be broadened and narrowed at the same time. For example: Starting from a completely random, possibly n-dimensional “world”, what kind of “interesting” and enduring patterns can emerge? And by “enduring” I mean that, regardless of a specific instantiation of a random world, the same patterns emerge every time?
Some years back I took an admittedly very modest crack at this kind of a question in a couple of LW posts about Order from Randomness:
https://www.lesswrong.com/posts/aCuahwMSvbAsToK22/physics-has-laws-the-universe-might-not
https://www.lesswrong.com/posts/2FZxTKTAtDs2bnfCh/order-from-randomness-ordering-the-universe-of-random
In the second post I take one of the most random possible one-dimensional sequences, white noise, then order it, subtract the emerging linear trend, and look at the result. What we see is no longer completely random, but has a power law spectrum with a strange exponent of -1.86. The same exponent shows up in every run, too. That’s not quite the same as noticing a Hilbert space with unitary evolution emerge from some random set of, say, complex numbers, I just wanted to illustrate the general idea.
Comment #129 January 25th, 2022 at 11:05 pm
Cellular automata / hyper graph updating rules aside, the multi-way causal graphs of the wolfram physics model are the most intuitive channel I’ve found for thinking about the quantum mechanical nature of reality. Formalizes the multiverse.
Comment #130 January 25th, 2022 at 11:34 pm
Q1:
It seems like our universe has a finite dimensional Hilbert space (a least for a given volume of space), despite the fact that its classical equivalent would have an infinite dimensional one. Maybe God had some kind of design requirement that there should be only finitely many (orthogonal) states. The mathematics of probability with infinitely many outcomes is notably more gnarly than the finite case, after all.
This doesn’t get us all the way to ruling out even such things as cellular automata, which are also finite, but maybe there was also a requirement to have time be continuous? Let me explain that a bit further. The key “weird” property of quantum mechanics (at least IMO) is that given two states, one can make a superposition of those states that is “pure” (i.e. contains no information / has entropy 0). Another way of putting this would be to say that QM has no preferred basis. In the Heisenberg picture, the universe at any given moment in time always consists of the same information, it’s just that at different times, we’re “viewing it from different angles”. In quantum mechanics, this view can change continuously as a function of time. In classical mechanics, it has to discontinuously jump around in right angle increments, always staying aligned with the preferred basis. So in that sense, quantum mechanics can give us continuous time, while classical mechanics can’t, not with a finite theory.
Q2:
something something Fisher information metric on the space of all probability distributions over N outcomes something something
Comment #131 January 25th, 2022 at 11:35 pm
Tim Maudlin #122:
After Scott’s post on his Zen anti-interpretationism, I was afraid that the S/O comment section had become completely conquered by many-worlds people.
I am heartened to see there are signs of a budding Bohmian insurrection! My money is on Bohmian mechanics being a more faithful rendering of the fundamental universe that the relative state interpretation when it is all said and done.
Comment #132 January 26th, 2022 at 12:40 am
I think the question has a major unstated premise: that ours is the *only* universe. I find it just as likely that there exists another, completely unreachable and unobservable universe out there that runs on a different set of laws (resembling classical mechanics or otherwise). In fact, I think it’s just as plausible that every *possible* universe with every possible set of physical laws and initial conditions exists. In virtually every one of those universes, life, or even perhaps matter as we know it, doesn’t exist. But ours happens to be one of those where matter and life did emerge (see: the anthropic principle), and one in which the rules of nature are moderately complex–not so complex that we have no hope of understanding them, but more complex than evolution was capable of carving into our ape intuitions (the best it could manage was some sort of classical approximation of reality).
Sometimes, I think of this like mathematics: when you first posit an idea like “prime numbers” or “the mandelbrot set” or “fibonacci numbers” or “conway’s game of life”, whether you realize it or not, those systems may have immense amounts of emergent consequences and complexities (“there are infinitely many primes”, “the area of the mandelbrot set is finite, but its boundary is infinite in length”, “the ratio of adjacent fibonacci numbers approaches the golden ratio”, etc.). In some sense, those emergent complexities exist independent of human beings ever discovering them, or even conceiving of the system itself. And in some sense, the initial idea of the system itself is identical to all of the infinite depths and emergent properties the system entails (when you describe prime numbers, you are describing an infinite set, whether you know it or not). I think our universe may be something like that: a set of things and a set of rules that determine relationships between those things, and immense amounts of emergent complexity from that. Other sets of rules exist, and other ways for those things to relate, so why shouldn’t those be considered universes as well? If I was religious, I would say our reality is just God thinking through the consequences of one possible set of initial conditions and physical laws for a universe, but I’m not religious and I see no reason why any sort of “thinker” needs to be involved, any more than \(2.718…\) needed Leonhard Euler in order for it to exist.
Comment #133 January 26th, 2022 at 12:47 am
The Game of Life is not a safe place for life to evolve. Everything is being constantly bombarded by gliders and stuff. We’re fortunate to live in a universe where gravity collects matter into tidy planets, whose surfaces are only lightly bombarded. I’d look for a reversible cellular automaton that has something like Newtonian gravity over long distances; how would one do this?
Comment #134 January 26th, 2022 at 1:40 am
Following Anbar #90 : I agree with this answer to Q2, which is clearly inspired by quantum logic. However the sentences ‘most economic representation’ and ‘necessary for empirical reasons’ are too vague, and must be substanciated on a physical basis, rather than on a logical one. Then, given the good hypotheses and the good theorems (Uhlhorn and Gleason), unitary algebras and Born rule can be deduced, not postulated ( https://arxiv.org/abs/2111.10758 ).
Comment #135 January 26th, 2022 at 1:57 am
Scott #33: I find it unsatisfying if in a classical world that is fundamentally continuous we put in discrete blocks arbitrarily by hand so that we can do cellular automata. If the discrete blocks arise organically in the continuous world then of course we have problems related to computation.
For interesting chemistry we need the building blocks to be charged. However, charged particles lead to various problems in classical electromagnetism, e.g., the infinite self-energy of electron or the radiation from an accelerated charge. Quantum mechanics solves these problems. I suspect that any attempt to solve these problems will have to reproduce quantum mechanics in its entirety. As you have pointed out it seems impossible to change quantum mechanics just a little and it seems to be an island in theoryspace. Maybe any self-consistent theory that tackles these problems of classical electromagnetism has to go all the way to quantum mechanics for this reason.
Comment #136 January 26th, 2022 at 3:03 am
Scott,
I’d like to show you a quote from a mainstream QM book (Sakurai J., Napolitano J. Modern Quantum Mechanics 3ed, 2021). At page 230, in regards to EPR/Bell arguments, we read:
“The fact that the quantum-mechanical predictions have been verified does not mean that the whole subject is now a triviality. Despite the experimental verdict we may still feel psychologically uncomfortable about many aspects of measurements of this kind.
Consider in particular the following point: Right after observer A performs a measurement on particle 1, how does particle 2 – which may, in principle, be many light years away from particle 1 – get to “know” how to orient its spin so that the remarkable correlations apparent in Table 3.1 are realized? In one of the experiments to test Bell’s inequality (performed by A. Aspect and collaborators) the analyzer settings were changed so rapidly that A’s decision as to what to measure could not be made until it was too late for any kind of influence, traveling slower than light, to reach B.
We conclude this section by showing that despite these peculiarities, we cannot use spin-correlation measurements to transmit any useful information between two macroscopically separated points. In particular, superluminal (faster than light) communications are impossible.”
What we are facing here is much more than a “psychologically uncomfortable” feeling, or a “peculiarity”, we are facing a logical/mathematical inconsistency between QM and the space-time structure of special relativity. Please look carefully at the last paragraph:
“we cannot use spin-correlation measurements to transmit any useful information…”
SR does not make any distinction between “useful” faster-than light messages and useless faster-than light messages. If a bit of information regarding the A measurement was sent instantly at B, we have to conclude that SR is wrong. We need to go back to Newton’s absolute space and time (since we need to determine which measurement was first), introduce an absolute frame of reference and couple QM to that absolute reference. In other words, if this type of non-locality is true, we need to conclude that we don’t understand pretty much anything about the fundamental structure of the universe. We need to develop a new physics, and only then your Q question would be answered (if the reformulated QM + absolute space still resembles the QM we have now).
As far as I can tell, the other possible option, superdeterminism, is much more parsimonious. It leaves space-time as it is, but QM becomes a statistical approximation of an underlying classical theory. So, again, the current structure of QM cannot tell us anything fundamental about the universe.
In conclusion, as long as the inconsistency between QM and SR is not cleared, we cannot, and should not attempt to answer the type of questions you asked. Any answer derived from a set of incompatible statements is irrelevant. Solve the incompatibility first, modify what needs to be modified, and only then an answer may be achievable.
Comment #137 January 26th, 2022 at 3:08 am
When all observables commute then physics is no longer observer dependent. God would not create such a programmable world with no role for the soul of the observer since there would be very little need for God’s metaphysical power in the first place. Quantum mechanics necessitates God to determine when the state of the universe will change according to the Born rule rather than the Schrodinger equation, and no matter how many dinosaur fossils are dug up Richard Dawkins will never be able to explain this from more fundamental principles. QM is God’s signal that all the evidence against his existence from lesser sciences can now be discarded as a test of our faithfulness.
Comment #138 January 26th, 2022 at 3:20 am
Tim Maudlin, Scott,
” “Standard QM”, if you mean by that what you find in standard textbooks, just is not a well-defined physical theory with well-defined ontology or laws. So it isn’t even the ballpark of answering the questions you are asking. Bohemian Mechanics—whatever other issues you have with it—is.”
Yes, I think that the above paragraph is exactly true. It can be rigorously proven based on the EPR + Bell arguments. The only consistent views are non-locality (Bohm) and superdeterminism. Both require one to abandon “standard QM”.
The inability of most physicists to accept this is responsible for the lack of progress in the field.
Comment #139 January 26th, 2022 at 3:32 am
Hi Scott,
Thanks for the reply, I’m glad my argument and/or its presentation were new and interesting.
As for the anisotropy question, about how I would feel if Lorentz invariance broke down at quantum gravity scales but was approximately recovered at larger scales, I think that would be aesthetically suboptimal, but not as deal-breaking as global anisotropy. My greatest feeling of aesthetic distaste comes for broken symmetries that stay broken all the way up to global scales. Emergent symmetries are definitely something I’m on board with – the emergent symmetry of a homogeneous gas, for instance, is broken at smaller scales.
That being said, Lorentz invariance, or another property saying that no direction or velocity is special, is a very nice and fundamental symmetry, and I’d strongly lean towards theories that preserved it all the way down to the fundamental scale. These properties seem fundamental to the notion of a “blank canvas” that the universe is drawn on, and I think that guides a lot of my feel on these matters.
On the other hand, I felt the exact same way about parity symmetry until I learned about the Wu experiment, so I’m not going to believe too strongly in the “blank canvas” aesthetic criterion.
Thanks again, Scott!
Comment #140 January 26th, 2022 at 3:47 am
Haelfix,
“One of the obvious answers here is that classical mechanics *can’t* reproduce nature as we observe it. That atoms would be unstable, that the ultraviolet catastrophe would have no resolution, and so on and so forth.”
Just look at this paper:
Stochastic Electrodynamics: The Closest Classical Approximation to Quantum Theory
Timothy H. Boyer
https://arxiv.org/abs/1903.00996
Pretty much all the so-called failures of classical physics were solved in the context of classical electromagnetism (Stochastic Electrodynamics is classical EM + 1 assumption about the vacuum). Even the stability of atoms has been explained classically, although only qualitatively. There was never a rigorous proof that classical physics, as a framework cannot explain this or that observed phenomenon. It’s just that, at a certain point in time, no suitable classical model was put forward.
‘t Hooft published a classical interpretation of QM:
Explicit construction of Local Hidden Variables for any quantum theory up to any desired accuracy
https://arxiv.org/abs/2103.04335
We still need to remember that there is no “quantum” theory of space, time or gravity. So, a large part of modern physics is still classical.
Comment #141 January 26th, 2022 at 4:17 am
Cain,
“When all observables commute then physics is no longer observer dependent.”
Observer’s independence has nothing to do with commutation. Some properties do not commute because the measurement of one perturbs the system.
All QM’s predictions are objective. All observers agree on what was measured and what the result of that measurement was.
Comment #142 January 26th, 2022 at 5:55 am
Scott #124 Occam’s razor when considering experimental data?
I feel ultimately the description of the universe using mathematics and hence leading to the classical framework first and then to the quantum one, tells more about how the human mind works and it’s limitations than anything else
Comment #143 January 26th, 2022 at 6:39 am
Let my try an analogy just to be on the same page about how a satisfying answer would look like.
We can derive the general structure of transformations between different reference frames from some very reasonable assumptions. There are only two possibilities: a) Galilean relativity (if there is no speed limit) b) special relativity (if there is a speed limit).
Is the answer to the question “Why Special Relativity?” then along the lines of “There are only two possibilities and both are of comparable complexity. Nature just happened to realize one of them” or something else?
Comment #144 January 26th, 2022 at 7:10 am
My feeling is that quantum mechanics can’t literally be true, because of the measurement problem, which I don’t think can be solved. Not without going beyond quantum mechanics. Maybe there is a way to make Many Worlds or Bohmian Mechanics work, but I don’t consider those to be orthodox quantum mechanics.
Let me explain by starting with the simplest quantum system, a spinor. Consider the particle state \( |\psi \rangle = \frac{1}{\sqrt{2}} ( |U_z\rangle + |D_z\rangle ) \). This state is a superposition of spin-up in the z-direction and spin-down in the z-direction. Where do probabilities come in? On the one hand, you could use the Born rule to say that the particle has a 50/50 chance of being spin-up or spin-down. But that seems like nonsense. It’s neither spin-up in the z-direction nor spin-down in the z-direction. It’s a pure state describing a particle that is spin-up in the x-direction. Until you introduce measurements, there really are no probabilities in quantum mechanics.
So introduce measurements. We allow this particle to interact with a Stern-Gerlach device that measures spin (in the z-direction, let’s stipulate). On the one hand, the Born rule says that our measurement will produce spin-up or spin-down with 50/50 probability. But on the other hand, if we consider the device itself to be a quantum system made of particles obeying the Schrodinger equation, then we would not find that the interaction of the particle and measuring device probabiistically results in either a measurement of spin-up or spin-down. We would find that the interaction produces some pure state that is a superposition of “The device having measured spin-up” and “The device having measured spin-down”. If we take into account the interaction of the device with the rest of the universe, then we would find (if we could make sense of the wave function of the universe) that the universe would be in a state of “a world in which the device measured spin-up” and “a world in which the device measured spin-down”.
It seems to me that you never get an actual measurement result, and so you never get probabilities. To get actual measurement results, it seems to me that you have to treat the measuring system (and presumably the rest of the universe) as something that is separate from the system being measured. Such a separation has to be artificial if everything is described by quantum mechanics. Maybe there’s a God outside of the universe who observes its history, forcing it to collapse probabilistically to something definite? Or maybe it’s all subjective, and the appearance of measurement results is relative to the observer? (But that’s a departure from orthodox quantum mechanics, it seems to me).
Comment #145 January 26th, 2022 at 7:11 am
Scott #31: I see, so what you’re asking is not why couldn’t the universe run on classical physical, but rather why couldn’t it run on classical computing. The answer, as already mentioned by Yoni #13 and OhMyGoodness #36, is that you need randomness, and classical computing can’t give you that. You specifically mention that you want a rich universe where complexity and information processing emerges, not a uniform and isotropic one. Well, with quantum mechanics you can start with a uniform and isotropic one, and quantum fluctuations will quickly make it more interesting. To get complex life you need evolution, which is intrinsically powered by randomness. I suppose you’ll agree with this, and object to the idea that classical computing can’t give you randomness.
After all, we have pseudorandom number generators, P=BPP, and we’re used to simply postulating that we have some random bits available for our computation. Well, postulating that the bits are random doesn’t help at all in a matter of principle, you still can’t generate them. At best you can postulate some randomness in the initial conditions of the universe, that is used up as the universe evolves, like in Bohmian mechanics. There are several problems with that. First is that it’s just subjective randomness, not true randomness. Second is that you’ll eventually run out of randomness and your universe will reverse to plain determinism (Bohmian mechanics goes around this problem by hiding an infinite amount of randomness in the continuum, but this trick is not available for a discrete computer). Third is that you’re just putting that randomness by hand in the initial state. The universe is not evolving complexity by itself, it is just playing out the recording put it in the beginning.
Consider the game of life: it does support complex patterns with complex behaviour, but it cannot generate them. All of the impressive life-like constructs we see there have been painstakingly designed. The same problem befalls PRNGs: we have to design our universe since the beginning with life forms that each incorporate a Mersenne Twister or something. It cannot evolve from nothing (also, I’m very skeptical that it would even work in a deterministic universe, a PRNG still needs a seed). To put it provocatively, a classical universe requires a deity, while a quantum one can run by itself.
You might object: isn’t quantum randomness just an illusion caused by deterministic branching in Many-Worlds? Well, I wouldn’t say it’s an illusion, it’s the only consistent definition of true randomness that has ever been proposed, but yes, it’s fundamentally deterministic. Couldn’t a classical computer use branching to produce the same true randomness then? Well, no, you do need quantum mechanics in order to have Many-Worlds. In a classical computer the resources needed to compute the universe increase exponentially with time.
Comment #146 January 26th, 2022 at 7:15 am
tez #126 (and matt #112), there was such a 19th Century crackpot. His name was George Boole. In 1854 he published “The Laws of Thought”. At that time he knew he had a problem with probability that he couldn’t articulate. 12 years later, he could articulate the problem.
Pitowsky puts it better than I can, “Surprisingly, the tools for such an analysis [of the Quantum Puzzle] were developed, independently of physics, over the last 140 years, beginning with George Boole [1862]. Boole’s research problem in this context can be phrased in modern terminology as follows: we are given a set of rational numbers P1, P2, …, Pn which represent the relative frequencies of n logically connected events. The problem is to specify necessary and sufficient conditions that these numbers can be realized as probabilities in some probability space.” That’s in “George Boole’s ‘Conditions of Possible Experience’ and the Quantum Puzzle”, Brit. J. Phil. Sci. 45 (1994). 95-125. Pitowsky is almost always worth reading.
Boole of course did not present probability in an operator formalism, but this is the idea laid bare. Pitowsky uses the slightly clunky word “commeasurability” to refer to the conditions of possible experience being satisfied, whereas most modern literature uses “measurement compatibility”. In the modern quantum probability literature, probabilities are most often presented in an operator formalism, where “measurement (in)compatibility” is closely related to (non)commutativity of operators.
Scott, I came to the measurement problem out of field theory, classical and quantum. Understanding the relationship between classical and quantum measurement theories as about two things, (1) noncommutativity and (2) that the spectrum of quantum noise is different from the spectrum of thermal and other noise, and removing them by extending classical measurement theory as needed, allows us to focus on the unnoticed elephant, which is that the classical dynamics is generated by the Liouville operator, **which is /not/ a positive operator**, in stark contrast to the quantum dynamics being generated by the Hamiltonian operator, which is a positive operator. This has extreme consequences because the positivity of the Hamiltonian operator is associated with analyticity in QM and in QFT.
What I think is the real payoff here is that through this understanding I find it possible to look at the Wightman and Haag-Kastler axioms for quantum fields through fresh eyes. My real interest is the problem presented by renormalization, which as far as I can tell is not likely to be addressed well by work on CAs any time soon (although work on CAs does consider scaling, it essentially follows well-worn paths established in the 60s, though that is again only to my knowledge). The Wightman axioms are to me a step towards doing CAs right, if we can understand why they have no interacting models in 3+1-dimensions.
Suppose we have an experiment that produces a list of numbers, from which we construct relative frequencies. Suppose we use a continuous probability density as an ideal model for those frequencies. We can ask what the probability density at the value x is, and get back p(x). Suppose now we use different coordinates, so we have to move the distribution by adding a constant: under assumptions that we often take to be satisfied in physics, we generate such translations using the differential operator ∂/∂x. “What is the value at x” and “use different coordinates” do not commute. If we ever transform the list of numbers we obtained from our experiment, we have to use an algebra and group of transformations to describe what we have done.
Things get both elementary and complicated when we consider the Lie algebra generated by [∂/∂x,x]=1, which is all too familiar from QM. In the context of probability measures, it is fairly natural to consider characteristic functions as Fourier transforms of those probability measures, which fairly naturally introduces a complex structure as a way to discuss the sine and cosine components of the Fourier transform (I’m not asking a philosopher to agree that this is obvious, I’m only saying that this is one way to introduce a complex structure in a natural enough way that we can mostly stop worrying about it, FTW).
You know I can go on for pages about this, but it’s all pretty simple math once anyone starts to run with it. Anyway, I’ve rehearsed this here before with only slight differences and you’ve always said it’s incomprehensible before, so it’s not likely to be any different this time, so I’ll be gone now. I’m mostly back to my work on renormalization in any case.
Comment #147 January 26th, 2022 at 7:17 am
Tim Maudlin #70:
They require a lot of complexity because the formal rules governing our universe don’t directly correspond with the formal rules of the computer being built, and so we have to build projections using physical rules to establish a different formal substrate for computation. There’s no obvious a priori reason why a universe based on a different set of formal rules couldn’t exist though. Exploring why this is or is not so is exactly what Scott’s Q is about.
Cellular automata are definitely formally simpler than our physics. There is no step in this process called “physically implementing the rules”, any more than there is a step to physically implementing the one dimensional strings of string theory. The cells of a cellular automaton define physical existence, and the rules that govern their behaviour just are what they are.
Comment #148 January 26th, 2022 at 7:53 am
While trying to understand QM (starting in 2013/2014), I had many attempts that I later felt ashamed of. Not because I made those attempts, but because I talked about them with my close ones. It was clear that those thoughts could neither make full sense to me myself, and even less so to the non-specialists who listened to me. But I want to stop feeling ashamed of myself. Let me see which of my attempted analogies I can still remember:
Q1?
– Why not classical: symmetry between observation and action (equations and variables)
– Why not classical: symmetry between space of positions and space of momenta
– Why not classical: nature’s way to avoid infinite information content in finite volume
Q2?
– Why the oscillations and waves: It is a conflict resolution mechanism.
– Why the oscillations and waves: It is a resonance phenomena.
– Why the oscillations and waves: It is a phase transition, or more precisely the interface between different phases.
– Why tensor structure (high dimensional instead of 4 dimensional): It is a probability density statistical description.
– Why unitary: unitarity implies linearity, so as instrumentalist I sometimes thought: It is linear, because it is a low order approximation around a working point.
– Lienhard Pagel: It is a fixed point, and Planck’s constant h cannot change, not even in an emergent theory on a higher level.
– All is chaos (instead of nothing), and the order propagates through it. The idealization failing in our universe is that things can stay “exactly constant” while other things change. Even more, the idea that it is even possible to define what “staying exactly constant” means is an idealization.
– Why complex number: The (number) field has a non-trivial automorphism, namely conjugation. Therefore the complex numbers themselves are not measureable quantities, and already hint at the “simplest” gauge symmetry.
– John Wheeler: It is like the second law of thermodynamics, entropy always increases, but not because this would be strictly true, but because it emerges from a law of large numbers.
Comment #149 January 26th, 2022 at 8:20 am
Scott #124: Let me attempt answering your question: “Do you think it was ever satisfactorily explained why we should never have expected, even a-priori, to have found ourselves living in the pre-Newtonian teleological universe of Aristotle?”
Despite not being a believer, I actually like your framing with “God”. The reason is that I think ultimately science is about finding a compelling and coherent story to explain the world. We feel we understand a phenomenon when we not only have a formula that can predict results of experiments, but also have metaphors and ways of thinking about it that make sense to us.
So the question of the “a priori expectation” (which one can also think of the “baseline” or “null hypothesis”) can be framed by thinking that this story has an author. I prefer this view to the notion that there is an objective measure of simplicity a la Kolmogorov complexity.
Now I don’t think that there is a single “satisfactory explanation” why we should reject the teleological Aristotolean universe. I am no historian, but believe it was a proccess to replace the teleological view with the mechanical view of the universe. In the beginning, the mechanical view might have been considered just a set of useful technical tools to do calculations and predictions. But as our world was more and more shaped by machines, engines, and clocks, these began to shape our metaphors as well. Hence we could think of the “clockwork universe” and these metaphors became more natural to us than stories about gods.
Similarly, I don’t think the view of “it from qubit” and the universe as a computer would make much sense to us if the Turing machine remained a thought experiment, rather than a device that we carry in our pockets. Now that we have gotten so used to the computer, we think of them as a useful metaphor to explain other stuff, rather than a mysterious phenomenon that needs to be explained.
This is basically what I meant by “getting used to” quantum mechanics. I think that it’s a gradual process, whereby as we find more and more practical and theoretical applications to a new framework, it begins to shape the metaphors we use, and the type of explanations we find “natural”, “simple”, or “beautiful”.
Comment #150 January 26th, 2022 at 8:33 am
Q1: In a classical universe one can obtain an infinite amount of information from measurements of a single particle in an arbitrary amount of time, since there is no limit to the precision with which one can measure that particle. We might then ask — where is all this information kept on the particle and why isn’t it possible to see all these degrees of freedom? Is the information a result of some underlying structure? If so, why can’t we further decompose this particle down to reach the atomic level of information. This is why I believe that the universe cannot be perfectly classical, for it would mean an infinite amount of information encoded onto a single atomistic unit. If we accept that information is physical, then an atomic theory for matter also implies an atomic theory for information, and classical physics even when allowing for indivisible particles does not restict the amount of information such a particle can contain.
Q2: From this rejection of infinite information per particle, we can generate a single axiom: (A0) the amount of information provided from a single measurement of a single system cannot exceed \(\gamma T\), where \(T\) is the amount of time spent measuring the system and \( \gamma \) is a unit conversion factor. I admit that this axiom is somewhat imprecise and non-rigorous, but I assert that QM is this axiom and A0 in turn is QM. Thus, the clearest explanation (and motivation) for quantum mechanics to my mind is that it provides the correct physical limit to information content. How exactly does this axiom reproduce QM, and why does \(T\) appear in A0?
i) The information bound in A0 must be linear in \(T\) due to the Fourier-Gabor limit. If a frequency interacts with our particle, and we can estimate this frequency by performing a single measurement on our particle, then estimation of the frequency to better than \(1/T \) would violate this limit. Here it is now clear that my colloquial definition of information is simply 1/uncertainty, whereas (quantum) fisher information is defined as this quantity squared.
ii) It reproduces the uncertainty principles – if we obtain one bit of information (say about the quantum state along \(x\) of a spin-\(\frac{1}{2}\) particle), we cannot immediately obtain another bit of information (say about the quantum state along \(y\)). Alternatively, if we immediately measure along \(x\) again, we get the same answer which is not another bit of information.
iii) It reproduces the Schrödinger equation – to get more information after measuring the state along \(x\) we must wait an amount of time. The state can evolve away from \(\pm x\) according to A0 and we can obtain more/new information upon readout.
i – iii) describe the predictions of quantum mechanical theory for measurements on any single system, they include how quickly the system can evolve in time and the probabilities for different measurement outcomes. If we are careful and rigorous in our definitions, then I expect that A0 and the quantum mechanical description of measurements and evolutions of states can be made equivalent. I also expect that at some level, this explanation is both obvious and already taken for granted by many quantum mechanics researchers (Scott included?). What I would like to stress though, is that when followed through completely, A0 has many consequences which actually diverge markedly from the mainstream understanding of quantum mechanics. Indeed, A0 predicts measurement outcomes that are in conflict with current QM theory, and A0 gets these predictions correct. This tells me that A0 is a powerful axiom and we should accept it.
What do I mean by this? To start being rigorous we need to define what a ‘single’ system is. An electron is a single spin-\(\frac{1}{2}\), but it is also composed of quarks, so is it still a single information system? Can more than \(T\) units of information be provided from this electron in time \(T\)? I work with NV centers in diamond, where a single spin-1 system is formed by thousands of carbon atoms and a single nitrogen atom. How much information can measurements on this system provide? One natural answer is to say that, if the system is described by a single quantum mechanical wavefunction, which is not separable, then this constitutes, for informational purposes, a single system which obeys A0. I acknowledge that I am using a circular definition where quantum mechanical theory (a separable wavefunction) is included in my fundamental axiom which then is used to produce QM, but I can’t avoid it at the moment. We then have:
iv) This definition of a single system reproduces entanglement – the measurement outcomes of all particles in an non-separable wavefunction are perfectly correlated and in such a way that no more than \(T\) amount of information (where \(T\) is the total measurement time) can be obtained. In effect an entangled \(N\) particle system is informationally equivalent to a single system.
By assuming this definition of a single system, it is clear that A0 diverges wildly from mainstream quantum measurement theory, since it is universally accepted that, by entangling \(N\) particles, one can run algorithms and perform measurements to extract more information per unit time than would be possible using \(N\) unentangled particles. I.e. the informational content of a Hilbert space is increased by entangling the particles and making the wavefunction non-separable, and does not in fact decrease the informational content. Not only that, iv) predicts that the \(N\) particle entangled system cannot outperform just a single spin-\(\frac{1}{2}\) particle.
First, let me say that I know advocating for iv) makes it seem, at first glance, like I have no understanding of first year quantum mechanics. However, I strongly advocate for using iv). To give one example, it is expected that the quantum fourier transform which uses entanglement, can improve frequency estimation. What is the frequency uncertainty when using the QFT? Precisely \(1/T\), independent of the number of qubits used in the QFT (see Childs, Preskill, and Renes. J. Mod. Opt. 47, 155–176 (2000)). One might argue that \(2^N\) frequencies can be discriminated when \(N\) qubits are used in the QFT. However, there is no principle in QM which rules out discrimination of \(2^N\) frequencies when using measurements of a single qubit. One very simple method to do this is excite the qubit into its excited state wait for it to emit a photon, then let the emitted photon pass through a spectrometer. The spatial location of the emitted photon allows an arbitrary number of frequencies/wavelengths to be discriminated, depending on the spectrometer bandwidth and resolution. However the uncertainty in the spatial location cannot be better than \(1/T\). There are many other methods to extract more than one bit of information from a single qubit in time \(T\). One could could irradiate a qubit with \(2^N\) different frequencies simultaneously and then see which one was absorbed. Or one could perform measurements on the single qubit at times \( T/2, T/4, T/8, …, T/2^N \). Theoretically I am not aware of any analysis that rules out iv).
Furthermore, we can check for experimental evidence against iv), if any exists, then it is clear that we must rule it out. However, there is no such evidence. In fact, all available experimental evidence agrees with iv) and actually provides a significant discrepancy with standard QM predictions. The easiest comparison can be made with atomic clocks, where the frequency of a near-resonant field is estimated. The Heisenberg limit provides: \(\Delta f \ge 1/(N T) \). It is expected that this limit can be staturated by using entangled states, however ALL experimental evidence with entangled states has an uncertainty limited to \(\Delta f \ge 1/T \), when the total experimental time and overheads are correctly considered. This is consistent with iv). This same agreement occurs when analysing phase estimation, magnetic field estimation and all quantum metrology. Furthermore, due to connections with quantum algorithms, iv) also implies that quantum computers will not outperform classical computers.
Comment #151 January 26th, 2022 at 8:46 am
Hi,
having read some of the comments, but not all, I do not see some of the bigger problems of classical physics mentioned.
One example is Newtonian Gravity allowing objects to reach infinite speed in finite time, another you mentioned is the solidity of matter(Pauli).
Another is the problem, which invented quantum Physics: The ultraviolet catastrophe. It might be hard to think of alternatives post hoc, but Planck is the easiest explanation of radiation that does not produce infinite intensities or Perpetual Energy Machines.
Are there cellular automata | strings | other theory canidates) wich preserve energy or a similar quantities?
Would these allow for a complexity similar to chemistry?
I belive (in the religious sense) in a theory more fundamental than QM, but our human understanding of Math is still inadequate.
Comment #152 January 26th, 2022 at 9:09 am
I think the Born Rule is good evidence for an Anthropic Universe since there is no good reason for it to be selected amongst all even power rules apart from being “most likely”.
Pauli already anticipated this in his not so well-known textbook on General Principles of Quantum Mechanics, P 15:
(he means experimental results)
Comment #153 January 26th, 2022 at 9:34 am
Disagreement with Andrei #135 and #137 : there is (at least) a third way besides superdeterminism and nonlocality (Bohm), that is predictive incompleteness, see https://www.mdpi.com/1099-4300/23/12/1660 . It does not require to ‘abandon standard QM’, but rather to look at it in a slightly different way.
Comment #154 January 26th, 2022 at 9:47 am
I don’t know the answers to Q1 and Q2.
I know that the answers to Q1 and Q2 are (most probably) disappointingly simple.
Plus, the simulation theory seems plausible iff the outer world is based on similar ruleset (QM/classical physics)
Comment #155 January 26th, 2022 at 10:00 am
Let me grant you you are right, and there is a theory which can create a complex world with chemistry, and which runs on classical laws, for appropriate definitions of “complex”, “chemistry”, and “classical.” I think it is very plausible that a set of definitions can be made where such a theory is possible. Let’s call this theory X.
It seems like your question then boils down to, a priori, why would we be “forced” to choose quantum mechanics over X, for some definition of “forced”? Well, “forced” can’t mean “quantum mechanics is the only logical possibility” since we are presupposing that X exists. So you must mean something like “why is quantum mechanics aesthetically more pleasing than X” or “why are the foundational principles of quantum mechanics psychologically feel more ‘inevitable’ than the principles of X?”
Can you define an objective “plausibility” metric by which we could rank quantum mechanics and X? I can’t think of one that doesn’t boil down to a subjective, personal preference.
The only way I can imagining answering Q1 in a scientific way is to prove X doesn’t exist, but I actually do agree with you that if you allow your definitions to be sufficiently broad, X probably does exist.
Without nailing down some of these definitions for what set X should be drawn from (do cellular automata count?) and how to compare X and quantum mechanics, I fear this question can be answered in 10 different ways by 5 different people, so is scientifically meaningless.
Comment #156 January 26th, 2022 at 10:24 am
Following up from an earlier comment I wrote, which isn’t published yet 🙂
Perhaps one way to formalize something like question 1, is: “What is the minimal set of assumptions such that quantum mechanics is the unique theory that can explain physics?”
In other words, we start with a set of plausible physical theories, then apply conditions to that set until quantum mechanics is the only thing left. For example, this may involve requiring the theory can produce “chemistry,” and giving a concrete definition of “chemistry”.
Then one could ask where the assumptions that are needed to rule out non-quantum theories are “weak” or “strong”, by some measure.
This is similar to how we understand GR — it is the unique low-energy theory of an interacting, massless, spin-2 particle. That’s not to say it is the *only* theory of gravity, just that to the extent the assumptions seem plausible and simple, you should believe GR is a good description of gravity.
I tend to agree with an earlier comment, that requiring *local* physics that does not distribute energy to very short wavelength modes by equipartition, is probably a very powerful condition that rules out a lot of non-quantum options.
Comment #157 January 26th, 2022 at 11:46 am
Vladimir #86:
Scott #33 left me completely flabbergasted. 25 adjustable parameters seem like too big a price to pay for explaining all of chemistry to you? Which features of chemistry you think can be reproduced by cellular automatons with less than 25 rules?
See my comment #99 (many other people’s questions are also addressed there). Briefly, I don’t mean our chemistry, I mean some chemistry with the right properties to support complex life.
Comment #158 January 26th, 2022 at 11:50 am
Lars #88:
And, if the failure to unite QM and GR despite a century’s worth of effort on the part of the smartest humans is any indication, it would certainly appear that God has actually achieved “maximal confusion”, even if she was only shooting for minimal.
I mean, the only reason physicists have been struggling to reconcile QM and GR is that they … figured out QM and figured out GR, to the point that they know the precise rate at which black holes emit Hawking radiation, despite never having been within several hundred light-years of one, and despite the process taking easily a googol years for the black holes at the centers of galaxies!
When you think about it that way, it’s staggering how much we do understand. One could easily imagine universes that were far more confusing than this one.
Comment #159 January 26th, 2022 at 12:15 pm
Guy #89:
Stephen Wolfram’s project. Isn’t that exactly his thing… answering Q1 by studying strong emergence from CA?
Right, but he never comes close to answering it. He starts by ignoring QM, then unconvincingly grafts on bits and pieces of it by fiat, never clearly defining what’s the Hilbert space, etc. Having said that, it’s true that CA-like models do seem more “natural” than our laws of physics in the space of computational possibilities! So then the question becomes one of explaining why those turn out not to be the right models for our world. The second step is important and is the one Wolfram never really takes! For more see my comment #99.
Comment #160 January 26th, 2022 at 12:25 pm
Chris #94:
You should expect dynamical evolution to be continuous, and in particular that transitions between states should be continuous. Lucien Hardy has shown that this assumption is enough to motivate representing quantum states with complex numbers.
You’re misremembering his result in an interesting way! From the existence of continuous, reversible transformations between pure states (plus some other unobjectionable axioms), Hardy deduces the truth of some QM-like theory, but the amplitudes could still be either real or complex. To get that they’re complex, you need an axiom saying that the number of parameters needed to characterize a bipartite state should be the product of the numbers of parameters needed to characterize the individual components. Personally, I’ve never quite understood the motivation for that axiom, other than that we “peeked in the back of the book” and already know that amplitudes are supposed to be complex.
Comment #161 January 26th, 2022 at 12:32 pm
skaladom #96:
I don’t find it at all clear what the actual translation of your question would be, devoid of metaphors of creation and goals … is it clear to yourself what the non-metaphorical question you’re asking really is?
Sure! What, if any, are the more basic principles that, once accepted, would make QM an unsurprising consequence?
Crucially, I think that this question has been satisfactorily answered for other parts of physics. Einstein, famously, explained the Lorentz transformations as just logical consequences of the equivalence of inertial frames plus the special role of the speed of light. Likewise, Boltzmann explained pretty much the whole of thermodynamics as a logical consequence of the reversibility of laws of physics plus the specialness of the initial state.
Do something analogous for QM.
Comment #162 January 26th, 2022 at 12:38 pm
Object of Objects #98:
To me, most of the curiosity around the question “why QM?” can be reduced to the curiosity around the question: why, in a quantum mechanical universe, do we find ourselves existing as classical life rather than quantum life?
No, I’d say that at least conceptually, we know a pretty good answer to that—namely decoherence! What we don’t know nearly so well, is why we’re not classical life existing in a classical universe.
Comment #163 January 26th, 2022 at 12:51 pm
entirelyuseless #105:
I’m aware you can’t reproduce violations of Bell’s inequality with cellular automata, but that is only if you assume that there is a clear mapping between cellular automata and the things we experience … But I am saying we know for a fact that there cannot be a clear mapping like that … Given that kind of uncertainty, I don’t see how Bell’s theorem would rule out a CA like that.
But isn’t it obvious that you can explain anything by anything else, if you’re allowed to shoehorn whatever doesn’t fit into the “mapping”? I could say the universe is explained by a kit-kat bar; GR and the Standard Model should just fall out as details of the as-yet-unknown kit-kat→universe mapping. You might call it … “entirelyuseless” 🙂
To date, no one has comprehensibly explained to me how you get our quantum-mechanical, Bell-inequality-violating observed reality from a classical CA, in a way that isn’t so convoluted, contrived, ugly, and insight-free that you wouldn’t be vastly better off just to throw out the CA part, and talk directly about quantum mechanics. Certainly not Wolfram, not ‘t Hooft, not Sabine Hossenfelder, not Andrei on this blog … as far as I can tell, all the words they’ve produced have taken us 0% of the way toward answering the question.
Comment #164 January 26th, 2022 at 1:06 pm
Considering one of your later comments, I’ve decided to start with assumptions, continuing the crackpot stuff:
Assume the universe is made up of space-time.
Assume the existence of space-time curvature.
Assume curvature curves curvature.
Assume that the curvature-curving-curvature results in a complex logarithmic spiral, the net result of which is that the curvature at a given distance R from a mass-energy origin (a singularity) is given by sin(ln(r))/r. (I can’t evaluate whether curvature-curving-curvature would actually work this way, so I’m positing it as an assumption.)
Assume that curvature can be both positive and negative; that is, the complex spiral can spiral in either direction.
Our equation sin(ln(r))/r has the distinctive property that, when considered in terms of forces, you have a striping effect; in terms of forces, attractive and repulsive forces alternate. This is an important place to begin, because if you have a field of mass-energy/singularities of sufficient density, they’ll begin to fall into one another, forming clumps. Accumulate enough “mass” in a clump like this, and it forms a larger singularity – however, as this singularity accumulates even more mass, the repulsive force one stripe out gets stronger and stronger, until, eventually, no more mass can enter.
So our assumptions give rise to “particles” of uniform “mass”. Examining curvature, note that the stripes alternate in reverse; if we call “positive” curvature matter, and “negative” curvature antimatter, note that the stable “mass” of matter corresponds to the minimum mass necessary for an antimatter singularity, but the minimum mass necessary for an antimatter singularity is not in fact stable (as it doesn’t stabilize until it is much larger).
We now have a cluster of particles of different but predictable masses, and a prediction that, for a given scale of particle, either matter or antimatter will almost completely dominate.
Assume that matter and antimatter experience curvature in an opposing fashion (where matter would be attracted, antimatter would be repelled, and vice-versa) (this may or may not cash out to the idea that matter and antimatter travel in opposite directions in time; this ties into what I think the complex logarithmic spiral is, but that’s overcomplicated, so here’s the assumption).
Now we get into an interesting position. Suppose we have a matter particle; starting at an arbitrary scale, there is a thick shell of space in which other matter particles are attracted, on either side of the radius of which they are repelled. Let us call this shell of space a nucleus, because that is exactly what it resembled. The outer thick shell of space is repulsive to other matter particles, but attractive to antimatter particles; let us call this the orbital space.
Now, I’m talking about particles, but it is important to mention at this point, before I go further, that there is not a particle sitting in the middle of the curvature; the entirety of the curvature IS the “particle”. It’s tempting to point at the singularity and call that the particle, but I believe this is a misunderstanding of the geometry involved, because a stable singularity as I described it is no longer connected to the local space-time. And because of the nature of curvature, it isn’t actually meaningful to talk about a local “center” or “origin”.
In the orbital space, antimatter particles can be captured. Because of the striping effect, stable antimatter particles are going to be substantially smaller, in terms of “mass”, than the stable matter particles they are captured by. They are repulsive to other antimatter particles; if you examine stable configurations of particles under these conditions, you basically get what look suspiciously like electron “shells”.
The next question is, exactly how far does this repulsive phase extend? I’m going to suggest it extends quite far, to a distance of somewhere in the vicinity of 10^6 meters. Then begins an attractive phase, which I’m going to suggest ranges from 10^6 to somewhere in the vicinity of 10^12 meters, at which point a repulsive phase runs to 10^18, and so on and so forth. Approximately.
Now, let’s suppose that we name the repulsive phase that ends at approximately 10^6 m “electricity”, and the attractive phase from 10^6 to 10^12 “gravity”, then we can arrive at certain conclusions. First, there is a minimum size for gravitically-bound mass, below which we should expect to only find electrically-bound mass. Second, there is a maximum size for the analogous structure we call a solar system, beyond which we expect to see a rapid and otherwise unexpected drop-off of mass. Third, we should expect to see a repulsive phase from 10^24 to 10^30m.
As far as I know, we have not actually detected a repulsive force in the range of 10^12-10^18m, and the idea that protons are only repulsive up to 10^6m is probably contentious (but testable – for smaller objects, we should see magnetic anomalies at some distance from their surface, and in particular there should be some distance at which the magnetic field apparently disappears and then reverses its polarity). But in spite of this, this matches a lot of the basic structure of the universe surprisingly closely, out of relatively few assumptions.
Of course, I have failed entirely to find a set of constants for sin(ln(r))/r which actually predict the orbital speed of our planets in the solar system. This might be due to a problem I am mathematically insufficient to deal with – if curvature curves curvature, what’s the actual radius of a given orbit?
Returning to the actual questions posed: Does it actually work for quantum mechanics?
I struggle to answer this, because, once you start considering probability waveforms as mass-energy distributions / curvature, it doesn’t look like it actually changes anything about quantum mechanics. You get some weird stuff, like hydrogen atoms being a meter across when measured “from the inside”, and in general all of our concepts of distances being horribly distorted by crazy lensing effects – but basically, as far as I can tell, things look more or less the same. You get a natural explanation for some forms of quantization, you get a natural explanation for why particles are the size they are, you get a natural explanation for waveform behavior. Things only really start to change when you get down to chromodynamics, as far as I know.
Comment #165 January 26th, 2022 at 1:12 pm
Ted #112:
Anyway, it isn’t clear to me that a classical Turing machine would necessarily be powerful enough to efficiently simulate processes complex enough to support intelligent life, i.e. powerful enough to “run the math in real time”. But your mileage may vary; maybe it seems obvious to you that a classical computer would be powerful enough.
Given that life on earth seems to exploit “quantum computational effects” minimally if at all, it seems to me like the burden of proof is firmly on anyone who believes that classical computation wouldn’t be enough to simulate processes leading to complex life, and to do so with only polynomial overhead.
Comment #166 January 26th, 2022 at 1:17 pm
matt #113:
How would those mathematicians have been led to invent something like quantum mechanics?
Obviously, because they would’ve wanted a notion of query complexity for Boolean functions that satisfied perfect composition and that matched approximate degree as a real polynomial for the n-bit OR function, and/or a model of computation that let them characterize complexity classes like PP and A0PP in terms of postselection. 😀
Comment #167 January 26th, 2022 at 1:20 pm
Clinton #118:
Why should the universe have been quantum mechanical?
Because we asked math (or the math we know) for the answer.
Nah, that doesn’t work.
Because the question still stands: why should the universe have been such that, when we used math to ask for the answer, the answer that came back was “quantum mechanics,” rather than (say) “some classical cellular automaton,” which also would’ve involved math that we knew?
Comment #168 January 26th, 2022 at 1:41 pm
This sounds like an extremely fun and interesting challenge. Of course, uninformed speculation on physics tends to veer into crackpot territory terrifyingly quickly, so please do not take the following overly seriously. That said, you asked and it sounds interesting to try to answer as best I can, so for whatever it’s worth…
Let us suppose that universes must be mathematical, as it strains conceivability to envision a non-mathematical universe. Let us further assume that universes tend towards simplicity, as it’s easy to envision reality trying to be simple, but without this assumption, it’s impossible to conceive of anything, as matters could always have some additional complication that we haven’t thought of. If you like, envision a Tegmarkian multiverse, with all possible universes, but with simpler ones being more common, or more strongly present or the like. Or, if you prefer, imagine God making a bunch of universes, but starting with the simplest ones. Now, it seems quite certain that a quantum universe is not the simplest possible. The simplest possible universe is probably just a null set, and there are other possibilities more complex than that, but far simpler than the world in which we find ourselves. Imagine a void with a single, classical particle traversing it or the like. So if we are to assume mathematicality and Occam’s Razor (and it is difficult to imagine how we are supposed to even begin answering the question of why the universe is quantum if we don’t), the obvious possibility is that this isn’t the simplest universe possible, but it’s pretty close to the simplest universe in which an observer can ask questions about physics.
So, if we’re creating a universe that contains an observer wondering about physics, what is the simplest way to make such an observer? They probably need some kind of substrate to start with. The simplest substrate is probably some amorphous material, but the observer needs to process information in order to actually observe. This suggests their substrate needs to change state in some manner that can be used for computation, at which point the most obvious simple solution is particles moving around, rather than an amorphous composition trying to move or undergo some complicated system of state changes. So, if we have particles that process information by their location, the obvious next step is to have some sort of force that can push them around. After all, an observer cannot exist without creating some sort of bias towards affecting something in a manner correlated with what they observed, so if our information processing is particle position, pushing or pulling on those particles seems like the simplest way to affect that processing to allow for thought or perception. So we have particles and force. If a force is to push the particles, it needs some reference point, some way of telling what direction it’s supposed to push in. Since we already have particles, why not use other particles to provide that reference point? So now we have particles exerting forces on each other. While these forces could push in all manner of directions and patterns, let’s start with attraction and repulsion as some of the simplest possibilities.
So if we have a universe of particles with attractive and repulsive forces, that sounds like a nice, elegant way to start trying to craft an observer. Notably, this is not a quantum universe yet; we have classical particles and classical forces, and this is much simpler than trying to add in quantum effects. However, we have a problem. Attractive particles will tend to just collapse down into a frozen state of maximum compression, while repulsive particles will tend to blow apart. Neither of these states seems conducive to information processing, which we presumably need to have an observer: the collapse state will tend to be unchanging, rather than reacting to new information, while the explosion state will tend to have the particles stop interacting to a meaningful degree (unless the forces have unlimited range, at which point it’s hard to have an observer without it being disrupted by the rest of the universe (unless your observer is the entire universe, which is not what we observe in our world, and sounds substantially more complex than simply having a small observer in a larger world)). Thus, we need a form of bracing, a way to have particles interact and attract without simply collapsing into each other. If we consider our particles’ motion in terms of potential energy, i.e. the position in a potential well that can be used to push the particles around, a very simple and elegant way of bracing is to have the laws of physics basically just assert that bracing: energy cannot easily fall below a certain level. If energy is continuous, one would expect it to continuously bleed away in an attractive potential well. But if energy is quantized, that quantization can brace particles in attraction with enough distance for other interactions to occur.
This would lead to a universe that is technically quantum, with energy only existing at certain levels. However, so far, we have a universe that looks nothing like our actual one: energy may be quantum, but the particles are purely classical points, rather than wave functions or anything like them. Can we find a reason for wave functions? Well, if our particles are points, then potential energy from particle attraction is purely a function of distance (for a given pair of particles; there could be other variables like degree of attraction or momentum of the attracted particles that might generate different amounts of energy between different combinations of particles at a given distance, but for any given pair of particles their potential energy from their attraction would very based on distance). If energy is quantized, and energy depends on distance, that would mean that distances between particles have to be quantized. For two particles, that would be possible, but for larger numbers, and with particles moving, the geometry would quickly become impossible. Fitting particles together such that at every time step, all of the distances match the allowed distances would not work with as few as three particles in the system.
Thus, point particles do not work, and they need to spread out to allow for energy to remain quantized. Rather than using a point with a single value, we need a function. Let’s use a simple function like a sine wave. Trying to have a particle with negative presence where the sine is negative sounds like a headache, so let’s square the sine so we don’t have to bother with that. What do we have now? For one thing, the sine wave has resonance if confined in a potential well (if its square is reflecting particle presence, then it has to have value zero at the edges of the well. Moreover, if it isn’t in a potential well, we’re not dealing with potential degenerate collapses, so we really only care about energy quantization in potential wells), which provides a nice source of the quantization we need; if anything that’s simpler and more elegant than just saying “this point particle can’t get within a nanometer of this other one, because reasons”. And crucially, this solves the problem of fitting particles together in a manner that allows them all to maintain quantized energy states. Consider an electron and a proton. They attract, and the proton creates a spherical potential well confining the electron. If, rather than being a point, the electron is spread out with its presence dictated by the square of a truncated sine wave, the simplest way this can happen is for that wave to be half a period. Square that, and you get an electron that is largely collapsed on the proton, but with some presence surrounding it. What happens if another proton happens by at an arbitrary range (i.e. the situation that quantized energy with point particles could not handle)? Now both protons together project a potential well, this time elliptical. The electron maintains a half-period wave function across the well, which when squared produces an electron primarily halfway in between the two protons, with presence projecting throughout the elliptical well, though decaying quickly. In other words, we have a system that does not experience degenerate collapse that would prevent further evolution in response to outside effects, and which can maintain its ability to keep evolving in the presence of arbitrary outside particles. Classical physics fails the first test, and quantum physics with classical particles fails the second. Thus, quantum physics with wave functions sounds suspiciously like one of the simplest possible systems that could lead to an observer, and thus by the anthropic principle, it makes sense that our universe is quantum in much the manner we observe.
I would like to reiterate not to take the above too seriously! This is just wild speculation, engaged in for my personal amusement and hopefully the entertainment of Scott and the other blog readers. This is not a serious theory of why quantum mechanics exists; it’s more the intellectual equivalent of a Banzai charge: ignore the overwhelming reasons why progress here is incredibly difficult and just try to advance. Nevertheless, it is interesting that quantum mechanics with wave functions does seem like one of the simplest ways a universe could exist in which matter wouldn’t simply collapse or explode, and thus one of the simplest universes in which we might expect to find ourselves.
Comment #169 January 26th, 2022 at 1:57 pm
A cellular automata universe would be deterministic. Humans would have no free will. If you want a world where some beings have consciousness and free will, then QM is the best known theory that allows for that possibility. You could add a stochastic process to a CA theory, but you might end up needing something like QM to explain that stochastic process.
Comment #170 January 26th, 2022 at 2:09 pm
Scott.
There’s obviously a great deal that could be said about your questions. I’ll just offer one thought that I’ve often had. Suppose “God” wanted to create a universe in which beings could experience some kind of “freedom” (or at least the appearance of it) yet still have the whole thing work. QM seems like exactly the sort of theory you would need for something like that. On the one hand things must always approach a classical limit once the scale gets large enough, yet particles appear free to behave in an unconstrained manner at small scales. You’ve often said that the randomness inherent in QM isn’t really “free will” but I think it’s likely that randomness is just what “free will” looks like from the outside and conversely “free will” is what randomness feels like from the inside. The sort of automaton based universe you propose wouldn’t allow anything like this. Also if random behavior (ie. freedom) were completely unconstrained you would end up with a completely chaotic and unstructured universe which wouldn’t be interesting.
Now I have a question for you:
Is grade-school arithmetic intractable ?
Specifically suppose you have a sequence of m signed rationals (p_i/q_i) where each p and q is an n bit integer. Consider the decision problem of deciding whether its sum is positive. Can that be decided in polynomial time ?
The reason this problem seems interesting is that I suspect that the answer to that question is no (basically because the common denominator can be exponentially large and the difference from zero can be as small as 1/lcd), yet I think it can be decided in PSPACE because both division and addition can stream their output bits using only a polynomial amount of intermediate storage space.
Comment #171 January 26th, 2022 at 2:16 pm
Why Quantum Mechanics? I suspect that’s the only theory that you can either interpret as deterministic (MWI, etc) or non deterministic (Penrose, etc); L1 based and continuous (Bohm, etc) or L2 based and discrete (Aaronson, etc); where consciousness is special (Wigner, etc.) or not (Zurek, etc.), etc. In Bohr’s words (stolen from le Nielsen&Chuang), a world where the opposite of a deep truth may well be another deep truth. In other words, my version of the Weirdness Hypothesis (re Chow #59).
But why the Weirdness Hypothesis? I suspect that our brains do construct our subjective perceptions of reality using neural ensembles that can flip as in the rabbit–duck illusion. You can think blue tribe, or red tribe, and that’s actually just a flip in one of a few key neural ensembles describing your worldview. In other words, the universe is what it is so that you can understand it no matter if you are blue tribe or red tribe, or neither (grey tribe) or both (apolitical). In other words, my version of the anthropic principle.
As always, anthropic principles tend to lack meat, but noticely less for Q1: the Universe *is* classical, if you want to interpret Her this way. So Q1 is only asking why *you* prefer to think in terms of L2. Maybe some alien birds prefer quaternions because that’s a best fit for/from 3-D flight simulation? Anyway, thanks the fun this question brings. 🙂
Comment #172 January 26th, 2022 at 2:23 pm
Question for chemists: How often can you say “It is only because the constituents of these molecules are in various superpositions of states and interfering with each other that this phenomenon can happen”?
The more I think of Q1, the more I think you will only get a satisfactory response from looking at a quality attempt at creating a CA/classical foundation that leads to complex chemistry. Then look at the biggest hurdles (or lack thereof!). If you see someone do a very good job at coming up with a theory, but she still says “I’ve tried every which way, but without *insert QM property here* I just can’t attain the required level of complexity” then you’ve got something to hold onto.
Right now if someone gives you what seems like a good answer why QM is not necessary, there will always be some uncertainty until you can see the idea in “action”.
Comment #173 January 26th, 2022 at 2:28 pm
Comparing various models of universes first requires establishing everything clearly in terms of equivalent resources.
If space and times are described as continuum, resources become infinite and it’s impossible to simulate anything perfectly in order to compare things objectively from a computational resource point of view.
E.g. simulating perfectly even a thing as simple as a classical double pendulum (in a space/time where positions are described by reals) is intractable given any finite amount of computational resources.
If things were fundamentally discrete, we could hope to maybe show that one model of computation can’t be “compressed”: the full set of resources and computational steps is required to reproduce a certain evolution from initial conditions, with no possible shortcut. Then we could declare that this model is truly fundamental (but the problem is that we can never be sure that a certain computation is the shortest/cheapest one, with Kolmogorov complexity, etc).
So:
1) what are the fundamental resources (how is time and space represented)?
2) whether the evolution (as a computation) is compressible or not?
I guess this is pretty much what S. Wolfram is doing: using a very basic/finite/discrete structures at the bottom (basically graphs). And evolving them using very simple computational rules, and observe the emergence of things like QM and general relativity and maybe the standard model.
Comment #174 January 26th, 2022 at 2:42 pm
Scott, from your comments you are definitively not a physicist (this is not personal, I could tell the same to many computer scientists). Let’s take a restaurant comparison : given a good dish, a physicist will appreciate it, describe it, maybe measure it, and give some conclusions that allow other folks to recognize this dish, know what can be expected from it, and compare it with other ones. But you want much more than that, you want to know all the tricks used by the Grand Chef to design the dish – even more, you want to be the Grand Chef himself, to enter all his most intimate thoughts.
Back to reality, there may or may not be a Grand Chef, anyway the point is that in physics the experiments tell us what is going on, and what must be described and predicted. This is the true problem that the creators of QM had to solve, in the first quarter of the 20th century – and they did find an answer. If you ignore this very down-to-the-earth reality, and want to move to pure abstraction (or prescription), asking ‘why this, and not that’, I’m afraid that you will get lost into metaphysics. Please have a look at https://arxiv.org/abs/2105.14448, it is short and easy to read (except maybe the Appendix), and it may suggest an interesting way to go.
Comment #175 January 26th, 2022 at 2:45 pm
I know that Scott doesn’t like the argument: “quantum mechanics is mysterious!… consciousness is mysterious!… therefore the two must be related!”
But… maybe the fastest way to prove that QM is required is to show that consciousness is fundamentally a quantum process!
For now we can say that we don’t observe the opposite:
if consciousness was just the result of simple digital computations, then it wouldn’t require all the fanciness of quantum mechanics on spacetime continuum to “summon” rich worlds teaming with conscious beings: nature/God would just have had to find ways to summon something as simple as the game of life, i.e. we’d be living in an actual universe that’s pretty much like playing Grand Theft Auto in Virtual Reality.
Comment #176 January 26th, 2022 at 3:05 pm
@Comment 169:
It is unclear either that humans have free will or that quantum mechanics makes free will possible. The concept of free will seems rather awkward: if our choices are deterministic than we clearly don’t have free will, and random choices don’t seem to fit the bill either. What is neither deterministic nor random? In addition, the many worlds model is probably the most likely form of quantum mechanics; in many worlds you end up in all branches of the wave function rather than choosing. Moreover, while quantum randomness is certainly a thing (from our point of view, if perhaps not from the point of view of an observer outside the universal wave function), as far as I know, human decisions are not generally affected by it. Many phenomena are effectively deterministic. Light reflecting off a mirror is a quantum process, but the way the wave functions add up results in the angle of reflection always equaling the angle of incidence. We would not say that because quantum mechanics exists, we cannot figure out how light will reflect!
Comment #177 January 26th, 2022 at 3:12 pm
Theories are tied to observations.
Why the hell didn’t Newton go beyond his elegant theory of force and gravity and see that General Relativity was just an even better and more complete theory? Because Newton’s theory fitted the data as well as one could ever hope to, at the time. After all, how much simpler than F=G.m1.m2/r^2 can it get?!
He would have been quite astonished to see that the GR equations are the more fundamental, more accurate answer.
Whether it’s 1700 or 2022, the same amount of caution should be used.
Especially when we’re so clearly not done (quantum gravity, the measurement problem, dark energy, black holes, etc, etc).
Comment #178 January 26th, 2022 at 3:13 pm
I’m afraid I don’t have anything extremely interesting to add to this conversation (which comes with having more of a biology background, I suppose — this isn’t my field by any stretch!), but for what it’s worth: while I recognise the non-stupidity of the Two Questions, I have never agonised about them overmuch because I’ve long been drawn to something like the Tegmarkian answer. Any universe-system that can be described in self-consistent mathematical terms “exists”, not because some God or matrix-engineer decided to run every self-consistent mathematical universe on a physical substrate for whatever reason, but because it really is math all the way down; we’re all just pieces of abstractly-“existing” maths, perceiving themselves.
(I say “universe-system” rather than “universe” because at that level of thinking, it seems to me that the complete MWI branching set of “universes” we inhabit should be taken as a whole and considered one of the distinct, mathematically-independent “worlds”; if there is a universe that runs on purely Newtonian physics, it stands as the ‘equal’ of our entire Many Worlds multiverse, not of any one branch of the wave-function.)
Where I diverge from “serious” Tegmarkians is that I’ve never seen the need to concern ourselves with the idea that more complex universes might be “less” “real” than simpler ones, and the ramifications of this idea in terms of whether QM really is “simplest”. This is what makes me feel a little inadequate bringing it up, because it makes my ‘answer’ a bit of conversation-stopper… there’s very little left to think about or investigate. But I can’t help the fact that I feel it’s just… true. If there are any orthodox Tregmarkians in the audience, and if our good host Scott will permit, I would appreciate an explanation for why you add this extra term to the proposition, rather than sticking with an uncompromising, straightforward “if you can compute it, it’s Real, end of story”. Am I missing something?
Comment #179 January 26th, 2022 at 3:17 pm
Suppose God was going over the construction of Reality one day and decided that the weird correlations in QM that give us Bell’s results were annoying, and so decided to get rid of them. What would the effect of that be? I realise that human-made artifacts such as quantum computers would no longer work, but what about the natural world? What changes would we observe? Does anyone have an answer or is it a somewhat meaningless question?
Comment #180 January 26th, 2022 at 3:21 pm
Which of Hardy’s axioms do we find strange? I don’t know yet what to make of the composite systems axiom used to rule out real-valued amplitudes. However, I feel the rest come from a desire to get the best of both worlds: continuous and discrete. Among classical theories, continuous ones have more symmetries, which we want. However, lacking a natural length scale or uncertainty principle, they allow essentially unbounded amounts of information and computation within finitely bounded regions. Following Hardy’s reasoning, it seems QM-like theories come from any attempt to resolve this tension.
Comment #181 January 26th, 2022 at 3:42 pm
As stated, Q1 is easy. 😛 Where would the fun be for God in creating a classical, i.e. deterministic, Universe? All events in such a world would be equivalent to the initial conditions. Any chaos-type complexity-driven pseudo-randomness would be unbearably boring to such an allmighty Being.
In other words, why would the Universe have a time dimension, if everything was pre-determined? It seems superfluous, not something a God would bother with. A Cauchy slice should be enough for this Creator. A painting, rather than a movie. Quantum mechanical (true) randomness is the source of all adventure. In fact, one could argue that the Newtonian view of position and its rate of change being known in the same instant, making a trajectory equivalent to any of its points and turning us all into hapless automata, is the one that should look suspicious. Negating the observer is no way to go in a science based entirely on observer data. 😉
With Q1 answered 😂 Q2 becomes: Which mathematical structures are appropriate for a theory of physical phenomena that are fundamentally random? Now that looks like it might involve some non-trivial mathematics, but at least we know the answer (a complex Hilbert space?).
Comment #182 January 26th, 2022 at 4:42 pm
Obviously we’re all flirting with crackpottery here, but given that: my guess would have to do with there being anything at all. How to create a universe out of nothing, stably, without blowing up, generating not a formless blob but intricate complex structure, however you want to characterize that. That somehow — and no, I can’t say exactly how — QM (or QFT, or whatever the right generalization of QFT is) allows that, or even insists on it, and no other system would. Once you’ve got a lot of stuff, there might be many possibilities as to the dynamics that drives it. But how do you self-consistently get a lot of stuff, and interesting, complex stuff?
Comment #183 January 26th, 2022 at 5:00 pm
Philippe Grangier #174:
Scott, from your comments you are definitively not a physicist (this is not personal, I could tell the same to many computer scientists). Let’s take a restaurant comparison : given a good dish, a physicist will appreciate it, describe it, maybe measure it, and give some conclusions that allow other folks to recognize this dish, know what can be expected from it, and compare it with other ones. But you want much more than that, you want to know all the tricks used by the Grand Chef to design the dish – even more, you want to be the Grand Chef himself, to enter all his most intimate thoughts.
“What really interests me is whether God had any choice in the creation of the world.” –my fellow computer scientist, Albert E. 😀
Comment #184 January 26th, 2022 at 5:01 pm
“There is a loophole, but I’d say it’s so extreme as to prove the rule. Namely, one can’t rule out that someone used, e.g., a giant Game of Life board to create a “Matrix” that’s running our entire quantum-mechanical universe as a computer simulation! To me, though, this just seems like an instance of a more general point: namely, that nothing in physics can rule out the possibility that the whole observed universe is a simulation, an illusion, or a lie. (The idea of “superdeterminism” goes to this same extreme, even though it strenuously denies doing so.)
“””
Holly cow. I imagine the creator coming up with such a mindfuck idea for a simulation. He must have been “shite, this would be crazy 🤪. Just need to find a big board. I might as well simulate that one observer (Scott) realizes this brilliant idea and comments about it in a random remote niche blog. No verification algorithm can be simulated on the board to disprove this btw”
Comment #185 January 26th, 2022 at 5:26 pm
Re #161:
“Einstein, famously, explained the Lorentz transformations as just logical consequences of the equivalence of inertial frames plus the special role of the speed of light.”
If this is the sort of thing you have in mind, then I think one needs to think through this case more clearly. From the point of view of GR 1) there generically are no global inertial frames, or even exact local ones and 2) there is no such objective physical quantity as the “speed” of anything, including light. Therefore this “explanation”, whatever you have in mind, is not only not simple, it is incorrect. And not for subtle reasons like QM. It’s just not what really explains anything, even from the point of view of Relativity.
Comment #186 January 26th, 2022 at 5:44 pm
Why do you except there should be a satisfying answer ? Except if we take “God” literally in your question, (as we should not, based on your comment #24), I don’t see any reason to believe we can expect an answer to this question. Your position seems to be assuming that the rules of the universe have been defined optimally in some sense. Indeed the word “God” in your question is not just a proxy for “the rules of Nature”, but implicitly justifies this additional assumption of well-chosen rules. This choice of words could be a hint that this assumption stems from the conscious or unconscious influence of millenia of religious thinking. In my opinion, using “God” in this kind of questions, even jokingly, opens the door for many biases and wrong preconceptions.
Maybe there is an answer that we would find more pleasant than “it’s just how the world is”, but to me that would be a good surprise rather than a necessity.
Comment #187 January 26th, 2022 at 6:18 pm
Scott #161
“What, if any, are the more basic principles that, once accepted, would make QM an unsurprising consequence?”
(1). The notion of “Objective Reality” is a limit that only makes sense after infinite elapsed time from the perspective of observers within reality.
(2). The basic design principle of reality is ‘Actualization’. Reality begins from a ground state of ‘possible worlds’ (non-constructive math), some of which start to get actualized (become actual worlds). Actualization simply means that an *objective* description of these worlds can increasingly be given purely in terms of computation (i.e, constructive mathematics). From (1), this process continues forever; worlds are always only in various *degrees* of actualization, which is the *measure* of their existence.
(3). To ‘actualize’ reality, there are 3 conditions:
(a). the whole can be decomposed into understandable parts (compositionality)
(b). the parts can combine into larger integrated systems (complexity)
(c). the parts affect each other in limited, logic ways (causality)
(4). Quantum mechanics is simply a special case of the general ‘theory of actualization’, which explains the physics of conditions (a), (b) & (c) above. The 3 conditions together give reality the property of ‘comprehensibility’ , which is equivalent to ‘actualization’. Comprehensibility is the ease with which observers within reality can understand it.
(5). Hilbert space is only a description of the space of possible worlds, it does not account for the actual process of actuliazation (properties a,b,c) which are expressed as : (a) computational topology, (b) function spaces, (c) computational geometry.
(6). The full ‘theory of actualization’ is about the mapping between (1) Hilbert space, (2) Computational Geometry & (3). Space-Time. (1) is about the ground-state of reality (the space of possible worlds), (2) is about the actualization of reality (how reality is made comprehensible) & (3) is the actualized structure of reality (the observed physical world).
Simple 😀
Comment #188 January 26th, 2022 at 6:24 pm
There are three properties of the universe that are easy to implement in simulations if you give up one of them, but I don’t see any way to implement all of them together, and I suspect quantum mechanics is the most simple solution that has these properties.
1. Seemingly euclidean geometry (unlike cellular automata, which only have discrete number of rotations) . I say seemingly because gravity distorts it a bit.
2. Discreteness of states. Everything is obviously easy with infinite constructions.
3. Reversibility. (That’s the unitary of the time evolution).
(4? Turing completeness / non trivial interactions / anything preventing you from just calling a non interacting model a solution.)
Go ahead and try to satisfy all those 3 conditions. 1,2 without 3 is easy: just use floating point numbers for your simulation like most physics engines. Giving up 1, you can satisfy 2,3 with Wolfram / Gerard t hooft style universes, but you’ll never recover eucleadean geometry. And giving up 2, that’s just classical physics with real numbers but it can never run on a computer.
Whether our universe has discrete states is still up to new research in physics, but assuming it is, I think it’s absolutely amazing feat to achieve just those 3 properties of our universe. I challenge you to implement all 3 of them with any toy universe.
How is quantum mechanics related to those 3 properties? If you try to satisfy those 3 properties naively, with particles that have definite position and just move every time step, you will have to have finite number of directions to move every time step, and no matter how much you zoom out, the metric will never be euclidean.
(*** you can try to implement a universe with rational numbers as coordinates, but you have to give up on locality, or you’ll end up with extremely weak interactions)
So I suspect that by smearing particles over space, as quantum mechanics does, you can get a universe which seems euclidean, yet with discrete states and without doing irreversible things like floating points and rounding errors. It’s all about finite dimensional representations of rotation groups. Spin, angular momentum, they are all ways to implement smooth rotations with discrete number of states and in a completely reversible way.
Even the weirdness of Bell tests, they are also all about your ability to rotate freely and measure in different angles. If you had just 6 spatial directions, like in a grid, you couldn’t do it.
Spin and qubits are not there for superposition or quantum weirdness or randomness, they are just the most simple representation of SU2, and they are required to make a universe with euclidean geometry, discrete states and reversibility.
Comment #189 January 26th, 2022 at 6:25 pm
entirelyuseless #46:
I would like to signal boost this profound observation, and hopefully take it a bit further. Research proposal:
1. Create an environment in a stylized universe like Conway’s Life or Fredkin and Toffoli’s billiard table.
2. Create an agent that is part of this environment. We can assume that the universe is Turing-complete. The agent has to be smart enough to create models about its environment, but stylized enough so that we do not have to design a full-blown artificial general intelligence. It is not intelligence that matters for our purposes, but the ability to alter, predict and model one’s surroundings. We do not even have to consider constraints like how the agent evolved, we can simply assume that it appeared in its fully glory as some Boltzmann-brain-with-eyes-and-hands. But it is important that the brain is part of its environment, not some ghost in a machine.
3. Formalize what it means for the agent to make observations and models about its environment. Figure out what kind of observations the agent is capable of doing in principle.
4. Investigate the relationship between the actual laws of the universe, and the laws of the universe as discovered by the agent.
Adding my own spin to this: I really believe we need to consider thermodynamics next to (or immediately after) QM when working on Scott’s Q. I’m saying this because we can’t answer the question without talking about time, and according to the block universe viewpoint I subscribe to, we can’t understand time without understanding how it emerges from the more fundamental notion of spacetime via thermodynamics. So I would add an extra constraint to the above research proposal. This constraint is a huge PITA for us human investigators, but might bring us closer to something valuable:
5. Do not “hardwire” the concept of time into the laws of the toy universe. Rather, any concept of time should reference the agent’s internals, similarly to how GR defines time via ticking clocks, which are very specific patterns in spacetime.
So the order in which the concepts build on each other is supposed to be 1. spacetime 2. agent 3. states/memories of agent. 4. time. That’s a very counterintuitive approach for us humans, but IMHO it is the right one. (And to dispel a possible misunderstanding in advance: this does not mean that time is subjective. I mean, time is kind of subjective in the above proposed toy block universe inhabited by a single Boltzmann brain Turing machine, but in our universe, it is not more and not less subjective than what’s prescribed by GR.)
Comment #190 January 26th, 2022 at 7:10 pm
Denis Kuperberg #186:
Why do you except [sic] there should be a satisfying answer ?
Isn’t the entire point in science that, whatever you know, you try to explain as a consequence of something deeper … and when you find it, you try in turn to explain that, and so on as long as you can? Yes, maybe you and everyone else will eventually give up in despair, or maybe someone will explain why there’s no explanation of the sort you’d wanted (as with individual quantum measurement outcomes), and then that particular direction of inquiry is at an end. Until one of those things happens, though, the hunt for explanatory satisfaction continues, just like it did for Darwin and Boltzmann and Einstein and all the rest who refused to shrug and say “it’s that way because it just is”!
Comment #191 January 26th, 2022 at 7:20 pm
Oh boy! First, you know way more QM than humble me… and I hope you’ll do a Bell inequality post (with photons) someday. Second I didn’t read all the comments, so this has probably been said.
QM (as a model of the world) is true because of experiment (as you said), that’s the ‘how’ of the world that science can do. The ‘why’ of the world does not seem open to science. It is a fun parlor game to ask what would be different if various fundamental constants were changed by a factor of ten.
On a personal level, I like the idea that it’s hard to confine a thing to tightly.
Comment #192 January 26th, 2022 at 7:27 pm
Interesting topic…
Comment #193 January 26th, 2022 at 7:31 pm
Peter Gerdes #122:
Look, I don’t really believe this but it’s an interesting line of thought.
Suppose you are some kind of dualist (in the sense of us having something like souls or being in something like the matrix) and you want a rules for the physical world that don’t expose this fact. In other words you want our choices to be governed by the operation of something outside the physical world but you want the physical world to appear causally closed even when you look at neurons under the microscope.
QM gives you a really nice way to do this as you can simply evolve the wave function forward globally and then choose the branch you want to take globally based on which best matches the desires of the non-physical ‘souls’ or individuals externally placed into the simulation or whatever.
Some others suggested this too. Since this post is already tagged “Embarrassing Myself,” I might as well come out and say it: I’ve thought for a long time that something in this general vicinity is one of the main possibilities on the table. Having said that, it’s actually nontrivial to spell out what QM gives you, that you couldn’t have equally well gotten from some classical theory with randomness and nondeterminism! And it’s also nontrivial to spell out why we’d still see the Born rule holding always and everywhere as far as we can test it, even when we look at neurons under a microscope, etc.—if so, then where does the Knightian unpredictability come in?
My Ghost in the Quantum Turing Machine essay, from back in 2013, represented one attempt to articulate an answer to these questions. I didn’t focus in that essay on the “explaining QM” angle, or on the possibility of other, non-quantum physical theories that could give rise to the same sort of Knightian unpredictability, but that would be a direction to go if I ever revisited these ideas.
Comment #194 January 26th, 2022 at 7:54 pm
Scott #190
> Isn’t the entire point in science that, whatever you know, you try to explain as a consequence of something deeper … and when you find it, you try in turn to explain that, and so on as long as you can?
I wouldn’t say that was the entire point of science. Certainly providing a reasonably strong rational foundation for our efforts to engineer a world more in accord with our desires seems like it would alone be reason enough to pursue science and mathematics.
As for whether science can ever provide a satisfactory explanation for our deepest questions about why things are the way they are, I don’t think so. I do however think that asking such questions is an important stage of the path to the truth.
The problem with science is that it gets its epistemology exactly backwards. It starts with what it sees “out there” when in fact what we see “out there” is something we can never have certain knowledge of, as most philosophers have understood at least since Plato. The one thing that we do have certain knowledge of is “I AM” and I think the fact that your own religious tradition equates that statement with the name of God is something it would be worth reflecting on.
PS. As for the question I asked in my last comment, sorry, I fear my feeble monkey brain failed me once again and I forgot that arithmetic is polylog time, not polynomial time, in the quantities on which it operates.
Comment #195 January 26th, 2022 at 7:57 pm
Tim Maudlin #123:
I get that *as a computer scientist* you regard cellular automata as “simple”. But thinking as a physicist, in a world where any sort of computer (and especially a digital computer) is a very complex, emergent thing, I don’t see *any* of them as simple. You think in terms of *writing* or *abstractly specifying* the rules under which they operate. I think in terms of *physically implementing* those rules, which requires a lot of complexity. So what comes natural from your discipline as a “simple” system looks to me as extremely complex. For example, actually physically implementing Conway’s rules for Life is a hell of a lot more complex than, say, F = mA, which isn’t a computation at all.
If you let yourself take the laws of physics that we happen to find in this universe as the standard of simplicity, then of course anything else is going to look more complicated by comparison! But that’s obviously rigging the game.
If you compare, say, Conway’s Game of Life against Newtonian mechanics in any non-rigged contest of simplicity, one that wasn’t specifically constructed to favor the latter, then it seems to me that Conway’s Game of Life wins hands down, just wipes the floor with Newtonian mechanics. It’s not just that it’s easier to write a program to simulate it in any programming language I’ve ever seen or heard of (though it is). The Game of Life is also immensely easier than Newtonian mechanics to explain to a child (believe me, I’ve tried both!). It would be far simpler to define in ZF set theory. If it can ever mean anything objective for one thing to be simpler than another thing, then the Game of Life, with its grid of bits replacing real-valued positions, velocities, accelerations, and masses, is simpler.
But of course, the Game of Life is not our world, which means that we have an actual opportunity to learn something important! Is there, for example, a compelling a-priori reason why we should never have expected to live in a world that wasn’t Lorentz-invariant, or at least Galilean-invariant, or at least rotationally invariant? Maybe thinking about it will uncover reasons that weren’t immediately obvious. You’re a philosopher; you believe in thinking, right? 😀
Comment #196 January 26th, 2022 at 8:07 pm
I think you are overly dismissive of argument #2 that the world as we know it would be impossible without quantum mechanics. In order for us to be having this discussion at all, the laws of physics need to have the ability to generate interesting complex structures in a reasonable amount of time starting from a simple initial state. Now I know that as a computer scientist you are trained to think that is a trivial problem because of Turing completeness, universality, blah blah blah, but really I don’t think it is so simple. Why should the laws of physics allow a Turing machine to be built? And even if a Turing machine is possible, why should one exist? I think the CS intuition that “most things are universal” comes with baked-in assumptions about the stability of matter and the existence of low-entropy objects, and I think it is not so easy to achieve these with arbitrary laws of physics.
Comment #197 January 26th, 2022 at 8:19 pm
Scott #192
You may be interested in this paper which derives that the only possible reference frame transformations are Galiliean and Lorentz given 4 intuitive axioms – http://o.castera.free.fr/pdf/One_more_derivation.pdf
Comment #198 January 26th, 2022 at 8:33 pm
Maybe god was trying to maximize the computational power of the universe without making it as all powerful has he/she/it is. Can you imagine a set of physical laws that allow computers more powerful than quantum powers but does not destroy computational complexity?
P=NP might be a bad idea.
Comment #199 January 26th, 2022 at 8:41 pm
@Scott #99:
“But, to say it one more time, what would’ve been wrong with a totally different starting point—let’s say, a classical cellular automaton? Sure, it wouldn’t lead to our physics, but it would lead to some physics that was computationally universal and presumably able to support complex life (at least, until I see a good argument otherwise). ”
I have a meta-question regarding this exercise. Suppose for the sake of argument that your conjecture is correct and it is indeed possible to write down a set of rules for a CA, the emergent behavior of which is sufficient to support complex life. Not only that, but the complex life supported by this universe is sentient and experiences consciousness in a manner entirely compatible with our own experience. In fact, if you were to simulate this CA long enough (suppose this was proven constructively so you actually have the list of rules on a piece of paper) eventually there would emerge a sentient being in this universe named Aaron Scottson who asks about his own universe “what would’ve been wrong with a totally different starting point—let’s say, unitary evolution of vectors in Hilbert space?”
All of this is a long-winded way to ask: Suppose there really wasn’t anything wrong with choosing a different, classical, starting point. How would this knowledge morph the nature of your inquiry into QM?
Put another way, what would be the necessary conditions needed to, in your mind, transform this from a physics question into a pure metaphysics question?
Comment #200 January 26th, 2022 at 9:23 pm
Also, Tim Maudlin #123:
Finally—and I’m a little upset to have be even writing this—the additional local “variables” (beables) in Bohemian mechanics are neither emergent (they are fundamental) nor hidden (collectively, they are what you can most easily see). “Standard QM”, if you mean by that what you find in standard textbooks, just is not a well-defined physical theory with well-defined ontology or laws. So it isn’t even the ballpark of answering the questions you are asking. Bohemian Mechanics—whatever other issues you have with it—is.
Tim, I think you should totally develop Bohemian Mechanics. 🙂
Seriously, though, I told you we weren’t going to agree, but FWIW, I’m well-aware of the fact that the “beables” (my phone keeps recommending “Beatles”) are the only observable thing in BM. The trouble, as you know as well as I, is that either a beable value is tied up with something decohered, macroscopic, and classical, in which case every interpretation of QM will just talk unproblematically about its value (or its value in “our” branch of the wavefunction) with no need for the beable, or else the beable value is tied up in an as-yet-unmeasured quantum state, in which case our knowledge of the probability distribution over beable values is simply whatever follows from our knowledge of the quantum state itself. In neither case does consideration of the beable, in practice, give us any information whatsoever that we didn’t already know. (Though the mathematical fact that one can add such things to the quantum formalism without contradiction is, I agree, important and interesting.)
Comment #201 January 26th, 2022 at 9:56 pm
Daniel Harlow #196:
I’m not sure about Turing machines but there are decision versions of for ex the n-body problem which are PSPACE-hard (https://en.wikipedia.org/wiki/N-body_simulation#Computational_complexity) so even the inverse square law which is one of the simplest physics laws allows for non-trivial computation
Comment #202 January 26th, 2022 at 10:06 pm
Scott #160
>To get that they’re complex, you need an axiom saying that the number of parameters needed to characterize a bipartite state should be the product of the numbers of parameters needed to characterize the individual components.
To me that seems fairly unobjectionable, his reasoning falls right in line with the Tegmarkian maximalist principle, in this case cashing out as saying that degrees of freedom factorize; that separate components need not be entangled. From the point of view of the quotient space, superposition is the default.
Comment #203 January 26th, 2022 at 10:40 pm
Scott #167:
Ah, sorry, I didn’t explain my thoughts very well. Please allow me to try again.
“Because the question still stands: why should the universe have been such that, when we
used math to ask for the answer, the answer that came back was “quantum mechanics,”
rather than (say) “some classical cellular automaton,” which also would’ve involved math that
we knew?”
There were two senses in which I meant “because we asked math (or the math we knew)”: First, a narrow sense and, second, a much, much broader sense.
First the narrow sense:
I’m thinking of the consensus around the end of the 19th century when it was widely held that all of the major theoretical foundations for physics had been worked out, that the universe was classical, and that what was left for science was just to fill in the details and the precision. Wasn’t it Planck’s advisor who said there was “nothing new to discover”?
What bothers me is that if you had been writing this blog in say the 1880s then your Q would have been: Why should the universe have been Newtonian/classical? And the discussion would be about why God made the universe Newtonian/classical instead of some other way … because … well, obviously Newtonian physics had passed every test.
Of course, we know what happened, experimental evidence led to the invention of the math we didn’t have – an extension of the laws of probability theory that allows negative numbers. The crucial moment perhaps when Schrodinger was challenged to “go find the wave equation” and begat the amplitude model. Heisenberg developed the same idea in parallel.
But … what if Schrodinger and Heisenberg … had NOT produced the amplitude model? What if Schrodinger had become distracted in that Swiss cottage and lost his way … meanwhile Heisenberg fell ill and never recovered … until still today we just had some ad hoc Bohr-type models, not yet explained by a full theory. Meanwhile, everyone still walking around saying, “Why did the universe have to be classical”?
We did not have the math to say otherwise.
So, if an alien were to have visited Earth in the 1880s and an Earthman were to have asked the alien, “Why is the universe classical.” I can imagine the alien might have answered, “Because of your math.”
Does your 2022 version of Q not risk making the same mistake? Maybe we think the universe is quantum mechanical because well … that is the best fitting tool we seem to have at the moment in our toolbox. But, it seems to be saying more than we should be able to say for us to say “the universe is quantum mechanical”. It seems we should be carefully modest and say, “our best model of the universe is quantum mechanical.” The former statement can lead us to making the Planck’s advisor error or to presuming that it “had to be so”.
What I’m saying here is that when we thought the universe was classical it was because that was the math-tool we were aware of. So, back then when we said, “The universe is classical” it wasn’t because the universe was classical. It was because of … the math we knew at the time. Now we understand that the classical math-tool is actually just part of a larger math-tool called quantum theory and that we can explain previously unexplained phenomena with that larger math-tool. It seems to be tempting the gods of comedy for us to … yet again … stand up on our primate hind legs and declare … “Now! Now we know what the universe is! Not before! But now!”
I’m just afraid we will end up sounding like Planck’s advisor. Which makes me think maybe the better question is “Why do we think that the universe is quantum mechanical?” Why did Planck’s advisor think the universe was classical?
And then the much, much broader sense.
So, this is the “unreasonable effectiveness” problem. And I admit I don’t even know what to make of the problem. Basically, the problem is this: Math is just … too perfect. Isn’t it? Why do we trust math? This is a really hard question to even ask. I don’t even know really how to ask it.
Philosophers have said, “Math describes but doesn’t explain.”
And that is an issue. But I think I’m even doubting if we should trust it to “describe”.
I think this is usually called the “non-Platonic” view of mathematics. There are probably different takes on it, and I’m not at all an expert, but I guess I would fall somewhere on the view that mathematics is something like a part of our cognitive model of computation – an intrinsic, inseparable, part of how our brains create what we experience as reality. Yes, yes, of course, there is an external reality. But we only ever experience what is generated internally. So, the internal model must put it together for us by following its own … model of computation.
So, what if the universe … isn’t even mathematical … much less quantum mechanical? What if mathematics is just some kind of evolved … approximation … for something that … we can’t even imagine?
Just seems like maybe these two possibilities above, the narrow and broad sense in which mathematics could be misleading, should be considered when someone asks a question like “why is the universe quantum mechanical”? Or at least I would want someone who was writing a book/essay on this question (even if briefly) to explain why I should trust the presumptions behind the question 🙂
And I know that probably makes you want to slap your forehead: “What, this doofus wants me to justify the use of math to explain the universe?” Isn’t it obvious! Well, yes. Your question implies that the universe is mathematical. I can’t think of a “reason” why it has to be. And if it doesn’t have to be mathematical then … it doesn’t have to be quantum mechanical.
Comment #204 January 26th, 2022 at 10:46 pm
Prof. Aaronson,
I hope I’m not putting words in your mouth here, but it appears that you have concluded that quantum mechanics is mathematically equivalent to an abstract, probabilistic, operator calculus. Assumptions can then be stated to reflect the numerical values of experiments. The usual presentation in textbooks is, of course, the opposite: physics reasoning with the needed mathematics in the lemmas. Are you aware of a treatment at, say, the advanced undergraduate to beginning graduate level, which describes QM from a mathematician’s perspective – the way you understand it yourself? (That is, a text which defines QM as an operator calculus first, and adds the physical data last?)
Such a text might help us readers ask better questions, in that it would allow everyone to see whether the methods used to define such an operator calculus are truly modern, or if they would be accessible to someone from the 19th Century. If no such text exists, and you write it, I will be your first customer.
Comment #205 January 26th, 2022 at 11:04 pm
This is certainly above my pay grade, but… here’s my highest-level conceptual argument for why something like quantum mechanics might be plausibly preferable. (I’m not at all sure that I can articulate this clearly or convincingly.)
In classical physics, it’s always seemed to me extravagant to postulate an enormous high-dimensional phase space, out of which our *actual* universe will only ever occupy a volume of essentially zero size. The “existence” of the rest of that vastly infinitely larger phase space has, as far as I can see, no physical meaning whatsoever. Quantum mechanics is actually more parsimonious, in the sense that all the degrees of freedom in phase space have actual physical meaning and causal power.
In other words, an individual classical system is really always one-dimensional: a continuous map, following certain laws, from time to points in phase space. An individual quantum system, on the other hand, occupies *all* of phase space at *all* times (more densely at some points than at others).
I realize this might only be even remotely appealing to an Everettian mindset, and (separately) that favoring classical physics, one would probably deny that they’re actually “postulating” the “existence” of all of classical phase space, that it’s just a mathematical convenience. Also, it doesn’t distinguish quantum mechanics, in particular, from other possible theories that satisfy this condition. I’m not sure whether it could.
Comment #206 January 26th, 2022 at 11:08 pm
Scott #160: Three comments regarding arguments for complex amplitudes:
1. I believe that Chris #94 is actually referring to the fact that that if time is continuous and time-evolution operators \(U(t)\) are linear and compose in the natural way, then you want to be able to take \(n\)th roots over the field of scalars, so that for every time-evolution operator \(U(t)\) there exists a corresponding operator \(U(t/n)\). As you know, this is only possible for real-valued scalars if you either use ancilla qubits or introduce a new dimension to your Hilbert space (a process that you must then iterate). I believe that Chris #94’s only mistake was in attributing this argument to Hardy, when as far as I know, you came up with it yourself.
2. You say above that “Personally, I’ve never quite understood the motivation for that axiom, other than that we ‘peeked in the back of the book’.” But in your lecture notes (https://www.scottaaronson.com/democritus/lec9.html), you say that “intuitively, it seems like the number of parameters needed to describe AB … should equal the product of the number of parameters needed to describe A and the number of parameters needed to describe B.” These two statements seem somewhat contradictory.
3. I’ve never understood the parameter-counting argument for complex amplitudes, because whether or not one finds the postulate that “real parameters should multiply in composite systems” to be intuitive, it doesn’t seem to actually hold in complex QM. That’s because an \(n\)-dimensional mixed state only has \(n^2 – 1\) real degrees of freedom after you normalize the density matrix (or equivalently, identify together density matrices that are related by a positive multiplicative constant). So in fact real degrees of freedom combine supermultiplicatively in complex QM: two subsystems with \(M\)- and \(N\)-dimensional Hilbert spaces combine together to form a composite system with \((MN)^2 – 1\) degrees of freedom, which is strictly greater than the \((M^2-1)(N^2-1)\) degrees of freedom that you’d get if they combined multiplicatively. In your notes linked above, you just say “we assume, for convenience, that the state doesn’t have to be normalized”. But isn’t this a bit of a cheat, since the whole proof revolves around carefully counting degrees of freedom? If we allow ourselves to introduce dummy degrees of freedom “for convenience”, then the same argument would seem to work for real QM as well.
Comment #207 January 27th, 2022 at 12:11 am
Continuing #187 to account for observers, my ‘theory of actualization’ leads to natural conjectures explaining intelligence, values and consciousness.
Now, I *don’t* think that mind is fundamental as such. As I said, I think the ‘ground state’ of reality is simply a space of ‘possible worlds’, and as complexity is built-up, these worlds get increasingly ‘actualized’ entirely via computational processes, so the ball gets rolling without any observers or consciousness, which are emergent systems of computation.
However, I think that *after* a certain complexity threshold is cleared, the continuing actualization of reality *does* need minds, and from this point forward consciousness *contributes* to the on-going actualization of reality! Not through any sort of mystical or non-physical process, but by structuring information (i.e., turning information into knowledge), thus helping to make reality increasingly comprehensible.
So what are minds? Well, remember I talked about the ‘actualization’ process itself, which I said takes place on the level of computational geometry, function spaces and topology. And minds exist at this level. They’re simply the higher-level processes of ‘actualization’. Minds make mappings of (representations of) reality by modelling systems of causal relations that are complex and compositional. And these models are *themselves* new systems of causal relations that are complex & compositional! This is an open-ended recursive process.
The meaning of life (for conscious observers) is thus simply the high-level version of the same process of ‘actualization’ that I think accounts for physics. It’s ‘Self-Actualization’ ! Of course, now we have to try to achieve a reasonable understanding of the meaning of the term ‘Self-Actualization’ 😉
Here’s my explanation of consciousness and values:
Consciousness is the highest level of recursive actualization. It’s a model of the perceived flow of time (temporal awareness). It works by integrating high-level values and low-level facts, to generate mid-level action plans. The representation of values, plans and facts is in the form of the temporal modalities ‘should, ‘would’ & ‘could’ respectively. And the generated ‘action plans’ which are “good” are simply the ones that structure information as knowledge such that it contributes to the on-going actualization of reality (i.e., generation of manageable complexity).
To understand values, consider the motivations of God in the context of ‘actualization’. I believe that these motivations can be considered to consist of two complementary tendencies, (1) rationality & (2) creativity, because this is the combination that generates *manageable complexity* ( ‘actualized reality’).
Rationality is about the *compression of information* (manifesting as intelligence) , whereas creativity is about the *exploration of possibilities* (manifesting as values). The balance between them generates manageable complexity.
In conclusion, the “actualization” of reality is ultimately about the generation of *manageable complexity*, which is complexity that has enough structured details to be interesting, but can still be compressed enough to make it comprehensible. At a high-level, this is the balance between rationality and creativity in conscious minds. And that’s the meaning of life.
Comment #208 January 27th, 2022 at 12:20 am
Scott, re #195 and #200,
Taking computer languages—which are designed for Turing machines, which to begin with are abstractly discrete objects—is “rigging the game” to at least the same extent. As is talking about what you can explain to a human child, which an *extremely* complex item! The Game of Life requires *counting* occupied adjacent squares, which is an extremely abstract and complex thing to do. Counting and then consulting a rule that depends on the outcome of the count. I see no sense at all in which that is simpler than a simple differential equation.
Regarding Bohmian Mechanics, we will disagree about some evaluate matters, but we should not disagree about plain facts. And its not at all a fact that any theory that has a quantum state in its ontology can “talk unproblematically about” the precise particle locations that constitute a Bohmian configuration. Obviously not. Everett certainly cannot. Does the particle location, as empirically accessible in Bohmian Mechanics “give us any information we didn’t already know”? Sure. For example, in a simple 2-slit situation, the location where the particle hits the screen (which we can find out) implies which slit it went though, which we didn’t know. I’m a bit of a loss at why you are making the claims you are.
Comment #209 January 27th, 2022 at 12:21 am
Scott #200:
I just want to chime in to support your encouragement of more attention and development of Bohmian Mechanics. Sheldon Goldstein has been doing what I think is really interesting work but I don’t think he hangs out in this corner of the internet so I am hoping to convert some others.
With respect to your question at hand, Scott, I do think that taking the Bohmian ontology seriously, and perhaps biting the Bohmian bullet all-the-way-to-the end may be fruitful or instructive. After all, the question that John Bell was most interested in before his death, and the question he wished others would pursue was this: why isn’t Bohmian mechanics Lorentz-invariant?
People who favor a many-worlds interpretation or ontology tend to pounce on this as a reason to throw Bohmian mechanics in the trash, but I personally think it is a great mystery that the interpretation with the most clearly-defined ontology, the one that best resolves the “anschaulihkeit” issue of QM in my mind is in some sense saying that there is a preferred reference frame. What would happen if we took that seriously?
Since I am anticipating some pushback for what I just wrote, here is Bell on the two-slit experiment:
“Is it not clear from the smallness of the scintillation on the screen that we have to do with a particle? And is it not clear, from the diffraction and interference patterns, that the motion of the particle is directed by a wave?”
So Tim, I agree with Scott. Keep going! Together, in the name of John Bell, we can defeat the many-worlds people fair and square.
Comment #210 January 27th, 2022 at 1:12 am
Scott,
“To date, no one has comprehensibly explained to me how you get our quantum-mechanical, Bell-inequality-violating observed reality from a classical CA, in a way that isn’t so convoluted, contrived, ugly, and insight-free that you wouldn’t be vastly better off just to throw out the CA part, and talk directly about quantum mechanics. Certainly not Wolfram, not ‘t Hooft, not Sabine Hossenfelder, not Andrei on this blog … as far as I can tell, all the words they’ve produced have taken us 0% of the way toward answering the question.”
I actually explained this multiple times, but I guess I was not clear enough, so I’ll do it again. I’ll put some numbers before each statement so that you can agree/disagree with any of them without losing much time.
P1. In field theories, like classical EM, GR, fluid mechanics, the system must satisfy the theories equations at any time. A system described by classical EM must satisfy Maxwell’s equations. – agree or disagree?
P2. This is just a reformulation of P1, but I feel the need to insist on it. Any state that does not satisfy the theory’s equations is physically impossible. There is no need to select/fine-tune the initial conditions to guarantee that such a state never occurs. – agree or disagree?
P3. A Bell test is a particular example of an EM system, consisting in three large groups of charges (electrons and nuclei). One group is the particle source, the other groups are the two detectors including whatever it is used to “choose” their orientation. – agree or disagree?
P4. From P1, P2 and P3 it follows that only some states of the source/detectors are possible (those that satisfy Maxwell’s equation). There is an infinity of such allowed states (corresponding to any choice of the initial conditions), but the physically impossible states (that don’t satisfy Maxwell’s equations) are even more numerous (for each valid state you can create an invalid one by simply changing the electric field at one point and leaving everything else the same) – agree or disagree?
P5. It is possible that when only the physically possible states are counted we will recover QM’s prediction. In other words, it is possible that those states that would lead to a different prediction than QM are impossible (don’t satisfy Maxwell’s equations) – agree or disagree?
P6. I claim that P5 represents a reasonable explanation of how Bell’s inequality can be violated without non-locality and without any conspiracy (unless you want to claim that the requirement that a state that does not satisfy the theory’s equations is a conspiracy). – agree or disagree?
Comment #211 January 27th, 2022 at 1:30 am
Philippe Grangier,
“Disagreement with Andrei #135 and #137 : there is (at least) a third way besides superdeterminism and nonlocality (Bohm), that is predictive incompleteness, see https://www.mdpi.com/1099-4300/23/12/1660 . It does not require to ‘abandon standard QM’, but rather to look at it in a slightly different way.”
I do not think “predictive incompleteness” can avoid the argument. Please address my argument below:
We have an EPR-Bohm setup, two spin-entangled particles are sent to 2 distant stations, A and B. The spin measurements are simultaneous (space-like) and perform on the same axis (say Z). Let’s assume that the result at A is +1/2. We have:
P1. The measurement at A does not change B (locality).
P2. After the A measurement, the sate of B is -1/2 (QM prediction).
From P1 and P2 it follows:
P3: B was in a -1/2 spin state even before the A measurement. The spin of particle B on Z is predetermined.
Symmetrically, the spin of particle A on Z was predetermined as well.
The only assumptions here are 1. locality and 2. QM gives correct predictions. From these it logically follows that the measurement results are predetermined. In other words:
C1: locality can only be saved by introducing deterministic hidden variables (the spins before measurements, or some other property that uniquely determines the results).
Then we have Bell’s theorem which says:
C2: Local hidden variable theories are impossible with the exception of superdeterminism.
From C1 and C2 it follows that the superdeterminism is the only possible way to keep physics local.
Comment #212 January 27th, 2022 at 1:39 am
A slightly different approach:
Waves, quantization, higher dimensionality, and the special nature of spacetime.
QM is the base case, but need to consider interaction with spacetime, and a possible “residue” component (related to wave interactions or higher order dimensionality?)
Waves – Are inherently complex.
Measurement/2-slit experiment – single wave interacting with the 2 slits, get constructive and destructive interference; with a slit being observed, the measurement device acts like another set of filters or possibly inserts a wave pattern that interferes in a way that produces the 2 lines pattern
Spacetime – how waves of matter, radiation, etc manifest. E.g. gravity waves
At small scales quantization – frothiness, uncertainty (principle)
Dark E – related? Could also be a side effect of higher dimensionality?
Why all the different fundamental particles? A few options:
1. They’re all the same as observed in their native dimensionality but look and behave differently as observed in 3+1D
2. There are other universes (bubbles in the ocean) that are only electrons say, but we don’t live there for obvious reasons
3. Something about how they’re manifesting as waves/wave interactions makes them appear different to us observers
Probabilistic, Born rule & tunneling – interactions between the waves and spacetime
Time reversal symmetry vs time – entropy/”residue” (creating aging/irreversible processes)
Coherence/decoherence – on larger scales/times more interference between waves, more decoherence. Entangling = aligning waves in some special way?
Dark matter – related to residue? As scale grows, impact of residue grows? On smallest scales/times, it is (more) reversible
“Classical” is then just a consequence of the scales on which we live/observe.
Mass E; a manifestation of the waves? Quantized waves?
Mass can’t travel at c due to interaction between spacetime wave and this wave? Residue?
C – speed of massless “particles” e.g. light constant – fundamental b/c no interaction travel of wave in spacetime, or it IS the single wave of spacetime propagating
Bell inequality – info still has to travel from A to B in spacetime, traveling as a wave…
Haven’t got to Pauli exclusion or Superposition yet, but you get the idea…
Sorry for the stream of consciousness/incompleteness, need to get to bed soon. Happy to try to elaborate if there’s interest.
Comment #213 January 27th, 2022 at 1:55 am
If I were to speculate wildly about concepts which might eventually provide insight on this question, then my best guess would be reversible computing, and my craziest guess would be interaction nets. To keep this relatively concise: A quote, then some links, and then some comments.
“A motivation for the study of technologies aimed at implementing reversible computing is that they offer what is predicted to be the only potential way to improve the computational energy efficiency of computers beyond the fundamental von Neumann–Landauer limit of kT ln(2) energy dissipated per irreversible bit operation.”
https://en.wikipedia.org/wiki/Reversible_computing
https://en.wikipedia.org/wiki/Landauer%27s_principle
https://en.wikipedia.org/wiki/Interaction_nets
I have no relevant expertise here; I’m only commenting because this stuff is obscure, interesting, and (from my uninformed perspective) possibly relevant. The interaction net thing in particular is basically just a vague association in my mind.
Comment #214 January 27th, 2022 at 1:59 am
Philippe #133
Hope you don’t get it wrong, but I guess my point in being synthetic was that I don’t feel I need theorems to trust the QM framework. It looks manifest to me that it is both fit for purpose and as “efficient” in fundamental computations as it could be.
The “mutually exclusive” relationship is all that is needed to setup a working (projective) propositional logic that generically forbids connecting propositions with AND, but realizes OR through linear dependency.
The Born rule is linear on all normalized arguments: the simplest way of producing a number that can always be interpreted as a probability, as empirically required by the need to make sense of the observed complementarity (and of the observed predictability of the Universe, despite the indeterminism)
QED
🙂
Comment #215 January 27th, 2022 at 2:06 am
Philippe Grangier,
I think I can spot the problem with your paper. In Chapter 5, below Figure 1 you say:
“The resulting predictions can be effectively checked in the verification zone V in the common future of all light cones.”
I think this is irrelevant. True, the experimental records can only be compared at V, but one can look at the time at which the measurements were performed (the time printed on those records) and conclude that the prediction of A about B was true immediately after the A measurement. It does not become true at V.
Comment #216 January 27th, 2022 at 2:14 am
Scott #99:
Your conjecture seems highly unmotivated to me. What’s the most “chemistry”-like thing you’ve seen CA do? Putting that aside, I think you’ve left a crucial part out of your conjecture, namely the typicality of initial configurations leading to “chemistry”. Certainly in Conway’s Game of Life most random initial configurations don’t give rise to the cool stuff you can get by fine-tuning.
Scott #174:
Are you sure Albert E. is the guy you want to be taking cues from where QM is concerned? 😛
Comment #217 January 27th, 2022 at 6:33 am
Scott,
I looked upon your paper, The Ghost in the Quantum Turing Machine, and I think I can point an error in your reasoning in regard to Bell’s theorem. At page 22 you quote Conway and Kochen:
“if there’s no faster than-light communication, and Alice and Bob have the “free will” to choose how to measure their respective particles, then the particles must have their own “free will” to choose how to respond to the measurements.”
The error in this type of argument is that the EPR argument closes the possibility of local indeterminism (see Comment #211). So, by rejecting determinism, you cannot recover locality. So the alternatives are:
1. local determinism (in the only surviving form, superdeterminism) and
2. non-locality, which can be either non-deterministic, or deterministic.
Non-determinism does not give you anything that determinism cannot.
Comment #218 January 27th, 2022 at 6:35 am
Having skimmed through all the comments so far, I think there is a lot of informed agreement that some randomness is necessary and a lot of discreteness is necessary. Other than that, I for one do not guess that there is only one way to get a universe with complex life forms. It seems to me that the way to bet is that our universe is a mediocre one (for the formation of complex lifeforms) among all physically possible universes. I am not saying the metaverse exists, since that raises additional questions, but that it seems quite possible to me that there are many other sets of physical laws and constants which could produce complex life.
Life: physical entities which can reproduce themselves with random errors and thereby evolve means of memory and decision-making computations.
Comment #219 January 27th, 2022 at 6:47 am
Level-design that uses RNG can open the design-space much more widely than classical level-design. Without RNG, either all the levels look the same, or you have to hand-craft them all to be different in interesting ways. QM is just a way to make sure the universe looks interestingly different across its total configuration. Without it, there would be no level in which we evolved, so we wouldn’t be asking the question.
Comment #220 January 27th, 2022 at 8:28 am
Nicholas Teague #129:
Cellular automata / hyper graph updating rules aside, the multi-way causal graphs of the wolfram physics model are the most intuitive channel I’ve found for thinking about the quantum mechanical nature of reality. Formalizes the multiverse.
It’s “intuitive” only because it doesn’t have unitary evolution or (therefore) well-defined probabilities!!
Comment #221 January 27th, 2022 at 8:50 am
Scott #163
Sorry, Scott, I think you pattern-match what entirelyuseless is saying to “deterministic hidden variable theory”, and dismiss it too quickly. It’s only my interpretation based on what entirelyuseless wrote, but I don’t think it is that. Do not forget that agent in Game of Life would never be able to say things like “danger, a glider is approaching”, or even “ouch, I was hit by a glider”, because the world is fundamentally inaccessible to them at that level, even if they build the Game of Life analogs of particle accelerators. Maybe it helps if you imagine that there is a huge chasm between the lower level cellular automaton rules, and the higher level QM rules, where the lower level rules are inaccessible to inhabitants of the universe, for fundamental reasons. Now you might, in turn, pattern match that to “deterministic hidden variable theory immediately slashed by Occam’s razor”, but it’s not exactly that, either. Let me explain my best case scenario:
I design a deterministic cellular automaton, and agents in it. The agents can not observe their environment without altering it, which gives rise to some version of Uncertainty Principle, which in turn forces them accept a quantum theory of physics. I am NOT saying that the agents are wrong, and their world is “really” a hidden variable world. From their perspective, they know everything about their world that’s ever knowable, and it is quantum. But I, who designed their world, have a different, very valid perspective. It is a semantic question who is correct, the creator or the agents. What’s more important is the existence proof that QM can arise like that, the fact that with the extra knowledge the creator has, Occam’s razor does not slash the deterministic theory.
Entirelyuseless, please feel free to clarify, or at least create an ad hoc comment thread with people like me who are open to your idea. 🙂 (Check out my comment #189 in case you missed it.)
Comment #222 January 27th, 2022 at 8:57 am
Q1 is tough, because a metaphysically-classical universe could have life and intelligence in it, apparently. The only attempt to reconcile this that seems to have any promise is what Wolfram is doing. If Wolfram’s project can’t succeed, then the only other explanation I have is theological—that you need true indeterminism in order to have both lawfulness and free will, as well as “room for God to act”.
Comment #223 January 27th, 2022 at 9:02 am
Bruce #132:
I think the question has a major unstated premise: that ours is the *only* universe.
Let me clarify that that’s in no way, shape, or form a premise! I could be 100% satisfied with an answer along the lines of, “really there’s a Tegmarkian multiverse, with some classical universes, some quantum universes, and some universes with other rules entirely, and a-priori we could’ve been in any of them, but the deck was stacked in favor of our finding ourselves in a quantum universe for the following reasons…”
If, of course, actually-convincing arguments or evidence were brought that that was true.
Comment #224 January 27th, 2022 at 9:05 am
Disregard the previous two comments, probably, as I’ve realized they probably don’t actually explain what I’m getting at.
Starting from here: Take Pilot Waves, and subtract out the particles; you’re left with MWI, but I think that way of thinking about things is misleading. Rather, I think the correct way to interpret the wave you’re left with is a mass-energy distribution; the particle isn’t probabilistically in different locations, it’s always a waveform.
Strictly speaking, you don’t have to be left with a mass-energy distribution; anytime there aren’t particles, and instead there are waveforms, you should expect to get quantum-mechanical behavior. Curvature, for these purposes, I am counting as a waveform, which I’m using to describe any space-extensive structure with a variable quantity which can be interpreted as amplitude. I think the ideal case for a waveform for these purposes has infinite spacial extent, which is not quite the same as occupying all space.
So if General Relativity is correct, and additionally understood to mean that mass-energy is in fact distributed over spacetime (and I don’t know how you can look at GR and think “There are particles there”), you should expect to get quantum-mechanical behavior.
You should expect to see quantization in any physical system with a finite number of stable configurations in which state transitions are the only measurable thing; even if energy isn’t actually quantized, a finite number of stable configurations mean you’ll only see energy in specific quantities. (Particularly when the system is significantly smaller in scale than the observers, and things happen relatively fast compared to observational techniques).
And classical really-real particles, if you think about it, are actually an enormous ask in terms of additional, and entirely unnecessary, complexity; the behavior we actually care about is all embedded in the fields, particles don’t actually add anything.
The one thing is that I don’t think it is necessarily strictly the case that measurement events should behave the way they do, and I think that’s much more specific, requiring a reason for waveforms to be “coherent” – that is, there has to be a reason that a waveform/particle which is sitting between two attractive forces shouldn’t just split in half down the center, and if MWI is correct and the waveform can split and still be coherent in the sense of maintaining (at least from its own perspective) radial symmetry, that’s an additional piece of the puzzle that needs to be explained. Granted, a universe in which waveforms aren’t coherent wouldn’t support complex structures or chemistry, so we can lean on an anthropic explanation.
Comment #225 January 27th, 2022 at 9:17 am
Nisan S #133:
The Game of Life is not a safe place for life to evolve. Everything is being constantly bombarded by gliders and stuff. We’re fortunate to live in a universe where gravity collects matter into tidy planets, whose surfaces are only lightly bombarded.
I think your point about the Game of Life is an excellent one. In contrast to many other things that have been proposed, that’s a real reason why we should never have expected to find ourselves in the Lifeworld. But I also think there are even more directly relevant factors than gravity to explain why our universe is a safer place for life than Life:
(1) Our universe supports not only Turing-universal computation, but fault-tolerant computation. (Note, however, that there are 2D cellular automata that support that as well, by celebrated work of Peter Gacs.)
(2) Our universe has these concepts called “conservation of momentum” and “conservation of energy,” which often (albeit not always) tamp down on runaway butterfly effects. For example, if you’re hit by a cosmic ray, whose energy is minuscule compared to the energy in your body, there will for that reason almost always be no noticeable effect, unless (for example) the cosmic ray happens to mutate a DNA strand in such a way as to give you an aggressive cancer. The Game of Life seems to have no analogous conservation principles that would tamp down on the destructive effects of gliders.
Comment #226 January 27th, 2022 at 9:28 am
TP #135:
For interesting chemistry we need the building blocks to be charged. However, charged particles lead to various problems in classical electromagnetism, e.g., the infinite self-energy of electron or the radiation from an accelerated charge. Quantum mechanics solves these problems. I suspect that any attempt to solve these problems will have to reproduce quantum mechanics in its entirety. As you have pointed out it seems impossible to change quantum mechanics just a little and it seems to be an island in theoryspace. Maybe any self-consistent theory that tackles these problems of classical electromagnetism has to go all the way to quantum mechanics for this reason.
I ought to have been more explicit that what I was really doing, with my Q1 and Q2, was calling for an entire research program, only bits and pieces of which already exist (mostly focused around Q2).
Your comment, with its inferential leaps from one sentence to the next, provides perfect illustrations of the sort of thing I’m asking for. Is it actually true that to get interesting chemistry, we need some notion of “charged particles” with a Coulomb force? If so, is it actually true that once we’ve introduced such particles, we’ll create problems that can only be solved with QM? Or are we just suffering from a lack of imagination, and (even more to the point) a lack of well-developed alternatives to gawk at?
Comment #227 January 27th, 2022 at 9:29 am
Sid #197 gives a link that Firefox assures me is unsafe. A search reveals its published version apparently to be https://doi.org/10.1119/1.10490
One more derivation of the Lorentz transformation
Jean‐Marc Lévy‐Leblond, American Journal of Physics 44, 271 (1976)
ABSTRACT: After a criticism of the emphasis put on the invariance of the speed of light in standard derivations of the Lorentz transformation, another approach to special relativity is proposed. It consists of an elementary version of general group‐theoretical arguments on the structure of space–time, and makes use only of simple mathematical techniques. The principle of relativity is first stated in general terms, leading to the idea of equivalent frames of reference connected through ’’inertial’’ transformations obeying a group law. The theory of relativity then is constructed by constraining the transformations through four successive hypotheses: homogeneity of space–time, isotropy of space–time, group structure, causality condition. Only the Lorentz transformations and their degenerate Galilean limit obey these constraints. The role and significance of each one of the hypotheses is stressed by exhibiting and discussing counterexamples, that is, transformations obeying all but one of these hypotheses.
The assumptions that are introduced look extremely strong to me, but YMMV.
Comment #228 January 27th, 2022 at 10:48 am
How I have summarized some options, trying hard not to bias the student:
> Here are a couple of examples of categories of results which, if they existed uncontroversially (they don’t!), would make me personally feel we have made progress on the questions:
> Example 1: Consider a derivation I’m fond of https://arxiv.org/abs/1706.05261 which basically says “add BLAH to classical propositional logic and you *have* to end up with something isomorphic to classical probability theory as your system of plausible reasoning”. Perhaps there is a different BLAH’ which instead leads you inevitably to quantum probability theory as your system of plausible reasoning. There are many versions of proposals like this – quantum logic, operational reconstructions, QBism and so on. Seems unlikely such things will ever give me a good intuition about why hbar is the size it is, why its so damn practically useful to think about those little waves as really diffracting around, why those spectral lines “exist” in light emitted long before creatures capable of worrying about probabilities at all existed and so on – but maybe I’m just being too much the physicist to want such.
> Example 2: Relativity taught us that different observers can seem to have incompatible narratives about what is going on, but once you identify the right “true” underlying description of a single reality they are all correct, so to speak. With quantum theory it seems hard to believe in such “singular agreeable realism” – witness the issues of contextuality (never mind the plethora of interpretations!) – but perhaps there is a different unification which shows that the only possible way to unify different observers each with their own personal “correct version of reality” is with something like QM. I think various topos approaches, and perhaps consistent histories, fall into this kind of category.
> Example 3: Most of the issues I have with quantum theory are to do with the manifest disrespect entangled particles show for space and time. But perhaps space and time are simply anthropomorphic variables – useful to evolve a brain that prioritizes them when chasing mates and bananas was our mission, but not actually fundamental or useful degrees of freedom to describe whatever it is that is “actually going on” at non-monkey scales. There are many imprecise and frankly borderline crackpot attempts at seeing space and time as “emergent” out there, but that shouldn’t dissuade us from considering the general principle might be valid.
Comment #229 January 27th, 2022 at 10:48 am
My oracle suggests that the answer to “why the quantum”, is that a maximal equiprobable Hartle sub-multiverse is one way to resolve the necessary ontological tradeoff between plenitude, internal intelligibility, and logical locality (pick any two). Unfortunately it also says we’re not ready to know the technical meanings of those terms. 🙂
Comment #230 January 27th, 2022 at 11:07 am
As stated, Q1 is easy. 😛 Where would the fun be for God in creating a classical, i.e. deterministic, Universe?
The Schrödinger Equation is as deterministic as any of Newton’s Laws.
Comment #231 January 27th, 2022 at 11:10 am
To me it’s not even clear that this can be done for any “theory”: since we will never be able to explain why there’s something instead of nothing, it’s hard to imagine how we can show that some aspect of reality is “necessary” in some absolute sense. At best we can only do it in relation to other features of reality.
So, if there’s a way to show that QM is necessary it’s probably by looking at even more fundamental ideas that are hard to dismiss, like entropy, conservation of information, fundamental symmetries, the fact that spacetime has 3+1 dimensions, etc.
Comment #232 January 27th, 2022 at 11:30 am
Andrei #136:
What we are facing here is much more than a “psychologically uncomfortable” feeling, or a “peculiarity”, we are facing a logical/mathematical inconsistency between QM and the space-time structure of special relativity. Please look carefully at the last paragraph:
“we cannot use spin-correlation measurements to transmit any useful information…”
SR does not make any distinction between “useful” faster-than light messages and useless faster-than light messages…
This is a subject I know well, and recent conversations with you, Sabine, and others reinforced for me that it’s the precise point where I get off the train headed to Superdeterministic Crazytown. That town was built to solve a problem that everything I know about quantum information tells me is not a problem at all.
By “useful” information, Sakurai clearly means any information whatsoever that one can choose to specify at point A. It’s a theorem in QM — I prove it every year in my undergrad class — that none of that can be transmitted. He means to exclude correlations, which occur even in classical probabilistic theories and are considered totally unproblematic there, but which can be stronger in QM in a way that helps A and B win certain nonlocal games, which is the whole content of the Bell inequality.
The above is really all that special relativity requires; SR doesn’t care about Bell-type correlations. We could see that abstractly, even if we didn’t have successful relativistic QFTs that show it explicitly. That QM and special relativity are difficult but not impossible to reconcile, and that the rare theories that do successfully reconcile them are so phenomenally good at describing the world, is a powerful indication that physics is on the right track here.
Comment #233 January 27th, 2022 at 11:38 am
Cain #137:
Quantum mechanics necessitates God to determine when the state of the universe will change according to the Born rule rather than the Schrodinger equation, and no matter how many dinosaur fossils are dug up Richard Dawkins will never be able to explain this from more fundamental principles. QM is God’s signal that all the evidence against his existence from lesser sciences can now be discarded as a test of our faithfulness.
What’s strange is that, having reserved to Herself this immense power to collapse the wavefunction, God would then choose to deploy it only and ever when quantum systems become decoherent and “macroscopic,” and would (as far as we can tell) scrupulously follow the Born rule every time, as though p(x)=|ψ(x)|2 were a higher law even than God!
Comment #234 January 27th, 2022 at 11:41 am
Scott P. #230
The Schroedinger equation is deterministic, but the Universe is not, which is the entire point if you want to be entertained while watching 🙂
Comment #235 January 27th, 2022 at 11:42 am
Scott #232
“That QM and special relativity are difficult but not impossible to reconcile”
Why do you assume that two theories that work separately can always be reconciled?
Isn’t it possible that the transition from one mode to the other involves processes that just can’t be described by some compact mathematical relation?
Comment #236 January 27th, 2022 at 12:07 pm
To Anbar#214
A big issue in all these discussions is to make clear what is your starting point, and what you want to get at the end. In some of the posts it is clear that the starting point is some mathematical foundational principle, ‘Let Be Psi‘, from which everything in the (many)world should be derived.
My starting point is just the opposite : I start from empirical evidence, then distill out some physical principles, then induce (actually, guess and justify physically) a suitable mathematical structure (with projectors !) and hypotheses likely to embed these physical principles. This construction turns out to fit quite well with the mathematical hypotheses of Uhlhorn’s and Gleason’s theorems, so from there, I can go deductively by using these theorems, and get respectively unitary transforms and Born’s rule, that are the basic framework of QM.
This is certainly not a full answer to Scott’s ‘Why QM’, but maybe a little hint to this why. This is also in line with 3. in his introduction, but without ‘dismissing the axioms as totally implausible and unmotivated if I hadn’t already known (from QM, of course) that they were true.’ Why not dismissing them ? Because they come from empirical evidence, not from an already abstract formalism.
Now, if it is clear to you that a working (projective) propositional logic and the need to make sense of the observed complementarity are enough to get unitary transforms and Born’s rule without further ado, this is nice, but I’m not that smart.
Comment #237 January 27th, 2022 at 12:15 pm
By the way in terms of “look how old we are now” arguments, to me one of the most depressing is that Mozart died at 35.
Comment #238 January 27th, 2022 at 12:24 pm
A couple of dumb questions from a biologist:
1) does QM require infinite precision for unitarity, etc. or do all of those irrational factors of sqr of 2 and pi take care of themselves?
2) Is Bell’s non-local condition equivalent to the older (I gather) notion of contextuality?
Comment #239 January 27th, 2022 at 12:40 pm
bertgoz #142:
I feel ultimately the description of the universe using mathematics and hence leading to the classical framework first and then to the quantum one, tells more about how the human mind works and it’s limitations than anything else
Answers of this kind seem endlessly popular, no doubt because of their air of transcendent wisdom, but they’ve never made the slightest bit of sense to me. We can use our understanding of physics, rooted in math, to build spacecraft and microchips and nuclear reactors, and all those things actually work—not just in the human mind, but in external reality. Which competing framework, not based on math, tells us more about the workings of reality, or better lets us escape the prison of our minds?
Comment #240 January 27th, 2022 at 1:01 pm
An addendum to the above that I should have added, is that coming up with classical explanations for things like atoms or chemical bonds is a major modern-day crackpot pastime – you can find lots and lots of crazy websites out there that contain attempts at it. Presumably, if it were easy to do, one of them would have done it.
Just to think about how that’d have to work – do the atoms have little jigsaw puzzle bits on them that determine what they connect to? Do they have spring-catches to stop them from bouncing apart? Who works at the lathe that turns out these tiny machines?
Is it all vortices in a fictionless fluid, like Lord Kelvin suggested? That one is a lot less patently absurd, but it didn’t go anywhere either. I really think you are underestimating the difficulty of coming up with a version of quantization that doesn’t involve quantum mechanics. After all, quantum mechanics was invented as a way to describe physical phenomena *first*, and the stuff about complex amplitudes and entanglement was noticed as an uncomfortable yet inevitable implication, *afterwards.*
Comment #241 January 27th, 2022 at 1:23 pm
The Schroedinger equation is deterministic, but the Universe is not, which is the entire point if you want to be entertained while watching
It remains to be proven whether the universe is deterministic or not. Personally, I see no problem with the universe being simultaneously deterministic and stochastic, but which (if either) it is is unclear. Certainly QM itself makes no statement about the presence or absence of determinism in the universe.
Comment #242 January 27th, 2022 at 1:46 pm
@Matt H
Phase space in classical mechanics is used mainly in statistical mechanics, to let the exact present (which is a single point) be diffused into a cloud that represents what we think the present might be, or other times, all of the counterfactuals that are being considered for theoretical reasons. There has to be a space to contain that distribution, and it needs to have space-like devices attached to it, like measurement and the inner product, or else we couldn’t measure the blob and prove theorems about it.
There are various ways to end up with a diffuse present, one of them being experimental uncertainty (representing Bayesian lack of knowledge), but other times one conjures things like “all of the states with energy E,” when you’re trying to show they have something in common.
Phase space is not necessary to do calculations about individual initial conditions, and in a Newtonian universe phase space would have the same status (as a collection of counterfactuals) as “the space of all alternative laws of physics,” because a different initial condition would be just as counterfactual as different laws guiding evolution. It is not really a part of classical mechanics in the way Hilbert space is a part of quantum mechanics.
Comment #243 January 27th, 2022 at 2:03 pm
Scott,
After reading some of your comment replies I understand that my objection wasn’t made with a full understanding of your position.
But… now I have a different objection.
If we invented a classical ruleset that allowed life, where chemical transformations of bulk substances were “ground-level” facts (this is kind of how people saw it in the earliest history of chemistry before atomic theory took hold), and if it was very simple and elegant, and highly favorable by the backwards anthropic principle*, and it seemed like a travesty that it wasn’t true…
… we wouldn’t know if the real laws weren’t even simpler and more elegant, because nobody knows what the true laws are yet. I think we have to come back to this once physics is done and we have a basis for comparison.
This applies to any grading system for “preferability” we could think of, even if you are like Sabine and think that elegance is a dumb grading system for a universe that contains ringworm. No matter how you’re ranking imaginary rules, you’ll never know if the real rules beat what you’ve come up with, because the real rules are as of yet unknown.
* I don’t know what the name for this is, but it’s the idea that laws of physics that are more likely to lead to life are more favorable than those that are less likely, as a continuum version of the fact that we can rule out any that never lead to life.
Comment #244 January 27th, 2022 at 2:10 pm
arXiv:2107.06942 [pdf, other] quant-ph
doi
10.3390/e24010012
The Relativity Principle at the Foundation of Quantum Mechanics
Authors: W. M. Stuckey, Timothy McDevitt, Michael Silberstein
Hi Scott,
I think the above article attempts to answer your question about the basis for QM by applying the no preferred reference frame principle to h-bar. curious about your evaluation, both fundamental approach and net results.
Thanks.
Comment #245 January 27th, 2022 at 2:15 pm
Scott,
i would like to mangle your two questions into one:
Why did God use quantum theory to make the universe, but have it appear classical to us?
And how exactly did he do that?
Comment #246 January 27th, 2022 at 2:53 pm
What’s always puzzled me about QM is the fact that in an apparently non-deterministic future we still somehow get *exact* conservation laws. The only way that happens is via entanglement, which constrains the sums of various random variables to be constant, which then gets us exponentially large state space.
So to me the mystery is exact conservation in the face of Bell non-determinism. If we want the future to not be locally determined in the distant past, and want conservation, it seems we’ll be getting something like QM albeit not-necessarily with complex amplitudes on that basis alone…
What’s your take on the puzzle of how conservation and randomness end up consistent?
Comment #247 January 27th, 2022 at 3:03 pm
Scott #225
“The Game of Life seems to have no analogous conservation principles that would tamp down on the destructive effects of gliders”
But I see no reason to assume that the 2D grid instantiating the Game of Life at its smallest level would be directly mapped to the reality perceived by entities at a much higher level of abstraction, and the apparent “destructive effects” of gliders may correspond to something entirely different at that level.
A taste of this idea: if we consider someone playing Resident Evil 4 in VR on Quest 2, there’s no direct/obvious connection between the 3D space perceived by the player (e.g. the player is looking at a red cup falling onto a wooden table and breaking into a hundred pieces) and the corresponding low level bit patterns and their evolution inside the linear structure of the RAM.
Comment #248 January 27th, 2022 at 3:13 pm
I hope a small question related to the basis for Q1 (we live in a QM universe) is ok
it’s surprising for a layman like me that it’s an open question whether the universe is deterministic (Anbar #234, Scott P. #241).
Is there some flaw in the reasoning “everything in the universe has QM state => the whole universe could be described as one QM state, which evolves deterministically according to the Schrödinger equation”?
Comment #249 January 27th, 2022 at 3:56 pm
Schrodinger equation allows for static and dynamic states. Static only possible because of slight of hand to make phase irrelevant in Born rule. This goes a long way to answer why QM. It requires 2d wave function with something like phase and modulus. Dynamic solutions are those that are superpositions of static states. Super position and multiple degrees of freedom imply tensor product.
Static solutions exist as eigenstates of Hermitian Hamiltoninion. These eigenstates exist cause the operator is over the complex numbers. The exponential of Hermitian is unitary.
Can you create a classic system with static and dynamic states?
Comment #250 January 27th, 2022 at 4:14 pm
Roger Schlafly #169:
Is it the case that quantum mechanics allows for the possibility of free will? Or is it the case that quantum mechanics attempts to rationalise the existing free will of the system? The system has free will in the sense that individual quantum outcomes can be thought of as the system, or parts of the system, freely assigning numbers to variables.
Genuine free will can only be modelled as living things, or other entities, assigning one or more numbers to some of their own variables (as opposed to the laws of nature determining all the numbers for all their variables).
So, I partly agree with you in the sense that quantum mechanics shows that the system could very easily cope with genuine free will.
Comment #251 January 27th, 2022 at 4:56 pm
Stewart Peterson #204:
I noticed it didn’t look like anyone had replied to your request.
The canonical text for the computational approach is by Nielsen and Chuang “Quantum Computation and Quantum Information”. All you need is Chapter 2.
Mike and Ike say it best: “Quantum mechanics is a mathematical framework for the development of physical theories. On its own quantum mechanics doesn’t tell you what laws a physical system must obey, but it does provide a mathematical and conceptual framework for the development of such laws.”
In other words, as Scott has said, quantum theory is not ABOUT physics. It is an extension of probability theory that allows for negative probabilities.
I also highly recommend Scott’s book Quantum Computing Since Democritus or his lecture notes on which that book is based – both for the presentation of the theory and for the humor.
Comment #252 January 27th, 2022 at 5:06 pm
A question similar to your question 1 that I would love you to discuss: given that the laws of physics are so complicated, why can they be so well approximated by something so simple?
“Why don’t Newtonian physics work?” doesn’t strike me as a terribly interesting question – there’s no reason to suppose they should.
But “why do they almost work?” seems much more puzzling.
Comment #253 January 27th, 2022 at 5:17 pm
I think there’s a misconception built into the premise of Q1.
I don’t think, in physics, we have *ever* been able to say that any of the basic physical principles are inevitable, a priori, directly from “the void”. What we have been able to do, is *synthesize* or *unify* the basic laws, so that what originally appeared to be two or more independent properties of the Universe, could be understood as consequences of a single, deeper principle (eg: electrostatics, magnetostatics, and light all following from Maxwell’s equations).
It’s not that Lorentz invariance is inevitable; but accepting Lorentz invariance as a basic principle *explains* a huge amount of observations, changes our point of view, and has been successful at letting us guess new laws.
I think that the best we can hope for, as mere mortals, is that we may discover one day how to derive quantum mechanics from a deeper principle, that also contains some other aspect of physics, like relativity. Finding *what* principles underly our world would be a major accomplishment. I don’t think we will ever be able to say *why* “God” chose the particular set of principles that underly our world — or at least, I think that would be a *much* stronger intellectual achievement than anything achieved in physics to date. No harm in asking the question, but worth putting the question in proper context.
Comment #254 January 27th, 2022 at 5:22 pm
JM #81 points to a blog post, https://www.lesswrong.com/posts/7A9rsJFLFqjpuxFy5/i-m-still-mystified-by-the-born-rule, where can be found, a long way down, a comment mentioning
https://royalsocietypublishing.org/doi/abs/10.1098/rspa.2020.0282, “The Born rule as a statistics of quantum micro-events”, by the author of that paper. Recommended.
Philippe Grangier #82, you likely won’t see this, but if you do I’m curious whether your taking of contextuality to be fundamentally non-classical, in the paper you link to, is something you absolutely won’t give up? I take contextuality, noncommutativity, and incompatibility of probability measures to be a natural extension of classical physics because such can be constructed straightforwardly using the Poisson bracket. The advantage of accepting that extension, I have found, is that we can then look at what other differences there are between classical and quantum physics without the distraction of noncommutativity. Spoiler: there are other differences.
Comment #255 January 27th, 2022 at 5:25 pm
Andrew Matas #253: I’m obviously well-aware that, whenever physical principles have been successfully explained, it’s always been on the basis of deeper principles, which are then explained (if at all) by even deeper principles, and so on, with the chain necessarily terminating somewhere. That’s what I want in the case of QM: to go at least one step deeper.
Comment #256 January 27th, 2022 at 5:33 pm
Two thoughts here:
1. It is possible to think of a physicist as analogous to a computer science algorithms researcher, the physicist’s goal is to find an equation (~algorithm) that *can* be calculated (~efficient algorithm) with different initial conditions (~inputs) and test the results in the lab (~run the algorithm on the computer). Therefore, out of all things (turing-)computable, physics can only be described by the set of efficiently computable algorithms, otherwise we cant really test them on the computer (i.e. nature). Now one can say that the laws of chemistry must somehow be in BQP, so if physics is classical (i.e we have a classical computer to run the algorithms), chemistry cannot be computed by nature in scale. So now the question is why can’t we have chemistry in P? From here it might be possible to show that some simple chemical process is BQP-complete…
2. A more general question I once asked myself is what abstractly a physical theory is (from a mathematical point of view)? Then one can think of different realizations to the general framework and start exploring what realizations have better properties. This is analogous to have the mathematical field as an abstract structure and the rationales and the reals as realizations of a field. One can say the rationales are so simple and finite in nature (~classical physics) but the reals have the important property of completeness (the limit of every Cauchy series exists blah blah).
Comment #257 January 27th, 2022 at 5:52 pm
Great question Scott, and an early, sophisticated answer might still be of value with God’s answer to Job in that same regard; `hey, did you put the stars in the heavens?’ Well, the first answer obviously is, “no”, but Newton’s plan, which looks good from my perspective, apparently is not THE plan.
PS – assuming THE plan implies that A plan will be logical
PSS – if the world was logical then the world would make sense, but the world doesn’t make sense therefore the world is not logical, modus tollens, ~B & (A->B)) -> ~A
PSSS – old Marx’ Bros quote – Zeppo says, “We’ve got to think!” Chico replies, “Nah, we already tried dat.”
Comment #258 January 27th, 2022 at 5:53 pm
Philippe #236
Well, the ado was taken care of by a few bright guys in the late 1920s…
In which sense the formalism and interpretation of QM are inevitable, given the empirical behavior of even something as simple as a photon, is laid out by Dirac in the introduction to the Principles, and Von Neumann figured out the formal logic behind the projectors shortly thereafter.
The shortest route from empirical observations to the inevitability of QM that I can offer with hindsight is:
quantized bound states and specific heats -> systems prepared in *similar* ways must behave in *exactly* the same way most of the times and *completely differently* otherwise -> enter mutually exclusive configurations with OR via linear dependency, and indeterminism, with (sesqui)linear Born rule to automatically generate consistent probabilities though amplitudes
What else could be simpler than this?
Comment #259 January 27th, 2022 at 6:07 pm
General comments only: It seems that a satisfactory analysis of any of Q, Q1, or Q2 must:
1) be clear upfront about assumed constraints (e.g. are only universes which can support consciousness being considered, or only ones which can support at least primitive life, or all universes however devoid of complex behavior?);
2) admit, at least initially, that one or more of the questions may not have an answer (e.g. it may just be untrue that the universe, or even the universe we find ourselves in, had to be quantum mechanical);
3) explicitly address, or exclude for good reasons, each of the four scenarios {artificial creation, naturalistic creation} x {single creation, many creations}, because they require different analysis approaches.
Comment #260 January 27th, 2022 at 6:08 pm
A cellular automata world might be simple to describe, but as Varga #221 and others pointed out, it would be horribly complex for anyone living in it. Probably more complicated than QM. The advantage to CA is supposed to be that it is less mysterious, but I think that it would be more mysterious.
Comment #261 January 27th, 2022 at 6:53 pm
Daniel Harlow #196:
I think you are overly dismissive of argument #2 that the world as we know it would be impossible without quantum mechanics. In order for us to be having this discussion at all, the laws of physics need to have the ability to generate interesting complex structures in a reasonable amount of time starting from a simple initial state. Now I know that as a computer scientist you are trained to think that is a trivial problem because of Turing completeness, universality, blah blah blah, but really I don’t think it is so simple. Why should the laws of physics allow a Turing machine to be built? And even if a Turing machine is possible, why should one exist? I think the CS intuition that “most things are universal” comes with baked-in assumptions about the stability of matter and the existence of low-entropy objects, and I think it is not so easy to achieve these with arbitrary laws of physics.
Indeed, one way that this thread has been useful to me, is that it’s shifted me in the direction of thinking that that’s right. Multiple people made the case to me that it’s far from obvious how well
(1) stable matter,
(2) complex chemistry,
(3) Lorentzian and other continuous symmetries,
(4) robustness against small perturbations,
(5) complex structures being not just possible but likely from “generic” initial data,
…
can actually be achieved in simple Turing-universal classical cellular automaton models.
Would you agree that this is, at the least, an exceedingly interesting research question?
Even if the answers to this research question turned out to support your side of your argument, I confess that I still face a psychological difficulty with saying that the universe is quantum-mechanical for reasons like (1)-(5) above. My psychological difficulty is just that QM, and in particular the exponentiality of the wavefunction, seems too metaphysically extravagant to be the solution to problems like these! It’s like, in order to patch up some issues with the evolution of complex structures in one universe, you’re going to create (at least from the Everettian perspective) exp(n) additional universes?? Really? To what kind of deity would such an astronomically expensive trade seem worth it? Maybe a deity to whom computational resources were no object … but then why didn’t the deity just go whole hog, and make NP-complete problems efficiently solvable or whatever? 😀
Comment #262 January 27th, 2022 at 7:05 pm
async #143:
We can derive the general structure of transformations between different reference frames from some very reasonable assumptions. There are only two possibilities: a) Galilean relativity (if there is no speed limit) b) special relativity (if there is a speed limit).
Is the answer to the question “Why Special Relativity?” then along the lines of “There are only two possibilities and both are of comparable complexity. Nature just happened to realize one of them” or something else?
Actually I think the answer is better than that. It’s something like: “if you want a spacetime continuum and equivalence of all inertial frames, then there are only two possibilities … but one of those possibilities implies unbounded speeds, therefore no true locality or isolation of subsystems, and therefore the other possibility is realized.”
This is an almost completely satisfying answer to the “why special relativity?” question. If we had an answer to the “why QM?” question that was even 25% as satisfying as that answer, I’d declare the research program advocated in this post a success and I’d go home happy.
Comment #263 January 27th, 2022 at 7:12 pm
Daryl McCullough #144:
My feeling is that quantum mechanics can’t literally be true, because of the measurement problem, which I don’t think can be solved. Not without going beyond quantum mechanics. Maybe there is a way to make Many Worlds or Bohmian Mechanics work, but I don’t consider those to be orthodox quantum mechanics.
These days, my personal sense is that Many Worlds is the orthodox position … it’s just that not all of its adherents are willing to come out and say so in so many words! Instead they talk about decoherence theory, the Church of the Larger Hilbert Space, etc.—they just refrain from pointing out the Everettian corollary of all this, and change the subject if someone else tries to. 🙂
Or to be maximally charitable, they’re Everettians except that they’re agnostic about whether there’s anyone home experiencing anything in any of the other branches, or whether they’re all just “ghost towns” like the Bohmians believe; and they’re also agnostic about whether future discoveries will change the whole outlook on these matters … which, if so, brings them exceedingly close to my own position.
Comment #264 January 27th, 2022 at 7:16 pm
Mateus Araújo #145: My difficulty with your answer is, even if you assumed that probabilistic branching was crucial, why not just have a classical stochastic evolution rule? You wouldn’t have to stick all the random bits at the beginning, if you didn’t want to; instead you could posit that there were probabilistic transitions (or even “freely-willed transitions”).
I’m not saying that there couldn’t be a compelling answer to that question, just that the question needs such an answer.
Comment #265 January 27th, 2022 at 7:27 pm
Boaz Barak #149:
But as our world was more and more shaped by machines, engines, and clocks, these began to shape our metaphors as well. Hence we could think of the “clockwork universe” and these metaphors became more natural to us than stories about gods.
Similarly, I don’t think the view of “it from qubit” and the universe as a computer would make much sense to us if the Turing machine remained a thought experiment, rather than a device that we carry in our pockets. Now that we have gotten so used to the computer, we think of them as a useful metaphor to explain other stuff, rather than a mysterious phenomenon that needs to be explained.
I actually strongly agree with that picture! But with one crucial addendum. Namely, I don’t see these various technologies (engines, trains, clocks, computers, now QCs…) as shaping our stories about the world in completely arbitrary ways … akin to spiders theorizing that spacetime is a giant web, or beavers theorizing a cosmic network of dams and lodges. Rather, I see these technologies as part of a feedback loop, where we learn more about the actual, objective nature of the world, and that understanding lets us build better technologies, and those technologies in turn provide better metaphors with which to describe more of the actual nature of the world (besides, of course, the technologies’ more direct assistance), and then we use the understanding to build still better technologies, and so on.
I.e., with full awareness of all the cultural relativists and “science studies” types waiting on the sidelines to pounce, I will insist that the sequence, from a teleological universe to a mechanical universe to a computable universe to (now) an “It from Qubit” universe, represents actual, genuine progress in understanding reality better. 😀
Comment #266 January 27th, 2022 at 7:34 pm
Tobias Maassen #151:
I do not see some of the bigger problems of classical physics mentioned.
One example is Newtonian Gravity allowing objects to reach infinite speed in finite time
That was fixed by GR, not by QM.
another you mentioned is the solidity of matter(Pauli).
Another is the problem, which invented quantum Physics: The ultraviolet catastrophe.
We’ve discussed both earlier in this thread.
Are there cellular automata | strings | other theory canidates) wich preserve energy or a similar quantities?
There are certainly CAs with natural notions of “energy,” although it’s challenging to get continuous symmetries.
Would these allow for a complexity similar to chemistry?
A research program to answer such questions is precisely what I’m calling for here!
Comment #267 January 27th, 2022 at 7:35 pm
Scott #261
>My psychological difficulty is just that QM, and in particular the exponentiality of the wavefunction, seems too metaphysically extravagant to be the solution to problems like these! It’s like, in order to patch up some issues with the evolution of complex structures in one universe, you’re going to create (at least from the Everettian perspective) exp(n) additional universes?? Really? To what kind of deity would such an astronomically expensive trade seem worth it? Maybe a deity to whom computational resources were no object … but then why didn’t the deity just go whole hog, and make NP-complete problems efficiently solvable or whatever? 😀
It might help if you think in terms of parsimony. People rarely have a problem with assuming the existence of infinities, whether in mathematics or in physics, and it’s usually not because they are explicitly comfortable endorsing the ‘metaphysical extravagance’ that dependence on infinities entail, and it’s certainly not because there is any evidence whatsoever for the physical existence of infinities. Rather, it’s that infinities offer us the benefit of a theoretical parsimony of type and kind. It’s likely you’re probably harboring some latent cognitive dissonance with respect to this tradeoff; if you were strict about a heuristic biasing you against ‘metaphysical extravagance’, you’d be a strict finitist, and your default assumption about the universe would be that it is finite in extent, contains finite matter, etc. Is that the case?
Is it metaphysically extravagant to suppose that there are infinitely many mathematical structures, including all perturbations on ones that closely resemble our universe? Or is it only metaphysically extravagant to suppose that this infinite collection has some ‘special sauce’ that makes it physical/concrete, rather than ‘merely’ mathematical/abstract? Seems to me that between “everything exists”, and “this, and only this, extremely huge and particular universe exists, out of the space of all possible ones that could have existed”, the former is vastly more parsimonious as a theory of everything. The “special sauce” concept is itself doing the work of making things appear metaphysically extravagant. So much the worse for special sauce!
Comment #268 January 27th, 2022 at 7:39 pm
James Gallagher #152:
I think the Born Rule is good evidence for an Anthropic Universe since there is no good reason for it to be selected amongst all even power rules apart from being “most likely”.
I disagree in the strongest possible terms. This is one of the clearest examples of an aspect of QM that we can fully explain. Once unitary evolution picks out the 2-norm as conserved, you then don’t get conservation of probability unless the probabilities also go like the 2-norm. And, crucially, there are no linear transformations that nontrivially preserve the 4-norm, the 6-norm, or any higher norms. For more, see Chapter 9 of Quantum Computing Since Democritus, or my Is Quantum Mechanics An Island in Theoryspace?
Comment #269 January 27th, 2022 at 7:48 pm
Andrew Matas #155:
I fear this question can be answered in 10 different ways by 5 different people, so is scientifically meaningless.
Countless other questions—e.g., “what is energy?”—can also be answered in 10 different ways by 5 different people, but hardly anyone claims that that makes them “scientifically meaningless.” The point is simply that many different correct answers can give complementary insights, while other answers can be rejected as straightforwardly wrong.
I’d guess that in the worst case, the “why QM?” question has the same character, while in the best case, it does have a unique maximally compelling answer after all.
Comment #270 January 27th, 2022 at 7:53 pm
Roger Schlafly #169:
You could add a stochastic process to a CA theory, but you might end up needing something like QM to explain that stochastic process.
That’s precisely my question, though: imagine that we lived in a classical CA universe that had stochastic transitions, but no quantum interference or anything like that. Would anyone even think to invent QM as a way to explain the origin of the stochastic transitions? It seems preposterous that they would. If so, though, then any convincing answer to the “why QM?” question will have to talk about more than mere stochasticity.
Comment #271 January 27th, 2022 at 8:02 pm
Gerard #194:
The problem with science is that it gets its epistemology exactly backwards. It starts with what it sees “out there” when in fact what we see “out there” is something we can never have certain knowledge of, as most philosophers have understood at least since Plato. The one thing that we do have certain knowledge of is “I AM” and I think the fact that your own religious tradition equates that statement with the name of God is something it would be worth reflecting on.
You do realize that some people would be amused by the spectacle of monks sitting cross-legged on a mattress, or for that matter rabbis swaying in a yeshiva, lecturing the people who built spacecraft, microchips, and nuclear reactors on how the latter “got their epistemology exactly backwards”? 🙂
PS. As for the question I asked in my last comment, sorry, I fear my feeble monkey brain failed me once again and I forgot that arithmetic is polylog time, not polynomial time, in the quantities on which it operates.
Thanks for saving me the need to answer! But you might be interested to learn that, as soon as you ask about analogous questions involving square roots—e.g.,
√a1 + … + √an ≤ √b1 + … + √bn?,
where a1,…,an,b1,…,bn are positive integers, you get a problem that’s famously not even known be in NP (let alone P), although it is known to be in PSPACE!
Comment #272 January 27th, 2022 at 8:11 pm
Sid #197:
You may be interested in this paper which derives that the only possible reference frame transformations are Galiliean and Lorentz given 4 intuitive axioms…
The point that that paper gives as its raison d’être—namely, that (as we now understand) the fundamental importance of c has nothing to do with the existence of apparently massless particles (photons), which if they are massless, happen to travel at the speed c, and instead has everything to do with c being the conversion factor between space and time—I’d always taken as obvious and well-known. So, is there anything else in that paper that I couldn’t have gotten from Albert E. in 1905? 🙂
Comment #273 January 27th, 2022 at 8:19 pm
Corey #199:
Suppose there really wasn’t anything wrong with choosing a different, classical, starting point. How would this knowledge morph the nature of your inquiry into QM?
That’s an excellent question; thanks Corey! My answer is this: if it turned out that “Aaron Scottson,” or whoever, could just as easily have existed in a classical world, then I would take that as “settling” Q1, albeit in a negative way—analogous to how Paul Cohen’s independence proof “settled” the question of the Continuum Hypothesis. That is, it would tell me that there’s definitely no deep explanation for QM to be had, that it was a-priori just as plausible that we’d have found ourselves in a classical world, and the choice basically just boiled down to a dice roll. That itself, ironically, would be some of the most worthwhile and interesting knowledge I could imagine!
Comment #274 January 27th, 2022 at 9:17 pm
Scott #272:
Einstein had to assume the existence of an invariant speed (and then showed that that + no preferred reference frame implied Lorentz transformations).
This paper shows that assuming homogeneity of space and time, isotropy, assumption that transformations (between reference frames) form a group, and causality, the only valid transformations are the Lorentz ones (so the existence of an invariant speed is *derived* from the axioms rather than *assumed*) OR the Galiliean ones — so where the “invariant speed” is infinite.
There are no other possible transformations.
Comment #275 January 27th, 2022 at 9:21 pm
Sid #274: I see; thanks! That’s indeed a stronger statement. I’d assumed someone had proved something like it, but I didn’t know who, when, or where.
Comment #276 January 27th, 2022 at 9:30 pm
Richard Feynman said
“I think I can safely say that nobody understands quantum mechanics.”
And, in the set of all possible mechanisms that nobody would be able to understand, God just picked the simplest.
Comment #277 January 27th, 2022 at 9:31 pm
I don’t know if anyone has already suggested this but,
“it would seem like child’s-play to contrive some classical mechanism to produce the same effect, were that the goal.”
Is super-mega-untrue and probably the answer to your question. Scientists before and in the early days of QM were unable to come up with classical mechanisms to produce things like atoms and molecular bonds. There was the “ultraviolet catastrophe,” where thermodynamics didn’t work. There was enormous effort put in to describing stuff using classical mechanics, back when that was all that existed, and it didn’t go anywhere.
It would be a lot of work to put together a historical review of all the attempts at non-quantum physical explanations for stuff that wasn’t planets or machinery, mainly because those ideas have been ignored for a hundred years on account of not working. However if you did this I think you’d realize how QM is truly “necessary,” to have a world like ours, that’s not run by demons inside of spheres pulling levers when the spheres get close to the right kind of other spheres. 😉
Comment #278 January 27th, 2022 at 9:32 pm
To Andrei #211 and #216
*I address your argument, though this is not fully easy with this simple editor, so I write between stars.*
We have an EPR-Bohm setup, two spin-entangled particles are sent to 2 distant stations, A and B. The spin measurements are simultaneous (space-like) and perform on the same axis (say Z). Let’s assume that the result at A is +1/2. We have:
P1. The measurement at A does not change B (locality). P2. After the A measurement, the state of B is -1/2 (QM prediction).
*No, first you should define what a ‘state’ is. In my definition a ‘state’, that I call a modality, only makes sense if the measurement context is defined. So one should write :
P2. After the A measurement, the state of B is -1/2, with respect to an orientation defined by the A context (QM prediction).*
From P1 and P2 it follows:
P3: B was in a -1/2 spin state even before the A measurement. The spin of particle B on Z is predetermined.
*No, since the prediction about the ‘state of B’ requires the A measurement, which was not done before*
Symmetrically, the spin of particle A on Z was predetermined as well.
*No, for the same reason, inverting A and B*
The only assumptions here are 1. locality and 2. QM gives correct predictions. From these it logically follows that the measurement results are predetermined.
*No again. The initial entangled state is predictively incomplete, and requires the A measurement to be done in order to make a meaningful prediction on the result of the B measurement; without that, B’s result is fully random*
In other words:
C1: locality can only be saved by introducing deterministic hidden variables (the spins before measurements, or some other property that uniquely determines the results).
*No*
Then we have Bell’s theorem which says:
C2: Local hidden variable theories are impossible with the exception of superdeterminism.
From C1 and C2 it follows that the superdeterminism is the only possible way to keep physics local.
*No. Predictive incompleteness goes with contextual inferences, that are purely quantum, but do not require neither (Bohm-type) nonlocality, nor superdeterminism. More details in https://www.mdpi.com/1099-4300/23/12/1660 *
I think I can spot the problem with your paper. In Chapter 5, below Figure 1 you say:
“The resulting predictions can be effectively checked in the verification zone V in the common future of all light cones.”
I think this is irrelevant. True, the experimental records can only be compared at V, but one can look at the time at which the measurements were performed (the time printed on those records) and conclude that the prediction of A about B was true immediately after the A measurement. It does not become true at V.
* The prediction was true, but B’s particle was not affected by A’s measurement whatsoever. Only the contextual inference by A is true, and only be checked later, in a fully causal way. And it can be checked that no contradictions arise when inverting A and B*
Comment #279 January 27th, 2022 at 9:33 pm
Scott P. #230
Apparently Newton’s equations aren’t even strictly deterministic:
https://en.wikipedia.org/wiki/Norton%27s_dome
in other words, determinism requires that two different states will never merge into a common state (time can run forward or backward). And it’s not true for Newton’s equations.
Comment #280 January 27th, 2022 at 9:37 pm
Not That Simple #277:
I don’t know if anyone has already suggested this but,
“it would seem like child’s-play to contrive some classical mechanism to produce the same effect, were that the goal.”
Is super-mega-untrue and probably the answer to your question. Scientists before and in the early days of QM were unable to come up with classical mechanisms to produce things like atoms and molecular bonds…
That has indeed been suggested at least a dozen times. 🙂
Briefly, “a world like ours” is not part of what I’m requiring here—I merely insist on “a world able to support complex life and intelligence.” It’s conceivable that you’re still right and there’s no simple classical way to get that, but it’s now far less obvious!
Comment #281 January 27th, 2022 at 10:48 pm
Scott, may I ask if you have any thoughts on question #3 in my comment #206? I apologize for it not being entirely on-topic, but at least it has the advantage of being a very concrete technical question that you may be able to completely resolve in a few sentences (unlike most of the “big” questions in this post and its comments!).
Comment #282 January 27th, 2022 at 11:28 pm
My 1000iq wacky justification for qm goes like:
Universe is geometric, and obeys relativity, and has a notion of distance
1. Rotation invariance implies spinors (“square roots” of geometry) somehow
2. Spinning things and waves are basically the same thing
3. Waves are naturally described by complex numbers (I think you can avoid this but you get something that’s isomorphic to the complex numbers regardless)
4. An angular distance then gets you the fubini study metric and all the wacky stuff it implies
All this has to be compatible so out pops a kahler manifold which in combination with the fs metric forces a lot of quantum mechanical behavior
* I have no idea why spin is so privileged but momentum and energy are related to translations, and they are important. But without it we don’t get any interesting behavior (or stable matter). That and it shows up in the purely geometric parts of general relativity (torsion) leads me to think there’s something deeper afoot.
I have no justification for the particular values of parameters or forces and such, that just seems like some extra spice
Comment #283 January 27th, 2022 at 11:30 pm
I specifically want to argue against the idea that there’s some association of quantum mechanics and consciousness. I think it’s a good exercise to try and imagine porting your brain to a classical algorithm – which we can do, since afawcs the brain is not a quantum computer – and then imagining having the concrete experience of that universe. You wouldn’t have truly (quantum) random events, but you’d still have unpredictable, chaotic outcomes, so you’d still need to model multiple futures, and you’d still experience making decisions and navigating state space, even in a non-branching block universe. I worry that this sort of thought – that reality has to be quantum because consciousness is quantum-related – arises more from the fact that we don’t understand consciousness as that we do understand it.
Anyway, my idea is that qm is just the simplest setup where a small seed state can give a sufficient variety of outcomes for life to exist. I think life will turn out to be so unlikely that the universe has to allow an unbounded continuum of branches to find some with life in it at all.
Comment #284 January 27th, 2022 at 11:34 pm
An interesting related question: what would a universe look like that had Quantum Mechanics, but not Relativity?
Could you have a quantum Ether?
Comment #285 January 27th, 2022 at 11:43 pm
I proffered a “pourquoi-pas?”-type answer to your Q1 in my first slide at https://cse.buffalo.edu/~regan/Talks/QUnion.pdf
Comment #286 January 27th, 2022 at 11:48 pm
Ted #281: OK, a couple responses:
– Good catch, you indeed caught me contradicting what I wrote 15 years ago, when I was apparently more impressed with the (nAnB)2=nA2nB2 argument for amplitudes to be complex. I believe the explanation is simply this: I was genuinely impressed with the “parameter-counting” argument the first time I saw it (which would’ve been either in Lucien Hardy’s work or in Chris Fuchs’s). But then, the more I saw it invoked, the less impressed I became—because like, if there’s this standard axiom that everyone knows to throw in for the sole purpose of ruling out real amplitudes, then why not just rule out the real amplitudes by fiat and be done with it? 🙂 I’ll leave it to you to decide if this is rational.
– Regarding your question about normalization: yeah, that bugged me too. But I believe the resolution is simply this: the parameter-counting only matters insofar as it leads to the possibility of local tomography: that is, reconstructing any bipartite mixed state ρAB solely from the correlations between the outcomes of measurements on A and B separately. So, what do we need for local tomography? Well, assume for simplicity that A and B are both qubits. Then as you correctly pointed out, when we include normalization, the number of independent real parameters needed to specify ρAB is only
42 – 1 = 15,
not 16.
Now let’s count the number of operators needed for local tomography and see if it matches. Well, we can characterize ρAB from the expectation values of all the possible tensor products of Pauli operators: namely,
I⊗I, I⊗X, I⊗Y, I⊗Z
X⊗I, X⊗X, X⊗Y, X⊗Z
Y⊗I, Y⊗X, Y⊗Y, Y⊗Z
Z⊗I, Z⊗X, Z⊗Y, Z⊗Z
Note that there are 16 possible tensor products here because
dim(A)2 dim(B)2 = 16,
the important thing about the 4 Pauli operators here being that they span the 4-dimensional vector space of 2×2 density matrices.
AHA, but Tr((I⊗I) ρAB) = 1 for all ρAB, which means that one of the 16 tensor products is irrelevant, which means that really we only need 15 of them! An exact match, stemming from the fact that
42 – 1 = 22×22 – 1.
Comment #287 January 28th, 2022 at 12:04 am
Scott #280:
“I merely insist on “a world able to support complex life and intelligence.””
I understand that many cellular automata fit this bill. They would contain people that thought they were having this very conversation, if the initial conditions were random and infinite.
I can’t think of a way to structure the problem as to forbid that answer. PRNGs can generate “random” initial conditions with extremely low Kolmogorov complexities, so you can’t ask for simple initial conditions. You can’t ask for small world either, because one trait of our universe is that it is much larger than it has to be to contain us.
Maybe figuring out how to forbid “Boltzman brains in the game of life” as an answer will be a helpful step.
Comment #288 January 28th, 2022 at 12:12 am
Clinton #203: The part you’re missing is that, if someone in the 1800s had asked “why is classical mechanics true? why does it have the features it does?,” with hindsight that would’ve been one of the best questions they could possibly have asked! Because it would’ve had nontrivial answers! Albeit, answers that were only discovered later. For instance:
Why is classical mechanics time-reversible? Why does it satisfy a least action / Euler-Lagrange principle? The answers would come from QM.
Why does the gravitional force fall off like 1/r2? Why are gravitational and inertial mass the same? The answers would come from GR.
In other words, there really were deeper principles waiting to be discovered (deeper principles expressed, yes, using math). So your thought experiment strikes me as supporting optimism, rather than pessimism, about the search for deeper principles underlying QM!
Having said that, there’s an immense irony here: physicists were ultimately able to explain classical mechanics in terms of deeper theories, in large part because they discovered that classical mechanics wasn’t exactly right. The corrections were what led them to the deeper theories from which classical mechanics was then recovered as an excellent approximation.
So we’re led to the following picture:
– If (as you seem to think) QM isn’t exactly true, just like classical mechanics wasn’t, then we should ultimately be able to explain QM in terms of something deeper.
– If (as I fear) QM is exactly true, then we might not ever be able to explain it in terms of anything deeper (but we can still try!).
Comment #289 January 28th, 2022 at 12:15 am
Wrote to you more philosophical thoughts by email, but here is a quarter-serious answer to Q1:
God made the universe quantum mechanical so we could have Shor’s algorithm.
Comment #290 January 28th, 2022 at 12:16 am
Stewart Peterson #204:
Are you aware of a treatment at, say, the advanced undergraduate to beginning graduate level, which describes QM from a mathematician’s perspective – the way you understand it yourself?
You could try my Quantum Computing Since Democritus! Or my undergrad quantum information lecture notes! Among countless other resources … but you asked me! 🙂
Comment #291 January 28th, 2022 at 12:21 am
Regarding tensor products,
if instead of quantum mechanical, it was instead merely probabilistic, where instead of a state vector we instead had a probability distribution, regarded as a vector in L^1(the state space)
wouldn’t we still end up with tensor products for combining different systems?
The time evolution operator would still be linear and preserve total probability, right?
Not that like, the derivative of the state vector (probability distribution) with respect to time would be a linear function of the current state vector (uhh…. actually, would it be? uh… hm.), but like, the function from “probability distributions at time t_1” to “probability distributions at time t_2” would be linear, right?
And, if that works for sub-systems, then, if you have two sub-systems together, then it seems like the tensor product is just about the only thing you could do. (?)
I mean, assuming that you didn’t want to actually sample a particular state, and instead wanted to just keep updating a probability distribution.
So, except for the “you could just sample a single state and stochastically update that over time, instead of keeping track of a probability distribution changing over time” (which admittedly is a big thing to except), I’m not sure that the “but the dimension of the state vector for large systems is exponentially large” is really a reason to disfavor qm compared to merely probabilistic things?
Also, hey, can you have anything substantial which is stochastic in this way but also where the state vector evolution through time is reversible? It seems to me like the answer would be no, assuming that it is possible to reach a state in more than one way.
Like, can you get all of “for the simulator, it is deterministic, but internally it is effectively random”, “separate parts of the system can be considered separately”, “for the simulator, the time evolution is reversible”, and “there is more than one way to get the same* result” any other way?
(I say “for the simulator” not out of a belief in a simulator, but just as an easier way to phrase things.)
I guess if “the same* result” had stuff that was actually different at a microscopic level, but which couldn’t be internally observed, then you could I guess, but at that point you are basically just including the history of the world in the current state of the world, in order to make the time evolution reversible, and that seems like cheating?
Comment #292 January 28th, 2022 at 12:24 am
Scott
It seems like the David Deutsch worldview is at least interested in Q1 and has some opinions on it. From a recent reading of TFOR some candidate directions are 1) QM is needed to resolve time travel paradoxes 2) QM is needed to provide foundations for moral realism 3) QM is needed to provide foundations for information and specifically biology and intelligence. My guess is a better understanding of this worldview has other opinions about why QM is needed for bio, epistemology, computing to make sense
I think it could be productive to address the David Deutsch world view as a whole as a way forward. I for one would love to see criticism and progress there.
Comment #293 January 28th, 2022 at 12:34 am
If I’m compelled to write, this is a dangerous post. Wheee…
I have more to read but Lots of quick things I see as nonsense:
– If you wanted to say this is the simplest mechanism allowing a certain complexity, that is hard because anything Turing complete admits a vast complexity.
– I think “why quantum” could be connected to consciousness, maybe. A much deeper meditation on Penrose’s idea that real understanding can transcend bounds of ordinary logic to draw meta-conclusions … But that’s a half formed idea and pretty crackpot.
– I met Andreas O. Tell on IRC several years back and he had a fun preprint. So like in the field that I got my masters in, condensed matter, there’s a technique that we are always using called density functional theory, DFT. The basic idea is to reduce the state space by doing an eigenvector decomposition of the state matrix… Keep only the states that have the highest several eigenvalues, try to reduce the dimensionality that way.
Tell’s preprint argues that this offers a different interpretation for QM, where essentially some aspect of consciousness is able to push the eigenvalues around arbitrarily on the state matrix, as long as they don’t cross. This means that the eigenstate for the highest eigenvalue is kind of our “best guess of reality”, we can push the eigenvalue to one while all of the other eigenvalues drop to zero. From this he tries to rederive the born rule by stating that information comes into some confined volume at the speed of light, and we have to readjust our understanding based on this new information, and basically the eigenvalues can cross with some probability.
So if I’m connecting to consciousness then that’s probably my angle, these are the prerequisite laws to have a vantage point and we only see the world with a vantage point. Less airy fairy, more brutal physical.
Comment #294 January 28th, 2022 at 12:36 am
A purely classical universe wouldn’t take any time to go from start to end
Comment #295 January 28th, 2022 at 12:50 am
Tim Maudlin #208: I’ve given you a bunch of criteria of simplicity according to which Newtonian mechanics will very often win (for example, against Ptolemaic epicycles, or Aristotle’s teleological universe), but according to which it will also sometimes lose (for example, against Conway’s Game of Life). The fact that all the reasonable criteria I can think of agree in these judgments is what gives me confidence that they’re pointing to something real. Even if you deny that, though, I hope you’ll grant me this: at least I didn’t send Newtonian mechanics totally unarmed into the Occam’s Razor gladiatorial arena, while arming its opponents. I tried to set up a fair fight.
You, by contrast, have set up “criteria of simplicity” according to which nothing besides our laws of physics could ever possibly win. Our laws are the simplest because they’re our laws, and because everything else has to be expressed in terms of them, whereas our laws can just effortlessly express themselves. Thus, all your judgments about the simplicity of our laws are entirely tautological: the contest you’ve set up is a bloodbath, F=ma just mowing down unarmed opponents with a machine gun.
I’m frankly amazed that a philosopher of science either wouldn’t notice that or wouldn’t care!
As for Bohmian mechanics, you keep making statements that I can only understand if I presuppose Bohmian mechanics is true! For example, you say that Bohmian mechanics “tells you” which slit the photon went through, as a function of where it hit the second screen, which is new information that you couldn’t have gotten without Bohm.
I, by contrast, would’ve put the matter thus: Bohmian mechanics tells you a story about which slit the photon went through—a story that, by construction, has no causal effect on anything you can observe later. A different story, one equally compatible with all the predictions of QM, could’ve told you that the photon went through a different slit. (Admittedly, with 2 slits and 2 spatial dimensions, there happens to be a unique local, probability-preserving way to divvy up what goes where. But that’s a very special case and won’t generalize to when we include multiple photons, qubit degrees of freedom like polarization, etc.) So then, I’d say that a physicist is free to adopt any such hidden-variable story, or none of them, according to convenience.
When someone with a more Everettian mindset thinks about Bohmian mechanics, it seems to decompose into three claims:
(1) Yes, there’s a wavefunction of the universe, just like in MWI, and it evolves unitarily just like MWI says it does, with no special role for “measurement.”
(2) Crucially, only one branch of the wavefunction has “anybody home” to experience anything; the other branches exist but are all “ghost towns.” (I don’t mean to dismiss or ridicule this claim; it’s interesting and for all I know it might be true!)
(3) In order to pick out which branch has “anybody home,” in a way that agrees with the usual Born rule at any individual time, we ought to tell one particular kind of story about nonlocal hidden variables defined in the particle position basis. (Well, at least in nonrelativistic QM; it’s unclear what we ought to do in QFT or quantum gravity.)
I don’t know whether you can engage with someone who understands Bohmian mechanics in this way, or whether the chasm is simply too great. In the latter case, maybe we should just drop this thread, since it isn’t directly relevant to the subject of the post.
Comment #296 January 28th, 2022 at 12:57 am
Scott,
“By “useful” information, Sakurai clearly means any information whatsoever that one can choose to specify at point A.”
If you take the non-local route (denying P1. – The measurement at A does not change B) it necessarily follows that one bit of information (the measurement result at A) was instantly sent at B (so that B is changed accordingly). Agreeing that an UP result is represented by 1 and a DOWN result by 0, a measurement sequence UP,UP,DOWN,UP consists of 4 bits: 1101. Those 4 bits are sent instantly at B. B can access them instantly by performing the corresponding measurements at his location.
“It’s a theorem in QM — I prove it every year in my undergrad class — that none of that can be transmitted.”
You DID transmit those 4 bits, 1101. What the theorem proves is that you cannot use those bits to transfer some other information of your choice, like a picture of a cat, or your name or whatever. But, as I pointed earlier, SR is not concerned with the content of the message. So, if your “choice” is to sent 1111, you cannot do that in a controlled manner. But this is completely irrelevant, a red-herring. SR does not have any special postulate that distinguishes between information that you “choose” to transfer instantly and information that you don’t choose. It’s just like saying that SR is in perfect agreement with solar flares producing instant effects at Alpha Centauri just because you cannot control solar flares so you cannot send cat pictures with them. Just a big, ugly red herring.
“He means to exclude correlations, which occur even in classical probabilistic theories and are considered totally unproblematic there…”
Those classical correlations can always be explained locally. This is made explicit in field theories like classical EM.
“…but which can be stronger in QM in a way that helps A and B win certain nonlocal games, which is the whole content of the Bell inequality.”
Again, if you deny P1, you have an explicit non-local information transfer. It is that transfer that allows you to win those games.
“SR doesn’t care about Bell-type correlations.”
This is because those correlations CAN be explained locally as well, in a superdeterministic context. But once you make the choice to reject superdeterminism, and deny P1, that local explanation is not available anymore, and suddenly SR cares about them.
“We could see that abstractly, even if we didn’t have successful relativistic QFTs that show it explicitly.”
Again, QFT does not make a clear choice of accepting or rejecting P1. If you make that choice and add to it the postulate:
The measurement of A causes an instantaneous change at B
you get a not so successful QFT anymore. You introduce an asymmetry between the A and B measurements (A is random and it causes B, while B, being caused by A is not random). Good luck making that compatible with SR.
“That QM and special relativity are difficult but not impossible to reconcile, and that the rare theories that do successfully reconcile them are so phenomenally good at describing the world, is a powerful indication that physics is on the right track here.”
QFT itself is on the right track (since it does not explicitely deny P1), but denying P1 derails it. accepting P1, and so, superdeterminism, is the only way to stay on that track.
Comment #297 January 28th, 2022 at 1:01 am
To Anbar #258
Well, I know also the history of QM, and I’m telling it each year to my students, who like it. But if ‘nothing else could be simpler than this’, why did Scott launch this (pretty animated…) blog discussion ? that is by the way following myriads of books and papers claiming to ‘answer the question’ ? some piece must be missing somewhere… ?
Comment #298 January 28th, 2022 at 1:16 am
Slight caveat in that in MWI nobody is home anywhere 🙂
Well I mean kinda. Think of MWI like a big graph, a Hamiltonian of the Universe says this state leads to that state leads to this other state, that’s the edges. People take this graph and put some complex numbers on some nodes and then evolve it through successive timesteps, moving the complex numbers to other nodes, based on some complex numbers attached to the edges. But if you’re a true disciple this is a silly game. All of the worlds exist and they exist “right now” and they contain completely deluded consciousnesses that are for some reason convinced that they are moving according to this graph and not its dual… But they are frozen in time and place, their notion of movement is illusory.
It shares this philosophical queerness with Minkowski space… A true disciple of Minkowski space thinks first that the whole worldline is conscious and the notion of the consciousness moving along the world line is a local phenomenon which is quite absurd in the global picture… then maybe later has to come to believe in the idea of a soul zipping across the worldline to explain why we actually think things are changing.
It’s not bad, but probably to get forward motion on philosophy of consciousness we need some sort of huge revolution coming from process philosophy… Literally that we cannot be conscious without changing, so that any static state being conscious is absurd. Like when you are rewatching Heroes and Hiro Nakamura freezes time, we just take for granted that his consciousness keeps going because he is able to keep changing, while all of the change around him has slowed to a near standstill, we Intuit that those consciousnesses have stopped.
Comment #299 January 28th, 2022 at 1:23 am
@Boaz Barak #289 Hah! That’d be fun. But surely we’d want Grover’s algorithm, the idea of being that evolution is some big computer and God wanted to solve for the history that generates conscious life, but he wanted that precious √N speedup because it’s taking too damn long
Comment #300 January 28th, 2022 at 1:31 am
Philippe Grangier,
P1. The measurement at A does not change B (locality). P2. After the A measurement, the state of B is -1/2 (QM prediction).
*No, first you should define what a ‘state’ is. In my definition a ‘state’, that I call a modality, only makes sense if the measurement context is defined. So one should write :
P2. After the A measurement, the state of B is -1/2, with respect to an orientation defined by the A context (QM prediction).*
As clearly specified in my description of the experiment, the detectors are fixed on the Z axis before the experiment. Z axis is unambiguously defined (say it points towards the galactic center or whatever). therefore, the experimental context is clear for both A and B. This context does not change between repeated measurements (difference from a Bell test where the detectors are reoriented between runs)
From P1 and P2 it follows:
P3: B was in a -1/2 spin state even before the A measurement. The spin of particle B on Z is predetermined.
“*No, since the prediction about the ‘state of B’ requires the A measurement, which was not done before*”
I don’t get this. Of course the prediction about B requires the A measurement. This is exactly the point. You measure A and the result allows you to predict B. You predict from A what the experimental record that will arrive from B contains.
If you add the locality condition, that the measurement at A (that enables the prediction) did not cause any change at B, but B is left by the A measurement in the -1/2 spin on Z state, B must have been in the -1/2 spin on Z state even before the A measurement, otherwise you have a change.
The only assumptions here are 1. locality and 2. QM gives correct predictions. From these it logically follows that the measurement results are predetermined.
“*No again. The initial entangled state is predictively incomplete, and requires the A measurement to be done in order to make a meaningful prediction on the result of the B measurement;”
Again, my argument does not deny that you need the A measurement to predict B, it actually needs that. So you didn’t show any problem with the argument.
“without that, B’s result is fully random*”
Without the measurement at A we cannot predict B. This does not make the B result “fully random”. We just can’t say anything about it.
“*Predictive incompleteness goes with contextual inferences, that are purely quantum, but do not require neither (Bohm-type) nonlocality, nor superdeterminism. More details in https://www.mdpi.com/1099-4300/23/12/1660 *”
Again, you didn’t show that any premise in my argument is wrong or unjustified. You simply reiterated P2 (the state of B is -1/2 after the A measurement). In order to refute the argument you need to deny P2 (or P1).
“* The prediction was true, but B’s particle was not affected by A’s measurement whatsoever. Only the contextual inference by A is true”
Well, if the B particle was not afected by A, but its state is -1/2 on Z after the A measurement, what state did B have before the A measurement? Any answer that is not -1/2 on Z would imply that the A measurement DID change B.
“..and only be checked later, in a fully causal way.”
Sure, but this does not concern my argument at all.
“And it can be checked that no contradictions arise when inverting A and B*”
This does not concern my argument either.
Comment #301 January 28th, 2022 at 1:37 am
I noticed that Scott repeatedly refers to an ‘exponentially larger state space for all of reality’, sentence taken from 2. in his introduction. I have a remark on that : for N qubits, the dimension of the state space is 2^N, but what happens by taking the (mathematical, not physical) limit of N going to countable infinity, aleph_0 ? It is well known (from Cantor) that 2^N goes to non-countable infinity, aleph_1, and thus the Hilbert space is not separable any more, and is essentially non-manageable. So there is some kind of asymptotic unstability of the Hilbert space structure, but what happens then ?
This is discussed in detail by von Neumann in http://www.numdam.org/item/CM_1939__6__1_0/ (free access), which is some kind of ancestor with respect to operator algebra, GNS etc. What Johnny says is that the unmanageable Hilbert space ‘blows up’ in parts, that are essentially disconnected from each other, and that are manageable (separable) again; in modern langage, they would be called superselection sectors. These sectors are usually type III in the Murray-von Neumann classification, but in this 1939 paper Johnny is very careful with that, because the existence of type III factors was not even sure at that time.
His paper is very technical (though he uses only very basic maths), but I really recommend that you read the first introductory section, that is quite illuminating. Interpreted in physical terms, it tells that the (mathematical) limit of countably infinite N behaves essentially like classical physics, because the operators relative to different sectors all commute. Also, there is no need to specify ‘all details’ in each sector, this is unfeasible, but since the sectors are disconnected a macroscopic label is enough to identify each of them, in a fully probabilistic approach. This is quite remote from MWI, but quite interesting I think…
There are more details in https://arxiv.org/abs/2003.03121 (published in Found. Phys.).
Comment #302 January 28th, 2022 at 1:41 am
Isaac Grosof #65 and #139.
I think you are making an implicit assumption that anisotropic microscopic rules must lead to anisotropic macroscopic dynamics. I agree that that this is probably true for CGoL and other similar automata, but I don’t see why it should necessarily be true for all automata.
The closest thing to a counterexample that I know of is the Arctic Circle Theorem: you start with simple anisotropic rules (placing dominoes on a square grid), but end up with something that has rotational symmetry (a circle!). But it’s not really dynamical in a way you might expect laws of physics to be dynamical, and it doesn’t even have translational symmetry.
Anyway, I think this is an interesting open(?) question. Can a cellular automaton be “approximately isotropic on large enough scale”, for some definitions of those words?
Comment #303 January 28th, 2022 at 2:03 am
async #143, Scott #262.
I think both of you are missing a third possibility: you can have Lorentzian, Galilean, or Euclidean geometry. Greg Egan’s Orthogonal trilogy is awesome, go read it if you haven’t 😀
Of course this doesn’t really affect either of your arguments. You just have change 1/2 to 1/3. And if you want locality then Lorentzian is still the only option.
Comment #304 January 28th, 2022 at 2:15 am
I want to add one more rule – there is no such thing as wave function collapse. Apparent wave function collapse, the Born rule and classical physics are implications of how the brain works. If we were instead observers outside the universe who could see “reality”, we would see…a wave function that never collapses.
Why the complex valued amplitudes, unitary transformations and tensor products? As far as I know, just because.
Comment #305 January 28th, 2022 at 2:37 am
To Andrei #300
If you only consider the situation where the two orientations are the same (perfect correlation), it is well known that you can construct a simple local classical model : just assume that for each pair the two spins have opposite orientations along the same (random) direction, determined when they are emitted. But such a model obeys Bell’s inequalities (BI), so you just ignore the difficulty, which is that they are violated when considering other orientations.
My whole point is to discuss what happens when the orientations are different, and when Bell’s inequalities are violated (Clauser 1972, Aspect et al 1982, loophole-free 2015). Then the previous naive model fails, and one must consider what goes wrong in Bell’s hypotheses, because Bell’s reasoning is logically and mathematically correct. A good analysis was proposed by Jon Jarrett in 1984, splitting Bell’s hypotheses in two parts, called (elementary) locality and predictive completeness. The same ideas were presented again by Abner Shimony (1986), but with a different terminology : elementary locality became parameter independence, and predictive completeness became outcome independence.
The trouble with Shimony’s terminology is that violating any of these two `independence’, that is required to violate BI, sounds like some form of non-locality. On the other hand, one can show simply that QM agrees with elementary) locality, but violates predictive completeness, and in addition that the violation of predictive completeness has little to do with non-locality, but much to do with non-classicality. For instance, any deterministic theory must agree with predictive completeness, and thus must violate elementary locality in order to violate BI; a good example is Bohm’s theory.
As a conclusion, there is something special with QM non-determinism, that is to allow predictive incompleteness, and thereby to violate BI (in this discussion I ignored superdeterminism, that is another possibility for violation, but honestly I simply don’t like it). More details are given in https://arxiv.org/pdf/2012.09736.pdf
Comment #306 January 28th, 2022 at 2:48 am
Philippe #297
It is a consequence of the fact that indeterminism and a lack of substance for realism (both consequences of the empirically observed complementarity plus the laws of thermodynamics) are tough pills to swallow, combined with the fact that you don’t really need to concede on either of this points before using fruitfully the formalism, as long as you engage in some conspiracy theory of some kind because “interpretations”.
Now, I’m not saying that trying to derive a more fundamental principle than “it’s the simplest thing I can concoct, that does the job” is necessarily a fruitless exercise, but the real “issue” is with indeterminism and no substance for realism.
Which brings us to Scott’s theme, which seems to be if “complex life as we conceive it” could also have arisen in a classical (non-complementary) Universe, and leans quite strongly on the conspiracy side 🙂
The answer “yes”, if possible at all, would imply the Ultimate Rube Goldberg device (made by the Ultimate Designer to satisfy a timeless urge?). Even if it was an option, the alternative (plain QM with indeterminism and no Objective Reality) requires incalculably less assumed complexity to start with.
Why do I say this.
Complex life “as we know it” requires organization on a large range of length scales.
In a QM universe the overall complexity is bounded by the fact that the inward dive in length scales stops at a finite value, where the hydrodynamic approximation starts to fail and we find atoms and molecules with their quantized configurations and stable ground states.
In order to mimic this in a classical Universe, bound states of elementary particles won’t do, because of the un-quantized nature of classical configurations and interactions: low frequency high amplitude fluctuations can unbind any unbindable system very quickly.
You then end up essentially with an elementary particle for at least *every state of every atomic species*, with transmutation rules that are individually assigned (can’t even use any Lorentz symmetry, as the classical Universe must be non-relativistic), and the complexity explodes.
The alternative in which is there is no end to the inward dive seems doomed on thermodynamic grounds, much like with Lorentz symmetry and the UV catastrophe
Is this “proof” that the answer to the theme question is “no”? I don’t know, but it’s at least circumstantial evidence.
Comment #307 January 28th, 2022 at 2:52 am
To Andrei #300
If you only consider the situation where the two orientations are the same (perfect correlation), it is well known that you can construct a simple local classical model : just assume that for each pair the two spins have opposite orientations along the same (random) direction, determined when they are emitted. But such a model obeys Bell’s inequalities (BI), so you just ignore the difficulty, which is that they are violated when considering other orientations.
My whole point is to discuss what happens when the orientations are different, and when Bell’s inequalities are violated (Clauser 1972, Aspect et al 1982, loophole-free 2015). Then the previous naive model fails, and one must consider what goes wrong in Bell’s hypotheses, because Bell’s reasoning is logically and mathematically correct. A good analysis was proposed by Jon Jarrett in 1984, splitting Bell’s hypotheses in two parts, called (elementary) locality and predictive completeness. The same ideas were presented again by Abner Shimony (1986), but with a different terminology : elementary locality became parameter independence, and predictive completeness became outcome independence.
The trouble with Shimony’s terminology is that violating any of these two `independence’, that is required to violate BI, sounds like some form of non-locality. On the other hand, one can show simply that QM agrees with elementary locality, but violates predictive completeness, and in addition that the violation of predictive completeness has little to do with non-locality, but much to do with non-classicality. For instance, any deterministic theory must agree with predictive completeness, and thus must violate elementary locality in order to violate BI; a good example is Bohm’s theory. More details are given in https://arxiv.org/pdf/2012.09736.pdf
As a conclusion, there is something special with QM non-determinism, that is to allow predictive incompleteness, and thereby to violate BI without explicit nonlocality. In this discussion I ignored superdeterminism, that is another possibility for violation, but honestly I don’t like it.
Comment #308 January 28th, 2022 at 4:17 am
To be frank, I think the viewpoints espoused here are “too unitary”.
More concretely, I would like to point out that the actual Schrodinger equation involves a *Hermitian* operator and not a unitary one. All the quantum-mechanical discussions that rely solely on unitary operators, and in particular anything to do with quantum computation and quantum information, are in my view based on an engineering viewpoint where somehow time evolution always happens in discrete steps (and Hilbert spaces are always finite-dimensional). They might be tremendously fun, useful, and may eventually produce real insghts into quantum gravity, but I do not think the answer to your questions lies there.
Instead, let us first realize that we need at least some physical input. Einstein also needed it, even in his derivation of special relativity: he required that all of physics, inclusing the measurement of the speed of light, is frame-independent. One can ask if there is a similarly catchy phrase that will inevitably lead to all of quantum mechanics. The best candidate that I can come up with is one of reductionism: your theory of nature should not have an unnecessary ugly dichotomy between waves and particles. Instead it should put them both on the same footing. Good luck doing that with any classical theory! So this is where I would depart to deduce the apparently cherished inevitability of quantum mechanics…
Comment #309 January 28th, 2022 at 4:26 am
When I retire, I need to make a point to really learn QM, and keep an archive of this blog.
Apologies in advance if I am completely out of my element (applied mathematician). The only thing I have to contribute is at least in the spirit of the questions. Do we have implicit assumptions on the nature of the universe? Is it a state machine that can compute the ‘next state’ from previous? (Or some kind of async update since simultaneity is not true.) Is it a massive time evolving PDE? Is it something efficiently computable? What if it’s not?
This is where I think we haven’t quite ruled out superdeterminism. I don’t necessarily care for it, but I also do not think it necessary to assume it has any intention behind the initial conditions. If the universe is a giant PDE, plus some global constraints it must satisfy, only certain solutions exist. One of those could be our world. Whether they seem non-physical to compute, goes more to our assumption that the universe is a computer and not a mathematical object. Personally I’ll sleep better at night if it’s not superdeterminism, but I think that’s a fundamental question worth asking.
That said I completely agree that a classical or CA chemistry should be possible and someone should absolutely try to make one. I’d play the heck out of that simulator.
Comment #310 January 28th, 2022 at 4:52 am
To Anbar #306
You write : « Now, I’m not saying that trying to derive a more fundamental principle than “it’s the simplest thing I can concoct, that does the job” is necessarily a fruitless exercise, but the real “issue” is with indeterminism and no substance for realism. »
The benefit of my approach is to exploit indeterminism in a smart way, and to save (contextual) realism, this is actually a quite decent realism from a philosophical point of view.
And then : « Which brings us to Scott’s theme, which seems to be if “complex life as we conceive it” could also have arisen in a classical (non-complementary) Universe, and leans quite strongly on the conspiracy side. »
Here I cannot tell, except with my previous restaurant metaphor, #174 : if you have spinach in your plate, you may certainly ask ‘why not a steak ? is the Grand Chef crazy ?’. But my more modest tendancy is to deal with the spinach.
Comment #311 January 28th, 2022 at 5:01 am
(I am a software engineer who wrote two quantum circuit simulators for personal exploration)
Are we as humans going to accept that reality is
generated
and generated through interactions of a few things
and so only exists for that interaction, only for its scope
and that is it and nothing more?
Related to that,
Are we going to realize that when we define a “state vector”, we think that we deserve to look from the “god’s perspective” with all the variables placed nicely together and there can be, to our convenience, *one representation to represent the whole.
An that representation is a notation of precision, where amplitudes have precise and sharp values, such and such imaginary number, instead of “any value but not these”.
We are still trying to hold on to the chair that we sit: the tools, notations and experiences. With them we got so far. We need to go to the very bottom and go up from there, with a notation of “not set”, “free to have such liberties”, “are not to be put together in one representation”. That means a new journey with extreme humbleness.
Comment #312 January 28th, 2022 at 5:02 am
Philippe Grangier,
“If you only consider the situation where the two orientations are the same (perfect correlation), it is well known that you can construct a simple local classical model : just assume that for each pair the two spins have opposite orientations along the same (random) direction, determined when they are emitted.”
This is true, but the main point of the argument is that, while, as you say, you CAN build a local, deterministic hidden variable model you CANNOT build a local non-deterministic model. So, the conclusion of the argument is that local indeterminism is falsified by this experiment. You cannot just ignore that and continue to discuss your model in the context of a Bell test or whatever other experiment. Local indeterminism is dead and buried by EPR-Bohm, we need to forget about it.
“But such a model obeys Bell’s inequalities (BI), so you just ignore the difficulty, which is that they are violated when considering other orientations.”
Here I disagree. Local classical models could in principle violate Bell’s inequality if they are models with long-range interactions, like classical electromagnetism. The presence of such interactions makes the model contextual, since the state at A and the state at B, and the state of the particle source (S) are not independent (the whole system A+B+S has to satisfy Maxwell’s equations). In other words, the hidden variable (which is determined by the state of the source at the time of emission) is not independent of the detectors’ settings. This is the so-called superdeterministic loophole.
“My whole point is to discuss what happens when the orientations are different, and when Bell’s inequalities are violated (Clauser 1972, Aspect et al 1982, loophole-free 2015). Then the previous naive model fails, and one must consider what goes wrong in Bell’s hypotheses, because Bell’s reasoning is logically and mathematically correct.”
Bell’s independence assumption is wrong for the simple reason presented above. Bell’s model ignores that long-range interactions exist even between distant systems, he basically equates classical physics with Newtonian rigid-body mechanics. The independence assumption does not make sense in any field theory (classical EM, GR, fluid mechanics and even cellular automaton models which are in fact discrete field theories).
“As a conclusion, there is something special with QM non-determinism, that is to allow predictive incompleteness, and thereby to violate BI (in this discussion I ignored superdeterminism, that is another possibility for violation, but honestly I simply don’t like it). More details are given in https://arxiv.org/pdf/2012.09736.pdf”
I think you dislike superdeterminism because you have a wrong understanding of it. But, like it or not, it’s the only local option on the table.
Comment #313 January 28th, 2022 at 5:09 am
Scott #264: The problem is that you can’t even define what this “classical stochastic evolution rule” is. A “freely-willed transition rule” is even woollier.
You are probably aware of the century-old difficulty in even defining what a probability is. We do have a good theory of subjective uncertainty, but that’s not good enough to power the universe, we need true randomness. Do you have an answer? What does it mean to say that state A transitions to state B with probability 2/3 or to state C with probability 1/3?
We are used to letting true randomness be simply an unanalysed primitive. We know how to deal with it mathematically (with the Kolmogorov formalism), and we know how to produce it in practice (with QRNGs), so we don’t need to know what it is. But if you are writing down the rules that make a universe tick that doesn’t cut it, you do need a well-defined rule.
The only solution I know is defining true randomness as deterministic branching, aka Many-Worlds. And as I’ve argued before, you do need quantum mechanics to get Many-Worlds.
Another argument, that I don’t find persuasive, but I think you will, is via the Chiribella-D’Ariano-Perinotti reconstruction of quantum mechanics. They have some axioms defining what a reasonable probabilistic theory is, and show that if in addition you require purification, that is, that randomness must be generated by the theory itself, then you end up with quantum mechanics. If you want a classical probabilistic theory then you have to let your randomness be external, like we’re used to in (classical) Hamiltonian mechanics.
Comment #314 January 28th, 2022 at 5:15 am
Scott #190: I was merely questioning whether this question was just a scientific one. Science can answer the “How ?”, but not necessarily the “Why ?”. The way you phrase the question makes me think that this is on the border of what we can expect science to answer for sure with enough efforts.
Don’t get me wrong : it might well be that for instance we find a unification of QM and GR, in which the laws of QM appear logical and inevitable, and that would count as a satisfying answer to your question.
But it could also be that the fundamental laws of the universe appear needlessly complex and arbitrary to us, with no deeper explanation. I just wanted to leave room for this second alternative.
Granted, the quest for simpler explanations have been an efficient guide for us so far, but we might encounter a fundamental roadblock on this path, it was the only point I was trying to make.
Comment #315 January 28th, 2022 at 5:15 am
A1. Classical physics assumes that the past is entirely known to a God. And the future is completely determined by the past. So, everything is predetermined. I find it meaningless.
A2. The rules of QM are simpler in some sense. For example, they allow us to access the Church of the larger Hilbert space. Which explains that complex things that we observe are just projections of something conceptually simpler, but which resides in higher dimension.
Unrelated to QM, the set of complex numbers contains much more harmony (and beauty, though it’s subjective) than real numbers. They are not just R^2, as we can think of them.
Comment #316 January 28th, 2022 at 5:28 am
Philippe Grangier #307: I don’t see how shuffling terminology around changes anything. Call it “outcome independence”, call it “predictive completeness”, call it contextuality. The fact of the matter remains that the probability of Bob’s outcome ‘b’ depends on Alice’s outcome ‘a’, which was produced in a space-like separated event. It’s nonlocal.
Comment #317 January 28th, 2022 at 5:56 am
On Q1: We can think of particles as wave-like excitations out of some vacuum ground state, similar to quasiparticles in condensed-matter systems. In our everyday life, waves also arise naturally out of perturbing any system… in that sense, a wave-like (and therefore, quantum) view of particles is much more natural than classical, “point-like” particles.
Comment #318 January 28th, 2022 at 6:02 am
Best speed with your latest exciting project! Personally I hope you develop it to book length.
In trying to follow this debate I’m often struct by a certain asymmetry in how it is framed: we accept the naturalness of a classical world and try to think up reasons why quantum mechanics had to be added.
For a moment, lets allow ourselves to start with the intuition that complex numbers are a natural starting point (they’re algebraically closed after all!). Now you can still ask Q, but you might decompose Q1 and Q2 differently. Q1* might be “Why didn’t God just keep the universe complex and be done with it?”. Q2* might be “Why this special alternative? Why force measurements to be real numbers? Why the Born rule….”. This leads to many of the same questions and potential anthropic explanations. But can thinking about the question from both ends help insight?
Obviously, quantum mechanics is more than just complex numbers, but really my point is to push that any treatment of this subject also has to deal further with issues in the philosophical foundations of maths (even further than you already did in QCSD). Didn’t Gödel hole formalism beneath the waterline? But in attempting to axiomatize physics we seem to want to continue to believe! In seeing ℕ as somehow more ‘natural’ and less needing of special explanation aren’t we taking sides on Q?
I’d also highlight the point from antiquity about moving too quickly to talking about R as ‘really real’ numbers. When we go to school we learn to count, to measure things and to sample from distributions (play games). For all of these ‘experiments’ we only directly need the integers and the rationals (their fairly trivial extension). Even when I measure the side of real world triangle (using whatever physical unit my teacher indicates) the physical result is a rational number. Now when I’m taught the ‘theory of triangles’ to make sense of my physical results I have to starting using numbers in R. Isn’t this already an important clue that the world of physical theories has to use a richer mathematics than that of our native perception? The ‘Ruler rule’ helps be jump between R and the rationals. Doesn’t the ‘Born rule’ just help me with another jump between number systems?
Perhaps your new book should be titled ‘Quantum Computing since Pythagoras’?
Comment #319 January 28th, 2022 at 6:03 am
@Scott: You seem to say that you feel Q is sufficiently answered for relativity, so am I correct in taking that to mean that you would be happy with finding some set of physical principles which necessitates quantum theory? Cause you could argue that Q is *not* answered for relativity, because why would the speed of light be finite? That seems like a bit of an arbitrary choice as well, and I could easily imagine a complex life-bearing universe where causality works instantly.
But assuming that finding good physical principles for quantum theory would indeed be sufficient: many such principles have been proposed (like the Hardy and Chiribella paper you mention), and as you rightfully say, these are not fully satisfactory.
However the Chiribella reconstruction does bring up the purification axiom as a thing that differentiates the classical world from the quantum world: in a classical world it is possible to have *genuine* mixed states, where knowledge is inherently smeared, while in a quantum world any kind of mixture or uncertainty comes from not having access to the full system. This could probably be related in some formal way to “information must be preserved”, although I don’t know exactly how.
Comment #320 January 28th, 2022 at 6:07 am
Separate comment, cause separate idea.
An anthropological argument: suppose Everett is right and additionally that our conciousness is classical. Then in our quantum universe there is an *uncountably infinite* number of classical conciousnesses existing on all sections of the universal wavefunction.
So supposing that there’s many universes, some of which will be classical (like the cellular automata you propose), and some of which will be quantum-like (supporting superpositions of consciousnesses). Then all the classical universes will only support a finite or countable infinite number of consciousnesses while the quantum-like universes support an uncountable-infinite number of consciousnesses. Hence, probabilistically you will always find yourself in a quantum-like universe.
That only partially answers Q1 though, as it gives an argument for why you shouldn’t find yourself in a classical universe. However, as you say, Q2 has many partial answers and there’s good reasons to believe that once you disallow a classical universe, a quantum universe is a very natural choice for several mathematical reasons.
Comment #321 January 28th, 2022 at 7:14 am
Scott, if these are the kinds of questions that interest you, don’t you think studying physics gets you closer to the answer than studying computer science and quantum computing? Studying quantum field theory, etc.?
I can also ask questions about all the non rigorous things in quantum field theory. Is there a formulation without renormalization? If not, then with which parameters does God actually run it? Do you realize that the current formulation of quantum field theory is far from being a computer program you can just postulate that God “runs”? That mathematical consistency of it has been an open problem for many decades now?
How far into physics and quantum field theory do you really understand? Don’t you think you should get a very good understanding of it (not even claiming I have it- I’m talking about understanding it like at least those physicists in CERN that actually compute things with it) if those are the kind of questions that interest you?
Comment #322 January 28th, 2022 at 7:45 am
** If we’re living in a simulation **
Perhaps God is running an optimization using something like genetic algorithms and trying to optimize for one or more parameters that are important (for some reason). Statistically, we’re probably not the optimal solution and if using something like genetic algorithm for optimization, there are lots of offspring that have nonsensical values. Let’s hope they don’t notice and kill off this descendant while I’m still typing.
Comment #323 January 28th, 2022 at 8:19 am
I’ve written reams and reams on this before
But basically, I think it comes down to causality. In classical mechanics, there’s a time dimension and particles are, in principle, distinguishable from one another. So you could take two electrons and trace their path backwards or forwards through time.
You can meaningfully ask “Where was electron B two hours ago?”
This is the change in quantum mechanics. There are only individual instants of time and there’s no causal connection between these instants, and no causal link between individual particles, making them interchangeable.
You *can’t* meaningfully ask “Where was electron B two hours ago?” Because it’s not meaningful to compare electron B to any other electron.
I’ve put it in detail in the last section here.
https://thesmalluniverse.net/pages/QuantumMechanics.html
Comment #324 January 28th, 2022 at 8:29 am
Thank you so much Scott for asking this wonderful question and for the thread taking place here, which I am reading with fascination!
My knowledge of QM is poor so I am afraid of embarrassing myself, but I can’t resist posting because something bothers me in this thread: Why are discussions on Q1 so much focused on cellular automata and their limits (anisotropy, problems with their implementations, how to get randomness, etc..)?
Maybe I misunderstood something, but if one wants to make a universe that is classical but resembles ours, isn’t the most “obvious” try, much more obvious than using CAs, to simply replace complex amplitudes by nonnegative probability densities, unitary transforms by stochastic transforms, and see what can be done with this? In such a universe we would still have:
-Elementary particles described by wave functions
-Wave function collapse due to measurements
-Tensor products
-etc.
but we wouldn’t have any of the “weird” quantum effects stemming from interferences of amplitudes. Am I right? Would such a universe still be called classical? Then it seems to me that a very promising route to answer Q1 would be to figure out what would go wrong with THAT universe.
I guess this has already been studied before and would love to get pointers from knowledgeable people!
Comment #325 January 28th, 2022 at 8:36 am
Scott replied at 223, “I could be 100% satisfied with … there’s a Tegmarkian multiverse, with some classical universes, some quantum universes, and some universes with other rules entirely, and a-priori we could’ve been in any of them, but the deck was stacked in favor of our finding ourselves in a quantum universe for the following reasons”
The odds are stacked because, like a water puddle fits the shape of the hole it is in, we evolved to take advantage of the way our universe works. The universe wasn’t made for us, we were made for this universe.
Comment #326 January 28th, 2022 at 8:38 am
Andrei #312, so your main point is :
« Local classical models could in principle violate Bell’s inequality if they are models with long-range interactions, like classical electromagnetism. The presence of such interactions makes the model contextual, since the state at A and the state at B, and the state of the particle source (S) are not independent (the whole system A+B+S has to satisfy Maxwell’s equations). In other words, the hidden variable (which is determined by the state of the source at the time of emission) is not independent of the detectors’ settings. This is the so-called superdeterministic loophole.(…) I think you dislike superdeterminism because you have a wrong understanding of it. But, like it or not, it’s the only local option on the table. »
In my understanding of superdeterminism, the non-independence between A, B and S must come from their overlapping past light cones, in order to avoid a clash with special relativity. This possibiliby was already considered by Bell, but it makes that there are no more independant events, no more randomness, no more freedom of choice, since everything has been ‘written in the past’. So yes, I dislike this option, and no, I don’t think it is the only local one on the table. It is true that predictive incompleteness is not easy to grasp because of its fundamentally non-classical and contextual features, but well, it does the job. So coming back to your preferred (Bohm-like) option of giving up locality, I think that you have a wrong understanding of contextual randomness.
Comment #327 January 28th, 2022 at 8:48 am
Follow-up: also mildly stacked because some randomness and discreteness are necessary for a feasible universe, and QM is a way of providing that. Maybe the only or best way, but we have no way of testing that, and whatever proof we think we could make might have a counterexample in the hypothetical metaverse which we will never see.
Comment #328 January 28th, 2022 at 8:50 am
Scott #261: I’ve hesitated to post this, because I think it adds nothing—even if it’s true, I don’t think it’s predictive or useful, and almost certainly non-falsifiable. But this comment suggests it, so I will!
In one of your prior posts from awhile back, I think you mentioned something roughly like “what kind of universe could exist where basic math differed greatly or was meaningless” (not a quote and I’d have to go back), but I had two thoughts. One was: what if you had a universe of (relatively) ‘godlike’ people to whom operations on infinite quantities were trivial, like 2+2 to us. Hypercomputing was just “computing”. Math on finite quantities might be anywhere between meaningless and problematic, or perhaps a specialized area of study.
What if such a species were interested in whether some form of finite life were possible… how to actually do this might still require some cleverness. They would have infinite computing resources at no cost, but limiting the simulation might be one of the main problems … some finite speed limit, some way to prevent hypercomputing or unlimited-range information transfer, etc. But at the same time, they have infinite resources, and they (unlike quantum computing 😉) can check all possible outcomes to see if finite life happened.
But, like I said… this is more the realm of speculative fiction than satisfactory answer; it’s still “just because,” with no way to tell in any case.
Comment #329 January 28th, 2022 at 8:51 am
CR Drost #298
Why do you say that MWI implies a time-crystal? Is that what you are implying?
Or are you suggesting that a given consciousness exists in a specific “world” both “before” and “after” a “split” (I hate these metaphors), and the idea that there is any “movement” during a measurement event is false?
Because the former seems confused, and the latter seems founded on a misunderstanding of MWI, because there aren’t literally distinct universes/worlds, but rather complex interactions of waves in a single universe in which sets of waves interact constructively, and other sets of waves interact destructively, in a way which results in the cancellation of large sets of wave interactions, and certain events can cause waves to split in ways which create new logical sets of waves, and this just happens to kind of resemble multiple universes / worlds.
(Of course, I may be misinterpreting you entirely.)
Now, on a more crackpot note, personally I think it’s false that the different “worlds” don’t interact, and I think world interactions figure importantly into what entropy “really is” (which is to say, I think entropy is a measurement of how many histories a given physical system is compatible with).
And on another crackpot note, note that what we call “time” is actually three distinct things. The first is the “tick rate” of the universe, or, alternatively, the conversion factor between space and time. The second is the “history” of the universe. And the third is the “dimension of change” of the universe. These may all be the same physical phenomenon, but they don’t actually need to be; you can have a closed (loop) dimension for time which functions as the “dimension of change” of the universe, for example, with the “history” of universe written on “distance” (in every direction, a la special relativity), and the “tick rate” embedded in the observed relative angle between the local orientation of the loop dimension, and the relative orientation of the loop dimension of another object. In this formulation, going backward and forward in the “dimension of change” is identical – traveling backwards in time is basically the same as going forwards in time, because time and history are distinct.
Comment #330 January 28th, 2022 at 8:57 am
Q: Why should the universe have been quantum-mechanical?
A: The answer is 0.1134.
You cannot ask a calculator why it calculates the way it does. If we assume that reality is no more than these complex probabilities, then simply a bunch of calculations is taking place. So all we can do is calculate. Anthropic-like considerations lead nowhere. While if we ask why is QM the way to avoid nothing, again a calculator doesn’t ask or answer questions like that.
All the calculator can do is provide us with details of the calculations and how likely is it that they will contain some clue meaningful to a human, like some zeroes in the expansion of π arranged into a circle in “Contact”? It seems the universe really is just “shutting up and calculating”.
When you type 0.1134 in a calculator and turn it upside down it reads “hello”. That’s all we are – a pattern on a calculator that says “hello”. Or asks “Why should the universe have been quantum-mechanical?”
Comment #331 January 28th, 2022 at 9:04 am
Maybe the only question is what does it take to create consciousness.
It would be paradoxical if consciousness was a purely classical process, something emerging from straightforward data processing, yet it would only be realized on digital computers that first have to emulate quantum mechanics or GR.
I think it’s therefore more reasonable to expect that every observed fundamental mechanism of our reality (QM and GR) are necessary ingredient for consciousness.
By Turing-Church, such fundamental mechanisms can always be emulated using a digital computer, even if very inefficiently (there’s no requirement that consciousness has to be updated at every fundamental step of the simulation), so we’re always back to square one.
This is why Penrose is trying to escape this by claiming that there must be some fundamental mechanism besides QM and GR that we have not captured yet, and that ingredient isn’t computable (saying that a causal element in our reality can’t be computed is the same thing as saying that it’s God magically changing the state of our universe from the outside… quantum randomness is an example of this).
Considering the universe as a giant computation could be misleading and confusing, maybe it’s better to consider the universe as a giant static mathematical construct, i.e. all there is at the very bottom is a giant set of “nodes” with infinite ways to connect them, and some subsets of connections have certain properties (like self-similarity, which Douglas Hofstadter claims is a special ingredient of consciousness) that simply auto select them to give rise to consciousness.
In the end, there’s no way escaping that, no matter what how much we model the external reality we perceive, consciousness is the only fundamental element of reality we have access to. Trying to explain consciousness from abstract elements we model in our minds (like quantum fields and their vibration) is never going to work, it just can’t.
Comment #332 January 28th, 2022 at 9:07 am
Mateus Araujo#316 : « Call it “outcome independence”, call it “predictive completeness”, call it contextuality. The fact of the matter remains that the probability of Bob’s outcome ‘b’ depends on Alice’s outcome ‘a’, which was produced in a space-like separated event. It’s nonlocal. »
Dear Mateus, you are going too fast, the main points are :
– first, split Bell’s hypotheses in two parts (1) and (2) as said before, whatever name you give to these parts, this is just a matter a taste. Either (1) or (2) or both must be violated by QM.
– second, consider that part (1), elementary locality or parameter independence, is really associated with the intuitive idea of an action at a distance, so its violation deserves to be called nonlocality. It is NOT violated by standard QM.
– third, consider that part (2), predictive completeness or outcome independence, is NOT associated with the intuitive idea of an action at a distance, but rather with an inference at a distance, that can only be verified in a common future. Therefore, its violation does not deserve to be called nonlocality, and has no problem whatsoever with special relativity.
Now you may claim that (2) is also a form of nonlocality, in this case it is only a question of how you define nonlocality. But my claim is that distinguishing clearly between (1) and (2) is quite useful, and also leads to the interesting conclusion that the usual psi is (predictively) incomplete, as long as the measurement context has not been specified (see article quoted before).
Comment #333 January 28th, 2022 at 9:08 am
Hello! My speculative answer to Q1:
In order for us to spout metaphysics about a Universe, it has to be possible in the first place. The rules of QM are just the rules of what phenomena are self-sustainingly possible at-all. Now wave your hands and go possible => probable, self-sustainingly => something-equals-one and continue to justify via your favorite axiomatic QM framework.
On that subject, I haven’t seen any axiomatizations of QM that directly generalize Polya/Jaynes/Cox – they all seem subtly different, and substantially more dry/boring.
Good luck in your quest!
Comment #334 January 28th, 2022 at 9:37 am
Scott #286: Thanks so much, that clears everything up. I agree with you that the principle of local tomography is a much more satisfying desideratum than the principle that “subsystems’ numbers of real degrees of freedom combine multiplicatively” – especially since the latter principle is in fact false in QM, and counts of real degrees of freedom actually combine supermultiplicatively! The (much more awkward) correct statement that “((real degrees of freedom) + 1)s combine multiplicatively” is not directly interesting in itself, but is only indirectly interesting in that it gives us the principle of local tomography.
Comment #335 January 28th, 2022 at 9:40 am
FeepingCreature #283
You could certainly convince some observers that they are living in a classical world, simulated or not. The problem is that this impression is illusory; they’ll simply be mistaken. They are, in fact, living in a quantum world, even if at a step removed, not unlike how for most of human history, we’ve lacked the tools to reliably study atomic physics, and therefore had no knowledge of this domain beyond wild speculation. Having said that, your thought experiment lays a decent claim against Penrose-type arguments, which rely specifically on the biological substrate of brains and neurons and their supposed quantum dependence (assuming your experiment goes as planned with no conspiracy), but that angle never seemed especially compelling to me in the first place. At least as far as the mathematical universe hypothesis is concerned, consciousness motivates the phenomenon of superposition because you’d expect consciousness to supervene over a bulk of possible worlds. You don’t need to assume much about how consciousness works to get there; only that consciousness supervenes over physical phenomena. I imagine among most who are scientifically minded, this is the default assumption about how consciousness relates to matter. At the very least, it seems fairly unproblematic. The other required assumptions are packed in to the MUH; that physical phenomena just are mathematical structures, and that isomorphic mathematical structures are numerically identical (ie. there are no haecceities in the mathematical ensemble; there is only one dihedral group of order 8).
>Anyway, my idea is that qm is just the simplest setup where a small seed state can give a sufficient variety of outcomes for life to exist. I think life will turn out to be so unlikely that the universe has to allow an unbounded continuum of branches to find some with life in it at all.
A spacially infinite classical universe can provide the same diversity of outcomes. I’m not sure what is gained combinatorially by representing the universe as a vector in an infinite dimensional Hilbert space.
Comment #336 January 28th, 2022 at 10:18 am
I keep thinking about the Quanta article about the maximum precision of clock given entropy, and wondering if there’s a way to turn that on its head and make it an axiom (except instead of being about physical clocks, it is a definition or condition of time itself), and out pops QM and maybe GR.
Comment #337 January 28th, 2022 at 10:28 am
@Crackpot #329
If it helps to reify what I’m saying into a concrete example, we can do that.
So suppose you have a qubit evolving under some Hamiltonian. The usual approach is to pick out a point on the Bloch sphere, track it over time. Your Hamiltonian is then a sort of flow field \(\vec v(\theta,\phi)\) on the Bloch sphere. Each point on the Bloch sphere is a “world” as far as MWI is concerned, and the insistence of MWI is that we should not be persnickety about “which state is it *actually* in”, the point that danced around the sphere following \(\vec v(\theta,\phi)\)… MWI says this is a mistake because it is not the qubit, but us, doing the dancing. We are following this point around on the sphere. The points are just there.
The central conceit of MWI is that we can boil down our nonunitary dynamics to unitary dynamics if we just insist that our “hyper-Bloch-sphere” that describes all of the atoms in our bodies and the experimental apparatus and the quantum thing under study, each point on that state space feels a certain way. So then we ask which way we feel and we discover that we just have a superposition of ways that we feel, and they are isolated from each other. Just the standard unitary flow, the standard flow field induced by the hamiltonian…
Comment #338 January 28th, 2022 at 10:36 am
Philippe Grangier #332: I know, I read the previous comments. I think the unsplit, informal version of Bell’s 1976 definition is a good definition of locality: the probability of an event can only depend on events in its past light cone. In particular it cannot depend on events that are space-like separated from it. Both (1) and (2) violate this definition, and therefore both are some sort of nonlocality. I think it’s impolite to rename things around and insist that only (1) is a proper definition of locality. There are so many definitions of locality!
That said, I do agree that (2) is a milder sort of nonlocality. If (1) was violated experimentally I would see no hope of reconciliation with relativity. With (2) there is hope, but it is by no means easy. When you go for a realist model of single-world quantum mechanics (naïve textbook realism, collapse models, or Bohmian mechanics) you get a flagrant violation of relativity. The only way I know how to do the reconciliation is with Many-Worlds.
Comment #339 January 28th, 2022 at 10:38 am
When we say that classical physics is incompatible with complex chemistry, are we referring to a specific set of laws and equations that turned out to be insufficient? Or is any possible classical theory known to be insufficient?
Classical physics can describe billiards-balls, which is sufficient for universal computation. Then either a billiard-ball computer can efficiently compute states produced by complex chemistry, or it can’t.
If it provably can’t, then we can settle BQP vs P right now.
I’m guessing that, despite the claims, we can’t actually prove any such statement.
An eventual compatibility between classical physics and complex chemistry is merely unlikely?
Another thing that is worth mentioning is that classical machines can efficiently simulate any quantum systems that are describable by a few non-universal quantum gate sets.
That means that, in practice, classical machines can simulate even large scale quantum interference, they will just struggle do so for every possible quantum circuit.
E.g. we have to throw a sufficient number of random quantum experiments/circuits at a classical machine to have it produce the wrong answer.
Basically, proving that classical physics can’t describe a quantum system is not that easy?
Comment #340 January 28th, 2022 at 10:56 am
Rather than answering directly the question “why QM?”, we could try to inspect why the question even arises in the first place.
Meaning that a deterministic universe build on QM is such that the effects of QM aren’t just direct bottom-up (building up from vibrations in quantum fields to macro structures), but alternative and subtle causal paths also spontaneously appear where the microscopic laws gets amplified and expressed at the macroscopic level (it’s the case with the question “why QM?”).
This is not just true about QM in isolation, the entirety of physical mechanisms have to be considered together.
E.g. the universe is such that the patterns in the brain of a cosmologist (such patterns are macroscopic structures), here on earth, are directly mapped onto the shapes of distant galaxies billions of light years away. This doesn’t happen by some direct effect of gravity of electromagnetism, but through what looks like very unlikely “conspiracies” (the evolution of life and then the evolution of intelligence). The isomorphism between the cosmologist brain patterns and distant galaxies is very strong, yet very hard to explain without walking back the entire evolution of life on earth.
This is often referred to as “the universe is such that it’s able to eventually look back at itself”.
Comment #341 January 28th, 2022 at 10:59 am
These questions venture in to the realm of metaphysics. I have a lot to say on Q, but I will just leave a few nuggets here.
– Consider that complex numbers, being algebraically complete, are more “real” than real numbers. The normal viewpoint is an accident of how human senses and classical measurement work. I would call complex numbers “full numbers” and real numbers “partial numbers”. So my answer here is “because the universe naturally uses full numbers”.
– Fundamentally, I think we need to take Tegmark more seriously and consider “what is the ontology of mathematics” using physics and empiricism as pointers, rather than asking “what is the ontology of the observed world” and relegating mathematics as a mere tool. It is difficult to answer your questions using science, which doesn’t really address “why questions” if you keep asking “why” more than a few times. Metaphysics on the other hand, should be founded on mathematics — not a dead, static, utilitarian mathematics, but a mathematics that is alive, that moves, and which breathes fire into itself. Mathematics as Mind, and Mind as the primary ontological substance.
– Briefly, the main problem of monism is to explain why things appear dualistic. I would suggest that this touches on a core aspect of QM — the Fourier transform. If everything is Mind (=Mathematics), there may still be two aspects of mind, which Kant called extended and thinking substances. Thinking (unextended) substance can be equated with the frequency domain (which consists of waves which extend infinitely), and the extended substances to the spacetime domain (which consists of infinity concentrated to finite time e.g. the Dirac delta function).
– All of mathematics is dynamic, alive. Importantly, this includes zero. Zero is not simply a mark used on a ledger to keep track of apples. Zero is a process, and that process is akin to moving around a circle back to the starting point. There is continual non-0 displacement, but a net displacement of 0. The fundamental group of the circle is Z, the integers. The circle is what binds together the frequency and spacetime domains of the Fourier transform. Zero contains infinity and zero serves as the “unit” of the frequency domain. Within the spacetime domain of finitude, we have a dimensional unit 1, but that too is dynamic and alive…
– These ideas are just suggestions, analogies. Simplified for brevity. If they make you uncomfortable, that’s good. Let them digest a bit.
Comment #342 January 28th, 2022 at 11:05 am
I’ve read that an ankle will get weaker and more injury-prone after each injury. You can strengthen the stabilizer muscles and rebuild your proprioception by balancing on one leg for a little while, a few times a day.
This comment, of course, contributes nothing to our shared understanding of the nature of the universe, so please feel free to delete it.
Comment #343 January 28th, 2022 at 11:08 am
Duh #219:
Level-design that uses RNG can open the design-space much more widely than classical level-design. Without RNG, either all the levels look the same, or you have to hand-craft them all to be different in interesting ways. QM is just a way to make sure the universe looks interestingly different across its total configuration. Without it, there would be no level in which we evolved, so we wouldn’t be asking the question.
<rant>
While I’ve learned many interesting things from this thread and appreciate the many thoughtful comments, I’m also noticing a large number of commenters who seem to have a totally misplaced confidence in their answers to the “why QM?” question. As one example, every single answer (like the above one) that talks about randomness crashes and burns against the objection,
“but then why not just classical randomness? if you didn’t already know that our universe involved complex amplitudes and unitary transformations and Hermitian operators, would you have invented all that stuff, just as a way to get randomness?”
Unless, that is, this objection is explicitly confronted and rebutted: something some commenters (to their credit) have attempted with varying degrees of success but that others don’t even seem worried about.
In general, when contemplating the “why QM?” question, I believe that a good rule of thumb is to remember the undergrads taking a midterm, who if you ask them to prove X, can produce paragraph after paragraph of verbiage whose tone is so self-confident that even a seasoned hand like me thinks, “well then they must really understand this, even if I can’t follow the train of reasoning” … until I turn the page, and I find paragraph after paragraph of equally self-confident verbiage proving a statement that’s false! 🙂
I.e., whenever you think you’ve found a completely convincing reason why QM is basically inevitable, you always, always need to stop and ask yourself: “well, suppose certain experiments in the early 20th century had turned out a different way. Would my brain have been able to generate equally convincing reasons why not(QM) was basically inevitable?”
</rant>
Comment #344 January 28th, 2022 at 11:09 am
You mention Einstein and he was similarly burdened with the “does god play dice?” Question and held an assumption that “no” was the answer and subsequently was incorrect. The question you are asking is “why does god play dice?” And perhaps, why do those dice have a specific number of sides, perhaps.
I respect that as QM expert you have the benefit of context for framing this question. I am wondering if this question will be similar to: “which interpretation fo quantum mechanics is correct (Copenhagen or many worlds?” Do you feel this is the former or latter type of question? may it forever sit outside of experimental verification? Why?
We expect Occam’s razor to hold. Perhaps more specifically we feel that excess information is wasteful. There are statistical arguments for that. But there are also plenty of examples where a simple pattern or system or order arises within an unnecessarily complex one. Could this be like asking, why does our intelligence run on wetware instead of hardware? It would be more “efficient” and simpler for intelligence to run on silicon or countless other media. (Hopefully this claim isn’t too contentious or distracting). The point being, systems evolve in a less direct path than design (the word you kept using). QM existed and then the universe evolved on top of it. I would assume QM is an evolutionary precursor to present day reality.
Comment #345 January 28th, 2022 at 11:47 am
Quantum mechanics lets you discretize the state space without discretizing space. In particular, it lets you simultaneously preserve continuous spatial symmetries and the third law of thermodynamics (entropy at zero temperature is a finite constant) in a system with particles.
So for instance assume you want to have something like particles, and you also want rotational invariance (you’ve said you are satisfied with Einstein’s justification of Lorentz invariance so I assume you’re happy with taking continuous rotations as a given). Then if your ground state of hydrogen (or whatever your basic atomic building blocks are in your fancy new universe) is rotationally invariant, but you also have a definite position for the electron (or whatever), then you can generate an infinite degeneracy of states by rotating this state. So entropy is infinite. On the other hand, if you want your low energy states to have finite entropy, you need to somehow have states where continuous rotations acting on them generate only a finite number of states, in other words they have to be finite dimensional representations of SO(3). So they have to be spherical harmonics, i.e. the stable bound states basically have to be waves. But when you isolate and manipulate (i.e. measure) their constituents, they look like localizable particles?
Comment #346 January 28th, 2022 at 11:52 am
Scott #337 : « well, suppose certain experiments in the early 20th century had turned out a different way. Would my brain have been able to generate equally convincing reasons why not(QM) was basically inevitable? »
This question brings me back to my previous restaurant metaphor, #174 : given the dish cooked up for us by the Grand Chef, should we try to swallow what we got, or to get something else ? To take a non-quantum example, what physics would we have if Michelson and Morley had measured a non zero velocity of the earth through ether ? One can abstractly speculate on that, but physicists will certainly prefer to adopt special relativity, and to use it…
To be clear, I don’t advocate « shut up and calculate », and I do think that « why QM ? » is an interesting question. But my effort will be towards first getting physical principles based on empirical evidence, and then finding appropriate mathematics to manage them : like calculus for Newton, linear algebra for QM, tensors for GR. So which mathematics are we missing now ? From #301, my best guess would be something like transfinite calculus, ie the proper simultaneous management of incommensurable scales.
Comment #347 January 28th, 2022 at 12:37 pm
Scott #102:
Two reasons for causation might be: it makes statistical inference tractable (i.e. check N^2 correlations instead of 2^N), as well as the whole SR thing where causation + other axioms leads to a finite speed of light and hence a local universe.
As to your whole research agenda, would it be fair to phrase it as: “give an arguement convincing a smart person in a world which feels to them intuitively like we feel ours to be that they are be living in a quantum world. Further, this arguement should be as natural as Einstein’s arguement for SR.” In which case, isn’t that exactly what the GPT subset of quantum foundations was made to do?
Comment #348 January 28th, 2022 at 12:49 pm
Scott #7³: While you say you “won’t understand” my work (#45), your rant is, I think, very, very close to what my work tries to unravel in detail. If we introduce classical randomness, how exactly does that differ from quantum randomness? If quantum randomness leads to measurement incompatibility, why do we think that classical randomness does not? What does classical randomness look like if we can find a natural way to include measurement incompatibility? If we introduce classical randomness and measurement incompatibility, what differences remain? (1) Quantum noise/randomness has a different spectrum from thermal, Gibbs noise/randomness; (2) classical dynamics, generated by the Liouvillian operator, is significantly different from quantum dynamics, generated by the Hamiltonian operator.
To try to be clear, I never claim this is the right way to think about the relationship between CM and QM (at least, I try not to; I’m championing this because AFAICT there is no serious person championing it.) I think CM and QM, in their different formalisms and interpretations, are almost equally viable “pictures”, but with different advantages and disadvantages. I think Andrei (who I think must be Andrei K.) and Philippe Grangier have viable stochastic and contextual approaches that are problematic only because their formalisms are just that bit too distant from the standard Hilbert space formalisms of QM/QFT; I think ‘t Hooft’s, Elze’s, Wolfram’s, Hossenfelder&Palmer, Wetterich’s and other approaches to CAs and stochastic methods struggle to include measurement incompatibility as naturally as Hilbert space methods do (although ‘t Hooft and Wetterich discuss noncommutativity and incompatibility at length, it seems that their accounts have in practice not been clean enough in practice to bridge the gap. On my tombstone, “His wasn’t either”.)
If we think these struggles are rhetorical as much as they are substantive, then Koopman’s Hilbert space formalism for CM seems one natural way to try to approach the gap between them. I have found a Koopman approach to be more successful than the Wigner function approach to that gap, even though I think that is also perfectly viable as a way to understand the relationship between CM and QM. When one considers the quantum measurement theory and quantum probability theory literature as well as the Koopman-von Neumann literature, I have found it mathematically preferable to adopt an algebraic approach to Koopman classical mechanics, so that we can compare the algebraic, symmetry, and analytic structures of CM and QM with, I suppose, even less distractions.
Comment #349 January 28th, 2022 at 12:59 pm
Hi Scott,
Here’s my take on it. I am a strong believer in the principle of plenitude. This idea has been around for millennia, but Max Tegmark gave it a more modern spin recently, as follows (I’m also adding a bit of my own interpretation to it):
The universe is a mathematical object. It’s just a bunch of interconnected equations that together form a unified mathematical object. And mathematical objects are basically just abstract objects that, like all abstract objects, all exist on their own in this abstract realm (think Platonism). Therefore, all possible universes exist (by possible I mean that they can be reduced to an abstract object).
The question now becomes “Why are we living in this particular, quantum mechanical universe, and not in a classical one?”. Here it comes down to the number of intelligent beings living in each one. Due to the infinite super positions of a QM universe (think the infinite branching in the Everett interpretation), a QM universe is infinitely bigger than a classical one. Therefore, it is inhabited by infinitely many more consciousnesses. Therefore, you are infinitely more likely to find yourself in a QM universe than in a classical one.
There could be a classical universe out there hosting intelligent life, we just don’t happen to live in it. The same idea can also explain why the universe is (probably) infinite in space.
Comment #350 January 28th, 2022 at 1:03 pm
CR Drost #337:
Apologies for the rudeness of my original reply, incidentally, it wasn’t intentional, and I realized how it came across almost immediately after I hit submit.
As for the points “just being there” – I mean, kind of? The Bloch sphere is a representation, not reality (well, maybe it is reality, who knows, but that’s not what MWI itself is saying); the distance between the points isn’t geometric. It’s correct, insofar as the metaphor goes, to say that it is us (the observer) who are moving, but again insofar as MWI itself is concerned, we’re not moving to a geometric destination which existed before we moved there, and what measurements we perform limit what points even exist.
Suppose you have an experimental apparatus such that a measurement is made, embedded in an experimental apparatus in which the measurement is then irrevocably destroyed such that the original measurement cannot be recovered, such that the superposition of the entire apparatus never collapses. We’re not confined to any position on the Bloch sphere corresponding to that destroyed measurement; those positions only come into being when a wave function collapse occurs. It’s not a physical space which we move around on; it’s a conceptual space representing possibilities which can change depending on the evolution of the system.
MWI doesn’t demand unitary dynamics, because fundamentally, it treats the probability wave as a wave, rather than being “about” probability at all; in the MWI framework, thinking in terms of probability (which when you get down to it is what unitary quantum mechanics is really all about, making sure the probability of finding a particle in a given location adds up to 1) is in a significant sense missing the point, because what probabilities are you even talking about? The particle isn’t “probably here”, the particle doesn’t exist, that’s the divergence point of the interpretation from pilot waves.
I don’t know what you mean by “feels a certain way”.
Comment #351 January 28th, 2022 at 1:16 pm
Scott #343:
“I.e., whenever you think you’ve found a completely convincing reason why QM is basically inevitable, you always, always need to stop and ask yourself: “well, suppose certain experiments in the early 20th century had turned out a different way. Would my brain have been able to generate equally convincing reasons why not(QM) was basically inevitable?””
– Personally, I think yes, because a lot of my crackpot nonsense started because I found QM entirely unsatisfying, and set out a couple of decades ago to “prove” that it was fundamentally wrong.
I cannot state the disappointment when I realized my own nonsense said something like QM had to be there. At this point, I’m pretty sure that an important lesson to take from both pilot waves and MWI is that some form of QM is a fundamental feature of any physical system whose behavior can be described in terms of waves.
Comment #352 January 28th, 2022 at 1:42 pm
What if you assume, that BQP is simply in P? Once we understand the algorithm that makes this possible, the “mysteries” asked about here will make sense to us. The Born rule, complex amplitudes, etc, will just fall out naturally from how this algorithm is structured.
For example, at first we did not know efficient algorithms for calculating a Fourier transform on discrete data, but once the trick of the FFT was found, not only did it unlock technological progress, but it also spurred on more research and understanding of the related transforms and variants of that basic idea.
If satisfactory answers to Q1/Q2 have eluded us for so long, perhaps it is slight evidence that there IS in fact an efficient algorithm for simulating a quantum circuit after all?
Comment #353 January 28th, 2022 at 2:06 pm
Crackpot #351
“QM is a fundamental feature of any physical system whose behavior can be described in terms of waves.”
It’s true that QM is an extension of the idea that everything is described by wave mechanics.
Water, gas, of course, but even solids, which aren’t solid at all at the fundamental level, but better modeled by a grid of point particles connected by springs (even diamond isn’t solid at a short enough time scale).
So to answer “why QM?” we probably first should ask “why waves?”, and the answer is that as soon as we have point-like objects within continuous space, and short range forces (springs, or local causal connections), we have waves.
But I think that people who deal with QM in a computational context will say that QM is fundamentally about some manipulation of imaginary probabilities rather than wave mechanics (Schrodinger equation). But it’s just a matter of perspective. It’s no coincidence that complex numbers are also what’s used to describe waves in classical/plain electrical engineering (without any reference to QM).
Comment #354 January 28th, 2022 at 2:24 pm
I’m not competent to say much about QM, but my intuition aligns closely with that of Age bronze #188: if I had to speculate on God’s desiderata when designing the universe I would start with
1. Discrete state space and discrete time,
2. Some form of extreme action principle obeying some form of Noether’s theorem.
If you try to do classical mechanics in completely discrete space and time you immediately run into issues where your variational principle no longer has a unique solution, motivating a probabilistic universe. (There’s also the matter of the destruction of Noether’s theorem). I might hope to show that complex probabilities, wavefunctions, etc. naturally arise as the only (or simplest) way of salvaging an action principle in a discrete universe—I think it is very telling that in QM the symmetries do not necessarily need to arise from continuous group actions—though I have no idea how to begin.
(One can also ask why the universe’s Hamiltonian is just-so, though I feel like that’s a less fundamental question and may not have any particularly satisfying answer, beyond appeal to anthropic principles.)
Comment #355 January 28th, 2022 at 2:38 pm
I don’t think that there is a really satisfying response, especially for the Q1, mainly because both classical and quantum physics are frameworks, not specific theories.
Maybe one can imagine some classical ( or even something non- classical, non- quantum either) alternative for a hypothetical world that has some kind of stable atoms, stars etc, but this is very speculative, and if ” anything goes” , then go figure…
If we assume that classical GR holds, then we have some good restrictions: A manifold, some energy conditions for the stress energy tensor, etc.
Such a world is probably dominated by black holes, almost exclusively. Everything, sooner or later, will be inside its Schwartzschild radius, so…
On the other hand, if some assumptions do not hold ( for example if certain energy conditions are violated ), then this hypothetical universe will have CTCs or timelike singularities, so, maybe it will not make sense as a plausible alternative world.
Comment #356 January 28th, 2022 at 2:50 pm
The only solution I know is defining true randomness as deterministic branching, aka Many-Worlds.
Maybe someone can answer this, but how does MWI deal with actual probabilities? Let’s say after a measurement there is a 1/3 chance that particle A is spin-up and 2/3 spin-down. As I understand it, MWI means that the universe/wavefunction branches at the point of measurement — in one branch A is spin-up, and in one it is spin-down. What then becomes of the 1/3? There are two branches, after all. What does it mean to have one branch be more probable than another?
Comment #357 January 28th, 2022 at 3:15 pm
Scott P. #356
That’s one of the main difficulties of MWI, there’s no clear agreement among its proponents on how to deal with it (e.g. Sean Carroll has a lot to say on this).
The way I think about it is that everyone agrees on what’s a binary split, a 50/50 branching.
And then any other split can be decomposed in a series of 50/50 splits, with special hidden labels. So, to create a 25/75 split, you do two consecutive 50/50 binary splits, and you have 4 possible hidden labels (TT, FF, TF, FT) and assign TT to one branch and the other three to the second branch (FF, TF, FT, all three being indistinguishable except for those hidden variables).
Comment #358 January 28th, 2022 at 3:20 pm
I have avoided reading the over 300 comments above and will just give my own speculation here. Apologies if any of this has been touched on above.
For me, the most interesting thing about quantum mechanics, from an abstract point of view, is that both the Church of the Larger Hilbert Space (CLHS) and the Church of the Smaller Hilbert Space (CSHS) exist – the theory can be developed from either point of view self-consistently – but they each give entirely different intuitions on what the theory is “about”. I would go so far as to say that an unconscious attachment to either one point of view of the other explains a lot of people’s intuition about which interpretations of quantum theory are plausible.
Now, what do I mean by these two churches? The central tenet of the CLHS is that quantum mechanics is a deterministic dynamical theory about an physical state (a pure quantum state) evolving in time under the Schroedinger equation. It is much like any field theory in classical physics, except that it has different rules for combining subsystems so the state does not live in “physical” space, and it has these weird rules for update upon measurement.
An everyday member of the CLHS may accept the projection postulate as part of the theory, and use the Church as a practical guide, i.e. the tendency to purify every mixed state, view every CPT map as a unitary followed by tracing, and every POVM as a projective measurement on a larger space. However, true devotees would embrace a no-collapse interpretation, such as many-worlds or a hidden variable theory like de Broglie-Bohm, removing the fundamental status of the measurement postulates and explaining them in terms of something emergent (decoherence and branching) or something more fundamental (the hidden variables). This can be done to a large degree of success. There are of course still debates about the extent to which all problems are solved in many-worlds or Bohm, but for the most part people agree that these interpretations work, even if they do not think they are correct.
In contrast, the CSHS starts from the premise that quantum mechanics is a generalization of classical probability theory. Something weird happened to the physical quantities that the theory is about – they became noncommutative. Given this, you formulate the theory as the most natural generalization of probability theory that you can think of for such a set of variables. This basically gives you quantum mechanics, via Gleason, Wigner, etc.
CSHS practitioners typically prefer not to purify their mixed states. Since classical probability distributions do not have purifications, you will miss analogies to classical probability if you purify as your first step, so they prefer to stick with the original “small” Hilbert space. True believers note that the collapse of the state upon measurement resembles the change in a probability distribution when you apply Bayesian conditioning, so they want to view the quantum state as something “probability-like” (whatever you think classical probabilities are). This leads to psi-epistemicism and Copenhagenish ideas.
Note that I am not talking about the debate between psi-ontic and psi-epistemic here. The Churches are to do with the way you go about constructing quantum mechanics, so are prior to that debate. Of course, CLHS does tend to predispose you to the psi-ontic point of view and CSHS to the psi-epistemic point of view, but that is derived from the fundamental tenets of the churches, not presumed at the outset.
From the CLHS point of view, it is weird that the CSHS exists and vice versa. Most hypothetical physical theories do not allow both points of view to coexist. For example, classical probability has no CLHS because probability distributions cannot be purified.
If you start from the CSHS point of view then you will be led towards frameworks like generalized probability theories (GPTs). Most GPTs do not have a CLHS. In fact, the Chiribella et. al. axiomatization can be understood as starting from the CSHS view and imposing that the CLHS must exist (plus a few other things). So this is evidence that theories admitting both points of view are rare. You certainly cannot have PR-boxes and much superquantum stuff in such a theory.
But this does not really resolve the question. From the CLHS point of view, you would not generalize the theory by moving to the GPT framework because you don’t believe quantum theory is a generalization of probability in the first place. Instead, you would be more likely to alter the Schroedinger equation or maybe the tensor product rule. Your generalized framework would be to start with an arbitrary differential equation and a composition rule, and then try to see what is needed for a branching structure to emerge or for a satisfactory measurement process to be explained by hidden variables.
Now, far less work has been done on this sort of framework, partly because CLHS advocates don’t seem to feel the need to axiomatize quantum theory, but are rather content if they can convince themselves that the measurement postulates emerge just for quantum mechanics itself. However, if one did develop this framework, I would be willing to bet that most theories in the CLHS generalized framework would not admit a CSHS.
So we seem to be in a special place where we have a theory that admits both CLSH and CSHS points of view and the theory can be self-consistently developed from either point of view. My version of Q is: why is this so? Can the two churches be unified? Is there a meaningful physical principle which explains why we need to have both? If you can answer that, then we can let Chiribella et. al. do most of the legwork after that, modulo exploring the theories that might violate some of their other more technical axioms.
Comment #359 January 28th, 2022 at 3:27 pm
Scott: “By my age, Einstein had completed general relativity, Turing had founded CS, etc.”
Well, but at that age Leibniz hadn’t yet written his Monadology.
And regarding the comment #8 by Rahul: “To answer Q1 do we have to first believe that God exists?”
I comment further: “Or do we have to believe that the God is perfect?”
Comment #360 January 28th, 2022 at 3:36 pm
Request for help and/or clarification.
Can someone who is a many-worlds person, or Everettian if you prefer, explain to me why the many-world ontology is so appearly or evident to you? I am looking for someone who is a die-hard, every branch is equally-real many-worldser.
Was there one moment where it all clicked for you? Do you believe that there is an uncountable infinity of other branches of the wavefunction that are equally real, equally extant?
Do I just lack imagination, or do I just not get it? From my perspective, when someone says something like, the only thing that really exists is the entire wavefunction of the universe, it feels like the word exists is doing a little bit too much work. That is, if things “exist” that cannot, even in principle, interact with “our branch” of the wavefunction, why say they exist?
So yeah, I think nobody is home in the other branches of the wavefunction in a way that is physically meaningful to me. I think the situation remains mysterious. For the time being, I would rather contemplate and marvel at the mystery, rather than emphatically proclaim victory in the form of a massively wasteful and exorbitant ontology.
What am I missing?
Comment #361 January 28th, 2022 at 3:59 pm
Quantum mechanics and free will seem to be somewhat like cellular automata and other computer programs in the sense that individual steps are taken (representable as numbers assigned to variables) in response to the individual situation a cell or an entity finds itself in. They are all about taking steps.
This is very different to the experimentally verified laws of nature which, despite the delta symbols, are static relationships; the laws of nature are not steps.
Steps are a different type of thing to relationships, but seemingly a system needs both relationships and steps.
But it is only in computer programs that the individual steps become rules. With quantum mechanics, and with free will, the individual steps taken are not rules.
Comment #362 January 28th, 2022 at 4:08 pm
Scott P. #356: It’s simple. After the measurement in one third of the worlds the particle is spin-up and in two thirds of the worlds it is spin down.
This “third” comes from finding a way to count worlds. It’s easy to find a rule that agrees with the Born rule: just define the measure of a world to be its 2-norm squared, and the relative measures will agree with the probabilities. There are plenty of formal arguments for deriving this measure in the literature, but I think the strongest argument is that it fits the data.
Now, you propose a different measure: that we should count equally each set of worlds that share the same measurement result. As you noticed, this has the fatal flaw of contradicting the data. It’s also ill-defined: these measurement results are just one decoherent event we chose to pay attention to. There’s plenty of decoherence events happening all the time, everywhere. To actually count each decohered branch equally we would need to take them all into account. It’s clearly a hopeless proposition, and to the best of my knowledge nobody has even tried to do that.
Comment #363 January 28th, 2022 at 4:12 pm
Scott P. #356:
I see the problem, as the idea that “you” have a singular position on the amplitude is contradicted by the existence of superpositions; you aren’t an infinite collection of identical people who diverge at different paths, you’re all of them, that’s what a superposition is (if they were actually distinct, we wouldn’t observe superpositional behavior). So the natural answer, that it’s just a question of which “you” you find yourself being, doesn’t actually illuminate anything, because in the given example there’s still just two of you.
Personally, on the crackpot side of things, I think “you” are more like a subset of the superposition; there’s no a single slice of the superposition that is “you”, but at the same time “you” are not, in fact, the entire superposition, and the superposition is actually quite “fuzzy”; there are very slight differences between different “you” clusters in the superposition, which are still coherent enough to not entirely cancel out. This may or may not resolve the problem, but an alternative version, in which we simply posit the existence of multiple “you” clusters which aren’t fuzzy, and are perfectly identical, may help resolve the original version of the question, which is to say why two possibilities shouldn’t always have a 50/50 split in terms of experience, the amplitude being merely amplitude after all, while still preserving superpositional behavior; you’re not the entire amplitude, you’re some portion of it, and thus if 2/3 of the amplitude goes in one direction, and 1/3 of the amplitude goes in another, you observe those probabilities in your personal experience.
That is, we only get the problem if we assume that “you” are the entirety of the superposition; as soon as we assume “you” are only a subset of it, the problem disappears.
Also, viewing it as branches is, I think, seeing the operation somewhat differently than it actually is. A measurement happens; the waveform “splits”. You yourself don’t split until you observe the measurement, and you don’t necessarily stay split; if the physical reality is identical at some point thereafter (nothing would be different in either case), then the “branches” have re-merged, because the waves have not actually diverged such that the different possible wave states cancel out (because, if we suppose there is no difference in the wave states, there’s no difference to cancel out), and you’re back to a superposition of either measurement. In practice this doesn’t matter at all (or at least I can’t think of a case where it makes any difference), but I think it matters to understand the interpretation, that there isn’t actually a separation between the “worlds”, it is purely a phenomenon of wave interference / cancellation.
I’m also going to kind of agree with fred in that every case I’ve tried to work out has fundamentally been some series of 50/50 splits (and recombinations), except the fact that I can’t come up with a scenario that isn’t a 50/50 split isn’t evidence that one is impossible, but that ties into other crackpot stuff that isn’t actually relevant to the question. However, I’ll disagree on the splits persisting for the fundamentally-identical-cases, because if there isn’t a physical difference – if the wave functions are identical – then the waves go right back to a superposition and interfering/canceling each other out.
Comment #364 January 28th, 2022 at 4:32 pm
fred #353:
Yep. So if the universe is built out of waves, something like QM. If the universe is built out of particles – well, also something like QM.
If energy is quantized, we’ll observe quantization. If energy isn’t quantized – well, we’ll still observe quantization, at least in any universe with relativity. To explain, insofar as an observer presupposes a finite set of probable small-scale stable configurations, which is to say, insofar as there’s something LIKE predictable chemistry, then there is a finite number of normal-conditions expected quantities of energy. In any universe with relativity, which is to say, any universe with a finite maximum speed, things on small scales “run faster” than things on large scales, such that only state changes between stable configurations can even be observed. (Observe that an observer whose atoms are the size of galaxies would have a scale-relative timescale incomprehensibly slower than our own, and wouldn’t be able to observe, for example, stars, which are far too short-lived and unstable to be observed on those timeframes).
Comment #365 January 28th, 2022 at 4:49 pm
I’m not scientist so I can’t add on the questions you asked, though sometimes I think about them. For sometime I think that in all metaphysical and scientific theories we have similiar patterns. Ancient greeks or modern physics have many similiar ideas and topics (discret and continous, atoms and energy, etc). Maybe our brain is wired in the way we could think about reality only with this ideas. Maybe for deeper understanding we need like update our brain with genetic enginiering or cybernetics. Penrose believes that our brain could do superturing calculation, but I’m not even sure that our brain is Turing-complete. Artificial neural networks aren’t generaly turing complete. Does somebody proved that human brain is?
Comment #366 January 28th, 2022 at 5:16 pm
Scott #343
– I.e., whenever you think you’ve found a completely convincing reason why QM is basically inevitable, you always, always need to stop and ask yourself: “well, suppose certain experiments in the early 20th century had turned out a different way. Would my brain have been able to generate equally convincing reasons why not(QM) was basically inevitable?” –
No, because not(QM) was already logically incompatible with the 19th century experiments establishing Maxwell’s equations. Not sure how far back you need to go in terms of empirical evidence before classical explanations start requiring Rube Goldberg concoctions, but I would guess not much.
Same goes for indeterminism: once observed that there is this intrinsic randomness in microscopic phenomena (as required to “make sense” of complementarity), trying to embed this randomness in a classical framework simply adds another substrate whose only purpose is essentially to negate the sentence “this intrinsic randomness is fundamental”. Why should one entertain that hypothesis?
Am I victim of this self confident delusion you mentioned and missing something obvious?
Comment #367 January 28th, 2022 at 5:29 pm
Einstein always said that the fundamental physical reality of QM should be expressed in terms of some kind of *physical* spacetime, not abstract spaces. But what space-time? Not the classical kind. So the closest thing in spirit to what Einstein wanted has to start with the phase-space formulation of QM and find a physical geometry, a *non-commutative geometry*. So I would ask: what natural *physical* principles related to this kind of geometry could explain QM?
I’d say that Alain Connes was probably on the right track , he’s the one thats tried to develop non-commutative geometry since the 80s, but the problem is, where are the underlying *physical* principles to motivate it? Without these, the task is hopeless, analogous to someone trying to find the math of general relativity without knowing any physics.
This is the mistake of nearly all the commenters in this thread; one simply cannot hope to understand QM merely by shuffling math symbols or firing off vague verbal ‘interpretations’ of abstract non-physical concepts like ‘wave functions’, one must obtain the underlying *physical* principles, expressed in terms of *non-commutative geometry*.
Comment #368 January 28th, 2022 at 5:46 pm
Tu #360: For me it was when I realized that Many-Worlds is what you get when you just take what the Schrödinger equation says as literally true, and stop torturing it with an unphysical and ill-defined collapse. It got reinforced when I was taking a lecture on QFT and realized that the high-energy people simply ignore collapse, for them the theory is completely unitary. Obvious in retrospect: for them relativistic effects are crucial, and how could they ever reconcile that with a nonlocal collapse?
I could embark on mental gymnastics as like in orthodox quantum mechanics or believe what the math was telling me. The choice was clear.
And yes, all branches are real. There’s nothing in the math to differentiate them. The Bohmians like to postulate the existence of some invisible pink unicorns bringing the magic of reality to only one branch, but that’s just ridiculous. At least they realize that this is what it takes to deny the existence of the other branches.
Comment #369 January 28th, 2022 at 6:11 pm
Daniel Varga #221:
I design a deterministic cellular automaton, and agents in it. The agents can not observe their environment without altering it, which gives rise to some version of Uncertainty Principle, which in turn forces them accept a quantum theory of physics. I am NOT saying that the agents are wrong, and their world is “really” a hidden variable world. From their perspective, they know everything about their world that’s ever knowable, and it is quantum. But I, who designed their world, have a different, very valid perspective.
My criticism is specific and technical: I see no reason to imagine that you can actually get QM that way. It seems to rely on a fuzzy notion of “the uncertainty principle” that was lifted from a popular magazine article, rather than the actual quantitative statement about complementary observables. But forget about the uncertainty principle: how would you explain Bell inequality violations this way, without having to resort to superdeterminism, thereby curing your headache by guillotine? How would you explain Shor’s or Grover’s algorithms??
The whole point of all these discoveries is that QM does not act like a classical CA to which you only have fuzzy, incomplete access. Maybe that was an honorable first guess, but you still have to discard the guess now, because it makes clear predictions and those predictions are wrong.
Again and again in this thread, I’ve admitted what I don’t know. But there’s one part I feel absolutely certain about, and that’s that the right approach to QM is to listen to what Nature says rather than dictating to her what she must’ve meant to say.
Comment #370 January 28th, 2022 at 6:19 pm
Scott, in comment #228 you mention how during the 1800’s physicists could try and address deep questions such as “Why is mechanics time-reversible?” “Why does mechanics satisfy a least action principle?”
(The following is tangent to the question raised in your blog post, but it may catch your attention.)
Interestingly, the reason why Hamilton’s stationary action holds good can be demonstrated using means that were already available during Hamilton’s time. That is, the fact that Hamilton’s stationary action holds good can be explained entirely in terms of _classical mechanics_. Links are at the end of this comment.
To avoid misunderstanding: I confirm of course that all of the phenomena of classical mechanics emerge from the quantum world. In that sense the entire body of classical mechanics is accounted for in terms of QM.
I am aware of course that the claim that Hamilton’s stationary action can be understood _classically_ is an unexpected one. Your _expectation_ is that Hamilton’s stationary action comes from QM.
I am aware: If a claim is highly _unexpected_ then the demonstration will have to be low friction, very accessible. (Conversely, if the demo would be opaque/dull then most likely the reader will dismiss it.)
For the goal of vivid demonstration I have created a set of interactive diagrams. The diagrams have sliders, moving the sliders allows the visitor to explore effects of variation. As the trial trajectory is modified: the diagram shows how various values respond to that.
I hope I can persuade you to check it out:
Available in two locations:
On the physics forum site physics.stackexchange:
https://physics.stackexchange.com/a/670705/17198
On my own website:
http://www.cleonis.nl/physics/phys256/energy_position_equation.php
(The version on my own website has the fully functional diagrams. The version on physics.stackexchange has animated gifs that are composited from screenshots.)
(The link to the stackexchange version is presented here to show that this material has been vetted by others. I’m acutely aware of your ‘claimed mathematical breakthroughs list’.)
Comment #371 January 28th, 2022 at 6:24 pm
fred #231:
So, if there’s a way to show that QM is necessary it’s probably by looking at even more fundamental ideas that are hard to dismiss, like entropy, conservation of information, fundamental symmetries, the fact that spacetime has 3+1 dimensions, etc.
I agree! That’s exactly what was done for many other aspects of physics that seemed unmotivated at first, such as (famously) the Lorentz transformations.
QM, however, seems a lot more fundamental than the 3-dimensionality of space (certainly the string theorists regard the latter as a mere emergent detail 🙂 )
Comment #372 January 28th, 2022 at 6:34 pm
fred #235:
Why do you assume that two theories that work separately can always be reconciled?
Isn’t it possible that the transition from one mode to the other involves processes that just can’t be described by some compact mathematical relation?
Whether or not that’s possible, when it comes to QM and special relativity it’s not true! We know that they can be reconciled because they were.
Comment #373 January 28th, 2022 at 6:42 pm
drm #238:
A couple of dumb questions from a biologist:
1) does QM require infinite precision for unitarity, etc. or do all of those irrational factors of sqr of 2 and pi take care of themselves?
The short answer is that trying to eliminate irrational numbers from QM would make it horrendously less elegant. Furthermore, it follows from basic trigonometry that if you want your angles to be rational (or even rational multiples of π), that will typically make your lengths irrational; if you want the lengths to be rational that will typically make the angles irrational.
2) Is Bell’s non-local condition equivalent to the older (I gather) notion of contextuality?
No, they’re different (although there are some mathematical connections between the two). To talk about measurement contextuality, for example, you don’t need any tensor products or spatial separation between Alice and Bob.
Comment #374 January 28th, 2022 at 6:51 pm
Scott #288:
I completely agree with all of your first point which I copy here below and regret that I gave you any other impression:
“… if someone in the 1800s had asked “why is classical mechanics true? why does it have
the features it does?”, with hindsight that would’ve been one of the best questions they
could possibly have asked! Because it would’ve had nontrivial answers! Albeit answers
that were only discovered later. For instance: Why is classical mechanics time-reversible?
Why does it satisfy a least action / Euler-Lagrange principle? The answers would come from
QM. Why does the gravitational force fall off like 1/r2? Why are gravitational and inertial
mass the same? The answers would come from GR. In other words, there really were deeper
principles waiting to be discovered (deeper principles expressed, yes, using math). So, your
thought experiment strikes me as supporting optimism, rather than pessimism, about the
search for deeper principles underlying QM! Having said that, there’s an immense irony here:
physicists were ultimately able to explain classical mechanics in terms of deeper theories, in
large part because they discovered that classical mechanics wasn’t exactly right. The
corrections were what led them to the deeper theories from which classical mechanics was
then recovered as an excellent approximation.”
On your second point, if we change it as follows:
“If (as you seem to think) QM
isn’t exactly truemay yet prove to be anincomplete model in the event of new evidence, just like classical mechanics
wasn’t, then we should ultimately be able to explain QM in terms of somethingdeeper.”
And with that change, I think you and I both agree that this would be the optimistic outcome if evidence were to show QM incomplete AND we found a more complete model.
To your third point:
“If (as I fear) QM is exactly true, then we might not ever be able to explain it in terms of
anything deeper (but we can still try!).”
We both have a fear. But our fears are different. And this has been my point.
The Scott Fear:
Scott fears QM is exactly true. By “exactly true” Scott means that QM is the actual operating system of the universe. And by “fear” what Scott means is that Scott may never know why QM must be the actual operating system of the universe.
The Clinton Fear:
Clinton fears QM is exactly true. By “exactly true” Clinton means that QM is the best model humans can find of the universe. And by “fear” what Clinton means is that Clinton can never know if Clinton is just stuck on some island in mathematical theoryspace or if Clinton is deceived by his own neural model of computation.
I fear that one of these nightmare scenarios below may be the truly horrifying answer to your Q.
(A) We have descended into a valley of mathematics or landed on an island in theoryspace, from which we CANNOT construct the mathematical tools required to leave. In other words, WE ARE IRRETRIEVABLY TRAPPED (unlike what you and I feel would be the optimistic scenario in your second point) in a local minimum on the theoretical landscape.
(B) Or, worse, the universe is not fundamentally mathematical at all – and so obviously cannot be QM. It is something … beyond mathematics – whatever that would even be. Maybe mathematics is just an … emergent property … or an evolved part of cognition. Yes, it works WITHIN the universe … but it doesn’t capture the “deeper” universe. I presume you are presuming that we will go about our project to understand reality by presuming that we WILL use mathematical reasoning. But I think once you do that then you find yourself ending up eventually at QM. Why? Well, the basic notions of symmetries that come with mathematics, how that leads to group theory, the normed division algebras, makes complex numbers the special case, and how if you say “I want to make a predictive model” … well, then you just assumed we were going to be using a probability model … and so … you get QM (see my first post). And, yes, I know that questioning the use of math makes you want to “howl into the dark” because … well what else are we supposed to do?! It is what works! But … OK … I’m telling you what a horrible truth this would be … what I’m AFRAID of … And that would be to KNOW that the only thing we know that works … is NOT and CANNOT be the way to understand the fundamental nature of reality.
(C) Or, maybe worse of all, we are being tricked by our own neural model of computation. The very thing we rely upon to know and think anything at all is just generating an elaborate cognitive deception. Neuroscientists are now generally arguing that what we take to be reality, free will, a sense of self … are sophisticated computational illusions generated by the brain. Yes, I know, physics and philosophy have been onto the idea that we should doubt our sensory experiences for a long time … But I’m not just talking about doubting our sensory experiences … I’m talking about doubting the very computational model we are using to generate those sensory experiences, the logic of our very thoughts, the computational model itself. And I am NOT encouraged that neuroscientists report the brain encodes complex numbers, the brain represents states in vectors of complex amplitudes, that normalization is canonical in the brain, and that linear operators are standard in the brain – all of which sounds like a vaguely familiar model of computation. I would feel much LESS worried about this possibility IF the neuroscientists reported that the brain looked anything like a binary model of computation. But it doesn’t – not at all. I would also like to think optimistically that the explanation is that the brain evolved a model that was like the model running in the environment (the universe) that it evolved in. In that case we could get back to the Scott Fear of definitely knowing the “actual operating system” viewpoint.
It is possible that these three situations overlap.
To borrow Wittgenstein’s metaphor, we are like the FLY IN THE BOTTLE which is trapped in the bottle because nature endowed it with a model of seeing (phototaxis) that does not even potentially ALLOW it to realize that there is a way out.
Comment #375 January 28th, 2022 at 6:54 pm
wolfgang #245:
Why did God use quantum theory to make the universe, but have it appear classical to us?
And how exactly did he do that?
Oh come on, we all but know the answer to that one: decoherence! Not an add-on, but an unavoidable prediction of QM in a universe that’s gradually filling out its Hilbert space like ours is.
Comment #376 January 28th, 2022 at 6:56 pm
Even beyond the question why this particular framework exists and is ( supposedly) more fundamental than the other, there is a more difficult and, maybe, deeper issue:
Are the existing laws of nature eternal, or they evolved from some more primitive or less ” accurate” ( or less well defined) primordial principles?
Do they have an independent existence in some ” platonic” sense?
Quantum mechanics seems to be more fundamental, yet it still depends from some ( naive versions of) classical notions ( like space and time). There is lots of speculation about the supposed emergence of spacetime ( and gravity) from some purely QM description, although gravity is the only universal interaction, as it affects everything, literally, that exists, including spacetime itself, and that’s a bit funny to think…
Comment #377 January 28th, 2022 at 7:35 pm
Scott #288:
“If (as I fear) QM is exactly true, then we might not ever be able to explain it in terms of
anything deeper (but we can still try!).”
But should we try?
A quick page search shows almost no mention in this thread of Godel or the Halting Problem. That can’t be right 🙂
Let me try to come up with something completely off the wall then and you tell me why this doesn’t apply.
Incompleteness:
Assume that QM is exactly the operating system of the universe. In other words, the map is the terrain. I mean literally, the universe physically is made out of complex amplitudes and the most fundamental physical laws are exactly the axioms of QM. Then, clearly, QM is an axiomatic system at least sufficient to capture the properties of N. Therefore, it will be impossible to prove the completeness of QM from within the universe. (As a bonus rabbit trail, if this were true, then is GR an example of an unprovable true statement within QM?)
The Halting Problem:
Or, how about this. Aliens arrive tomorrow and declare they know everything and say, “Ask us anything.” You ask, “Why did it HAVE to be QM?” They give you the principle X or the underlying general model Y that explains why the universe had to be QM. Will not your first thought then be, “Yeah, but … why did it HAVE to be X? or why did it HAVE to be Y?” So, it seems like there is something wrong with the question. In other words, if QM is the operating system of the universe then … there is no deeper reason why … there is no deeper operating system. Otherwise, you get a halting problem situation because you are asking “Will this program (search for the fundamental model) halt?” I mean … Let P be the program to find or verify the most fundamental model of the universe. Let I be the evidence input. Let output be 1 if P halts on I and 0 if P never halts on I. Assume there exists a program Halt(P,I) that returns 1 if and only if P halts on I …
These thoughts aren’t even half-baked 🙂 but I’m just trying to think of some way to prompt you to tell me why incompleteness/HP has nothing to do with your Q
Comment #378 January 28th, 2022 at 7:44 pm
Apologies for the overly long comment.
I want to start by saying that I completely agree that cellular automata seem much more natural than quantum mechanics. There are certainly difficulties in making a CA world come “alive”, but it isn’t clear that they are intractable. Here are some of the difficulties that I see:
– If the CA is not reversible, then it probably needs a hardcoded initial state (my intuition is that states which have an infinite chain of predecessors are rare if the CA is irreversible). This invites the religious question of “well who chose that state, then?”. It also invites Last Thursdayism – who is to say that where you are now isn’t the initial state?
– If your initial state has only finitely many “living” cells, then somehow its evolution seems like it will be eventually predictable, even to those living within the system. Well, maybe not – perhaps the inhabitants can’t compute the consequences of the initial position faster than they occur – but I generally dislike this type of universe as a place to live. There is always the fear that every source of interestingness in the world will eventually die out.
– If the state ever becomes periodic in some direction, then from that point on the CA is equivalent to a lower-dimensional CA.
– If the initial state is chosen at random, then we have the question of what mechanism makes the random choice, and why is that mechanism so different in flavor from the deterministic evolution of the CA from that point on?
– Once a CA has many local states or has a large neighborhood, describing its transition function becomes prohibitively difficult (I speak from experience). Even symmetry assumptions don’t buy you that much simplification.
In order to deal with these objections, it seems that at the very least you will want to use a reversible CA, ideally with some element of randomness. One direction which I haven’t seen explored is the possibility of a deterministic CA with a non-random quasiperiodic starting state, or perhaps a CA defined on a quasicrystal tiling of the plane (such as the Penrose tiling). In such a universe, the element of randomness comes from not knowing your precise location in space. You still seem to run into a few of the objections above.
An interesting side note is that reversibility and randomness seem to be the right framework for thermodynamics, and non-equilibrium thermodynamics might be a plausible recipe for life. Jeremy England has an intriguing research program which has the goal of showing that life naturally evolves in an environment with reversible physics, access to a heat bath, and access to a source of harvestable energy (based on the failure of a naive attempt to get this to work, I think that the ultimate source of this harvestable energy should be separated from the location where life is doing the living).
—
Most of the objections to CA physics also apply to other models of physics. They lead to the following desiderata:
– Physics should be reversible (note that this isn’t the same as saying that the laws of physics going forward in time should be the same as those going backwards in time, which is empirically false).
– There should be some element of randomness. The randomness shouldn’t occur at some special time, but rather should be an ongoing part of the evolution of the world.
– Physics should not depend on having a very strange initial state.
Actually modern cosmology doesn’t seem to satisfy these desiderata, as far as I understand it… if physics is reversible, and information is conserved, and if only a finite amount of information can be squeezed into a tiny space by black hole information bounds, then where was all the information stored at the time of the Big Bang? Ok, I’ve exposed my ignorance; let’s leave this question to the side until someone solves quantum gravity.
—
The discussion of CAs vs quantum mechanics has implicitly focused us on universes which satisfy the following desiderata:
– The rules of physics should be computable, in the sense that you should be able to approximate the (statistical distribution of) outcomes to any desired accuracy.
– The laws of physics should be based on local interactions. (Implicit in this is that there should be some notion of distance, and time, which looks at least vaguely like a manifold. I’m not sure how to answer the question “why not a Sierpinski triangle?”.)
– We should be able to describe what is going on in a finite region of space to a high enough accuracy that we can make predictions about it at a fixed time in the future, using only a finite amount of information. (Chaos theory says that in some reasonable classical situations we may need a huge amount of information to predict times very far in the future, but it has little to say about a fixed time in the future.)
All of these seem to be saying the same sort of thing. Local rules on finite alphabets is how we define Turing machines, after all, and logic is also based on local rules of deduction.
Why do we think that “computability” is the natural boundary, here, rather than “polynomial-time computability”? One argument I’ve heard is that a Turing-complete universe with polynomial-time physics can simulate any computable universe, but no computable universe can simulate the physics of a universe with a halting oracle (except perhaps in some limiting sense). This isn’t completely satisfying to me: polynomial-time computability seems like it should be a natural requirement, even though it empirically isn’t.
At the very least, even if physics isn’t simulatable in polynomial time, it seems that SAT-solving should still be hard – otherwise, what is the point of evolving complex life and inventing mathematics? This isn’t a very good argument, though: I’d be excited to have an NP-oracle, rather than distressed.
—
Once we’ve gotten this far, I feel that we have sort of justified studying generalized probability theories. We want reversible randomness. You mentioned that you’ve seen the work of Chiribella et al. There actually seem to be many variations on this work: several different ways to reconstruct quantum mechanics from a small number of reasonable axioms plus one “weird” axiom. Interestingly, every reconstruction uses a slightly different collection of “reasonable” axioms. I’m quite curious about whether pooling together all of the “reasonable” axioms (and leaving out the “weird” ones) of the different approaches is enough to reconstruct quantum theory.
Here is an attempt to list out the “reasonable” axioms that occur in various quantum reconstructions:
– The framework should be “causal” in the sense of generalized probability theory (closely related to having no faster-than-light signalling, which seems essential for computability). Another way I’ve seen this phrased is that “local operations commute” (obviously implied by special relativity).
– The system should not be deterministic. From here it’s only a small leap to assuming that the set of states is convex.
– Since convex sets naturally have a dimension, we can ask whether the collection of possible states in a finite region must be finite-dimensional. At least it should be possible to approximate the state-space as finite-dimensional, if we hope for a computable physics? (Quantum Field Theory seems to violate strict finite-dimensionality, and as far as I can tell no one knows how seriously we should take that. But assigning a finite-dimensional Hilbert space to every possible region of space seems problematic once we start wondering about the exact amount of space necessary for the dimension of the Hilbert space to increase by exactly one. I wish someone who understood quantum gravity would explain how this is supposed to make sense.)
– Different states should be meaningfully different: it should be possible to distinguish them by performing measurements on them. “Local discriminability” takes this a step further, but requiring it seems somewhat intuitive: it would be odd if there was a strange property of a pair of spatially separated particles which could only be tested by bringing them back together, especially because there would be no point in time when they were ever truly in the same place (assuming that space is not discrete).
– We should be able to independently prepare states in separated locations, without getting any strange correlations between them. (Together with the previous assumption, I think this implies that the linear span of the state space of a composite system is a tensor product, but it’s been a while since I’ve gone through this stuff.)
– State transitions should be locally reversible, as long as you look at all of the information available in a neighborhood of the thing that is changing. In particular, there should be a reversible operation which looks like flipping a coin when some of the information is thrown away. This assumption is less defensible than others, but it has intuitive philosophical arguments in its favor. (Can this assumption be used to justify “purification”?)
– Time evolution should be infinitely divisible (and probably continuous, too). This rules out CAs by fiat (unless you have some clever stochastic CA where transitions occur based on a Poisson process, but that would seem to conflict with special relativity?). A fan of discrete systems might throw this assumption out, but if it helps to narrow things down to just quantum mechanics then I say let’s use it!
– Measurements should be explainable in terms of the transitions that exist in the theory. So you can’t just have a theory with a weird collection of state spaces, and no interesting transitions at all! (I haven’t seen anyone state or use this axiom, but it certainly seems reasonable to me.)
Did I leave any obvious reasonable axioms out? Are these enough to derive quantum mechanics? Or can you somehow satisfy all of this with a non-quantum theory?
Comment #379 January 28th, 2022 at 8:26 pm
No Preferred Reference Frame at the Foundation of Quantum Mechanics
https://www.mdpi.com/1099-4300/24/1/12
Hi Scott, I know you don’t like to receive references, but two things:
1. Doesn’t this sound like a dead-on attempt to answer your question?
2 Please at least glance at the picture(Click on link and scroll down)
TIA for considering this
I hope its as relevent as I think it is.
Comment #380 January 28th, 2022 at 8:56 pm
Scott
“Whether or not that’s possible, when it comes to QM and special relativity it’s not true! We know that they can be reconciled because they were.”
Yea, after I posted I realized you were talking about QM+SR and I was thinking about QM+Gravity.
Comment #381 January 28th, 2022 at 9:04 pm
Scott #371
“I agree! That’s exactly what was done for many other aspects of physics that seemed unmotivated at first, such as (famously) the Lorentz transformations.
QM, however, seems a lot more fundamental than the 3-dimensionality of space (certainly the string theorists regard the latter as a mere emergent detail)”
What do you think of Sean Carroll’s program to show that everything, including spacetime, could be derived on top of the wave function as the most fundamental object?
I guess that’s one way to go about proving that QM is necessary, no?
Comment #382 January 28th, 2022 at 9:24 pm
#fred, #Scott, 231 , 371
Yes, that’s exactly what I mean by looking for deeper fundamental principles. All the clues are there, Scott, I think you could crack this by next week, just by thinking it through carefully 😀
Lets summarize my proposed chain of reasoning:
(1). We can’t understand QM starting with abstract “wave functions” or pure math, we must identity the motivating *physical* principles, *not* abstract concepts like wave functions
(2). The “wave function” is non-physical, it’s just a computational method we use to calculate aspects of a deeper reality
(3). The deeper reality has to be a physical geometry
(4). The putative new geometry isn’t classical, it must be non-commutative
(5). The closest quantum formulation is the ‘phase-space formulation’, so that should be the starting point when looking for the putative new geometry
To get the deep physical principles, we need to understand what quantum mechanics is *actually* about at the deeper level, which is unknown.
Now, if I were to guess, based on the ‘phase-space formulation’ and also the black-hole information stuff, my guess would be this:
(6). *Quantum mechanics is a generalization of statistical mechanics*.
The reason I like my postulate (6) is because it would naturally result in information, complexity and generalized probabilities (Quasiprobability distributions) playing important roles, which we know they do.
So (6) might be a good initial postulate, but that still doesn’t tell us *what* it is that is actually stochastic in the quantum case. For classical mechanics, we know it’s the thermal vibrations of particles (thermodynamics), but what actually is it that’s vibrating in the quantum case?
Comment #383 January 28th, 2022 at 10:15 pm
Viktor Dukhovni #246:
What’s always puzzled me about QM is the fact that in an apparently non-deterministic future we still somehow get *exact* conservation laws … What’s your take on the puzzle of how conservation and randomness end up consistent?
I’m not sure I understand what the puzzle is. Imagine, for example, a board game where we roll dice to move the pieces around, never adding or removing pieces from the board. We have plenty of randomness even though the number of pieces is exactly conserved quantity.
In general, while symmetries are of course extremely important when trying to guess correct physical theories, it seems to me that their “fundamental” importance in physics is often grossly overstated. At the end of the day, a symmetry simply means that the space of valid, distinct states in your theory is something different, and smaller, than you naïvely thought it was. So, to give some silly CS examples, instead of the set of all possible n-bit strings, maybe you’re restricted to the set of n-bit strings of some Hamming weight k (i.e., the number of “1”s is a globally conserved quantity). Alternatively, maybe every two strings of the same Hamming weight are to be identified, since the theory is symmetric under permutations.
In some sense, though, these are more statements about our own limitations than statements about the theory itself. After all, the whole time we were worried about building up a one-to-one map between our human notations and the actual physical states of the theory … that whole time, the theory itself was happily living well-defined in its actual state space, the one that’s left after you correctly mod out and otherwise account for all of the symmetries!
Comment #384 January 28th, 2022 at 10:46 pm
Maybe QM is more of an expression or artifact of our doing science through the doors of our perception (apologies to Jim Morrison).
Consider a QM interpretation of the Monte Hall problem. When the first door opens, it appears the wave function for the door contents collapses, but only for the contestant, not the host, who knows what is behind all the doors already. If the host did not know what door the contestant picked he might open that one. If he did not know what was in the rooms he might open the one with the car. But he required a combination of both sets of knowledge, and that fact is also known to the contestant. On door opening, the actual “measurement” is not a revelation of what is behind the door, but the “captured” partial increase of knowledge of the contestant, which requires input state from both the host and the contestant. If there are four doors and two contestants, and the second contentant was absent when the first choice was made but arrives just before the first door is opened, their view of the wave function is also overlapping, but different.
A complex number is a pair of reals with an extended definition of multiplication that applies to pairs. The weirdness of time – the Minkowski metric has a minus on the time squared; the time-dependent Schrodinger equation has a -i on the dt – with its suspected relation to human perception being laid out in time – is suspiciously similar to the complex-valued wave function. Human reason, if it is able to make any sense of cause and effect, must analyze with strict determinism and a global reality. However it is an anthropomorphism to neglect the scientist’s initial state and their state change during observation. If the observed item is another scientist observing back, there is no global reality for measurement that pertains solely to one. My view of me, my view of you, your view of you and your view of me, and all the combinations when they meet seems to be asking for pairs that have addition and multiplication descriptions.
Comment #385 January 29th, 2022 at 12:00 am
Scott #386
“In general, while symmetries are of course extremely important when trying to guess correct physical theories, it seems to me that their “fundamental” importance in physics is often grossly overstated. At the end of the day, a symmetry simply means that the space of valid, distinct states in your theory is something different, and smaller, than you naïvely thought it was.”
Every theory guarantees the existence of symmetries – a symmetry is just a group whose action moves solutions to your equations along the surface in solution space that your theory constrains them to.
Symmetries don’t have to be simple, like the rotational symmetry of x^2+y^2, they may also be elaborate and contrived like whatever the transformation is that preserves x^2+xy+y^3. These symmetries show up in things like plasma physics and are still useful.
They are useful because they abstract away specific parameters and keep only the structure. The groups behind the fields in field theory leave the fundamental constants aside, as things that are implied to exist by the symmetry, but not fixed by it. It’s like telling you that a circle is round without getting into the details of how many inches and micrometers it is across.
Comment #386 January 29th, 2022 at 1:47 am
Clinton #251, Scott #290 (if I may address you as such),
Thank you very much for the resources.
I certainly understand that the mathematical structure of quantum mechanics can be described as an abstract operator calculus, but does this mathematical structure necessarily result in correct physics? That is to say, can correct predictions about physical systems be made from this derived, purely theoretical, result, with only the physical constants filled in by experiment – analogous to the fact that special relativity can be derived from Maxwell’s equations?
For example, it seems as though we can define superposition as the effect of a commutative operator that collapses the wavefunction. Determining the prior state of the system, if we want to determine the order of arguments – the preimage of this operator – is non-deterministic, obeying Bell’s theorem. The output of such a “preimage generator” would be a probability distribution over the (discrete) set of possible inputs, because a commutative operation erases information about the input string. To make this more concrete, all we have to go on when factoring 12 is [3,4],[4,3],[2,6],[6,2], which expands further to [3,2,2],[2,2,3],[2,2,3],[2,3,2]. Pick a factor out of the input string: it has a 2/3 probability of being 2, and a 1/3 probability of being 3 – but you cannot determine which argument is a 2 or a 3. Preimage generation of factors is therefore fundamentally non-deterministic.
However – it also seems that this approach also implies that P!=NP. If the preimage generator is fundamentally non-deterministic, it cannot run in P. There is simply no information for a deterministic preimage generator to go on – it has been erased by the computation that created its input. Therefore, for any commutative operator which erases information at a rate which is polynomial with respect to the input string length (but sub-exponential), its preimage generator will be in NP. (An example of such a commutative operator seems to be, simply, evaluating a polynomial.) The commutative operator itself could be used to verify the correctness of the output of the preimage generator – to use the example above, integer multiplication – however, there cannot be a deterministic version of the preimage generator itself.
To be clear, this is the reverse of the typical case: the preimage generator is taken to be the “original” computation, and the commutative operator (examples of which are clearly in P) is used to “verify” it; we are not using the preimage generator to verify the output of the commutative operator. (I am aware that the existence of lossy functions does not prove P!=NP via the existence of one-way functions, of course!) I am aware that we ordinarily do not care about the order of arguments given to a commutative operator, but in this case, let’s say we do: say, for example, that the arguments given to the commutative operator are a combination to a very complicated combination lock, and we need to recover them in order. (Put more abstractly, we’re treating the commutative operator as taking a vector input, and the operation is performed on the elements of the input data structure, with a scalar output.) Also, this approach seems to not relativize or algebrize, as it does not use an oracle. It further seems to not be equivalent to a natural proof, since a non-deterministic function cannot be Boolean and so it makes no sense to attempt to compute its circuit lower bound. Taken together, these conditions seem to cause this approach to clear the major early bar to such proof attempts.
I use “it seems,” of course, because an insight that “seems” to solve a Millennium Prize Problem is unlikely to be correct. I just can’t figure out why it’s false, although this may be 1:45 AM thinking. I assume I will be swiftly corrected, if it is worth the time to do so. Sorry for the length.
Comment #387 January 29th, 2022 at 1:58 am
Mateus Araújo #368: can you elaborate on the difference between high-energy people and other physicists? As far as I know all (quantum) physicists follow the same procedure: initial conditions –> unitary evolution –> measurement.
Comment #388 January 29th, 2022 at 2:31 am
Scott#375
I suspect you are circular when answering wolfgang #245 by invoking decoherence. If you look for an answer to the first part of the question (why did God use quantum theory to make the universe ?) you cannot invoke the QM prediction of decoherence to answer the second part (why and how did He make it appear classical to us ?). Unless you consider that God made QM for the very purpose of using decoherence, which would be an answer to your initial question.
Comment #389 January 29th, 2022 at 2:45 am
Update to previous comment: thinking about this some more, the “very complicated combination lock” is the verifier, not just any commutative operator, and obviously, if the answer depends on the order in which the arguments are entered, it’s not a commutative operator.
That’s why you sleep on this stuff, equally obviously. I’ll think very carefully about whether it holds for a noncommutative operator with a vector input and a scalar output – right now, I don’t see why it doesn’t – before I open my mouth again. I sincerely apologize for taking up your time.
Comment #390 January 29th, 2022 at 2:54 am
Scott#375 (continued)
Said otherwise, your Q1 might be reformulated as :
Q1’ : Why (and how) did God make the universe both quantum and classical, depending on the way or the scale you look at it ? What would’ve been wrong with choosing one possibility only ?
I think the answer to this question is much easier, because choosing either one leads to obvious contradictions with empirical evidence. And thus God needs both of them to get a meaningful universe…
Comment #391 January 29th, 2022 at 5:43 am
As soon as we set up a ‘classical’ universe, we may get the reals as insufficient to manage reality. The reals automatically then give rise to the splitting field of complex numbers , which gives us a simpler to define universe, where more complexity is possible with less rules. In this way Q2 is (almost) inevitable.
Comment #392 January 29th, 2022 at 5:44 am
Mateus Araujo #338, 368:
Every interpretation of QM has some vagueness built in. They just exchange one kind of vagueness for another, and that seems to be inevitable.
In the standard interpretation it is the ill – defined measurement process. In the original Everett version, the “ontological status” of the relative states was not clearly defined.
In some sense, it was much closer to Zeh’s ” many minds ” interpretation ( that is even more vague ), than to the currently popular version.
If one thinks that the branching ( or splitting, depending on the proponent) structure has to be taken literally, then several serious problems occur:
How does this splitting is defined? Globally, or locally?
The first option is incompatible with relativity ( there is no preferred slicing), and is extremely non – local and ill- defined. The second option ( that the splitting ” happens”, somehow locally confined in the future light cone ), is not compatible with QM, because of the usual non- local ( in the QM sense) aspects of quantum measurements. Also, in the case of EPR- type experiments, there are spacelike separated measurements / events that have to be correlated, and there is no other explanation for this in the MWI picture ( if you have a” local” branching), other than the usual consistency requirement. This is also the standard textbook explanation, so why bother with all this extra baggage about splitting worlds?
The above issues are only the ” Tip of the Iceberg”:
Things are getting worse if you allow Gravity to enter the party. Whatever the correct deeper theory is ( quantized gravity, or emergent, or whatever…) it needs a semi- classical limit ( non- Hausdorff? ).
Not only that: It has to give, in each branch, the same predictions that GR gives, at least approximately, to fit observations.
So, it seems that the claimed simplicity of the many worlds interpretation is only superficial, after all…
Comment #393 January 29th, 2022 at 7:12 am
Scott #383: While I agree, that doesn’t answer the mystery of why the standard model happens to be built on gauge theories. Why is this redundancy principle so powerful in guessing the right Lagrangians for nature’s forces, if nature cares not about our silly mathematics?
Comment #394 January 29th, 2022 at 7:16 am
Scott #125:
I don’t know enough to tell you why quantum mechanics is the right answer, but I do think there is a simple answer to why-not-classical that you are hinting at here, which is that it’s too small. If stars did not shine nor meteors fall, but we still learned about earthly facts of physics, evolution, and the history we’ve had of evolutionary catastrophe and innovation, then were we wise enough, we could still deduce *purely on first principles* that the world is too small, and there must be more to our universe we could not see. We would then not be surprised in the least to find out about quantum mechanics—for sure there must be an infinity of branching realities, else how could we so unlikely possibly come about?
People, I think, are mostly blind to the exponential power of repeatedly multiplying one small probability by another. Classical mechanics already feels awfully small for our observable universe, apparently barren save for our one species that will probably die by its own hand. I can’t tell you what makes quantum mechanics the simplest big-and-stable theory, but at least it is big! You’d expect the simple theory underlying reality to be made up of some stuff combining with some stuff in a way that cheaply (in terms of descriptive complexity) made a lot more stuff, as much as possible of which is stable, and which amplifies the quantity of conscious life when it eventually is found. Quantum mechanics does that. It gives you a lot more minds for a lot less descriptive complexity than classical theories. The fact that each little primitive of reality is constantly doing things that create entire new spectrums of our whole reality is perhaps the least surprising fact there could be.
I recognize that this is a lame answer that doesn’t say very much specifically about QM versus other big multiplicative theories, but I don’t have the background to comment on that.
Comment #395 January 29th, 2022 at 7:16 am
Scott #261
(4) robustness against small perturbations
This can be addressed by feedback loops even if all the individual components are highly unstable.
Comment #396 January 29th, 2022 at 7:39 am
Followup to followup:
The idea of a “preimage generator” was to avoid this problem, by creating a non-deterministic algorithm that would, effectively, need to guess the order of the arguments. The fact that it’s a commutative operator that produced the hash that is the input to the preimage generator is crucial to the information-theoretic argument, of course, and it cannot be noncommutative or there may be a way to reconstruct the input vector. Equally important is that verification be by a noncommutative algorithm: the commutative operator will return true to any given valid input vector (i.e., any combination lock “key” that is possible, but in the wrong order).
Therefore, we need three elements: first, we start with the combination “key,” then the commutative operator hashes it, then the preimage generator guesses combinations based on factors, and finally we use the (noncommutative) combination lock to test it. A hard-coded key being present in the combination lock may, however, function as an oracle, and we don’t want that.
There is, however, an analogous system: public key cryptography. So, we start with the private key, the commutative operator hashes it – by, for example, multiplying the bytes together, or any desired lossy algorithm that is sub-exponential in its erasure of information – then the preimage generator guesses which order the bytes were in, and finally, the public key is used to set the decryption algorithm and each order of bytes is tested. The iteration between the preimage generator and the decryption algorithm does not seem to relativize, because the time taken to run the decryption algorithm is taken to be part of the runtime of the overall algorithm – that is, it’s not an oracle – but you wrote the book on that, not me. By all means please correct me if I’m (still!) wrong.
Hope this is still on-topic and, again, sincere apologies for taking up your time – I just wanted to correct that before the universe did and I thought this was important motivation for why it’s important to determine if a system is in fact fundamentally deterministic.
Comment #397 January 29th, 2022 at 8:37 am
Vladimir #387: Sure. High-energy people are usually interested only in describing the (unitary) scattering matrix, leaving the measurements at the end implicit. Nobody cares about the post-measurement state. There is some work in modelling detectors with QFT, such as the Unruh-DeWitt detector, but they are usually studied in isolation. Nobody uses them to calculate what happens in a particle collider.
In contrast, low-energy people like me usually model explicitly the measurements with PVMs or POVMs, and care about the post-measurement state. Sometimes of a system in isolation, or the state you get after measuring a part of an entangled state.
In a nutshell, low-energy people use collapse in practice, and high-energy people don’t use collapse in practice.
Comment #398 January 29th, 2022 at 9:00 am
Dimitris Papadimitriou #392: “Standard” quantum mechanics is so vague that it is hard to even criticize it. I prefer to focus on interpretations that are at least clear about what is going on, like Bohmian mechanics or collapse models. Copenhagen and QBism deny that quantum states are real and refuse to say what is real instead. They have a Strawberry Fields attitude: nothing is real, nothing to get hung about.
I don’t see the point of referring to Everett’s original version. We are physicists, not historians. Keep in mind that Everett’s paper was castrated by Wheeler in an (unsuccessful) attempt to make it palatable to Bohr. His PhD thesis is much better, although not entirely satisfactory, as Everett didn’t know about decoherence. Saying that it is similar to many-minds is an empty insult.
Branching occurs at the speed that decoherence spreads. May be slower than light in some situations, but never faster. I find it rather amusing that you think that local branching is somehow incompatible with QM. The apparent nonlocality in EPR experiments is only there because of the nonlocal collapse. Many-Worlds whole point is not having a collapse! There is no nonlocality that could be incompatible with local evolution. As for the specific mechanism that generates the Bell correlations locally, I recommend this paper by Brown and Timpson.
Comment #399 January 29th, 2022 at 9:09 am
Mateus Araújo #398:
Copenhagen and QBism deny that quantum states are real and refuse to say what is real instead. They have a Strawberry Fields attitude: nothing is real, nothing to get hung about.
I’m going to have to quote that when I teach!
Comment #400 January 29th, 2022 at 9:42 am
Dimitris #392, Mateus #398, Scott #399
“Copenhagen and QBism deny that quantum states are real and refuse to say what is real instead.”
Taking argument from this refusal to conclude that MWI is the only alternative is a caricature. You can perfectly hold that systems within contexts are the real physical objects, and that the usual quantum states (psi) are mathematical tools allowing us to calculate probabilities of real measurements results, for real systems in real contexts. All QM is about (non-classical) probabilities, and MWI is an extravaganza trying to claim that probability amplitudes are real, instead of simply admitting that what is real are physical events, objects and properties.
Comment #401 January 29th, 2022 at 9:49 am
There is a theory that the Big Bang arose from quantum fluctuations due to Heisenberg uncertainty arising in a vacuum, thus allowing something to be created from nothing.
Perhaps that is the “why” – it’s because quantum mechanics is the only thing that truly allows for something from nothing.
Comment #402 January 29th, 2022 at 10:08 am
han #401:
There is a theory that the Big Bang arose from quantum fluctuations due to Heisenberg uncertainty arising in a vacuum, thus allowing something to be created from nothing.
The problem is that that’s an egregious misunderstanding! Even if the Big Bang did arise as a vacuum fluctuation, the vacuum (which is itself an extremely complicated object in QFT) would’ve previously existed, and its own existence would remain unexplained.
QM simply doesn’t help at all, at least on its own, with the “ur-mystery” of why there’s something rather than nothing.
Comment #403 January 29th, 2022 at 10:22 am
Chris W. #248:
it’s surprising for a layman like me that it’s an open question whether the universe is deterministic (Anbar #234, Scott P. #241).
Is there some flaw in the reasoning “everything in the universe has QM state => the whole universe could be described as one QM state, which evolves deterministically according to the Schrödinger equation”?
As usual, it all comes down to definitions—not to some advanced physics that you as a layman don’t understand. The Schrödinger equation is deterministic; the Born rule for measurements is not. But is measurement actually a fundamental part of the laws of physics (as in Copenhagen and dynamical-collapse theories), or is it just an artifact of our experience as observers (as in MWI and Bohm)? Also, in the latter case, are there nonlocal hidden variables that restore the “determinism” even of the measurement outcomes (as in Bohm), even though we can never exploit that determinism to make predictions in practice?
Comment #404 January 29th, 2022 at 10:31 am
Johnny D. #249:
Schrodinger equation allows for static and dynamic states. Static only possible because of slight of hand to make phase irrelevant in Born rule. This goes a long way to answer why QM. It requires 2d wave function with something like phase and modulus. Dynamic solutions are those that are superpositions of static states. Super position and multiple degrees of freedom imply tensor product.
Static solutions exist as eigenstates of Hermitian Hamiltoninion. These eigenstates exist cause the operator is over the complex numbers. The exponential of Hermitian is unitary.
Can you create a classic system with static and dynamic states?
That’s a superb question. It reminds me of Boddy, Pollack, and Carroll’s quantum-mechanical resolution of the Boltzmann brain problem, namely that (in many cosmological models) the extremely late universe is just going to be sitting in an eigenstate of the Hamiltonian doing nothing, and it’s a misunderstanding of QM to think that Boltzmann brains will infinitely often be “fluctuating into existence” out of that eigenstate, since there won’t be any observers or measuring devices around to measure the fluctuations.
But I can raise the same objection to you that I raised to Boddy et al. at the time: namely, just as Hamiltonians have eigenstates, it’s equally true that classical stochastic evolution laws (i.e., Markov chains) have stationary distributions! The only difference, it seems to me, is that it feels more tempting to regard a quantum-mechanical eigenstate as the “actual reality of what’s going on” (namely, nothing) than it is to regard a stationary distribution like the Gibbs distribution in the same way. For better or worse, people are tempted to regard a Gibbs distribution as merely an expression of human ignorance; if we knew the exact position and velocity of every atom, they point out, we’d see that the state wasn’t “stationary” at all, but frenetically jumping around.
Comment #405 January 29th, 2022 at 10:42 am
Jacob #252:
A question similar to your question 1 that I would love you to discuss: given that the laws of physics are so complicated, why can they be so well approximated by something so simple?
“Why don’t Newtonian physics work?” doesn’t strike me as a terribly interesting question – there’s no reason to suppose they should.
But “why do they almost work?” seems much more puzzling.
There’s a large part of your question that’s just straightforwardly physics rather than philosophy! I.e., if you assume the more recent theories, you can derive and explain why the earlier theories worked to such an excellent approximation—as, for example, Newtonian gravity was recovered as an approximation to GR, or classical electrodynamics as an approximation to QED. Indeed, those derivations were a central part of what the new theories had to do in order to be accepted in the first place!
But there’s a revised version of your question (also asked, I think, by one or two other commenters here) that survives and that I find extremely interesting. Namely, whatever is the most fundamental theory of the physical world, why should it have given rise to this onion-like structure of cruder and cruder approximations (e.g., GR, QFT, nonrelativistic QM, classical mechanics) that each have their own internal logic, and that come closer and closer to the world of everyday human experience? Is there an anthropic story to tell about that? (Probably yes—there usually is 🙂 —but should we actually believe it?)
Comment #406 January 29th, 2022 at 11:01 am
Pedro #393:
Isn’t a guage theory effectively just “We’re neglecting other forces and effects and constraining ourselves to particular configurations to ensure the neglected artifacts are negligible, and our theory is based on predicting the outcome of the thus-constrained system”? (Except more mathematically rigorous)
Under that understanding, in what sense does nature care about our mathematics? We’re trying to minimize the effect nature has.
Comment #407 January 29th, 2022 at 11:42 am
Crackpot #406: No, Pedro #393 has a point. The central importance of gauge forces is the strongest argument against my position that symmetries are “just a thing that you mod out by” (i.e., more about our description than about the world). Gauge forces, however, seem to me to be about more than just symmetry: they arise from making an a-priori bizarre-seeming assumption that a certain internal symmetry holds separately at every point in space, even though the only way to prevent a contradiction when someone travels around a closed loop is then to introduce a new long-range force. And somehow this same trick works over and over. Someday maybe I’ll write a blog post about it … when I understand it.
Comment #408 January 29th, 2022 at 11:54 am
I am late and not particularly knowledgeable, but IMO Q1 is just a specific case of the “why are so many of our conditions so perfect for us to exist in” question that is best-answered by the Anthropocene principle.
Just like we need some form of random mutation and recombination for evolution to work and select for people who can talk about evolution, we need some form of uncertainty to create heterogeneity in the universe to allow statistically unlikely but critically important (for us) events.
Why quantum mechanics? Well, why DNA? Why gravity? There are probably lots of other ways things could have worked, but if they produced conditions for sentience, we (or our protoplasmic counterparts) would be asking “why froblits? Why BNM2? Why the general charge field?”
We can’t know whether quantum mechanics is the only way for a universe to work. We just have to be comfortable with that uncertainty.
Comment #409 January 29th, 2022 at 11:59 am
Scott #268
Sorry I only saw your reply today re Born Rule.
I think you must be misunderstanding me, Schrödinger Evolution preserves |psi|^100654444222 (for example)
Of course this is dumb, but then so maybe is our Universe…
If you have a proof of the Born Rule from the other common axioms of Quantum Mechanics you would get a Nobel Prize I think…
Actually even the Bohmians have a neat argument that whatever rule the Universe started out with it would very quickly evolve to the Born Rule – which is like a classical dynamical system evolving to the natural invariant measure as an attractor – but then you could still have the Universe restricted to the subset of trajectories which evolve to the |psi|^100654444222 invariant subset – there’s no obvious mathematical or physical reason to enforce the whole set of |psi|^2 invariant trajectories in the system.
(Of course, this not evidence of Anthropic Universe in the usual sense of fine-tuning considerations, but just easy to see that the Born Rule would be very common in an Anthropic scenario (as the least “dumb” choice), so no need for design arguments or mathematical perfection underlying our Universe)
Comment #410 January 29th, 2022 at 12:03 pm
To Scott #403, answering Chris W. #248; see also PG#332 and #400, and Mateus#338
In the last Växjö conference (August 2021) I felt that there was a consensus on some kind of trade-off :
– if you insist on determinism (or at least classical randomness, based on ignorance) then you must give up locality in a rather strong sense, called elementary locality in #332. This is typically Bohm’s version of QM, and close to Bell’s ideas.
– if you admit some kind of non-classical randomness, then you can save locality, or at least live with a much weaker form of non-locality, related to what I called predictive incompleteness in #332 (see also #338). This is my preferred version of QM, you may consider it as neo-Bohrian, despite my claim of psi being incomplete. Contextuality is essential here, and also accepting that contextual inferences are NOT nonlocal influences.
– MWI claims to keep both determinism and locality, the price being the extravaganza quoted in #400. What surprises me most here is the claim that the universal psi should be ‘real’, contrary to any empirical evidence. Would MWI consider that psi only speaks about probabilities, and that ‘branching’ is simply updating these probabilities, I would consider it much more acceptable.
– there are many other options, including the ones based on ‘agents’ beliefs’ and further away from any form of physical realism, but they were not strongly represented at that conference.
I guess everybody reading these lines already know all these options, leading us to admit that God did not do a better job to reconcile physicists (and computer scientists) than he did with religions…
Comment #411 January 29th, 2022 at 12:05 pm
@Scott #365
>> we all but know the answer to that one
Well, then tell us how many worlds did He use for that ?
Minor issue, I know …
… and how did the initial quantum state (encoding universes of different size etc.) decohere?
Comment #412 January 29th, 2022 at 12:23 pm
My understanding of QM is too rudimentary to possibly offer any insights above what others will have, so I offer you this instead:
When I arrived in the afterlife and met God, I asked him this very question – well, I say “when” and “in” and “met”, though really the afterlife is beyond time and space, and it’s more accurate to say God *is* the afterlife, but I digress – I asked, why quantum mechanics? Why this seemingly unnecessary complexity? And, being now joined with God I directly perceived the answer, which I now lay out in a narrative form suitable for Earthly communication.
At first God did try to make universes based on simple rules – simplicity is beautiful to God too. And for sure, cellular automata seemed a sensible way of doing this, but have you ever glanced over the proof of Turing completeness for such systems? Simple rules simply shift the complexity of a calculation into the initial state, and massively complex and detailed structures are required to make even the simplest of calculations. Producing sentient beings in such universes required an unfathomably large and special set of initial conditions, the probability of sentience arising from random conditions is vanishingly small. Of course God made some infinitely large universe of this sort to get around this – only it just seemed too wasteful.
So God moved on universes with more elaborate laws, laws just complex enough to allow the spontaneous and regular emergence of sentience. These laws turned out to be much like what you and I know as classical mechanics, or variations thereof. But the inhabitants of such universes quickly reached a point where they had discovered all the laws – and after that, well, they got bored. Weltschmerz. There was no sense of wonder, nothing left to debate, no uncertainty, no further progress to be made, no problems to solve, and in consequence, none of the higher planes of happiness that God wishes for his creations.
And so God experimented with more complex universes. Too much complexity proved troublesome as well – the sentient beings that emerged were unable to solve anything, and gave up striving to understand. But somewhere in between there lay a sweet spot – where the sentient beings could perpetually live in wonder and debate and strive forward, attaining the satisfaction of discovery and understanding, but never the ennui and dissipation of having solved everything.
Thus the beautiful irony of my question – why is the universe quantum mechanical? Simply so that I could wonder why.
Comment #413 January 29th, 2022 at 12:25 pm
Scott #405
« Why should it have given rise to this onion-like structure of cruder and cruder approximations (e.g., GR, QFT, nonrelativistic QM, classical mechanics) that each have their own internal logic, and that come closer and closer to the world of everyday human experience? »
Tentative answer : because historically it went just the other way around, i.e. making better and better experiments, built upon the previous step, and requiring better and better approximations ? And this is called the progress of science ?
Comment #414 January 29th, 2022 at 12:26 pm
Scott #399: Please do =)
Philippe Grangier #400: That wasn’t an argument for Many-Worlds, it was an argument against Copenhagen and QBism. I did give arguments for Many-Worlds, spread around other comments: that it follows from a literal interpretation of the Schrödinger equation as true, and that it is the only way I know to make sense of probability and Bell nonlocality.
I don’t see how saying that “systems within contexts” are real instead helps with anything. Do these systems have a mathematical description? It can’t be a quantum state, as you claimed that they are not real. What is it then? Just an informal notion? That doesn’t cut it. You do need to provide a precise mathematical description of what is real, otherwise you’re just producing yet another Bohrian smokescreen.
Comment #415 January 29th, 2022 at 12:50 pm
If you demand locality, then rigid bodies are out of the question and you are left with nothing but point particles (or strings?). But Newtonian point particles will never interact because the collision probability would be zero. To get around that, you need something like Feynman paths.
On the other hand, continuous fields like you have with, for instance, Navier-Stokes, are local and non-quantum. Perhaps they aren’t interesting enough due to not being in a high dimensional space like QM. Perhaps the dimensionality of state space doesn’t cost anything and QM is favored because the equations are somehow simpler than something like Navier-Stokes.
Ultimately, I think there is nothing that mandates QM in a strict sense. We could just as well have been living in Conway’s game of life. That probably could support evolution if played on a large enough board for enough time. Perhaps a three dimensional version, in order to increase the compute density.
If we are to take seriously Max Tegmark’s mathematical universe, perhaps all these alternate theories are physically real. Every differential equation or cellular automaton you can think of is quietly chugging along in Plato’s space of Forms. Some of them support life, and of those some have a Scott Aaronson asking why we’re living in this particular one.
Comment #416 January 29th, 2022 at 1:03 pm
The number of comments has gone to 400+ already! It would be impossible to read through all the preceding comments and only then write my reply. … In fact, I am going to save this discussion and think my way through all the (valid) points of all the comments slowly, over a period of time. But for the time being, let me write something directly in reference to the main text itself.
—
> Q: Why should the universe have been quantum-mechanical?
We say that the universe is QMcal, but only after (i) taking into account all the available evidence, including the most general phenomenological knowledge and experimental data, and then (ii) *inducing* a theoretical explanation which arises from, and is consistent with, the former.
If tomorrow the sum totality of the observational knowledge (including the concrete experimental data) changes, we might have to say that the universe is not really QMcal after all, that it has some other (as of today unknown) sort of a nature. We will then try to find *that* theory.
It’s all a matter of finding a theory that *consistently* explains *all* the evidence which is known at a given point of time.
—
> Q1: Why didn’t God just make the universe classical and be done with it? What would’ve been wrong with that choice?
Essentially, the answer is that some or the other explanation (e.g. a prediction) from the theory would be wrong, that it wouldn’t be consistent with how the world actually works. The ultraviolet catastrophe, for instance. The catastrophe does not occur in the real world out there; it’s merely a feature of the “classical” Maxwell-Heaviside-Lorentz theory.
An important point here, however, is that there is no such a thing as a single, over-arching, “classical” theory. There are many different classical theories, each of which applies to its own set of phenomena and / or ranges of observations.
There was the Newtonian mechanics of particles, and it applied to the idea of the electrically uncharged and finite-sized objects that interact only via the direct contact at their respective bounding surfaces. The qualifications used in the previous statement were not known at the time; these were subsequently discovered, over a period of time. But already in Newton’s own time, in fact in his own theory of gravity, the elements of the theory *failed* to conform to the ontological model of his own mechanics. In Newton’s law of gravity, you have objects that aren’t in the direct contact, and yet manage to exchange forces (of gravity) via an instantaneous action at a distance (IAD for short). So, Newtonian gravity already is a non-local theory — even if no one today would doubt that it is a “classical” theory. Then there also is the classical diffusion theory of Fourier’s (say the conduction of heat). It again is a non-local theory — and yet, very, very classical. Electrostatics is yet another classical but non-local theory (it has IAD).
So, contrary to the present-day discussions, there was no such a thing as *the* “classical” theory, no such thing as a single ontological viewpoint / theory which remained applicable to *all* the pre-quantum theories. Why, even the geometrical theory of optics is useful (in certain contexts), but has no wave nature.
In fact, even in “modern physics”, you often do a mix-n-match of different theories.
Schrodinger’s wave mechanics is not relativistic. But using it still gives you a damn good estimate of the bonding energy in the case of the smaller atoms. For the helium atom, in fact, the difference in the bonding energies predicted by the relativistic and non-relativistic QMcal theories turns out to be just a fraction of a percent (less than 0.1 *percent*, IIRC). And, closed-form solutions are not available. So, the error due to *not* using a relativistic theory anyway gets completely lost in the jungle of the numerical errors alone. (A “jungle” it is, not a “forest”.) In fact, to back out a bit, even the Born-Oppenheimer approximation itself is just that — an approximation. But we use it.
—
> Q2: Assuming classical physics wasn’t good enough for whatever reason, why this specific alternative? Why the complex-valued amplitudes? Why unitary transformations? Why the Born rule? Why the tensor product?
That’s a very good set of questions. (Phrased way, way, better than Q1!) I will make sure to answer this second set when I come to writing / explaining my new theory (of the non-relativistic QM).
Already, we discussed the complex-valued nature of solutions on this blog back in 2018 (and recently, I also discussed it at Dr. Roger Schlafly’s blog). So, that’s out of the way, in a way. So, I don’t think I would even include a discussion of this feature in my paper. Enough to say that solutions in my approach remain complex-valued too.
In fact, you can think of my new approach as providing a layer underneath the postulates of the mainstream QM, a mechanism which explains how those postulates come about. So, all these (and similar) questions are very relevant / pertinent, but not each of them might actually get discussed in my paper — not in a detailed way, anyway.
And, BTW, that precisely is the reason why I’ve been saying that I need to have some informal interaction with a physicist proper, say a prof or a post-doc of physics (or even a researcher in the QC field who has thought about the foundational aspects) so that I know what all points to include, and what all to leave out, in the published version of my paper(s). In the absence of such interactions, I could go on writing many things that are too obvious to the intended audience (the physicists proper) — I am talkative. And yet, I may perhaps end up not including discussion of points that are very obvious to me but may not be so to others. (Like, e.g., the complex-valued nature of \Psi.)
—
Last minute addendum:
My theory is deterministic, but nonlinear, and thereby leading to an “irreversibility” of the SDIC sort i.e. of the exponential divergence sort. (SDIC means: Sensitive Dependence on the Initial Conditions.)
In general, IMO, high time that people made a distinction between a *law* of physics and a behaviour of a *system* whose elements obey that same law. A differential equation (or a set of them, or a mechanism formulated using differential terms) is deterministic. *Always*. But a system which is composed of elements each of which obeys that same deterministic law, may not itself be deterministic. Newton’s three laws of motion are deterministic, and so is his law of gravity. These apply to particles. As to the systems: The 2-body system is deterministic. But the 3-body system already is non-deterministic — even if based on the same, deterministic law. Prof. Norton has been studying the non-deterministic nature of systems based on Newton’s laws and has many interesting points to note at his site, even at a level that is accessible to the layman (e.g., Norton’s dome).
As to the in-principle stochastic theories, well, *all* of them pertain to *systems*. All of them may be regarded as mere approximations which, by their starting point itself, choose to leave out a lot of information.
For instance, the kinetic theory. It leaves out the detailed description of the individual gas molecules (regarded as “particle”s), simply because there are too many of them. But the theory is useful. “Useful” doesn’t mean “fundamental”. The kinetic theory gets going by partitioning the system state, and then populating those partitions with groups of a large number of particles (but it in fact uses real number to denote the number of particles in a group, not integers!). Noteworthy: The information pertaining to the instantaneous *position* and *speed* of an individual particle gets retained in some way, even if only in an aggregated form, but the information pertaining to the *direction* of motion of each individual particle gets lost in the process. Have a careful look at this point. If you do, understanding many puzzles becomes so easy and immediate, whether it be reversibility and time’s arrow, the Poincare recurrence, etc. …
So, overall, the point of this addendum is this: People should make sure to know what they are talking about. Is it the basic law (which governs the *elements* of a system)? Or is it the behaviour of a *system* (i.e. an assemblage composed in a complex way from those elements, whether the complexity is due to the nature of their interactions, or due to their sheer number, or something else)? IMO, this is a highly relevant point, but the modern / present-day tendency is to gloss over it. Fundamental physical laws are deterministic, and yet can lead to a chaotic (or even fully non-deterministic) behaviour at the level of systems.
Best,
–Ajit
Comment #417 January 29th, 2022 at 1:08 pm
Philippe #413:
That’s missing the crux of the question though: why are the laws of nature even *approximable at all* at the macroscale?
When my students implement simulations of physical systems and get the laws wrong—flip a sign, drop a term—they don’t get new, interesting chemistry and physics. They get complete chaos. Somehow the laws of the universe are such that they homogenize incredibly well at different length and time scales. (Not perfectly—even classical mechanics has its monsters, like turbulent flow—but well enough that it allowed rational exploration of science to develop in the first place).
A steel spring has got its lattice of carbon and iron atoms, its dislocation, its delocalized cloud of electrons; nucleons and their quantum states and their quarks and strings and who knows what else—but at the human scale all of that melts away into the F = kx that even high school students can understand. It’s totally unclear to me why that must be the case.
(You can argue that locality of physical forces, plus Taylor’s theorem (everything is quadratic if you don’t perturb it too much), leads to Dirichlet-like energies which promote smoothness across scales… maybe… but that doesn’t come close to fully resolving the question).
Comment #418 January 29th, 2022 at 1:11 pm
Mateus #414 : « You do need to provide a precise mathematical description of what is real, otherwise you’re just producing yet another Bohrian smokescreen. »
Currently my best answer to your demand is https://arxiv.org/abs/2003.03121 , published as Found. Phys. 51, 76 (2021). It includes both systems and contexts in a unified algebraic framework, inspired by von Neumann’s paper quoted in #301. But it’s not a final answer yet, since it is essentially a ’static’ picture without time evolution.
For MWI please see #410, unfortunately I don’t buy the idea, which also clashes with the above von Neumann paper in the asymptotic limit of a countably infinite number of particles (in this limit there cannot be any universal psi, due to sectorization).
Comment #419 January 29th, 2022 at 1:34 pm
Philippe Grangier #410: The challenge remains, as I mentioned in my comment, to actually produce this “mild” nonlocality in a realist model of quantum mechanics without a flagrant violation of relativity. Otherwise there’s nothing mild about it. The only way I know how to do it is with Many-Worlds. I believe it’s not possible to do it in a single world, but I’d love to be proven wrong.
Comment #420 January 29th, 2022 at 2:03 pm
“An exponentially larger state space for all of reality…”
While I know what you mean by this, is that necessarily the right way to think of it? After all, there’s this lecture…
Also regarding Q1, it might be relevant to note there are results saying that QM is in some sense a unique deformation of classical mechanics into a non-equivalent stable structure, and the instability of classical mechanics is strongly connected with its degeneracy in this framework (concretely, the exactness of pure states).[*]
E.g. https://arxiv.org/abs/math/9809056 (and the 1977 Bayen et al paper referred to within).
[*] This is a paraphrase of the summary of these results by Ludwig Faddeev, p.517 of Quantum Fields and Strings, a Course for Mathematicians vol.1. I have not read the 1977 paper.
Comment #421 January 29th, 2022 at 2:12 pm
Annnnd … while we may or may not have gotten closer to explaining quantum mechanics from some deeper ur-principle … I can now report that I’ve injured the same foot a third time, once again while playing with my kids. (I resolved to be more careful, but they begged me to be “It” in tag until I finally gave in, thinking the foot must be OK and healed now, and … you know the rest!) I’ll find an orthopedic doctor, of course, but I’m resigned to the likelihood that this is just a permanent diminution in my quality of life, and I’ll be more-or-less hobbling from here till the end. Well, less exercise, so in expectation probably less time left, but possibly more time each day to think about QM!
Comment #422 January 29th, 2022 at 2:35 pm
Stewart Peterson #386
I’ll just take the first question. You asked:
“I certainly understand that the mathematical structure of quantum mechanics can be described as an abstract operator calculus, but does this mathematical structure necessarily result in correct physics? That is to say, can correct predictions about physical systems be made from this derived, purely theoretical, result, with only the physical constants filled in by experiment – analogous to the fact that special relativity can be derived from Maxwell’s equations?”
In Chapter 9 of his Quantum Computing Since Democritus class notes, Scott gives the best answer:
“So, what is quantum mechanics? Even though it was discovered by physicists, it’s not a physical theory in the same sense as electromagnetism or general relativity. In the usual “hierarchy of sciences” — with biology at the top, then chemistry, then physics, then math — quantum mechanics sits at a level between math and physics that I don’t know a good name for. Basically, quantum mechanics is the operating system that other physical theories run on as application software (with the exception of general relativity, which hasn’t yet been successfully ported to this particular OS). There’s even a word for taking a physical theory and porting it to this OS: “to quantize.” But if quantum mechanics isn’t physics in the usual sense — if it’s not about matter, or energy, or waves, or particles — then what is it about? From my perspective, it’s about information and probabilities and observables, and how they relate to each other.”
In other (my) words, the postulates of “quantum mechanics” do not (necessarily) give a physical theory. So, it does not “result in … physics” or “result in” anything. QM is a class of computation. And as a class of computation, there will be computational characteristics that go along with using it. So, think of it like a programming language that requires systems to be represented in a specific way (as a vector in C Hilbert space) and requires a (Hamiltonian) operator to specify the evolution of the system (this is how physicists “use” it). There is no physics “in” the class of computation a priori – again, just consequences of the characteristics of its computational class, characteristics of complex numbers, linear operators, etc.
For example, Planck’s constant is not “predicted” by the postulates of QM. It gets introduced if we want to use Planck’s energy formula to talk about energy – ie develop the more refined wave picture in continuous time where h must be experimentally determined. (See Nielsen and Chuang’s Quantum Computation and Quantum Information, Sec. 2.2.2)
As for making correct predictions, I would put it like this: QM is the most successful computational class we know for constructing predictive physical models.
Now, all of that being said, we did “get more than we originally bargained for” from the formalization of QM, two examples being, experimental confirmation of the violation of the Bell inequalities and of the (very recent) experimental confirmation of the requirement of complex numbers.
https://www.pourlascience.fr/sd/physique/l-intrication-quantique-confirmee-par-une-experience-de-bell-sans-faille-12185.php
https://www.nature.com/articles/s41586-021-04160-4
Whether one wants to call those results “predictions about physical systems” or “characteristics of the computational class”, … there are competing … interpretations 😉
Comment #423 January 29th, 2022 at 2:55 pm
Scott,
Sorry to hear about your foot. I remember when I hurt my foot years ago, it seemed to take ages and ages to heal. It might just be a matter of time.
I have a different take on the quantum randomness issue. I think that an awful lot of nonsense is talked about the supposed randomness of individual quantum outcomes.
Surely the MORE important points about individual quantum outcomes are:
1. The system, or a part of the system (e.g. a particle), has taken a definite step (seemingly in response to a situation).
2. This definite step would be modelled (e.g. in computer program) as the assignment of a number to a variable.
3. The fact that the number can’t be predicted by an observer is not as relevant as points 1 and 2.
Comment #424 January 29th, 2022 at 3:30 pm
In physics, it is always necessary to begin with some basic postulates. A good postulate is one which we do not feel inclined to question; we are willing to simply accept it as something that happens to be true of the universe we live in, and we feel no need to ask “why is it true?” (Feynman explains this point with characteristic charm here.)
As an example, the physics community could have postulated the Lorentz transformations: they could have said, let’s just take length contraction and time dilation as given facts. But when Einstein showed that they could be derived from two other postulates (constancy of light speed and the relativity principle) physicists mostly agreed that those postulates were a much better starting point. But why? Why does nobody go around demanding to know: “why the relativity principle?” but we do go around asking “why the quantum?”
In contemplating Q1, we ought to first ask ourselves why we can’t just accept the “quantumness of the universe” as a postulate that does not require further explanation. (I’m not suggesting that it would be obvious how best to formulate this “quantumness” as a postulate — that would still be an open question to be debated as part of Q2. The key issue in Q1 is why ANY postulate representing “quantumness” should be needed in the first place).
For us to accept something as a postulate, we must feel comfortable with not interrogating it with further “why?” questions. That means it must be a statement about the nature of the world which fits comfortably with our implicit framework of thought (world-view).
Physics operates always within a world-view that is historically and culturally conditioned, which tells us which concepts demand an explanation, and which ones can be accepted without question. If we feel that the “quantumness” of the world is in need of explanation, it is only because there is something about it that does not fit easily or comfortably with our implicit world-view.
As a rule, the world-view is always the last thing to change. If we cannot explain a phenomenon using postulates that fit nicely into our world-view, we just keep on trying. If it persists, we call it an “anomaly” and try to get away with ignoring it. And if we cannot ignore it, then finally, reluctantly, we look for an alternative world-view in which we can find new postulates that explain the phenomenon. The “relativity principle” would not have been accepted as a postulate at just any time in human history. The way had to be paved for its acceptance by the work of many thinkers (Galileo and others) who helped create a world-view in which such a postulate would be acceptable.
In the present case, the first step is to ask ourselves exactly why our present world-view is one that happens to accommodate a “classical” model of the world more comfortably than a quantum model. Is it because of the inherent merits of a “classical-friendly” world-view, such as its elegance, simplicity, or “naturalness to the mind”? Or is it an accident of the particular historical and cultural moment in which physics is presently embedded, which narrows the physicist’s imagination to exclude any “quantum-friendly” world-view?
If we are satisfied that our world-view stands up on its own principles, then it makes sense to hold on to it and ask: “why the quantum?” But if we suspect our present world-view might be too narrow (incidentally, this is not an unreasonable suspicion, given that the physicists who shaped the present world-view are disproportionately white men raised in the tradition of Western philosophy) then we should instead ask: “how do I change my world-view, so that the quantumness of the universe might fit in comfortably as an unquestioned postulate”?
Comment #425 January 29th, 2022 at 3:54 pm
Scott #420:
You’ve helped me in this thread; I hope this helps you. You may already know all this, but in case you don’t, here goes.
It takes a while to get on an orthopedist’s calendar. In the meantime, an experienced physical therapist (a real one, not a chiropractor), specializing in sports medicine, has seen just about every type of ankle injury, and probably much worse ones from dealing with football players and so forth. They may be able to prevent scar tissue and/or bone spurs from forming, which will make surgery (if you need it) less complicated and more likely to succeed, and will make rehab easier as well. Take it easy but don’t immobilize it; try to keep everything as loose as you can, with lots of motion and very low forces. It is, however, very important to deal with this quickly.
Signed,
A guy who tore a ligament in high school, had parents who didn’t think time was of the essence in dealing with it, and hasn’t been quite the same since. (Please take it from me – don’t do what I did!)
Comment #426 January 29th, 2022 at 4:20 pm
Has anyone said that the reason for QM is to make the universe maximally impenetrable/misinterpretable for those who do not make a living studying it? 🙂
Comment #427 January 29th, 2022 at 4:28 pm
Mateus #419
Please look at the light cones picture in Fig. 1 of https://arxiv.org/abs/2012.09736, I don’t see any flagrant violation of relativity here. But again, you have to admit that a contextual inference is only an inference, and not an influence. Deciding how `mild’ it is belongs to you…
Comment #428 January 29th, 2022 at 4:46 pm
I am delighted by these questions and I agree they are important and perhaps answerable. I’m very interested in your take on them. Still as a theoretical physicist I have to say that I expect especially the second one to be the wrong question (although quite close to the right one). I expect the question should be “Why do we perceive physics to be governed by Quantum Mechanics.” I guess you want to exclude discussion on this topic (point 4) but I think it should be considered. Not as a way out of having to throw away determinism but because any theory we ever had of physics turned out the be only an “effective theory”. Pursuing this direction does not have to be like throwing up your hands in wonder and say oh but what if some other beautiful other theory exist that we don’t know yet. Often useful classifications can be made even from a position of relative ignorance. From the infinite space of possible quantum field theories we can understand how in the low energy experiments we are able to do on earth most of them all end up looking the same. Physicist first computed things in the simplest cases just because there they could do the computation. Only later it was understood that if nature was not that simple it would still come out looking exactly the same at least at energies that we can reach on earth. It could very well be the case that something similar is going on in the class of possible “Quantum-Like” theories. The simplest theory isn’t always right (See Q1). But it might be that more complicated theories look the same in some regime that we happen to be living in. The guiding principles of effective field theory could very well inspire a useful approach towards classifying the spaces of possible Quantum-Like theories.
Comment #429 January 29th, 2022 at 4:47 pm
Scott #420
Scott, just wait a few more years, and you will get grandchildren ! They are a lot of fun also, even if you don’t run as fast as before. And since their parents take care of them, you have only the good part of the game, and a lot of time to think about QM !
Comment #430 January 29th, 2022 at 5:08 pm
Scott, thanks for the clear answer regarding various definitions of determinism w.r.t the universe!
There is one particular question about Q1 that’s stuck in my engineer’s brain (“how to engineer a universe” 😉 ):
Maybe it is just not possible for creators to make a universe with an arbitrary complex rule set?
The idea (btw is there a way to proof that there is always at least one of the many worlds, in which you don’t make a fool of yourself?):
Of course this all is based on the assumption that for any complex universe the necessary classical rule set would be larger than a quantum one.
Another motivation for the “complexity of underlying rules restriction” speculation is my understanding of the universe’s initial condition (singularity), i.e. absence of any complex structure
Comment #431 January 29th, 2022 at 5:36 pm
“Why should it have given rise to this onion-like structure of cruder and cruder approximations (e.g., GR, QFT, nonrelativistic QM, classical mechanics) that each have their own internal logic, and that come closer and closer to the world of everyday human experience?”
I think this doesn’t even need an anthropomorphic argument. Consistent complicated theories with many parameters usually have simpler consistent limits when taking some parameter to 0 or infinity. It is much easier to come up with a simple theory than a very complex one. Thus, we first understand physics in certain limits.
Comment #432 January 29th, 2022 at 6:52 pm
Thinking “aloud” here: if there were some other universe that were simulating ours, then presumably that other universe would need at least as much computational power as quantum computation, but it could also have much more power. After all, if an applied or experimental scientist uses a classical supercomputer to do a massive amount of linear algebra, they don’t think about it as a “terrible waste of computing resources,” since linear algebra is in the complexity class NC1, whereas the classical computer can do all of the seemingly larger class P in polynomial time. In fact the scientist thinks of the supercomputer as just the right tool for the job. Structural complexity would only seem relevant to the scientist, only worth looking into at all, if it were telling them that they couldn’t do the computation they wanted to do. So, in the same way, even if God has the power of PSPACE or EXP, She does not necessarily consider it a waste to use that power to run our merely BQP universe.
Comment #433 January 29th, 2022 at 9:11 pm
I will try to answer Q1. Once God decided to make thinking animals like humans out of constituents like atoms and molecules which are some billion times smaller than human size, he/she did not have any choice about making laws different from laws operating in the human world! Humans cannot have any experience of living in atom size world. So human intuition has to be based on everyday experiences in dealing with tables, chairs, trees, and wild animals, which are classical. Then he/she had to make laws different from classical laws and of course, they would appear weird from the intuitive thoughts of humans. He/she had to find something different from everyday deterministic logic. Non-deterministic means probabilistic. So I can understand God’s design and motivation very well!!!
Comment #434 January 29th, 2022 at 10:35 pm
Clinton #422:
Thank you very much for your thorough and thoughtful response.
I was always taught to “shut up and calculate,” and furthermore, that anyone who didn’t shut up and calculate was a science-denying crazy person. That may have caused me to inadvertently ask: “is there a set of axioms, which we are not allowed to question, and from which we can shut up and calculate?” Obviously, this is not how science works, and I knew that then as now.
The formulae we use to calculate are clearly correct to the best of our knowledge, but they are (historically) the product of inductive reasoning from experiment. Shall I conclude, then, that there is no definition-theorem-proof version of these? Effectively, that the mathematical structure, as far as it has been developed, allows us to construct non-physical models which do not match real-world data, but which are consistent with the mathematical structure?
Is it even possible to construct a more-constrained mathematical structure, from which we can shut up and calculate? For example, deriving special relativity from Maxwell’s equations does not tell you, numerically, what the speed of light is. Nor would deriving the Planck energy equation from such an abstract structure tell you, numerically, what Planck’s constant is – but it would give you the equation, which will fit the data you will get once you conduct your experiment, and you can fill in the constants.
Ordinarily, such a research program would not have much value, as it would reveal what we know anyway, and add a layer of formalism that nobody would ever use in practice. However, it would allow us to determine if the equations we use could have been derived, rather than inferred from experiment – if that is the goal.
If the goal is unfalsifiable metaphysics, however, we could just as easily say this:
The thought process which was used to develop quantum mechanics – developing rules through abstraction from data through inference conducted by debate – has cultural roots in Talmudic scholarship. Therefore, “God made the world quantum” so that His chosen people could discover it! If the universe only makes sense if analyzed in the manner of a Talmudic scholar, and no one can come up with a functionally-equivalent system that does not use Talmudic scholarship to develop it, that should demonstrate to everyone, forever, how Judaism is the only correct approach to understanding the world and everyone must come to it or forever wallow in ignorance. (Now, just to be perfectly clear, that doesn’t make anything “Jewish physics.” Once the physics is developed, it’s a physical fact, and anyone of any origin can process the information and understand it. It’s just that it was developed in a cultural system created by Jewish religious scholarship, to which the rest of the world must be forever indebted and which everyone else would be foolish not to use, if it is so effective as an analytical paradigm.)
It depends, then, on what is meant by “why” QM. Why, in the mathematical sense of “from what is it a consequence,” or why, in the metaphysical sense of “what is the deep spiritual truth?” I was addressing the former question, since it seemed to be more concretely useful – see, e.g., the P!=NP proof attempt above – although I am certainly capable of addressing the latter. (I am writing a manuscript along the lines of the previous paragraph, which is unsurprisingly my personal experience. I found secular, “cultural Judaism,” by looking for a worldview that reflected reality. It is responsible, in my opinion, for everything that still works in American society – physics, math, technology, and even some of our military thinking – and the waning of its cultural influence has created our toxic political environment. I believe I can make a very good case that America desperately needs cultural Judaism, before it follows a similar self-destructive path to that which Germany followed when it decided to exclude Jewish culture. We have an advantage: democracy and a free press, through which we can advocate for our values. All is not lost, but we must step up.)
Comment #435 January 29th, 2022 at 10:37 pm
Scott #407:
A symmetry is just a statement of independence, that one property doesn’t depend on another property, yes? (That question is serious. I’m continually baffled by the way people talk about symmetries, like they’re deeply mysterious, that there’s no “reason” for, for example, radial symmetry to hold in a particular case; to me, that’s basically a statement that the default assumption is dependence instead, which seems more complex than the converse.)
Or in the strong case, a symmetry is a set of properties that is independent of another set of properties, which granted can look a little weird when subsets of two independent sets themselves are not independent (i/e, {a,b,c} and {d,e,f} are independent, but {a,b} and {d,e} are not) – which I think is where the “weird trick” comes from, in a roundabout kind of way, noticing the properties c and f are actually in the set.
Comment #436 January 29th, 2022 at 10:57 pm
Hi Scott,
I don’t know if you remember me, but we got a chance to hang out a couple years ago.
I’m not sure if you’re familiar with the work of Markus Muller, but his notion of “observer states” closely resembles the view of certain spiritual traditions. If those adepts had thought hard about physics, I suspect they would have noted two things. First, what Yoni mentions (freedom in the future and a fixed past), but second, that it needs to provide the kind of almost-solipsism that can only come out of something like the measurement problem. That’s what prevents it from being implemented classically, I think. (You might wonder “but *why* do we need something like almost-solipsism,” which I could try my hand at answering perhaps. Of course, it probably won’t be an empirically justifiable answer…)
I might also point to Rovelli and his recent discovery of and affinity for the Buddhist philosophical notion of “emptiness.” In short, emptiness could be described as the realization that “there is no way that things ‘actually are.'”
I imagine you’re reluctant to take seriously the kinds of claims that religious people often like to make about having discovered this or that scientific principle ages ago, but something quite different is going on here. I’d be happy to say more, but this is probably enough for now.
Cheers,
A
Comment #437 January 29th, 2022 at 11:01 pm
Scott #404
Step 2: why do dynamic states need to be superpositions of static states?
Atoms have static states. Static states give geometry of chemistry.
Atoms interact with other atoms to give chemistry and materials. When 2 atoms are close, they interact. Since after interaction, 2 atoms that started, say, in their ground states can be in any superposition of static states, superposition allows many more types of interaction.
It’s fun! An atom in a superposition can have a fixed but uncertain energy but an oscillating spacial distribution.
What chemistry or material features are necessary and necessarily require these superpositions? I am not an expert, but there are examples (famously photosynthesis).
I imagine heat transfer, liquids…
Certainly it is easier to create the universe with superposition superpowers.
Comment #438 January 29th, 2022 at 11:38 pm
Scott #421:
Oh! How these things *happen*!
…Get well soon!
—
Kashyap Vasavada #433:
I had written the following paragraphs in my original reply i.e. comment no. 416 above. I had saved my original reply to a plain text file too. But in the mechanics of the last minute editing, somehow, these two paragraphs got deleted. By me. Even if I don’t at all recall how I ended up doing that. (How things happen!) [I can tell that it was me who deleted these paragraphs, only because I had saved both the initial version and the final version in plain-text files on my HDD!]
So, originally, in my comment #416 above, the following para’s appeared at the end of my answer to Q1 (i.e. just before going over to Q2).
Quote:
OTOH, yes, we do know that ultimately, you can’t go on using ideas of just the electrostatics (the way it is, in Schrodinger’s wave mechanics). You have to use electro-*dynamics*, which “automatically” means the special theory of relativity. So, in that sense, the universe isn’t *really* QMcal, it is QM + relativity-theoretical. And so on…
So, the Q1 itself is “wrong” in the sense that it uses too vague a terminology. There are at least two vague terms in it: the idea of the Creator God, and the word “classical”. I would leave aside the idea of God in any discussions of the quantum phenomena. (Also the idea of consciousness, and, though none mentions it, also the idea of life, etc.) Even then, I had to point out that there is no such a thing as *the* “classical” world in the first place! There are many different “classical” (actually pre-QMcal) ideas, and so, possibly, many different notions of classical worlds.
Unquote.
—
Finally, might as well insert a brief note, regarding Physics, General Philosophy, and Theology
Theories of physical sciences, in particular of physics, fall in a different class from those of, say, philosophy.
But first, a point about their similarities: Both philosophy and physics are forms of science. In both, fundamental ideas are inductively conceived of, staring from a body of direct observational knowledge.
Now the difference: The observational data, i.e. the phenomenological knowledge, on which the philosophic truths rest are such that such a knowledge would be available to any thinking adult, of any profession, at any time and age, in any culture, in any location, etc. This specific phenomenology, therefore, ends up including in itself only the simplest kind of observations. It is for this reason that philosophic observations are of the kind that would commonly form an element in any kind of an inquiry.
OTOH, the phenomenological knowledge pertaining to physics is much more detailed and specific in nature. Physics knowledge is only derived from (and only applies to) certain special class(es) of phenomena — those pertaining to the nature and actions of inanimate objects and the similar characteristics of living beings. Physics critically makes use of the experimental method. The observations which it makes use of are such that the sum totality of the observational base is subject to revision.
Thus, the fundamental truths of physics can be subject to revisions.
It’s wrong to expect from any physics theory — even the most fundamental physics theory — the same nature as that which defines the philosophical kind of knowledge. To expect or ascribe such characteristics to theories of physics is to commit a primitive error. Such an error, in today’s world, typically arises out of too easily dismissing the entire domain of philosophy as such. [The late Professor Dr. Richard Feynman of CalTech easily springs to the mind.] But that’s not the only reason. There is another reason, another motivation too. The attempt to wrench the prestige which is actually due to the philosophy, and try and pass it off as it the prestige attached to one’s own profession. [Western scientists, and Americans, in particular, ought to introspect.]
OTOH, in looking for a stable / immutable kind of observations, philosophy too ends up focusing on such aspects of the real world, and therefore ends up deriving such a kind of a knowledge, that this knowledge is often not of a most direct or relevance in applications, of a crucial importance in choosing between alternative answers. Philosophical knowledge provides a base for all other kinds of knowledge. But precisely for that reason, philosophy alone cannot settle issues like, e.g., those mentioned in Q2 here, or even in Q1.
Theology, to me at least, looks like a primitive / rudimentary form of a philosophical kind of thinking. Also, for that same reason, susceptible to more easily being irresponsible. This is an issue of the type of a body of thoughts (not necessarily knowledge, but just a body of thoughts). It is not an issue of practitioners. Some theologians in fact are, in their personal dealings, most considerate and responsible people you would ever run into. But personal virtues like that do not change the nature of the body of thought. It does not change the nature of what kind an abstract thought does theology encapsulate, and puts forth.
Needless to add, expecting to settle physics questions via appeal to theology is even worse an error than expecting physics to settle philosophic questions or vice versa.
Best,
–Ajit
Comment #439 January 30th, 2022 at 12:17 am
Clinton #422.
The trouble with Scott’s, computer sciency, answer is where he says “quantum mechanics sits at a level between math and physics that I don’t know a good name for.” But we do know a good name for it and have for many decades (follow the link). It’s long been recognised that quantum theory is a (partial, algebraic) generalisation of probability theory and that quantum mechanics is just its (natural) application to mechanics; “[quantum] probabilistic mechanics”.
Comment #440 January 30th, 2022 at 1:56 am
Jacques Pienaar #424: « In the present case, the first step is to ask ourselves exactly why our present world-view is one that happens to accommodate a “classical” model of the world more comfortably than a quantum model. »
A tentative answer, taken from https://arxiv.org/abs/2105.14448 : The normal approach in physics since Newton has been to define objects, to attribute properties to them, and to measure these properties. One then asserts that the object “has” this property, for example that it has a position, a velocity, or a momentum. Does this “natural” approach work in quantum mechanics? Although physicists are extremely reluctant to admit it, the answer is clearly no – and this “no”, correctly interpreted, provides the empirical element that is missing in our understanding of the quantum description of the physical world.
You will find more in this paper, but another way to tell it, at least in my view, is that QM requires to abandon classical reductionism, the idea that nature is built from smaller and smaller parts, like a house is built up from bricks. This is obviously not the case in QM, you can see it from entanglement, from contextuality, from non-commutativity… any one you prefer. This leads to the idea that in QM the physical object is not a (Newtonian) system alone, but a system within a context. This is quite hard to swallow indeed, but still physically and philosophically acceptable, and this certainly does not imply the death of physical realism.
Comment #441 January 30th, 2022 at 2:44 am
Philippe Grangier,
“In my understanding of superdeterminism, the non-independence between A, B and S must come from their overlapping past light cones, in order to avoid a clash with special relativity.”
Indeed.
“This possibiliby was already considered by Bell, but it makes that there are no more independant events, no more randomness, no more freedom of choice, since everything has been ‘written in the past’.”
This is just good, old determinism, like Newtonian mechanics, classical EM, GR and so on. Neither of those theories allows for randomness or freedom or choice. This is not a valid argument against superdeterminism.
“So yes, I dislike this option, and no, I don’t think it is the only local one on the table.”
Well, in this case, please refute my argurment, since your last attempt was based on a misunderstanding of the argument (you assumed freely orienting detectors). Alternatively, you can provide a valid counterexample of the argument. Please explain how predictive incompleteness reproduces locally the perfect correlations of the EPR-Bohm setup.
“It is true that predictive incompleteness is not easy to grasp because of its fundamentally non-classical and contextual features, but well, it does the job.”
I’m looking forward to see how it does the job. How can you make two random measurements (measurements at A and B) always agree. The statistics of such events (two coin-flips) says that the probability of agreement is 50%. You need 100%.
“So coming back to your preferred (Bohm-like) option of giving up locality, I think that you have a wrong understanding of contextual randomness.”
1. My preferred option is locality. It is a reasonable option , based on the success of SR.
2. I don’t think there are more types of non-locality. If A and B are space-like and A causes B or B causes A you have non-locality. In order to get the A and B measurements in an EPR-Bohm setup agree you need this. There is no such thing as a “mild” or “weak” non-locality (non-Bohm nonlocality). Either your theory is local or it is not. Scott also claims that the non-locality required to get perfect correlations in an indeterministic context are not problematic for SR. I refuted that claim in my post #296. The non-signaling theorem is just a red herring. The inability to control an otherwise non-local phenomenon does not make it local and does not make your theory compatible with SR.
Comment #442 January 30th, 2022 at 3:19 am
If you look for a deeper understanding of quantum theory , I would consider the inelegant properties as a starting point.
The evolution of the state psi(t) uses time as a classical parameter, implicitly assuming the existence of classical clocks – and where would they come from ?
A formulation of quantum theory without this classical time would be very interesting …
T.P. Singh , 2021 is a paper which proposes such a formulation and might be interesting for your project …
Comment #443 January 30th, 2022 at 4:34 am
I think the answer to the Question depends a lot on the interpretation of quantum physics. So it is no surprise that various interpretations have been talked about in this long discussion above and I am actually surprised that the discussion did not degenerate into “my interpretation is better than yours” contest right away. I wonder if this took a particular moderation effort on our host’s part or if people here are that reasonable in general. Anyway, my own suggestion concerns the many-world interpretation. The idea is it could be the most efficient theoretical framework (or at least one of few most efficient) in terms of generating as many “interesting stories” per “number and complexity of rules”. Something similar has been of course already mentioned multiple times in the comments above. I just wanted to stress that it may be the “ratio” by which the “quantum mechanics with many words” wins. Sure, something like Game of Life may have simpler rules yet, but the rules that define, say, the Standard model are not that terribly convoluted either. It helps that quantum theory itself, without considering a particular model and particular parameters, is just a linear unitary evolution of a Hilbert space after all. And while the Game of Life or a classical mechanics were able to produce the complexity necessary for something like the life to exist too, there would be only so many different structures and different “stories” taking place for a given initial configuration in the classical world, while a never-collapsing universal wave function could produce much more, even for one particular “initial condition”. Needless to say, this all remains a very fuzzy idea until someone comes with a way to formalize the “ratio” I have mentioned here, something which I do not know at all how to do.
Comment #444 January 30th, 2022 at 6:31 am
It does not look like the universe has been quantum-mechanical.
The Hamiltonian in Wheeler-DeWitt superspace is equal to zero. There is no concept of time. Unfortunately, we do not have the right tools to understand it at the moment.
However, a “tiny” part of our Universe has been quantum-mechanical. Why?
So we can have a valid formula for entropy
$$S=\log\frac{\Delta p\Delta q}{(2\pi\hbar)^N}$$
Comment #445 January 30th, 2022 at 6:49 am
Andrei #441 :
« Please explain how predictive incompleteness reproduces locally the perfect correlations of the EPR-Bohm setup. »
Technically (in the shut up and calculate version), it works just like standard QM. About whether this is local or not, see the exchanges with Mateus Araujo. In my opinion, the distinction beween elementary locality and predictive completeness is conceptually quite useful. But if you prefer not to make it, and to put everything in the same ‘non-locality’ black bag, this is your choice, and this amounts essentially in how you define non-locality. Predictive completeness has no problem with SR, see previous posts and reference.
« If A and B are space-like and A causes B or B causes A you have non-locality. ».
In an EPRB experiment Alice’s measurement does not « cause » Bob’s result, which is anyway undefined as long as Bob has not decided about an orientation. Her measurement only allows Alice to make a contextual (and probabilistic) inference about Bob’s result. Nevertheless, if you consider this to be still too much ‘nonlocal’, and prefer superdeterminism as a fully local alternative, I cannot prove you wrong – I can only tell that in my framework, superdeterminism is not the only alternative (and certainly not my preferred one).
Comment #446 January 30th, 2022 at 7:15 am
Scott #405:
>[…] whatever is the most fundamental theory of the physical world, why should it have given rise to this onion-like structure of cruder and cruder approximations (e.g., GR, QFT, nonrelativistic QM, classical mechanics) that each have their own internal logic […]?
Effective field theory?
Comment #447 January 30th, 2022 at 7:23 am
Sorry to hear about the problem with your foot. In the previous year, I had a similar problem with my right arm.
It took me several months, but now it seems o.k. I wish you the best.
I thought about your questions a bit more, and I think that QM exists because Einsteinian Gravity ( or at least a slightly modified version of this) has to exist, for a world hat has complexity, observers, etc.
Maybe the answer ( supposed that there is one…) is that obvious. I’ ll give several reasons for this:
Gravity is the only ” interaction” that affects causality in a non trivial way, and this is important for a macroscopic world with some kind of observers. One could ” put by hand” some causal structure in some other contrived theory, but in GR this is an intrinsic characteristic of the theory.
In some sense, non- trivial causality and Einsteinian Gravity are almost synonymous.
There comes QM: for the universe to make sense, you need a ” stable”, well defined causal structure.
QM permits some violations of some energy conditions, but not of the ” Averaged E.C.”, as it seems.
So, it puts several serious problems for things like warpdrives, traversable wormholes, etc, while GR alone, as a theory of spacetime ( but not of matter fields) is not constrained.
Also, gravity is the only universal interaction. It is also highly non- linear and has that kind of inherent ” instability”, due to its attractive nature, so it needs something else for things to be stable and non- static at the same time. QM is that appropriate ” something”.
By the way, I don’t think that a reasonable world with Newtonian gravity can exist, even though some people are taking it for granted that it could. But that’s another story…
Comment #448 January 30th, 2022 at 7:37 am
Concluding thoughts of my comments 249 and 437.
How can we list phenomena that require superposition? These phenomena are mostly left undiscovered since classical computation is disadvantaged at finding them. Quantum computers may discover zillions of phenomena that require superposition.
Reasons quantum computers may contribute greatly to humans:
1. The curiosity of just building them
2. Give us faster algorithms for some problems
3. Discover physics phenomena that require superposition
I am hoping for lots of 3!
My imagination runs wild. My favorite is quantum models of liquids. What properties of salt water flowing through ion gated membranes are truly quantum?
Comment #449 January 30th, 2022 at 8:32 am
Clinton #422
Sorry about the wrong link. The correct link under “many decades” is here. The link to follow – to the first publication of an explicit recognition that QT is PT that I know of – is to this 1954 paper of Segal’s.
Comment #450 January 30th, 2022 at 8:39 am
Hello Paul Hayes #439,
You are absolutely correct to point out the fact that QM is a generalization of probability theory. I’m glad you did. In the midst of composing my reply to SP I myself paused at that exact sentence in Scott’s lecture notes and wondered if I should include it or include more of Scott’s own words about the GPT aspect. As you are no doubt aware Scott is all the time pointing out that QM is “just a generalization, or extension, of the laws of probability to allow minus signs.” And in my own post #118 above I refer to QM as a “Probability Model Builder”. But, again, I’m with you 110% because this is important.
My thinking behind not expanding on it in my reply was that the question posed by SP appeared to be coming from a perspective that I know all too well – and that is the perspective of what I would call the “Traditional QM Physics Magic Story”. If you were a student like myself (Graduate Degree Engineering) who went through a few physics courses in college then you probably were indoctrinated into the QM “mystery” the way that I was.
QM is (still to my knowledge) almost always first conveyed to students (or the general public) like a “Magic Mystery Story”. You know how this goes … “So, we’ve got these two slits and we begin firing electrons through them … wouldn’t you expect that they just go through one or the other …” It feels in a lot of ways like a magic trick … like a play on someone’s expectations.
QM is (to my knowledge) rarely first conveyed to students (or the general public) like this: “So, we want to make predictions about what is going to happen when we conduct physics experiments. That means we are going to need a probability theory. Let’s begin by considering the probability theories that are available to us, which one(s) we might prefer to use for various reasons (such as being complete and closed, linearity, choice of basis, etc.), and then we will consider some actual experiments to see if the probability theory we think is best actually turns out to be good for the job.”
Now, I’m not accusing introductory physics teachers of purposefully attempting to mislead people or of consciously teaching things this way. And I am SURE that there ARE some physics instructors who DO attempt to make the GPT connection from the start. Feynman, a great popularizer, clearly said that what made QM different was that it allowed negative probabilities. We just find ourselves at the end of a century-long process where physics just HAPPENED to be the application field where QM first arose. And, so, physicists tell THEIR story in the sequence of how it happened to them. Nothing wrong with that.
What I sensed in SP’s question was a familiar perspective (one that I once held) that was based on how QM had been presented to me as “only something weird to do with physics”. Scott’s paragraph there from his lecture notes I felt was a good starting point for someone (as it was for me) in realizing that the “QM Magic Mystery Story” may not be the only way to understand or approach QM.
I remember one of my mathematics professors related to us (I think this is from Schopenhauer) the three stages of truth:
1. It is ridiculed.
2. It is opposed.
3. It is said that “We have known this result for some time now …”
And so I am very glad to hear that we have known for some time now that QM is recognized as a GPT.
Thank you again, Paul, very good point!
Comment #451 January 30th, 2022 at 8:48 am
Philippe Grangier #427: I believe you misunderstood my challenge. I’m granting you that on the probability level violating (2) is a mild form of nonlocality. I’m challenging you to produce these probabilities from a realist model, without flagrantly violating relativity. Your paper does nothing of the sort.
Comment #452 January 30th, 2022 at 9:04 am
Mateus Araujo #398
I agree with you about the need for clarity, and a plausible, realistic ( if you like), description of the physical world at the fundamental level. I’m not a proponent of operational or naively ” epistemic” ways of thinking about nature, at least not on the fundamental level. I prefer to be agnostic about QM interpretations, though, and the reason is that the standard QM formalism does not gives us enough information to decide which interpretation is the “correct” one.
I, also, respect the alternative theories that you mentioned ( especially the various Objective collapse or Gravity induced reduction theories that are potentially testable ), and I’m trying to avoid premature conclusions, because there are many potential surprises for us in the future, or I guess so…
I just do not find that kind of clarity in any interpretation, not only in the epistemic ones.
In many aspects, the Everettian / many worlds class of interpretations ( there are many different interpretations with that label, not a specific one- for example, S. Carroll seems to disagree with you about the local branching…) are more problematic than the traditional textbook ones.
The reason for this, I think, is that such a realistic description is not possible if many serious issues are put under the rug.
Most people are concerned about the combatibility of MW with the Born rule, or with the preferred basis problems, but , in my opinion, the most problematic thing has to do with the ” semi-classical limit”, that I mentioned in a previous comment. All usual debates about interpretations omit gravity/ spacetime/ semi-classical limit issues, so I can not take them too seriously, I’ m afraid.
QM works perfectly well, as far as we know, and for the time being, that’s all.
Comment #453 January 30th, 2022 at 9:22 am
Now, you propose a different measure: that we should count equally each set of worlds that share the same measurement result. As you noticed, this has the fatal flaw of contradicting the data.
Usually when the hypothesis (MWI) contradicts the data, we reject the hypothesis, not simply redefine things so that our hypothesis remains correct. 🙂
It’s also ill-defined: these measurement results are just one decoherent event we chose to pay attention to. There’s plenty of decoherence events happening all the time, everywhere. To actually count each decohered branch equally we would need to take them all into account. It’s clearly a hopeless proposition, and to the best of my knowledge nobody has even tried to do that.
I don’t see why it would be hopeless in principle. The whole point of a QM interpretation is to explain why we see the results we see.
Comment #454 January 30th, 2022 at 10:45 am
Very interesting question and discussion. This is a sort of meta-meta questions that seem remote from physics yet probably require much knowledge of physics and its history.
Three questions that might be related are
1) What is the origin/meaning of probability in nature and why is there probability?
2) What is the origin/meaning of chaos and why is there complete chaos?
(By chaos I mean a behavior that cannot be predicted (even probabilistically). Chaotic behavior seems crucial to the great difficulty to extend (practically) logic from TRUE/FALSE setting to a probabilistic setting, and to answer questions about the probability that some complex statement is correct*.)
3) Why is there something rather than complete chaos?
I would speculate that quantum mechanics is a framework (perhaps “minimal” or even “unique” in some sense) that allows answers to, or at least “better understanding of”, these questions 1) 2) 3).
(If true this can serve as an answer of some sort for “Why should the universe have been quantum-mechanical?”)
My thinking is influenced by that of Itamar Pitowsky’s (that was mentioned in the thread, by Peter Morgan in #146), but I certainly don’t represent or even remember Itamar’s precise views on these three questions or on the question that Scott raised.
* For example, Rudolf Carnap, a central member of the Vienna philosophy circle, had a programme which he believed could lead to a whole logical calculus of probability starting with answer to the question: “What is the probability of a statement A given the validity of statement B?” and ending with an answer to “What is the probability that a theory X is correct?”.
Comment #455 January 30th, 2022 at 10:49 am
Somewhere I saw a paper that argued that even God could not resolve the state of a quantum universe without “collapsing the waveform” or whatever the right metaphor is. It makes me think the answer to Q1 is a kind of radical No Privileged Observers philosophy. Analogous to what relativity does to the ordering of events – there’s no “objective” frame for space vs time – quantum mechanics does for measurement itself. The no hidden local variables rule means there’s no God’s Eye View that’s hidden from us, the muddle that we find ourselves in is as real as it gets. And I think that’s beautiful. I suspect that future mind-warping physics will find something else that seems obviously constant and objective and demonstrate that it’s relative and conditional, in a way that we haven’t even considered possible before
Comment #456 January 30th, 2022 at 10:54 am
typo: My question #2 should be:
2) What is the origin/meaning of chaos and why is there chaos?
Comment #457 January 30th, 2022 at 11:05 am
Mateus#451 « I’m challenging you to produce these probabilities from a realist model, without flagrantly violating relativity. Your paper does nothing of the sort. »
I claim it produce these probabilities without flagrantly violating relativity, so we must disagree about what is a ‘realist model’.
If you mean some kind of hidden-variable, Bell-type model, then you are certainly right, this is the first option in my post #410.
But if you consider the second option of #410, including predictive incompleteness of psi and contextual inferences, I don’t see any ‘flagrant violation of relativity’ whatsoever. On the other hand, I’m ready to admit a ‘mild form of nonlocality’, whatever it means, and though I prefer not to use this wording.
Comment #458 January 30th, 2022 at 11:12 am
Scott- You’ve heard this before, but here are some thoughts on a narrow question within this area- why the Born rule?
The only answer I’ve seen that manages to get it without either breaking the unitary rules or covering everything in metaphysical verbiage is Jacques Mallah’s “Many Computations” approach. To recap, for others, the question is why actual counts of results of approximately (pace Heraclitus) repeated experiments show Born probabilities. That’s very far from what would expect from a naive count of outcomes in a pure unitary (many worlds) picture in which one pretends that distinct outcomes are well-enough defined to be countable. Mallah starts with pure unitary quantum dynamics and a radically materialistic definition of what constitutes a counted outcome. It’s a thought, i.e. a fairly robust calculation by a subset of the overall quantum system.
So why would an outcome represented by a larger quantum measure get more counts? Mallah proposes that to be robust enough to “count”, the quantum representation of the coherent thought must have decent signal-to-noise compared to an incoherent background white-noise component of the state. Ordinary signal-to-noise considerations then given that the coordinate space volume over which the signal must be averaged to beat the noise scales inversely with the measure, so the number of such volumes scales as the measure.
The big question then becomes why would the total state consist of a coherent piece, the part we always talk about, plus the white-noise background, suggested by the Born rule. I argue for a fundamentally anthropic reason. No other combination would lead to probabilities that factorize into stable probabilities of sequential events. (That property is just assumed in the Gleason argument and the many others that try to derive the rule from the pure structure.) Without that property, probabilities of past events would not be fixed after the event. The preconditions for evolutionarily favored thinking would seem not to exist without that property.
At one point I argued that this anthropic constraint would account for the incoherent background. On second thought, we already have been tempted to use anthropic arguments to justify the 2cd Law, i.e. the existence of the initially low-entropy component whose behavior we usually analyze. So maybe the noise should just be taken as given so that now the usual anthropic push is needed just for the existence of our coherent component, no longer to justify the non-existence of the incoherent component. In Mallah’s picture that incoherent component persists.
Again, none of this says anything about why the whole unitary structure exists.
Comment #459 January 30th, 2022 at 11:54 am
I guess I’ll say a little bit more about why “almost-solipsism.”
We’ve already seen why indeterminism “should be” one of God’s design goals. A crucial question is, when should any particular instance of indeterminism get resolved?
Well, from the perspective of any one of her countless incarnations, God wants reality to look free. And if indeterminism gets resolved before it gets to me, then it doesn’t look free to me. So I must be the ultimate resolver.
Of course, this comes with a little problem: it’s logically inconsistent for each incarnation to be the ultimate resolver in a single shared reality. And we don’t want to do something silly like place each incarnation in her own reality where everyone else is an NPC, because those incarnations tend to go crazy instead of waking up (which is also a core design goal…). Something a little more clever is needed.
I think you already see where this is going. The question you may have is, amn’t I just describing our _actual_ reality and trying to retrofit these design goals? And the answer is no: the Buddhists really did discover these design constraints ages ago. The Buddhist “no-self” is the fact that there are no observers; only observer-states (a la Muller). Their “emptiness” is the fact that there is no fixed external reality that is presenting itself to you. Liberation is the direct perception that reality is manifesting itself in this beautiful and crazy way, instant by (illusory) instant.
Comment #460 January 30th, 2022 at 12:11 pm
Clinton #450 (“If you were a student like myself (Graduate Degree Engineering) who went through a few physics courses in college then you probably were indoctrinated into the QM “mystery” the way that I was.”)
No, I was a bog standard physics degree student. So all the more scandalous that I was taught QM as almost all physics students were then, and still are: the current textbooks and courses still take those mysterious, “view from nowhere” Dirac-von Neumann axioms and go from there, and the malign influence of [semi-]classical antiquity is still felt.
Comment #461 January 30th, 2022 at 12:17 pm
Scott P. #453: In order to make probabilistic statements in Many-Worlds we need to introduce a measure over worlds. The non-probabilistic part of the theory (Hilbert spaces, tensor products, Schrödinger equation) does not determine which measure this is, it is a logically independent postulate. In Everett’s original paper he chose the 2-norm measure, which does fit the data. Now, six decades later, you are insisting that we have to replace the 2-norm measure with this “every measurement result is equal” measure. Even though you realize yourself that it contradicts the data. Moreover, you insist that the fact that your measure is contradicted by the data implies that we have to discard Many-Worlds. Not the measure you propose.
I am at a loss for words.
Comment #462 January 30th, 2022 at 1:03 pm
About Q1. If we build a universe only with classical physics then this universe would lack randomness and it would be deterministic and computable. There are many reasons why such a universe is not ideal to live in if you are an intelligent being and you want, at least to try, to make your own future.
Comment #463 January 30th, 2022 at 1:19 pm
Scott #420
Sorry to hear about your foot, I’ve had an issue with gout last year and couldn’t walk for several weeks.
I think QM explains our universe because we need unitary evolution otherwise we’d explode to infinity or shrink to zero and once you allow fundamental randomness then something like Gleason’s theorem, or your argument in the Theory in Island Space paper or just simplicity makes the Born Rule pretty obvious. (Of course the evolution may not be perfectly unitary, just enough for the stability we observe)
(Btw, according to Abraham Pais, not only Born, but Dirac and Wigner initially suggested |psi| rather than |psi|^2 for the probability, see p. 9 Max Born and the Statistical Interpretation of Quantum Mechanics)
Comment #464 January 30th, 2022 at 1:36 pm
Dimitris Papadimitriou #452: Carroll is wrong about branching, among several other things. He is not a good source on Many-Worlds. Saunders is much better. Wallace as well. I’m also a fan of Brown. Deutsch is also pretty good.
Comment #465 January 30th, 2022 at 1:37 pm
Philippe Grangier #457: In your paper you don’t produce probabilities from anything, you just say they are what they are. And by “realist model” I definitely do not mean a hidden-variable model, it’s simply a model where you define what the real objects are. For example, a naïve realist reading of textbook quantum mechanics is such a model, where you declare that both the quantum state and the collapse are real. It has no hidden variables, and it also flagrantly violates relativity. Ditto for a more sophisticated collapse model. No hidden variables, and violates relativity. With Bohmian mechanics you do have hidden variables, and also a violation of relativity.
If you can’t produce your probabilities from a realist model, you don’t have a “mild form of nonlocality”, you have nothing. If nothing is real than clearly nothing violates relativity!
Comment #466 January 30th, 2022 at 2:12 pm
Mateus #465
Sorry Mateus, it is clear now that we have branched into different universes, and that whatever one of us says is fully decohered for the other.
You are too much into a view with a ‘real psi’ to admit that there may be real objects and properties, without requiring that psi is ‘real’; whereas this is obvious to me.
I just (re)quote https://arxiv.org/abs/2105.14448 as a simple introduction, but unfortunately even DeepL will not translate this into something intelligible for you.
Comment #467 January 30th, 2022 at 2:29 pm
Anbar #258:
Well, the ado was taken care of by a few bright guys in the late 1920s…
In which sense the formalism and interpretation of QM are inevitable, given the empirical behavior of even something as simple as a photon, is laid out by Dirac in the introduction to the Principles, and Von Neumann figured out the formal logic behind the projectors shortly thereafter.
Do you understand that I’m not asking why you’d use QM to create this world—which is obvious—but rather, why you’d use it to create a world? While the majority of commenters here get this, there seems to be a persistent minority for whom it’s a completely ungraspable concept.
Comment #468 January 30th, 2022 at 2:39 pm
Chris #267:
if you were strict about a heuristic biasing you against ‘metaphysical extravagance’, you’d be a strict finitist, and your default assumption about the universe would be that it is finite in extent, contains finite matter, etc. Is that the case?
Is it metaphysically extravagant to suppose that there are infinitely many mathematical structures, including all perturbations on ones that closely resemble our universe?
These are excellent questions, but my answer is: whenever there’s an implied infinity in physics, as for example in QFT or GR, and it’s not “provably benign” like the quantum-mechanical amplitudes forming a continuum, I assume the infinity is just an artifact of current knowledge, and that future discoveries will fix it as they so often have in the past.
By contrast, if QC is possible and gives the hypothesized speedups, then the exponentiality of quantum states is 100% real, even though it’s only exploitable in specialized ways. And at least at first glance, it seems wasteful and extravagant to solve a “polynomial-sized problem” (like the stability of matter or the UV catastrophe) by introducing an exponential amount of additional structure! Which is simply to say: if we want to go this route, then the challenge is to find some way of thinking according to which this is not nearly as wasteful or extravagant as it looks.
Comment #469 January 30th, 2022 at 2:43 pm
Guyren Howe #284:
An interesting related question: what would a universe look like that had Quantum Mechanics, but not Relativity?
It would just look like nonrelativistic QM, wouldn’t it? Or are you asking whether that
Comment #470 January 30th, 2022 at 2:52 pm
Scott #467 : « Do you understand that I’m not asking why you’d use QM to create this world—which is obvious—but rather, why you’d use it to create a world? »
Well, I guess I’m among the minority who did not grasp the concept, which was maybe not so clear in your initial questions, it may depend on your mindset. But at least I’m happy to read that the answer to ‘why QM rules this world’ is obvious for you.
Comment #471 January 30th, 2022 at 2:54 pm
Scott, did my previous comment get lost in moderation? (I can’t say that it was terribly insightful, but I don’t think it was offensive or off-topic, so I assume that you didn’t reject it.) Roughly reproducing it here:
Proponents of the many-worlds interpretation and decoherent branch differentiation argue (correctly, IMO) that the MWI follows naturally from the principles of quantum mechanics. I wonder if it’s possible to turn this argument around, and to take the general idea of the MWI as the fundamental starting point for “deriving” (or at least motivating) why QM describes the real world?
Of course, that just moves the explanatory burden onto the task of finding some natural-seeming general philosophical principles that would lead to the idea of the MWI – which is likely even harder than motivating QM itself. But to me, the idea of the MWI is one of the biggest conceptual departures of QM from classical laws – much bigger than simply modifying the classical stochastic 1-norm to the quantum 2-norm – so starting there that might be a promising avenue for motivating QM from the most general philosophical principles. You could perhaps imagine a reasonable line of philosophical reasoning that finds a Tegmarkian “multiverse” (in which all mathematically self-consistent laws of physics are physically realized “somewhere”) to be too extravagant, but to want to have “as many things exist/events happen as possible” within a single simple unified set of physical laws. There might be some way to get from that starting point to QM rather than classical physics.
Comment #472 January 30th, 2022 at 3:00 pm
Philippe Grangier #466: I’m not demanding psi to be real, I’m demanding something to be real. It can be psi, it can be a crazy hidden variable, it can be objects and properties, whatever, it just has to be mathematically well-defined.
And now you quote for the third time a paper where you still don’t do it. It’s rude to waste people’s time like that.
Comment #473 January 30th, 2022 at 3:01 pm
Is quantum number jumping the only genuine moving part in the entire system? Is all other number change in the system merely due to passive mathematical law of nature relationship, which only kicks in when quantum number jumping has occurred?
I think that the above is indeed the case. Individual quantum number jumps, by the system or a part of the system (e.g. a particle) in response to a situation, which would be modelled (e.g. in a computer program) as the deliberate assignment of numbers to variables, are the only genuine moving parts in the entire system.
Quantum number jumping, i.e. primitive free will, and the basis for the more advanced free will of living things, is the only genuine moving part in the entire system.
Comment #474 January 30th, 2022 at 3:47 pm
Ted #470: I did give an argument of that sort in my comments #145 and #313. In a nutshell, you need Many-Worlds to have randomness, and you need quantum mechanics to get Many-Worlds. It doesn’t seem to have persuaded anyone, though.
Comment #475 January 30th, 2022 at 3:55 pm
Scott #467
I do. See #306 and #366, as well as original answer to Q1
Comment #476 January 30th, 2022 at 3:55 pm
wyatt the noob #292:
It seems like the David Deutsch worldview is at least interested in Q1 and has some opinions on it. From a recent reading of TFOR some candidate directions are 1) QM is needed to resolve time travel paradoxes 2) QM is needed to provide foundations for moral realism 3) QM is needed to provide foundations for information and specifically biology and intelligence. My guess is a better understanding of this worldview has other opinions about why QM is needed for bio, epistemology, computing to make sense
I think it could be productive to address the David Deutsch world view as a whole as a way forward.
Deutsch, to be honest, had many confidently-asserted ideas in The Fabric of Reality that made no sense to me—I thought his later The Beginning of Infinity was a much better book.
I at least understand the argument that “you need QM to resolve time-travel paradoxes,” but I think it’s flatly false. Even in QM, the fixed-point around a CTC needs to be a mixed state in general. So then why not, in the classical case, allow probability distributions as the fixed-points around CTCs—those will also necessarily exist?
I don’t understand why QM would be needed to “provide foundations for moral realism” (!) (also, even if that were so, would Nature care?), nor do I understand why it would be needed to “provide foundations for information and specifically biology and intelligence.” If Deutsch thinks that MWI is needed for these things, then once again the question is, but why not a classical multiverse?
Comment #477 January 30th, 2022 at 3:55 pm
Scott, do you include classical field theories (e.g. Maxwell) in classical mechanics, and Quantum Field theories (QED, QCD) in Quantum Mechanics; so the main difference is between classical vs. quantum, mechanics or not? Apologies if this has already been covered.
Comment #478 January 30th, 2022 at 3:57 pm
philip #294:
A purely classical universe wouldn’t take any time to go from start to end
Of course it would! If people are worried about the “block universe” of classical physics, then I don’t understand why they aren’t equally worried about the “block multiverse” of MWI…
Comment #479 January 30th, 2022 at 4:05 pm
B R #308:
your theory of nature should not have an unnecessary ugly dichotomy between waves and particles. Instead it should put them both on the same footing. Good luck doing that with any classical theory! So this is where I would depart to deduce the apparently cherished inevitability of quantum mechanics…
That just pushes the question back to: why do we need waves or particles? We need information, but why does it have to propagate in either of those ways, let alone in both of them?
Furthermore, “wave/particle duality” strikes me as an old-fashioned way of talking. From a modern QFT standpoint, waves and particles are simply different ways to describe phenomena that can arise on quantum fields, the truly fundamental entities.
Comment #480 January 30th, 2022 at 4:10 pm
Ibrahim #311:
Are we as humans going to accept that reality is generated
and generated through interactions of a few things and so only exists for that interaction, only for its scope and that is it and nothing more?
Related to that, Are we going to realize that when we define a “state vector”, we think that we deserve to look from the “god’s perspective” with all the variables placed nicely together and there can be, to our convenience, *one representation to represent the whole.
…
We are still trying to hold on to the chair that we sit: the tools, notations and experiences. With them we got so far. We need to go to the very bottom and go up from there, with a notation of “not set”, “free to have such liberties”, “are not to be put together in one representation”. That means a new journey with extreme humbleness.
Alright then, will you use your extreme humbleness to enlighten the rest of us as to the true nature of what’s going on? (And was that humble enough? 😀 )
Comment #481 January 30th, 2022 at 4:20 pm
Hey Scott!
Yes, this is THE most important question within science. Looking forward to your book! If you are going to write a review, of course you know, I think you should include John Wheeler’s take. Many see “it from bit” as a Rorschach test, connecting anything vaguely physics-related to anything vaguely information-related.
For Wheeler thought, the answer was simple: self-reference. The universe is self-generated, as you can see in his famous “U-diagram” where the eye looks back at itself (optional: see Ed Witten’s interview about this idea in Quanta magazine). In other words the universe can’t exist in relation to anything else, so it has to exist in relation to itself. Wheeler also liked to say that you should find the answer first then do the calculation. He did not have a calculation, but over the span of 3+ decades, he obsessed over the possibility that there was some kind of relation to the self-referential undecidability proofs of Godel, Turing and others. There is no need to modify QM; no hidden variables, etc.
Let me answer your Q2’s first.
QM has two “processes,” as von Neumann called them, and this is the framework for ALL of QM.
Process 1: Non-unitary collapse. Non-deterministic and measurable. (“Particle” behavior)
Process 2: Unitary evolution. Deterministic and not directly measurable. (“Wave” behavior)
Your summary in QCSD of QM as probabilities with negative values very effectively applies this wave-particle duality to quantum computing theorems. The interference from negative values is wave-like and the probabilities are particle-like.
:: Born Rule ::
Let’s look at a simplified real-valued Born rule, then extend it to complex values.
Wave-like sinusoidal behavior is the same as rotational behavior.
x^2 + y^2 =1 describes rotational/sinusoidal behavior (wave-like)
Pr_x + Pr_y = 1 describes the probability Pr whether this or that state will be observed (particle-like)
Thus x^2 + y^2 = Pr_x + Pr_y encodes wave-particle duality. The first terms of the right and left side correspond to each other, while the second terms of the right and left side correspond to each other. x^2=Pr_x ….. The Born Rule! (for reals) It is an easy exercise for the reader to see how this extends to multiplying a complex number by its conjugate.
:: Complex Numbers::
When you call imaginary numbers by Gauss’s original name for them, “lateral” numbers, it’s easy to see why complex numbers are called for in QM. Complex numbers (e.g. multiplying by i) describe rotations – i.e. wave behavior. So a Hilbert space described by C^N, is a necessary framework for wave-particle duality. The complex number describes the rotation (wave) and the dimension N, describes possible states in which a system can be observed in (particle).
::Unitarity::
Same thing here. You can describe a unitary matrix as e^iH where H is a Hermitian matrix. (sorry about the notation)
Every other aspect of the quantum formalism can fit into this particle-wave duality framework, i.e. the duality of the non-unitary Process 1 and the unitary Process 2.
Okay, none of this is conceptually new, you can figure all this out just to establish that QM is what we thought it was, I mean it is on Bohr’s coat of arms. So what does von Neumann’s Process 1 and Process 2 have to do with undecidability proofs? (I wonder why von Neumann didn’t catch this himself.)
Various proofs that rely on self-reference such as applying a Universal Turing Machine to itself, or a Zeno Machine to itself, or even the self-application of Quine’s protosyntax (Quine 1940, Smullyan 1957), have some kind of epistemological limit as a result. All these proofs use quotation, e.g. Godel numbering. Quine, among others has written extensively about this. The key idea is that the epistemological limits of pure math provide us with the epistemological limits of the physical world. In all these proofs, you have two cases, which I will call Case 1 and Case 2:
Case 1: Provable/writable but not consistent
Case 2: Consistent but not provable/writable.
Godel, Turing and others have emphasized that “provability” (by effective method/procedure) means proof by physical, mechanical means e.g. ink on paper, chalk on a chalkboard, stylus on clay.
Then Case 1 provides a mathematical framework for the physics of Process 1 and Case 2 provides a framework for the physics of Process 2.
Case 1 Process 1
Case 2 Process 2
For example if you prepare an electron in spin z+ and measure the x-component of spin, sometimes you will get x+ and other times you will get x-. Inconsistency! This is NOT to say that quantum mechanics is an “inconsistent” theory. It is the most successful theory we have. What this means is that when you prepare a system with the same exact initial conditions, you will not always get the same final conditions, and they can fall into a nice probability distribution. That is Case 1/Process 1.
In Case 2 / Process2, you have a consistent evolution of the Schrodinger equation. It evolves exactly the same each and every time, given the same initial conditions, but you cannot, in principle, measure a superposition (if you could, we would just rename it an eigenstate!).
You can extend this correspondence between logic/CS and physics, as Wheeler envisioned, to explain really all of QM. For example, an Oracle corresponds to the first particle you decide to measure in an entangled pair, and the UTM that is now decidable, corresponds to the second particle observed in an entangled pair.
So what can this self-referential QM do that standard QM cannot do? Many have called for a theory of quantum theory of gravity where there is no space or time. Build a theory of quantum gravity on information. For example, this simple result shows how undecidability pops out when you try to combine QM and GR.
It is known that there is no generalized algorithm for the homeomorphy problem for compact 4-manifolds; the problem is undecidable (Markov 1958). Therefore there is no generalized algorithm for the homeomorphy problem for causal diamonds at the Planck scale (which is allowed by the topological censorship theorem).
It is easy to disprove this result; just show that spacetime at the Planck scale has a trivial topology, or at least restricts some topologies, as is the case with 3-manifolds. But this approach to quantum gravity – generalizing QM a la Wheeler and leaving GR alone – has its benefits. For example, we have a description of an observer within a system, which is necessary for a satisfactory theory of quantum cosmology.
Lastly, several people have tried to define living systems, e.g. Autopoiesis, (M,R) system, as self-referential and even undecidable systems. In the field of cybernetics, several researchers believe there is no generalized algorithm to describe living systems. And physicists have struggled to define an “observer” with a solid consensus. Wheeler’s self-referential observer gives a framework for living systems that does not exist with the familiar stick-and-ball chemistry descriptions of living systems.
You can read Wheeler’s essays and see that his insight is pretty much on point with all these mathematical descriptions. His essays are a good place to start to explore this which you can read on jawarchive.wordpress.com.
Why was the universe the way it is? Because it can’t be any other way! It has to exist in relation to something… itself!
Comment #482 January 30th, 2022 at 4:20 pm
In my previous comment about the constraints that the existence of QM puts on a universe with Einsteinian gravity, I mentioned that in such a world the averaged energy conditions are expected to be valid, so that world is probably causally stable, but I omitted some other obvious related implications:
For example, Hawking radiation needs both GR and QFT, and it is essential for the 2nd law of thermodynamics to make sense in a world with black hole or deSitter horizons. There are some well known problems of course here, namely the information loss problem, but these have to do with our ignorance ( or with our insistence that we must trust some hypothetical ideas / principles), and not with some inherent inconsistency of nature.
There are many other examples that support this idea, that QM complements in a sense GR despite the infamous apparent incompatibility of the two (resolution of singularities, cosmology, inflation,etc…)
The more I think of it, the more I’m convinced that maybe this is it.
Comment #483 January 30th, 2022 at 4:20 pm
I think the argument that classical mechanics ows the fact that it is working (where it is working) to Quantum Mechanics is flawed; since any improved physical theory would be able to reproduce the former results, otherwise it wouldn’t be adopted.
Although it is interesting that many classical notions still work under the formally only slightly changed (then-)new not-classical mechanics.
Of course, an important such notion is mass, which quantum mechanics uses, but doesn’t tultimately tell us what it is about…
But these musings don’t really add anything useful to Scott’s original question(s), and even “42” probably won’t cut it, so I’m gonna stop.
Comment #484 January 30th, 2022 at 4:21 pm
Mateus Araújo #313:
We are used to letting true randomness be simply an unanalysed primitive. We know how to deal with it mathematically (with the Kolmogorov formalism), and we know how to produce it in practice (with QRNGs), so we don’t need to know what it is. But if you are writing down the rules that make a universe tick that doesn’t cut it, you do need a well-defined rule.
The only solution I know is defining true randomness as deterministic branching, aka Many-Worlds. And as I’ve argued before, you do need quantum mechanics to get Many-Worlds.
Are you seriously arguing that, if the world were classical, we wouldn’t be able to make the concept of probability well-defined, but because the world is quantum we can? That’s ironic because usually people argue the opposite—i.e., they understand what probability means in this-or-that classical context, but what could it possibly mean with Everett branches? 🙂
Personally, it’s not a dealbreaker for me in either context—as long as you can explain why your probabilities are real and nonnegative and sum up to 1 I’m probably happy.
But I will insist that probability seems conceptually prior to QM. Indeed, as you know, we can derive the rules of probability from axioms about rational betting agents, none of which rely on QM in any way, and all of which would seem reasonable in any that contained such agents at all.
So I simply don’t see what the problem would be, with putting such probabilities into the fundamental laws without going through the Born rule, as is done in countless stochastic models of physical phenomena. I think there’s a genuine problem in explaining why, in our universe, probability only seems to appear in the fundamental laws via the Born rule, and one of the greatest obstacles to solving such problems is to treat their answers as obvious.
Comment #485 January 30th, 2022 at 4:38 pm
John van de Wetering #319:
You seem to say that you feel Q is sufficiently answered for relativity, so am I correct in taking that to mean that you would be happy with finding some set of physical principles which necessitates quantum theory? Cause you could argue that Q is *not* answered for relativity, because why would the speed of light be finite? That seems like a bit of an arbitrary choice as well, and I could easily imagine a complex life-bearing universe where causality works instantly.
You’re right that even for special relativity, the question isn’t completely answered, because we don’t know why the speed of light needed to be finite—or, for that matter, why the laws of physics had to look the same in all inertial frames, why there needed to be such a concept as “inertial frames” at all. But these all seem, in retrospect, like natural goals in designing a universe. I.e., you want to be able to pick stuff up and move it around without changing its structure, and you also want an upper limit on the speed with which you can do so, since otherwise you could get a giant mess where everything instantaneously affects everything else.
If QM could be derived as the inevitable consequence of similarly natural design goals, I’d say we ought to feel satisfied to have understood more than we had any right to.
Comment #486 January 30th, 2022 at 5:05 pm
John van de Wetering #320:
Then all the classical universes will only support a finite or countable infinite number of consciousnesses while the quantum-like universes support an uncountable-infinite number of consciousnesses. Hence, probabilistically you will always find yourself in a quantum-like universe.
I really don’t think this works, because why couldn’t God just say “let there be a continuum of non-interacting classical universes?” Or 22^ℵ0 of them or whatever? Wouldn’t that be even more anthropically favored than the Everett multiverse?
Comment #487 January 30th, 2022 at 5:06 pm
I’m not enough of an expert in quantum mechanics to have any suggestions for the Q, but I wanted to just chime in and say that I’d be very excited to read this book/essay (I hope it’s a book). I can’t think of a better person to write it.
Comment #488 January 30th, 2022 at 5:56 pm
Scott, it’s simple. God is a Unitarian Universalist. Therefore She/He made the universe obey unitary evolution.
Comment #489 January 30th, 2022 at 6:03 pm
Jair #488: Alas, your theory fails. A Unitarian Universalist God would clearly have made unitary evolution optional, depending on individual conscience. 😀
Comment #490 January 30th, 2022 at 6:33 pm
If we postulate that quantum mechanics is the generalized theory of statistical mechanics, the question is what is the physical interpretation of things like ‘negative probability’? What are the analogues of classical thermodynamics? For instance, for the waves, what is actually oscillating?
If we go with Einstein and suppose that the geometry is the real fundamental thing, perhaps QM is not too different from general relativity after all, although it has to be non-classical (so non-commutative geometry?). And if a certain kind of geometry is needed for reality to be comprehensible at all, then perhaps something like QM and classical mechanics inevitably emerge from that in many different possible worlds.
I’d focus heavily on the notions of symmetry and stability, because QM is all about the linear algebra, and ‘groups’ (basic objects of abstract algebra) are all about symmetries. So somehow connect symmetries to physical stabilities?
There’s an intriguing strange sort of ‘doubling’ aspect to QM, for instance, why does an election need to rotate twice (through 720 degrees) to return to initial state? What’s the physical interpretation of that in terms of geometry?
Thinking along these lines, the classical space-time of general relativity could be an amalgam of two more fundamental types of geometry, one representing the structure of QM , the other the structure of classical mechanics. QM geometry would then be analogous to the “square root” of classical space-time.
Again, earlier in thread, I’ve suggested the 3 defining conditions for reality to be comprehensible at all: (1) Causality , (2) Compressible complexity, (3) Compositionality. Given this, the physical manifestation of complexity is perhaps inevitably something like the classical space-time, and that has to be decomposable into an amalgam of two more fundamental kinds of geometry, one about the *compositional structure* (Classical Mechanics!) the other about the *causal structure* (Quantum Mechanics!).
Comment #491 January 30th, 2022 at 8:07 pm
Gadi #321:
Scott, if these are the kinds of questions that interest you, don’t you think studying physics gets you closer to the answer than studying computer science and quantum computing? Studying quantum field theory, etc.?
I can also ask questions about all the non rigorous things in quantum field theory. Is there a formulation without renormalization? If not, then with which parameters does God actually run it? Do you realize that the current formulation of quantum field theory is far from being a computer program you can just postulate that God “runs”? That mathematical consistency of it has been an open problem for many decades now?
How far into physics and quantum field theory do you really understand? Don’t you think you should get a very good understanding of it (not even claiming I have it- I’m talking about understanding it like at least those physicists in CERN that actually compute things with it) if those are the kind of questions that interest you?
I completely agree that
(1) knowledge of physics in general, and QFT in particular, is potentially extremely relevant to the questions I’m asking about, and
(2) my knowledge of (non-information-theoretic) physics, even at the undergraduate level, leaves much to be desired.
A large part of the reason why I wrote this post in the first place was to elicit the arguments for QM that are known to people who know more physics than I do, and to collect the many different such arguments in one place! ~500 comments in, I think I’ve had some success at that.
If I do write the survey or book I currently imagine, I’ll have to learn more physics, and indeed one reason for the writing project would be that it would be an ideal pretext to learn more physics.
I wouldn’t dare to speculate about these matters if I hadn’t personally known so many people who, to put it mildly, do know QFT—Weinberg, Susskind, Bekenstein, Maldacena, Preskill, Eliezer Rabinovici, Daniel Harlow, Sean Carroll—and if, in my conversations with them about foundations of physics, they hadn’t usually been perfectly content to approximate physics as a collection of qubits being acted on by a quantum circuit, and then discuss whatever information-theoretic or complexity-theoretic question they wanted to discuss in that language.
I believe they felt at liberty to do this for a few reasons:
(1) The finiteness of the Bekenstein-Hawking entropy, which suggests that there really is a discrete collection of qubits at the Planck scale, even if we don’t yet know how it’s realized,
(2) The modern, Wilsonian perspective on QFT, which suggests that whatever is going on at the Planck scale (strings, qubits, etc.), we’d perceive a QFT at the scales accessible to us, with the infamous renormalization problems of QFT probably reflecting nothing more than our ignorance of what’s happening at the shortest distances,
(3) The fact that a near-century of progress in QFT and quantum gravity has left the basic principles of QM itself, not only 100% in place, but a rich topic even experiencing a rennaissance of new discoveries (e.g., in quantum computing and information),
(4) The quantum version of the Extended Church-Turing Thesis, which suggests that even QFTs and quantum gravity theories can likely be simulated by standard quantum computers (i.e., in BQP) with at most polynomial slowdown, and vice versa.
Comment #492 January 30th, 2022 at 8:11 pm
JH #323:
This is the change in quantum mechanics. There are only individual instants of time and there’s no causal connection between these instants, and no causal link between individual particles, making them interchangeable.
Certainly we recover a robust causality in QM, and connections between different instants, in the presence of decoherence.
It’s true that indistinguishable particles—ones that we can know to be indistinguishable (!)—was one major new development of QM. But that just pushes the question back a step: why should the existence of knowably indistinguishable particles have been such an important design requirement for our universe?
Comment #493 January 30th, 2022 at 8:14 pm
Steven Evans #330:
Q: Why should the universe have been quantum-mechanical?
A: The answer is 0.1134 … When you type 0.1134 in a calculator and turn it upside down it reads “hello”. That’s all we are – a pattern on a calculator that says “hello”. Or asks “Why should the universe have been quantum-mechanical?”
WHOA. I probably need another bong hit to appreciate that insight… 😀
Comment #494 January 30th, 2022 at 8:30 pm
Clark Van Oyen #344:
The question you are asking is “why does god play dice?” And perhaps, why do those dice have a specific number of sides, perhaps.
More like, why can the different ways of rolling the dice to get the same result interfere destructively and cancel each other out? Why should they be those kind of dice? 🙂
I respect that as QM expert you have the benefit of context for framing this question. I am wondering if this question will be similar to: “which interpretation fo quantum mechanics is correct (Copenhagen or many worlds?” Do you feel this is the former or latter type of question? may it forever sit outside of experimental verification? Why?
The relationship, if any, between the “why QM?” question and the “how should we interpret QM?” question is actually an excellent question in itself. Clearly the two questions are connected, in that certain answers to one would naturally suggest answers to the other, and vice versa. Equally clearly, the two questions are not the same; logically it seems either could be answered without shedding any light on the other.
Between the two, I’m actually more optimistic about our ability to make progress on the “why QM?” question, because there’s an obvious path forward: namely, study the evolution of complex structures, chemistry, life, etc. in a wide variety of (simulated) classical universes, and see if there are things that consistently go wrong. Whereas with interpretation of QM, it’s not just that we’re at an impasse, it’s that it’s far from obvious what research directions have any hope of resolving the impasse.
Unless, of course, future research were to reveal that QM is only an approximation to something deeper, in which case we’d be back to the drawing board on both the “why QM?” question and the QM interpretation question!
Comment #495 January 30th, 2022 at 8:36 pm
Liam #345:
Quantum mechanics lets you discretize the state space without discretizing space. In particular, it lets you simultaneously preserve continuous spatial symmetries and the third law of thermodynamics (entropy at zero temperature is a finite constant) in a system with particles.
So for instance assume you want to have something like particles, and you also want rotational invariance (you’ve said you are satisfied with Einstein’s justification of Lorentz invariance so I assume you’re happy with taking continuous rotations as a given). Then if your ground state of hydrogen (or whatever your basic atomic building blocks are in your fancy new universe) is rotationally invariant, but you also have a definite position for the electron (or whatever), then you can generate an infinite degeneracy of states by rotating this state. So entropy is infinite. On the other hand, if you want your low energy states to have finite entropy, you need to somehow have states where continuous rotations acting on them generate only a finite number of states, in other words they have to be finite dimensional representations of SO(3). So they have to be spherical harmonics, i.e. the stable bound states basically have to be waves. But when you isolate and manipulate (i.e. measure) their constituents, they look like localizable particles?
You make a very interesting argument—certainly one of the better ones on this thread—but of course it leaves many possibilities unaccounted for. What if we abandon point particles and have little hard spheres? What if we make entropy finite by simply saying that all measurements of continuous parameters are subject to fundamental noise—not for true quantum-mechanical reasons, but like in the popularized misunderstanding of the uncertainty principle? What if rotational invariance only has to emerge at macroscopic scales, while at the Planck scale we can have a random cloud of discrete points?
Comment #496 January 30th, 2022 at 8:40 pm
I #347:
As to your whole research agenda, would it be fair to phrase it as: “give an arguement convincing a smart person in a world which feels to them intuitively like we feel ours to be that they are be living in a quantum world. Further, this arguement should be as natural as Einstein’s arguement for SR.”
Yeah, that seems fair.
In which case, isn’t that exactly what the GPT subset of quantum foundations was made to do?
I’m not sure. If it is, then what’s the best argument for the inevitability of QM that the GPT (Generalized Probabilistic Theories) research program has managed to come up with? Is it Hardy’s or Chiribella et al.’s? In general, I could easily imagine the GPT program telling us a lot about Q2, but it’s harder to imagine it answering Q1.
Comment #497 January 30th, 2022 at 8:50 pm
Tiberiu M #349:
Here it comes down to the number of intelligent beings living in each one. Due to the infinite super positions of a QM universe (think the infinite branching in the Everett interpretation), a QM universe is infinitely bigger than a classical one. Therefore, it is inhabited by infinitely many more consciousnesses. Therefore, you are infinitely more likely to find yourself in a QM universe than in a classical one.
There could be a classical universe out there hosting intelligent life, we just don’t happen to live in it. The same idea can also explain why the universe is (probably) infinite in space.
See my comment #486 for why that doesn’t work.
Comment #498 January 30th, 2022 at 9:13 pm
When all you have is a hammer, everything looks like a nail…
Obviously Scott is a very talented specialist, but I wonder if he’s not fooling himself by turning his hammer on philosophy and expecting this effort to resolve a mid-life crisis. Why fuss over QM when our institutions have already developed compelling answers over thousands of years to a more fundamental question: “Why do things exist at all?”
Unfortunately many modern scientists are either unaware or willfully blind to the fact that such answers already exist. St. Thomas Aquinas tells us that it’s possible to know that God exists with natural reason alone. This has been the official stance of the Catholic Church since at least the first Vatican Council.
http://www.scborromeo.org/ccc/p1s1c1.htm
Section III, line 36
Comment #499 January 30th, 2022 at 9:22 pm
Tom #498: If Catholic theology can answer the question of why anything exists, then can it also answer the far easier-seeming question of why that which exists obeys quantum mechanics as far as anyone can tell? If so, then would you be kind enough to enlighten me as to the answer?
Comment #500 January 30th, 2022 at 9:27 pm
JakeP #352:
What if you assume, that BQP is simply in P? Once we understand the algorithm that makes this possible, the “mysteries” asked about here will make sense to us. The Born rule, complex amplitudes, etc, will just fall out naturally from how this algorithm is structured.
That would be nice! 🙂 I emphatically do not expect BQP=P, but if and when such a collapse were ever shown, it would indeed be worth carefully examining whether the proof shed any new light on the foundations of QM.
If satisfactory answers to Q1/Q2 have eluded us for so long, perhaps it is slight evidence that there IS in fact an efficient algorithm for simulating a quantum circuit after all?
Alas, heuristics of the form “I bet X is true, because if it weren’t then this other thing Y would seem too mysterious for me to understand,” have a pretty abysmal track record. 🙂
Comment #501 January 30th, 2022 at 9:30 pm
Scott #499: That seems to be a rather more difficult question since it presupposes there is some reason why QM needs to exist. (And we don’t know this reason) The question of why anything exists however presupposes nothing since existence is self-evident.
Comment #502 January 30th, 2022 at 9:33 pm
Etienne #354:
if I had to speculate on God’s desiderata when designing the universe I would start with
1. Discrete state space and discrete time,
2. Some form of extreme action principle obeying some form of Noether’s theorem.
That’s an interesting proposal! I’ve indeed often wondered about the role of Noether’s theorem in the “design goals” for our universe—given that, if you made up some random classical CA, Noether’s theorem would almost certainly not be relevant to it. Indeed, even if your CA were subject to continuous symmetries, you still wouldn’t automatically get associated conserved quantities, unless your CA also happened to satisfy an action principle / Euler-Lagrange equation (as you say).
But … why is this so important? Is it just a mathematically convenient property of our laws, or is it something that actually plays an important role in enabling complex chemistry, life, and intelligence?
Comment #503 January 30th, 2022 at 9:40 pm
Scott P. #356:
There are two branches, after all. What does it mean to have one branch be more probable than another?
I’d say that it simply means: if someone asks you to bet on which branch you’ll find yourself in before the branching happens, then you should accept all and only those bets that would make sense if the probabilities were indeed 1/3 and 2/3, or whatever else the Born rule says they are.
Comment #504 January 31st, 2022 at 12:08 am
Scott #494
– More like, why can the different ways of rolling the dice to get the same result interfere destructively and cancel each other out? Why should they be those kind of dice? –
Because otherwise there would be no forbidden or assured transitions, i.e. indeterminism would imply unpredictability
Comment #505 January 31st, 2022 at 12:11 am
Gil #454 and #456:
1. My error:
I had realized by late last evening (a day after posting my reply #416) that I had committed an error, a glossing over of the distinction between non-determinism, and deterministic chaos. In particular, I said (in #416):
> “But the 3-body system already is non-deterministic — even if based on the same, deterministic law.”
I should have said:
> “But the 3-body system already is chaos-theoretical — even if based on the same, deterministic law.”
Similar corrections should apply to my other statements too… Indeed, on the second thoughts, my following statement (in #416) also is mistaken (or at least too hurriedly written):
> “But a system which is composed of elements each of which obeys that same deterministic law, may not itself be deterministic.”
Well, to the best of my knowledge, if the equation governing the elements of a system is deterministic, and if what you are dealing with is a system — and not a “random composition” of those elements (and please don’t ask me what that means!) — then the behaviour may be chaos-theoretical, but it will still be deterministic. That’s the position I should have stuck to.
I realized this error yesterday, and was wondering if I should add yet another self-referential comment to this thread. But finding that you too were thinking of chaos and probability, I decided that the admission of the error was indeed due on a much more immediate basis.
—
2. Randomness and deterministic chaos:
Another point, in reference to what you said:
On the issue of randomness vs. deterministic chaos, one of the most helpful resources I’ve ever run into is this paper by Geoff Boeing: [ https://www.mdpi.com/2079-8954/4/4/37 ] (open access).
Refer to fig. 10 in it. The chaos-theoretical description deterministically selects a subset of the mathematically random points. So it is deterministic. … I knew this, but still ended up committing the above error. … However, note, as the author says:
> “Strange attractors stretch and fold state space in higher dimensions, allowing their fractal forms to fill space without ever producing the same value twice.”
Somehow, that last part (“without ever…”) had a way of interfering with my more rigourous knowledge, once I panned out a bit (while writing #416).
Obviously, a part of me wants to describe the chaos-theoretical situation as non-deterministic. After all, you provably don’t get the same value ever again, do you? … Obviously, this part overtakes my thinking when I am getting a bit too “philosophical”. I need to guard against the tendency.
OK, now let me come to your questions…
—
3. Thoughts on Gil’s questions:
> “1) What is the origin/meaning of probability in nature and why is there probability?”
My philosophic conviction is that there is no probability “all the way down”. The most fundamental physical laws, at the most foundational layer, must therefore be deterministic. But the “Law vs Systems” distinction applies. (I touched on this point a bit in my comment #438 above). This consideration in fact introduces unexpected consequences; it leads to some issues which we don’t know how to handle right. Off hand, I can think of two categories of such issues:
(i) the softer category of issues: The deterministic chaos. Its features share some of the characteristics which we would otherwise ascribe to non-deterministic systems. But the behaviour can be essentialized as being deterministic too, in a way.
(ii) the plain hard category of issues: The theory-breakdown points like: singularities, or pathologies like Norton’s dome, etc.
Now, note that even the apparatus of the Probability Theory fails (i.e., it has nothing to add) for the scenarios in the second category. (In fact, as usual, probabilistic analyses must start by subtracting some information which would otherwise be available!)
Further, when the number of DoF’s is very large, the chaos-theoretical description is practically indistinguishable from that based on a mathematically defined ideal randomness.
So, perhaps, it is more pertinent to ask the mathematician: “Where do you get your idea of the perfect randomness from? Please identify.”
But the most basic point against the physical existence of probabilities is that the idea violates the law of identity.
All in all, (i) it’s a dumb idea, and (ii) the appeal to emotions for the need to have something like that, can be easily dealt with if you can devise a satisfactory mechanism at a sufficiently basic level, say a chaos theoretical, and (iii) probability-based theories anyway never add to knowledge; they begin by subtracting information that would otherwise be available.
> “2) What is the origin/meaning of chaos and why is there chaos?”
We discussed that.
> “3) Why is there something rather than complete chaos?”
The alternative to “something” is not “chaos”; it is the existential nothing, the nought, the void, the “shunya”, the zilch, … .
Chaos (even in its non-scientific usage) evokes the image of something that twists and turns and morphs in every way unimaginable and unpredictable; something that can gulp or spit out any definite thing once in a while but you can’t tell which thing, when, where, how, or why. That’s the non-scientific chaos for you.
Now, hold on to that image. And observe that it has some definite things in it too, not just chaos. If it were nothing but only chaos, we couldn’t even grasp any thing about it. Even an image like this would be impossible. … Further, notice that even that chaotic thing has already been supposed to exist. So, in that sense, it is (at least posited) to be a something — at least in that imagination. It’s just that its behaviour — its actions — are completely unpredictable — to us.
So the issue you want to raise could be better framed as the following: Why do things behave lawfully rather than chaotically? (More philosophically: Why does the law of identity hold?)
I already gave a hint to the answer, so I won’t discuss it.
—
4. About QM:
You also said:
“I would speculate that quantum mechanics is a framework (perhaps “minimal” or even “unique” in some sense) that allows answers to, or at least “better understanding of”, these questions 1) 2) 3).”
Nope. It is true that the mainstream QM is only a framework, and not a complete description. But the mainstream QM theory is linear. So, it actually does not allow seeking answers to questions like the above.
To seek answers to questions like chaos, all that you need is an ontology like Newtonian mechanics, say the technique of Molecular Dynamics. The mainstream QM in principle falls short.
—
BTW, my following blog posts may be of interest:
“Fundamental Chaos; Stable World”, August 2019. [ https://ajitjadhav.wordpress.com/2019/08/28/fundamental-chaos-stable-world/ ]
“Determinism, Indeterminism, Probability, and the nature of the laws of physics—a second take…”, May 2019 [ https://ajitjadhav.wordpress.com/2019/05/01/determinism-indeterminism-probability-and-the-nature-of-the-laws-of-physics-a-second-take/ ]
Best,
–Ajit
Comment #506 January 31st, 2022 at 12:22 am
For an alternative perspective to the “classical life” presumption…let’s do a 180 and locate ourselves inside the wave function itself and explore the idea that certain quantum mechanical processes play a central role in our everyday choices and preferences. Before I elaborate on this proposal, if you’ll have it, I’d like to do a quick Gedanken experiment:
Please choose your favorite ice cream, picking from these three: vanilla, chocolate, or strawberry.
Now to elaborate, consider the possibility that we just conducted a quantum experiment on yourself as follows: what is objectively described as measurement and quantum probabilities in a QM theoretic framework forms a *duality* to choice and preference in our subjective experience – two sides of the same coin.
So, when we set up the Gedanken experiment we set up a basis for measurement, call it the “ice cream basis”. It has three basis states, call them |V>, |C>, and |S> for vanilla, chocolate, and strawberry respectively. If you were, say, on the fence between chocolate and vanilla, a wave function that would have described you *before* you chose, would have been a superposition of |C> and |V> with roughly equal amplitudes for each, and a negligible amplitude for |S>. On the other hand, if it was a slam dunk to choose Strawberry with every other choice having zero appeal, then your wave function would not have been in superposition and would be purely in the |S> state.
In general, if you prefer A to B, then the probability of measurement outcome A is greater than B. Each time you make a choice, you are measuring some aspect of your mind.
After you chose, you remained in an eigenstate of the “ice cream operator” with eigenvalue of your chosen flavor: for example, consider the choice again. Did you choose the same flavor?
So, what do we get from such a proposal? We get a framework in which folks have probabilistic free will (although not Knightian freedom as described in Scott’s Ghost essay). For many though, I think this will still be satisfactory – you have freedom to choose, the only constraint upon you is to choose in line with your preferences which is something perhaps impossible to not do. You will be probabilistically predictable in a quantum sense, but is this really any different than knowing your good friend Alice loves chocolate and will choose |C> 95% of the time? You still don’t know what she will pick on any particular trip to the ice cream parlor. Furthermore, you get this freedom within the laws of physics and the causal closure of physics (CCP) is preserved.
Interestingly, as a bonus, there is an academic field called Quantum Cognition whose practitioners model the outcomes of psychological experiments using the framework of quantum mechanics. They are careful not to suggest quantum effects are actually occurring in the brain, probably because this still carries a stigma in scientific circles, but they argue the mathematics of QM is more successful at modeling human decision making than so-called ‘classical models’ of behavior. One example is called ‘question order bias’; the order in which subjects are asked questions can cause different outcomes even though the questions themselves are the same. In other words, ask question a then b then c, and you’ll likely get different answers than if you ask a, then c, then b. In QM, of course, the order of measurements is well understood to affect the outcome of those measurements.
Hopefully, this is compelling for its upside: we roll two otherwise unexplained subjective phenomena, choice and preference, into the laws of physics while rescuing a degree of free will by locating ourselves inside the wave function itself. Explanations of some psychological phenomena are free-bees and go along for the ride.
What about the downside, is this even possible? QM in physics is typically applied to atomic scale systems, or super cold temperatures, how could such physics be applicable in a warm, macroscopic object such as your body? Roughly 50 years ago the physicist Herbert Frohlich proposed a QM model – a particular Hamiltonian – that gave rise to a Frohlich condensate (something like a Bose-Einstein condensate). Frohlich showed, in theory, this condensate could form even in the noisy, warm environments of a biological organism. Frohlich’s model did not depend on isolating the system from the surrounding environment to maintain coherence. Rather, coherence was maintained by continuously pumping energy into the system (presumably the role of metabolism). Unfortunately, nearly a half century went by with no evidence of such a condensate and many physicists soured on the idea. Recently in 2015, however, a paper was published that cited experimental evidence of just such a condensate in a biological protein (https://aca.scitation.org/doi/pdf/10.1063/1.4931825). This experiment took place in a petri dish under exposure from a THz laser, so it still remains to see evidence of a Frohlich condensate actually in a living thing, and powered by its own metabolism.
And, even supposing such a condensate did exist in the body, what could we be physically measuring when we make a choice in the “ice cream basis”? Some folks have suggested ion channels in the neurons of brains because these ion channels themselves have single ion diameters and operate on scales susceptible to quantum mechanical effects, yet, they can influence the firing of neurons and thereby have their effects amplified to macroscopic scales. These answers won’t come without much more research, even presuming quantum-life is true, but hopefully folks find this little Gedanken experiment thought provoking in an alternative way to traditional thinking on these topics.
Comment #507 January 31st, 2022 at 12:31 am
Scott Says:
Comment #493 January 30th, 2022 at 8:14 pm
That took several minutes hard thought mentally juggling and rearranging the 2 facts about QM I know;). But if we forget about finite precision and collapsing/splitting wave functions we are simply assuming we have a quantum computer. Why does a quantum computer make quantum computations? Because it is a quantum computer. Your assumptions turn the universal quantum computer into the perfect, fundamental being and we don’t get to ask why they exist. Our only choice would be to make your assumptions scripture and to sing the praises of the quantum computer.
We know how we get chemistry and biology when we pile the computations high, and we know only of this one possible instantiation of physicality, so any anthropic-like considerations become dull tautologies. People should visit Legoland – it’s amazing what you can make out of lego, too.
Surely, for any scientific revolution to be launched before your ankles completely give way, the clue is where we definitely don’t have a clear description – the collapse/split of the wave function?
Comment #508 January 31st, 2022 at 12:33 am
Scott #421: “[…] I’ll be more-or-less hobbling from here till the end. Well, less exercise, so in expectation probably less time left […]”
Please don’t jump to such pessimism! Get good care, take good care, and you have reason to be hopeful. Plus i and a lot of folks are sending you supportive waves (both real and complex 😉
Comment #509 January 31st, 2022 at 12:35 am
I’m a total amateur at QM. My amateur thought follows.
Suppose (1) we want to create a simulation of a universe as we observe it (so it’s mostly classical), (2) we want the hardware we run the simulation on to use the same principles as the universe we simulate (for simplicity), and (3) we want the simulation to be fairly fast: i.e., we don’t want to take a really long time to simulate one time step.
Requirement (1) means we don’t have to propagate a full many-worlds-interpretation-like wave function; rather, at each time step, we randomly (according to the Born rule) select one subspace from a lot of orthogonal ones, keep simulating it, and erase the rest, freeing up most resources at every time step.
A solution might be a quantum computer, since its full state space size is exponentially larger than its classical state space size (caveats in a moment). That means we need to make it only a little larger than a typical natural quantum system to be able to also cover all the classical DOFs in the universe in a single MWI-like world.
Caveats: I’m not sure that the quantum state space size is a good proxy for simulation power. Here I’m assuming it is. Also I have no idea if any of the details used here, such as pruning orthogonal subspaces, can be implemented on a QC without destroying the simulation.
To give some numbers to this, take something like 10^80 atoms in the observable universe and multiply by some rough number to get at the classical number of DOF: something like M = 2^300 classical DOF.
Then suppose the maximum natural quantum system’s state space in our universe is equivalent to approximately N = 10^4 qubits. By “natural” I mean the state space size needed to carry out chemistry.
So then there are roughly M/N separate quantum systems to simulate plus roughly M classical DOF to simulate. We can do that on a quantum computer having log2(M/N 2^N + M) qubits, or roughly log2 M + N. Using the numbers above, that is 300 + 10^4.
Thus, a quantum computer that can simulate a complicated chemical quantum system needs to be only a little larger (~10% larger) to simulate one MWI-like world of the observable universe. In the unlikely event that I’ve made it this far without committing one or more egregious errors, this would make a quantum computer very natural hardware for a universe, satisfying all three of my design requirements.
A universe that runs this way would have a maximum in-universe QC size before it would exhaust the underlying hardware, so I can recklessly double down on this line of thought and predict (in doing so, I’m pretty sure I’m ignoring a ton of sensible things Scott says on this blog, but such is the right of the amateur) we might hit an upper limit to QC size in our universe.
Comment #510 January 31st, 2022 at 12:37 am
Scott #467:
But my position has been that I don’t, wouldn’t want to, and in fact cannot, know what specific physics theory to use in order to create a world like ours. And, also that none can.
As to the business of just creating any which “world”: Please consult any of the many modern physicists, computer scientists, mathematicians… and philosophers.
—
Scott #492:
Now you are doing much, much better. [See the effect of beginning to consider “our universe”?]
… Replacing “design requirement” by “feature”, I would say that I am definitely going to add this bit to the set of things I should be explaining (even if only briefly).
Best,
–Ajit
Comment #511 January 31st, 2022 at 1:29 am
It has been my first time contributing to this blog, so thanks to all for the interesting discussions, and sorry for wasting you time in case I did. As a last post here is a tentative answer to Q1: Why didn’t God just make the universe classical and be done with it? What would’ve been wrong with that choice?
My answer : There was no such choice, the universe is both classical and quantum depending on the way and the scale you look at it. Either a purely classical world or a purely quantum world would be physically inconsistent and would lead to absurd predictions, there is an abundance of examples. This may sound terribly neo-Bohrian, but I accept the heritage, modulo my claim that psi is predictively incomplete, and needs a context to get a physical meaning.
Comment #512 January 31st, 2022 at 1:50 am
Hey Scott,
I’m wondering what you make of Tim Palmer’s suggestion that “quantum indeterminacy may perhaps be replaced by certain kinds of ‘hidden variable’ chaotic dynamic, provided that the chaos is sufficiently nasty.” https://royalsocietypublishing.org/doi/10.1098/rspa.1995.0145
Could this be a way out of the quantum mechanical realm back into the comfortable world of determinism? I think the idea is that instead of considering collapsing of the superposition to be random, it is determined by hidden variables that act in a manner so chaotic that we perceive it to be random, but nature can somehow compute it. Further, it may even be uncomputable to us. (I may not have worded that correctly, I’m not an expert – just interested!).
Thanks!
Comment #513 January 31st, 2022 at 4:19 am
Scott #484: I’m dead serious. People usually argue the opposite because they understand neither probability nor Many-Worlds.
You didn’t reply to my request to give a definition of objective probability. I don’t expect you to succeed, people have tried to do it for a century and failed. I just want you to be honest with yourself and realize that you can’t. This failure is so widely recognized that the consensus in philosophy is that objective probabilities do not exist, they are content to deal with subjective probabilities. Meanwhile the consensus in physics is that objective probabilities are obviously what quantum mechanics gives you, and they don’t worry about defining them.
Now subjective probability is conceptually prior to quantum mechanics, and thoroughly unproblematic. Objective probability, on the other hand, was introduced in our theories with the Born rule, and remained mysterious until the advent of Many-Worlds.
As for why is it a problem to have an undefined transition rule in your theory, that much should be clear. You might as well write that God decides to transition state A to B or C.
It’s true, people have done classical stochastic models for a long time. But you don’t need to know what probability is to do that, you just need to know how to deal with it, and how to get a RNG when you actually need to run the model. Like how people have dealt with water for millennia before understanding what it was. They drunk it, swam in it, piped it, froze it, evaporated it, made steam engines with it. All that without having the faintest clue about atomic theory and chemical bonds.
The reason why true randomness only appears through the Born rule is because true randomness is deterministic branching. I do think that’s obvious. If we lived in a Permutation City world where mind uploading and copying of agents was commonplace we’d also see true randomness appearing in this emergent level, but we don’t live in such a world, at least not yet.
Comment #514 January 31st, 2022 at 5:49 am
Scott #479:
> That just pushes the question back to: why do we need waves or particles? We need information, but why does it have to propagate in either of those ways, let alone in both of them?
I would like to re-emphasize that I really think that you are going to need *some* physical input for this discussion to go anywhere. You do not just need information.
(In fact, let me also point out that the physical input necessary to ‘derive’ special relativity is absolutely insufficient to conjure up a theory of everything. Similarly, I think that all the modern and possibly deep links between information theory and quantum gravity go too far down the rabbit hole, and therefore might not be the best starting point to answer your question. Sorry!)
> Furthermore, “wave/particle duality” strikes me as an old-fashioned way of talking. From a modern QFT standpoint, waves and particles are simply different ways to describe phenomena that can arise on quantum fields, the truly fundamental entities.
But you were precisely asking *why* these quantum fields were necessary, no? In other words: of course particle/wave duality is old-fashioned, but how many ways are there to arrive at a unified description? Quantum mechanics is one way. (And, as Weinberg argues, QFT is the sort-of unique combination of QM + SR.) If it is the unique way then are we not done? At least to me your question then seems to be answered if we accept the starting point that we require a unified theory of waves and particles.
Whether to investigate this possibility depends of course on how appealing you think that starting point is. Personally I would love to see if such uniqueness can be convincingly argued for. I think it would provide a wonderful supplement to any QM textbook, because the only experimental input you would need would be the photoelectric effect and the two-split experiment for electrons.
Comment #515 January 31st, 2022 at 8:00 am
Scott, a question in MJ Geddes #490, “what is the physical interpretation of things like ‘negative probability’?” crystallizes for me that there is a fairly robust sense in which there is no such thing as a ‘negative probability’. There are, however, many examples of what I would call ‘negative and complex-valued pseudo-joint probabilities’, which have incompatible probability measures as marginals.
Gil Kalai #454 name-checked Itamar Pitowsky, who is as clear as anything I’ve seen about the relationship between joint probabilities and incompatible probabilities: by definition, the latter do not admit representation as a joint probability. One way to think of the situation is for me that ‘negative and complex-valued pseudo-joint probabilities’ are a consequence of trying to force incompatible probabilities into a single joint probability.
We know that incompatible relative frequencies occur in our analysis of experimental results often, but in particular they occur whenever we report a violation of Bell-inequalities. So being able to model incompatible relative frequencies is a good thing (as Wigner functions is OK, yes), even though our records of experimental results are certainly “classical”, just ordinary numbers on paper or in computer memory, as at least some strands of the Copenhagen interpretation insisted.
Why QM? Given any large body of experimental data, on paper or perhaps Terabytes long, a list \([x]\), we can use arbitrary algorithms \(f_i\) to construct a list of summary values, which we call measurement results, \(M_i=f_i([x])\). Metadata about those Terabytes that derives from our knowledge of the experimental apparatus and procedure suggests that we choose specific algorithms, with a strong preference for terminating transformations, not just any algorithm at random. In particular, we distinguish different preparations, with which we associate sublists (\[x]_j\), so we obtain a finite array of measurement results \(M_{ij}=f_i([x]_j)\).
Now, we can look for solutions of the linear equations \(M_{ij}=Tr[\hat\rho_j\hat O_i]\), which can always be solved for high enough dimension. Indeed, it can always be solved for high enough dimension even if we require that \(\hat\rho_j\) and \(\hat O_i\) must be diagonal. Thus, I suggest, both quantum and classical presentations of past data is always possible; the kicker, however, is that any given choice of dimension leads to different interpolated and extrapolated predictions for the measurement results of future experiments. A given choice of measurement results \(M_{ij}\) is effectively a compression of the \([x]\), with all the choices that implies, as well as the choice to construct the particular apparatus. Of course in the above there has been no mention of a Hamiltonian or Liouvillian dynamics, of space-time as a way to index our measurements, and a slew of other important aspects, but there’s this very low-level sense in which I take classical and quantum formalisms to be universally applicable. As physicists, we can’t step outside our actual records of experimental results [as people, we can also think many things that we do not write down or that is otherwise not part of those Terabytes, but of that we cannot speak in formal communication with the editors of physics journals.]
What is not at all clear, however, is that such a linear model for the monolithic list \([x]\) is always the best way to analyze and present the results of experiments. I haven’t yet seen anything better than Hilbert spaces, POVMs, unitary dynamics, and all that, in such of the literature on generalized probability theory that I have seen, but people are certainly presenting alternatives (as just now on math.OA, “Dendriform algebras and noncommutative probability for pairs of faces”, https://arxiv.org/abs/2201.11747, which to me looks interesting).
I hope some of this is a little helpful. Notice that there’s no mention here of “systems” or “particles”, which to me should only be introduced with as much care as we always should take when introducing “causes” as an explanation for what in physics is typically a very complex system of correlations, at the Avogadro Number scale and beyond. I suppose we can say that Judea Pearl has shown that up to a point it can be done, but I think it’s not to be done glibly even when it might seem obvious.
Comment #516 January 31st, 2022 at 8:39 am
Clinton #450 (“Feynman, a great popularizer, clearly said that what made QM different was that it allowed negative probabilities.”):
BTW, in view of that and similar remarks here and elsewhere, QT is a proper probability theory, not a so-called quasiprobability theory, and it doesn’t allow negative probabilities.
Comment #517 January 31st, 2022 at 8:41 am
Steven Evans #507:
Your assumptions turn the universal quantum computer into the perfect, fundamental being and we don’t get to ask why they exist. Our only choice would be to make your assumptions scripture and to sing the praises of the quantum computer.
You do realize, don’t you, that we’re now 515 comments deep in a thread about why the universe is quantum, why the right model of computation for our world appears to be quantum computation? That I could just shrug and accept it, as many of my brilliant colleagues do, but that this thread exists because I (unlike many of them) hold out the hope that it might eventually be possible to do better?
Comment #518 January 31st, 2022 at 8:43 am
Scott #383 & #407
Ok, here’s my attempt to say why I think symmetry really is as fundamental as physicists say it is, why you shouldn’t think about gauge theories as just a bizarre-seeming trick that works remarkably well for mysterious reasons, and why if you explained all this to a deity who was trying to invent rules for a new universe, she might find it so compelling that she dropped all her previous plans for a classical universe and started making a quantum one. I apologize in advance for the super-duper long comment, which honestly feels a bit inappropriate – in my defense, we’re already 500+ comments deep into this thread, so maybe one extra long one at this point isn’t hogging too much space?
First, you say symmetries are really just a restriction to a subspace states in the theory, and the theory happily lives in this space. But this is not really what symmetries are. Crucially, symmetries don’t remove states from the theory, they relate what you see when you look at different states in the theory. For instance, Lorentz invariance tells you that if you know the energy of a particle at rest, then you know its energy at any velocity. This point is particularly clear for symmetries like boosts, which don’t commute with the Hamiltonian and so are not associated with a conserved charge, so there is no charge sector to restrict to. But even when they do commute with the Hamiltonian, so there is a conserved charge, symmetries are much more interesting than a restriction to a charge sector. In particular, we’re often interested in theories with multiple phases, one where the symmetry is preserved by the ground state and another where it is spontaneously broken. In either case, the symmetry commutes with the Hamiltonian. In the broken phase, the symmetry tells you things like the degeneracy of the vacuum, and it relates the different degenerate vacua to each other. Even better, the symmetry doesn’t need to be exact, but can be explicitly broken, and you can still systematically include the effects of the explicit breaking in the predictions of the symmetry as long as the breaking is small. This is what is at the heart of the chiral Lagrangian, which describes things like pion scattering and meson masses. When you quotient by a symmetry, by contrast, there cannot be any symmetry breaking, no matter how small. Anyway, you get the point – symmetries are not just a fictitious device that lives in our head, they are a physical action that you take on the states of the theory.
I assume what you had in mind was that if you have a conserved charge, then you can project onto states with that charge and just get rid of all the other states. But even here, symmetries are usually more than that! We are often interested in continuous symmetries, and these are generally associated not just with a conserved charge but instead with something much more powerful, namely a conserved current and charge density, which satisfies a continuity equation. That means that you can measure how much charge is in any local region of space. Even if the total charge of the space is zero, you can dump some positive charge into some finite region (at the cost of putting some negative charge somewhere else), so in the finite region you are effectively seeing the predictions of the symmetry on charged states. Moreover, charge can’t just disappear from this region and pop up somewhere else – because of the continuity equation, the charge can only leave a region by passing through its boundary. So if you know the charge in some region at some initial time, and then you sit and monitor all the charge passing across the boundary, you also know the amount of charge at some later time. Again, the implications are much stronger than what you would get by just restricting to states with a fixed global charge. Even better, if your current is coupled to a gauge field, then you can measure the amount of charge in some region without actually looking inside that region, by using Gauss’ law.
This brings us to the role of gauge fields and gauge symmetry. Ironically, although they might seem deeper, the modern understanding is that it is actually the gauge symmetries that are just a quotient of the space, and have more to do with how we as humans describe them than what the theory is doing at a deep level. Most physicists prefer to use the term “gauge redundancy” rather than “gauge symmetry” for this reason. The actual invariant meaning of a gauge symmetry has more to do with the operators that make sense in the theory – things like Wilson lines, in particular. Often, gauge theories have equivalent descriptions where there are no gauge fields, and you don’t ever refer to gauge transformations at all, but the Wilson line operators are still there in some form. The descriptions that invoke a gauge redundancy are more about our desire as humans to write things in terms of local Lorentz-covariant fields, in the limit where the gauge theory has weakly coupled spin-1 massless particles. In fact, you can show that if you have spin-1 massless particles, and you want to write a Lorentz vector field for them, then on general grounds you have to identify field configurations that are related by gauge transformations – Weinberg proves this in section 5.9 of his QFT textbook. The theory itself doesn’t care about whether or not you use this description. So yeah, it’s a hack, it’s amazing that the people who invented gauge theories came up with it, but we understand very well now why one has to make *exactly this hack* and not any other one. On the other hand, what the actual underlying physics *does* care about is that the spin-1 particles are coupled to conserved currents! This follows from what are known as “soft photon” theorems, and they imply that massless spin-1 particles can only couple to conserved quantities. So conserved quantities are actually the fundamental thing here, since they are what allow spin-1 massless particles to have any interactions. Similar statements apply to massless spin-2 particles: they have to coupled to a conserved energy and momentum, and if we want to write them in terms of fields then we have to introduce diffeomorphism invariance, which puts us on a path towards deriving General Relativity.
So, why would our deity be moved by any of this? Well, one possible response to the “can’t you just get whatever interesting complicated dynamics you want using celular automata?” question is that, yes, you can do basically whatever you want with them, that’s the problem! The system is too unconstrained and ad hoc. If you said “can’t your deity just, moment by moment, choose what every particle does in this universe”, the answer would also be yes, but you would immediately reject this option. However, the beauty of quantum mechanics is that, together with Lorentz invariance and some assumptions about particles and locality, it *drastically* constrains you to just a few choices. At one point, something like this was one of the main dreams of string theory – that the constraints of a UV-complete theory of quantum gravity would be so stringent that string theory would fix the parameters of the Standard Model to at most a handful of choices. Whatever you think of the progress of this program, it’s clear why it was so appealing. Ironically, though, something like this is actually true of *low-energy* quantum gravity. In that case, the constraints of Lorentz invariance, a spin-2 massless particle, and unitary quantum mechanics appear to uniquely land you on General Relativity at low energies. Similarly, the constraints of a weakly coupled spin-1 massless particle land you on Yang-Mills theories. Ah, but “isn’t GR + QM an unsolved problem?” I hear someone saying. One of the surprising little secrets of quantum gravity is that, for accessible regimes of physics, we already have a perfectly good theory of quantum gravity! It’s called the Effective Field Theory of GR, and it works great. This is not just the statement that quantum effects are small in practical situations, it’s that *we know how to calculate these quantum effects very precisely* in practical situations. It’s a strange quirk of history that this isn’t better known – partly because the people who “discovered” it (primarily, to my knowledge, Ken Wilson and Steven Weinberg), did so almost in passing while they were trying to understand nonrenormalizable theories more generally. So one of the most appealing things about quantum mechanics is its inflexibility – up to a few parameters, it fixes the low-energy physics. In fact, almost all of the particles we see in practice are around at low energies because of symmetry in some way or other. The only exception is the Higgs boson, which has no apparent symmetry reason to be part of the low energy theory, and this is what mystifies people so much about it.
So now you go to your deity and say, look, you can keep doing what you are planning with your classical universe, and it’s going to be the absolute wild west in terms of the landscape of rules you are allowed to consider. Or, you can do this thing called quantum mechanics, and with a couple assumptions about symmetry and massless particles with spin the low-energy theory is nearly going to be chosen for you. Which do you think she would pick?
There are of course a number of objections you might make. For one, the Standard Model isn’t *that* constrained – you have to choose the gauge groups and their matter representations, plus the values of all the couplings – maybe some classical theory could be more or less equivalently constrained just by demanding internal consistency? To which I would say, sure, maybe, show me the system you have in mind and we can compare. But if the comparison is between the wide space of, say, all CA, then I think our quantum system pretty clearly wins. Another complaint you might have is that this isn’t a logical argument like Boltzmann’s derivation of thermodynamics, it’s essentially an appeal to aesthetics. I also don’t have a great answer to this, except to point out that the empirical fact that the Nature has apparently, time and again, chosen fundamental laws that obey rigid mathematical structures and follow from a small set of simple elegant principles is one of the great mysteries of our universe, and I think is implicitly part of the premise of Q1.
Comment #519 January 31st, 2022 at 9:18 am
I may be missing some subtlety concerning the idea to simulate classical universes to see if complex life evolves but in this universe quantum processes are fundamental to complex life. No classical explanation is possible to allow for the efficiency and/or speed of photosynthesis, cellular respiration (electron transit through mitochondrial membrane), reaction rates of enzymes, etc. I would add to this list anything akin to human consciousness based partially on the evidence of the impact of anesthetics on consciousness that bind with quantum mechanical London forces at receptor sites. If complex life in a classical universe then it would necessarily rely on far slower and inefficient processes than here.
When someone claims “intelligent design” my thought is, well not really that intelligent (say maybe 130 on the Stanford Binet). Clearly they had great tools but can’t gloss over that mistakes were made. It could have been done better no question by an intelligent designer with no constraints.
Your negative comments about our species align well with the Norse creation story where humans arose from the underarm perspiration of a god.
Comment #520 January 31st, 2022 at 9:52 am
Is “quantum mechanics” something physicists/computer scientists do, or an inherent property of nature/reality?
If the first, the Born rule is on the face of it no great mystery, “it´s just how we do things”, in the same way that action at a distance is no practical problem (perhaps a philosophical one; but we don´t do philosophy in science); the only thing that counts is that the results of the calculations are correct, that is usefully mappable to arranged experiments (or natural occurrences). The question then still is, why does one method work, while others seemingly don´t? (This seems to be Scott´s stance Q2.) But that could merely (??!!) be a problem of “we build tools, both physically and mentally, so *that* they work, which is where we stop worrying and use them. (We then use these tools to eventually create the next better generation of tools, a process to which there seems no obvious end.)
If the second: how do physical systems “know about” their wave function, let alone the result of “measurements”? Do parts of the system measure other parts? In what way are they even separated? Is this in any way dependent on human observers at all, or rather God´s Ineffable Code (so to speak)? The latter seems to be Scott´s hope, from statements like “QM (being) exactly true” as “one of the profoundest truths our sorry species had ever discovered” and the Einstein quote whether the creator had any choice.
Comment #521 January 31st, 2022 at 10:00 am
I find it interesting to note that all digital computers are the (trivial?) illustration of the fact that a very classical and stable reality can be built on top of QM.
E.g. the hundreds of thousands of instances of a certain model of Lenovo Laptop running the same program give the same macroscopic output even though microscopically they’re all very different.
In other words, a given digital computer is a system which branches microscopically just as much as any other system, but within it lies a sort of “conspiracy” that creates a stable macroscopic state comprised of millions of abstract symbols.
But such symbols are “secret” and in the eye of the beholder: because digital computers are just extension of our brains, that high level reality only makes sense to us, and the same observation probably applies to the human brain; even though, moment by moment, my brain branches into many different paths microscopically, my thoughts are somewhat stable across many of those branches (maybe not as stable as a digital computation, but pretty close).
Of course, fundamentally, this can be boiled down to the observation that three rocks on the ground will represent the number 3 in a stable manner, regardless of all the jiggling, decoherence and randomness going on at the atomic level.
But rocks are just going to stay there and nothing interesting will happen. But other classical systems exist (like amino-acids), which can carry along a self-“complexification” of the mapping between such high level stable symbols and stable properties of the environment, to the point where those symbols eventually capture questions such as “Why QM?”.
Comment #522 January 31st, 2022 at 10:01 am
If I want to construct a world from scratch and I got the power of a
probabilistic turing machine(overused, let's have some variety) probabilistic register machine (+ an integrated hypercomputer for one specific occasion) at my disposal I suppose I would first set some design goals that I would like to implement:Goals 1-3 should not be a problem. Even though here in a 4D world (I find it among the most mind blowing things overall that the dimensionality of our world so unbelievable low, as a side note), I unfortunately can't render it and noone can imagine a 7D image, the spacetime algebra of the world can be computed.
I am not sure if Goal 4 can be implemented.
If Yes:
Nice. Let's see if I can inject the qualia of a 200 dimensional space into the consciousness of someone living somewhere in my 7 dimensional spacetime. Or the finite string w e {1,0}* that is the mental impression "blue". Btw. does every stone implement every finite state machine (http://consc.net/papers/rock.html) and if yes and if qualia is just some information theoretical set of states, does any have stone every possible qualia over time?
If No:
Nice. What else do I need to add qualia to my "lifeless" world inhabitants? Can a world have some non computable or even non mathematical properties that are needed for Qualia?
If not with math, how can they be "wired" into the world?
Which design goal can I have that would prevent me from using a probabilistic Register Machine? I could for example decide that I want spacetime to be R^n. If my space is R^n I can't possibly update an uncountable infinite subset of elements in R^n in finite time with finite steps with my probab. RM.
I don't know why our world has QM, the trivial reason I guess;
If you believe all possible worlds exist, necessarily, there is at least one world with QM. If all possible worlds exists, there are "many" worlds far stranger than QM world.
Or it is necessary to achieve some design goal.
Either way, great questions and the comment section has intriguing ideas, probably going to reread everything in summer when my mental abilities are at their peak.
Comment #523 January 31st, 2022 at 10:03 am
For the inclined reader, here is a nice elaboration of an example in physics where “from simple requirements of rational consistency we could have arrived at the Lorentz transformation. As Minkowski said ‘Such a premonition would have been an extraordinary triumph for pure mathematics.’ (Which seems to be Scott´s dream as well.)
“Staircase Wit” https://www.mathpages.com/rr/s1-07/1-07.htm
(I am not in any way affiliated with the author.)
Comment #524 January 31st, 2022 at 10:16 am
Why is the universe English?
The Schrodinger Equation’s conceptualization of a “wave” is, effectively, a function which defines a complex number for every point in a given space. A complex number, in the context of a fixed set of dimensions, is equivalent to an amplitude. In either case, what we’re really talking about is a value.
When you start examining what the limits of what a valid space must be for the purposes of the equation, and notice the existence of alternative coordinate systems, I just don’t think the question means anything. The universe isn’t “quantum mechanical” any more than it is “English”, “quantum mechanics” is just a way of talking about the properties of universes.
Now, it’s possible it may be a more or less intuitive way of approaching questions about the universe; that is, the real question isn’t “Why is the universe quantum mechanical”, but “Would quantum mechanics always be the sensible language to talk about the universe”, the answer to which is clearly no, because classical physics worked until it didn’t. What language you use to describe the universe is going to come down to what phenomena you need to describe.
Why isn’t the universe classical? Because classical physics is an incomplete language; it can’t describe certain phenomena. Observe that in classical physics, waves exist; the Schrodinger equation will apply. There will be quantum mechanical behavior, which classical physics would not be able to describe. So we’d have to invent quantum mechanics, or something like it, in order to describe that behavior. And then we’d be asking the same question.
Comment #525 January 31st, 2022 at 10:18 am
Given your own description of QM vs classical mechanics as just L2 norm vs L1 norm, It’s easy to see how L2 is much more special/symmetric than L1:
1) L2 norm is the only norm that is self-dual
2) The symmetry group of (finite dimensional) space with L2 norm is infinite, whereas for other norms (L1 included) the symmetry group is just a finite groups of axis permutations.
I obviously don’t know why is the more symmetric option usually the correct one, but it seems that whenever things could be more symmetric or less symmetric, the universe goes along with the more symmetric option. So in this sense QM is just another symmetry of the laws of physics. Just like we have rotational and translational symmetries of space, we also have a rotational symmetry of the space of “probability vectors”, which gives us QM.
Comment #526 January 31st, 2022 at 11:10 am
Jester #520
– Is “quantum mechanics” something physicists/computer scientists do, or an inherent property of nature/reality? –
I don’t see the exclusion here.
It is certainly something physicists/computer scientists do, but it is of course related to properties of nature/reality
It is the ultimate, possibly least constrained, embodiment of the scientific method.
You need a protocol to follow in order to start with the desired configuration, a protocol for waiting or doing other things, and a protocol to spot some desired configuration at the end; and you need a way to predict the success rates for each combination of protocols.
QM provides exactly that.
Coming to the supposedly incompatible second statement, the wave function is a mental representation of a specific protocol. It “belongs” to the QM user (when his confidence level is high enough that the relevant protocol was followed with no errors), but it of course represents both the system and the apparatus, i.e. “properties” of nature/reality, does it not?
Comment #527 January 31st, 2022 at 11:38 am
QM can be seen to arise from quite elementary considerations if you try to construct an (interesting) reality from scratch yourself:
First we start with one state, x, and allow it to vary (so it can do something non-trivial). So we need to give it a range of values, integers or reals seem reasonable first choice.
Now we have to apply a function to vary it, so multiplication or addition seems a good choice.
However, in the case of addition by anything other than 0 we would get a value tending to infinity, and in the case of multiplication by anything other than +1 or -1 we would get a state tending to zero or infinity.
Neither case seems interesting.
So rather than just a single real number let’s multiply by a couplet, ie a number with a magnitude and phase, like a complex number of magnitude 1, now the state varies continuously without exploding to infinity or shrinking to zero.
THIS IS QUITE INTERESTING…
But we really would like a little more complexity, so let’s introduce a second state y, and by similar reasoning we should multiply by a unitary 2×2 matrix to get an interesting universe.
But although interesting it is also predictable (predetermined)
So let’s introduce a spontaneous random change in x and y while keeping the overall modulus (x^2 + y^2) constant (otherwise we could get an exploding or shrinking universe)
Now we have a very interesting universe, which is unpredictable but does not explode to infinity or shrink to zero, and can be analysed statistically.
For a very large number of states, and large unitary matrices, the random change will barely be noticeable in the unitary dynamics (we can have very stable microscopic structures with decay lifetimes on huge timescales)
Also the Born Rule now is just the need to keep the states on the “sphere of rotation in C^n” – it’s Anthropic in origin, if it doesn’t exist the Universe has states tending to infinity or zero, so we would never emerge to observe such a Universe, we need this “Pythagorean Rule” to enable us to emerge from the Universe’s evolution…
Comment #528 January 31st, 2022 at 11:47 am
Matt Leifer #358: Thanks for the extremely interesting comment—one of my favorites of the thread!
I’ve also marvelled at the fact that QM seems to admit such different ways of describing the same situations—even the extremes you call Church of the Larger Hilbert Space and Church of the Smaller Hilbert Space (CLHS and CSHS). Certainly, most other theories that one could make up would not admit such an enormous range of possible views of how to interpret them. But could difficulty of the interpretation problem actually have been a “design goal,” a thing that we can imagine QM as having been “chosen” to satisfy? As if, in the heavens 14 billion years ago, there was one faction of radically subjectivist, QBist angels, but also a bitterly opposed faction of angels that wanted an objective state describing exponentially many possible experiences, and God had to pick a compromise theory that would satisfy both factions? 😀
Comment #529 January 31st, 2022 at 12:00 pm
Hello again Paul #516
Thank you again! That was a terrible sentence I wrote! Poor Richard probably rolled over in his grave.
I guess I was misremembering or mixing up things like this:
https://www.nature.com/articles/471296a
Which says “According to Feynman, the key difference in quantum theory is that the particle does not follow the classical path, or any single path. Rather, it samples every path connecting A and B, collecting a number called a phase for each one. Each of these, in concert, determines the probability that the particle will be detected at B.”
Thus I was leaving out that while the amplitude (phase) may possibly be negative, that is NOT the probability but rather “determines” the probability. Or maybe is a part of determining the probability?
Feynman’s paper
https://cds.cern.ch/record/154856/files/pre-27827.pdf
I went back and read over Feynman’s paper and find that Feynman appears to endorse something like using negative numbers in intermediate calculations as part of a probability theory. But please help me to understand that better if that is not what he is saying – because I do want to be clear on that. Here is another take by John Baez with more references:
https://johncarlosbaez.wordpress.com/2013/07/19/negative-probabilities/
What Feynman DOES say is: “It is not our intention to claim that quantum mechanics is best understood by going back to classical mechanical concepts and allowing negative probabilities … Rather we should like to emphasize the idea that negative probabilities in a physical theory does not exclude that theory, providing special conditions are put on what is known or verified.”
So, I thank you because I definitely do NOT want to give anyone the impression that Feynman was saying “just add negative probabilities.”
I’m probably getting into too much hot water by pulling Feynman into this – which is probably one of the reasons for Scott’s rule that we not try to pull in other sources but just make our own arguments 🙂 So let me stop doing that.
I want to get back to basics and consider only what the QT postulates say. (I’m looking at the Nielsen and Chuang version of the postulates.)
Postulate #1 allows negative complex numbers (amplitudes) encoding the state vector.
Postulate #3 the Born Rule interprets the 2-norm of those amplitudes to be encoding probabilities of the basis states.
Thus, you are absolutely correct that QT doesn’t have negative probabilities. The probabilities are always positive by the Born rule. What QT does allow is for the USE of negative numbers in the system state vector.
The (possibly) negative complex numbers should not be understood to be the probabilities. Only the squared magnitude of those numbers may be understood as the probability – per Born’s infamous best use ever of a footnote.
Does that then sound like the right way to say how negative numbers are involved?
One thing I would add is that QT requires that the normalization by kept “in place” through some sort of “computational overhead” over those (possibly) negative amplitudes in the state vector. In other words, those supernatural monks keeping track of all these calculations on some cosmic side ledger in a higher plane of existence must maintain things so that even though the amplitudes may be negative they must still be staying in that Born relationship with the amplitudes for all other possible states … just in case a physicist (or some part of the environment) suddenly asks them for their squared magnitude 😉
Comment #530 January 31st, 2022 at 1:04 pm
Anbar #366:
… not(QM) was already logically incompatible with the 19th century experiments establishing Maxwell’s equations. Not sure how far back you need to go in terms of empirical evidence before classical explanations start requiring Rube Goldberg concoctions, but I would guess not much
… Am I victim of this self confident delusion you mentioned and missing something obvious?
Well, possibly! What I’ve found weird, in this thread, is to be sandwiched between two self-confident extremes:
(1) People lecturing me on why the “why QM?” question can obviously never be answered; how I need to learn to accept that certain things are true “just because.”
(2) People lecturing me on why the “why QM” question obviously has been answered, because how other than QM would you account for such-and-such empirically observed phenomenon?
In some sense, both of these extremes pointedly refuse to enter into the thought experiment that this whole post was about: namely,
Suppose you were designing a new universe from scratch. It wouldn’t have to look like this universe, but you might want it to produce rich, complex behavior in an elegant way, or something along those lines. What considerations would militate in favor of your choosing to make your universe quantum or classical?
There are also many dozens of comments that do directly engage the question, and I appreciate those enormously! But some fraction of comments continue to round down to either (1) or (2), no matter how often I try to clarify.
Comment #531 January 31st, 2022 at 1:09 pm
mjgeddes #367:
This is the mistake of nearly all the commenters in this thread; one simply cannot hope to understand QM merely by shuffling math symbols or firing off vague verbal ‘interpretations’ of abstract non-physical concepts like ‘wave functions’, one must obtain the underlying *physical* principles, expressed in terms of *non-commutative geometry*.
Dozens of commenters here have been talking about physical principles, including Lorentz invariance, the existence of stable bound states for atoms, the ability to resolve ultraviolet catastrophes, etc. etc. Why on earth would you identify “*physical* principles” with noncommutative geometry? The latter is just one particular mathematical idea for how to formulate quantum theories of gravity—an idea that doesn’t seem to have enjoyed great success in physics so far, although maybe that will change.
Comment #532 January 31st, 2022 at 1:09 pm
Anbar #526:
Indeed there is no exclusion.
“It is the ultimate, possibly least constrained, embodiment of the scientific method.”
I can see that, yes. Although perhaps a bit hopeful.
“You need a protocol to follow in order to start with the desired configuration, a protocol for waiting or doing other things, and a protocol to spot some desired configuration at the end”
I love this, because it both resembles “fixing of initial conditions, calculate, compare end result” of a physics calculation, as well well as “input, program running, output” of a computer calculation, devised and directed by people.
“the wave function is a mental representation of a specific protocol. It “belongs” to the QM user”
I see. But then my point is: how does nature do it herself (so to speak)? How do atoms do it to form chemical bonds? “Where” is the wavefunction “there”? Surely atoms themselves do not undertake the above threefold action.
“it of course represents both the system and the apparatus, i.e. “properties” of nature/reality, does it not?”
I guess so; the point I am grappling with is, how does QM work from the perspective of (say) an atom? How does it know what to do? Because the QM calculations are (probably) not taking place locally where it is.
(Although perhaps these are improper, ill-defined questions, or naive… But I can´t help comparing it to for example an apple following the gradient of a gravitational field. Probably too naive; my apologies to the intellectual heavyweights around here, which I gather are many.)
Comment #533 January 31st, 2022 at 1:17 pm
The most puzzling thing to me about QM is that nature seems to take all the possible futures as input to what makes it pick just one amongst those futures.
In other words, exactly the same reason that on one hand a QC seems do be doing some book keeping that’s vastly beyond what a classical computer can do (i.e. juggling all possibilities at once at no extra exponential cost), yet in the end so much of the information is discarded that there’s no super obvious practical win across the board.
Comment #534 January 31st, 2022 at 1:17 pm
Cleon Teunissen #370:
I am aware of course that the claim that Hamilton’s stationary action can be understood _classically_ is an unexpected one. Your _expectation_ is that Hamilton’s stationary action comes from QM.
I am aware: If a claim is highly _unexpected_ then the demonstration will have to be low friction, very accessible. (Conversely, if the demo would be opaque/dull then most likely the reader will dismiss it.)…
Alas, I took a look at your links and found them incomprehensible. You jump almost immediately into diagrams and equations, without ever explicitly stating what are the more basic principles from which you propose to derive Hamilton’s stationary action (if not QM), and crucially, why those more basic principles (whatever they are) don’t already implicitly presuppose the answer that you want.
I confess that I might be biased by having had this argument previously, with people who were 100% confident that they could explain the stationary-action principle in a purely classical way, but then every time I asked them to teach me, I got a huge, complicated runaround, never bottoming out in anything I understood the way the quantum-mechanical explanation does.
Comment #535 January 31st, 2022 at 1:29 pm
Clinton #374:
The Scott Fear:
Scott fears QM is exactly true. By “exactly true” Scott means that QM is the actual operating system of the universe. And by “fear” what Scott means is that Scott may never know why QM must be the actual operating system of the universe.
The Clinton Fear:
Clinton fears QM is exactly true. By “exactly true” Clinton means that QM is the best model humans can find of the universe. And by “fear” what Clinton means is that Clinton can never know if Clinton is just stuck on some island in mathematical theoryspace or if Clinton is deceived by his own neural model of computation.
I mean, either
(a) all of our experimental data continues to be consistent with the hypothesis that QM is the “actual operating system of the universe,” or else
(b) it doesn’t.
In case (a), we can simply continue regarding QM as the “actual operating system of the universe,” as best we can tell, subject to the usual proviso that in science you almost never “prove” your theories, you only accumulate more evidence for them or you rule them out.
In case (b), we might or might not be smart enough to come up with the deeper theory, but at least we’ll then know that QM was not the “actual operating system of the universe,” and that its appearance of being so was illusory!
Comment #536 January 31st, 2022 at 1:37 pm
Scott #530:
“Suppose you were designing a new universe from scratch.”
That is… a tall order.
Also, what do you mean by “universe” here? Something like our reality, made of atomic matter, inhabited by animals etc.? Or a model universe like for example minecraft, or toy universes (models) by theoretical phycists, to probe/better understand its laws? Purely mathematical structures in the platonic sense?
I was under the impression that your challenge/questions were pertaining to our actual reality, and the curious and peculiar scientific relevance of QM in it.
As an aside: while QM and its formalism is indeed extremely important and unavoidable in many respects, there are many scientifically important questions it doesn´t (can´t?!) answer (like the particle families, whose properties are essentially empirical, especially mass). It is not the answer to every question in physics, let alone in the universe; so why are you so “hung up” on it? Apart of course from it being your expertise and you liking it, both of which are sufficient reasons to do it. Perhaps you overestimate its importance for universal, even quasi-philosophical questions?
Comment #537 January 31st, 2022 at 1:43 pm
Clinton #377:
“If (as I fear) QM is exactly true, then we might not ever be able to explain it in terms of anything deeper (but we can still try!).”
But should we try?
Yes, we should. 😀
A quick page search shows almost no mention in this thread of Godel or the Halting Problem. That can’t be right…
For starters, QM, and other physical theories, are not “formal systems” of the kind that the incompleteness theorem talks about. Formal systems are things that we can use to reason about physical theories.
Yes, it’s possible that some theorems relevant to physics could be independent of ZFC or whatever—but even if so, such theorems would necessarily involve quantification over an infinite set of possible situations, or time evolution infinitely far into the future, or some other infinite element. Those theorems’ independence from ZFC would be no bar whatsoever to finding a unified theory of fundamental physics, which is more like finding the right “axioms” than like discovering all the possible consequences of those axioms (i.e., theorems). Sure, we might not discover that unified theory, but if so it will be for a different reason, such as lack of ingenuity or funding. As someone who knows the incompleteness theorem pretty well, I can tell you with certainty that the high-energy physicists will not be able to blame their failure on it. 🙂
Comment #538 January 31st, 2022 at 1:48 pm
I think I may have made this point in one of the comments before, without much traction:
it is well known that one cannot have [x,p]=i in a finite Hilbert space, so either it’s an approximation or the Hilbert space is truly infinite.
Comment #539 January 31st, 2022 at 1:49 pm
Incidently, do we know why the “speed of light” has the value that it has, relative to other physical values? And if this ratio were changed, would that render the universe unstable in the calculations?
(Apologies if too off-topic; please disregard accordingly.)
Comment #540 January 31st, 2022 at 1:53 pm
Jim Graber #379: Yes, that paper is indeed an attempt to answer the question in my post. I’d have to study it more carefully, but my immediate reaction is: it seems to rely on the coincidence that, in our (3+1)-dimensional universe, spin-1/2 particles physically instantiate qubits, with the possible spin directions corresponding to points on the Bloch sphere. I’ve certainly marveled at the same fact, but doesn’t it seem a bit too specialized to be taken as the reason for the entire edifice of QM?
Comment #541 January 31st, 2022 at 1:55 pm
fred #381:
What do you think of Sean Carroll’s program to show that everything, including spacetime, could be derived on top of the wave function as the most fundamental object?
I guess that’s one way to go about proving that QM is necessary, no?
I’m a fan of that program and I follow it with interest! Even if it succeeded, though, in a very precise sense this program would not show that QM was “necessary”: instead, it would show that QM was “sufficient.” 🙂
Comment #542 January 31st, 2022 at 2:01 pm
Stewart Peterson #386: Very briefly, the P vs. NP question talks only about computational problems where
(1) all of the required information is provided as part of the input;
(2) as soon as you see the input, you know an efficient, explicit algorithm to check any proposed answer; and
(3) the “only” difficulty is the exponential number of possible answers to check.
3SAT and Sudoku are examples of such problems; reversing physical evolution (in cases where information has gotten lost) is not an example.
That’s the reason why your 1:45AM thoughts can’t possibly have solved P vs. NP. 🙂
Comment #543 January 31st, 2022 at 2:06 pm
Clinton #529 (“I went back and read over Feynman’s paper and find that Feynman appears to endorse something like using negative numbers in intermediate calculations as part of a probability theory.”)
Sure. I endorse that too. Even in classical probability, e.g. P(A+B) = P(A) + P(B) + (-P(AB)).
Sure. OTOH, for the sake of a better understanding of what’s going on, I’d strongly recommend steering clear of “QT postulates” and towards the QPT literature (e.g.).
Comment #544 January 31st, 2022 at 2:06 pm
Philippe Grangier #388:
I suspect you are circular when answering wolfgang #245 by invoking decoherence. If you look for an answer to the first part of the question (why did God use quantum theory to make the universe ?) you cannot invoke the QM prediction of decoherence to answer the second part (why and how did He make it appear classical to us ?). Unless you consider that God made QM for the very purpose of using decoherence, which would be an answer to your initial question.
All I meant was that, if we’ve already accepted QM, and we know that the universe will start in a special initial state and then apply a Hamiltonian that gradually fills out the Hilbert space, then decoherence theory plus MWIism give us a ready-made explanation for why observers within our universe could perceive a classical world subject to occasional random jumps. I wasn’t proposing this as an answer to the “why QM?” question: at most it could be one piece of a much larger answer.
Comment #545 January 31st, 2022 at 2:10 pm
Philippe Grangier #390:
Q1’ : Why (and how) did God make the universe both quantum and classical, depending on the way or the scale you look at it ? What would’ve been wrong with choosing one possibility only ?
I think the answer to this question is much easier, because choosing either one leads to obvious contradictions with empirical evidence. And thus God needs both of them to get a meaningful universe…
No, that doesn’t work. To say it for the nth time, the “empirical evidence” isn’t fixed in this exercise, but depends on what kind of universe God chooses (or what kind of universe we choose when roleplaying God), which is precisely the question at issue!
Comment #546 January 31st, 2022 at 2:11 pm
Michel #391:
As soon as we set up a ‘classical’ universe, we may get the reals as insufficient to manage reality. The reals automatically then give rise to the splitting field of complex numbers , which gives us a simpler to define universe, where more complexity is possible with less rules. In this way Q2 is (almost) inevitable.
Even if I accept that, it still doesn’t explain why complex numbers should appear in physics as quantum-mechanical amplitudes, rather than in any of a thousand other imaginable ways.
Comment #547 January 31st, 2022 at 2:14 pm
Veedrac #394:
I don’t know enough to tell you why quantum mechanics is the right answer, but I do think there is a simple answer to why-not-classical that you are hinting at here, which is that it’s too small. If stars did not shine nor meteors fall, but we still learned about earthly facts of physics, evolution, and the history we’ve had of evolutionary catastrophe and innovation, then were we wise enough, we could still deduce *purely on first principles* that the world is too small, and there must be more to our universe we could not see. We would then not be surprised in the least to find out about quantum mechanics—for sure there must be an infinity of branching realities, else how could we so unlikely possibly come about?
Again, unfortunately, I don’t think the “bigness” argument works on its own. There are already billions of galaxies that we can see, so why not simply postulate an infinity more that we can’t see, before going to a quantum-mechanical wavefunction?
Comment #548 January 31st, 2022 at 2:15 pm
Could it be that QM is the optimum operating system of the universe because a QC can simulate QM systems in a way where the systems and their simulations are perfectly indistinguishable? So that the simulation hypothesis could be perfectly realized (QM realities can be stacked on top of one another).
Something that’s not true about digital computers and “classical” physics: no digital computer with finite resources can perfectly simulate many non-trivial basic classical systems (like 3 body problem)?
Comment #549 January 31st, 2022 at 2:25 pm
Scott 262 and 485
“one of those possibilities implies unbounded speeds, therefore no true locality or isolation of subsystems, and therefore the other possibility is realized”
“you want to be able to pick stuff up and move it around without changing its structure, and you also want an upper limit on the speed with which you can do so, since otherwise you could get a giant mess where everything instantaneously affects everything else.”
Hi Scott, why is it obvious that the universe must allow true locality? Weren’t people fine with the idea that everything instantaneously affects everything else in the days of Newtonian physics?
Comment #550 January 31st, 2022 at 2:33 pm
Brooks #408:
I am late and not particularly knowledgeable, but IMO Q1 is just a specific case of the “why are so many of our conditions so perfect for us to exist in” question that is best-answered by the Anthropocene principle.
LOL, I’m now imagining an “Anthropocene principle” that’s almost the opposite of the anthropic principle … saying that the universe must be such that life will not only arise, but also quickly destroy itself! 🙁
Why quantum mechanics? Well, why DNA? Why gravity? There are probably lots of other ways things could have worked, but if they produced conditions for sentience, we (or our protoplasmic counterparts) would be asking “why froblits? Why BNM2? Why the general charge field?”
We can’t know whether quantum mechanics is the only way for a universe to work. We just have to be comfortable with that uncertainty.
The scientific spirit exists in a tension between two forces:
(a) being comfortable with uncertainty, and
(b) actually working to resolve your uncertainty.
Jumping prematurely to an unsatisfying explanation, and fetishizing your in-principle inability ever to explain something, are both failure modes, since if there is a good explanation (as there so often has been in science’s history), both attitudes will massively interfere with your ability to find it.
Comment #551 January 31st, 2022 at 2:36 pm
James Gallagher #409:
I think you must be misunderstanding me, Schrödinger Evolution preserves |psi|^100654444222 (for example)
No. No it does not. You are mistaken about this, and I did not misunderstand you. Unitary evolution preserves only the 2-norm. One can even prove that there are no nontrivial linear transformations that preserve other p-norms (except that stochastic evolution preserves the 1-norm on nonnegative real vectors only). Try some examples and see!
Comment #552 January 31st, 2022 at 2:41 pm
By the way the written quote from Haldane is-
“The Creator would appear as endowed with a passion for stars, on the one hand, and for beetles on the other”
It’s often said that when queried by a women’s church group what he had deduced from his studies he answered with-
“He has Ann inordinate fondness for beetles.”
Comment #553 January 31st, 2022 at 2:41 pm
Philippe Grangier #410:
In the last Växjö conference (August 2021) I felt that there was a consensus on some kind of trade-off…
I’ve attended only one Växjö quantum foundations conference, back in 2003. I enjoyed it a lot and would go again sometime, but I also found the meeting to be teeming with axe-grinders, spouters, word-redefiners, Bell’s-theorem deniers, local realist diehards, and other quasi-crackpots! While I was only a grad student then, it was enough to tell me not to put too much stock in any “Växjö consensus”… 😀
Comment #554 January 31st, 2022 at 2:46 pm
Russel #412:
But the inhabitants of such universes quickly reached a point where they had discovered all the laws – and after that, well, they got bored. Weltschmerz. There was no sense of wonder, nothing left to debate, no uncertainty, no further progress to be made, no problems to solve, and in consequence, none of the higher planes of happiness that God wishes for his creations.
And so God experimented with more complex universes. Too much complexity proved troublesome as well – the sentient beings that emerged were unable to solve anything, and gave up striving to understand. But somewhere in between there lay a sweet spot – where the sentient beings could perpetually live in wonder and debate and strive forward, attaining the satisfaction of discovery and understanding, but never the ennui and dissipation of having solved everything.
Thus the beautiful irony of my question – why is the universe quantum mechanical? Simply so that I could wonder why.
Next time you meet God, could you please ask Her why She didn’t create additional particles at the LHC energy scale, so that physicists in the early 21st century would have some more experimental clues to go on, just like their forebears did, and would not be at risk of “ennui and dissipation”? Many of my particle physics friends would like to know the answer. 😀
Comment #555 January 31st, 2022 at 2:54 pm
Jacques Pienaar #424:
If we are satisfied that our world-view stands up on its own principles, then it makes sense to hold on to it and ask: “why the quantum?” But if we suspect our present world-view might be too narrow (incidentally, this is not an unreasonable suspicion, given that the physicists who shaped the present world-view are disproportionately white men raised in the tradition of Western philosophy) then we should instead ask: “how do I change my world-view, so that the quantumness of the universe might fit in comfortably as an unquestioned postulate”?
Please enlighten me, then. In your view, which non-Western or non-white-male-dominated philosophical traditions shed the most light on the question of why our universe turns out to be describable by a complex unit vector evolving unitarily in a tensor product Hilbert space?
Comment #556 January 31st, 2022 at 2:59 pm
Aditya Prasad #436:
I might also point to Rovelli and his recent discovery of and affinity for the Buddhist philosophical notion of “emptiness.” In short, emptiness could be described as the realization that “there is no way that things ‘actually are.’”
Even if that were so, science would still be concerned with explaining how things appear to be. Is it your contention that Buddhist philosophy can help us with that?
Comment #557 January 31st, 2022 at 3:02 pm
Thanks so much to everyone who offered advice about my ankle! I can walk again today without too much problem, but will still need to see an orthopedist or physical therapist to figure out how to prevent this from recurring every few weeks…
Comment #558 January 31st, 2022 at 3:12 pm
Philippe Grangier #470:
Well, I guess I’m among the minority who did not grasp the concept, which was maybe not so clear in your initial questions, it may depend on your mindset. But at least I’m happy to read that the answer to ‘why QM rules this world’ is obvious for you.
To say it one more time:
(1) If you take the experimental data of this world as given, then it’s obvious that you need QM (or some theory to which QM is an excellent approximation) to explain that data.
(2) If you don’t take the experimental data of this world as given, but are designing a new world from scratch, then it’s far from obvious why or whether you’d choose to make your new world quantum.
How could I have said this more clearly?
Comment #559 January 31st, 2022 at 3:14 pm
Ted #471: Alas, perfectly reasonable comments often end up in my spam filter! I’m glad that you successfully posted another version of what you wanted to say. Feel free to email me if it happens again.
Comment #560 January 31st, 2022 at 3:15 pm
Jester #477:
Scott, do you include classical field theories (e.g. Maxwell) in classical mechanics, and Quantum Field theories (QED, QCD) in Quantum Mechanics; so the main difference is between classical vs. quantum, mechanics or not?
Yes.
Comment #561 January 31st, 2022 at 3:25 pm
Luke W #512:
I’m wondering what you make of Tim Palmer’s suggestion that “quantum indeterminacy may perhaps be replaced by certain kinds of ‘hidden variable’ chaotic dynamic, provided that the chaos is sufficiently nasty.”
Tim is now a full-on superdeterminist—not surprisingly, since that’s indeed the only way to torture a local hidden-variable model like what he wants into reproducing the Bell inequality violations. In a recent email exchange, Tim assured me that while superdeterminism might seem like a vacuous dead-end, equally able to “explain” anything whatsoever, all would become clear if only I understood the role of fractal cosmologies, p-adic numbers, and Fermat primes in his story.
If you can’t predict my reaction to that, check out our recent superdeterminism thread for clues. 🙂
Comment #562 January 31st, 2022 at 3:27 pm
Hi Scott. First, thanks for responding. I’m deeply impressed by your willingness to engage with us (and especially with crackpots like me!)
If you consider QM an explanation of “how things appear to be,” then yes, I am claiming that Buddhist philosophy (or realization) can help with that. The phrase “no way that things actually are” roughly translates to “realism is false,” not that there’s no structure to appearances.
My claim is that in Buddhist realization, there is the direct perception that one’s experience has certain features that are consistent with (one very straightforward interpretation of) the measurement problem. In particular, that all of (this) reality hinges on *you specifically* in a mind-shatteringly bizarre sense, and yet that other people are no less real or conscious, and are in the very same situation as you. (And also that there are countless other realities, all of which one discovers oneself manifesting in/as, as one approaches becoming a full Buddha.)
I’m not asking you to believe that this is true, but if it were true, it seems like it would motivate at least some of the structure of QM. In particular, it resolves your question to Yoni regarding why not simple classical indeterminism. Would you agree?
Comment #563 January 31st, 2022 at 3:37 pm
Mateus Araújo #513:
You didn’t reply to my request to give a definition of objective probability. I don’t expect you to succeed, people have tried to do it for a century and failed. I just want you to be honest with yourself and realize that you can’t. This failure is so widely recognized that the consensus in philosophy is that objective probabilities do not exist, they are content to deal with subjective probabilities. Meanwhile the consensus in physics is that objective probabilities are obviously what quantum mechanics gives you, and they don’t worry about defining them.
You and I part ways at this stop. I don’t accept that one has to give a definition of “objective probability” that would satisfy you, or the world’s philosophers, before one can use the concept in constructing a physical theory—it’s enough to know how to work with it. This is directly analogous to how Newton didn’t have to “define” force, Einstein didn’t have to “define” spacetime, etc. etc. It was enough for them to give mathematical descriptions of how these entities behaved in their theories.
Presumably you and I agree that even if the world were classical, we’d still use probability theory all the time to describe our knowledge, just like we use it now? Nevertheless, you maintain that the only possible kosher way (or at least, the only way known) to take this formalism that we all use and build it into the fundamental laws, is the indirect, amplitude-based way that QM does it? If so, I respect that you’ve staked out intellectual territory that you might inhabit alone, or fairly close to it! 🙂
Comment #564 January 31st, 2022 at 3:59 pm
Liam #518: Thanks for the extremely interesting comment—another of my favorites of this thread!
I have two followups:
(1) Every time I’ve struggled through an explanation of why gauge redundancies imply the existence of new forces, it felt to me like hocus-pocus. I.e., you take a step that I would never have contemplated, of just blatantly adding a new term to your Lagrangian to counterbalance the otherwise-bad effects of the gauge symmetry that you insisted on, and then voila, a new force! Is your position more like: (a) I should continue struggling with this until it doesn’t feel like hocus-pocus, or (b) it would be better to go in the opposite direction, of starting with the existence of spin-1 and spin-2 forces, and then seeing why they basically have to act like gauge forces?
(2) You argue, fascinatingly, that the QM/SR/QFT is preferable to classical cellular automata as the basis for a universe, precisely because the former is so much more constrained. One could of course wonder why being constrained is so wonderful—to me, it seems wonderful if and only if the rare universes that satisfy your constraints happen to be the sorts of universes you wanted anyway, which then brings us right back to the question of why you wanted them! 🙂
But let me take a different tack: the principles of QM/SR/QFT aren’t that constraining—or rather, they’re constraining except in all the ways in which they aren’t. They still leave it to God to choose the matter content, gauge groups, coupling constants, and even the dimensionality of spacetime. In other words, looking at the known laws of physics, we might say that:
(i) God decided, for whatever reasons, on QM and SR.
(ii) A great deal else follows from (i) as inevitable logical consequences, and as should surprise nobody, God decided on all of that other stuff too. 🙂
(iii) After (ii), God still had a huge (probably infinite) space of possible choices. So God made one such choice, freely and arbitrarily as far as we know today, just as if She had chosen the Game of Life from among all possible cellular automata.
Furthermore, we can imagine that, to the Mind of God, everything contained in (ii) would be so obvious as to pass without comment (God has, of course, mastered Weinberg’s QFT textbooks 🙂 ) … so that God’s “mental effort” would be focused entirely on (i) and (iii). If so, though, the content of (ii) no longer seems all that relevant to the “why QM?” question—precisely because we now understand its inevitability!
Comment #565 January 31st, 2022 at 4:14 pm
Jester #520:
Is “quantum mechanics” something physicists/computer scientists do, or an inherent property of nature/reality?
Again and again in this thread, people have asked questions like that, but I confess that I still don’t see how it helps us!
Clearly the answer to your question is “both”: yes, QM is a thing scientists do, but the reason why scientists do it is that reality has some inherent property that makes it the right thing to do, when you probe reality at the most fundamental level currently known. The question we’re asking is why reality has that inherent property.
Incidentally, this is also my central objection to QBism, Copenhagen, and all other subjectivist interpretations of QM. The advocates of those interpretations keep repeating, in a thousand variations:
“QM is not about Nature, it’s about us, how we talk about Nature, organize our knowledge, etc.!”
Clearly this strikes them as a profound insight. Whereas I, as it were, already assimilated the insight and just proceeded immediately to what I see as the real question, namely, “why does Nature have whatever inherent property makes it an overwhelmingly good choice to talk about it using QM—a property we might abbreviate as ‘being quantum’?” 😀
Comment #566 January 31st, 2022 at 4:21 pm
Scott #543:
I do not think very big finite classical universes are any simpler. You spend so much descriptive complexity on encoding a very big number in addition to your universe, and then randomly seeding each part of your universe in a distinct and interesting way, whereas you could just describe your universe in a way that directly spawns that big and varied spectrum instead. A classical universe does not want to be big, it wants to be the size of the light cone, or the interaction cone if not so similar to ours, and if you just make that interaction faster, you’ve really just shrunk the pieces.
More importantly, if you really do have that initialization of a really big universe, you’re still better off in an anthropic sense running a simple multiplicatively-growing universe off it, rather than a tiny classical one. Learning that there were other continents didn’t stop there being other stars, learning there were other stars didn’t stop there being galaxies, learning there were other galaxies didn’t stop reality constantly cloning itself in the quantum realm. The calculus doesn’t really change.
(Sure, you say, but why interference, why not have this quantum cloning be non-interacting? Well, I don’t know, but the non-interacting spectrum of universes would still have to be non-classical, else they’re all just the same universe. The argument that QM is overly complex is a lot weaker against that comparison.)
(Sure, you say, but why this anthropic assumption where your probability is proportional to complexities and quantities of observers? Well, I don’t know, but if it wasn’t the case that the universe was drawn from a computationally coherent distribution, I wouldn’t expect people to be able to reason about it probabilistically, nor would I expect to exist if anthropics did not.)
(Sure, you say, but the quote said ‘infinite’, not just ‘very big finite’. Well, I don’t know, but look, over there, a conveniently timed distraction.)
Comment #567 January 31st, 2022 at 4:35 pm
How one symbolically represents the world is part of the problem.
For starters, there is actually no such thing as a purely mathematical system, because mathematics only exists in the context of human consciousness and agency: people invented the symbols and the meaning of the symbols, people discern the symbols and people manipulate the symbols. A mathematical system is actually a larger thing that includes human consciousness and agency, but the essential human consciousness and agency aspect of a mathematical system is always discounted, i.e. regarded as being unworthy of consideration. To completely represent a stand-alone mathematical system, you would need to use symbols representing the equivalent of human consciousness and agency.
It’s a similar thing when using symbols to represent the world-system. There are necessary aspects of a system that can’t be represented by the usual mathematical symbols:
— It is logically necessary that any differentiated system (e.g. one that we would symbolically represent by equations, variables and numbers) can differentiate itself (i.e. discern difference in the aspect of the world that we would symbolically represent by the equations, variables and numbers).
— And it is logically necessary that any moving system can move (what we would symbolically represent by) the numbers that apply to the variables. But you can’t represent number movement with equations: despite the delta symbols, equations can only ever represent relationship.
That is why a computer program can symbolically represent a WHOLE system much better than mathematical symbols alone can, because a computer program has special symbols that can represent the discerning of difference, and special symbols that can represent steps (e.g. the assignment of numbers to variables).
Comment #568 January 31st, 2022 at 4:39 pm
Very interesting, I would read that book if you end up writing it!
I wonder how much would fall out of the universe being in some way fundamentally discrete in time/space/content yet having no particular grain and additionally having all those aesthetically pleasing symmetries/conservation laws. If there’s a minimum meaningful length but no preferred grid does that imply things need to be fuzzy in an interesting way? (As should be obvious I know very little on the subject). I’m now trying to imagine what the Game of Life has to look like if there’s no grid, no tic of the clock, and no preferred direction or speed.
Comment #569 January 31st, 2022 at 4:45 pm
Assuming that someone has somehow managed to justify the unitary evolution / density matrix / partial trace parts of the QM formalism but not the Born rule, what is wrong with the following justification of the Born rule? (I feel like it is a “standard” argument, but my take on it might be slightly different from the usual one.)
Suppose we wanted to test whether some repeatable process produces some particular outcome with probability at least \(p\). Then we set up a system where we repeatedly set up the experiment from scratch, entangle our measuring device with the outcome, and increment a counter every time the outcome we are interested occurs (and also increment another counter to keep track of how many experiments we’ve done). At the end of each experiment, we look at the value of our counters, and determine whether or not at least a \(p\) fraction of the experiments had the outcome we are interested in (we do this using a classical Turing machine, implemented in QM via the whole business of reversible logic gates and throwing away junk bits). At the end of each round of this system this we update the value of a single, fixed output bit (really a qubit) which stores the answer to the yes/no question “were at least a \(p\) fraction of the experiment outcomes the interesting one?”
As the system runs, the state of the output bit evolves. We only care about the output bit – we don’t care about all the complicated internal state of the experiment, or the measuring device, or the counters, or the multiply-by-\(p\) gates – we only care about this one, single (qu)bit. So we take a trace, and end up with the density matrix of the output bit, which is a simple two-by-two matrix. Now standard quantum mechanics tells you that this two-by-two density matrix will asymptotically approach either the matrix \(\begin{bmatrix} 0 & 0\\ 0 & 1\end{bmatrix}\) or \(\begin{bmatrix} 1 & 0\\ 0 & 0\end{bmatrix}\), depending on whether the probability \(p\) was larger or smaller than the Born rule’s prediction.
To me, this looks like a derivation of (a frequentist version of) the Born rule from the unitary evolution / partial trace setup together with the *topology* on the collection of two-by-two density matrices. Did I cheat somehow? Is the partial trace rule as hard to justify as the Born rule?
Comment #570 January 31st, 2022 at 4:52 pm
Scott #563: You still haven’t got the point. In Newton’s and Einstein’s theories the behaviour of force and spacetime are perfectly well-defined through their equations. You know how to transition from state A to state B. That’s all we need.
This is not the case for objective probability. You are in state A, and from that you sometimes go to state B or state C, in such a way that when you repeat the transition many times the frequency of going to state B will be close to 1/3. Probably. How can’t you see that the transition rule is ill-defined?
In our world it’s easy to get around this, we just use a QRNG with bias 1/3, and we have a well-defined transition rule. We can’t invoke a QRNG if the universe is classical, though.
If our world were classical we would indeed use subjective probabilities, no worries. I don’t see what you mean by using them in our fundamental laws. Subjective probability? You mean the weights agents use to calculate expected utilities and make decisions in situations of insufficient knowledge? In our fundamental laws? That doesn’t make sense, we need objective probabilities for the fundamental laws. Or are you claiming that the quantum mechanical probabilities are subjective? In that case you are alone with the Bohmians in one corner, while pretty much all physicists agree with me that they are objective. Or what are you claiming? I genuinely don’t understand.
Comment #571 January 31st, 2022 at 5:05 pm
Mateus Araújo #570: Your position here is as baffling to me as the superdeterminists’ position is to both of us.
Why not just have a theory where the “true state of the world” is a classical probability distribution (p1,…,pN), where that distribution evolves in time by stochastic matrices, and where the interpretation of the probabilities is the obvious one—call the probabilities “objective,” “subjective,” or whatever other words you want, they’re the numbers that agents should use if they want to be right about what they’ll see next? I.e., simply the direct classical analogue of MWI, with probabilities in place of amplitudes? How could that possibly be problematic in any way that MWI itself isn’t problematic?
Comment #572 January 31st, 2022 at 5:14 pm
NT #525:
So in this sense QM is just another symmetry of the laws of physics. Just like we have rotational and translational symmetries of space, we also have a rotational symmetry of the space of “probability vectors”, which gives us QM.
This is indeed an intriguing possibility, and not at all unrelated to what (e.g.) Lucien Hardy does in his derivation of QM!
On the other hand, it’s clear that our universe was not optimized to be “as symmetric as possible.” Many symmetries, like parity and time reversal, exist but are broken, while others, like supersymmetry, are badly broken if they exist. And this isn’t terribly surprising: presumably a maximally symmetric universe would be some perfect, isotropic sphere that nothing ever happened to! 🙂
Thus, one can’t just have a general heuristic of “our universe has property X because it’s more symmetric that way”—one really does have to explain why some symmetries were apparently more important to God than others.
Comment #573 January 31st, 2022 at 5:21 pm
Jester #536:
I was under the impression that your challenge/questions were pertaining to our actual reality, and the curious and peculiar scientific relevance of QM in it.
Of course!
The point, once again, is that to ask for explanations of our actual reality, means to ask “but why wasn’t it otherwise”? And that inherently requires considering other, hypothetical, non-realized ways that reality could have been.
As an aside: while QM and its formalism is indeed extremely important and unavoidable in many respects, there are many scientifically important questions it doesn´t (can´t?!) answer…
Again, of course! While one might eventually hope to explain every facet of existence, in this post I set ourselves the more … “modest,” “achievable,” “warmup” goal of merely explaining QM. 😀
Of course that’s probably still too hard, in which case, the usual approach of a scientist would be to narrow the question still further, and further, until they reached something that they could actually answer. And I strongly endorse that here.
Comment #574 January 31st, 2022 at 5:26 pm
Martin Mertens #549:
Hi Scott, why is it obvious that the universe must allow true locality? Weren’t people fine with the idea that everything instantaneously affects everything else in the days of Newtonian physics?
Oh, I don’t claim that this is obvious—not at all. I merely claim that seems extremely natural, in a way that QM currently doesn’t (at least to me).
(Note also that, even in the 1600s, Newton was severely criticized by fellow natural philosophers for the instantaneous nature of his gravitational force law, and he accepted the criticism! He just said, reasonably, that it was the best he could currently come up with.)
Comment #575 January 31st, 2022 at 5:30 pm
Aditya Prasad #562:
I’m not asking you to believe that this [Buddhist philosophy] is true, but if it were true, it seems like it would motivate at least some of the structure of QM. In particular, it resolves your question to Yoni regarding why not simple classical indeterminism. Would you agree?
No, sorry, I wouldn’t. You’d need to spell out for me a little more explicitly why a classical probabilistic universe would’ve been incompatible with the Buddha’s teachings. Also, did any Buddhists predict as much before QM was experimentally discovered in the early 20th century?
Comment #576 January 31st, 2022 at 5:34 pm
Everyone: Having finally—finally!!—caught up in my responses in this thread, I’d now like to get back to the rest of my life. 🙂 Thanks so much for participating. Please confine all further comments to responses, rather than starting new topics, and then I’ll close the thread in another day or so (although I might do another post soon reflecting on what I’ve learned).
Comment #577 January 31st, 2022 at 5:45 pm
Thank you, Scott.
Is it fair to say then that your goal, other than the survey, is to do for “quantum” what quantum did for “classical”, i.e. the layer underneath, resp. show that this is not possible, thus ” quantum” is the most basic layer, and everything else is on top of it?
(That would be neat, to understand at the end what you meant at the beginning! Also somewhat embarrassing… :-/ Ah well. Better late than never.)
Comment #578 January 31st, 2022 at 6:10 pm
Jester #577:
Is it fair to say then that your goal, other than the survey, is to do for “quantum” what quantum did for “classical”, i.e. the layer underneath, resp. show that this is not possible, thus ”quantum” is the most basic layer, and everything else is on top of it?
Showing either of those things is certainly a worthy goal, yes! But I’d content myself with the much more limited goal of writing the best available survey about the question—a survey that left no relevant known facts unexamined.
Comment #579 January 31st, 2022 at 6:25 pm
Scott #573
“While one might eventually hope to explain every facet of existence, in this post I set ourselves the more … “modest,” “achievable,” “warmup” goal of merely explaining QM. 😀 ”
No, I think we may as well try to ‘explain everything’ , at least in terms of fundamental principles. Then I’m sure that QM , including the best interpretation, would be seen as natural and simple.
In fact, I believe I did exactly that (explain everything) in posts #187 and #207, the only problem being that my explanation is still too high-level and general to be that useful at the moment – just need to fill in details and work out the math 😀 But basically, ‘Self-Actualization’ is the answer to everything.
Meantime, in terms of specific guesses, I would reiterate what I said later in thread: ‘geometry’ is the fundamental reality, QM is about ‘generalized statistical mechanics’, and the notion of ‘symmetry’ is very important.
Just an additional comment on stat mechanics: it’s an interesting point that in classical mechanics, if you had complete information about physical states, the ‘entropy’ of anything is zero. Whereas in QM, that need not be true. So perhaps QM is inevitable because it enables the notion of ‘entropy’ to be objective?
But I would definitely look more closely at notions of ‘Symmetry’. I understand that symmetry is not everything in physics, but for *explaining QM* in particular, it might just be – Algebra is all about Symmetry.
Comment #580 January 31st, 2022 at 7:02 pm
@Scott #575: When a tree falls in a forest and there is no observer, the wave function does not collapse.
PS You are supposed to build quantum computers, so that Buddha can crack messages of the enemy. And the maximal speed is finite, so that non-Buddhist civilizations do not destroy the whole universe.
Comment #581 January 31st, 2022 at 7:23 pm
Matt Leifer #358: Thanks for the fascinating comment!
I’ve thought to myself before that there are two kinds of quantum mechanics practitioners in the world. (I know that this sounds like the setup to a joke, but it isn’t.) Please correct me if I’m wrong, but I think that my own distinction lines up very closely with your distinction between the CSHS and the CLHS, but it’s pithier (if less well-informed):
1. The first type of person thinks that a pure state is a special case of a general state that happens to be represented by a rank-1 density operator. (Or equivalently, a general state that has zero von Neumann entropy, or one whose uncertainty is “all quantum rather than classical”, although you need to be careful about how you interpret that last claim.)
2. The second type of person thinks that a general state is (loosely speaking) a special case of a pure state for which you don’t have access to all of the degrees of freedom.
(I don’t mean “special case” literally for perspective #2. I just mean more broadly that from perspective #1, pure states “sit inside of” mixed states, while from perspective #2, mixed states “sit inside of” pure states, but in a different sense.)
Comment #582 January 31st, 2022 at 7:48 pm
The key point is that it not only needs to feel like there are many possible futures (and only one past), but that _which one gets chosen_ depends in some way on you, specifically. (I’m well aware that no self-respecting physicist believes that anyone has a _choice_ about which branch gets chosen, although even this isn’t as simple as it looks. All that’s needed for now is that you are the resolution point.)
The only way to accomplish this classically (AFAICT) amounts to essentially solipsism, which God also doesn’t want. You must be interacting with other equally real beings, each of whom has the same ultimate freedom, from their own perspective. And the most straightforward read of QM and the measurement problem (that doesn’t wave it away by pretending that decoherence solves it) lands you in something very much like this position.
It seems that no Buddhist was clever enough to develop this into a physical theory, I admit. And if you were to open up orthodox Buddhist scriptures, you’d be hard-pressed to find these constraints laid out so plainly. But you will repeatedly find analogies comparing life to a dream; awakening being about becoming lucid in that dream; exhortations to achieve total and ultimate freedom; constant reminders that other people, while not exactly “real” either, deserve the utmost compassion; etc. While I’m not sure that’s enough to give you the mathematical structure of QM, it sure seems to force something that behaves remarkably similar to it. (And although I’m _far_ from enlightened myself, I’ve had enough personal experience to convince me that they’re not far off the mark in their claims.)
Comment #583 January 31st, 2022 at 8:01 pm
Following the (fun) line of thought advocated by Matt Leifer #358-
Is it only QM that has the special property of being self-consistently developed from more than one perspective. I am *not* a physicist, and hence might be wrong about the following (apologies if so), but my lay understanding is that even classical mechanics can be formulated from multiple perspectives (Newtonian / Lagrangian / Hamiltonian).
If this is so, even if we could argue for a physical principle of the form “A reasonable universe should have multiple workable mathematical formulations”, this might not help privilege QM over other theories (even classical mech) as much as we would like.
Comment #584 January 31st, 2022 at 8:57 pm
Scott Says:
Comment #517 January 31st, 2022 at 8:41 am
If everything observable to us is ultimately nothing but the output of a quantum computer, including all the comments, then that cannot contain the answer to why the quantum computer is and why it is not otherwise. These questions are then just patterns in the computer output (like 0.1134 looking like “hello”) which are meaningless to the computer. By making the assumptions you do, you render your question meaningless.
All we can definitively say based on the assumptions is that we are in a quantum computer, and this is what the quantum computer can do. If we want to know why physicality appears roughly at least as quantum computation, surely the biggest hint must be where that physical effect mysteriously apparently disappears from view, namely the measurement problem.
Comment #585 January 31st, 2022 at 9:55 pm
Steven Evans #584: Your argument (such as it is) has nothing whatsoever to do with quantum mechanics, and is reminiscent instead of a centuries-old religious argument against scientific materialism itself. Namely,
“if what you atheistic scientists say is true, then according to your own theory, you can’t have any justified reasons for saying it. You’re just saying whatever you are because the initial conditions of the universe, the laws of motion, your evolutionary imperatives, or whatever determined that you’d say it. Therefore I don’t need to listen to you. HAHA CHECKMATE ATHEISTS!”
The scientists have any number of possible responses—e.g., “forgive us if we still disagree with you; God must not have given us the reasoning power to see why you’re right”—but maybe the best response is simply that this is a textbook case of the genetic fallacy. Even a fallible process can produce valid arguments. Even a mindless computer, if it’s formed or trained correctly, can generate valid arguments (as with automated theorem-proving programs). To reject an argument, therefore, it’s not enough to heap scorn on its origin; you actually have to look inside the argument and articulate what’s wrong with it. There are no “Fools’ Mates” here.
Comment #586 January 31st, 2022 at 9:55 pm
Scott#537
>For starters, QM, and other physical theories, are not “formal systems” of the kind that the >incompleteness theorem talks about. Formal systems are things that we can use to reason >about physical theories.
This is true. You cannot apply incompleteness proofs *to* quantum mechanics like you apply them to Peano axioms. QM is not a formal system and even if it one wanted to claim that it was QM still uses real numbers, which Tarski has shown is not undecidable like the naturals.
However, so many alternatives to standard QM (e.g. hidden variables) have been proposed because of the “illogical” nature. QM does not follow pre-Godelian, Aristotelian logic, which is why it seems so strange.
But the quasi-paradoxical nature of quantum mechanics fits…. perfectly….. into the quasi-paradoxical nature of the self-application of formal systems (even formal systems which do not use the effective procedure that Godel and Turing use… e.g. Quine’s protosyntax).
If you want a cartoon version of this, look at Hawking’s Godelian argument for the impossibility to obtain a ToE (which he later withdrew). He compares our knowledge of physics to 2D maps of the world, where one point must always be excluded, where one category of knowledge excludes another category of knowledge. He makes no mention of QM, yet he is describing complementarity without realizing it! His argument was wrong, but not without merit!
My point is that this is all a priori. Then questions you asked about the Born rule, unitarity, complex numbers, etc, fit into this framework as explained in #481. The Born rule, unitary, etc. are not a priori, but they fit into the a priori framework. My guess is that some of these ideas will evaporate with a quantum theory of gravity (Bryce DeWitt, for example thought the notion of Hilbert Space might not survive a quantum theory of gravity!!).
If you want to use theological speak this is “I am that I am”, only formalized.
Comment #587 January 31st, 2022 at 10:09 pm
Scott #535:
As Mark Twain said: If I’d had more time I would have written a shorter blog post …
Let me copy your previous post here to save the scrollback
*******************************************************************
The Scott Fear:
Scott fears QM is exactly true. By “exactly true” Scott means that QM is the actual operating system of the universe. And by “fear” what Scott means is that Scott may never know why QM must be the actual operating system of the universe.
The Clinton Fear:
Clinton fears QM is exactly true. By “exactly true” Clinton means that QM is the best model humans can find of the universe. And by “fear” what Clinton means is that Clinton can never know if Clinton is just stuck on some island in mathematical theoryspace or if Clinton is deceived by his own neural model of computation.
You said:
“I mean, either
(a) all of our experimental data continues to be consistent with the hypothesis that QM is the “actual operating system of the universe,” or else
(b) it doesn’t.
In case (a), we can simply continue regarding QM as the “actual operating system of the universe,” as best we can tell, subject to the usual proviso that in science you almost never “prove” your theories, you only accumulate more evidence for them or you rule them out.
In case (b), we might or might not be smart enough to come up with the deeper theory, but at least we’ll then know that QM was not the “actual operating system of the universe,” and that its appearance of being so was illusory!”
*******************************************************************
During working daylight hours I’m 110% on board with the scientific method all the way up to a Quine-Putnam ontological commitment to a universal quantum reality … psi-ontic for the win!
But … in the dark hours of the night when the cold shadows of mortality loom … there be doubt …
I’m afraid that either our mathematical enterprise or our neural model of computation could be acting like Descartes’ demon … cooking up the books from which we read “our experimental data”. This is not an actual, intentional “demon”, of course, I’m worried about … but an unintentional evolutionary or developmental accident of our mathematical logic or the grey stuff between our ears. This is not the “sensory” type of demon that Descartes had in mind simply deceiving us about external “appearances” of reality. This would be a more indirect and subtle demon that could take the form of …
EITHER
(Demon X) Is the mathematical enterprise that gives “our experimental data” its nature (the mathematical/logical fruit “Adam” partook of when first he committed us to use formal systems based in first-order, or propositional, logic). By this, I mean that the experimental data itself could be doubted because it is fruit on a twig of probability theory, which is on a branch of measure theory, itself on the trunk of set theory, and all rising from the roots of first-order or propositional logic. It seems like once we primates climbed this particular tree to have a look around that we may have committed ourselves to end up … out on a QT limb. And … once we are up the math tree we find ourselves naturally dealing with symmetries, measure/probability theories … and inexorably QT is the best-looking branch for us to climb out on and have a look. But, what if we think that this is the “best we can tell” only because we happen to be up a propositional, first-order tree?
OR
(Demon Y) Is the evolved computational hardware of our neural model of computation. Neuroscientific evidence suggests that cognitively (the software) our brains are filling our heads full of illusions and delusions. And on the hardware side: synaptic sites on neuronal dendrites encode complex numbers in the transfer impedance of the membrane amplitude and the normalization of complex amplitudes over possible states for some receptive field represent probabilities. We may not have to go all the way up to a full neural quantum model of computation before we could come under something like a cognitive illusion where all information we acquire “appears” to us like normalized complex amplitudes.
X and Y are not necessarily exclusive.
The reason for bringing up these Cartesian demons is that … either of them could very well be misleading us into thinking that we can/should ask …
Q = Why should the universe have been quantum mechanical?
That is, either a mathematical/logical demon or a neurological demon could be behind condition (a) above.
I realize that I’m stepping outside of the agreed rules of science when I argue that we should not trust condition (a) in more than just the normal sense of remaining vigilant for inconsistent evidence. But then, I think that you also are stepping outside of the agreed rules by not accepting as an answer for Q the rule you yourself gave above that you “can’t prove your theories, you only accumulate more evidence for them or you rule them out.” Aren’t you trying to prove why the universe should have been QM?
So, help me with the exorcism 🙂
How do we exorcise Demon X ? And, remember, it’s no good pointing to condition (a) because the condition of (a) may be due to Demon X. The best argument I can think of to exorcise Demon X is that the possibility of a Demon X existing would cut down every potential tree we might want to climb … if the illusion comes with the nature of the tree …
How do we rule out Demon Y ? I mean … if the only form of information we can possibly know will always look to us fundamentally like a complex amplitude (or the normalized complex amplitude). How would we escape from that? I would like to think there could be some kind of computational answer here – something like universality that would say “No Demon Y can’t trap you in your own computational model.” But I’m not sure, especially if the model always converts all information into amplitudes – whatever form it is in externally – if that would even be a form of “information” …
Scott, thank you for hosting this discussion. Thank you (as always) for the incredibly charitable giving of your time and expertise.
Comment #588 January 31st, 2022 at 10:50 pm
Mateus Araújo #513: I would accept your argument about how branching gives rise to objective probabilities if it could explain the form of the resulting probabilities. This would be the case if the branches were equally likely (at least for a finite number of branches). However, the probability is not uniform; it’s given by the Born rule. So branching doesn’t derive the Born rule (though it’s certainly compatible with it); it’s imposed separately. So I have a hard time accepting branching as the explanation of probability in our world. One might as well say that QM leads to objective probabilities via the Born rule, bringing in branching seems superfluous.
Comment #589 January 31st, 2022 at 11:01 pm
Clinton #587: No, I really don’t think I’m violating my own rules. In science, you can’t demand that the universe give you a deeper explanation of something, but you can always ask. You can always look for a deeper explanation, bearing in mind that you might not find one. And even the deeper explanation probably won’t consist of a mathematical proof. I.e., even when there’s a rigorous derivation of an old theory’s conclusions from a new starting point (e.g., Kepler’s laws from Newtonian mechanics, Newton’s gravitational force law from GR, thermodynamics from statistical mechanics…), the new starting point still has to be accepted on the basis of “mere” empirical evidence and/or reasonableness, not proof. Nevertheless, this is the main way that scientific progress happens. And it’s been pretty successful.
And I see no reason why I need to believe someone who points to a theory originating in experience (like QM), and expresses certainty that this time we’ve surely hit rock bottom and will never, ever successfully explain it in terms of anything else. Maybe so, but what’s the source of the person’s certainty? What’s the harm in seeing if we can make the theory a little more natural- or inevitable-looking, e.g. by rederiving it from a different starting point?
Comment #590 January 31st, 2022 at 11:10 pm
Scott #555:
I thought you’d never ask!
First let me straighten out one thing: deriving a “complex unit vector evolving unitarily in a tensor product Hilbert space” is part of Q2; my comment was aimed at Q1. I’m only suggesting that the kinds of postulates you need in Q2 look “weird” to our intuition because we are looking at them from a particular angle; but maybe we can find a home for one of them (thereby resolving Q1) by shifting our perspective.
My current source of inspiration is the tradition of phenomenology, which — oh bitter irony! — is part of Western contitental philosophy and most definitely historically dominated by white men such as Husserl, Heidegger, Sartre, and Merleu-Ponty. That said, phenomenology does have intriguing overlaps with Buddhism and has also influenced feminist philosophy via Simone de Beauvoir. But more to the point: you’ll be hard-pressed to find a philosopher of quantum physics today who admires phenomenology (and if you do, please introduce me); that is enough to place it outside of the usual boxes that we physicists are used to.
How specifically might phenomenology help to answer “Why the quantum”?
In “CSHS” interpretations, particularly “Copenhagenish” ones (borrowing terms from Matt Leifer #358) like those of Rovelli, QBism, Healey, Grangier, and others, there is an emphasis on “observation”. Although often used negatively, as in `an outcome doesn’t exist unless it is observed’, the implication is positive: that things do exist just in case they are observed. If you filter that idea through a (non-phenomenological) Western philosophical lens — the kind that makes a Cartesian split between a proactive `knower’ and a passive world just lying out there waiting to get `known’ — then it does sound like some variant of solipsism. But if you read it through the lens of phenomenology, the idea comes alive as a potentially constructive thesis, against whose backdrop a suitably formulated quantum principle just might appear natural.
Phenomenology tells us that the ground-level of reality lies in what is observed, and everything that claims existence in physics — black holes, atoms, the big bang — has grown out of and has its roots in perception. The difference is subtle but important. When observations are taken to confirm the existence of a physical object, we usually interpret our observations as revealing a pre-existing truth that was formerly hidden from observation. On this usual account, observation is secondary to what is real: the objects of physics lie behind observed phenomena, as behind a veil or a screen, like puppet masters putting on a show for us.
Phenomenology flips that view on its head. When observations “confirm” that an object exists, they just as much contribute to its existence. The objects of physics are not constructed behind the observed phenomena, they are built upon it and made out of it; they are imposed upon perception and if they fit well, then we weave them into the tapestry of our reality.
Turning now to QM: one aspect of perception is that it is inherently contextual, since every observation occurs in some context. Therefore contextuality is the default, and it is non-contextuality that becomes mysterious. How is science able to transcend the contextuality of nature to make claims that are (largely) non-contextual? How is classical physics possible in this world? If we reject the initial supposition of “hidden variables” behind what we observe, then why can we in most cases get away with pretending that they are there? Notice how the shoe is on the other foot: we are now contemplating a world where quantum theory fits more comfortably than classical physics!
You will (rightly) complain that making vague references to “contextuality” is still a long way from deriving the very particular structures of contextuality that lead to Hilbert spaces. Obviously to do that we need to make some substantive postulates.
So, here is my constructive proposal. Take a list of all those postulates that people have proposed in reconstructions of QM — the ones which struck you as “weird” and “unmotivated” — and revisit them with a thoroughly phenomenological mind-set. Maybe this time one of them will suddenly jump out at you as a perfectly natural constraint to impose upon the phenomenological world, much like the “relativity principle” or “constant speed of light” seemed like perfectly natural postulates to make in Einstein’s world.
Addendum: Besides phenomenology, I also think neo-Kantianism, American pragmatism, and enactivist cognition might be fruitful places to look, and I am probably ignorant of many others. For those curious about phenomenology, a good entry point is Zahavi’s book.
Comment #591 February 1st, 2022 at 12:02 am
Jacques Pienaar #590: Would you agree that dinosaur fossils, let’s say, were really there in the ground for 65+ million years before anyone dug them up, records of a vanished world that actually existed—that none of this was “constructed by the act of observation”? If so, then is the thesis that while 19th-century phenomenologists were wrong about all the stuff they thought they were talking about, they were nevertheless right about quantum states, which they didn’t know they were talking about? 🙂
More fundamentally, my objection is that this game seems too easy. The Continental philosopher gets to, as it were, fling mud indiscriminately at anything that seems like it might be important to Western science: objectivity, determinism, predictability, locality, noncontextuality, Boolean logic, reductionism… And then, if science painstakingly comes to the conclusion that one or more items on that list really do need to be rejected, or even just revisited or reinterpreted, the Continental philosopher gets to do a victory dance, as if he or she deserves glory for a brilliant premonition.
But despite what everyone says, questioning established postulates is trivially easy, like sailing out into the open sea! The hard part is reaching land—i.e., finding clearly-stated new postulates to replace the old ones.
Even so, I don’t know of a single thinker, before QM, who specifically proposed that the way we calculate probabilities might become wildly wrong at the scale of atoms—if anyone can suggest a reference, let me know! (If I remember right, William James did speculate that Newtonian determinism might be found to no longer hold at the atomic scale, and I do respect him for that.)
Comment #592 February 1st, 2022 at 12:02 am
Scott Says:
Comment #585 January 31st, 2022 at 9:55 pm
OK. But the quantum computer would need to output verifiable details of its origin. But the same output of a computer can be achieved on different instantiations (a tape, a chip), so there is no reason to think the computer can output verifiable details of its origin.
Also, if we assume physical observations are simply the output of a quantum computer then any conclusions would apply equally to a quantum computer that doesn’t underpin a physical universe. The main issue about how fire is breathed into the equations is being lost.
The measurement problem does seem to have some connection with how equations are made physical, though; because some of the physical effect apparently disappears from view.
Comment #593 February 1st, 2022 at 1:49 am
Scott #542:
First, thank you very much for addressing this question at all.
I had heard of constraints, or axioms, #2 and #3 in your formulation. The cryptosystem in #396 passes – as far as I can see – both tests. In the case of #2, the algorithm to check the proposed answer is the decryption algorithm. In the case of #3, the hashing function, each iteration of the preimage generator, and each iteration of the decryption algorithm are all in P for known (existing) cryptosystems.
In the case of your axiom #1, however, I had not heard that stated in so many words, and I thought I had read Cook’s problem statement fairly carefully. I see multiple plausible interpretations of your axiom #1:
1: “P vs. NP is indeterminate over any system containing an oracle.” It certainly is; I thought I had bypassed this by not using an oracle. The cryptosystem in #396 does not “hardwire” any of the information needed to solve the problem. This information is contained entirely in the private key and the public key. My approach looks like it starts with not enough information, but that’s because it’s deliberately run backwards: we start with all of the information, store a public key in one subroutine that will accept only one private key, scramble the private key within another subroutine, and force yet another subroutine to unscramble the private key. We verify, in other words, the original private key – not the fact that a key that we generate can be scrambled into the same ciphertext. The scrambling is not used for encryption – it’s very weak, and if used to generate keys, would be cracked by the first guess by the preimage generator. It’s used to create such a highly-many-to-one mapping that guessing the specific preimage that was used is difficult: the task in your #3.
2. “Lossy functions are not in P, by definition.” I think it’s unlikely that you meant this.
3. “Preimage generators are not in NP, by definition, because all of the information needed to solve the problem must be input to the algorithm performing the guessing.” If you say this is true, I defer to your judgment – but I still wonder how this is possible. If we have enough information to know which guess to make, to what extent are we really guessing? It’s like having a maze with arrows on the floor; the whole point of a maze is you don’t have enough information at the beginning to know which way to go. The lack of information is the reason you have to guess. Now, sure, all of the information must be available to the overall algorithm – but not necessarily to the guessing subroutine. If this case is outside the scope of the problem, by definition, that sure is interesting, since I haven’t seen any statement to that effect in the literature. I equally-sure am open to hearing it, though!
4. “Once you scramble the private key, the cryptanalysis is outside the scope of P vs. NP.” Depending on definitions which I’m sure you know better than I do, this may be true – but then, why would anyone think that P!=NP implies that public key cryptography is secure, or can be made to be secure, and that P=NP implies that it cannot be made to be secure? If it were outside the scope of the problem to brute-force the system, why would anybody be talking about P vs. NP in this context? After all, a brute-force attacker doesn’t have the private key, and therefore doesn’t have all the information needed to solve the problem. If this is what you meant, then brute-forcing a key is outside the scope of P vs. NP, whereas much of the literature seems to discuss just this. So, I’m assuming this isn’t what you meant, either.
5. “The analysis is valid; it just doesn’t imply anything about P vs. NP due to the particular way that the P vs. NP problem is defined by convention.” In this case, we can discard the P vs. NP talk and discuss determinism and non-determinism, as it occurs outside of the P vs. NP problem. (This addresses the original topic of the thread, and this is why I brought up the whole topic.) If correct, my initial argument in #386 would establish that information loss in the equation of state of a process causes the reversal of the process to be fundamentally non-deterministic, which is a useful result if we’re trying to identify whether all non-deterministic systems can be modeled deterministically. I argued above that this is not possible. This appears to be in agreement with current data: we can determine the probability distribution of a superposition, but the outcome of the collapse of the wavefunction is fundamentally unknowable. Information about what will happen has been destroyed by the operation that composed the system’s state. However – you wrote the book on this, literally, and if it were valid, you presumably would have thought of it.
6. “Peterson is a dummy; he hasn’t established even this much.” I’m more than willing to take this interpretation, but I’m curious as to why. I may have misinterpreted what you said; I certainly appear to have misinterpreted Cook’s problem statement if what you say is correct. Is there a formal definition of your axiom #1 somewhere in the literature? I read your 2008 paper on algebrization and didn’t find anything (other than interpretation #1 above) that looked like it in there.
Would it perhaps be better if I posted a single, cohesive, formal statement, instead of relying on a correction to a correction to an argument? That might clear up what I’m really trying to get at.
Comment #594 February 1st, 2022 at 1:50 am
Scott#470 « How could I have said this more clearly? » I feel compelled to answer this, in a post-last post…
You said it clearly, but it simply does not fit into my mindset. To take again the restaurant metaphor, as a physicist I consider my duty to deal with the dish in front of me, there is a lot to explore and understand, and when doing that I’m sitting on the shoulders of giants, as the wording says. But as an ambitious theoretical computer scientist you want much more, you want to be the Grand Chef and redesign the whole menu – well…
In order to get an idea about how physics works, I recommend that you have a look at the article ‘Why some physical theories should never die’, by Olivier Darrigol, available at http://www.sphere.univ-paris-diderot.fr/IMG/pdf/Martins_paper.pdf . In my words, it explains that a new, more refined physical theory never totally replaces the previous one, some bits and pieces always remain as a background, absolutely required just to build up new experiments, based on well-known physics and techniques. Maybe this was not so much apparent – though already true – in classical physics, until we came to QM, which makes no sense if taken out of the classical context allowing us to define experiments (sorry again for being neo-Bohrian). QM alone predicts ghostly behaviors and self-multiplying worlds, in complete contradiction with obvious empirical evidence. OK, you want the world to be ‘quantum only’, and you don’t want any reference to empirical evidence; unfortunately I cannot follow you on these grounds.
Comment #595 February 1st, 2022 at 2:47 am
Scott #517 “we’re now 515 comments deep in a thread about why the universe is quantum, why the right model of computation for our world appears to be quantum computation?”
I share the view that the question of the “right model of computation of the world” might be important also for understanding QM and “why QM”. I am not sure how common this view is among physicists and mathematicians, so the original post and the question “Why QM” make sense also for those who do not regard computation as part of the answer (and also my previous comment was computation-free).
Scott and I differ on the answer to the question about the “right model”.
The universe being quantum leaves two possibilities for “the right computational model of the world”
The first is noisy quantum computation below the noise level required for quantum fault tolerance
The second is noisy quantum computation above the noise level required for quantum fault tolerance
Scott’s view is that the right model of computation of the world is “noisy quantum computation with noise rate below the noise level required for quantum fault tolerance,” while my view is that the right model of computation of the world is “noisy quantum computation with noise level above the noise level required for quantum fault tolerance”.
My brief speculation (#454) for “why QM?” is that QM is required to explain probability, to explain chaos, and (NEW) to explain the emergence of geometry/space-time. Furthermore, I think that understanding probability, chaos, and geometry (in the physical context), depends not only on identifying QM as the fundamental physics theory but also on realizing that “a world devoid of quantum computation” represents the correct computational model.
Comment #596 February 1st, 2022 at 2:53 am
Jester #532 and Scott
– I guess so; the point I am grappling with is, how does QM work from the perspective of (say) an atom? How does it know what to do? Because the QM calculations are (probably) not taking place locally where it is –
This is the way I see it
An atom “knows what to do” as much as we do. The difference between us and the atom is that we are complex enough, through the perks of evolution, to understand what’s happening (we have the circuitry within our bodies to generate meta-objects like the concept of a material object, represent self-reference and ricorsivity, etc)
Opposite to the atom, we recognize that the world is predictable and we try to work our way from there by using the recursive self-referential language of meta-objects: QM is were we landed (so far, at least)
Back Scott’s question whether anything like a complex object pondering on the concept of recursion would be possible in a classical universe, I’ve been trying to argue that the answer is “possibly, but it would require Rube Goldberg concoctions that would make the string landscape look simpler than a platonic solid, in comparison”
Comment #597 February 1st, 2022 at 4:07 am
Scott #571: Since you think that I’m on a superdeterministic level of insanity, I think I’m forced to clarify that this stuff I’m talking about subjective and objective probabilities is uncontroversial. There’s a huge literature on the topic, and pretty much everybody that has studied it agrees with me. What is controversial is my claim that Many-Worlds solves the problem.
As for your classical Many-Worlds theory, it is fine, I mentioned it was a solution way back in comment #145. Now the state of the world is not a string of n-bits that evolves probabilistically in a ill-defined way, but a vector of 2^n real numbers that evolves deterministically via a stochastic matrix. The transition rule is now well-defined. The weights p_i are literally the amount of worlds of each kind, and probability theory reduces to measure theory.
Now please don’t say that the probabilities are objective or subjective or whatever, that’s just intellectual vandalism. You were explicit that the vector of probabilities was the true state of the world, that makes them objective probabilities. Objective probabilities are a property of the world, subjective probabilities are a property of your head. Objective probabilities exist without any agent to think about them, subjective probabilities do not. You can have subjective probabilities about deterministic phenomena, but there are no objective probabilities involved (excepted of course in the case of branching). Do you see that they are completely different things?
Comment #598 February 1st, 2022 at 4:29 am
Mateus Araújo #513:
Even if perfect “objective probabilities” might not exist, it still makes sense to try to explain which properties they should have. This enables to better judge the quality of approximations used as substitute in practice, for the specific application at hand.
—
I wrote: “Indeed, the definition of a truly random process is hard.” having such practical issues in applications in mind. In fact, I had tried my luck at a definition, and later realized one of its flaws:
The unpredictable for anybody is a mistake. It must be unpredictable for both my opponents and proponents, but if some entity like nature is neither my proponent nor my opponent (or at least does not act in such a way), then it is unproblematic if it is predictable for her. An interesting question arises whether I myself am necessary my proponent, or whether I can act sufficiently neutral such that using a pseudorandom generator would not yet by itself violate the randomness of the process.
(Using a pseudorandom generator gives me a reasonably small ID for reproducing the specific experiment. Such an ID by itself would not violate classical statistics, but could be problematic for quantum randomness, which is fundamentally unclonable.)
—
I had tried my luck with highlighting important properties randomness should have for specific applications before:
The general attitude of that text is quite similar, but my newer elaborations are different in important details.
—
I guess you (=Mateus Araújo) had something completely different in mind, when asking for a definition of objective probability. But my attempts at definitions are not so far off, if you take into account my gut feeling that “true randomness” can be simulated/approximated by mathematical structures:
Comment #599 February 1st, 2022 at 4:52 am
Feynman, Witten, Maldacena and several others have urged us not to ignore John Wheeler’s argument. Yet, his approach never really appears in lists of quantum interpretations or approaches to “unification“ (Wheeler’s mantra was “How come the quantum?” How come existence?”) His “it from bit” is often misinterpreted, sometimes as classical, because it doesn’t use the term “qubit”, even though “it from bit” preceded the term “qubit” by 5 years. The exception is from people like Maldacena and Witten and some others, who have understood and defended as possible if not plausible, Wheeler’s claims. I am afraid it will go quite a bit longer being ignored, missing inroads to new areas of research.
The Gödel argument, when presented on a blackboard, is either accepted easily and with shock “Oh I see, wow!” or it is dismissed because there is some misunderstanding about the claim. For example one critic confused semi-decidability with undecidability, and thought this whole Wheeler approach was connecting semi-decidability with QM instead of undecidability. The conversation sounded like the cow/daughter confusion scene from Fiddler on the Rood.
If there is any fogginess, I encourage you to think twice about throwing away everything about this approach. I encourage you instead to focus on learning the subtle differences between semi-decidability and undecidability, between decoherence and measurement, then revisiting the argument. Primers on these subjects will appear on jawarchive.wordpress.com, as well as a proper formal treatment of Wheeler’s argument in the future for anyone who wishes to understand more.
Like Feynman, like Witten, like Maldacena, like many others, I encourage you to look into Wheeler’s approach. With a soon-coming formalization of Wheeler’s ideas with applications, hopefully any fogginess and misunderstanding will be cleared up! We can and we will understand Wheeler’s question “How come the quantum?” or as Scott puts it “Why quantum mechanics?”
Good luck everyone on your search!
Comment #600 February 1st, 2022 at 4:53 am
murmur #588: One needs branching in order to define what are objective probabilities. Saying that an event E happens with probability 1/3 means that we branch and in 1/3 of the worlds event E happens. This explains why is it impossible to predict whether E or ¬E happens: because both do happen, you will have future selves experiencing each of them. It also explains the law of large numbers: if you do a sequence of measurements, in most worlds the frequencies you obtain will be close to 1/3 (it can be proven rigorously).
This doesn’t tell you what is the probability of a given event, to answer that you need do decide how to count branches, i.e., to define a measure over them. Well, since we have the law of large numbers we can find it out experimentally, and it is clear that if we count branches according to their 2-norm we fit the data very well. That’s enough for me.
You claim that the probability is not uniform, but this is not true. There’s no such thing as “uniform” by itself, mathematically we can only say uniform relative to some choice of measure (often the Lebesgue measure). Well the probability is uniform with respect to the 2-norm measure.
You seem to advocate for the measure where we count each measurement result equally. I already explained in my comment #362 why this doesn’t make sense, but maybe it’s worth citing a more sophisticated argument. This recent paper by Saunders shows that if you insist on counting branches equally but respects their decoherence structure you go back to counting them according to their 2-norm, modulo some rounding errors.
Comment #601 February 1st, 2022 at 6:50 am
Scott #104:
Wow, this is impressive. And it nicely mirrors the spririt of this blog post and the constructive atmosphere of the discussions in the comments.
What I like especially about it is the concrete illustration of how progress happens by small steps. (Well, I admit that developments like MIP*=RE are big steps, but if you dig into this big step, you learn that it was enabled by many smaller steps.) I find it important to encourage the public discussion of small steps. To an individual researcher himself, small steps (like a sharpened imagery) might feel nice on a personal level, but “well know” in sufficiently small communities on the larger level. And sharing such small steps in a way that others can enjoy them too can be a larger effort than apparent from the finished 2-3 pages of text and images:
Comment #602 February 1st, 2022 at 8:05 am
It would be interesting to get feedback from Scott to see if this thread of 600+ posts has moved his needle in any direction even just a tiny bit.
And if not, why he thinks that’s the case?
E.g. a reflection of the blog’s audience, a reflection on the question itself, or something about himself, etc.
Comment #603 February 1st, 2022 at 8:18 am
gentzen #598: I’m afraid you misunderstood me. I’m certain that “perfect” objective probabilities do exist. A simple photon shinning on a beam splitter is an example, and there are countless more from quantum mechanics. I don’t see why is it a mistake to ask for a phenomenon that is unpredictable to anybody, we do have that. What I’m asking is for a mathematical definition of objective probability, this is the century-old difficulty.
Comment #604 February 1st, 2022 at 8:27 am
The chapter “Question 10: Reconstructions” in Maximilian Schlosshauer (editor) “Elegance And Enigma – The Quantum Interviews” (2011) contains the opinions of 17 researchers on a similar question as this blog post, including Tim Maudlin and Lucien Hardy. Many participants in the comments here probably know this already, but I guess only very few know the following information from its preface:
(Sorry for replacing “unreliable” information by …)
I find the way that John Wheeler frames his “Why the quantum?” question in the conversion with Paul Davies in that older book unexpected and illuminating:
Comment #605 February 1st, 2022 at 8:33 am
Scott #288
“…if someone in the 1800s had asked “why is classical mechanics true? why does it have the features it does?,” with hindsight that would’ve been one of the best questions they could possibly have asked! Because it would’ve had nontrivial answers!”
Indeed, Kant’s 1786 “Metaphysical Foundations of Natural Science” is not trivial at all…
Comment #606 February 1st, 2022 at 8:41 am
Anbar #596:
“An atom “knows what to do” as much as we do. The difference between us and the atom is that we are complex enough (…) to understand what’s happening”
But this is what I meant when I asked whether QM is something we complex organisms do as calculations, and whether whatever happens in nature all by itself is THE SAME! So nature does QM without knowing it? But how? What is the “mechanism”?
Although apparently this doesn´t strike anyone else as odd, so I guess I will ascribe it to my own peculiar view.
Anyway, thank you for your reply.
Comment #607 February 1st, 2022 at 8:48 am
Thanks, Scott, for quite a wild ride. I hope your foot will heal completely and sooner rather than later.
I’m glad I’ve followed along every twelve hours or so, otherwise I would surely have never started. The comments you’ve picked out as good have seemed good to me, but there have been many others that I’ve thought worthwhile. I look forward to your summary of what you’ve learned.
One aspect it’s been interesting to see has been the perennial misunderstanding of what you intend by your “Why?” questions. I think I still don’t quite understand what you think would be good answers, so please forgive me that my comments have more focused on “How is it that?” At least, I think that’s approximately what I have tried to do. I suppose I take the view that the way to have an answer to a “Why?” sneak up on me is to answer as many “How?” questions as I can think of, from as many points of view as I can find and partially assimilate from other people, and hope.
In response to J #583, therefore, “even classical mechanics can be formulated from multiple perspectives (Newtonian / Lagrangian / Hamiltonian)”, I find it helpful to add to that list Koopman’s Hilbert space formalism for classical mechanics (Koopmanian? Koopmannian?), and even more so to use an algebraic approach to Koopman classical mechanics (there are other formalisms, of course). That gives us, I suppose, a way to answer “How can we make classical and quantum similar enough that the question ‘Why should the universe have been quantum-mechanical?’ is not quite as much a concern?” I find Koopman approaches to CM helpfully different from Wigner function approaches to QM, although of course the latter are also good to know.
Another perspective I find helpful is an algebraic approach to probability theory, which becomes quantum probability theory with just the addition of noncommutativity, I think because that approach abstracts away from concerns about dynamics and causality, which we can add back in after we have understood the role of “collapse” when dynamics and causality are absent.
I apologize if all this is too mathematically distant from an answer to a simple “Why?”
Comment #608 February 1st, 2022 at 9:02 am
Mateus #597
I don’t follow your logic. It seems to me maybe you reject the privileged position of an observer in quantum mechanics. A sufficiently large number of experiments (similar to GHZ say) confirm a superposition state prior to measurement. There is no basis to sub divide the pre measurement state with individual probabilities. Objective probabilities can only be assigned to the outcome of an observation and those are objective probabilities as attested to by voluminous experimental evidence. The objective probability is possible only in conjunction with an objective observation. There just is no reality to sub divide pre measurement.
GHZ and similar experiments scare me by the way-they provide good evidence that reality really does behave in accordance with QM.
Comment #609 February 1st, 2022 at 9:38 am
Scott #565:
You say QBism ignores the `real question’. Yet here is a quote from Fuch’s 2016 paper `On Participatory Realism’ that emphatically says the opposite:
Fuchs has said similar things repeatedly in various places since the very beginning of QBism. Here’s one from a 2001 (!) paper he wrote while at Bell Labs, before he was even fully QBist (my emphasis):
If I had the time, I bet I could come up with a dozen more like that. So I am really baffled why you seem to think — along with Mateus Araújo #398 — that QBism says `nothing is real, nothing to get hung about’. Did you somehow skip these parts of QBist literature? Or are you simply committed to the idea that whatever is real has to reside in the quantum state?
Comment #610 February 1st, 2022 at 9:52 am
BTW, in case no-one’s posted a link to it already, there’s a book about the modern attempts to answer this question (“By contrast, the focus of the new wave is the reconstruction of quantum theory from physical principles. Contemporary researchers are looking for an answer to Wheeler’s famous question “Why the quantum?” [17] and are driven to understand the origin of the formalism itself.).
Comment #611 February 1st, 2022 at 10:30 am
Philippe Grangier,
“In an EPRB experiment Alice’s measurement does not « cause » Bob’s result”
OK.
“…which is anyway undefined as long as Bob has not decided about an orientation.”
I think I’ve corrected you twice that in my argument the detectors do not change orientation. You are answering again to a different argument.
“Her measurement only allows Alice to make a contextual (and probabilistic) inference about Bob’s result.”
In my argument the inference is exact (100% probability).
My argument is different from the original EPR one. It does not depend on counterfactuals (unperformed measurements). It doesn’t matter how the records are manipulated and sent from A to B or whatever. At the end of the day you have a piece of paper showing two columns of space-like measurement results, perfectly anti-correlated. Just tell me how your model explains those printed values.
Comment #612 February 1st, 2022 at 10:55 am
Mateus Araújo #603:
Your challenge was to try to define objective probabilities, if I understood you correctly. I gave it a try, and I claim that it goes beyond mere subjective probability. I don’t want to claim that it exactly captures the required properties of objective probabilities, and my excuse is that all my attempts in the past later turned out to have “room for improvement”.
My understanding is that your own shoot at your challenge was: “The only solution I know is defining true randomness as deterministic branching, aka Many-Worlds.”
A good mathematical definition should be invariant under isomorphism. But if I have a model of the world, I can always “adjoin” an “all knowing pure observer”, and get an isomorphic model. Therefore I must limit the “unpredictable for anybody” in my definition to the “relevant stakeholders” in such a way that it excludes at least superfluous “all knowing pure observers”.
Comment #613 February 1st, 2022 at 11:10 am
Jester #606
“ But this is what I meant when I asked whether QM is something we complex organisms do as calculations, and whether whatever happens in nature all by itself is THE SAME! So nature does QM without knowing it? But how? What is the “mechanism”? “
Why should there be a mechanism in the first place? Ether was not needed, at the end. Besides, it’s not like the Earth would need to calculate its orbit, in a classical universe 🙂
I probably don’t understand what you mean with “how”… QM describes a self consistent way (and possibly the most efficient, from a design point of view -> Q2) for complex systems to make sense of a predictable non Rube Goldberg Universe, like the one we seem to inhabit
Coming to Q1, the obstacle is that all the biological processes depending on molecular details like those that make a given molecule an enzyme and a similar molecule (or maybe a switched off version of the same molecule) not, would essentially need their own fundamental description and individual designs, rather than being emergent and naturally selected… definitely not a task worthy the time of a timeless being
Comment #614 February 1st, 2022 at 11:31 am
I just finished implementing Heisenberg-Weil gates in Cirq and remembered this post. Addressing Q2, it seems otherwise impossible to count in half-steps on a qutrit. I guess you could say a solution with only real numbers is possible if you do something like a half step is 0.5(|001>) + 0.5(|010>)+ 0.0(|100>) and such, but that ends up being a non-smooth solution. I’m not sure if a smooth solution using only classical probabilities exists or not, but regardless the complex solution seems so much more elegant (I’m new to the field and am still kind of blown away by how nicely that turns out). So maybe “you can count to 3 by halves” is the axiom that unfolds into Q2.
Comment #615 February 1st, 2022 at 11:32 am
Stewart Peterson #593: P vs. NP has nothing to do with “information loss.” It’s about information that’s already present in the input—e.g., the existence or nonexistence of a satisfying assignment to a 3SAT instance, or (yes) a preimage of a hash function output—but that appears to be exponentially hard to access. If your knowledge is coming from Steve Cook’s Millennium Prize essay, then try, I dunno, my P vs. NP survey article, and come back when you’re done if you still think P vs. NP is relevant to your interests?
Comment #616 February 1st, 2022 at 11:33 am
Mateus Araújo #368: Thank you for the thoughtful reply– I really appreciate. I certainly admire that you are clear about which bullet you are biting. I could carry on with quibbles and questions, but I will save it for another post in the future.
I have enjoyed reading your lively back-and-forth with Philippe Grangier, so thanks to you both for that!
Matt Leifer #358: This comment represents some of the best writing and clearest thinking on this subject that I have ever come across. I am going to print it out and sleep with it under my pillow. Thank you for sharing.
Mateus Araújo, many posts: I think Scott understands that there is an important philosophical distinction between subjective and objective probability, but is making the point that it is easy to imagine a universe where the only kind of probability that arises is subjective. While this would disqualify “QM as a source of objective probability” as an answer to Q2, it does not refute your argument about QM as a source of objective probability with respect to our universe.
Comment #617 February 1st, 2022 at 11:51 am
Philippe Grangier #594: I completely agree with you about the pattern of new physical theories “stealing” all the successes of the old ones—sometimes literally to the point of recycling the exact same equations, but assigning new, more sophisticated meanings to the variables, reducing to the old meanings in some appropriate limit. I believe Feynman and Penrose both wrote about this.
But here’s another pattern in the history of physics: experiments yield unexpected results, somewhat arbitrary-looking theories are carefully constructed to explain those results, and then someone realizes why it actually followed from more general principles that the theories must’ve taken the form they did. A few examples include:
– Michelson-Morley → Einstein
– Zoo of elementary particles → Standard Model with SU(3)×SU(2)×U(1) gauge group
– QFT with hacky renormalization recipes → Wilsonian effective field theory
In each of these cases, on the one hand, we can say that with hindsight, physics took a somewhat painstaking, roundabout route to a place where in principle, pure thought probably could’ve gotten faster. But on the other hand, physics can hold its head high, since this is how science is supposed to work—how it’s worked in the best cases—and given the limitations of human intelligence, no one in history has yet discovered a faster way in practice! 🙂
Comment #618 February 1st, 2022 at 12:10 pm
Mateus Araújo #597: Alright, alright, I retract my comment about your position being superdeterminism-level insane. 🙂
You’ve now ceded to me (and I’m sorry that I missed it earlier) the main point I cared about: namely, that you haven’t given any evidence or arguments against “classical probabilistic Many-Worlds”—which therefore remains 100% on the table, awaiting considerations that would rule it out in favor of the amplitude-based Everettverse.
I, in return, am happy to cede to you the main point that you cared about: namely, that in both the Everettverse and the Classical Probabiliverse, the probabilities of branches could properly be called “objective.”
To my way of thinking, there are some entities that are clearly “objective” (dinosaur fossils), and others that are clearly “subjective” (opinions), and probabilities are fascinating precisely in how they straddle the two realms. I.e., I could easily imagine two people who I’d regard as equally sane, both in possession of a final theory of physics, yet disagreeing with each other about whether to call some particular probability “objective.” On reflection, though, I agree with you that the Everettverse and the Classical Probabiliverse both settle this debate, in favor of the “objective” side, simply by fiat!
Here’s something to ponder, though: your position seems to commit you to the view that, even if we only ever saw classical randomness in fundamental physics, and never quantum interference, a Probabiliverse (with all possible outcomes realized) would be the only philosophically acceptable way to make sense of it.
But given how hard it is to convince many people of MWI in this world, which does have quantum interference, can you even imagine how hard it would be to convince them if we lived in a Classical Probabiliverse?
In fact, if anti-Everettians understood this about your position, they could use it to their advantage. They could say: “Everettians, like Deutsch, are always claiming that quantum interference, as in the double-slit experiment, forces us to accept the reality of a muliverse. But now here’s Mateus, saying that even without interference, a multiverse would still be the only way to make sense of randomness in the fundamental laws! So it seems the whole interference argument was a giant red herring…” 🙂
Comment #619 February 1st, 2022 at 12:12 pm
Scott #564:
To answer your question (1) – yes, I’m saying you should do (b), i.e. start with the existence of spin-1 or spin-2 massless particles and see why they basically have to act like gauge forces. The way the argument works is that you classify all finite-dimensional unitary representations of the Poincare group for massless particles with spin. You find that some of the Lorentz generators have to act trivially. Then for, say, spin-1, you look at the vector representation A_mu, and you find that the Lorentz generators that needed to act trivially actually generate a shift in A_mu. Stated in position space, this shift is the usual gauge transformation of A_mu. So gauge transformations need to “do nothing”, i.e. be a redundancy of description. Then you try to add interactions in your Lagrangian between the gauge field and other stuff, and to preserve this redundancy the gauge field has to couple to a conserved current. So you are forced into using gauge theory. If you have 30 minutes of spare time and just want to get the basic idea, I recommend reading just the first four pages (or at least the first two paragraphs!) of chapter 8 in Weinberg’s QFT textbook.
Comment #620 February 1st, 2022 at 1:03 pm
fred #602:
It would be interesting to get feedback from Scott to see if this thread of 600+ posts has moved his needle in any direction even just a tiny bit.
Yes, it has! (A tiny bit 🙂 )
It’s solidified for me that the a-prioristic arguments for QM that have some purchase on me, that I want to explore further, can pretty much all be classified into a few broad families:
(1a) QM as an elegant way to “reconcile the discrete and the continuous,” producing a universe with continuous positions, momenta, etc., but still finite upper bounds on information content, as well as stable atoms with discrete, combinatorial Lego-like behaviors.
(1b) QM as an elegant way to preserve symmetries—rotational symmetry, Lorentz symmetry, etc.—even in the face of discreteness that would naïvely seem to break those symmetries.
(2a) QM as a way to get randomness that can never be explained by introducing local hidden variables, that must be “freshly generated on-the-fly” (under no-signalling assumptions), and that could conceivably even bear on questions of identity and free will.
(2b) QM as a way to let us keep arguing forever about what’s “ontic” and what’s “epistemic,” whether we’re forced to accept the reality of a multiverse or can reject the branches that we don’t see as unrealized hypotheticals, rather than clearly resolving such enormities one way or the other as most of the known alternatives would do.
Much of my remaining unease comes from the fact that we now have too many plausible-seeming explanations on the table! So: can all four of the explanations above be seen, somehow, as different facets of the same thing? Or should we simply say that the explanations, while different, are complementary? I.e., maybe God was trying to solve a multi-objective optimization problem? Like, maybe She needed QM for the certifiable randomness reason, and the Lorentz symmetry and reconciling discrete and continuous stuff just clinched the case for Her, or fell on Her lap as a welcome bonus?
Anyway, the other thing I learned from this thread is the huge variety of ways in which people can misunderstand me when I ask “why QM?”—which will be crucial knowledge when and if I write my survey. To me, it seems too obvious to state that what we mean is:
“Suppose I’m designing a new universe from scratch—if we like, to be run inside my computer. I will be the God of this new universe. Please walk me through, in detail, all the considerations that would militate for or against putting quantum mechanics into my universe. Don’t talk to me in philosophical generalities, but in mathematical and physical details: what options do I have, other than QM? What sort of universe will each option lead to? What mathematical consistency or other problems have I failed to foresee? If you can convince me that QM would clearly be the right choice—then, and only then, I can consider the question of why our universe is quantum to have been answered to my satisfaction, or at least, probably as well as it ever can be answered.”
But many, many commenters seemed to have trouble with this—either lecturing me on why learning more about my question is philosophically impossible and I should just give up, or else pointing to various known phenomena in our world (e.g., blackbody radiation) as so clearly, obviously “explaining the necessity of QM” that I must be dense not to have thought of them. So I’ll need to start my survey by directly engaging those views.
Anyway, thanks again everyone, for one of the more epic threads in this blog’s history! I’ll probably shut the thread down by tomorrow morning, so please get in any final comments before then.
Comment #621 February 1st, 2022 at 1:17 pm
Responding to your #618, Scott, I’ve come to think that the “Classical Probabiliverse” and the “Everettverse” (EV) are both quasi-deterministic in only slightly different ways. In the EV, all evolution is unitary, so that for any given measurement the same measurement perfectly repeated at an earlier or later time (with unitary evolution carefully adjusted for) must give the same result. This apparently means that for any and all measurements the future cannot be different from what the past thought it would be, which seems to imply that there cannot be any branching.
This makes some idealizing assumptions that an Everettian might not want to make, but if we do make them the Classical Probabiliverse apparently looks very similar to the EV. A Liouville state unitarily evolved into the future by the Liouvillian operator in CM looks very similar to an Everettian state unitarily evolved into the future by the Hamiltonian operator in QM. Without collapse of the wave function as a way to construct joint probabilities for noncommuting operators, an Everettian in general cannot perform incompatible measurements, which creates a further point of similarity with the usual idea that incompatible measurements cannot be modeled by CM.
FWIW, when I realized this about a year ago, I became much more OK with the EV. I feel that it makes some sense of why people might like the EV, in part because it’s so close to the Classical Probabiliverse. Of course this argument doesn’t prove that there is no branching, only that on a particular set of assumptions about QM there is no branching, which, as I say, some Everettians might not want to make.
Comment #622 February 1st, 2022 at 1:50 pm
Scott#620 « please get in any final comments before tomorrow morning. »
Well, hard to resist such an invitation, so here is a post-post-final post…
– I buy 1a and 1b, and I could tell more on that, but no time left.
– For 2a, my colleague Alexia and I have something to tell about ‘ontological randomness’, see https://arxiv.org/abs/1804.04807
– As a supporter of contextual objectivity in a single universe, I’m not interested in 2b.
Then some final speculation : if classical physics remains present and is not ‘emergent’, then it should show up somewhere in the theoretical description, right ? I think it does, but not in the naive textbook QM, that manages infinite limits in a wrong way. Von Neumann already noticed that, since 2^aleph0 = aleph1, the dimension of a countably infinite tensor product blows up to uncountable infinity, see #301. Moreover, there is ‘nothing’ between aleph0 and aleph1, this is Cantor’s continuum hypothesis. This has a strong flavour that there is also ‘nothing’ between a system and a context, except for the (in)famous Heisenberg cut. So maybe we are missing some pieces of transfinite mathematics to make sense of all that ? Major progress in physics require not only new equations, but new mathematical tools : differential equations, linear algebra, tensors… what will be the next one ?
Comment #623 February 1st, 2022 at 1:55 pm
Scott #620
“So: can all four of the explanations above be seen, somehow, as different facets of the same thing”
But they are already!
1a and 1b clearly belong to the same theme (as your own choice of numbering shows), and 2a is a *logically necessary*, i.e. only following from self consistency, consequence of 1a/b.
It’s basically the entire reason why QM makes so much sense as a “fundamental choice for a timeless creator”
2b on the other hand only results from the confusion about 1 and 2a
Comment #624 February 1st, 2022 at 1:59 pm
Anbar #613:
Re mechanism: “Ether was not needed”
Not for the equations to work, agreed.
“it’s not like the Earth would need to calculate its orbit, in a classical universe” Also agreed, that is why I wrote somewhere above that “action at a distance is no practical problem”, as long as THE SCIENTIST wants to CALCULATE RESULTS.
But for some reason, I favor the idea of having a physical concept of what is actually going on in nature (reality), by itself. I find explanations that require someone doing calculations that do not take place in the things themselves as unsatisfactory.
But this quirk of mine is neither relevant for actual scientists who are happy just to do the calculations, nor for Scott in particular with his survey and Q(1/2). A bit frustrating, but here we are.
Comment #625 February 1st, 2022 at 2:00 pm
Scott #616: Indeed, I cede to you that a Classical Probabiliverse is a perfectly consistent way of having objective probabilities, without requiring quantum mechanics. In my comment #145 I rejected it because I had in mind simulating the Classical Probabiliverse with a cellular automaton or a deterministic classical computer, which would lead to a nasty exponential explosion. But you’re not proposing to simulate it, but rather have its laws as the fundamental laws of universe, so it’s fine.
In the foundations of probability there are three different, but closely related concepts: subjective probability, objective probability, and relative frequencies. Their closeness has repeatedly tempted people to reduce them to each other, and this always resulted in failure. We have to accept that this cannot be done, they are in fact different, and all three have important roles to play.
Indeed, I do believe that the existence of objective probabilities is enough to imply the existence of a multiverse, without requiring interference or entanglement. I’m also aware that extraordinary claims require extraordinary evidence, and if we did live in the Classical Probabiliverse I probably wouldn’t accept the argument myself.
Maybe you know the story of how, back in the 60s, Lewis arrived at something like the Classical Probabiliverse only from considering the semantics of modal logic. Unsurprisingly, the reception was hostile. The most generous reaction I’ve seen was that his semantics were pretty nice and we could even use them, if we added the caveat that his multiple worlds were obviously not real.
Comment #626 February 1st, 2022 at 2:04 pm
gentzen #612: My challenge was to get a mathematical definition, I don’t see any attempt in your comment to do that. Indeed, I claim that Many-Worlds solves the problem. Note that in Many-Worlds it is not makes no sense to predict which measurement result will happen: all do. Therefore even an “all-knowing pure observer” cannot do it.
OhMyGoodness #608: Indeed, I’m rejecting a privileged position for the observer. That is not only philosophically abhorrent, but also trivially contradicted by our observations. We can observe light that was emitted billions of years ago, long before any observer existed, and their spectrum shows that quantum mechanics worked the same way back then. I don’t see what the GHZ experiment has to do with anything.
Comment #627 February 1st, 2022 at 3:45 pm
“I, in return, am happy to cede to you the main point that you cared about: namely, that in both the Everettverse and the Classical Probabiliverse, the probabilities of branches could properly be called “objective.””
Would anyone really be willing to wager on a lengthy game of chance that uses a fair three faced die and I win if one of two numbers come up and my opponent wins if only one of the three numbers comes up. If anyone really believes my expectation of winning is subjective or depends on unobservable branching of timelines then put money in escrow and let’s start the game. When you have a bounded set of outcomes, and frequency analysis of outcomes, then it has objective value in allowing reasonable outcome expectations. No difference with quantum experiments with a bounded set of outcomes and frequency analysis of individual outcomes. In both cases no a priori determinism is possible but still possible to form reasonable objective expectation of outcomes. I have no idea if branching occurs (my bias would be that it doesn’t) because it cannot be observed but can’t understand the logic of stating that probabilistic forecasts I see being satisfied in this universe are subjective while branches in potential unobservable universes are objective. Both situations add some little piece of new reality to this universe that previously were in some sort of state of potential reality. I don’t understand why that transition would necessarily require some branching to have logical value in this reality. Of course it’s only in your mind prior to the outcome of the experiment that then allows you to determine if your expectations were in accordance with results.
I know re hashing arguments made innumerable times but sensitive when claims are made that probability has no objective basis. In some sense life is just a series of probabilistic decisions. Deterministic outcomes are rare with respect to uncertain outcomes that admit only some (maybe naive, maybe formal) probabilistic evaluation.
Sounds like you have tough kids. 🙂
Comment #628 February 1st, 2022 at 3:47 pm
Scott #620:
Out of your options, (2a) jumps out at me as the approach most favoured by QBism (certainly it is the one that resonates with me personally).
Blake Stacey frames the core quantum principle as `vitality‘, whereby nature actively resists a hidden-variable description, and a similar thought is expressed in our paper deriving the Born rule.
Our way of formulating (2a) is like this: classical physics is stale and boring in the sense that a single measurement can perform two functions: (i) allows for tomography of the state, i.e. the probabilities assigned to its outcomes uniquely identify the state, and (ii) perfectly distinguishes the maximal number of states of a system. One way to formulate the principle (2a) is to say that nature breaks this degeneracy: it forces the measurement that extracts `complete information’ from the system (the informationally-complete measurement) to have strictly more outcomes than the measurement that extracts `perfect information’ from the system (a PVM).
I think Rovelli was aiming for something along these lines with his two postulates: (R1) There is a maximum amount of relevant information that can be extracted from a system; (R2) It is always possible to acquire new information about a system. (However, as Blake Stacey points out, these aren’t enough by themselves to rule out hidden variables.)
Note that positing that nature `resists a hidden variable description’ is not a natural postulate if you subscribe to a world-view in which observations just reveal to us a reality that is already `out there’. In such a world, why would we need to have `freshly generated randomness’? What calls for it?
(2a) would look more natural in a world which emphasizes free will and where reality is constantly on the make, being shaped in part by our observations of it. And that is precisely the kind of world that phenomenology and American pragmatism describe. For instance, the following quote from Merleau-Ponty:
When I read that, it sounds to me very much in the spirit of saying that “optimal” interrogations (aka PVMs) do not “reveal all the secrets” of a system, and each time we interrogate the system it changes such that we never complete our knowledge of it (“interrogation stands in essential correlation to becoming”).
Comment #629 February 1st, 2022 at 4:43 pm
Scott, thanks for clarifying how the axiomatic derivations of QM acquire a circular character: if I understand you correctly, the properties that we like to assume (symmetries, reversibility, action principle, etc.) are ultimately best *explained* by QM, which is exactly the theory we’re trying to justify!
As far as we know so far, cellular automata appear equally capable of supporting life-like processes. I too would love to see a bigger research programme that tests this hypothesis by exploring emergent properties of CAs.
Supposing that both CAs and QM can do the job, which should we prefer? In Comment #468, you argued that we should be biased against QM because it wastefully introduces an exponential amount of additional structure. While your point seems intuitive enough, I’d like to entertain an opposing view, that this is not the right measure of a theory’s complexity. Accepting the conventional wisdom on BQP, we see that QM doesn’t literally produce exponentially more resources. The wavefunction is a strange piece of math, and has a very large representation, sure, but it doesn’t correspond to an equally large space of empirical outcomes. I think it pays to rigorously consider our criterion for a “good” theory.
Solomonoff and others have argued that the right criterion is minimum description length: the shortest computer program that explains our lifetime’s dataset is inferred as the “true theory” describing our reality. Why choose this criterion? Because we need *some* sort of prior bias, else we’d run into No Free Lunch and Boltzmann Brain issues. Furthermore, it’s been shown that any kind of bias we *could* come up with would be dominated by the minimum-length approach (subject to certain assumptions such as the Church-Turing thesis). So it seems that, in all Universes that *allow* inference, the minimum description length approach should work.
If all this talk of inference seems too subjective, the corresponding “objective reality” would be to imagine that the Universe is the output of a universal deterministic Turing machine, whose infinitely-long program tape is set uniformly at random. We might take this randomly programmed machine as our ultimate theory of the Universe. There’s a reasonably large (that is, O(1)) probability of this random program encoding a discrete cellular automaton. Similarly, since QM is simple to describe (e.g., in terms of Lucien Hardy’s axioms!), there’s a reasonably large probability that the program will consist of a short description of QM, with the remaining infinite sequence of bits serving as algorithmically random seeds for Born’s rule (up to any desired precision). Thus, this approach also addresses the complaints that some commenters raised against probabilistic theories, regarding the “correct interpretation” of probability: the fact of the matter is that the random program prior puts a substantial positive measure on output data that’s consistent with QM + Born’s rule.
Unfortunately, this approach doesn’t single out QM over CAs. Nonetheless, it gives them roughly equal status, which I find remarkable enough! Among theories with continuous space-times, it’s not at all clear to me that there are workable alternatives to QM. Plain stochastic transitions are very limiting: for example, how would you continously apply a NOT gate over a finite and positive amount of time? In other words, what is the NOT gate’s “square root” in a classically stochastic theory? I admit that, for all we know, it might be possible to sacrifice some of these nice properties and still produce life. Nonetheless, QM does seem quite special at the very least!
By the way, if you have time for it, I’d love to hear you expand on your Comment #495 about different ways to bound information in continuous space-time.
Comment #630 February 1st, 2022 at 5:07 pm
Jacques Pienaar #609: Yes, I know that Chris Fuchs always says he ultimately wants to say something about the world! I’ve known Chris for ~23 years, and have had many conversations with him about … well, what else? … this stuff, quantum foundations. And I love Chris’s writings and have learned a lot from them.
The trouble is that, as far as I can see, the goal of saying something about the external world always remains at the aspirational level. And as soon as Chris stops expressing that aspiration, he goes right back to talking about agents, and their subjective probability assignments, and measurement as just a weird kind of Bayesian updating. And if you ask him (or David Mermin, or any other QBist) what happens when an agent itself is placed in superposition, whether two superposed agents then actually exist or not … well, you get a disquisition that I wouldn’t even be able to summarize here, since I can no longer piece any of it together once it’s finished! 🙂
Comment #631 February 1st, 2022 at 5:15 pm
Maybe the bounded uncertainty in a classical universe is just impractical and even implausible.
In a classical world, if we are given control over a range of inputs we can ensure a non-zero probability for some output, or class of outputs.
And in that way, diminish uncertainty by some amount.
But in a quantum world, with amplitudes, someone else could just destructively interfere those outputs out of existence, thereby restoring uncertainty?
That seems less problematic. I mean, imagine humanity being able to irreversibly reduce uncertainty by one bit every day.
Anyway, I see that efficient-factorization is still a top contender for Q1. 🙂
Comment #632 February 1st, 2022 at 5:30 pm
#620,
“So: can all four of the explanations above be seen, somehow, as different facets of the same thing? Or should we simply say that the explanations, while different, are complementary? I.e., maybe God was trying to solve a multi-objective optimization problem?”
The latter, i.e multi-objective optimization. As I suggested in #207, it’s a balance between 2 factions or ‘houses’ of angels: Blue angels (Rationalists) and Green angels (Artists).
The Blues want reality to be *comprehensible* (data compression enables symmetry). The Greens want reality to be *interesting* ( ‘injecting’ more randomness enables growth of complexity)
The Blues set the *constraints* on reality (1a, 1b: bounds on information, symmetry, stability of matter), the Greens enable *possibilities* through which reality can get more interesting (2a, 2b: inject randomness, enable conscious minds to exist in order to structure information). But they’re complementary.
Ultimately it’s all about ‘Actualization’ (realization of possibility). From the perspective of agents within reality, the ‘objective’ picture can’t be completed except as a limit or ‘idealization’, so reality is, in a practical sense, always ‘under construction’.
Comment #633 February 1st, 2022 at 6:50 pm
Mateus Araújo #626:
I see, so you are missing the details how to connect to “relative frequencies” #625:
For my attempt, this means that the details of the following statement are missing:
You are right that even if I would allow indefinitely many games, and would declare that I believe that results with odds smaller than 1 : 100 000 000 are a definite proof of cheating, it would still be unclear how exactly I should determine whether I believe that my opponent (or nature, or fate, or whoever) has cheated. If the sequence of games runs long enough, extremely improbable results are bound to appear from time to time. I agree that those details are tricky, and it is quite possible that there is no good way to address them at all.
Comment #634 February 1st, 2022 at 7:18 pm
I certainly do not know enough physics to make any guess on the big issues. As a merely intrigued observer, the fact that Stern-Gerlach experiments statistically generate values related to cosines fascinates me. Having read two Internet pages recently, a strange thought came in to focus. The issues have to do with the distinction between “passive” and “active” associated with changes to coordinate systems and the fact that Einstein had specifically sought to free the laws of physics from canonical coordinatization.
As I understand it, relativity is a superposition of inertial or/and uniformly accelerated reference frames. Any measurement which would alter momentum for some component of the experiment would seem to be introducing a relationship between between the initial reference frame and two new reference frames (equal and opposite). If the reference frames relate to the laws of physics as changed coordinate systems, then the relationship between “active” and “passive” is symmetric by virtue of being unknown — and, perhaps, unknowable.
The classical theory depends upon the smoothness of trigonometric functions — or, better, analytic functions. But, there are two curious things surrounding these functions. One is that the theory of closed real fields is decidable. This evaporates when analytic functions are introduced. The other has to do with the topics Cantor had been studying just prior to set theory — namely, sets of uniqueness for trigonometric series. Today, theses sets are studied on a circle rather than the real line. They lead to many interesting questions.
Everyone knows about the Cantor set. When you take it’s Cartesian product and project the image of this Cantor square back into the real line with a 45 degree angle, you obtain a fully measurable interval from -1 to 1. Two systems seem to relate to this — Rademacher functions and Walsh functions. The latter are good for many things in information theory. But, they cannot patch together a manifold
And, one can give an interesting little topology on [-1, 1]. It is called the either-or topology.
I do not know much about physics, but Hamiltonians encode the laws of motion so that energy is principal while relativity encodes the laws of motion to be free of canonical coordinates. So, I wonder if the trigonometric functions are the avenue through which the initial conditions of an experiment and the terminal conditions of an experiment relate to one another.
If so, quantum reality would seem necessary.
Comment #635 February 1st, 2022 at 7:26 pm
Mateus #626
Yes same as now-photons from say quasars passed through two slits today will show interference patterns until an observer makes a measurment and only then localized to a location. The quasar photons remain in a superimposed state until a measurement made by a current observer. Considering the relativistic time dilation of photons it certainly makes sense that the age of photons makes no difference to quantum mechanical adherence and so the quantum mechanical prpoerties remain in a sumperimposed state until such time as a measurment is made. The current obsever remains central to quantum mechanical state of the ancient photon-superimposed or measured value.
Comment #636 February 1st, 2022 at 8:18 pm
Scott #589:
Just wanted to wrap up our sub-thread by going back to my first post … 600 comments ago!
[[[[ At what comment count do we cross the blog event horizon …? ]]]]
Way back then I gave these first answers to your questions, prefacing by saying these would be my answers “for the tenure committee” 😉 In other words, if I had to bet my kid’s college money … then this is my bet …
Q2:
Why C? Obviously, the complex numbers are closed and complete. And then state vectors if anything will be “distinguishable” (orthogonality). And if we want to be able to take state vectors and get a complex number (given multiplication and phase operators) then there’s an inner product space … and now it’s a complex Hilbert space (finite case assumed …)
Why unitary transformations? Linearity; not breaking the above chosen C Hilbert space
Why the Born rule? Gleason’s theorem, again consequence of choosing C Hilbert space
Why the tensor product? to combine the C Hilbert state spaces
Q1: Just repeated that God would start with C
Q: Why should the universe have been quantum-mechanical?
If the universe is made of complex amplitudes, or God made the universe out of complex amplitudes, then all else pretty much follows.
To actually give you an answer for Q1:
There is nothing (that I know of) that would have prevented God from, or have been wrong with God, making a classical universe – or something even more simplistic. I see no reason why we could not, for example, refer to instances of Conway’s Game of Life as “universes” obeying their own internal models with an infinite boundary at the horizon. Isn’t that the definition of a universe?
So, to summarize, my viewpoint is that it all begins with complex numbers, amplitudes: a complete, closed, continuous field as the most fundamental “stuff”.
All the other features {symmetries [group classifications related to the normed division algebras], interference/entanglement, complex Hilbert space [discrete states], unitary [don’t break the Hilbert space], randomness [Born rule/Gleason, why is everyone confused? The Born rule is how the universe DEFINES randomness, doesn’t it 🙂 ], tensor product (combine state spaces)} just naturally follow once we find ourselves in the complex field.
That “safe” bet above being my “party-line” position … I then went down a couple of rabbit holes. What I do appreciate about you Scott is that you are not afraid to admit to looking down a rabbit hole or two, even if the usual answer is “No, that can’t be right.” and even if we expect that beforehand. You once said that hearing wrong ideas can actually be quite helpful because knowing WHY the idea fails can often shine more light on the right idea – indeed that is pretty much the sentiment behind this present quest. And, again, I agree with you above that we can always ask.
Thanks again! I look forward to your essay/book.
Comment #637 February 1st, 2022 at 8:29 pm
Scott #620:
“(2a) QM as a way to get randomness that can never be explained by introducing local hidden variables, that must be “freshly generated on-the-fly” (under no-signalling assumptions), and that could conceivably even bear on questions of identity and free will.”
Here’s something I think I can contribute! Free will is kind of a red herring, as free will is not linked to determinism or non-determinism. A slight majority of philosophers accept this (59%? see Compatibilism), but the gist of the argument is thus:
What does it mean to have free will? It’s hard to define. If presented to make a choice, especially an important one, a person does not typically “roll a die”. They make a decision based on their sum total of life experience. But, they could just as easily roll a die, and that’s fine too. The injection of randomness does not increase the importance or morality of the action. Objectors usually say they do not want their decisions dictated by predictable laws. I think this is mistaken, because those laws give rise to an emergent consciousness, that is clearly operating at a much higher level of abstraction, than the dynamics at the base level. At a less sophisticated analogy, a computer program sorting a list is not, in any useful way of thinking, just throwing electrons around. It’s running quicksort (or whatever).
What is the alternative? Even a supernatural explanation (which I don’t think anyone was advocating for) of consciousness merely moves the goal post; in the supernatural realm, what logical object is making the decisions? It must have a metaphysical definition at the very least, subject to the same question of determinism or not. While that is an interesting place to think about (esp after a few drinks), it does make me think there’s nothing lacking in the free will department with a classical, even deterministic universe. Chaos theory is more than enough for excitement.
I think randomness that cannot be introduced in terms of local hidden variables does have some applications, however. For one it permits the choice of those values to be based on non-local or inaccessible reasons.
Comment #638 February 1st, 2022 at 8:52 pm
Scott #267:
You took this in the opposite direction that I intended: the analogy with strict finitism was meant as a reductio ad absurdam. While it’s “metaphysically extravagant” to theorize there are infinitely many prime numbers, it’s also completely reasonable. So why think it’s a mark against a theory that it depends on lots of extra structure? I’m sorry to keep to beating this drum, but it does seem as if it’s explanatory power goes under-appreciated – all of this extra structure is predicted by the mathematical universe hypothesis, which suggests we live in the largest, most complex, possible structure, not the smallest.
I think you were too quick to dismiss my answer in #14, and rereading your response it seems as if you misunderstood it, which must be my fault, but I’m not sure how the miscommunication happened. Let me back up. You raise two really interesting questions about quantum mechanics; there is a roll-your-sleeves-up, nuts-and-bolts question about why we have the particulars of QM, like the Hilbert space, gauge symmetries, spinors and complex amplitudes, etc. This is a fascinating topic for research, and most of the comments in this thread have engaged with this aspect of your prompt. And then there is what is to me the more interesting and fundamental question, why is the world *quantum*? More to the point, why is it *weird*? I think this gets at the intuition that motivated this discussion in the first place. I think it’s why you’re asking whether someone from the 19th century could be convinced that QM was necessary, and not asking whether someone from the 18th century could be convinced that Maxwell’s theory of electromagnetism was necessary. The latter is not interesting to you in the same way because classical EM just isn’t weird.
What is it about QM that makes it weird? It comes down to superposition. Everything else factors in to the nuts-and-bolts mechanics question. If the world didn’t have superposition, it would be classical. You might still be interested in why these specific whatever mathematical tools are needed to describe it, and it’s separately very interesting to think about whether superposition generates and packages together all this mathematical furniture, but presumably you wouldn’t have the same attitude of astonishment with respect to its weirdness. Without superposition, we don’t get Bell inequalities, spooky action at a distance, we don’t get the weirdness of the MWI or QBism. Without superposition, the wavefunction represents epistemic uncertainty, not ontological uncertainty. Superposition is what makes QM so metaphysically uncomfortable, seem so ad hoc and extravagant, and counterintuitive.
I think I gave a really good motivation for superposition, in #14. It’s an extremely simple answer, in a way that is metaphysically satisfying; it’s very natural given the presumptions; and plausibly could have been believed by 19th century scientists – David Lewis’ On Possible Worlds could easily have been written a hundred years earlier, since it depends on no physical knowledge whatsoever, and you could get the same insight that follows from modal realism.
The basic idea is that consciousness supervenes over possible worlds, and that there are no physical haecceities. Let me give an illustrative example. Suppose that Scott and Scott* are two people who are qualitatively identical, living in causally disconnected parallel universes that are identical except for the fact that in one universe, some electron is in a spin up state, and in another, that electron is in a spin down state. Are Scott and Scott* are just the same person? It seems so. Their mental experiences are mathematically identical, and if we equate mathematical structures, there should only be one seat of consciousness experiencing the lives of both Scott and Scott* at once, making them one and the same person. If you accept that, Scott/Scott* therefore experiences the electron in a superposition of spin up and spin down, and this superposition is ontological, not epistemic. Moreover, Scott experiences all possible worlds that are consistent with his mental data, all at once in superposition. Is this not a compelling explanation for why we see superposition? From a simple answer to “why is there something rather than nothing?”, we get the mathematical universe hypothesis. And then from a further simple assumption that consciousness supervenes over the mathematical ensemble, we motivate the phenomenon of superposition! Notice that when Scott/Scott* make a measurement, they branch off away from each other. We don’t experience superpositions directly. But the correlations that prove they are real can be explained by this ‘supervenience across parallel worlds’ perspective. That seems to me fairly remarkable!
Comment #639 February 1st, 2022 at 9:25 pm
Chris #638: Thank you for introducing me to the bizarrely wonderful word “haecceity”! Of course I agree that superposition is the defining feature of QM, the thing that makes it qualitatively weirder to us than Maxwell’s equations or anything else that preceded it (at least since Galileo and Newton).
But there has indeed been a miscommunication. If superposition is the defining feature of QM, then interference of amplitudes is the defining feature of superposition. It’s not some technical detail. It’s the only way we know that superpositions are really there in the first place, that they aren’t just probability distributions, reflecting our own ordinary classical ignorance about what’s going on.
Thus, to explain why the world is non-classical—to answer my question Q1—means to explain why the world exhibits constructive and destructive interference among the difference possible pathways by which a thing could happen.
With me so far? Then now for the clincher: no amount of talk about David Lewis, or modal realism, or consciousness supervening over possible worlds, etc. etc. gets me anywhere toward understanding why we live in a universe where the different possible pathways interfere. You’re one of at least a dozen commenters on this thread with whom I part ways over this one essential point.
Why does it matter? Because as long as we’re talking about modal realism (or Tegmark’s “Mathematical Universe Hypothesis,” or the “Classical Probabiliverse,” or whatever), we’re still having a purely metaphysical debate, one that could’ve been had centuries or even millennia ago. We’re not using this massive empirical clue, this Clue of Clues even, that came to us from Nature herself—the interference, the amplitudes—this metaphysical enormity that logically didn’t have to be there, but is.
I’d like to use the Clue. I’d like to know what it’s telling us.
Comment #640 February 1st, 2022 at 9:38 pm
Scott #615:
I am clearly getting on your last nerve and really, truly, this was the last thing I wanted to do. I sincerely apologize. Rereading my last comment, the length was clearly way out of line, and abused my privilege (not right) to speak to you. I can summarize it thusly:
“All information must be present in the input. This is true, because you said so. You are a world expert and I am an undergraduate with a question. What I don’t understand is: the input to what? The input to the overall cryptosystem? The input to the hash function? The input to the decryption function? The input to the preimage generator? Or all of these? It seems like no cryptosystem would possibly work if all of them were operating on all of the information, including a way to deterministically derive the private key. Or am I missing a definition that I could find somewhere in the literature without further bothering you? I can think of good reasons why each one doesn’t apply, if you’re interested [followup: clearly, you’re not. -Ed.], but I want to accept correction and I am ready to listen.”
As to why I said anything at all:
I read your survey article, years ago in fact, and I thought I had carefully constructed a workaround to the obstacles that are mentioned in it. Clearly, that is not the case – again, because you say so; I trust your judgment. While working as a staff member at a secondary educational institution in 2014, I shopped my proof attempt around to every reasonably-qualified CS instructor on our staff, as well as anyone else they knew who was interested – full disclosure, that’s four people, three of whom held Ph.Ds in pure math – and all agreed that they couldn’t see anything wrong with it as far as internal consistency, and that I needed to contact a specialist in the problem to see if the case that I presented was in scope for the problem. Their advice for how to do so was to follow the conversation around a professional blog and, when the topic came up and my approach appeared to advance the discussion, to pose the idea as a question (not an answer). In the meantime, I decided to leave it alone, to the extent that I had not read my own proof attempt in years and tripped up (see #389) over my own definitions. So, really, I’m trying hard to not be a crackpot. You hear from enough of them. I will accept any correction that I am given – but I do wish to understand such correction.
Perhaps part of the proof attempt can be salvaged, once I do understand what is currently wrong with it?
Comment #641 February 1st, 2022 at 9:47 pm
I am very oblivious sometimes. It took me 3 weeks to remember my co-worker’s name, which is sad because she has my wife’s name!
I am pretty sure I am being oblivious to something glaringly obvious to others here, and I would like any commenter to point out what information I am missing.
I don’t understand this: Why does Wheeler’s approach get ignored?
1. It has a simple overarching principle: self-reference.
Just as GR is built out of a simple principle, namely the equivalence principle, Wheeler’s approach has a simple principle of self-reference.
2. It has solid maths behind it: undecidability. Although the details are not fully worked out at this time, the math is still there. The math still explains things not explained by our current understanding. It’s not difficult to understand if you have a grasp on QM and the halting problem, but knowing some GR does not hurt for certain applications.
3. It has applications beyond our current 1920’s QM – which is perfectly correct but lacking. A group of mathematical biologists at a conference in Prague (ALife conference) this last July discussed Wheeler’s self-referential U-diagram as a possible way forward in defining living systems. Additionally, as Witten has emphasized, you need a way to include the observer in the quantum system. This is important for quantum cosmology.
A lot of what is daily discussed – Bekenstein, Everett – had a heavy influence from Wheeler. Why ignore this idea, if Wheeler considered it even more important than his work with Bekenstein and Everett and others?
Wilczek said “the importance of Wheeler’s technical contributions to physics gives his statements a weight that, coming from any other source, they would not have”. Kip Thorne said Wheeler “probed beyond the frontiers of human knowledge”. Maldacena said “Wheeler was right about many things and so maybe he is right about this too”. Feynman, who pooh-poohed many ideas, specifically warned against dismissing this approach (see jaw100.wordpress.com for references).
I am not confused why Wheeler’s work is at the bottom of the list. I am confused why it never even *makes* the list in the first place during discussions about quantum foundations. There used to be entire conferences dedicated to Wheeler’s ideas in the ’70s, ’80s, ’90s, and ’00s.
I know I have a huge gaping blind-spot and it seems like this is a good forum to ask people involved in quantum information why this is a non-starter. Ed Witten, in his 2017 interview with Quanta, quoted Wheeler saying that it may take 10, 100, or even 1,000 years, to fully understand this approach, so Wheeler knew it might not catch on super quickly.
If someone in the comment section could clear the fog, and can explain the lack of interest, I would very sincerely appreciate it. Thank you in advance to anyone who might have an answer!
And thank you Scott for opening up this forum and for allowing everyone to speak freely. It has been an interesting post for sure. And please get better soon!
Comment #642 February 1st, 2022 at 10:59 pm
Baruch Garcia #641: So, you can feel gratified that tonight, your constant cheerleading for Wheeler finally provoked me to sit down and actually read his famous Information, Physics, Quantum: The Search for Links paper, rather than just endlessly encountering quotes from it. I also read Wheeler’s remembrance of Weyl, which hits on many similar themes.
I’d wanted to read Wheeler’s How Come the Quantum?—certainly its first page asks the same questions I was asking in this post, but more evocatively:
The quantum, foundation principle of twentieth century physics, and indispensible working tool for anyone who would make reliable predictions in the world of the small, still comes to many as strange, unwelcome, forced on man from outside against his will. The necessity of the quantum in the construction of existence: out of what deeper requirement does it arise? Behind it all is surely an idea so simple, so beautiful, so compelling that when—in a decade, a century, or a millennium—we grasp it, we will all say to each other, how could it have been otherwise? How could we have been so stupid for so long?
—but alas,
I couldn’t find a non-paywalled version, and however many millions of dollars the University of Texas spends on its library system, I can never, ever remember what you do to get past paywalls.(UPDATE: a helpful reader sent it to me! I read it, and wouldn’t change a word of what I wrote below)I’d known for decades, of course, that Wheeler, besides having coined or popularized the phrase “It from Bit” (and “black hole,” and “quantum foam,” and so much else), was famous for constantly asking “why the quantum?”—i.e., precisely the question of this post. In his late-in-life philosophizing mode, though, I’d always thought of him as a prophet, visionary, or gadfly more than a clear and precise thinker. He always seemed to be stringing together slogans and metaphors, issuing metaphysical edicts backed only by his own undoubted eloquence, and reverently quoting Bohr, and it never seemed to add up to a coherent argument for anything, except when Wheeler would descend to earth to explain some piece of established physics.
Now, though, that I’ve forced myself actually to read some of Wheeler’s philosophical writings, sentence by sentence, all the way through … well, my initial impression has been completely, 100% confirmed! 😀
While there are frequent mentions of Gödel—mostly in the context of Wheeler shoring up his belief that the continuum is just a convenient theoretical illusion, in math and physics alike—I saw nothing resembling a clear case that one can use Gödel’s Theorem to derive any part of the structure of QM. Do you claim that such a case exists?
Already, I worry that the answer is yes, you do, but it would take a hundred more paragraphs spilled across my screen to explain it—more U’s looking at themselves, more “law without law,” etc. So I should warn you in advance that I don’t need the paragraphs with evocative phrases! I need an argument, one that starts with clear premises and proceeds linearly to a clear conclusion, using math whenever necessary—but if so, invoking mathematical concepts like incompleteness, undecidability, etc. only with accepted meanings that I understand.
That’s what, frankly, I haven’t yet gotten from Wheeler—as great a man, as distinguished a physicist, and as inspiring a prophet as he was—or from Garcia, as cool a guy as he is! 🙂
Comment #643 February 2nd, 2022 at 12:03 am
Scott #639
I’m afraid you’re getting things backwards; interference is an artefact of the substance of matter being made of waves instead of particles. A wave goes through two slits at once and interferes with itself – that’s not at all interesting beyond the question of why matter is made of waves instead of particles, which is more along the lines of the nuts-bolts question. You think it matters because you assume the particle view and particle interactions are ontically prior – but that phenomena is well explained by decoherence.
Comment #644 February 2nd, 2022 at 12:32 am
Chris #643: A billion times no! To describe quantum interference as merely about “matter being made of waves and not particles” is one of the worst rhetorical misdirections in the subject’s history (and there’s stiff competition!). It suggests some tame little ripples moving around in 3-dimensional space, going through the two slits and interfering, etc. But that’s not how it is at all.
If we have a thousand particles in an entangled state, suddenly we need at least 21000 parameters to describe the “ripples” that they form. How so? Because these aren’t ripples of matter of all; they’re ripples of probability amplitude. And they don’t live in 3-dimensional space; they live in Hilbert space.
In other words, what quantum interference changes is not merely the nature of matter, but much more broadly, the rules of probability. That change to the rules of probability is quantum mechanics. The changes to the nature of matter are all special byproducts—alongside the changes to the nature of light, communication, computation, and more.
The great contribution of quantum computation to the quantum foundations debate is simply that, at long last, it forced everyone to face this enormity, to stop acting like they could ignore it by focusing on special examples like a single particle in a potential and pretending the rest of QM didn’t exist. Indeed, by now QC’s success in this has been so complete that it’s disconcerting to encounter someone who still talks about QM in the old way, the way with a foot still in classical intuitions … so that a change to the whole probability calculus underlying everything that could possibly happen gets rounded down to “matter acting like a wave.”
Comment #645 February 2nd, 2022 at 1:05 am
Scott #644
Speaking of rhetorical misdirections! You are totally begging the question. Indeed, the wavefunction lives in a very high dimensional space, but that is no reason *whatsoever* to assume it’s not real. That is, in fact, the basis for the many-worlds Everett interpretation of QM. The way you’re talking makes it seem like you’re not at all familiar with it, which of course can’t be true. But then what explains your comment? I’m genuinely mystified.
Comment #646 February 2nd, 2022 at 1:21 am
Chris #645: I’m totally, 100% fine with regarding the wavefunction as real. So many commenters are obsessed with whether the wavefunction is “real,” but that wasn’t my interest at all in this post. My interest was in how the wavefunction behaves. And the main way it behaves, is that its complex-valued amplitudes undergo constructive and destructive interference. Indeed, that’s the only way we know the wavefunction is “there” at all—whatever you mean by “there”! To explain QM means to explain why these complex-valued amplitudes, these pre-probabilities that are able to interfere with one another, should be part of the basic architecture of the world. Any answer that doesn’t engage that question, isn’t even on the same intellectual continent as what interests me.
Comment #647 February 2nd, 2022 at 1:31 am
Scott #646
Well, ok, so after all that about how my describing matter as wavefunctions was the greatest rhetorical misdirection in the subject’s history, you then turn around and agree with it as if it were trivial, and now you’re instead acting as if the nuts-and-bolts question is the only thing you’re interested in the first place, instead of the question of why we should expect superposition, which just a few comments you agreed was the single feature that made it so qualitatively weird. I don’t disagree that the question of why matter should be wave-like, and that it should complex-valued amplitudes and so on, is an interesting research topic, but that’s not what makes QM astonishing. What makes it astonishing is superposition, and that is what I’ve offered motivation for.
Comment #648 February 2nd, 2022 at 1:46 am
Great discussion.
A few folk have mentioned the possibility that QM results from the need to have no preferred reference frames. Intuitively that makes sense to me, although I’m not sure there’s a good enough argument for why that can’t be achieved in classical physics.
It would be interesting to know what the relational QM people think about this question.
Comment #649 February 2nd, 2022 at 2:13 am
Scott #33 “Actually, you’ve made me realize that designing such a CA would be a phenomenal research project for anyone seeking to investigate Q1/Q2.”
I’ve been involved in that research project for many years, so I thought I should try to comment.
I regard QM as having two parts: the aspects that make distinct state finite in finite systems, and the aspects that make QM difficult to simulate on classical computers. The former was initially thought to be the essential novelty of QM, but with the advent of digital computation and communication it has become firmly part of the classical realm. In fact, the finite distinctness of continuous unitary transformations generalizes the finite maximum signaling rate of classical waves. Thus even if we lived in a world with only classical finite distinct state underlying it, the formalism of unitary evolution of linear superpositions in Hilbert space would still be a natural one to use to describe it, given our macroscopic perspective: this formalism allows us to express finite-state dynamics in terms of continuous macroscopic space and time, to reconcile finite-distinctness with continuous macroscopic symmetries, and to use the same finitely-distinct language to describe isolated systems at any scale.
Thus the only aspects of QM I regard as surprising are those that make quantum computation difficult to simulate classically. One approach to asking why these aspects might be necessary is exactly the one you outlined above: try to make an interesting physics with classical Cellular Automata, and see if there is a problem. Some issues that come up are discussed in this paper. I would define an interesting physics as one that has a macroscopic realm in which Darwinian evolution is possible, and even probable. The “probable” qualification is needed because it isn’t sufficient for a sparse and arbitrarily hard-to-find set of initial states to have the required properties. In fact, allowing interesting evolutions to be arbitrarily unlikely, all CA’s that can simulate a universal set of logic gates become equivalent.
Given this definition, one non-obvious requirement for an interesting physics is essentially macroscopic special relativity: we need to have the same laws of physics in motion as at rest, or it is very difficult to construct macroscopic organisms that remain alive if they move (or their world moves)! The Game of Life, for example, has no vestige of this property. Classical hydrodynamic lattice gases, on the other hand, have discrete versions of rotational symmetry microscopically that become effectively continuous at larger scales. Another requirement is a long time-evolution, and this suggests the CA should be reversible: on average, half of all configurations in an invertible finite-state dynamics are part of a single dynamical cycle. The irreversible Game of Life is again a good counter example. From all but a sparse and hard to find set of initial states, it reverts to a collection of local independent oscillators in a number of space-updates that is sublinear in the number of bits in the space, rather than exponential in it. We can list many obvious requirements for an interesting macroscopic realm, such as macroscopic forces (both attractive and repulsive) and sources of free energy; others become apparent as we test our models.
One thing we learn by trying to construct interesting physics using reversible classical computational models is that the issues can be subtle. For example, macroscopic entropic time can be very different than microscopic computational time. Time for macroscopic creatures follows the increase of coarse-grained entropy, with macroscopic effects following their macroscopic causes, but this time may be very different than the order in which the computer updates its bits. The simplest example of this is a finite-sized reversible CA universe started from a low-entropy initial state that is not time-symmetric (I discuss this example in section 5.1 of my PhD thesis). Coarse-grained entropy increases regardless of whether the computation runs forward or backward from the low-entropy state, and the set of configurations visited form a long repeating cycle. Macroscopic creatures arising in either direction of entropy increase might think microscopic time flowed in their entropic time direction, but if the cycle of configuration updates is computed in only one time direction, some of the creatures would be wrong!
One can construct more exotic examples, where the microscopic bit updates do not solely follow either macroscopic entropic time direction. This can be done, for example, by turning the time evolution of the previous example into a spatial evolution in one added dimension that proceeds in both former time directions at once. If we then add to this dynamics a small amount of bidirectional interaction along the added dimension, we can get an evolving spacetime with essentially the same macroscopic dynamics as before, but where the exact microscopic evolution depends on both “past” and “future” information.
Comment #650 February 2nd, 2022 at 2:24 am
OhMyGoodness #635: I’m not talking about the age of the photons, but about the nuclear fusion reactions that happened billions of years ago. Unless you’re willing to postulate a retrocausal action reaching back then, you’re forced to admit that quantum mechanics works the same way with or without observers.
Comment #651 February 2nd, 2022 at 2:52 am
I apologize and important only for me but where I wrote bounded set I meant set with finite number of elements and where I wrote superimposed I meant state of superposition.
Photons are devilishly reliable little bosons. At 14 billion years old and just as good as the day they were created.
There were a large group of people on climate web sites that were arguing cold bodies do not emit thermal photons to hot bodies since against a gradient. I pointed out the difficulty of arranging boson boson interactions at low energies and referred them to the scattering literature. They really are little wisps of probability.
I would also like to thank you for your post. I saw many things here that hold interest for further study and thank you again for allowing me to post on your website. I really hope you are able to progress your work and able to use interference and amplitudes for deeper insights.
None of the titans of physics and mathematics included in your post had a high quality and interesting website as you do. 🙂
Comment #652 February 2nd, 2022 at 6:57 am
The Planck distribution, with the zero-point field (ZPF) term, is the foundation of quantum mechanics. The ZPF spectrum, with its spatial symmetries and cubic frequency dependence is the only such spectrum that is also Lorentz invariant. The only synthetic truth “God” had to add is the scale, given by Planck’s constant and the speed of light. By the way, this spectrum is also invariant under dipole radiation, so a universe filled with dipoles, radiating and reradiating energy in equilibrium, would naturally maintain a ZPF of this form.
Comment #653 February 2nd, 2022 at 7:04 am
Mateus #650
I am not proposing any explanation. My only comment is that there is enormous evidence that quanta exist in a state of superposition until such time as an observer conducts a measurement. The observation (observer) is required to change from a state of superposition to some measured value.
Comment #654 February 2nd, 2022 at 8:03 am
OhMyGoodness #653: On the contrary, all the evidence we have shows that quantum mechanics worked the same way before observers began to exist. If there was a privileged role for observers we should expect a drastic change when humans arose. Suddenly everything started collapsing! That is, assuming that you define observers as humans. Or is being alive enough, and wavefunctions started collapsing when unicellular life appeared? Or is being human not enough, and wavefunctions only started collapsing when somebody first got a PhD in physics?
Comment #655 February 2nd, 2022 at 8:04 am
Supposing the existence of a wave in a one-dimensional space, evaluated across that one-dimensional space, such that every point in that one-dimensional space has an amplitude.
The amplitude is, from a dimensional framework considering only that single dimension, complex.
We could graph the wave in two-dimensional space, in which case the complex amplitude, for the example waves I have in mind, goes away; but the wave as considered in the two-dimensional space no longer satisfies the criteria of the Schrodinger Equation, as it no longer has a value for the entire space being considered.
In what sense is the complex amplitude not just an artifact of the fact that the amplitude of the wave exists in a dimension other than the dimensions being evaluated against?
Comment #656 February 2nd, 2022 at 8:57 am
Mateus #650
If you want a naive, purely qualitative, spur-of-the-moment, not-well-supported, science fiction based speculation (please don’t spend the time to fault it) then there is some principle that ensures we observe a consistent reality, that complies with QM principles, whenever an observation is made that converts from superposition to real values. This underlies entanglement, complementarity, astronomical observations, etc. This would presumably require some type of super luminal information transfer so maybe active outside our spacetime.
I hope this idle speculation doesn’t offend technical sensibilities to an unbearable level.
Comment #657 February 2nd, 2022 at 9:14 am
This is a wonderful line of inquiry. Scott, I wonder if you wouldn’t mind taking your favorite comments/points from this enormous chain and putting it as an addendum to the original post, or a new post? I don’t know if I’m alone in this, but I find the formatting of the comment section rather hard to follow, although that could just be the amazingly high number of comments.
Comment #658 February 2nd, 2022 at 9:43 am
Scott #642: It’s perhaps not so much Gödel as the axiom of choice if we work with a nondifferentiable wave theory as a model. That is, it’s perhaps CAs either all the way down or else far enough down that the limiting case is an effective model. Or, taking ourselves to be working with a chaotic dynamics, as Tim Palmer does, it’s perhaps vortices all the way down. I don’t care even slightly what infinitesimally small stuff might be doing, because I don’t see any way for us to measure it. In any case, if a model is not differentiable, then measure theory may nonetheless still work well enough, even without determinism. Hence we work with a probability theory, and hope the axiom of choice doesn’t break everything we do.
In such contexts, however, we may sometimes perform measurements and use algorithms that result in incompatible probability measures. In that case, we don’t want to just discard that information because it doesn’t fit in a classical model that does not allow noncommutative operators, we want to encode that information using superpositions. Hence the quantum. Or, at least, hence superpositions and noncommutativity. [Note that this isn’t just waves, this is waves of incompatible probability measures, which is a mathematics of a higher order than waves of a classical field. Higher order caustics should be expected, I suppose. Situations in which low-dimensional Hilbert spaces give effective models may cut across our intuitions about classical fields. I note as well that von Neumann and Birkhoff used Koopman’s idea to use a Hilbert space for classical mechanics as a convenient way to prove their different versions of the ergodic theorem, and I suggest that this is a mathematics that can also be leveraged in discussing how to move forward Wheeler’s, CA, chaotic dynamics, and other programs.]
Comment #659 February 2nd, 2022 at 10:06 am
Scott #572: On the other hand, it’s clear that universe was not optimized to be “as symmetric as possible.”
I’m going to go on a wild speculation here and say that the universe might actually be the simplest/most symmetric one, that is compatible with intelligent life in it.
I’m conflating symmetry and simplicity because mathematically it shouldn’t matter if (say) we have a 3 parameter model, or a model with 5 parameters with some symmetries that allow us to quotient out 2 of those.
Obviously I’m aware that this is a vaguely formulated speculation.
For one, compatible with intelligent life could mean anything from “just Turing complete”, to “Turing complete + some ergodicity that guarantees that given long enough time from a random start, some structures resembling computation will inevitably emerge”.
More importantly, I don’t really know how to rigorously define complexity/simplicity of a model (or how to apply the usual asymptotic definitions to a fixed case). For this speculation to even be plausible, the definition of simplicity needs to assign relatively high complexity values to simple, but non-symmetric options like finite automata.
However, if the complexity measure actually subtracts the complexity of the symmetry group from the complexity of the initial structure (which I think it should), then this becomes at least slightly more likely.
Of course it’s still a vague and wild speculation, but if there’s a place to put out such speculations, comment section on a blog seems to be a good one, so here it is.
Comment #660 February 2nd, 2022 at 10:40 am
Scott #639
Time is an unwitnessable algebraic dimension. And, the witnessed continuum is dynamic. The pedagogy of physics presents the notion of energy as a dichotomy between potential energy and kinetic energy.
There is no absolute notion for potential energy. On that count, you should try and learn about “differential ontology,”
https://iep.utm.edu/diff-ont/
You will not find much, and, what counts as “the foundations of mathematics” for the last century has only studied “objectual ontology” for the last century.
In the pedagogy of physics, real spaces serve as the objective ontology. This is the case because the explanatory power of physics relates to measurement. Obviously, to do so requires intervals to be interpretable with respect to ruled straightedges. But, the pragmatic emphasis in physics does not critically analyze the use of this mathematical ground.
Because the introductory pedagogy of physics defines kinetic energy using the vector called velocity. Presumably, this can be motivated by the witnessing of trajectories. So, kinetic energy can be understood as that part of the pedagogy which relates to parameterized arcs. Arc connectedness is stronger than path connectedness in that it assumes bijection with the presupposed interval. Path connectedness is a weaker notion requiring only continuity. On this alone, kinetic energy might be a vague concept. But, what I wish to emphasize is that these connectedness properties are topological, and, therefore relate to the topology of a dimension we cannot witness.
Most people who learn basic analysis will tend to understand “points” according to the Cantorian idea that infinitesimals do not exist. This would correspond with Cantor’s nested set theorem for closed sets of vanishing diameter. In so far as this is interpretable as Leibniz’ principle of the identity of indiscernibles, it is disputed in the foundations of mathematics. But, the notion of “point” becomes problematic because of the distinction between “static” and “invariant.” Obviously, if one is formulating explanations for a dynamic continuum, invariance ought to be emphasized.
While points, like time, may not be witnessable, we do seem to witness plurality in our spatial experience. This plurality may be said to correspond with separations. So, differential ontology is witnessable, whereas it is much harder to make the case for objectual ontology.
Using the expression “gauge space” quite differently from physicists, the mathematician Eric Schecter introduced classifications for spaces according to subsets of the metric space axioms. The homotopy type theory advocate Mike Shulman wrote an nlab page on these gauge spaces. In his fourth example bullet,
http://nlab-pages.s3.us-east-2.amazonaws.com/nlab/show/gauge%20space#examples
he explains that every topology defines a quasigauge space. Using standard membership notation and standard logical notation, he portrays separation by an open set U as
x in U AND NOT y in U
The denial of this formula is not “individual” or “singular” as one might understand it in the standard pedagogy of mathematics. It is a conditional statement which must accommodate the fact that the empty set is an open set.
If ‘0’ and ‘1’ are taken to be standard GF(2) representations of truth values, then it is numerical difference which grounds “truth.” This coincides with the standard definition of affine spaces from vector spaces and contradicts the necessary truth of reflexive equality statements in systems emphasizing objectual ontologies such as first-order logic.
More importantly, however, is that the conditional is a reflexive connective whose semantics is understood relative to a 16-set of tables represented with vectors from a 4-dimensional vector space over GF(2). The Boolean polynomials associated with these tables decorate an order whose connectivity is that of a tesseract. As an ortholattice, that order correspond with a 4-dimensional space.
The mathematician Robert Curtis decorated the Golay code so that two of its octads joined as “the complementary square” could be decorated with 4-dimensional vectors over GF(2). People interested in quantum gravity are very interested in this structure since the Kummer surfaces in mirror symmetry have sixteen exceptional points that can be mapped onto the vectors of the 4-dimensional vector space over GF(2).
As I understand matters (topologically) it is easy to see why people who use logic naively are simply ending up in the same place and then engaging in rhetoric to unkowingly dismiss there own problem when expressed in different terms.
As outlined in Assmus and Salwach, the 16-element group of Pauli matrices contains two 2-(16,6,2) block designs. Of course, it took some work to recognize that they were talking about this group. One is a Kummer configuration. The other is not. Therefore, it is unrepresentable.
You can, as I have done, use the Kummer configuration mapped into a Rook’s graph to label the Pauli matrices. Then, using a uniform group signature, these labels can be used to generate the blocks of the non-representable design.
What I suspect is that, at least with spin, superposition is the manifestation of the non-representable block design in the 16-element group of Pauli matrices.
The distinction between differential ontology and objectual ontology in the modern era would correspond with the distinction between empiricism and idealism. Look up Bradley Regress on the Stanford Encyclopedia of Philosophy to see the role Russell plays in excluding the study of differential ontology from modern mathematical thought. Aristotle had criticized circularity in arguments. However, modern admonitions against circularity apply to much more since it places the demand on ontology as well. What motivates this is a correspondence theory of truth. Truth with respect to a differential ontology will be some sort of coherence theory of truth and will look like fictionalism.
In his “Principles of Mathematics” Bertrand Russell discusses a “relative view of quantity” which seems to coincide with a differential ontology relating to an order over magnitudes.
For what this is worth, Wittgenstein’s states of affairs in relation to the use of a negation connective appear to be comparable to my earlier comparison of Einstein’s reference frames and spin. As it historically evolved through Carnap and Kripke, modern accounts of modal reasoning conflate counterfactuality and partiality of information usually spoken of as Leibniz’ possible worlds.
One other thing. That neither Boolean algebras or orthomodular algebras are the faithful models for their respective logics was shown by Pavicic and Megill in 1999. People who give credence to claims made by the study of symbolic logicians are simply denying this result. Because the faithful models involve a mapping into a hexagonal ortholattice which is not distributive this probably relates to the use of a complete graph on six vertices discussed in Assmus and Salwach. In addition to his book on analysis, Eric Schecter wrote a book on mathematical propositions in which a “hexagon interpretation” for propositional logic is given. As one who read Pavicic and Megill might expect, the hexagon interpretation is not distributive. For some people, it is the unfaithful Boolean model of logic which is confusing.
I hope these remarks have been specific enough for your aesthetic.
I apologize for any typographic errors. This is too much to edit on a mobile phone.
Comment #661 February 2nd, 2022 at 10:43 am
Scott, in a video of you I saw on youtube, you gave an answer to Q1 to do with information: if the universe was classical, you’d need an infinite amount of information to store the x coordinate of even one particle. That seems crap; we should want a universe that’s more discrete.
Isn’t Q1 answered by that?
Comment #662 February 2nd, 2022 at 11:24 am
Hamish Todd #661: We’ve been over this before on this thread, but since I can certainly no longer expect anyone to have read through 🙂 , briefly:
– If you start with continuous classical physics, you do indeed run into all sorts of problems related to infinite entropy, and QM is the primary known way to tame those problems—but I find it plausible that there might be many other ways that we’re prevented from seeing only by lack of imagination. Further research would help!
– Even more directly, entropy considerations do nothing to rule out classical theories that are discrete from the beginning (such as cellular automata).
Comment #663 February 2nd, 2022 at 11:27 am
NT #659:
Of course it’s still a vague and wild speculation, but if there’s a place to put out such speculations, comment section on a blog seems to be a good one, so here it is.
I laughed. That’s indeed what this thread has been for.
Comment #664 February 2nd, 2022 at 11:29 am
Jonah #657:
This is a wonderful line of inquiry. Scott, I wonder if you wouldn’t mind taking your favorite comments/points from this enormous chain and putting it as an addendum to the original post, or a new post? I don’t know if I’m alone in this, but I find the formatting of the comment section rather hard to follow, although that could just be the amazingly high number of comments.
Absolutely! I was already thinking of that, as the extraordinary length of this thread makes it hard to find some real gems. Your comment pushes me over the edge toward doing it.
Comment #665 February 2nd, 2022 at 11:35 am
Scott #551
I didn’t mean a 100654444222-norm I just meant |psi| raised to that power! Yes that is a very stupid thing to even suggest, my point was that it is too convenient to accept that the Born Rule is [psi|^2 just because it allows an easy explanation of it as a probability in an otherwise deterministic unitary evolution, and I then posted comment #527 to suggest how I could imagine the Born Rule might appear from elementary considerations.
Which was what you asked for in the OP after all!
Comment #666 February 2nd, 2022 at 12:11 pm
James Gallagher #665: The p-norm means the norm where you raise |ψ| to the p power.
Comment #667 February 2nd, 2022 at 12:19 pm
Scott #666,
Power of the Devil? Comment of the Devil?
Comment #668 February 2nd, 2022 at 12:28 pm
Brian La Cour #652:
The Planck distribution, with the zero-point field (ZPF) term, is the foundation of quantum mechanics. The ZPF spectrum, with its spatial symmetries and cubic frequency dependence is the only such spectrum that is also Lorentz invariant. The only synthetic truth “God” had to add is the scale, given by Planck’s constant and the speed of light. By the way, this spectrum is also invariant under dipole radiation, so a universe filled with dipoles, radiating and reradiating energy in equilibrium, would naturally maintain a ZPF of this form.
Sorry, but you lost me at the first sentence! Planck’s blackbody distribution was the historic starting point for what eventually became QM. In what sense is it a logical foundation? Can you mathematically derive superposition, entanglement, complex amplitudes, unitary evolution, and the Born rule using the Planck distribution as the starting point, or point me to where that was done?
Comment #669 February 2nd, 2022 at 12:39 pm
Scott #666
erm, you mean the sum of pth powers raised to the power of (1/p)?
I don’t think this is really related to the Born Rule, I was just trying to point out (by massive exaggeration) how arbitrary the initial “discovery” of the Born Rule was.
I supplied info/links on how the original greats of QM were not too sure themselves whether it could just be |psi| or some higher quartic form.
I thought this was exactly the type of discussion you wanted?
But, it seems you think my ideas far too simplistic or even stupid?
Comment #670 February 2nd, 2022 at 12:47 pm
Norman Margolus #649: Thanks for your exceedingly interesting comment—especially, for your detailed “field reports” on the difficulties of getting classical CAs to generate the desired sorts of behavior from generic initial conditions. That of course couldn’t possibly be more relevant to the subject matter of this post.
In retrospect, I wish we’d talked more about this stuff when I was at MIT! I think the problem was that sooner or later, you’d make some confident pronouncement about the ability of classical models to reproduce quantum behavior that seemed so blatantly contrary to everything I understood, that I couldn’t get any further without resolving the point of contention (but then it never actually did get resolved). Indeed, one sees an example of that even in your comment:
Thus even if we lived in a world with only classical finite distinct state underlying it, the formalism of unitary evolution of linear superpositions in Hilbert space would still be a natural one to use to describe it, given our macroscopic perspective: this formalism allows us to express finite-state dynamics in terms of continuous macroscopic space and time, to reconcile finite-distinctness with continuous macroscopic symmetries, and to use the same finitely-distinct language to describe isolated systems at any scale.
Could you please walk me through, as if I’m very slow and dense, how beings in Conway’s Game of Life would find it natural to use “the formalism of unitary evolution of linear superpositions in Hilbert space” to describe their experience? Wouldn’t those beings then wonder why they couldn’t violate the Bell inequality, among a hundred other things? No need even to go to quantum computation.
Anyway, I was never able to get past the constant “transgressing the boundaries” (in Alan Sokal’s phrase) between classical and quantum, to all the other technical stuff you’ve learned from your CA investigations, but the latter is actually of great interest to me!
Comment #671 February 2nd, 2022 at 12:51 pm
James Gallagher #669: I’m all for freewheeling discussion of the space of logical alternatives to QM! The issue was that you made an importantly, relevantly false mathematical assertion: namely, that Schrödinger evolution preserves the p-norm for all even p. It does not. Do you now agree that this is false?
Comment #672 February 2nd, 2022 at 12:56 pm
Scott #671
No I didn’t!
I said the evolution preserves |psi|^83867264424829492748 (or whatever)
I made the comment without properly explaining I was trying to emphasise that the Born Rule really is an arbitrary axiom in QM
I feel bad, haven’t posted for months and now messing up one of the all-time great threads.
Sorry Scott and commenters
Comment #673 February 2nd, 2022 at 1:18 pm
James Gallagher #672:
I feel bad, haven’t posted for months and now messing up one of the all-time great threads.
Sorry Scott and commenters
Don’t worry, all is forgiven …
… if you’ll just explicitly agree that Schrödinger evolution does not preserve |ψ|83867264424829492748 (or whatever) 🙂
It only preserves |ψ|2. People should understand that.
Comment #674 February 2nd, 2022 at 1:29 pm
Scott #674
Oh right now I see your misunderstanding, I just meant (very simplistically) since |psi| is preserved by the evolution than so is any power of it.
Yeah dumb, but then some people thought just the modulus of psi might be the probability nearly 100 years ago, and we still haven’t really established why it’s the square of the modulus
Comment #675 February 2nd, 2022 at 1:30 pm
Scott #674: James Gallagher never claimed that Schrödinger evolution preserves the p-norm for all even p. He only claimed that it preserves the 2-norm raised to some arbitrary power. Well, if you have some conserved quantity, and you apply some arbitrary function to it, you get a result which is still a conserved quantity.
Comment #676 February 2nd, 2022 at 2:01 pm
There’s a bug in correcting errors after submitting that messed up the only equation in post #673. It was perfectly readable in the preview!!! Here is the entire post again.
Scott #534: “I confess that I might be biased by having had this argument previously, with people who were 100% confident that they could explain the stationary-action principle in a purely classical way, but then every time I asked them to teach me, I got a huge, complicated runaround, never bottoming out in anything I understood the way the quantum-mechanical explanation does.”
I make a simple argument in a paper about counting distinct states in continuous unitary transformations, which I’ll summarize for your benefit here.
Classical systems achieve the maximum possible rate of distinct quantum state-change for their relativistic energy and momentum. To keep things simple, I’ll just justify this here by pointing out that this is exactly how we normally assign an entropy to a classical system: we count the maximum number of distinct states consistent with its volume in classical phase space. Although this counting is consistent with quantum mechanics, we can treat it here as an independent assumption that classical mechanics can be treated as having an underlying finite-state classical combinatorics.
Relativistic energy, in units where \(h=2\), is the total number of distinct states traversed per unit of macroscopic entropic time, and momentum of a subsystem is similarly the number of distinct states traversed due to its macroscopic frame motion, per unit distance. Thus relativistic Lagrangian action for a system of particles, \( (H-\sum p_i v_i) dt\), is the total number of distinct states of the system in a short time, minus the states due to particle motion in this time — this is the total number of distinct states not due to motion. This short-time quantity is known to be maximized, and since energy is conserved on the actual dynamical path, the number of distinct states due to the modeled motion is minimized.
This can be interpreted as a generalization of the principle of maximum aging: we must minimize the amount of motion in order to maximize the number of distinct states in all subsystem rest frames.
Comment #677 February 2nd, 2022 at 2:04 pm
Ok, part of this may have been a trivial misunderstanding. When I write |ψ|p, I mean the vector norm to the pth power, i.e. Σx |ψ(x)|p. Taking the 2-norm and raising it to powers is obviously uninteresting.
But also: there really is something special about the 2-norm. It’s the only p-norm that has a nontrivial set of linear transformations that preserve it on all vectors (the 1-norm has such a set, but for real nonnegative vectors only). This is an extremely important point and it’s one we understand.
Comment #678 February 2nd, 2022 at 2:41 pm
The fact, that not too many comments relate to the speculation below is definitely telling me, that it’s either completely flawed or trivial or in some other way useless…
but (oh no, here he goes – sincere apologies!!!) it was so mind blowing to learn that in QED the combination of constructive and destructive interference of possible photon paths basically “produces” the stationary action principle for them…
so my poor soul can’t rest until I get small hint what to read/learn to stop thinking that this could have something to do with Q1 🙂
I.e. what if there is just no way to make a “real” (as in “not simulated”) universe in which the classical laws of physics are “directly implemented”?
Of course, even for me this only makes sense when I regard the complex wavefunction as a real, inherent property of the universe that is real mechanism (sorry Mr. Bohr!) for actually creating / “implementing” the classical laws. So not just a more fundamental/underlying descriptive physical law for the descriptive classical laws. At least I feel they can only be descriptive in the sense that photons most likely don’t know how to solve the Euler-Lagrange-Equation…
Oh shoot.. I just realize that this last speculation about real vs. descriptive physical laws locates me still very much at the top of Mt. Stupid on the fake Dunning Kruger curve … but hey! Even from here, this super interesting, +600 comment long discussion is a fascinating view! And honest thanks to Scott for allowing dilettante interjections while the experts are discussing.
(since ontic vs. epistemic is an integral part of QM discussions, I hope it is ok to refer to a version of the curve that was not objectively a result of the Dunning Kruger study but only perceived by some people on the internet as such)
Comment #679 February 2nd, 2022 at 4:44 pm
Scott #670: “Wouldn’t those beings then wonder why they couldn’t violate the Bell inequality, among a hundred other things?”
I actually addressed this in my other post. Bell’s inequality violation only happens when you have certain rotations. Beings in Game of life or any other cellular automata with only pi/2 rotations do not have bell inequality violations. It’s actually an interesting question whether there is a local hidden variable formulation of quantum mechanics in a universe with only pi/2 rotations.
Maybe they only have Pauli gates like Gottesman–Knill, because their magnetic fields can only have 6 directions so they only have pauli gates. (Game of life obviously won’t have any magnetic fields, I’m talking about hypothetical similar universe with only pi/2 rotations).
Comment #680 February 2nd, 2022 at 5:08 pm
Scott #641
Yes, Wheeler’s essays are very poetic, but his notebooks are quite different. As Amanda Gefter wrote in “Trespassing on Einstein’s Lawn”, Wheeler was “obessessed” with Godel. His notebooks mention Godel thousands of times, yet his published essays only rarely mention him or undecidability. Now with the history out of the way, let’s get to the math.
When we are proving that a problem is undecidable, let’s say the halting problem, we can show that we have to choose between an answer that is computed (written down by a Turing machine) or not self-contradictory. A generalized algorithm requires information to be computed (written down by a Turing machine) and consistent (i.e. no self-contradiction).
The act of a Turing machine writing an answer down I call “case 1”. The physical, mechanical act of writing is necessary for “proof”. The act of writing is necessary for an effective method/procedure. “Case 1” i.e. writing by a TM, is necessary for provability and hence completeness.
The lack of contradiction, i.e. consistency, is what I call “case 2”.
Then we have von Neumann’s Process 1 which refers to the measurement process and Process 2 which refers to the evolution of the state vector.
Process 1, like Case 1, is tangible, observable information, yet there is contradiction. Hold on. Not a contradiction of QM itself. If you start out with the same exact initial conditions, like preparing an electron in spin state z+, sometimes you will get x- and sometimes you will get a contradictory answer, x+. Then you can build the consistent probability distributions we are familiar with. This indeterminacy has no analog in classical mechanics (e.g. you can build a deterministic coin-flipper if you use a robot, but you cannot do the same with QM).
Process 2, like Case 2, is consistent. The Schrodinger equation evolves exactly the same, given the same initial conditions. Yet in both cases, there is no tangible observable information. No Turing machine has written down an answer, no silver halide crystal has darkened on a screen. No ink on paper, no dot on a screen.
This is just the beginning. I have more in comment #481 (and my FQXi essay which I do *not* expect us to engage in, but I am mentioning here only for reference purposes).
And even though I am “cool”, I am wrong plenty, believe me! I just like seeing where that is the case.
Thank for your time!!
Comment #681 February 2nd, 2022 at 5:12 pm
Pardon my ignorance, but do we have satisfying answers to Q (and especially Q1) for attributes other than “quantum-mechanical”? 3-spatial-dimensioned? Relativistic? Atomic?
Relatedly, is there any particular reason to consider quantum mechanics less likely a priori than classical mechanics, keeping in mind that most math was developed to describe an approximately classical world?
Comment #682 February 2nd, 2022 at 5:37 pm
And then to consider that with superdeterminism this discussion arose in this universe (including this comment) 😉
(and no, I am not a determinist)
Comment #683 February 2nd, 2022 at 6:05 pm
hnau #681:
Pardon my ignorance, but do we have satisfying answers to Q (and especially Q1) for attributes other than “quantum-mechanical”? 3-spatial-dimensioned? Relativistic? Atomic?
This already came up many times in the thread.
For “relativistic,” I’d say the answer is a resounding “yes,” thanks to Einstein’s derivation of the Lorentz transformations from extremely natural principles.
For “atomic,” it seems extremely natural to design a universe where you can build many big things out of a few types of small things! Although I agree that one can conceive of other possibilities.
For “3-dimensional,” I’d say that the answer is currently “no,” and that’s almost as profound a question (not quite) as the “why QM?” question—maybe I’ll do another post about that one!
Relatedly, is there any particular reason to consider quantum mechanics less likely a priori than classical mechanics, keeping in mind that most math was developed to describe an approximately classical world?
Quantum mechanics represents one extremely specific choice for how to probabilities are to be calculated—namely, as the squared absolute values of complex amplitudes. The term “classical,” by contrast, encompasses anything whatsoever with no probabilities at all, or with probabilities introduced just as basic primitives. A priori, then, if we weren’t biased by living in a world that happens to be described by QM, it might seem almost as weird to divide all logically possible theories into “quantum” and “classical,” as to divide all logically possible desserts into “Baked Alaska” and “not Baked Alaska.”
Or maybe not! But that’s exactly the question, isn’t it? 🙂
Comment #684 February 2nd, 2022 at 6:09 pm
Scott, you’ve probably seen this weird argument by Eliezer about QM as seemingly solving the “the anthropic trilemma” (perhaps originally due to someone else). An excerpt from his post (also called The Anthropic Trilemma on LW) follows.
It’s a remarkable fact that the one sort of branching we do have extensive actual experience with—though we don’t know why it behaves the way it does—seems to behave in a very strange way that is exactly right to avoid anthropic superpowers and goes on obeying the standard axioms for conditional probability.
In quantum copying and merging, every “branch” operation preserves the total measure of the original branch, and every “merge” operation (which you could theoretically do in large coherent superpositions) likewise preserves the total measure of the incoming branches.
Great for QM. But it’s not clear to me at all how to set up an analogous set of rules for making copies of sentient beings, in which the total number of processors can go up or down and you can transfer processors from one set of minds to another.
It’s only an 8-minute read, but it has a summary at the end anyway.
It seems plausible that a big classical world + this reasoning about sensible(/non game-breaking) subjective measure would result in us finding ourselves in a QM world.
Comment #685 February 2nd, 2022 at 6:49 pm
Re complex numbers: But what is a number? Surely real-world numbers can only be relationships between categories, in the same sense that the laws of nature are relationships between categories, but real-world numbers can only be relationships between categories where the numerator and denominator categories cancel out? Numbers are relationships, NOT pebble-like entities or finished products?
Comment #686 February 2nd, 2022 at 7:34 pm
Age bronze #679:
It’s actually an interesting question whether there is a local hidden variable formulation of quantum mechanics in a universe with only pi/2 rotations.
I can answer that question for you: the answer is no. The 3-qubit GHZ experiment requires stabilizer measurements only (standard and Hadamard bases), yet admits no local hidden-variable explanation.
Comment #687 February 2nd, 2022 at 7:37 pm
Norman Margolus #679: In that case, I come back to the very same questions that I remember having when we discussed this a decade ago. Is there a least-action principle in Conway’s Game of Life? If so, what is it? If not, what is the set of classical cellular automata that DO satisfy a least-action principle? I could see them needing to be reversible; are there any other conditions?
Comment #688 February 2nd, 2022 at 8:49 pm
Well, my physics questions are these:
(1). What is the *physical* significance of complex numbers?
(2). The quantum ‘wavefunction’ is nothing like an ordinary physical wave of classical mechanics, so what is actually ‘waving’ in the quantum case?
I think if I had clear answers to (1) and (2), I could explain the interference of the complex-valued amplitudes , no problem.
Comment #689 February 2nd, 2022 at 10:01 pm
Scott # 530
> What I’ve found weird, in this thread, is to be sandwiched between two self-confident extremes:
> (1) People lecturing me on why the “why QM?” question can obviously never be answered; how I need to learn to accept that certain things are true “just because.”
> (2) People lecturing me on why the “why QM” question obviously has been answered, because how other than QM would you account for such-and-such empirically observed phenomenon?
As someone who simultaneously believes (1) and (2), I don’t think these are opposing extremes. I think these are different ways of expressing a typical physicist objection to this kind of question. The goal of physics is to explain the world we are in. Therefore, empirical facts are necessary. Counterfactuals about hypothetical worlds are not guaranteed to be meaningful. Of course it’s useful to consider models of reality that don’t exactly describe the world we are in (like, say, solving the Coulomb potential in quantum mechanics). But in physics we base these thought experiments on principles that describe our world, and we take these principles seriously *because* they explain many experimental facts. There’s no way to derive that the world *must* be Lorentz invariant, since we can imagine non-relativistic worlds, but we can show that assuming Lorentz invariance explains many phenomena we observe and predicts new phenomena we haven’t.
> In some sense, both of these extremes pointedly refuse to enter into the thought experiment that this whole post was about: namely,
> Suppose you were designing a new universe from scratch. It wouldn’t have to look like this universe, but you might want it to produce rich, complex behavior in an elegant way, or something along those lines. What considerations would militate in favor of your choosing to make your universe quantum or classical?
I think refusing to engage, *is* a form of engaging, in this case. I have no problem with asking the question and seeing what it leads to. But I think it’s also valid for people to point out that there are obstacles to this line of thinking leading to an interesting answer, and I don’t think it’s obvious from what you’ve written that these obstacles can be overcome. For example: how do you define a “success metric” on the space of theories that lead to “rich, complex behavior… or something along those lines”? In fact we could take non-relativistic Newtonian mechanics as an example. Let’s assume the world is just described by fluid mechanics, and we take Terry Tao’s ideas that fluid mechanics can describe a Turing machine (https://arxiv.org/abs/1402.0290). That’s my concrete proposal for a purely classical model of the world that does have complex behavior. Now by what measure do you propose to tell me quantum mechanics is superior to fluid dynamics?
If you don’t like pessimists, that’s fine, but I don’t think comments making these objections are invalid or necessarily failing to engage.
Comment #690 February 2nd, 2022 at 10:59 pm
Scott #687:
Given the absence of continuity in the action (that is, between possible points in the action), all (valid) trajectories through the action would seem to satisfy a stationary action principle.
More strongly, given the deterministic nature of the Game of Life, a least-action principle seems, from an observer looking at the objective rules of the game, kind of irrelevant; there’s exactly one possible trajectory. Does a trivial least-action principle count?
From the inside I think would be a different question; given that the Game of Life is, as I understand it, Turing-Complete, I think the question, when changed to ask whether an observer inside the system would observe a least-action principle, is whether an arbitrary Turing-computable simulation must obey a stationary-action or least-action principle. I think the answer there is clearly no, at least until we restrict the simulation-space to universes which follow some kind of consistent internal logic.
That question still seems too big, so let’s consider the least convenient universe: If you constructed a classical universe which followed a greatest-action principle, what would that look like? My mental picture of a particle obeying a greatest-action principle ends up looking an awful lot, at least when considering the net effect of the particle’s fields on surrounding particles, like a wave obeying a least-action principle, at least insofar as we hold a given starting point in space and time, and a given ending point in space and time, constant. (As my mental model suggests that the trajectory that maximizes the action is going to be a spiral centered on the least-action-principle trajectory)
Comment #691 February 3rd, 2022 at 12:31 am
Re Lorraine Ford #685:
I should have added: Does a complex number, e.g. the square root of minus one, seen as a relationship, and nothing but a relationship, seem more feasible than a complex number seen as a finished product or a pebble-like entity?
Comment #692 February 3rd, 2022 at 12:33 am
Since the thread is still open, here is an answer to Scott #668 « Can you mathematically derive superposition, entanglement, complex amplitudes, unitary evolution, and the Born rule using the Planck distribution as the starting point, or point me to where that was done? »
This is an interesting question, but I’m afraid it has no answer. One cannot ‘derive’ the quantum formalism from the Planck distribution, or from any similar observation, because a physical observation (what I call an empirical evidence) does not have enough mathematical structure to derive a whole theory. As a ‘proof’, it took more than 20 years of intense thinking to go from Planck’s law to real QM. On the way there must be an ‘inductive’ step, where the physicists guess a mathematical structure, and then deductively check whether it does what they wish, and fits with the data.
As a brief sketch of our CSM approach already quoted, we start from physical axioms to show that probabilities are needed due to contextual quantization (just logic, no maths). Therefore we look for a suitable probability theory, connecting modalities which are appropriate contextual generalizations of `quantum states’. It turns out to be a good idea (that we can justify but not prove) to get probabilities from projectors, with orthogonal projectors associated to mutually exclusive modalities, and the same projector associated to mutually certain modalities. This framework does correspond to the mathematical hypotheses of Uhlhorn’s and Gleason’s theorems (yeah !), and thus we get deductively unitary transforms and the Born’s rule. More details in https://arxiv.org/abs/2111.10758 .
The goal on any such construction is to have the inductive step as short and compelling as possible, but it’s not possible to remove it; otherwise it would not be physics, but mathematics where hypotheses are the pure choice of the mathematician, without requiring any fit with any data. But maybe I’m repeating again and again the very idea that Scott does not want to hear…
Comment #693 February 3rd, 2022 at 1:16 am
Scott #687:
“Is there a least-action principle in Conway’s Game of Life?”
One issue I have here is that the Game of Life does not have momentum variables: the configuration at one moment in time alone uniquely determines the evolution of the system in the future. So you can’t even begin to write down a least-action-like boundary-value problem.
If you allow the rules of the cellular automaton to depend on the last *two* grid states, you can try to generate automata from a Lagrangian L(q_i, q_{i+1}) via a least-action principle (where q_i denotes the configuration of the grid at step i). For instance one Lagrangian I’ve played with in the past is
L(q_i, q_{i+1}) = d(q_i, q_{i+1})^2 + V(q_i),
where d() is the earth-mover distance and plays the role of kinetic energy, and V is some prescribed potential function. The action corresponding to L is not differentiable but you can still insist on least action in some local sense: declare a path (q_a, q_{a+1}, …, q_b) physical if toggling any one cell’s state in any of q_{a+1} through q_{b-1} increases the action.
The optimality conditions on the action define “Euler-Lagrange Equations” (q_i, q_{i+1}) -> q_{i+2}. These describe q_{i+2} in terms of systems of inequalities which can have zero, one, or many solutions. Your question, applied to this setting, becomes: do there exist some L for which q_{i+2} is guaranteed to have exactly one solution? Which is very interesting and not something I’ve thought about.
Multiple solutions are not necessarily an obstacle: evolve your automaton by choosing one solution (uniformly?) at random, MWI-style. If you do this, and you plot over time the expectation that each cell in the grid is filled, you get dispersion of probability packets over time, in a way that, with a lot of squinting, is reminiscent of the behavior of the Schroedinger equation.
Comment #694 February 3rd, 2022 at 3:02 am
Is there some consensus on what approach(es) for CAs is/are the most promising?
E.g. in terms of what is the interpretation of the cells’ content, like:
a) a more direct one like in #687, treating the cells as some kind of matter and then checking if principles like stationary-action can be constructed.
b) more indirect/general, “it from bit” approach with the cells contain information and the goal is to construct the CA in such a way that the emerging structures show some properties that would allow to understand them as a kind of matter
c) …
Comment #695 February 3rd, 2022 at 5:55 am
It might be simple.
It might be useful to have creator and annihilation operator: God creates the universe, and the creation does not exist in classical mechanics .
It might be necessary relativistic quantum theory.
The Dirac equation implies the existence of antiparticles, so that the second quantization creates these new particles and antiparticles from the vacuum.
Comment #696 February 3rd, 2022 at 5:56 am
Scott #670: “Could you please walk me through, as if I’m very slow and dense, how beings in Conway’s Game of Life would find it natural to use “the formalism of unitary evolution of linear superpositions in Hilbert space” to describe their experience? Wouldn’t those beings then wonder why they couldn’t violate the Bell inequality, among a hundred other things? No need even to go to quantum computation.”
Maybe the best way to approach this is to argue that it is natural for us to use the QM formalism to describe classical mechanics if we want to get the connection between finite distinctness and energy and momentum right. Then, for creatures living in a universe that is classical at all scales, the same formalism would be appropriate at all scales. The discussion below is based on part of my paper, Counting distinct states in physical dynamics.
Now, classical mechanics can be regarded as the infinitely distinct \(h\to0\) limit of a unitary quantum evolution — this is the path integral viewpoint. In taking this limit, the role of quantum mechanics in absolutely defining macroscopic energy and momentum as maximum rates of distinct change in time and space respectively is lost. Thus if we want to assign an entropy to a classical mechanical system, we do that by effectively assuming it has an underlying maximally distinct unitary evolution with given energy and momentum, and do the appropriate counting of distinct quantum states in classical phase space.
We can do better though, in statistical mechanics, by explicitly modeling the right count of distinct states. We do this, for example, in the Ising model. There we treat a classical finite-state energetic model as a special case of a quantum model, and get critical exponents for phase change behavior that agree with real physical systems. We can do the same thing for classical mechanics, and get the correct local rates of state change in time and space dictated by energy and momentum by explicitly using finite-state classical reversible models, and treating them as unitary evolutions (quantum computations). Traditionally, we only use such rates of change implicitly in classical mechanics, to define a continuous dynamics (see comment #676).
We can, for example, use a classical lattice gas as our model. This is a reversible CA that alternates free motion of identical classical particles between lattice sites, with interaction steps where the bits at each lattice site interact independently. It is easy to evaluate the quantum average energy and momentum during the free motion phase, where every particle moves independently; these quantities are unchanged by interactions where classical energy and momentum are conserved. Of course what we do, in practice, is only evaluate relativistic energy due to particle motion, and never explicitly model the particle’s mass energy (due to some unknown internal rest frame dynamics). We instead infer the mass from the particle speed and momentum, which are determined by the spacing between distinct lattice sites and the rate of lattice updates.
In this analysis, the quantum description of the motion of each particle is isomorphic to a 1D classical wave shifting at a constant rate, and so distinct motion is related to bandwidth limits on rate of distinct classical signaling: we are really only using quantum mechanics here as classical Fourier analysis! (The interpretation of amplitudes is very interesting in this classical case.) Using magnitude of average momentum to bound distinctness, rather than using momentum bandwidth, is a direct generalization of the Nyquist rate. In either case, the maximally distinct wavefunctions are distinct shifts of sinc functions, and these are always centered on lattice sites during the site updating steps.
Comment #697 February 3rd, 2022 at 7:06 am
Scott #668
Actually, a great many quantum phenomena, including the ones you mention, can be explained using this as a starting point. See, for example, the following (and references within):
https://arxiv.org/abs/2004.08749 “Emergence of the Born rule in quantum optics”
https://arxiv.org/abs/2008.04364 “Entanglement and impropriety”
https://arxiv.org/abs/2101.03371 “Classical model of a delayed-choice quantum eraser”
Comment #698 February 3rd, 2022 at 7:07 am
Clinton #374: (Because nobody else is replying to this evidence)
Does a self-referential reply prove this thread incomplete … or that we can’t determine if it will ever halt 🙂
…
I’m a computer engineer – a practical, simple-minded person.
If I want to be able to introduce information into a computer then I do that by creating a bit at a particular physical location. All information in the computer, no matter how complex the computer, is fundamentally just bits at physical locations. Although it is a bit ridiculous to say … it is still true, that you can imagine yourself going down to a physical location on the device and writing down either a “1” or a “0” which is then fed to the logic. All input for the computer has this form. We are just writing down numbers when we give input to the device.
For years I heard colleagues or the popular press say that the brain was “kind of like a classical computer” because it had these things called “spikes” that would “fire or not fire” and so the brain was a “binary-like” classical computer. I accepted this as being sensible. “Brain spikes” sounded to me kind of like on/off or “1” and “0” and even a simple-minded engineer can understand that.
Then I happened across Christof Koch’s The Biophysics of Computation and picked it up, because, well, it sounded like he might be about to give me something like a “VLSI Design and Layout” overview of the digital logic of the brain.
I was mistaken. The brain is nothing like a binary model of computation.
The part that disturbed me the most was thinking about it the way I like to think about my doofus computer engineering hardware realization of a bit … you know where I just “write down a number” – for a classical binary computer I’m only allowed to write down either a “1” or a “0”. That “1” or “0” is essentially a measurement of the position of a technological switch. And in a classical computer that measurement can ONLY be on/off. On page 67, Koch discusses what measure neuroscientists typically use to characterize the “synaptic efficiency” or “degree of coupling” between a synapse and the cell body. Again, what I’m looking for is “What number(s) would a computer designer effectively be writing down when they place a synapse on a dendrite?” Because that’s how models of computation work. Well, when they place a synapse at a specific location on a dendrite what the computer designer is encoding is, in general, to directly quote Koch, “a complex number” encoded in the transfer impedance. I mean, that’s the computational design decision you are allowed to make: “Where am I going to put this synapse on this dendrite?” That decision is a decision about encoding a complex number or a different complex number at a different location.
That can’t be right, can it? The fundamental neural input of placing a synapse on a dendrite encodes a complex number? You tell me. Here’s the original neuroscience.
https://www.jneurosci.org/content/jneuro/15/3/1669.full.pdf
Again, I’m thinking in the simplest terms I can as a doofus computer engineer. All I’m asking is “What is the difference between putting this synapse here at this location on the dendrite or a little farther along on the dendrite?” The difference is that by doing that I will be changing the complex number I am encoding – measured as the transfer impedance.
What I most certainly will NOT be doing is placing either a “1” or a “0” on the dendrite.
My first reaction was … If it is not binary then does this mean that the brain must be some kind of “analog computer”? But … I KNOW that analog computation is really NOT a good idea for a lot of reasons.
Koch (p.476) tells me that “despite all of the nonlinear mechanisms known to exist in the nervous system, evolution conspired to produce neurons that act in a perfectly linear manner.”
This means that we have a model of computation over the complex numbers and the operators are linear.
I look around for what else Koch may have to say and find that he says we can compute an inner product …
https://www.nature.com/articles/440611a
This article made me pause … Why is a neuroscientist talking about quantum mechanics? I remember taking some physics in college and something in there about quantum theory, we had to apply some of that stuff in my semiconductor devices courses … But, QT couldn’t have anything to do with the brain! QT is about atoms and such, right?
Neuroscientists describe in many ways how the brain encodes orthogonal states. So, it is discrete. And, in case you missed it, the accumulated evidence above is all we need to possibly have complex Hilbert space: complex numbers, orthogonal states, an inner product.
And then I stumble across Scott’s book and lecture notes QCSD
https://www.scottaaronson.com/democritus/
in which Scott says that quantum theory is “not about matter, or energy, or waves, or particles”. But, wait … Koch wrote the paper above in which he appeared to be assuming that quantum theory must be about the physics of atomic particles but if QM is not about atoms … then what is it about?
Scott: “it’s about information and probabilities and observables, and how they relate to each other”
Now I’m confused. I was taught that QM was about physics.
My undergraduate physics textbook was of no use to resolve my confusion … because it also presented QM as about atomic physics.
I look around and find the canonical source for Quantum Computing, Nielsen and Chuang’s Quantum Computing and Quantum Information, and come away understanding what Scott meant: Quantum theory is a computational theory. Full stop. Physicists were just the first scientific community to pull this particular calculator off the shelf and use it.
That’s … amazing. QT is a computational theory. I’m a computer engineer. My job is to physically realize computation.
Now I’m back to thinking about the fact that what is going on when a synapse gets placed on a dendrite is the encoding of a complex number. I mean you put a synapse right there on that dendrite and you just encoded a complex number. Move it over a little bit and you are encoding a different complex number.
The dendrite is a linear operator acting (like a linear matrix operator according to Koch) on the encoded (positive or negative) amplitudes. By the way, the brain can give you what is called an Excitatory Post Synaptic Potential (EPSP) or an Inhibitory Post Synaptic Potential (IPSP). When you place either on a dendrite then you just programmed a positive or negative complex number into some operator. Koch says we’ve got no problem having an inner product. Koch also describes many different logical operators (things like XOR gates) and mathematical operators (like Fourier transforms and phase operators) that the neurons have no problem encoding in the morphology of the dendrites. In fact, Koch says, it’s like the brain is made for computing linear matrix operators. But, surely, I think, the brain must be like really super clumsy (we would say not fault-tolerant) and not be able to maintain the coherence of the phase relationships between states, right? No, again Koch describes a host of fault-tolerant mechanisms and empirically demonstrated ability of the brain to maintain phase relationships.
And then I see that neuroscientists are suggesting normalization is canonical (https://www.nature.com/articles/nrn3136). So, now you have a computer engineer presented with a device with complex number encoded inputs, linear operators AND on top of that it maintains normalization across, not just as a sum over the amplitudes in the basis states when present by themselves, but as a superposition over all possible states (Koch, p.23).
And what do neuroscientists think that the brain is producing by running complex number inputs through linear operators and maintaining a normalized condition over the outcomes?
Probabilities
Neuroscientists are telling us that the brain is a prediction machine.
https://ec.europa.eu/research-and-innovation/en/horizon-magazine/theory-predictive-brain-important-evolution-prof-lars-muckli
So … Let me get this straight …
Neuroscientists have evidence that the model of computation in the brain
(1) Encodes all information in the form of amplitudes (complex numbers)
(although I don’t see they appreciate the implications of this)
(2) Has orthogonal representations of states and an inner product (Hilbert space)
(3) Uses linear operators on those states (unitary if preserve inner product)
(4) Can maintain phase relationships (coherence)
(5) Uses interference of amplitudes (positive numbers EPSPs and negative numbers IPSPs)
(6) Produces a probabilistic interpretation of reality …
Gee, that all sounds familiar …
How to reconcile that these six features are argued for by Koch in Biophysics of Computation or other neuroscientists in other places … but then Koch says “no quantum brain” in (https://www.nature.com/articles/440611a)? The answer appears to be that someone has convinced Koch that QM is about atomic physics and thus a quantum brain would have to be realized in atomic-scale units of computation (qubits in atomic-scale systems). Also, he appears to think that the main point for anyone wanting to have a quantum brain is that it would somehow magically explain consciousness. I have just two things to say about this (1) There is nothing in the quantum postulates about a requirement to use atomic-scale systems as the computational primitives, and (2) There is nothing in the quantum postulates about consciousness. Full stop.
Now, by the way, having a model of computation doesn’t have anything to do with what the device is programmed to accomplish. For example, just because I have a classical computer on my desk in front of me does NOT mean that this computer is automatically gifted with understanding classical probability theory or classical mechanics or classical logic. Programming is an entirely different problem. And, further, the computational power of any particular physical computer is going to be a function of not just its computational class but arbitrary things about its technology, memory size, gate speed, and architectural configuration. It is easily possible to have what is in theory a powerful class of computation only to be physically realized in something like an Altair 8800. It is possible to have, for example, a quantum computer that is programmed to do nothing but phase estimations or period finding or maybe QFTs …
Is our brain the Altair 8800 of quantum computer models?
Why do we care about this here in this thread?
One explanation for the six features above begins with the assumption that appears to also be behind the original Q in this thread: The universe actually is running a QM model of computation. If an species is then evolving a computational device selected for most closely modeling its environment, then it should evolve a computational device that most closely matches the computational model of its environment. But, why does the world appear classical? When the probabilities appear … is exactly when QT becomes classical! It’s known as the third or “collapse” postulate. However, the brain uses the interference of amplitudes in its computation of those probabilities … just like the model says to do.
It seems that another explanation could be that the universe only appears QM because it can only appear to us as QM due to our neural model of computation. It could be as simple as this: The brain writes down all incoming information as amplitudes before it gives it to “you”. Then you claim: “The world sure does look like it’s being computed from nothing but amplitudes …”
However, I would like to believe that even if our brain is strictly contained within an external reality that was running a higher complexity class we would still be able to determine, following the empirical scientific evidence, the true model of that external reality. After all, we can simulate quantum systems on a classical computer – just not efficiently (that we know of). But … I’m not sure. Would we always be able to internally process empirical information that was only computable (or efficiently computable) strictly beyond our internal model of computation? This is why I believe this hypothesis is relevant in this discussion: because if we do have a quantum neural model and if the universe were running a much more highly efficient computational class so that we were never able to process empirical measurement in that higher class, then that would be an answer to “Q: Why should the universe have been quantum mechanical?” Because that was the limit of our own empirical information complexity class. And if neuroscientists are giving us EVIDENCE that this is our model of computation and if the computational complexity theorists (Scott) can show us that there is something like a complexity wall we run up against … then I think we might have found something like an answer to the Q in this thread.
If nothing else, the six facts above require some explanation. If they are not evidence of a computational device “trying to look like a quantum model” then … what would be?
It should be pointed out that the six items above are actual observable features of computational devices that you can be shown with a microscope or intracellular microelectrode. The neuroscientists didn’t just make up the fact that the transfer impedance at a synaptic site encodes a complex number. If someone wants to call this comment “crackpot” then I’ve got a few requests: Show me where in the photon we can write down in its hardware the complex numbers for states like I can by placing a synapse at a particular dendritic site? Show me the operator architecture in the electron like I can see it in the dendritic arborizations. Show me the normalization across nuclear spin states like we can measure it across the output of the neural operators. Remember, I’m a computer engineer, so show me the programmable hardware – not your mathematical scratchpad off to the side of the actual device.
And, so, while we are asking “Why should the universe have been quantum-mechanical?” I think we need to answer why the model of computation between our ears generating a probabilistic representation of reality looks an awful lot like the probabilistic model of computation we claim is running reality.
Comment #699 February 3rd, 2022 at 8:18 am
Baruch Garcia #680
The problem with invoking Goedel or Turing in discussions of mathematical physics is that their work appears in the context of specific paradigms. The so-called “crisis in geometry” led to the use of analytical methods whereby one can speak of instances of structures that cannot be constructed. But, extending this to “the arithmetization of mathematics” is an entirely different matter. Incompleteness has motivated a great deal of research. Dan Willard’s work on self-verifying theories shows that incompleteness is a phenomenon attached to multiplication as a total function. Arithmetical systems built upon difference and divisibility do not fall prey to the incompleteness phenomenon, provided they do not inadvertently introduce a total multiplication. Of course, the fragments of arithmetic used in such studies are contrived for the purpose of these very investigations.
Another aspect of Goedel’s work is that it effectively compares two infinite systems. Without intending to diminish Goedel’s work in any way, Aristotle recognized that both grammar and numbers shared structures which could be described in terms of prior and posterior elements. Goedel’s genius lay in recognizing a significant application of this situation. Goedel’s work had been directed primarily at Hilbert’s metamathematical approach to proof theory. The importance of recognizing that this is about metamathematics is that “the sensible impressions made by symbols on a page” are not the algebraic and analytical objects of ordinary mathematics. Nor, for that matter are formal languages. The only explicit comparison of formal language study and ordinary mathematical language use I have ever found is in Section 74 of Kleene’s “Introduction to Metamathematics.” If one were to be studying ordinary mathematical practice, one would be studying a sequence of formal systems S_1, S_2, S_3, … This is not what Goedel’s work is addressing. Moreover, Tarski has a footnote in his paper introducing his T-schema which explicitly explains that he is not considering languages built up from definitions as in ordinary mathematics.
The distinction with respect to metamathematics is also important when it comes to Turing’s work as well. For many people, the introduction of completed infinities into mathematics is an abomination. Turing’s work is specifically motivated by the fact that a human being doing calculations on a sheet of paper is only using finitely many symbols at any given stage of a calculation. While the halting problem is incredibly interesting, what is perhaps more significant is that it forces a change to how one has to apply bivalent logic. This is made explicit in Markov’s “Theory of Algorithms” where he introduces strengthened implication based upon “giveness” of a single well-formed string rather than a definition of “meaningfulness” attached to the total collection of well-formed formulas obtained through a structural recursion over an alphabet. The latter construct is that used in first-order logic to avoid semantic paradoxes. That is the genius of Tarski’s work. But, because structural recursion presupposes completed infinity, recursion theory is used verify the well-formedness of any given formula.
Although the primitive recursive arithmetic used by Goedel is quantifier-free, it is platonistic. By contrast, Skolem’s paper on arithmetic described a system built up through the introduction of symbols. This is much closer to bounded arithmetic. Arithmetic truths become truths as relations become populated. The change in logic associated with Skolem’s work lies with his use of “descriptive functions” whereby succession is understood through the use of definite descriptions. Logics which address descriptions in a meaningful way are called free logics. Because the standard account of recursive function theory requires an additional notion of “equality” by which defined symbols can be distinguished from undefined symbols, it is comparable to free logics. Typically all undefined symbols are identified as “the same” in recursion theory. This would correspond to the principle of indiscernibilty of non-existents from negative free logic.
Truth, especially in mathematics, is not an easy thing. And, there are plenty of philosophers who will condemn any theory of truth that is not realist.
It is not enough to simply assume that encodings of syntax using ‘0’ and ‘1’ has any relationship to the transcendental logarithm function used to define bits in information theory. And, since metamathematics is fundamentally grounded on the “sensible impression of symbols on a page,” confusing it with the mathematics of analytic functions is a backdoor for the very problem that led to mathematicians using analytical methods in the first place.
The real problem with relying too heavily on mathematics is that the use of logic makes the notion of truth deflationary. To a very real extent, logic words now decorate geometries. So, the problem of which geometry becomes the problem of which logic.
Experiments in science are supposed to help with that.
Comment #700 February 3rd, 2022 at 9:53 am
Andrew Matas #689:
I think refusing to engage, *is* a form of engaging, in this case. I have no problem with asking the question and seeing what it leads to. But I think it’s also valid for people to point out that there are obstacles to this line of thinking leading to an interesting answer, and I don’t think it’s obvious from what you’ve written that these obstacles can be overcome. For example: how do you define a “success metric” on the space of theories that lead to “rich, complex behavior… or something along those lines”? … If you don’t like pessimists, that’s fine, but I don’t think comments making these objections are invalid or necessarily failing to engage.
See, your comment is totally, 100% fine, because you understand what I’m asking and you’re simply pessimistic about our ability to answer it (and you might be right!).
I got annoyed by the many comments that didn’t understand—or affected not to understand—the whole concept of explaining a physical theory as the inevitable consequence of more fundamental principles, as if we didn’t already have successful examples like Einstein’s derivation of the Lorentz transformations. As if I’d slap myself on the forehead and say: “Oh! The deep explanation for why QM is true is that people did some experiments in the early 20th century and they discovered that it’s true! Now why didn’t I think of that?” 🙂
Let’s assume the world is just described by fluid mechanics, and we take Terry Tao’s ideas that fluid mechanics can describe a Turing machine (https://arxiv.org/abs/1402.0290). That’s my concrete proposal for a purely classical model of the world that does have complex behavior. Now by what measure do you propose to tell me quantum mechanics is superior to fluid dynamics?
That’s a superb question, but possibly one that we can answer right now! If you look at Tao’s constructions of Turing machines out of dissipationless fluids, they’re incredibly finely engineered. In line with what Norm Margolus was saying in #649, it seems extremely plausible that, if you just give your fluid “generic” initial conditions, you’ll get lots of chaotic, turbulent behavior but never any complicated stable structures evolving. Or maybe not! Either way, it’s a great research direction on which actual progress seems possible.
Comment #701 February 3rd, 2022 at 10:07 am
Philippe Grangier #692:
The goal on any such construction is to have the inductive step as short and compelling as possible, but it’s not possible to remove it; otherwise it would not be physics, but mathematics where hypotheses are the pure choice of the mathematician, without requiring any fit with any data. But maybe I’m repeating again and again the very idea that Scott does not want to hear…
At risk of repeating again and again what you don’t want to hear 🙂 … the historical route by which physical principles were first arrived at, is often wildly different from the route by which they’re best derived or justified after the fact. Examples abound, from thermodynamics to the Lorentz transformations to quantum field theory.
After the fact, then, it seems possible that physicists could’ve said: “if only we’d assumed the Planck black-body spectrum, plus these one or two other physical postulates that would’ve seemed perfectly reasonable and motivated to us at the time, we could’ve derived the entire structure of QM right away, and saved ourselves 20 years of struggle and effort!”
Anyway, whether that’s true or false can’t be decided via a-priori philosophizing … which is exactly why I was asking it! 🙂
Comment #702 February 3rd, 2022 at 10:12 am
Clinton #698: I’ll confess to being less impressed than you are by all your “evidence” that the brain implements QM. Linear algebra and complex numbers are both pretty ubiquitous in science and engineering, for reasons having nothing to do with QM (e.g., for complex numbers, the fact that eiθ=cosθ+isinθ concisely summarizes all the trig identities…).
In general, whenever someone points to some apparently classical thing and claims that it’s “just like QM,” my immediate questions include:
– Can the thing violate a Bell inequality?
– Is measurement of the thing an inherently destructive operation?
– Does the thing need 2n parameters to describe a system with n components?
– Can the thing factor integers in polynomial time?
Comment #703 February 3rd, 2022 at 10:54 am
mls #699
What a great comment! I learned a lot! In addition to Kleene’s section 74, read Quine’s chapter 6, which deals with reals. (also forgive my missing umlaut over Godel’s name!)
What you said is absolutely right. Incompleteness only applies to certain paradigms, there are plenty of decidable systems. At first glance, it seems misguided to try to connect anything in the set of undecidable problems to physics that uses real numbers. Not only that, but physical theories are not formal systems.
My point is that when you apply a formal language to itself, you get epistemological limits. For example, self-applying a Turing machine to find a generalized algorithm for *any* semantic property (Rice’s theorem), or self-applying a Zeno machine, or self-applying the syntax of first-order logic (as Quine and Smullyan have shown), or even self-applying an oracle to itself, leads to epistemological limits.
You may already know this, but just to make sure we are on the same page: Self-application has a specific technical meaning, which involves the use of quotation symbols (as Godel numbering does) to distinguish between the name and the object. This becomes important in defining what an effective procedure is and the T-schema. (More on the t-schema in another comment; this argument does not depend on Tarskian semantics).
This point is *self-application* leads to epistemological limits, and quantum theory seems to follow these limits. For example the distinction between von Neumann’s Process 1 (non-unitary measurement) and Process 2 (unitary evolution), entanglement (the first measured particle corresponds with an oracle), to seemingly uninteresting properties like measuring a quantum system in an eigenstate of an observable chosen (trivial semantic property).
In short, I am only looking at the category of proofs that uses self-application in this specific way. Other proofs, like the decidability of first-order reals, lie outside of this category. The main idea is self-reference (in the Godel-Tarski-Turing way which employs quotation symbols distinguishing the name from the object).
Does that clear up anything mls? Thanks!
Comment #704 February 3rd, 2022 at 11:00 am
Scott #702:
Not to argue for or against Clinton, whose positions don’t really pertain to questions I find interesting, but rather to address the questions you pose:
Suppose you want the position, within a centimeter, of a wave that is a meter across; not the center, the position. What kind of transformations do you need to perform on that wave to fit it into a centimeter, such that the question can even be answered? Is the position a “hidden variable” of the system? If all the transformations were performed using waves, would an observer embedded in those waves observe Bell inequality like behavior, even if the waves are “running on” classical particles? (Do the particles count as the universe, or do the waves, in that case?)
The second two questions are far too overdetermined; how we parameterize a system is not identical to the minimum number of possible parameterizations (see fractals; the apparent complexity derived from an observation of a system is not equivalent to the complexity of that system), and the ability to factor numbers took us a long time to figure out for QM, so it’s not exactly a meaningful question.
Comment #705 February 3rd, 2022 at 11:16 am
Clinton #698
Anesthesiologists tend toward quantum mechanical basis of consciousness I guess primarily because general anesthetics remove consciousness but leave other brain functions intact. The attached letter to the Journal of Anesthesiology is interesting and references new research that suggests an isotope of Xenon that has spin 1/2 is a materially worse general anesthetic than the majority of Xenon isotopes that have spin 0.
https://pubs.asahq.org/anesthesiology/article/129/2/228/17988/Anesthetic-Action-and-Quantum-Consciousness-A
You can find papers easily that set out the widely accepted model of general anesthetic action through London forces interacting with proteins in non polar fluid filled pockets in the brain.
Comment #706 February 3rd, 2022 at 11:18 am
Sahil #684: That’s a fascinating puzzle, thanks for sharing it here. Unfortunately your argument does not work. The Classical Probabiliverse solves it the same way Many-Worlds does it, by having the total measure of worlds be conserved upon branching and merging.
Your requirement of having a well-behaved subjective probability only rules out a Permutation City kind of universe, where branching is literally creating another copy of the simulation. Now this kind of universe has been studied and ruled out by several people, including me (in section 5, where I call it Kent’s universe) and more recently Short (who calls it discrete many-worlds).
Comment #707 February 3rd, 2022 at 12:22 pm
Scott #687:
Norman Margolus #679: In that case, I come back to the very same questions that I remember having when we discussed this a decade ago. Is there a least-action principle in Conway’s Game of Life? If so, what is it? If not, what is the set of classical cellular automata that DO satisfy a least-action principle? I could see them needing to be reversible; are there any other conditions?
It is certainly sufficient that the CA follows a continuous classical mechanical dynamics that is derivable from a Lagrangian, with only the initial state constrained so that we get lattice behavior at integer times. An example is shown in Figure 14 of the counting paper. Some CA field models obey exact least-action principles, such as the one of Figure 4 of my finite-state classical mechanics paper. There are very good reasons to believe that reversibility is a requirement in general, as you suggest, which rules out Game of Life. If we want to interpret least-action as locally minimizing distinct motion while conserving energy, we are restricted to models where a conserved energy is defined.
Comment #708 February 3rd, 2022 at 12:39 pm
I’ve thought about this for decades and respond a few times to queries like this, then give up.
This is two questions that will only be resolved “by the eventual death of the current proponent factions”.
First, why QM? or, better, why “quantum theory” (i.e. something like the Standard Model).
It all comes down to this: without quantum field theory you cannot make a theory with a positive norm that does not “collapse” (in the sense of a classical “planetary” atom collapsing to a point in the presence of spontaneous loss of energy to a classical gauge field … this applies both to QED and QCD). Alternatively, trying to apply Grassmannians to a classical theory results in negative norms. This is well-known.
Second, what about the Born rule of QM and unitarity (and positive norms)? I started thinking about this back in 1971 when I first applied for a grant to study “intramolecular vibrational relaxation” of “isolated molecules”. Some people claimed that “intramolecular relaxtion” could not occur in isolated molecules. This got me thinking about how the experiments actually worked. It turns out that once one describes the actual experiment
it “just all works” by considering unitary evolution of the molecule and photons by QED.
Eventually everybody agreed.
The answer is trivial: the Standard Model just works and is the entire story (i.e. quantum field theory, a unitary theory) (absent gravity and a non-zero Cosmologial Constant). You just have to include the whole local environment, encompassing the entire apparatus of production and meaurement, a macroscopic thing.
What people still attack me on is my simple descritption (which is correct and unambiguous) of how measurements work: its the de Broglie-Bohm formalism with the word “particles” replaced by “lines of constant flux”. Note of course this description required the whole, entanged, wave function of the whole “apparatus”. Its non-local, of course, as required by the Bell stuff.
What about wave function collapse? Wave functions of a “test particle” have extent. They DO eventually actually collapse, but this takes time. Remember that the wave function of the “test particle” is exceedingly entangled with the apparatus …. this never goes away. The probabilities of the macroscopic apparatus eventually down-weight to vanishing the part of the wave function of the “test particle” that has not “collapsed”.
Am I saying that the measured outcome depends on the (unknown and unknowable, or “actual”) state of the whole world? Answer is one word: yes.
Also … you will note I’m not writing out any equations. Many people arguing these questions seem to demand that. Its unnecessary … The very words “standard model”
(including non-perturbative formalism), “unitary” and “no gravity and thus no Planck Scale” say it all.
You’re welcome to attack me. My comment about Everettism is that it can be forced into similar conclusions by renamings.
Comment #709 February 3rd, 2022 at 1:46 pm
Norman Margolus #707:
It is certainly sufficient that the CA follows a continuous classical mechanical dynamics that is derivable from a Lagrangian…
Aha! That strikes me as an absolutely crucial concession. Of course if the dynamics are derivable from a Lagrangian, then they’ll satisfy a least-action principle—the whole Lagrangian formalism was basically constructed around the least-action principle! But unless I’m mistaken, most classical dynamics that one could write down, even reversible ones, won’t be derivable from a Lagrangian.
So the question stands: why should the classical dynamics that were thought to describe our world, until the early 20th century, be derivable from a Lagrangian?
I think I know a pretty good answer to that question: namely, because those classical dynamics ultimately come from QM, which predicts that all the paths for which the action isn’t stationary will undergo destructive interference and cancel each other out. The question stands of whether you have an answer that doesn’t reference QM.
Comment #710 February 3rd, 2022 at 1:50 pm
Crackpot #704:
the ability to factor numbers took us a long time to figure out for QM, so it’s not exactly a meaningful question.
I don’t see why not. What I’m really asking here is just whether the system is BQP-complete—that is, whether I can take an arbitrary quantum circuit acting on qubits, and map it onto the system in question. Since Shor’s algorithm is already known, the ability to use a system to factor follows immediately once its BQP-completeness is established.
But in my experience, the majority of people who point to classical systems that are “just like QM” don’t even get as far as asking the question of whether those systems are BQP-complete. They’re content to point to some particular occurrence of interference, or complex phases, or anything else that vaguely reminds them of QM.
Comment #711 February 3rd, 2022 at 1:54 pm
OhMyGoodness #705
I’m sure that paper would be more interesting to me if I were an anesthesiologist 🙂
The first thing I would say about this is … what does QM have to do with consciousness? I don’t even know what consciousness is. Most of the time I only feel maybe partially aware of what’s going on anyway. My brain is definitely a very low complexity version of whatever class of computation it is. Maybe on my best days I would be something like a O(log log log n) quantum model 😉
As a computer engineer the question of consciousness has nothing to do with the postulates of quantum mechanics. The postulates are just (roughly) complex Hilbert state space, unitary operators, projection operators, tensor products. So, if somebody brings me a candidate computational device and it looks like there’s pretty decent evidence that some of those requirements could be met then I’ll give the evidence a chance at least.
Basically I don’t know what consciousness even is. And the postulates say I don’t need to know so I’m not worried about it because I don’t see how it bears on the argument I’m making which is just an argument about hardware. If I had to speculate I would maybe say that consciousness is just some kind of algorithm or routine that is programmed in the neural model. But whatever happens to be programmed into a computer doesn’t have bearing on its class of computation. In other words, delete that program and I think we’ve still got the same model of computation.
Another common argument against this idea is decoherence. But, again, nothing in the postulates is telling me I have to shut the door to candidate computational devices based upon some physical scale requirement. The postulates just tell me what the mathematical requirements are that the device has to physically realize. So, this is a straw-man argument (to me anyway) because nowhere in this is it claimed that the relevant computational units are atomic-scale devices. Again, as a computer engineer I am very open to the idea that there may be parts of the design landscape for computational architectures or devices that we are not fully aware of. We should not be surprised to be surprised … which is kind of the fundamental scientific attitude.
This idea seems to get something of a strong response. Why? This is just a probabilistic model of computation over the complex numbers. It wouldn’t make someone some kind of supercomputer. It’s not going to mean we can solve NP-Hard problems in polynomial time. It’s not going to mean our brains are stretched out over the multiverse or any other kind of nonsense. In fact, what I understand is that for most purposes classical computing will still be what we go to even if we do get scalable, fault-tolerant quantum computers built. QCs, I understand, are just better at limited, select things where phase and periodicity can be leveraged. But, I don’t know, maybe dealing with phase and period finding problems was important in the early evolution of vertebrate life on Earth? And, it would be very easy for us to put together “trivial” quantum computing devices. But … like I said … my brain feels pretty trivial most of the time
But to the point of this thread there seems to be some complexity/computability loophole that we need to maybe think about a little bit. I could be wrong about that, of course, and would love to hear how. I would like to think that even if we did have some form of this model of computation for our brain’s operating system that it would not fundamentally limit the possible progress of answering questions like Scott’s which I believe are extremely interesting and valuable questions. But I just don’t know how to close this (potential) loophole. But again, with my brain you’re dealing with a quantum Altair 8800 at best 😉
Comment #712 February 3rd, 2022 at 1:56 pm
Scott#701
« the historical route by which physical principles were first arrived at, is often wildly different from the route by which they’re best derived or justified after the fact ». Sure, I agree on that, and the goal of current QM reconstructions is to find the best possible route, since by now we know much more than 100 years ago.
« if only we’d assumed the Planck black-body spectrum, plus these one or two other physical postulates that would’ve seemed perfectly reasonable and motivated to us at the time, we could’ve derived the entire structure of QM right away, and saved ourselves 20 years of struggle and effort! » Here I disagree, because nature provides you with surprises that can hardly be expected, and therefore science does not work this way. Just to take a personal example, our work https://www.nature.com/articles/35082512/ which is at the basis of current efficient quantum simulations with neutral atoms came (almost) as a surprise, that we could catch on the fly, this is the point. Then many developments and subtleties followed, this took 20 years, though they might have seemed ‘reasonable and motivated to us at the time’.
Good luck for the essay !
Comment #713 February 3rd, 2022 at 2:08 pm
Philippe Grangier #712: Oh, I’m well-aware that it doesn’t work that way in practice (except when it does, as with Einstein and GR, or Dirac and the positron…).
The question is whether it doesn’t work that way
(a) because it can’t, even in principle, or
(b) because no humans around happen to be smart enough.
Often, it seems to me, we get evidence after-the-fact that the truth was actually (b), when someone shows how theory X could have been mathematically deduced from the data that was available much earlier plus minimal reasonable assumptions. Even in those cases, it’s hardly surprising that the first derivation of theory X would make use of much more empirical data than was strictly needed! After all, the original discoverer was optimizing for being the first to figure out the truth, not at all for minimizing the needed experimental input.
Comment #714 February 3rd, 2022 at 2:23 pm
Scott #709:
So the question stands: why should the classical dynamics that were thought to describe our world, until the early 20th century, be derivable from a Lagrangian?
I think I know a pretty good answer to that question: namely, because those classical dynamics ultimately come from QM, which predicts that all the paths for which the action isn’t stationary will undergo destructive interference and cancel each other out. The question stands of whether you have an answer that doesn’t reference QM.
Having a relativistic dynamics that maximizes the number of distinct state changes in rest frames at each scale is something that favors as much interesting hierarchy as possible. That is precisely my counting interpretation of the least-action principle.
Comment #715 February 3rd, 2022 at 2:37 pm
Scott #702, “In general, whenever someone points to some apparently classical thing and claims that it’s “just like QM,” my immediate questions include:
– Can the thing violate a Bell inequality?
– Is measurement of the thing an inherently destructive operation?
– Does the thing need 2n parameters to describe a system with n components?
– Can the thing factor integers in polynomial time?”
In mathematical terms, I take this to ask that the mathematical structure of “some apparently classical thing” must be isomorphic to the mathematical structure of a QM thing. If the mathematical structures are isomorphic, we won’t as much have to prove every single “but what about?”, although variant interpretations of the mathematics may for some questions introduce nontrivial issues.
In a specific sense, there is a noisy classical Klein-Gordon free field that is isomorphic to a quantized complex Klein-Gordon field and there is a noisy classical free field that is isomorphic to the quantized electromagnetic free field (see “Classical states, quantum field measurement”, Physica Scripta 2019, https://doi.org/10.1088/1402-4896/ab0c53. Sorry, that’s a link, but the idea is not difficult: we can think of it as just using a nonlocal involution and raising and lowering operators in the QFT to construct QND measurements.) Interacting QFTs in 3+1-dimensions are not well-defined, so we cannot construct isomorphisms for them until we have the renormalization problem better understood.
The construction concerned is not a geometric or other quantization, which is not an isomorphism. The isomorphism is nonlocal and makes QM analytic in a way that the classical theory is not, so this is not a panacea, however the mathematics on the classical side is effectively about CAs on the continuum constructed using elementary QFT methods, so that seems approximately in your wheelhouse. As I say above, if we can get some aspects of “how?” answered, hopefully answers to “why?” might sneak up on us.
Comment #716 February 3rd, 2022 at 2:42 pm
Scott 713: An even more ‘productive’ example: Einstein described the laser concept about one hundred years ago (1917, Zur Quantentheorie der Strahlung), Townes and Schawlow theoretically described 40 years later how it might be done, and Maiman did it. Now imagine a present world where we do not have lasers everywhere…
Comment #717 February 3rd, 2022 at 3:12 pm
Scott #642
We can suspend any talk of undecidability for the time being. This argument in #481 has two steps:
Step 1: Show how the Born rule, complex numbers, unitarity, etc fall into the framework of complementarity (where von Neumann’s Process 1/non-unitary measurement and Process 2/unitary evolution are complementary i.e. particle-wave duality.) I have extended this argument to the continuous nature of Process 2 vs discontinuous nature of Process 1, non-commutativity of operators, structure of density matrices, etc. if we ever do get to that point!
Step 2: Show how the logic of complementarity is described by self-reference.
Step 1 makes no reference to undecidability, just linear algebra and trigonometry! Maybe that’s a better place to start? The arguments are in the first half of #481. No Gödel or Wheeler, just math that is familiar to everyone!
Comment #718 February 3rd, 2022 at 3:31 pm
Scott #710:
“I don’t see why not. What I’m really asking here is just whether the system is BQP-complete—that is, whether I can take an arbitrary quantum circuit acting on qubits, and map it onto the system in question.”
– Suppose, for a moment, you ran a simulation of a quantum universe, on a classic Turing machine; for deterministic purposes, we’ll assume it simulates all possible branch events. Yes, it would take an absurd amount of time, a classic Turing machine is incredibly slow at that task, and would require an absurd number of registers. However, from the perspective of somebody inside that quantum universe, time progresses “normally”; they will never notice how slow the system is running. And they could build a quantum computer in that simulation. Can you prove that a Turing machine is BQP-complete?
A subsystem can be BQP-complete even if the system itself is not. When I, and others, talk about wave interference – I think you misunderstand us as saying you can perform quantum computing on those waves. Which, well, maybe that’s what some people are saying; that’s not what I’m saying. I’m saying that those waves can possess quantum behavior from an inside-the-wave perspective.
Consider the bird’s eye perspective of MWI, and how it is in a fundamental sense deterministic, in a way that an inside-the-wavefunction perspective is not; the ability to do quantum computation is dependent on the wave functions splitting, and nothing splits from the global MWI perspective, stuff just interferes. Also, the parameterization of the system is wildly different from the inside-the-wavefunction perspective, and from the outside-the-wavefunction perspective.
Critically, information inside the wavefunction is not necessarily recoverable from outside the wavefunction. (Consider a universe of a continuous medium; any finite extent of the medium can correspond to a wave function of infinite internal complexity, meaning the task of just locating any particular piece of information in that extent may be impossible, and the task of recovering it would itself be destructive.)
Comment #719 February 3rd, 2022 at 4:19 pm
Clinton #711
Your post is contradictory (it seems to me) in that you use many statements about yourself to deny the ability to make statements about yourself. I assume that you are a person and this is not a Turing test.
Considering that you are a computer engineer and interested in hardware design-
Let’s consider that indeed the human brain provides a different model of computation than current digital computers. The highest synchronicity rate is the gamma range of 30-60 hertz, so very low clock speed. However, the human brain has thus far unparalleled ability to create novel things of value. It is not just deterministic software running on squishy hardware but is capable of developing Galois Theory, General Relativity, Quantum Mechanics, understand the value of sending a giant mirror to L2, etc. It’s evolutionary history in a seemingly classical world does allow it to make continuous risk assessments and make decisions and judgements in a complex risk environment. It is capable of considering if it has attributes of something more than a readily available digital computer and the ramifications of whatever the determination. It can make decisions based on quantum random processes, if it so chooses, and hence cannot be reliably computationally simulated by a digital device.
If you accept that the human brain does, in some ways, have advantages over a digital device, then hardware design to capture those differences in a device is a self evident worthy exercise. As a “hardware” designer you might pursue growing brain tissue, or upgrading existing brains in various ways, or by other means.
If this is a Turing Test, or you persist that you are just equivalent to an Altair 8800, then certainly enhanced performance of what the brain does well holds no interest for you. 🙂
Comment #720 February 3rd, 2022 at 4:22 pm
Baruch Garcia #703
I am familiar with how the treatment of self-reference with these historical approaches leads to epistemic limitations. I wrote comment #660 specifically with respect to epistemic limitation. In that comment I purposely attempted to make Dr. Aaronson aware of “differential ontology” as distinguished from “objectual ontology.” After all, he seemed to like the word “haecceity” so much, I thought a little more vocabulary might also interest him.
The link I provided for it’s explanation is
https://iep.utm.edu/diff-ont/
But if one prefers long drawn out deliberation which begins by denying any responsibility to connect with traditional developments of received views, the link,
https://www.intechopen.com/chapters/58822
also shows up on a search.
The expression “objectual ontology” as a contrastive is my own arising from reading “Understanding Identity Statements” by Morris. He contrasts Frege’s view and Wittgenstein’s view as “metalinguistic” and “objectual,” respectively.
The first of the above links associates objectual ontology with transcendence and differential ontology with immanence. Relative to modern usage, I read immanence as being comparable with emergence. The self-reference you are describing is in relation to transcendence.
Early authors certainly had the opportunity to study projective geometry from a foundational perspective. When Russell had been more Kantian, he suggested that projective geometry be studied as Kantian externality. Although I have no specific reference, I believe that the somewhat circular nature of projective geometry associated with it’s duals had been found objectionable by his empiricist contemporaries. Poincare’s conventionalism seemed to have taken hold, and, Russell looked elsewhere.
The correspondence theory of truth is very natural. A report from one linguistically competent person to another involving witnessable urelements is a falsehood if correspondence is not maintained. Tarski’s use of the liar paradox to demonstrate the materiality of his semantic conception grounds what philosophers of science would call bridge laws between paradigms. Appearances of circularity, as had probably been how Russell’s contemporaries viewed projective geometry, would have, and continue to be, challenged vehemently.
That we now have an entire field of categorial logic grounded in the duality of category theory speaks to the fact that mathematics has evolved from the priorities of nineteenth century philosophy.
I wish to be clear about this because self-application in Church’s lambda calculus involves a notion of function which need not be extensional in the comprehensionalist sense of logicism.
Let me give you a sense of how I see the difference.
Take the expression,
XOR( OR, NAND ) = NXOR
as an axiomatic stipulation between “logic word inscriptions.” When I speak of logic leading to deflationary truth, I mean precisely of how analytic philosophers have attributed “necessity” to stipulated rules. Conveniently, they deny meaning to the words of mathematicians, but retain the meanings for their logical connectives.
To interpret the stipulation above, write out fixed columnar representations for the exclusive disjunction, the inclusive disjunction, the denied conjunction, and the biconditional. The fixed representation is essential because it fixes the read order for rows spanning the first two columns and the row order for the pairs in each of those rows.
Treat the rows as vector components so that it makes sense to speak of componentwise evaluation. Write down a truth table without a third column using the format of the fixed representation.
Now, take the first component from the third column of OR and the first component from
the third column of NAND and evaluate them as arguments to XOR. Place the result into the first row of the third column of the (currently empty) resultant table. Call this the componentwise evaluation if the first row.
Now perform componentwise evaluations for the remaining three rows. Unless I am mistaken, your resultant will match the truth table for your biconditional.
Church’s early account of function application involved functions of one argument. The self-application occurring here is only manifest when the axiomatic stipulation takes its own denoting symbol as an argument. But,
XOR( XOR, XOR ) = NTRU
certainly makes sense.
It is what Skolem calls “the recursive mode of thought” which forces mathematicians to violate the dichotomy between syntax and semantics which justifies the Quinean quote conventions.
In turn, consider the assertion, “mathematics is extensional.” That all looks good when talking about extensionality grounding the identification of two function specifications with a graph. But, as an assertion about *all* of mathematics, one must grant credence to the materiality of Fregean truth objects. The very fact that modern metamathematics speaks of “truth values” rather than truth objects indicates the same evolution from the priorities of nineteenth century philosophers as does category theory.
(The problem, of course, is that the continuum hypothesis is buried deep within all of the associated folklore.)
It is certainly true that I am not intimately acquainted with the self-referencing systems related to diagonal arguments. But, I also do not have any evidence that the abstract objects with which I personally navigate my environment have objective materiality. In so far as objectual ontology leads to transcendental justifications and in so far as such justifications ground indispensability arguments for the objective existence of mathematical objects, proponents of correspondence theories of truth for mathematics appear as if they are relying upon revealed truth rather than reason.
What I have described above yields 4096 stipulations which can be collated into Cayley tables. Each such Cayley table contains a subtable indexed with the inscriptions ‘TRU’ and ‘NTRU’ that corresponds to the usual truth table semantics. Also, I am well aware that these manipulations are metamathematical. It is common for foundational authors to describe metamathematics as “using mathematics to study mathematics.” There is a specific definition for relating “differences” to a vector space in order to obtain an affine space. This requires a Cartesian product. The XOR table obtained using these methods satisfies the relation on a Cartesian product found in that definition. Consequently, the vectors of GF(2)^4 may be attached to these logic word inscriptions.
You and I both recognize the implications of self-reference with regard to epistemology. The method you pursue implicitly gives credence to the folklore of the arithmetization of mathematics. Being somewhat slow-witted, I worked on much of this prior to the rise of the Internet and completely missed the fact that philosophers simply declared geometry to no longer be a meaningful part of mathematics.
My bad.
If you would like to see this monstrosity of syntax,
wfbg-at-xemaps-dot-com
I am an autodidact. So nothing I do will ever see publication.
If you are interested in these self-reference issues, one of the better papers I have read was by Haim Gaifman. I just cannot remember the title.
Comment #721 February 3rd, 2022 at 4:25 pm
Clinton #698:
“All information in the computer, no matter how complex the computer, is fundamentally just bits at physical locations.”
Actually, a computer merely symbolically represents information: in a computer, a number is symbolically represented by an array of higher and lower voltages, but a real-world number (e.g. a number associated with mass or position) clearly does not exist as an array of higher and lower voltages. A symbolic representation is not the same as the actual thing.
The thing about computers is that they can symbolically represent the equivalent of consciousness and agency, whereas mathematics alone can’t do that.
Comment #722 February 3rd, 2022 at 4:33 pm
Q1: Why didn’t God just make the universe classical and be done with it?
According to L. Susskinds’ book Theoretical Minimum – Classical Mechanics,
classical dynamical laws must be deterministic and reversible (each state has only one predecessor and one successor). This corresponds to the conservation of information.
What would’ve been wrong with that choice?
Wouldn’t that be super boring if the amount of information stays constant?
I don’t know how much effort it takes to create a universe, but I assume it’s so much that you would not want to waste it on a boring universe…
(If information gain in the sense of increasing Shannon entropy is a desirable property for a interesting universe, the QM mechanism of random selection from possibilities seems helpful too. And of course with the feature of destructive interference for reasons given in #709 🙂 )
Comment #723 February 3rd, 2022 at 5:11 pm
Since the thread is still open, here’s my response to Scott #591.
In order to answer this, I can’t avoid bringing QBism into the discussion, because my personal approach to `why the quantum’ is a kind of QBism-phenomenology hybrid. Your point is similar to an objection that people often make against QBism, namely: if quantum states are just in the minds of agents, then how can the QBist explain the existence of things that were around before agents? The logic runs something like this:
(1) If something physically exists, then it must exist in some `state’ (eg a classical distribution on phase space or a quantum state);
(2) QBism says that “quantum states” (and probabilities) are epistemic and subjective: they are beliefs of rational decision-making agents;
(3) There were no agents in the distant past, eg when the Earth was forming, or when dinosaurs roamed the Earth (assuming none of them were smart enough to count as agents);
(4) Therefore by (2)&(3) there were no quantum states in the distant past;
(5) Therefore by (1)&(4) nothing existed in the distant past before agents.
QBism accepts (2-4), but rejects (1), hence also rejects the conclusion (5). To the QBist, the word “state” has a different meaning than the conventional one. Here are roughly the two definitions:
Conventional: A “state” is a description of a system, that tells us what its properties are and how it “really is”.
QBist: A “state” is a set of probabilities \( P_a(x) \) assigned by some agent, that tells us how likely the agent thinks it is that they will get result “x”, if they were to take action “a” on the system. Consequently, the QBist definition does not tie the existence of a system to its needing to have a state; a system presumably must exist in order to have a state, but if there’s no-one around to assign it a state, that doesn’t have to imply that the system doesn’t exist.
So, what precisely is the nature of something that existed before agents were around to observe it? Officially, QBism mostly leaves this issue open, and individual QBists tend to give different answers. Chris Fuchs has shared this quote with me (source here) that gives a “Whiteheadian” answer:
In other words, things have a way of `getting themselves made’ without needing agents to observe them; the agents can just `hitch a ride’ on this process if they happen to be around.
My own view, as expressed in my comment #590, is a little more extreme: I think we ought to take seriously the idea that reality is founded upon observation. When we say that “dinosaurs existed in the past”, the conventional view interprets this statement as telling us `where’ dinosaurs can be found in time, as though time were a sort of container for things, like saying the cereal exists “in the back of the cupboard”. Phenomenology rejects this view of time.
To a phenomenologist, saying that “dinosaurs existed in the past” means that they exist to us now (since we can think about them and talk about them) but that their mode of appearance has a `past tense’: they will not appear to us as living things that can eat us or stomp on us, but as sets of dry skeletons ensconced in a surrounding context of geological and paleontological evidence and theories. Time and history in phenomenology is not a container of existence, it is a mode of appearance.
I would say, then, that a velociraptor exists at many levels of reality. At the most base-level of reality, the velociraptor just “is” a collection of dried-out bones, archeological records, and so on. As we delve into the historical meanings of these, we build up layers of the velociraptor’s “historical reality” that proceed from less to more abstract. As we go up the levels the records become more sparse and their meanings more contested, but the velociraptor becomes more alive. It will not devour me, but its existence as a living thing “in history” has very real and potentially observable consequences. For instance, by understanding the probable migration patterns of velociraptors we can predict where to dig to find more bones. In short, I’m a metaphysical pluralist about reality and about time.
As I said above, though, not all QBists would agree with me and this is still very much an topic for QBism’s “inside baseball”. For the interested, there is a good discussion of these things in this phenomenology paper about QBism.
Comment #724 February 3rd, 2022 at 6:27 pm
Baruch Garcia #703
As an addendum let me direct your attention to the Mathoverflow link,
https://mathoverflow.net/questions/352298/could-groups-be-used-instead-of-sets-as-a-foundation-of-mathematics?r=SearchResults
This is a discussion among category theorists who conclude that groups could be foundational. But, the debate over category-theoretic foundations and comprehensionalist set theories is treacherous territory.
The syntactic methods I described do not emphasize the fact that the 4-dimensional vector space over GF(2) is an instance of a 16-element group with signature,
a^2 = b^2 = c^2 = d^2 = e
Unfortunately, everyone who wishes to declare a foundation for mathematics forgets that it is a claim of how people different from themselves study mathematics. No amount of philosophical rhetoric will reconcile “the arithmetization of mathematics” with “group theory can be a foundation for mathematics.”
I do not declare paradigmatic approaches different from what I do to be invalid. I am just trying to understand in an environment of non-cooperating actors.
Comment #725 February 3rd, 2022 at 6:40 pm
Jacques Pienaar #723: Thanks!! Your discussion makes it clear, again, that QBism is not the right interpretation for me. I have an extreme aversion to having to contort my language to say things like: “a velociraptor exists at many levels of reality. At the most base-level…” I prefer to say: of course velociraptors existed then, as surely as I exist now!
Having said that, I appreciate that QBists take the question seriously and think about it, and I’m actually partial to the idea of making a distinction between “this velociraptor existed” and “this velociraptor had some definite quantum state.” I explored such a distinction myself in my Ghost in the Quantum Turing Machine essay—or rather, I did ascribe a “state” to the velociraptor; it’s just that the exact polarization of a photon hitting the velociraptor’s tail could be in the sort of state that I called a “freebit.”
Comment #726 February 3rd, 2022 at 6:48 pm
Chris W. #722:
classical dynamical laws must be deterministic and reversible (each state has only one predecessor and one successor). This corresponds to the conservation of information.
What would’ve been wrong with that choice?
Wouldn’t that be super boring if the amount of information stays constant?
Everettian QM is also deterministic and reversible! And it’s just one illustration of why deterministic, reversible theories can be far from boring: even if information stays constant, it can take billions of years for the consequences of given information to manifest themselves!
Comment #727 February 3rd, 2022 at 6:49 pm
And now I’m really, actually going to close the thread, like, tonight, as I feel like every point that will be made, probably now has been made, multiple times! Thanks again everyone!
Comment #728 February 3rd, 2022 at 7:16 pm
Since comments are about to be closed, I will quickly protest against what Jacques Pienaar is saying in the name of phenomenology. The phenomenological analysis of how the experience of the past is constituted in the present, in no way requires one to dismiss the actual past in the way he is doing.
Comment #729 February 3rd, 2022 at 7:53 pm
The less patient among your readers would appreciate an abridged version of the thread.
Comment #730 February 3rd, 2022 at 8:21 pm
Scott #702:
Sorry if this is a repeat. I thought I replied to your reply but maybe I missed hitting submit. I’ll try to remember and rehash here.
I’ll confess to being less impressed than you are by all your “evidence” that the brain implements QM. Linear algebra and complex numbers are both pretty ubiquitous in science and engineering, for reasons having nothing to do with QM (e.g., for complex numbers, the fact that eiθ=cosθ+isinθ concisely summarizes all the trig identities…).
As someone who has worked on DSP applications, I understand Euler’s formula. I have a T-shirt with the statement case when $$ \theta = \pi $$
I wouldn’t say that I am completely “impressed” by the evidence, more like “disturbed”. I definitely subscribe to the ECREE standard. I actually think that this hypothesis is weak in several aspects – especially in the details on normalization – kind of important – but the evidence is not zero.
What does impress me about synaptic-dendritic encoding is that it is encoding an amplitude (complex number) that is meaningful to the computational operation of the neuron. And if we move that synapse back and forth on the dendrite then what we are doing is changing the amplitude that we choose to encode. The representation of orthogonal states of course makes this a discrete model of computation and not analog. So, this is not at all like an analog device where we can talk about input using Euler’s formula and complex numbers.
Yes, also we can look at the physics inside a semiconductor device and find quantum systems and classical systems that involve complex numbers. But, the key here is that those systems are just supporting the operational definition of the device. The quantum things going on inside a transistor have nothing to do with the computational class of the transistor relative to the computer because what I am doing with a transistor is essentially giving someone a place to write down numbers, in that case only a “0” or a “1”. The placement of the synapse, at least to my simple eyes, appears to be a way to give a hardware designer the means to write down a complex number that is then fed into whatever operator the dendritic morphology is programmed to process. Again, at this level of a computer, it all boils down to “What kind of numbers do you allow me to write down at the input locations?”
In the same way, I have no interest in the atomic-scale quantum mechanics going on inside a neuron or inside the brain. That is just part of any physical device. The critical feature we care about is at the level that is computationally relevant to the input and output of the device. I’m aware of the Penrose/Hameroff proposal but that never made any sense to me because I can’t see how quantum effects at the atomic scale have anything to do with the computational level of the brain.
In general, whenever someone points to some apparently classical thing and claims that it’s “just like QM,” my immediate questions include:
Yes, I had the same questions. Everyone knows the brain is a classical model of computation, right?
– Can the thing violate a Bell inequality?
There are cognitive scientists who appear to be under the impression that they have data showing Bell inequality violation for human cognition. I don’t see anything wrong with the math … but maybe someone else here does?
Here’s just a random search result
https://arxiv.org/abs/2102.03847
Not all of this research is of the same caliber (not all research in any field is). But, to me what they’ve got going for them is they appear to explain long-standing cognitive fallacies and effects that long refused to be explained by classical models. This next paper is a survey and also gives some criticism of the approach.
https://pubmed.ncbi.nlm.nih.gov/34546804/
– Is measurement of the thing an inherently destructive operation?
We need some mechanism for measurement operators. Two candidates, so far as I can tell: small-scale axon terminal arbor pruning and synaptic inhibition of dendritic synapses. What I understand we want is an operator to project onto a subspace spanned by the orthogonal states. The picture is of the destruction of the previously preserved superposition of amplitudes across all possible states. The latter synaptic inhibition is the more interesting computational option to me as it suggests suppressing coherent superpositions
https://royalsocietypublishing.org/doi/10.1098/rsta.2018.0107
So, yes, the projection operators would be inherently destructive of the encoded superposition of phases.
The thing about the superposition of amplitudes in the neural model is you can actually take out your microelectrode and measure the amplitudes in orthogonal groups of coherent neurons. When a physicist is asking me to accept that the spin states of an electron are in a superposition he’s just showing me numbers on a scratchpad. Yes, yes, he can run the experiment and we can see that the results match the model. But still, where exactly is the universe keeping track of these amplitudes … ?
The reaction that the quantum brain hypothesis appears to get is one of incredulity or scorn. Which is funny … because that is usually the way that I feel when a physicist asks me to accept that an electron is keeping track of spin amplitudes, Hilbert space, unitary evolution, and has measurement operators at the ready just in case a physicist wanders along, sets up an experiment, and asks for its state 😉
– Does the thing need 2n parameters to describe a system with n components?
On the one hand, if we think of the cognitive science results above as just “tests on some unknown systems” imagining that no one told us where the data came from, then we should conclude that the data show a violation of Bell inequalities and thus requiring a model with 2n parameters (so far as we know).
Think of it the way it is physically realized. One part of the neuronal group has amplitude for one state and another part of the group has amplitude for another state. The key is that there must be superposition across all possible states that relies on interference. Occupying both states 0 and 1 means that there are non-zero amplitudes in those parts of the neuronal group for which the squared magnitude of those amplitudes would represent the probability of the system being in that state. Yes, if we want to write this down then we are writing down 2n parameters for n computational units. I mean, we are just recording a complex number here and another over there and requiring the hardware interference architecture to maintain the Hilbert state space.
And then also there is no “trivial lower limit” on the number of computational units required in a device in order for us to realize quantum computing. The requirement is that we can operate over the interference of amplitudes.
– Can the thing factor integers in polynomial time?
Do we have quantum computers yet that can factor integers? 🙂 I know I know we are on the way. Even once we have non-trivial quantum computers will they be able to handle any size integer input?
It seems entirely possible to have trivial realizations of a class of computation that are not powerful enough to realize every conceivable algorithm or input size due to arbitrary memory or the functional or architectural details of the computer. As you know, it is possible to realize Turing universality with something as trivial as the DigiComp II. Of course, who wants to implement 18×18 DSP multipliers using the DigiComp II model?!
We can have a special purpose hardware architecture that is not optimized for running other types of applications. So, a specialized processor may be less than ideal for other applications. The brain may simply not have the functional architecture for Shor’s algorithm or quantum modular exponentiation. Maybe the brain is only designed with functional blocks for things that were of evolutionary importance: phase estimation, period finding, amplitude amplification, or maybe even the QFT … and those functional blocks are configured over particular receptive fields for which they make sense? But, I think the question of “Why did the brain evolve these features that appear to be aiming at QT?” is an interesting question.
Look, Scott, I’m down with ECREE. And I feel like I selected a bad way to try to get to my central concern related to the question in this thread.
Maybe there is a way to forget the quantum brain hypothesis and think of my concern in a more general way.
Let’s say we have a finite universe simulated with (which as far as I can tell is the same thing as “running on”) computational class X. Now let’s have a finite system within that universe that is a physical realization of computational class Y. What must be true about the relationship between X and Y in order for Y to “understand” that X is the operating system of the universe? Is this always possible? Is there a computability/complexity reason why or why not? Is there some “minimum complexity distance” that must exist between X and Y in order for Y to be successful in learning X?
Earlier in our sub-thread you seemed to say in effect, “Well we don’t know but we can try.” Seem like maybe we could do better at saying something about our chances?
How about that. I should have asked that first instead of going off the brain deep end 🙂
Comment #731 February 3rd, 2022 at 8:28 pm
Hi Lorraine Ford #698:
Thank you for the comment.
Ideas about consciousness and agency in computation or the limits of mathematics are way above my pay level. They are interesting, yes, but I work at the “gate-level of reality” 🙂
I would say that, as best I understand the idea of Turing universality, numbers are the symbols we use for computation or they are equivalent to any symbols that may be used. As to what a number “really is” or what “real-world numbers are” again that’s too hard for me. I just know that in the classical model I can only put a “1” or a “0” down for each memory or input location whether we call those symbols or numbers.
Comment #732 February 27th, 2022 at 6:28 pm
[…] remembered this recently, when Mateus Araújo put out the challenge to define true randomness beyond a mere unanalysed primitive, and I decided to give it a try. His replies included: “I’m afraid you misunderstood me. […]