>In order to avoid a time-evolved differential eqn (NS approach), the LS constraint cannot be satisfied instant by instant.

Ah, here's the point of confusion. In any given instance, between any two particular measurements, I see nothing wrong with the intermediate fields obeying *some* continuous constraint. After all, I expect the fields to be smooth/continuous themselves.

Here's an example (not actually what I'm going for, but useful for this discussion): Suppose a photon/EM-field in a double-slit experiment is a continuous field, with LS rules constrained by both the past preparation and the future measurement. This 4D story can be represented as a 3+1D description, but different future measurements could easily have different past behavior. So in 3+1D the field might obey differential equation A if future boundary/measurement A' is imposed (corresponding to an interference measurement), but would instead obey differential equation B if future boundary/measurement B' is imposed (corresponding to a which-slit measurement). So there's no master differential equation and NS fails as an "explanation". And yet, both A and B might conserve the same quantities, so one still gets the behavior that you're worried that LS rules out. Does that sound reasonable?

Right now my research is taking me more towards something that roughly maps to stochastic quantum mechanics (esp. Larry Schulman's ideas, if you're familiar with his non-classical 'kicks'), where 'conserved' quantities are only conserved on average, in some statistical limit. But I see nothing fundamentally wrong with the approach in the previous paragraph, as an example of how LS might not map to an NS story.

Ken,

You say,

"How would the founders of quantum theory have met these challenges if they thought the universe ran according to the mathematics of the Lagrangian

Schema { not as a computer, but rather as a global four-dimensional problem that was solved \all at once"?"

How about modeling with a quantum computer? Computers are only means to build a complex scenario that mimics a hypothetical situation. You're not saying that some aspect of the universe runs like a computer. With our understanding of gravity's properties, it is a mystery that only empirical evidence seems to address, as I explore in my essay.

Liked your essay. A great deal of substance here.

Jim

    • [deleted]

    Hi Ken,

    We met at PI when you gave your seminar there a couple of years ago and then again at the Foundations meeting in Brisbane. I remember talking to you about the universe being a computer (or rather 'not' being one)! How have you been?

    Thanks for the well-written essay. I must say that I wholeheartedly agree with your critique of the universe being a computer and am particularly persuaded of this given the measurement problem. However, I am not sure that I understand how the Lagrangian Schema can actually solve these problems. I have two major concerns with this:

    1. You say that the Lagrangian Schema is "the cleanest formulation of general relativity, with the automatic parameter-independence that GR requires, and bypasses problematic questions such as how much initial data one needs to solve the Newtonian-style version." If I understand this correctly then I don't think I agree at all. The question of finding valid boundary data for the EH action is the "thick sandwich" problem. It is riddled with difficulties and may not even be well defined except for solutions with a lot of symmetry. In fact, this is where the Newtonian Schema does much better. There are theorems for solving the initial value problem in general. The Einstein equations are naturally interpreted as evolution equations on Superspace (in a particular foliation) and that is how numerical relativists actually solve non-trivial problems. I'm not aware of anyone being able to do the same with a boundary value problem.

    2. I don't see how the Lagrangian Schema can solve the measurement problem without appealing to intuition from the Newtonian Schema. It's true that the LSU doesn't have dynamical equations but it still gives you a partition function on configuration space. How do you interpret this for the whole universe? Is there just one measurement that gives the whole history of the universe, but then what does it mean to "update" our knowledge if there is no sense of "before" and "after" our knowledge is updated?

    Like I said, I completely agree with your diagnosis of the problem. However, I think that one will need to go beyond just moving to the Lagrangian Schema to arrive at a sensible solution to the measurement problem. I hint at this in my own essay but I don't yet have a concrete proposal.

    Here is a crazy idea: maybe measurement is just not a computable process at all. Maybe the universe uses something like an Oracle to determine the outcome of a measurement process. I tried playing with a model like this a couple of years ago but got nowhere. I suppose it's probably too crazy to work...

    Cheers,

    Sean.

      • [deleted]

      Thanks, Ken, that helps greatly. You're not trying to rule out NS altogether, and NS does give rise to (3+1)D descriptions. You're saying LS is the master formalism by virture of its unique, unambiguous approach to any boundary conditions while NS is a slave thereto, by virture of its disparate differential equations for different boundary conditions. Of course, this is contrary to the situation in physics today whereby the vast majority of physicists subscribe to NSU and view LS as a mere curiousity, computational convenience and slave to NS. By realizing LS is the master formalism and subscribing to LSU, much (if not all) the "mystery" of QM disappears and we open the door to a new path to unification.

      Oops... Somehow I got logged out!! I am "Anonymous"

      - Sean Gryb.

      • [deleted]

      The computer and the universe

      John Archibald Wheeler

      Abstract

      The reasons are briefly recalled why (1) time cannot be a primordial category in the description of nature, but secondary, approximate and derived, and (2) the laws of physics could not have been engraved for all time upon a tablet of granite, but had to come into being by a higgledy-piggledy mechanism. It is difficult to defend the view that existence is built at bottom upon particles, fields of force or space and time. Attention is called to the "elementary quantum phenomenon" as potential building element for all that is. The task of construction of physics from such elements is compared and contrasted with the problem of constructing a computer out of "yes, no" devices.

      Preparation for publication assisted by the University of Texas Center for Theoretical Physics and by National Science Foundation Grant No. PHY78-26592.

      http://www.springerlink.com/content/ck753337h0515573/

      Hi Ken,

      Very good essay and after reading it I must strongly urge you to study Joy Christian's "Disproof of Bell's Theorem" (Most of the book is also on arXiv.org as the separate chapters). Joy's model is a physical model that has so far completely resisted any kind of computer simulation while completely explaining the correlations seen in EPR-Bohm type scenarios. Granted, his model is a bit difficult to understand at first but once you understand some basics about Geometric Algebra, the model is actually fairly simple. Nature simply has a 50-50 chance of left or right handed orientation when the particle pairs are created. Now, I hadn't really thought about it before but it might be nice to try to fit the model to LSU.

      Also, if you have a chance, you can check out my essay where I briefly argue that quantum mechanics is due to relativistic effects on the microscopic scale.

      Best,

      Fred

      • [deleted]

      Dear Ken,

      Although not directly related to your essay presented here, I have an idea that I hope can be of some interest to you base on a realistic model in ordinary space-time. Nothing mathematical fancy, I find that the bosonic quantum field can be reconciled from a system with vibrations in space and time. The model has some unique features that seem to be extendable to gravity and non-locality of quantum theory.

      Is there really no reality in quantum theory

      Best wishes for you in the contest.

      Hou Yau

        • [deleted]

        I am also living in the Bay Area and hope I can meet you one day.

        Hi Ken,

        Ha! I finally read it (though I should have done so months ago when you first sent it to me and I do apologize for that).

        Anyway, so, in general, I really like the idea (and your writing is impeccable as usual). But I did feel a bit unfulfilled at the end. For instance, it didn't seem entirely clear to me how the LSU approach solves the problem you pointed out in your first footnote: if the universe is (or at least appears to be) time-asymmetric, why are it's laws time-symmetric?

        Also, is it really anthropocentric to try Method X, observe the outcomes of Method X and see that they match reality, and then thus assume Method X must be at least partially correct as a model? I mean, the whole point is to match observation and measurement and even in the quantum realm I think that there has to be some kind of objective reality given the fact that the scientific method even exists in the first place (i.e. while we may all view slightly different realities, we clearly have enough in common to allow us to communicate with one another and even to put quantum mechanics to practical use).

        Nevertheless, sometimes I wonder if we are even fully capable of non-anthropocentric thinking.

        My only other complaint about the essay is related to my usual diatribe about entropy - I hate associating it with "disorder." I see absolutely nothing bizarre about low-entropy early universe (and Max Tegmark agreed with me on this at the last FQXi meeting, albeit quietly - I was not so quiet). But that's just a personal pet peeve of mine.

        Ian

          I actually agree with George on this point (you should see some of the interesting comments on his very stimulating essay, as well as those on Julian Barbour's - conversely, my own essay seems to be about the only defense of reductionism in this entire contest!).

          Ian

          Dear Ken,

          very fine essay; I have only some general remark/doubt: what is the reason for arguing in favour of so obvious thesis. Let me explain: we know that even mathematics (arithmetics) is not axiomatizable by any quite powerful set of axioms (Goedel theorems) so some simple algorithms do not have definfite answers. Computers are designed such that algorithms yield definite outcomes. That would be very strange, if impossible at all, if the world would be a computer realizing succesfully the algorithms, since even mathematics shows the impossibility of this (for a suitably rich system).

          wishes,

          Jerzy

          c.f.: http://fqxi.org/community/forum/topic/1443 (What if Natural Numbers Are Not Constant?)

            • [deleted]

            Ken,

            I wonder what the Lagrangian Schema could do if a theory is inconsistent, e.g. if in general relativity the speed of light is both variable and constant. You wrote:

            "If one wants to "fit" quantum theory into the spacetime of GR, one must use the Lagrangian Schema, solving the problem "all at once"."

            I am afraid that would be a hopeless procedure:

            W. H. Newton-Smith, The rationality of science, Routledge, London, 1981, p. 229: "A theory ought to be internally consistent. The grounds for including this factor are a priori. For given a realist construal of theories, our concern is with verisimilitude, and if a theory is inconsistent it will contain every sentence of the language, as the following simple argument shows. Let 'q' be an arbitrary sentence of the language and suppose that the theory is inconsistent. This means that we can derive the sentence 'p and not-p'. From this 'p' follows. And from 'p' it follows that 'p or q' (if 'p' is true then 'p or q' will be true no matter whether 'q' is true or not). Equally, it follows from 'p and not-p' that 'not-p'. But 'not-p' together with 'p or q' entails 'q'. Thus once we admit an inconsistency into our theory we have to admit everything. And no theory of verisimilitude would be acceptable that did not give the lowest degree of verisimilitude to a theory which contained each sentence of the theory's language and its negation."

            Pentcho Valev

              • [deleted]

              I think your essay is very interesting and makes a reasonable point. The Lloyd concept of the "quantum cosmic computer" is something which I have gone back and forth on. My sense of the issue is this: a physical system is modeled as a computer when it is convenient to think of it that way. When it is not so convenient then it is modeled by other means.

              A Lagrangian model of a physics system has the initial state and the final state specified. This carries over to the quantum path integral, where the extremization procedure is generalized to a variational calculus on many paths with quantum amplitudes. The analyst is then faced with the task of finding some dynamical process or quantum evolution which maps the initial state of the system to the final state. The analyst will then use what tools they have in their box to solve this problem.

              A quantum computer model approach with quantum gravity may work in the case of quantum black holes if the analyst knows the initial data in the formation of the black hole and detects the final outcome. The entropy of a black hole is given by the Bekenstein formula S = kA/4L_p^2. Here L_p = sqrt{Għ/c^3} is the Planck length, and L_p^2 is a unit of Planck area. A is the area of a black hole event horizon. The entropy is equal to the number of Planck units of area that comprises the event horizon S = Nk/4. This is given by the total density matrix of the black hole, where ρ is the quantum density matrix ρ_{op} = sum_n|ψ_n)(ψ_n|. A trace over the density matrix in the Khinchin-Shannon formula determines entropy. If you threw a bunch of quantum states into a black hole entropy of the black hole is

              S = k sum_nρ_a log(ρ_n) S_{BH}

              The black hole entropy increases. The S_{BH} is an initial entropy of the black hole, which if the analyst does not know the previous quantum states comprising the black hole is determined by a trace over states. You are not able to disentangle entropy of your signal from the black hole by performing the proper partial trace necessary. However, if you kept an accounting of all states in the black hole and the joint entropy of the states you put into the black hole, which are negative, then you could in principle extract the information you put into the black hole. How joint entropy can be negative is a consequence of quantum entanglement, and by putting a quantum bit stream into a black hole is to entangle it with the black hole.

              A detailed understanding of this requires the use of error correction codes. The most general one is the Leech lattice Λ_{24}, which is constructed from a triplet of E_8 heterotic groups. string theory has a heterotic sector with E_8 ~ SO(32). The so called sporadic groups, such as the Leech lattice, are a system of automorphisms and normalizers which define something called the Fischer Griess (or monster) group. If one has a complete data set this may then be used to model the black hole as a quantum communication channel or computer that processes input states into output states. The great gauge-like group is then a "machine" which one can use to model a process.

              This artificial tracing is related to the measurement issue, or the final outcome. In the case of a quantum computer there is no decision procedure for determining this, at least not at this time. As you indicate this is a weakness with the whole quantum cosmic computer conjecture. In addition, black holes have the convenience of having an asymptotic description, such as the curvature going to zero out at infinity. This tacitly means the computation process as a model can be realized by some observer, however idealized the observer is, so the process can be modeled as an algorithm.

              With the universe in total this picture becomes more problematic. If we are to think of an observer as reading the output, there is no boundary region where this observer can read this output without that procedure also being a part of the "computation." The problem of establishing a Cauchy region of initial and final data is much more difficult to work.

              Cheers LC

                Torsten,

                Thank you for the comments... I actually quite like the continuum, although that probably didn't come across in this contest (see the last contest for more details). The question of whether EPR/Bell-inequalities can be explained by hidden variables or not actually depends on whether one is in an NSU or LSU framework. It works in the latter but not the former. Unfortunately, most analysis simply assumes NSU, which has biased many people against hidden variables, but that option opens up again with LSU.

                Best,

                Ken

                James,

                Thanks! I'm not sure I understand your point, but even if one had a working quantum computer, one couldn't "model" quantum phenomena with it; one could merely "reproduce" quantum phenomena. And, unfortunately, such a computer would give no more information about what was happening between quantum measurements than the bare phenomena themselves, because one can't make an "extra" intermediate measurement in the quantum computer without destroying the match to the actual phenomena in the first place. (Weak measurements aside for now...)

                Really, a quantum computer would *be* a quantum phenomenon, in and of itself. It would not be a useful "model", in that it would not add to our understanding... any more than pointing out that a star is an excellent model of that same star would add to our understanding.

                Please let me know if I'm totally off-target with your intended comment!

                Cheers,

                Ken

                Hi Sean! Yes, of course I remember you, and thanks for your very thoughtful comments.

                Your GR points are all perfectly valid, and you're right that the LSU-style "thick sandwich problem" is unsolved (specifically, the question of whether there is one unique classical solution for a given closed-hypersurface boundary metric, and how to find it). But the "problematic questions" that I had in mind were issues involving the intersection of GR with QM... namely, the question of whether one can even impose all of the needed boundary data without violating the HUP. (If not even the Big Bang can beat the HUP, having a formal solution of Newtonian-GR for a HUP-violating initial boundary seems sort of useless to me, even leaving aside whether "lapse functions" are natural to include in an initial boundary to begin with.)

                Even though the thick-sandwich problem is clearly in the LSU camp, the way it's normally phrased makes it clear that it was developed in an NSU-mindset. After all, who says that there has to be only one unique solution? (Esp. given what we know about QM.) I'd be far more interested in a result that showed *many* solutions for a given boundary, of which our actual universe would be merely one possibility. (And yet, if this result were discovered, I think many physicists would view this as a failure.) At the end of the day, though, it's not the job of LSU approaches to recover classical results, or even NSU-like results. Hopefully it will lead to *new* insights that actually look like our GR+QM universe.

                As for the measurement problem, the clearest discussion I've written so far is my entry in the previous FQXi contest, but even reading that you'll still probably have many of the same questions. A stronger argument will require a fully-fleshed out toy model, one that I'm still plugging away on, but for now I'll leave you with the following thoughts: If I'm right that configuration spaces live in our heads, not in reality, then we need to delve deeper into the foundations of the path integral to find a realistic LSU story. The good news is that the action itself lives in spacetime. Sinha and Sorkin (1991) pointed out that by doubling the particle path integral (folding the paths back on themselves in time), you no longer need to square the integral to get out probabilities -- and at this point it almost looks like the configuration spaces used in stat mech (which I assume you'll agree would be perfectly acceptable for a realistic theory). But those particle paths still need negative probabilities to get interference. However, fields can interfere in spacetime, so extending these ideas to fields can arguably solve this problem (although this will require further alterations of the path integral, with some master restriction on field configurations to make it mathematically well-defined). The last piece of the puzzle -- how to objectively define an external measurement in the first place -- is addressed in the previous contest entry. The key is that a measurement on a subsystem is not merely the future boundary constraint itself, but also the chain of correlations that link it to the cosmological boundary. Thus, in a quantum eraser experiment, the future chain is broken, and all observers eventually concur that no "measurement" was ever made in the first place. Note this only works in an LSU; in a realistic NSU theory, one needs to know "right away" whether a given interaction is a measurement or not.

                For your final idea, about some "Oracle" choosing amongst possibilities, I'm entirely on board with that notion and am growing more confident about it all the time. But it only makes sense for the choice to happen *once*; some global choice that picks one reality out of all possible universes (given all the boundary data, and some master constraint on the total Lagrangian density). Maybe it's now clearer why I'd prefer lots of solutions to the thick-sandwich problem. From our perspective, individual quantum experiments all might seem to have a different random choice, but really it's all one big choice that manifests itself as computable patterns between similar experiments (computable from the size of the different solution spaces.) Getting the probabilities right will be the ultimate test of all this, but if you read my conclusion again with this in mind, you might see what I'm going for.

                I'm looking forward to reading your own essay... And I hope we cross paths again soon!

                Cheers,

                Ken

                Dear Ken,

                I think there is a conflation between computational models and mathematical representations. It is true that mathematical representations almost always turn out to be computable models, specially when used numerically to approximate the solution to a real-world problem. But a mathematical model doesn't always come with its implementation (in fact rarely does). That is, one has to find a computer implementation for a mathematical model. There is no one-to-one correspondence between models and algorithms. For example, many Turing machines can compute the same computable function, but each can do so in a completely different way (e.g. a function that can be computed in linear time can also be computed in exp time).

                While I may agree that the working assumption of science is that nature is mathematical I think it is far from obvious that science assumes that a particular implementation of a mathematical model is the specific way nature operates. This is also a common conflation when people think that saying that a natural process is Turing computable means that it is computed exactly by something like a Turing machine (nobody thinks the brain is a Turing machine, but most science works under the assumption that the brain is Turing computable, which is completely different). In your essay you point out Seth Lloyd's claim that the universe is a computer, this illustrates my point, because in fact Seth Lloyd does not think that the universe is a Turing machine, but in fact a quantum computer, which in the light of your own arguments I find difficult to reject.

                On the other hand, I don't think that physicists will ever be able to close all loopholes in quantum theory at the same time, and it hasn't yet been done. While your ideas are provocative I still find the computable ground less subject to particular interpretations.

                  Dear Prof. Wharton,

                  as a physicist, you argue that the universe is not a computer. As a professional in the field of computers, it is hard for me to agree with you. To me the universe appears as an enormous computer and I find the parallels between the two everywhere, from the the structure of space with the visible universe confined to a 3-dimensional display, akin to a 3D touch screen, to the origins of life itself. Most striking similarities I find in our creation myths and the ways in which the complex systems are put together in practice. But my essay is not about that. In my essay I infringe into your territory, just as you infringe into mine, and argue that organization of space housing the universe dictates the laws of physics.

                  My analysis of the current state in physics zeroed in on the paradox of space, which ~100 years ago was substituted with the wave-particle duality. The incongruous notion prevailing today that waves can propagate in emptiness, without a supporting medium, is called a workaround in my field, where such compromises are common and constitute a norm rather than an exception. The difference between you physicists and us programmers is that the programmers fully appreciate that, in the long run, such workarounds come with a heavy price and are must be addressed sooner or later. Better sooner than later.

                  In contrast, you physicists seem a headstrong bunch. Having decided long before the computers, when the ability to compute was deemed as the height of human ability, that the understanding and visualization of the underlying reality can be discarded as long mathematics appear adequate, you as a group still stubbornly stick to it. Apparently you do not realize that you have put yourselves in danger of being replaced by the very calculators you still strive to emulate. The current motto in physics "shut up and calculate" have put you on par with the computing machines, as if you have forgotten that whatever a human can calculate a machine can do far better. As a professional in my field, I fully appreciate the fact that the main difference between calculators and humans is that humans understand how the calculations relate to the physical reality and have a vision of how their own experience fits in it.

                  And so I find it ironic that the modern day physicists appear unaware that their phenomenal ability to compute could in fact be the vestiges of our very origins as well as the origins of our universe. Arguing that the universe is not a computer you physicists don't seem to appreciate the fact that mathematics divorced from the understanding of the underlying reality, and the vision that comes with it, is what has turned you into the glorified calculators and thus put forth the question of your own utility to the rest of the humanity.

                    Dear M. V. Vasilyeva,

                    Perhaps were you a little older, and had experience with analog computers, you might not jump to such conclusions.

                    You say, "In contrast, you physicists seem a headstrong bunch."

                    But when your only tool is a hammer, everything looks like a nail. When one is mostly familiar with computer concepts, they seem to apply everywhere. Believe me, there's much more to it than you seem to see.

                    "And so I find it ironic that the modern day physicists appear unaware that their phenomenal ability to compute could in fact be the vestiges of our very origins as well as the origins of our universe. Arguing that the universe is not a computer you physicists don't seem to appreciate the fact that mathematics divorced from the understanding of the underlying reality, and the vision that comes with it, is what has turned you into the glorified calculators and thus put forth the question of your own utility to the rest of the humanity."

                    Since I have been far more professionally successful in the field of computer design (see here) than as a physicist, and have published university texts on computer design and a dissertation on an 'automata theory' of physics, I do know about computers. All I can say is that, no matter how much everything looks like a nail to you, it's a little deeper than that.

                    But I still enjoyed your essay.

                    Edwin Eugene Klingman