Michael,

Actually, I'm being much more heretical than you suggest. I'm not saying that we can't physically reproduce nature's computations, I'm saying that nature doesn't even *utilize* computations, at least not in the standard past-to-future sense of the word. I'm denying any map between reality and Hilbert space, as well as the premise that you're assuming when you say "used by NATURE as well". (I'm using nature=universe here, although I use only 'universe' in the essay.)

Hope that makes it sound a bit more interesting... :-) And yes, I do propose an alternative path forward; I'm not just pointing out unresolvable problems.

Best,

Ken

Dear Ken,

very nice essay. I completely agree with you about your claim: teh universe is not a (quantum) computer. Determinism has influenced physics sine a long time. But non-local phenomena arise (like EPR), so we have to go deeper and find models to include te indeterminism. Hidden variables are no real path but non-computability is a way. What I mean are real non-algorithmic real numbers. By definition there is no algorithm (or law) to calculate them but they exists. So, if our future is non-computable then we keep up the concept of the free will etc. But as a consequence we need the contiuum (probably you do not like this point). One reason, why I keep up the concept of a smooth manifold.

If you like, have a look into my essay.

Best

Torsten

    Dear Prof. Wharton,

    Thanks for your reply as it clarifies for me what an LSU presents for you as being within a broader context. As to your last remark that it requires one to "relax the Principle of Sufficient Reason to the point where our universe is just one of many possible solutions to the same ultimate constraints" leaves me somewhat confused. That is are we talking about LSUs being differentiated only by their initial primal conditions, as to what initially has them being composed of, or rather what order initially emerges from what otherwise would be the same primal conditions. In this respect I like the thoughts of Leibnitz that when such is considered there needs to be found a difference between what is certain and what is necessary. From this I've always taken him to mean one needs to look at the differences between what might exist and what is able to be realized as existing, as only the latter can be found to be a reality.

    Once again I thank you for your excellent essay, as I do think the LSU conceptualization is something worthwhile to have explored respective of the pursuit of answers regarding the hows of the physical world. I also find such a conceptualization may perhaps form a bases from which the whys of it for those who are so inclined wondered about may be addressed as well.

    "We have said that the concept of an individual substance [Leibniz also uses the term haecceity ] includes once for all everything which can ever happen to it and that in considering this concept one will be able to see everything which can truly be said concerning the individual, just as we are able to see in the nature of a circle all the properties which can be derived from it. But does it not seem that in this way the difference between contingent and necessary truths will be destroyed, that there will be no place for human liberty, and that an absolute fatality will rule as well over all our actions as over all the rest of the events of the world? To this I reply that a distinction must be made between that which is certain and that which is necessary."

    -Gottfried Wilhelm Leibniz, "Discourse on Metaphysics" (1686)

    Kind Regards,

    Phil

    >In order to avoid a time-evolved differential eqn (NS approach), the LS constraint cannot be satisfied instant by instant.

    Ah, here's the point of confusion. In any given instance, between any two particular measurements, I see nothing wrong with the intermediate fields obeying *some* continuous constraint. After all, I expect the fields to be smooth/continuous themselves.

    Here's an example (not actually what I'm going for, but useful for this discussion): Suppose a photon/EM-field in a double-slit experiment is a continuous field, with LS rules constrained by both the past preparation and the future measurement. This 4D story can be represented as a 3+1D description, but different future measurements could easily have different past behavior. So in 3+1D the field might obey differential equation A if future boundary/measurement A' is imposed (corresponding to an interference measurement), but would instead obey differential equation B if future boundary/measurement B' is imposed (corresponding to a which-slit measurement). So there's no master differential equation and NS fails as an "explanation". And yet, both A and B might conserve the same quantities, so one still gets the behavior that you're worried that LS rules out. Does that sound reasonable?

    Right now my research is taking me more towards something that roughly maps to stochastic quantum mechanics (esp. Larry Schulman's ideas, if you're familiar with his non-classical 'kicks'), where 'conserved' quantities are only conserved on average, in some statistical limit. But I see nothing fundamentally wrong with the approach in the previous paragraph, as an example of how LS might not map to an NS story.

    Ken,

    You say,

    "How would the founders of quantum theory have met these challenges if they thought the universe ran according to the mathematics of the Lagrangian

    Schema { not as a computer, but rather as a global four-dimensional problem that was solved \all at once"?"

    How about modeling with a quantum computer? Computers are only means to build a complex scenario that mimics a hypothetical situation. You're not saying that some aspect of the universe runs like a computer. With our understanding of gravity's properties, it is a mystery that only empirical evidence seems to address, as I explore in my essay.

    Liked your essay. A great deal of substance here.

    Jim

      Hi Ken,

      We met at PI when you gave your seminar there a couple of years ago and then again at the Foundations meeting in Brisbane. I remember talking to you about the universe being a computer (or rather 'not' being one)! How have you been?

      Thanks for the well-written essay. I must say that I wholeheartedly agree with your critique of the universe being a computer and am particularly persuaded of this given the measurement problem. However, I am not sure that I understand how the Lagrangian Schema can actually solve these problems. I have two major concerns with this:

      1. You say that the Lagrangian Schema is "the cleanest formulation of general relativity, with the automatic parameter-independence that GR requires, and bypasses problematic questions such as how much initial data one needs to solve the Newtonian-style version." If I understand this correctly then I don't think I agree at all. The question of finding valid boundary data for the EH action is the "thick sandwich" problem. It is riddled with difficulties and may not even be well defined except for solutions with a lot of symmetry. In fact, this is where the Newtonian Schema does much better. There are theorems for solving the initial value problem in general. The Einstein equations are naturally interpreted as evolution equations on Superspace (in a particular foliation) and that is how numerical relativists actually solve non-trivial problems. I'm not aware of anyone being able to do the same with a boundary value problem.

      2. I don't see how the Lagrangian Schema can solve the measurement problem without appealing to intuition from the Newtonian Schema. It's true that the LSU doesn't have dynamical equations but it still gives you a partition function on configuration space. How do you interpret this for the whole universe? Is there just one measurement that gives the whole history of the universe, but then what does it mean to "update" our knowledge if there is no sense of "before" and "after" our knowledge is updated?

      Like I said, I completely agree with your diagnosis of the problem. However, I think that one will need to go beyond just moving to the Lagrangian Schema to arrive at a sensible solution to the measurement problem. I hint at this in my own essay but I don't yet have a concrete proposal.

      Here is a crazy idea: maybe measurement is just not a computable process at all. Maybe the universe uses something like an Oracle to determine the outcome of a measurement process. I tried playing with a model like this a couple of years ago but got nowhere. I suppose it's probably too crazy to work...

      Cheers,

      Sean.

        Thanks, Ken, that helps greatly. You're not trying to rule out NS altogether, and NS does give rise to (3+1)D descriptions. You're saying LS is the master formalism by virture of its unique, unambiguous approach to any boundary conditions while NS is a slave thereto, by virture of its disparate differential equations for different boundary conditions. Of course, this is contrary to the situation in physics today whereby the vast majority of physicists subscribe to NSU and view LS as a mere curiousity, computational convenience and slave to NS. By realizing LS is the master formalism and subscribing to LSU, much (if not all) the "mystery" of QM disappears and we open the door to a new path to unification.

        The computer and the universe

        John Archibald Wheeler

        Abstract

        The reasons are briefly recalled why (1) time cannot be a primordial category in the description of nature, but secondary, approximate and derived, and (2) the laws of physics could not have been engraved for all time upon a tablet of granite, but had to come into being by a higgledy-piggledy mechanism. It is difficult to defend the view that existence is built at bottom upon particles, fields of force or space and time. Attention is called to the "elementary quantum phenomenon" as potential building element for all that is. The task of construction of physics from such elements is compared and contrasted with the problem of constructing a computer out of "yes, no" devices.

        Preparation for publication assisted by the University of Texas Center for Theoretical Physics and by National Science Foundation Grant No. PHY78-26592.

        http://www.springerlink.com/content/ck753337h0515573/

        Hi Ken,

        Very good essay and after reading it I must strongly urge you to study Joy Christian's "Disproof of Bell's Theorem" (Most of the book is also on arXiv.org as the separate chapters). Joy's model is a physical model that has so far completely resisted any kind of computer simulation while completely explaining the correlations seen in EPR-Bohm type scenarios. Granted, his model is a bit difficult to understand at first but once you understand some basics about Geometric Algebra, the model is actually fairly simple. Nature simply has a 50-50 chance of left or right handed orientation when the particle pairs are created. Now, I hadn't really thought about it before but it might be nice to try to fit the model to LSU.

        Also, if you have a chance, you can check out my essay where I briefly argue that quantum mechanics is due to relativistic effects on the microscopic scale.

        Best,

        Fred

        Dear Ken,

        Although not directly related to your essay presented here, I have an idea that I hope can be of some interest to you base on a realistic model in ordinary space-time. Nothing mathematical fancy, I find that the bosonic quantum field can be reconciled from a system with vibrations in space and time. The model has some unique features that seem to be extendable to gravity and non-locality of quantum theory.

        Is there really no reality in quantum theory

        Best wishes for you in the contest.

        Hou Yau

          Hi Ken,

          Ha! I finally read it (though I should have done so months ago when you first sent it to me and I do apologize for that).

          Anyway, so, in general, I really like the idea (and your writing is impeccable as usual). But I did feel a bit unfulfilled at the end. For instance, it didn't seem entirely clear to me how the LSU approach solves the problem you pointed out in your first footnote: if the universe is (or at least appears to be) time-asymmetric, why are it's laws time-symmetric?

          Also, is it really anthropocentric to try Method X, observe the outcomes of Method X and see that they match reality, and then thus assume Method X must be at least partially correct as a model? I mean, the whole point is to match observation and measurement and even in the quantum realm I think that there has to be some kind of objective reality given the fact that the scientific method even exists in the first place (i.e. while we may all view slightly different realities, we clearly have enough in common to allow us to communicate with one another and even to put quantum mechanics to practical use).

          Nevertheless, sometimes I wonder if we are even fully capable of non-anthropocentric thinking.

          My only other complaint about the essay is related to my usual diatribe about entropy - I hate associating it with "disorder." I see absolutely nothing bizarre about low-entropy early universe (and Max Tegmark agreed with me on this at the last FQXi meeting, albeit quietly - I was not so quiet). But that's just a personal pet peeve of mine.

          Ian

            I actually agree with George on this point (you should see some of the interesting comments on his very stimulating essay, as well as those on Julian Barbour's - conversely, my own essay seems to be about the only defense of reductionism in this entire contest!).

            Ian

            Dear Ken,

            very fine essay; I have only some general remark/doubt: what is the reason for arguing in favour of so obvious thesis. Let me explain: we know that even mathematics (arithmetics) is not axiomatizable by any quite powerful set of axioms (Goedel theorems) so some simple algorithms do not have definfite answers. Computers are designed such that algorithms yield definite outcomes. That would be very strange, if impossible at all, if the world would be a computer realizing succesfully the algorithms, since even mathematics shows the impossibility of this (for a suitably rich system).

            wishes,

            Jerzy

            c.f.: http://fqxi.org/community/forum/topic/1443 (What if Natural Numbers Are Not Constant?)

              Ken,

              I wonder what the Lagrangian Schema could do if a theory is inconsistent, e.g. if in general relativity the speed of light is both variable and constant. You wrote:

              "If one wants to "fit" quantum theory into the spacetime of GR, one must use the Lagrangian Schema, solving the problem "all at once"."

              I am afraid that would be a hopeless procedure:

              W. H. Newton-Smith, The rationality of science, Routledge, London, 1981, p. 229: "A theory ought to be internally consistent. The grounds for including this factor are a priori. For given a realist construal of theories, our concern is with verisimilitude, and if a theory is inconsistent it will contain every sentence of the language, as the following simple argument shows. Let 'q' be an arbitrary sentence of the language and suppose that the theory is inconsistent. This means that we can derive the sentence 'p and not-p'. From this 'p' follows. And from 'p' it follows that 'p or q' (if 'p' is true then 'p or q' will be true no matter whether 'q' is true or not). Equally, it follows from 'p and not-p' that 'not-p'. But 'not-p' together with 'p or q' entails 'q'. Thus once we admit an inconsistency into our theory we have to admit everything. And no theory of verisimilitude would be acceptable that did not give the lowest degree of verisimilitude to a theory which contained each sentence of the theory's language and its negation."

              Pentcho Valev

                I think your essay is very interesting and makes a reasonable point. The Lloyd concept of the "quantum cosmic computer" is something which I have gone back and forth on. My sense of the issue is this: a physical system is modeled as a computer when it is convenient to think of it that way. When it is not so convenient then it is modeled by other means.

                A Lagrangian model of a physics system has the initial state and the final state specified. This carries over to the quantum path integral, where the extremization procedure is generalized to a variational calculus on many paths with quantum amplitudes. The analyst is then faced with the task of finding some dynamical process or quantum evolution which maps the initial state of the system to the final state. The analyst will then use what tools they have in their box to solve this problem.

                A quantum computer model approach with quantum gravity may work in the case of quantum black holes if the analyst knows the initial data in the formation of the black hole and detects the final outcome. The entropy of a black hole is given by the Bekenstein formula S = kA/4L_p^2. Here L_p = sqrt{Għ/c^3} is the Planck length, and L_p^2 is a unit of Planck area. A is the area of a black hole event horizon. The entropy is equal to the number of Planck units of area that comprises the event horizon S = Nk/4. This is given by the total density matrix of the black hole, where ρ is the quantum density matrix ρ_{op} = sum_n|ψ_n)(ψ_n|. A trace over the density matrix in the Khinchin-Shannon formula determines entropy. If you threw a bunch of quantum states into a black hole entropy of the black hole is

                S = k sum_nρ_a log(ρ_n) S_{BH}

                The black hole entropy increases. The S_{BH} is an initial entropy of the black hole, which if the analyst does not know the previous quantum states comprising the black hole is determined by a trace over states. You are not able to disentangle entropy of your signal from the black hole by performing the proper partial trace necessary. However, if you kept an accounting of all states in the black hole and the joint entropy of the states you put into the black hole, which are negative, then you could in principle extract the information you put into the black hole. How joint entropy can be negative is a consequence of quantum entanglement, and by putting a quantum bit stream into a black hole is to entangle it with the black hole.

                A detailed understanding of this requires the use of error correction codes. The most general one is the Leech lattice Λ_{24}, which is constructed from a triplet of E_8 heterotic groups. string theory has a heterotic sector with E_8 ~ SO(32). The so called sporadic groups, such as the Leech lattice, are a system of automorphisms and normalizers which define something called the Fischer Griess (or monster) group. If one has a complete data set this may then be used to model the black hole as a quantum communication channel or computer that processes input states into output states. The great gauge-like group is then a "machine" which one can use to model a process.

                This artificial tracing is related to the measurement issue, or the final outcome. In the case of a quantum computer there is no decision procedure for determining this, at least not at this time. As you indicate this is a weakness with the whole quantum cosmic computer conjecture. In addition, black holes have the convenience of having an asymptotic description, such as the curvature going to zero out at infinity. This tacitly means the computation process as a model can be realized by some observer, however idealized the observer is, so the process can be modeled as an algorithm.

                With the universe in total this picture becomes more problematic. If we are to think of an observer as reading the output, there is no boundary region where this observer can read this output without that procedure also being a part of the "computation." The problem of establishing a Cauchy region of initial and final data is much more difficult to work.

                Cheers LC