Dear Ken,

Your passage

"Is the universe ebectively a quantum computer?

This essay argues "no" on both counts; we have erred by assuming the universe must operate as some corporeal image of our calculations "

suggests some sort of attitude to mathematical details of so-called "quantum computer" and "the universe as approximation to quantum computer ".It is easy to know that quantum computer today is merely mathematical construction based on ideas of complex numbers algebra, where qubits, algorithms and Hilbert complex vector space are used to imagine such sort of software for future super computer (used by NATURE as well ). Some theorems of complex computational mathematics could be used in philosophy of the Universe as a whole, indeed.But,unfortunatelly, we cannot deduce any serious technical content from such poetical image as " Universe as Computer "(probably inspired by the art of the Enlightment) indeed. I suppose there is no real cognitive problem here...?

    You wrote:

    "Which past events cause

    the future boundary constraint? How do objects in

    the universe \know" what future boundary they're

    supposed to meet? Doesn't Bell's Theorem [13] prove

    that quantum correlations can't be....."

    All your statements based on light speed constant c

    Imagine please our universe this way

    Big Bang; Present; Big Crunch

    c=10^30; c=10^10; c=10^-10

    G=10^12; G=10^-8; G=10^-28

    h=10^-28; h=10^-28; h=10^-28

    alfa =10^-3; 1/ 137; 1

    e=0,1 ; e=e ; e=12

    What is your question to this picture?

      Thanks for the reply, Ken.

      I'm trying to understand your desiderata, so I'll go with "my hoped-for-3+1D representation of what is happening between measurements." :-)

      My questions are an attempt to understand what you mean by an LS approach that doesn't allow for an NS approach but nonetheless allows for a 3+1D representation. Specifically, I'm interested in how that might be mathematically instantiated.

      In order to avoid a time-evolved differential eqn (NS approach), the LS constraint cannot be satisfied instant by instant. Your response seems to agree with that, so no confusion there. Also, it seems to me, L cannot have any symmetries since they lead to conserved currents cast in conservation equations, i.e., NS formalism. Thus, L has no symmetries and satisfies a global constraint that cannot be satisfied instant-by-instant, yet this formalism allows for a 3+1D representation.

      Maybe it would be more productive to ignore my confusion and share your specific idea(s) for how this might work. I realize you don't have a finished product, so I don't expect anything precise. You allude to an idea at the end of your essay, so perhaps you could elaborate a bit on that.

      Hi Amanda,

      Glad you liked it, and thanks for the interesting question. Before your post, I had only read summaries of 'top down cosmology', and it originally struck me as making the time-reverse of the usual mistakes when it came to interpreting the wavefunction. But after your comment, I went back and read some of the original papers. I was pleasantly surprised to find that much of Hawking's original motivation mirrors my own complaints about the NSU -- especially the first 1.5 pages of Hawking's original http://arxiv.org/abs/astro-ph/0305562 . It's effectively a critique of the NSU, and concludes that the way forward is to look to the path integral (LSU).

      But that's where things go awry, because no one has ever come up with a realistic interpretation of the path integral -- even in laboratory experiments, let alone the whole universe. The '06 paper with Hertog simply assumes that once one has the "amplitude" (not probability!) of the universe the problem is solved, ignoring the fact that the quantum foundations community can't agree on what the amplitude/wavefunction means in the first place. So the motivation is great, but there's no LSU interpretation for Hawking to tap into -- because it hasn't yet been developed, probably for the reasons I outline in my essay.

      As it stands, though, top-down cosmology seems to hold that the *past* is a huge-dimensional configuration space (like I said, the time-reverse of the usual thinking), which is pretty confounding to someone like me, committed to the block universe. In fact, if you look closely, the argument goes like this: 1) We don't know the past, 2) We represent things we don't know in huge-dimensional configuration spaces, so 3) The past *is* a huge-dimensional configuration space. In other words, the time-reverse of the same anthropocentric (not anthropic) reasoning that I'm complaining about here.

      Still, thank you very much for drawing this connection; I'll definitely find it useful if I ever nail down this realistic re-interpretation of a (modified) path integral that I'm working on.

      Best,

      Ken

      Hi Tom,

      Thanks for the very nice comments! I'm way behind on reading essays, but I've added yours to the list... That's interesting about the connection you see with Joy Christian's work; I haven't yet put in enough effort to wrap my head around it, perhaps because I'm stuck in too much of a classical-spacetime mindset.

      Cheers!

      Ken

      Michael,

      Actually, I'm being much more heretical than you suggest. I'm not saying that we can't physically reproduce nature's computations, I'm saying that nature doesn't even *utilize* computations, at least not in the standard past-to-future sense of the word. I'm denying any map between reality and Hilbert space, as well as the premise that you're assuming when you say "used by NATURE as well". (I'm using nature=universe here, although I use only 'universe' in the essay.)

      Hope that makes it sound a bit more interesting... :-) And yes, I do propose an alternative path forward; I'm not just pointing out unresolvable problems.

      Best,

      Ken

      Dear Ken,

      very nice essay. I completely agree with you about your claim: teh universe is not a (quantum) computer. Determinism has influenced physics sine a long time. But non-local phenomena arise (like EPR), so we have to go deeper and find models to include te indeterminism. Hidden variables are no real path but non-computability is a way. What I mean are real non-algorithmic real numbers. By definition there is no algorithm (or law) to calculate them but they exists. So, if our future is non-computable then we keep up the concept of the free will etc. But as a consequence we need the contiuum (probably you do not like this point). One reason, why I keep up the concept of a smooth manifold.

      If you like, have a look into my essay.

      Best

      Torsten

        Dear Prof. Wharton,

        Thanks for your reply as it clarifies for me what an LSU presents for you as being within a broader context. As to your last remark that it requires one to "relax the Principle of Sufficient Reason to the point where our universe is just one of many possible solutions to the same ultimate constraints" leaves me somewhat confused. That is are we talking about LSUs being differentiated only by their initial primal conditions, as to what initially has them being composed of, or rather what order initially emerges from what otherwise would be the same primal conditions. In this respect I like the thoughts of Leibnitz that when such is considered there needs to be found a difference between what is certain and what is necessary. From this I've always taken him to mean one needs to look at the differences between what might exist and what is able to be realized as existing, as only the latter can be found to be a reality.

        Once again I thank you for your excellent essay, as I do think the LSU conceptualization is something worthwhile to have explored respective of the pursuit of answers regarding the hows of the physical world. I also find such a conceptualization may perhaps form a bases from which the whys of it for those who are so inclined wondered about may be addressed as well.

        "We have said that the concept of an individual substance [Leibniz also uses the term haecceity ] includes once for all everything which can ever happen to it and that in considering this concept one will be able to see everything which can truly be said concerning the individual, just as we are able to see in the nature of a circle all the properties which can be derived from it. But does it not seem that in this way the difference between contingent and necessary truths will be destroyed, that there will be no place for human liberty, and that an absolute fatality will rule as well over all our actions as over all the rest of the events of the world? To this I reply that a distinction must be made between that which is certain and that which is necessary."

        -Gottfried Wilhelm Leibniz, "Discourse on Metaphysics" (1686)

        Kind Regards,

        Phil

        >In order to avoid a time-evolved differential eqn (NS approach), the LS constraint cannot be satisfied instant by instant.

        Ah, here's the point of confusion. In any given instance, between any two particular measurements, I see nothing wrong with the intermediate fields obeying *some* continuous constraint. After all, I expect the fields to be smooth/continuous themselves.

        Here's an example (not actually what I'm going for, but useful for this discussion): Suppose a photon/EM-field in a double-slit experiment is a continuous field, with LS rules constrained by both the past preparation and the future measurement. This 4D story can be represented as a 3+1D description, but different future measurements could easily have different past behavior. So in 3+1D the field might obey differential equation A if future boundary/measurement A' is imposed (corresponding to an interference measurement), but would instead obey differential equation B if future boundary/measurement B' is imposed (corresponding to a which-slit measurement). So there's no master differential equation and NS fails as an "explanation". And yet, both A and B might conserve the same quantities, so one still gets the behavior that you're worried that LS rules out. Does that sound reasonable?

        Right now my research is taking me more towards something that roughly maps to stochastic quantum mechanics (esp. Larry Schulman's ideas, if you're familiar with his non-classical 'kicks'), where 'conserved' quantities are only conserved on average, in some statistical limit. But I see nothing fundamentally wrong with the approach in the previous paragraph, as an example of how LS might not map to an NS story.

        Ken,

        You say,

        "How would the founders of quantum theory have met these challenges if they thought the universe ran according to the mathematics of the Lagrangian

        Schema { not as a computer, but rather as a global four-dimensional problem that was solved \all at once"?"

        How about modeling with a quantum computer? Computers are only means to build a complex scenario that mimics a hypothetical situation. You're not saying that some aspect of the universe runs like a computer. With our understanding of gravity's properties, it is a mystery that only empirical evidence seems to address, as I explore in my essay.

        Liked your essay. A great deal of substance here.

        Jim

          Hi Ken,

          We met at PI when you gave your seminar there a couple of years ago and then again at the Foundations meeting in Brisbane. I remember talking to you about the universe being a computer (or rather 'not' being one)! How have you been?

          Thanks for the well-written essay. I must say that I wholeheartedly agree with your critique of the universe being a computer and am particularly persuaded of this given the measurement problem. However, I am not sure that I understand how the Lagrangian Schema can actually solve these problems. I have two major concerns with this:

          1. You say that the Lagrangian Schema is "the cleanest formulation of general relativity, with the automatic parameter-independence that GR requires, and bypasses problematic questions such as how much initial data one needs to solve the Newtonian-style version." If I understand this correctly then I don't think I agree at all. The question of finding valid boundary data for the EH action is the "thick sandwich" problem. It is riddled with difficulties and may not even be well defined except for solutions with a lot of symmetry. In fact, this is where the Newtonian Schema does much better. There are theorems for solving the initial value problem in general. The Einstein equations are naturally interpreted as evolution equations on Superspace (in a particular foliation) and that is how numerical relativists actually solve non-trivial problems. I'm not aware of anyone being able to do the same with a boundary value problem.

          2. I don't see how the Lagrangian Schema can solve the measurement problem without appealing to intuition from the Newtonian Schema. It's true that the LSU doesn't have dynamical equations but it still gives you a partition function on configuration space. How do you interpret this for the whole universe? Is there just one measurement that gives the whole history of the universe, but then what does it mean to "update" our knowledge if there is no sense of "before" and "after" our knowledge is updated?

          Like I said, I completely agree with your diagnosis of the problem. However, I think that one will need to go beyond just moving to the Lagrangian Schema to arrive at a sensible solution to the measurement problem. I hint at this in my own essay but I don't yet have a concrete proposal.

          Here is a crazy idea: maybe measurement is just not a computable process at all. Maybe the universe uses something like an Oracle to determine the outcome of a measurement process. I tried playing with a model like this a couple of years ago but got nowhere. I suppose it's probably too crazy to work...

          Cheers,

          Sean.

            Thanks, Ken, that helps greatly. You're not trying to rule out NS altogether, and NS does give rise to (3+1)D descriptions. You're saying LS is the master formalism by virture of its unique, unambiguous approach to any boundary conditions while NS is a slave thereto, by virture of its disparate differential equations for different boundary conditions. Of course, this is contrary to the situation in physics today whereby the vast majority of physicists subscribe to NSU and view LS as a mere curiousity, computational convenience and slave to NS. By realizing LS is the master formalism and subscribing to LSU, much (if not all) the "mystery" of QM disappears and we open the door to a new path to unification.

            The computer and the universe

            John Archibald Wheeler

            Abstract

            The reasons are briefly recalled why (1) time cannot be a primordial category in the description of nature, but secondary, approximate and derived, and (2) the laws of physics could not have been engraved for all time upon a tablet of granite, but had to come into being by a higgledy-piggledy mechanism. It is difficult to defend the view that existence is built at bottom upon particles, fields of force or space and time. Attention is called to the "elementary quantum phenomenon" as potential building element for all that is. The task of construction of physics from such elements is compared and contrasted with the problem of constructing a computer out of "yes, no" devices.

            Preparation for publication assisted by the University of Texas Center for Theoretical Physics and by National Science Foundation Grant No. PHY78-26592.

            http://www.springerlink.com/content/ck753337h0515573/

            Hi Ken,

            Very good essay and after reading it I must strongly urge you to study Joy Christian's "Disproof of Bell's Theorem" (Most of the book is also on arXiv.org as the separate chapters). Joy's model is a physical model that has so far completely resisted any kind of computer simulation while completely explaining the correlations seen in EPR-Bohm type scenarios. Granted, his model is a bit difficult to understand at first but once you understand some basics about Geometric Algebra, the model is actually fairly simple. Nature simply has a 50-50 chance of left or right handed orientation when the particle pairs are created. Now, I hadn't really thought about it before but it might be nice to try to fit the model to LSU.

            Also, if you have a chance, you can check out my essay where I briefly argue that quantum mechanics is due to relativistic effects on the microscopic scale.

            Best,

            Fred

            Dear Ken,

            Although not directly related to your essay presented here, I have an idea that I hope can be of some interest to you base on a realistic model in ordinary space-time. Nothing mathematical fancy, I find that the bosonic quantum field can be reconciled from a system with vibrations in space and time. The model has some unique features that seem to be extendable to gravity and non-locality of quantum theory.

            Is there really no reality in quantum theory

            Best wishes for you in the contest.

            Hou Yau