The computer and the universe

John Archibald Wheeler

Abstract

The reasons are briefly recalled why (1) time cannot be a primordial category in the description of nature, but secondary, approximate and derived, and (2) the laws of physics could not have been engraved for all time upon a tablet of granite, but had to come into being by a higgledy-piggledy mechanism. It is difficult to defend the view that existence is built at bottom upon particles, fields of force or space and time. Attention is called to the "elementary quantum phenomenon" as potential building element for all that is. The task of construction of physics from such elements is compared and contrasted with the problem of constructing a computer out of "yes, no" devices.

Preparation for publication assisted by the University of Texas Center for Theoretical Physics and by National Science Foundation Grant No. PHY78-26592.

http://www.springerlink.com/content/ck753337h0515573/

Hi Ken,

Very good essay and after reading it I must strongly urge you to study Joy Christian's "Disproof of Bell's Theorem" (Most of the book is also on arXiv.org as the separate chapters). Joy's model is a physical model that has so far completely resisted any kind of computer simulation while completely explaining the correlations seen in EPR-Bohm type scenarios. Granted, his model is a bit difficult to understand at first but once you understand some basics about Geometric Algebra, the model is actually fairly simple. Nature simply has a 50-50 chance of left or right handed orientation when the particle pairs are created. Now, I hadn't really thought about it before but it might be nice to try to fit the model to LSU.

Also, if you have a chance, you can check out my essay where I briefly argue that quantum mechanics is due to relativistic effects on the microscopic scale.

Best,

Fred

Dear Ken,

Although not directly related to your essay presented here, I have an idea that I hope can be of some interest to you base on a realistic model in ordinary space-time. Nothing mathematical fancy, I find that the bosonic quantum field can be reconciled from a system with vibrations in space and time. The model has some unique features that seem to be extendable to gravity and non-locality of quantum theory.

Is there really no reality in quantum theory

Best wishes for you in the contest.

Hou Yau

    Hi Ken,

    Ha! I finally read it (though I should have done so months ago when you first sent it to me and I do apologize for that).

    Anyway, so, in general, I really like the idea (and your writing is impeccable as usual). But I did feel a bit unfulfilled at the end. For instance, it didn't seem entirely clear to me how the LSU approach solves the problem you pointed out in your first footnote: if the universe is (or at least appears to be) time-asymmetric, why are it's laws time-symmetric?

    Also, is it really anthropocentric to try Method X, observe the outcomes of Method X and see that they match reality, and then thus assume Method X must be at least partially correct as a model? I mean, the whole point is to match observation and measurement and even in the quantum realm I think that there has to be some kind of objective reality given the fact that the scientific method even exists in the first place (i.e. while we may all view slightly different realities, we clearly have enough in common to allow us to communicate with one another and even to put quantum mechanics to practical use).

    Nevertheless, sometimes I wonder if we are even fully capable of non-anthropocentric thinking.

    My only other complaint about the essay is related to my usual diatribe about entropy - I hate associating it with "disorder." I see absolutely nothing bizarre about low-entropy early universe (and Max Tegmark agreed with me on this at the last FQXi meeting, albeit quietly - I was not so quiet). But that's just a personal pet peeve of mine.

    Ian

      I actually agree with George on this point (you should see some of the interesting comments on his very stimulating essay, as well as those on Julian Barbour's - conversely, my own essay seems to be about the only defense of reductionism in this entire contest!).

      Ian

      Dear Ken,

      very fine essay; I have only some general remark/doubt: what is the reason for arguing in favour of so obvious thesis. Let me explain: we know that even mathematics (arithmetics) is not axiomatizable by any quite powerful set of axioms (Goedel theorems) so some simple algorithms do not have definfite answers. Computers are designed such that algorithms yield definite outcomes. That would be very strange, if impossible at all, if the world would be a computer realizing succesfully the algorithms, since even mathematics shows the impossibility of this (for a suitably rich system).

      wishes,

      Jerzy

      c.f.: http://fqxi.org/community/forum/topic/1443 (What if Natural Numbers Are Not Constant?)

        Ken,

        I wonder what the Lagrangian Schema could do if a theory is inconsistent, e.g. if in general relativity the speed of light is both variable and constant. You wrote:

        "If one wants to "fit" quantum theory into the spacetime of GR, one must use the Lagrangian Schema, solving the problem "all at once"."

        I am afraid that would be a hopeless procedure:

        W. H. Newton-Smith, The rationality of science, Routledge, London, 1981, p. 229: "A theory ought to be internally consistent. The grounds for including this factor are a priori. For given a realist construal of theories, our concern is with verisimilitude, and if a theory is inconsistent it will contain every sentence of the language, as the following simple argument shows. Let 'q' be an arbitrary sentence of the language and suppose that the theory is inconsistent. This means that we can derive the sentence 'p and not-p'. From this 'p' follows. And from 'p' it follows that 'p or q' (if 'p' is true then 'p or q' will be true no matter whether 'q' is true or not). Equally, it follows from 'p and not-p' that 'not-p'. But 'not-p' together with 'p or q' entails 'q'. Thus once we admit an inconsistency into our theory we have to admit everything. And no theory of verisimilitude would be acceptable that did not give the lowest degree of verisimilitude to a theory which contained each sentence of the theory's language and its negation."

        Pentcho Valev

          I think your essay is very interesting and makes a reasonable point. The Lloyd concept of the "quantum cosmic computer" is something which I have gone back and forth on. My sense of the issue is this: a physical system is modeled as a computer when it is convenient to think of it that way. When it is not so convenient then it is modeled by other means.

          A Lagrangian model of a physics system has the initial state and the final state specified. This carries over to the quantum path integral, where the extremization procedure is generalized to a variational calculus on many paths with quantum amplitudes. The analyst is then faced with the task of finding some dynamical process or quantum evolution which maps the initial state of the system to the final state. The analyst will then use what tools they have in their box to solve this problem.

          A quantum computer model approach with quantum gravity may work in the case of quantum black holes if the analyst knows the initial data in the formation of the black hole and detects the final outcome. The entropy of a black hole is given by the Bekenstein formula S = kA/4L_p^2. Here L_p = sqrt{Għ/c^3} is the Planck length, and L_p^2 is a unit of Planck area. A is the area of a black hole event horizon. The entropy is equal to the number of Planck units of area that comprises the event horizon S = Nk/4. This is given by the total density matrix of the black hole, where ρ is the quantum density matrix ρ_{op} = sum_n|ψ_n)(ψ_n|. A trace over the density matrix in the Khinchin-Shannon formula determines entropy. If you threw a bunch of quantum states into a black hole entropy of the black hole is

          S = k sum_nρ_a log(ρ_n) S_{BH}

          The black hole entropy increases. The S_{BH} is an initial entropy of the black hole, which if the analyst does not know the previous quantum states comprising the black hole is determined by a trace over states. You are not able to disentangle entropy of your signal from the black hole by performing the proper partial trace necessary. However, if you kept an accounting of all states in the black hole and the joint entropy of the states you put into the black hole, which are negative, then you could in principle extract the information you put into the black hole. How joint entropy can be negative is a consequence of quantum entanglement, and by putting a quantum bit stream into a black hole is to entangle it with the black hole.

          A detailed understanding of this requires the use of error correction codes. The most general one is the Leech lattice Λ_{24}, which is constructed from a triplet of E_8 heterotic groups. string theory has a heterotic sector with E_8 ~ SO(32). The so called sporadic groups, such as the Leech lattice, are a system of automorphisms and normalizers which define something called the Fischer Griess (or monster) group. If one has a complete data set this may then be used to model the black hole as a quantum communication channel or computer that processes input states into output states. The great gauge-like group is then a "machine" which one can use to model a process.

          This artificial tracing is related to the measurement issue, or the final outcome. In the case of a quantum computer there is no decision procedure for determining this, at least not at this time. As you indicate this is a weakness with the whole quantum cosmic computer conjecture. In addition, black holes have the convenience of having an asymptotic description, such as the curvature going to zero out at infinity. This tacitly means the computation process as a model can be realized by some observer, however idealized the observer is, so the process can be modeled as an algorithm.

          With the universe in total this picture becomes more problematic. If we are to think of an observer as reading the output, there is no boundary region where this observer can read this output without that procedure also being a part of the "computation." The problem of establishing a Cauchy region of initial and final data is much more difficult to work.

          Cheers LC

            Torsten,

            Thank you for the comments... I actually quite like the continuum, although that probably didn't come across in this contest (see the last contest for more details). The question of whether EPR/Bell-inequalities can be explained by hidden variables or not actually depends on whether one is in an NSU or LSU framework. It works in the latter but not the former. Unfortunately, most analysis simply assumes NSU, which has biased many people against hidden variables, but that option opens up again with LSU.

            Best,

            Ken

            James,

            Thanks! I'm not sure I understand your point, but even if one had a working quantum computer, one couldn't "model" quantum phenomena with it; one could merely "reproduce" quantum phenomena. And, unfortunately, such a computer would give no more information about what was happening between quantum measurements than the bare phenomena themselves, because one can't make an "extra" intermediate measurement in the quantum computer without destroying the match to the actual phenomena in the first place. (Weak measurements aside for now...)

            Really, a quantum computer would *be* a quantum phenomenon, in and of itself. It would not be a useful "model", in that it would not add to our understanding... any more than pointing out that a star is an excellent model of that same star would add to our understanding.

            Please let me know if I'm totally off-target with your intended comment!

            Cheers,

            Ken

            Hi Sean! Yes, of course I remember you, and thanks for your very thoughtful comments.

            Your GR points are all perfectly valid, and you're right that the LSU-style "thick sandwich problem" is unsolved (specifically, the question of whether there is one unique classical solution for a given closed-hypersurface boundary metric, and how to find it). But the "problematic questions" that I had in mind were issues involving the intersection of GR with QM... namely, the question of whether one can even impose all of the needed boundary data without violating the HUP. (If not even the Big Bang can beat the HUP, having a formal solution of Newtonian-GR for a HUP-violating initial boundary seems sort of useless to me, even leaving aside whether "lapse functions" are natural to include in an initial boundary to begin with.)

            Even though the thick-sandwich problem is clearly in the LSU camp, the way it's normally phrased makes it clear that it was developed in an NSU-mindset. After all, who says that there has to be only one unique solution? (Esp. given what we know about QM.) I'd be far more interested in a result that showed *many* solutions for a given boundary, of which our actual universe would be merely one possibility. (And yet, if this result were discovered, I think many physicists would view this as a failure.) At the end of the day, though, it's not the job of LSU approaches to recover classical results, or even NSU-like results. Hopefully it will lead to *new* insights that actually look like our GR+QM universe.

            As for the measurement problem, the clearest discussion I've written so far is my entry in the previous FQXi contest, but even reading that you'll still probably have many of the same questions. A stronger argument will require a fully-fleshed out toy model, one that I'm still plugging away on, but for now I'll leave you with the following thoughts: If I'm right that configuration spaces live in our heads, not in reality, then we need to delve deeper into the foundations of the path integral to find a realistic LSU story. The good news is that the action itself lives in spacetime. Sinha and Sorkin (1991) pointed out that by doubling the particle path integral (folding the paths back on themselves in time), you no longer need to square the integral to get out probabilities -- and at this point it almost looks like the configuration spaces used in stat mech (which I assume you'll agree would be perfectly acceptable for a realistic theory). But those particle paths still need negative probabilities to get interference. However, fields can interfere in spacetime, so extending these ideas to fields can arguably solve this problem (although this will require further alterations of the path integral, with some master restriction on field configurations to make it mathematically well-defined). The last piece of the puzzle -- how to objectively define an external measurement in the first place -- is addressed in the previous contest entry. The key is that a measurement on a subsystem is not merely the future boundary constraint itself, but also the chain of correlations that link it to the cosmological boundary. Thus, in a quantum eraser experiment, the future chain is broken, and all observers eventually concur that no "measurement" was ever made in the first place. Note this only works in an LSU; in a realistic NSU theory, one needs to know "right away" whether a given interaction is a measurement or not.

            For your final idea, about some "Oracle" choosing amongst possibilities, I'm entirely on board with that notion and am growing more confident about it all the time. But it only makes sense for the choice to happen *once*; some global choice that picks one reality out of all possible universes (given all the boundary data, and some master constraint on the total Lagrangian density). Maybe it's now clearer why I'd prefer lots of solutions to the thick-sandwich problem. From our perspective, individual quantum experiments all might seem to have a different random choice, but really it's all one big choice that manifests itself as computable patterns between similar experiments (computable from the size of the different solution spaces.) Getting the probabilities right will be the ultimate test of all this, but if you read my conclusion again with this in mind, you might see what I'm going for.

            I'm looking forward to reading your own essay... And I hope we cross paths again soon!

            Cheers,

            Ken

            Dear Ken,

            I think there is a conflation between computational models and mathematical representations. It is true that mathematical representations almost always turn out to be computable models, specially when used numerically to approximate the solution to a real-world problem. But a mathematical model doesn't always come with its implementation (in fact rarely does). That is, one has to find a computer implementation for a mathematical model. There is no one-to-one correspondence between models and algorithms. For example, many Turing machines can compute the same computable function, but each can do so in a completely different way (e.g. a function that can be computed in linear time can also be computed in exp time).

            While I may agree that the working assumption of science is that nature is mathematical I think it is far from obvious that science assumes that a particular implementation of a mathematical model is the specific way nature operates. This is also a common conflation when people think that saying that a natural process is Turing computable means that it is computed exactly by something like a Turing machine (nobody thinks the brain is a Turing machine, but most science works under the assumption that the brain is Turing computable, which is completely different). In your essay you point out Seth Lloyd's claim that the universe is a computer, this illustrates my point, because in fact Seth Lloyd does not think that the universe is a Turing machine, but in fact a quantum computer, which in the light of your own arguments I find difficult to reject.

            On the other hand, I don't think that physicists will ever be able to close all loopholes in quantum theory at the same time, and it hasn't yet been done. While your ideas are provocative I still find the computable ground less subject to particular interpretations.

              Dear Prof. Wharton,

              as a physicist, you argue that the universe is not a computer. As a professional in the field of computers, it is hard for me to agree with you. To me the universe appears as an enormous computer and I find the parallels between the two everywhere, from the the structure of space with the visible universe confined to a 3-dimensional display, akin to a 3D touch screen, to the origins of life itself. Most striking similarities I find in our creation myths and the ways in which the complex systems are put together in practice. But my essay is not about that. In my essay I infringe into your territory, just as you infringe into mine, and argue that organization of space housing the universe dictates the laws of physics.

              My analysis of the current state in physics zeroed in on the paradox of space, which ~100 years ago was substituted with the wave-particle duality. The incongruous notion prevailing today that waves can propagate in emptiness, without a supporting medium, is called a workaround in my field, where such compromises are common and constitute a norm rather than an exception. The difference between you physicists and us programmers is that the programmers fully appreciate that, in the long run, such workarounds come with a heavy price and are must be addressed sooner or later. Better sooner than later.

              In contrast, you physicists seem a headstrong bunch. Having decided long before the computers, when the ability to compute was deemed as the height of human ability, that the understanding and visualization of the underlying reality can be discarded as long mathematics appear adequate, you as a group still stubbornly stick to it. Apparently you do not realize that you have put yourselves in danger of being replaced by the very calculators you still strive to emulate. The current motto in physics "shut up and calculate" have put you on par with the computing machines, as if you have forgotten that whatever a human can calculate a machine can do far better. As a professional in my field, I fully appreciate the fact that the main difference between calculators and humans is that humans understand how the calculations relate to the physical reality and have a vision of how their own experience fits in it.

              And so I find it ironic that the modern day physicists appear unaware that their phenomenal ability to compute could in fact be the vestiges of our very origins as well as the origins of our universe. Arguing that the universe is not a computer you physicists don't seem to appreciate the fact that mathematics divorced from the understanding of the underlying reality, and the vision that comes with it, is what has turned you into the glorified calculators and thus put forth the question of your own utility to the rest of the humanity.

                Dear M. V. Vasilyeva,

                Perhaps were you a little older, and had experience with analog computers, you might not jump to such conclusions.

                You say, "In contrast, you physicists seem a headstrong bunch."

                But when your only tool is a hammer, everything looks like a nail. When one is mostly familiar with computer concepts, they seem to apply everywhere. Believe me, there's much more to it than you seem to see.

                "And so I find it ironic that the modern day physicists appear unaware that their phenomenal ability to compute could in fact be the vestiges of our very origins as well as the origins of our universe. Arguing that the universe is not a computer you physicists don't seem to appreciate the fact that mathematics divorced from the understanding of the underlying reality, and the vision that comes with it, is what has turned you into the glorified calculators and thus put forth the question of your own utility to the rest of the humanity."

                Since I have been far more professionally successful in the field of computer design (see here) than as a physicist, and have published university texts on computer design and a dissertation on an 'automata theory' of physics, I do know about computers. All I can say is that, no matter how much everything looks like a nail to you, it's a little deeper than that.

                But I still enjoyed your essay.

                Edwin Eugene Klingman

                Dear Ken Wharton,

                A well written, accessible essay. I found it very interesting.I am not sure of the prevalence of the basic assumption that the universe is a computer. However having chosen that assumption to talk about you do a very good job of clearly communicating your viewpoint.

                I do like that your essay is forward thinking and suggesting a potentially useful direction for future research, rather than just pointing out the problems. Good luck in the competition. Regards Georgina.

                  Ken

                  Not 'start' with logic, but found the 'structure' emerging to be precisely that of Truth Propositional Logic (infinite hierarchical 'nested' compound propositions), and the kinetics analogous to dynamic logic's (PDL) 'Interleaved modes'.

                  I also started from a more Lagrangian than Newtonian view, but consider real data more than theory and don't habitually 'compartment' findings or assume interpretations. Issues with Bell are well covered elsewhere and in the references (lack of space), but I'll follow up yours.

                  All I'm after is falsification. To me it's entirely self apparent and all attempts to falsify have only resolved other anomalies. but because it's unfamiliar all just seem to do what you have, write it off and ignore it as it's 'different'. So half of physics is looking for a better solution, but all of physics completely ignores it when one arises (and have done for 4 years). How do we overcome that? Any ideas?

                  Best wishes. I think you are correct, but it seems there may be no point in being so.

                  Peter

                  Hi Ken,

                  Thanks for your detailed response! If you have had to a chance to read my essay, you'll probably find that we have very different intuitions for how to address some of these problems (in our approach, we treat configuration space as completely fundamental!). Nevertheless, I am very interested in hearing more about your ideas. I think I can see where you are going and would be interested to see a toy model worked out.

                  In regards to the "thick sandwich" problem, I can see now why the classical result doesn't bother you. However, our universe seems pretty classical now. It takes some real mental gymnastics to try to think in the way you are suggesting!

                  Where our intuitions do seem to overlap is with the Oracle issue. Even in the setting described in my essay, I think this is potentially the only way to really understand measurement although I didn't really say this in the text. I would be happy to discuss ways to make this idea more concrete. I've always found it very intriguing.

                  Hope to see you some time again!

                  Cheers,

                  Sean.

                  Dear Ken Wharton,

                  Do not belittle what you called "typical engineering-physics". I agree that your LSU exactly corresponds to the monist view by Einstein, Hilbert, and present mainstream. You are not questioning the fundamentals; you are trying to defend and extend the philosophy on which spacetime and time symmetries arose. I see my essay a challenge to you because it clearly distinguishes between past and future.

                  You seem already to be not very precise when you wrote in the abstract "predict the future from what we know about the present" but then "predict the future from the past". This would mean the past is what we know about the present. Noticing your obvious (in words like "we know" and "predict") anthropocentric point of view, I prefer a notion of objective reality that I described in my essay.

                  You denied to be a superdeterminist and declared to agree with an unknown to me Huw Price on the issue. Could you please explain his and your position?

                  Let me out myself as a fan of Karl Popper: I see it justified assuming potentially infinite influences, no matter whether the world is actually open in the sense of potentially infinite or there is objectively no chance to have a complete and trustworthy mathematical description of it.

                  Do not mistake this as support for Roger Schlafly's certainly also welcome in fqxi almost nihilistic attitude. I am the optimist who hopes for revelations of foundational mistakes. Maybe, my Figure 5 can be refuted. So far it seems to refute a basic assumption that led to Einstein's work.

                  Sincerely,

                  Eckard Blumschein