The big bang has a lot of empirical support for it. I don't particularly want to get into trying to argue the points for inflationary multiverse theory. It is though likely that just as solar systems operate not by some geometric order of planetary orbits, but rather by a more fundamental set of principles, so too are the gauge field constructions in a vacuum nucleation or pocket universe. In medieval to the renaissance cosmology it was thought the solar system was arranged by a set of geometrically ordered "orbs." Kepler worked on something like this. It appears likely that the universe is far grander, and what we observe as the spacetime universe is just one bubble out of a vast number of such on an inflationary spacetime that is often called the multiverse.

Cheers LC

The physical world and universe is rather contrary to our common sense. For instance you say that every particle is unique, but it is well known by the Pauli exclusion principle that this is not the case. Two electrons are not distinquishable in quantum entanglement. The advancement of our understanding of the physical world is not going to conform closer to our common sense, it will challenge it.

LC

  • [deleted]

Lawrence,

I'm not really trying to start the cosmological argument, so much at trying to describe how various dichotomies, order/randomness, linear/non-linear, vector/scalar, node/network, organism/ecosystem, are aspects of an fundamental underlaying process.

It really isn't so much to argue about cosmology, but to make the deeper point that our logical concepts are generally based on one side of this relationship, that of the linear, ordered, singular organism, since it is the basis of our narrative, cause and effect descriptions of reality and the resulting atomized view affects many aspects of our understanding and relationship to nature, from monotheism to the Big Bang.

It's not the math is wrong, since it is distilled pattern, but how we apply it. For example;

" This means potentially there is only one electron in the universe, but where the huge number we observe are copies of that one state in different configuration variables."

"what we observe as the spacetime universe is just one bubble out of a vast number of such on an inflationary spacetime that is often called the multiverse.'

I can see how the math for this might be quite logical, but that in editing the variables, some important details might have been left on the cutting room floor. If the universe is one electron , might it be equally mathematically provable that every electron is a universe? If A=B, then does B=A? I tend to see multiverses as a version of C. S. Escher sketches of waterfalls and stairways going in circles. Quite interesting on paper, but problematic in reality.

So my point is, again, that we are not taking that scalar randomness into account as the background and balance to the logical vector.

  • [deleted]

This background balancing is a bit like looking into a mirror and trying to explain what we see, as though there is some world opposite ours, rather then the principles of the mirror, so we keep coming up with all these shape explanations, from multiverses to supersymmetric particles.

  • [deleted]

Hi Lawrence,

this is a very intriguing essay that touches on a lot of things that I have dimly glimpsed in my own thinking. In particular, the issue of how undecidability relates to physics is something that is intermittently on my mind. There's been a recurring trend to look for undecidability as somehow related to the measurement problem, or other quantum mechanical weirdness, starting with maybe Zwick in 1978, and continued by people such as Mittelstaedt or Thomas Breuer (who uses diagonal arguments to establish the impossibility of perfect self-measurement in theories assumed to be universally valid, that is, apply to observer as well as observed). A relatively recent development is the idea that the randomness of quantum measurement outcomes is related to the undecidability of the outcome from axioms encoded in the state preparation, as developed by Paterek et al. There's also interesting work by Karl Svozil, Christian Calude, and others, in investigating quantum randomness and uncertainty from the point of view of Chaitin's algorithmic information theory.

All of which is just to say that a lot of people have seen some common ground here, while apparently nobody has been able to find a rigorous formulation. Your take on the issue is a new one to me: as far as I understood, you seem to be saying that independent axioms may be repealed in order to allow greater mathematical freedom, citing the case of abandoning the parallel postulate in order to lead in a profitable way to new formulations of geometry. But of course, in any theory, all axioms are logically independent of one another, no? Otherwise, if any axiom can be derived from the other axioms, you can just strike it out, and you'll be left with the same theory. This was what drove the attempts to derive the parallel postulate from the other axioms: it was seen as a blemish on the theory, and it was hoped that the theory would hold up unchanged without it. The construction of geometries inequivalent to Euclid's by Lobachevsky and others ultimately was what killed this hope. (And besides, isn't Euclidean geometry decidable anyway?)

So the parallel postulate ultimately isn't derivable from the theory in the same sense that, say, the existence of the square root of -1 isn't derivable from the field axioms: the incompleteness here is in a sense trivial, and different from the Gödelian case in the sense that one probably wouldn't want to insist that the field axioms are complete in the sense that they derive every true position they can express. So it seems to me that there's a difference between the independence of the parallel postulate and the independence of, say, the continumm hypothesis from Zermelo-Fraenkel.

Also, even though there are undecidable propositions about any sufficiently complex system (any system capable of universal computation), this does not imply any 'uncertainty' about the fundamental laws (though I'm not sure if you're arguing for that): take, for instance, a universal cellular automaton such as the Game of Life. It's 'fundamental laws' are simple enough, and can be completely specified without any problem; nevertheless, there exist undecidable propositions, such as whether or not a specific configuration will ever turn up. But of course, GoL can be simulated on a computer; so the mere existence of undecidability does not imply anything about the uncomputability of physical evolution. So this does not put the hypothesis of the universe being, in some sense, a 'giant computer' to rest: in fact, I would rather consider it evidence in its favour, since as I said, every universal system gives rise to undecidable propositions.

But I'm not quite sure if I'm not arguing past your points; I need to re-read your essay in some calmer moment, I think.

    • [deleted]

    Lawrence,

    Rather than a top down platonic view of the entire universe as one electron and everything else as a reflection of it, what about a bottom up view, where every electron is its own unique view/reflection of the entire universe, necessarily most reflective of what it is most tied into/entangled with?

    Dear Lawrence,

    Very nice and rich essay! I must confess that am still trying to connect all the dots, but for what I understand so far, especially after the clarrifications in your comments, I very much agree with you.

    I may return with some questions.

    Best regards,

    Cristi Stoica

      Thanks for the positive response. I have been intending to write more fully on your essay, since I have now read it a couple of times. I will try to do that later today or tomorrow.

      LC

      This essay I kept on the level of modal logic so the presentation could be kept on a somewhat informal level. I think on a deeper level this connects with the Langlands program. The proof of the Tanyama-Shimura conjecture is a two dimensional form of a more general theorem, or set of theorems, of elliptic curve cohomology. In four or eight dimensions there are similar results for conformal systems, such as in four dimensions with the Cardy theorem. When extended to eight dimensions this climbs up the Cayley numbers into the exceptional group E8 and potentially octonions. This may connect to some axiomatic basis for a putative proof of the Riemann zeta function conjecture. I think the zeros of the zeta function and prime distribution may be mapped into the set of eigenstates for quantum gravity.

      I have been reading the paper by Mathur, where he introduces early on the nature of wave functions on spatial splittings of the Schwarzschild spacetime. Wave functions of different wavelengths near the horizon, here the horizon considered classically, will scatter into the exterior and inferior of the BH in different ways. Those wave functions though are defined by the action of field operators on a Fock space or vacuum occupation space. If the horizon is "quantum uncertain" I think there is then an underlying associative property for these field amplitudes.

      I have read a couple of papers by Mittelstaedt. Most of his work was done in the 1970-80 time period. I have not followed anything he may have done later.

      I have not yet scored any essays on FQXi. I have only read a handful at this time. I see that you have an essay here as well. I will try to get to it in the next few days. I generally score essays after I have some idea of what a number of them look like. I usually keep a copy of the essay page where I put in a tentative score on the file before I actually score them on the FQXi page.

      Cheers LC

      The undecidable reminds me of Wangs Tiles and how Egan used them in Diaspora. It may be possible to build such tiles out of amino acids and just mix them together to see what happens.

      Your idea is different and based on taking out associativity as an axiom, is that right? It's an interesting way of looking at.

      Do try to make a better effort to be accurate. The real physical world and real Universe are unique and have to conform to our common sense assessment. There are no quantum entanglements in reality. You can waste your time as much as you like pretending to know how abstract invisible electrons operate, if each particle is not unique, why is CERN trying so hard to isolate the Higgs boson?

      That is one way to think of it. Another is if you think of the physics as having two representations. One representation involves quaternions where associativity holds. The other involves the octonions.

      A cloth item or garment will when hung over a chair or a hanger or thrown on a surface assume a shape that minimizes the stress per area on it. This is a function of the topology of the garment and the geometry which involves these stresses per unit area. The elementary case is that of a skirt, where the boundary at the top is equal to the boundary at the bottom. These two boundaries define a simple cobordism. A more complicated case is that of a pair of pants. Here the boundary at the top is one circle and the boundary at the bottom consists of two circles. Think of the top circle on both the pants and the skirt as a group G which has some decomposition into SL(2R)^n. The skirt provides a simple deformation retract of G to the bottom. The pair of pants has two circles we think of as a group G' that each decompose into SL(2,R)^{n2/}.

      The skirt represents the elementary view, say the 8-bit SLOCC. The pants are more complicated where this decomposes into two copies of a 4-bit SLOCC. These two copies have a duality relationship, such as with Yangians. The pants are then the perspective we have of physics, or fields that are quaternion valued. They are associative = "nice," or nice according to how we normally think of physics, and there are conformal dualities. The skirt represents the world more fundamentally. In moving between the two perspectives we turn on and off the axiom of associativity.

      Cheers LC

      One goal of physics is to reduce the number of fundamental degrees of freedom in the world. If all electrons are just projections of one electron into different configuration variables this is a huge reduction in the number of degrees of freedom.

      I don't deny that top down physics operates as well, such as what Ellis works on. Emergent structures at the top are constraints on lower level processes. In fact in my essay at the end I indicate how this sort of incompleteness argument could play into work on top-down causality.

      LC

      • [deleted]

      Lawrence,

      It just seems to me, that if you are starting with the one electron, with all the rest as manifestation of the one, you are starting at the top and it is what is beneath it that is emergent.

      I realize this fits in with a singularity based universe, but the point I keep making is that if we view it as a bottom up emergence from a fluctuating void, it actually fits with what we observe and that the sense of a singular focus is more an intuitive reflection of our own point of conscious reference.

      Think in terms f a pyramid; The base is distributed, while the apex is the point of reference. Or gas coalescing into a star. Or 1/0; Nothing is more fundamental than something.

      Consider it in theological/spiritual terms; Is there an elemental spirituality to nature, from which complex organisms coalesce, or is there that one deity, of which we are all copies?

      Or simply biological terms; Did life emerge from seas of nucleating amino acids and build up ever more focused levels of complexity, or was it just a single spark of being from which every living organism is a copy?

      The base is not the point, but the field.

      The physical world just does not conform to that way of thinking. The trend since the time of Galileo has generally been in this direction. I think it is apparent that you are not have an extensive education in physics. As our knowledge of the universe has increased it has become increasingly removed from our everyday expectations of things and what might be called common sense.

      LC

      Hi Lawrence,

      I have found your essay interesting and difficult so it took me a lot of time to study and maybe you will find my comment not clear? But I have to try.

      I would like to refer to your statements: 'The quantum computer perspective of the world describes the universe as some master quantum Turing machine that deterministically computes information states. This is really a modern version of the clockwork world. Conversely the path integral perspective tells a somewhat different story, for the evolution of a physical system or quantum states is due to the extremization of the action. In this setting the evolution is less due to the deterministic automata processes of a Turing machine than they are due to a continuous extremization.... The issue of determinism in a "clockwork" or computer fashion is equivalent to an ordering of states, as well as a statement about logical necessity, or modal logic...'

      The theories that combine digital physics with loop quantum gravity are well known and formulated e.g. by Paola Zizzi (Computational LQG). There is no room here for citations. Additionally according to Lee Smolin (LQG) self-organized critical systems are statistical systems that naturally evolve without fine tuning to critical states in which correlation functions are scale invariant. My own view seems to support the view of Smolin in the meaning that the universe is a dissipative coupled system that exhibits self-organized criticality. The structured criticality is a property of complex systems where small events may trigger larger events. This is a kind of chaos where the general behavior of the system can be modeled on one scale while smaller- and larger-scale behaviors remain unpredictable. The simple example of that phenomenon is a pile of sand.

      Why I have touched that issue? This has a lot to do with determinism, computability, TM and emergence. My conclusion is: when QM and GR are computable and deterministic (however non scale-invariant), the universe evolution (naturally evolving SOC system) is non-computable and non-deterministic. It does not mean that computability and determinism are related.

      I would argue for a "top-down" physics with the emergence of higher level properties. I think that there is not only a prospect for playing a role but at the moment it plays a role in the emergence of biology and even consciousness.

      It is a bit hard for me to follow what you are saying at this point. The departure is seen here with your "see it in theological or spiritual terms." At the point one invokes those sorts of requirements they have departed from scientific discussion.

      LC

      • [deleted]

      Lawrence,

      It has to do with underlaying premises on which even cultural norms are founded. Eastern belief systems haven't gravitated to a singular deity because the underlaying assumption is more context based, than unit based. The network side of the equation, rather than the node side.

      Biological evolution is a multi-billion year experiment. That is has resulted in mobile organisms with separate energy(lungs/gut/circulation) and information(nervous) processing systems and the information processing system is further divided into a parallel/scalar processor and a linear/vector processor and since this is the basis through which our perception of reality is filtered, it should be given some degree of attention.

      If physics can only speak to the extremes of the very large, the very small and the very abstract, is it any wonder the preachers and politicians have no competition for people's attention?

      What are your dreams of understanding nature worth, if the society in which you exist is falling apart and you have nothing to offer in the way of understanding why?

      Do you seriously believe that you and your fellow physicist's obsessive compulsive abstraction addictions could in any way be superior to my common-sense grasp of reality? Unique is not optional. Each real snowflake is unique once; therefore everything real is unique, once. Each unique thing is neither superior to, inferior to, nor equivalent to any other unique thing. It is impossible for the unique to increase, decrease or remain equivalent to any other unique. Abstract knowledge of an abstract Universe may have increased for a certain selective essentially religious group of white male physicists, fortunately, because reality is unique, it is an easy task to expose them for the pretentious fabulists they are.

      Hi Jacek,

      I have been curious about the role of LQG. I will confess that I am primarily oriented towards the string theory perspective. I do think LQG has some relevancy to physics, but it is uncertain what that is. LQG comes from the ADM approach to general relativity, which give constraint equations NH = 0 and N^iH_i = 0 with no explicit time dependency. The lack of time dependency means that energy is not defined. This is a manifestation of Gauss' law, where on a general manifold there is no boundary from which to integrate over to define mass-energy as the source of the field. So the Wheeler DeWitt equation, and spinor variations on that theme in LQG, are in effect constraint systems. The problem is that we do not know what this contrains exactly.

      I have thought that LQG is some sort of "target" of a renormalization group flow in string/M-theory. However, string theory has at its IR limit a graviton in a weak coupling regime on a background. This theory is renormalizable as a perturbative field theory, even if we know it is a weak coupling approximation to quantum gravity. LQG is not renormalizable. So we are sort of left with an open question. Is LQG a strong coupling S-duality to the string weak coupling theory? There are some problems with an idea of this sort. In particular LQG is not easily embedded into a larger unification scheme with gauge fields or supersymmetry.

      However, LQG is based upon basic general relativity in a way that is hard to ignore. I suspect it is not completely wrong, as is often thought in the string theory community. However, where it fits into things is unknown.

      LQG is related to Regge calculus, which is run on a computer. It is one of the tools numerical general relativists use. However, this approach to general relativity is also a manifestation of the action principle. Underlying it all is an extremal principle. Extremization is a continuous process and from a set theory perspective it is a manifestation of a nondenumerable set of numbers, or the reals. This is one motivation to suggest that the universe is not entirely a computer, even if it has some computer or algorithmic-like structures to it.

      Cheers LC