Essay Abstract

The digital model of the universe or \It From Bit" is not decidable. A model of the physical universe encoded by algorithmic means will not compute reality. One unknown domain argued to be outside any computerized model based on current quantum eld theory is quantum gravity. A change in axiomatic basis is proposed to address eld nonlocality in quantum gravity.

Author Bio

Doctoral work at Purdue. Worked on orbital navigation and currently work on IT and programming. I think it is likely there is some subtle, and in some ways simple, physical principle that is not understood, or some current principle that is an obstruction. It is likely our inability to work quantum physics and gravity into a coherent whole is likely to be solved through new postulates or physical axioms, or the removal of current ones.

Download Essay PDF File

Of course this means I have an added information management issue. I almost didn't submit anything this time. However, this idea came to me last month and after some calculations decided to give it another try. This was a bit fun to work through and write up.

  • [deleted]

It looks that my previous post was a lucky rabbit's foot for you. In fact, after such a post your Essay became the leading one.

Cheers,

Ch.

Glad to see you decided to join in. The essay covers a lot of great material and I will have to read it a few times.

Christian: I am the leader with one vote. This is not very strong. Though I hope the cumulative count remains fairly strong. I think Phil has over all the solid score.

Phil: This essay is fairly broad. The core issue is that It From Bit is undecidable, for any schema of that nature is based on an incomplete axiomatic system. This is a good thing as I see it. It means there are new foundations to think about.

  • [deleted]

" in a nonlocal manner a quantum particle 'knows' how to evolve by sampling all possible paths.......it can still be argued there is either an underlying or a dual perspective on nature which is continuous and not digital."

IOW: an aether

    • [deleted]

    Lawrence,

    "The core issue is that It From Bit is undecidable, for any schema of that nature is based on an incomplete axiomatic system."

    Doesn't that essentially falsify "it from bit?"

    • [deleted]

    Lawrence,

    What I mean to say, is that if it's undecidable, then you don't have "it."

    The quantum vacuum is not an aether exactly. It is not an aether in the old fashioned sense with a continuum of degrees of freedom. The vacuum does however admit configuration variables, and these can be continuous.

    LC

    Euclid's five axioms were thought to contain all of geometry. The fifth axiom that parallel lines never intersect was thought for centuries to be provable from the other four. It is a proposition about geometry that is not provable. One can turn it on or off, and with the off condition you have geometries with curvature. The Euclid fifth axiom is not provable, and is not provable by any axiomatic system that enumerates its Godel numbers.

    It from Bit amounts to saying that all of existence is computable and computable by itself. In a general sense this is not decidable. The "Bit" part of this involves some algorithmic structure, and this in a Turing machine sense is not able to compute all possible states thought of as symbol strings. So any theory one has of "Bit" is not going to be axomatically complete. There will exist states which exist that are not computable. One must then enlarge "Bit" or enlarge the axomatic or algorithmic structure of "Bit" to include these. At least that would be needed if you think this is "It."

    LC

    • [deleted]

    Lawrence,

    So it is as Wolfram said; "You need a computer the size of the universe to compute the universe." ?

    The issue I have with "It from Bit," is the assumption that since the calculation can be made using any medium, then the medium is irrelevant. Yet if you have no medium, you have no message.

    If you have nothing, then you just have the zero/0. No one/1. Just a flatline, with no pulse, no binaries, no positive/negative. So there has to be some medium in order to have any laws, principles, axioms, computations, etc.

    That medium is what is present. The information is the changing form, evidence of events that no longer exist, not the substance.

    • [deleted]

    Lawrence,

    "One must then enlarge "Bit" or enlarge the axomatic or algorithmic structure of "Bit" to include these."

    What do you think of Lev's structs as a way to do this, whether practically or philosophically?

    There is another side of this that the particle view misses. The network, as opposed to the node, seems to be inherently about connections, while the particle view is about distinctions. How parts add up to a larger whole, not just a sum of the parts. Whether it is the organs of your body adding up to you, or the quanta in a Bose-Einstein condensate amounting to a larger quanta.

    I think this is where the scalar side of the brain works better, while the vector side just gets tangled in its "symbol strings."

    Various entries point out a bit only makes sense in context, so it is "Bit from It and It from Bit," in the sense that each is the lens through which the other is observed, but each view is still limited and there is no middle view where all sides are clear.

    All physical theories are effective theories, or ultimately models. One should never take any theory as being somehow absolute. Even if we end up with a cosmology theory that is at the limits of our observing capabilities we should never assume we have it all. "It From Bit" is really a way in which we could run quantum cosmology on a quantum computer. However, the algorithm that is run is ultimately a system of physical axioms (postulates) which are incomplete. They can never be complete. So the quantum cosmology run on our quantum computer is mo more "IT" than can my piffle of a lap top crack all public key encryption codes or RSAs. With the universe at large the quantum computer idea means the universe computes itself, which means by default that the universe is itself incomplete.

    BTW. the quantum computer will at first be a boon for physics and cosmology, in particular with modeling the SLOCC systems for BPS and SUSY black holes. In the long run though, I look upon the quantum computer with trepidation and dread. If you think the world is getting loopy and strange due to information complexity, just wait until the quantum computer dominates the scene.

    In my essay I draw a comparison between Godel's second theorem and David Hume's conclusion about the naturalist or "is ought" fallacy. This was the basis for his argument that causality is not strictly proven by logic. To assume the occurrence of an event, or the existence of some physical state, is logically derived is a fallacy. The second theorem of Godel is related to this. It means that mathematics is in a way a bit of an empirical subject. It the universe then computes itself, it does so in the same way we study a subject like math or physics: The universe in effect discovers itself.

    This then suggests that we can't assume that all of existence is defined by an algorithm which computes itself. The algorithm doing this is similar to a universal Turing machine that is incapable of determining its own halting status as it evaluates all other possible algorithms. This does not mean It From Bit is false, but it is not something which can be proven. The fact it can't be proven means that what ever algorithm or formal system of computing Bits we have it is incomplete and the "It" will as a result always be found to be larger. To assume otherwise is to commit the naturalist fallacy pointed out by Hume.

    Cheers LC

      • [deleted]

      Lawrence,

      " the universe is itself incomplete."

      And hopefully will remain so. When it is finished, it is finished.

      Remaining within the dynamic processes, it seems there are some revealing patterns. One of these seems to be that complexity is part of the overall pattern and is not just a linear progression into ever more complexity, but is a process that leads to breakdowns. This might be considered the Tower of Babel syndrome. You could say the algorithm informing it, or the energy motivating it, reaches limits and its wave of applicability peaks.

      As is happening in physics, or the world economy, it seems the alternative is breakdown and chaos, yet this too is part of the pattern, as the linear pushes into an increasingly disturbed non-linear environment.

      Rather than ask what comes next, perhaps we should back up and ask why each step becomes ever more complex. Why each floor of the structure requires more re-enforcement of the lower levels, why cars and buildings and society become ever more complex. Much of it has to do with the fact the environment is fundamentally non-linear and while progress seems like a vector, it is actually a scalar. Each level magnifies and multiplies the issues and the complexities, until they overwhelm the endeavor. Yet we continue to view reality as linear. We even see the universe as beginning at a point of origin and pushing outward, because we see the most basic unit of energy as a point that doesn't expand, but moves along a vector for billions of years. The expansion has been relegated to a statistical probability. Yet these "probabilities" and anomalies multiply until they overwhelm the model.

      There really is no way around this. There is no path to Nirvana, no all-encompassing algorithm, just a rising and falling of waves in an eternal sea. To be more "objective," we need to be able to be more objective about our own situation, in order to be able to ride these waves and know when to get off one and onto another. Even to accept our own mortality as part of this process.

      It is only when we insist our path is the only one, all others are wrong and it is going to the promised land, that we delude ourselves.

      Not trying to get too philosophical on your thread, but just trying to put the effect of complexity into a larger contextual process.

      I an not sure what you mean by "linear." This is not covered at all in my essay, but I think there is an elementary quantum statistics of a 2 1 spacetime that underlies a lot of the complex physics of strings and supersymmetry. I am not going into that in detail, for it would be too much. However, I think there is a degeneracy of states or superselection from which heterotic string theory emerges. Since this involves the octonion group E_8 this touches on the matter of nonassociativity.

      LC

        • [deleted]

        Lawrence,

        I'm no match for you in terms of the leading edges of complex theory. I'm simply making a general point about the nature of complexity. For example, there are literally billions of microbes in a person's gut. What level of computational complexity would it take to describe every relationship? Necessarily it would be far beyond any computational ability we currently have, yet one could make the general statement that it is a digestive process. So do we have to construct a precise, bottom up model of the entire system in order to effectively understand it? We can't. We would drown in detail and lose sight of what we are trying to do. A map can't show every detail, or it is useless for any particular purpose.

        So the point is whether physics is drowning in detail, literally off in other universes, to the point of losing sight of what it is trying to understand. There is no ultimate algorithm which will explain the universe to humanity and when even the field is starting to throw up their collective hands over the fact the most developed concepts, such as string theory, have nothing to offer beyond a big question mark, then it might be time to consider if the path taken is anything more than a sticky trap. I know what I say has little weight, but I think you will find there will be more and more people like me. Eventually string theory is not going to be putting food on anyone's table.

        You and I have argued over my ideas enough, not to go there, but you do have the ability to clarify your arguments, as you did to Phil in the above comment, so keep it up and keep breaking down all those beautiful ideas and see what further patterns emerge and what are just empty bubbles. If it requires other universes, that should be a hint some factor has been overlooked.

        • [deleted]

        The level of complexity or the amount of information we observe is determined by the number of states, say the dimension of a Hilbert space or the size of a coarse grained phase space, call that dim H and entropy is S = ln(dim H). The amount of complexity we observe around us is huge. However, I think that much of the huge complexity around us is due to a redundant set of copies of fundamental states on different configuration variables. This means potentially there is only one electron in the universe, but where the huge number we observe are copies of that one state in different configuration variables. This huge redundancy has a relationship to the occurrence of event horizons and holography. I will have to leave this conjecture at this stage, for it gets a bit subtle.

        LC

          What on earth does any of this incomprehensible abstract senseless physics babble have to do with reality? As I have thoughtfully pointed out in my understandable essay BITTERS, the Universe is unique, once and every seeming piece of the real Universe is unique once. Each real snowflake is unique. Each man-made particle is unique. Whereas scientists seek out repeatable abstract theories of abstract structures and abstract histories and abstract continuations, real unique has none of these abstract qualities.