[deleted]
Lawrence,
"The core issue is that It From Bit is undecidable, for any schema of that nature is based on an incomplete axiomatic system."
Doesn't that essentially falsify "it from bit?"
Lawrence,
"The core issue is that It From Bit is undecidable, for any schema of that nature is based on an incomplete axiomatic system."
Doesn't that essentially falsify "it from bit?"
Lawrence,
What I mean to say, is that if it's undecidable, then you don't have "it."
The quantum vacuum is not an aether exactly. It is not an aether in the old fashioned sense with a continuum of degrees of freedom. The vacuum does however admit configuration variables, and these can be continuous.
LC
Euclid's five axioms were thought to contain all of geometry. The fifth axiom that parallel lines never intersect was thought for centuries to be provable from the other four. It is a proposition about geometry that is not provable. One can turn it on or off, and with the off condition you have geometries with curvature. The Euclid fifth axiom is not provable, and is not provable by any axiomatic system that enumerates its Godel numbers.
It from Bit amounts to saying that all of existence is computable and computable by itself. In a general sense this is not decidable. The "Bit" part of this involves some algorithmic structure, and this in a Turing machine sense is not able to compute all possible states thought of as symbol strings. So any theory one has of "Bit" is not going to be axomatically complete. There will exist states which exist that are not computable. One must then enlarge "Bit" or enlarge the axomatic or algorithmic structure of "Bit" to include these. At least that would be needed if you think this is "It."
LC
Lawrence,
So it is as Wolfram said; "You need a computer the size of the universe to compute the universe." ?
The issue I have with "It from Bit," is the assumption that since the calculation can be made using any medium, then the medium is irrelevant. Yet if you have no medium, you have no message.
If you have nothing, then you just have the zero/0. No one/1. Just a flatline, with no pulse, no binaries, no positive/negative. So there has to be some medium in order to have any laws, principles, axioms, computations, etc.
That medium is what is present. The information is the changing form, evidence of events that no longer exist, not the substance.
Lawrence,
"One must then enlarge "Bit" or enlarge the axomatic or algorithmic structure of "Bit" to include these."
What do you think of Lev's structs as a way to do this, whether practically or philosophically?
There is another side of this that the particle view misses. The network, as opposed to the node, seems to be inherently about connections, while the particle view is about distinctions. How parts add up to a larger whole, not just a sum of the parts. Whether it is the organs of your body adding up to you, or the quanta in a Bose-Einstein condensate amounting to a larger quanta.
I think this is where the scalar side of the brain works better, while the vector side just gets tangled in its "symbol strings."
Various entries point out a bit only makes sense in context, so it is "Bit from It and It from Bit," in the sense that each is the lens through which the other is observed, but each view is still limited and there is no middle view where all sides are clear.
I respond in detail below so it is visible to all.
LC
All physical theories are effective theories, or ultimately models. One should never take any theory as being somehow absolute. Even if we end up with a cosmology theory that is at the limits of our observing capabilities we should never assume we have it all. "It From Bit" is really a way in which we could run quantum cosmology on a quantum computer. However, the algorithm that is run is ultimately a system of physical axioms (postulates) which are incomplete. They can never be complete. So the quantum cosmology run on our quantum computer is mo more "IT" than can my piffle of a lap top crack all public key encryption codes or RSAs. With the universe at large the quantum computer idea means the universe computes itself, which means by default that the universe is itself incomplete.
BTW. the quantum computer will at first be a boon for physics and cosmology, in particular with modeling the SLOCC systems for BPS and SUSY black holes. In the long run though, I look upon the quantum computer with trepidation and dread. If you think the world is getting loopy and strange due to information complexity, just wait until the quantum computer dominates the scene.
In my essay I draw a comparison between Godel's second theorem and David Hume's conclusion about the naturalist or "is ought" fallacy. This was the basis for his argument that causality is not strictly proven by logic. To assume the occurrence of an event, or the existence of some physical state, is logically derived is a fallacy. The second theorem of Godel is related to this. It means that mathematics is in a way a bit of an empirical subject. It the universe then computes itself, it does so in the same way we study a subject like math or physics: The universe in effect discovers itself.
This then suggests that we can't assume that all of existence is defined by an algorithm which computes itself. The algorithm doing this is similar to a universal Turing machine that is incapable of determining its own halting status as it evaluates all other possible algorithms. This does not mean It From Bit is false, but it is not something which can be proven. The fact it can't be proven means that what ever algorithm or formal system of computing Bits we have it is incomplete and the "It" will as a result always be found to be larger. To assume otherwise is to commit the naturalist fallacy pointed out by Hume.
Cheers LC
Lawrence,
" the universe is itself incomplete."
And hopefully will remain so. When it is finished, it is finished.
Remaining within the dynamic processes, it seems there are some revealing patterns. One of these seems to be that complexity is part of the overall pattern and is not just a linear progression into ever more complexity, but is a process that leads to breakdowns. This might be considered the Tower of Babel syndrome. You could say the algorithm informing it, or the energy motivating it, reaches limits and its wave of applicability peaks.
As is happening in physics, or the world economy, it seems the alternative is breakdown and chaos, yet this too is part of the pattern, as the linear pushes into an increasingly disturbed non-linear environment.
Rather than ask what comes next, perhaps we should back up and ask why each step becomes ever more complex. Why each floor of the structure requires more re-enforcement of the lower levels, why cars and buildings and society become ever more complex. Much of it has to do with the fact the environment is fundamentally non-linear and while progress seems like a vector, it is actually a scalar. Each level magnifies and multiplies the issues and the complexities, until they overwhelm the endeavor. Yet we continue to view reality as linear. We even see the universe as beginning at a point of origin and pushing outward, because we see the most basic unit of energy as a point that doesn't expand, but moves along a vector for billions of years. The expansion has been relegated to a statistical probability. Yet these "probabilities" and anomalies multiply until they overwhelm the model.
There really is no way around this. There is no path to Nirvana, no all-encompassing algorithm, just a rising and falling of waves in an eternal sea. To be more "objective," we need to be able to be more objective about our own situation, in order to be able to ride these waves and know when to get off one and onto another. Even to accept our own mortality as part of this process.
It is only when we insist our path is the only one, all others are wrong and it is going to the promised land, that we delude ourselves.
Not trying to get too philosophical on your thread, but just trying to put the effect of complexity into a larger contextual process.
I an not sure what you mean by "linear." This is not covered at all in my essay, but I think there is an elementary quantum statistics of a 2 1 spacetime that underlies a lot of the complex physics of strings and supersymmetry. I am not going into that in detail, for it would be too much. However, I think there is a degeneracy of states or superselection from which heterotic string theory emerges. Since this involves the octonion group E_8 this touches on the matter of nonassociativity.
LC
Lawrence,
I'm no match for you in terms of the leading edges of complex theory. I'm simply making a general point about the nature of complexity. For example, there are literally billions of microbes in a person's gut. What level of computational complexity would it take to describe every relationship? Necessarily it would be far beyond any computational ability we currently have, yet one could make the general statement that it is a digestive process. So do we have to construct a precise, bottom up model of the entire system in order to effectively understand it? We can't. We would drown in detail and lose sight of what we are trying to do. A map can't show every detail, or it is useless for any particular purpose.
So the point is whether physics is drowning in detail, literally off in other universes, to the point of losing sight of what it is trying to understand. There is no ultimate algorithm which will explain the universe to humanity and when even the field is starting to throw up their collective hands over the fact the most developed concepts, such as string theory, have nothing to offer beyond a big question mark, then it might be time to consider if the path taken is anything more than a sticky trap. I know what I say has little weight, but I think you will find there will be more and more people like me. Eventually string theory is not going to be putting food on anyone's table.
You and I have argued over my ideas enough, not to go there, but you do have the ability to clarify your arguments, as you did to Phil in the above comment, so keep it up and keep breaking down all those beautiful ideas and see what further patterns emerge and what are just empty bubbles. If it requires other universes, that should be a hint some factor has been overlooked.
The level of complexity or the amount of information we observe is determined by the number of states, say the dimension of a Hilbert space or the size of a coarse grained phase space, call that dim H and entropy is S = ln(dim H). The amount of complexity we observe around us is huge. However, I think that much of the huge complexity around us is due to a redundant set of copies of fundamental states on different configuration variables. This means potentially there is only one electron in the universe, but where the huge number we observe are copies of that one state in different configuration variables. This huge redundancy has a relationship to the occurrence of event horizons and holography. I will have to leave this conjecture at this stage, for it gets a bit subtle.
LC
What on earth does any of this incomprehensible abstract senseless physics babble have to do with reality? As I have thoughtfully pointed out in my understandable essay BITTERS, the Universe is unique, once and every seeming piece of the real Universe is unique once. Each real snowflake is unique. Each man-made particle is unique. Whereas scientists seek out repeatable abstract theories of abstract structures and abstract histories and abstract continuations, real unique has none of these abstract qualities.
Lawrence,
Let me go back and clarify my distinction of linear vs. non-linear. Obviously linear is sequential, yet this is a very broad category, from simple steps to complex changes. Non-linear is randomness. For example, think molecules of water. Yet there is a great deal of order in this as well. For one thing, it can be measured as a scalar, be it temperature, pressure, weight. There is also the entropic effect, as the various parts bounce into each other and trade energy, speeding the slower and slowing the faster ones, to reach an equilibrium state. There is also Newton's dictum of every action being matched by an equal and opposite reaction. Logically the action, being so defined, is linear, while the reaction of its environment is non-linear, so there is a natural balance between the motion of the particular and the reaction of its environment.
Now these two processes are intimately entwined, like the non-linear gut activity propelling the organism. Evolution is a good example of the situation, as we think in terms of linear progression, such as through generations of organisms. Yet it is much more of a scalar process, as progress needs to be supported by all the activities of the environment, as simple forward action tends to be balanced and negated by the larger quantity of activity. So there has to be the constant feedback and the resulting scalar activity is what actually determines the overall direction. Sort of like a tree has to grow out in all directions for the sequencing of rings to form. Like trying to introduce modern technology into a less complex society that lacks the broad cultural knowledge to use it.
So there is this mutual dichotomy of the actions of the particular and that of the mass of activity. Nodes and networks are a model of this relationship.
You might say the non-linear is the "course-graining" within the particular frame or space.
Now the problem, as I see it, is that there is an overwhelming bias to describe the particular, the node, as fundamental, but the logic doesn't really support this.
Consider how you conflate "state" with one electron. The idea of the entire universe as one "atom" was most forcefully put forward by LeMaitre, in his original argument for what ended up being called Big Bang theory.
Yet "one" is something clearly defined as a unit, ie. is distinct from its environment. On the other hand, a state, particularly a neutral state, is more of a zero. There is no set of boundaries or distinctions. It could well be infinite, as once it is clearly finite, then it becomes, not so much a state, but a set of all that it contains and thus a unit within the larger context.
This goes to the heart of my arguments against Big Bang Theory, in that it argues for the entire universe as a particular unit and attempts to totally erase any concept of a larger context. Now this model is having to admit other universes, yet still cannot condone the idea of any environment to form and contain them. Yet whether we want to admit to a larger context or not, it is essential to creating the unit, by setting boundaries. As it is, our current theories have any number of holes, from the singularity to dark energy, to suggest outside connections.
The electron could as well be a fluctuation of the void. One node popping up in the infinite network of potential.
The big bang has a lot of empirical support for it. I don't particularly want to get into trying to argue the points for inflationary multiverse theory. It is though likely that just as solar systems operate not by some geometric order of planetary orbits, but rather by a more fundamental set of principles, so too are the gauge field constructions in a vacuum nucleation or pocket universe. In medieval to the renaissance cosmology it was thought the solar system was arranged by a set of geometrically ordered "orbs." Kepler worked on something like this. It appears likely that the universe is far grander, and what we observe as the spacetime universe is just one bubble out of a vast number of such on an inflationary spacetime that is often called the multiverse.
Cheers LC
The physical world and universe is rather contrary to our common sense. For instance you say that every particle is unique, but it is well known by the Pauli exclusion principle that this is not the case. Two electrons are not distinquishable in quantum entanglement. The advancement of our understanding of the physical world is not going to conform closer to our common sense, it will challenge it.
LC
Lawrence,
I'm not really trying to start the cosmological argument, so much at trying to describe how various dichotomies, order/randomness, linear/non-linear, vector/scalar, node/network, organism/ecosystem, are aspects of an fundamental underlaying process.
It really isn't so much to argue about cosmology, but to make the deeper point that our logical concepts are generally based on one side of this relationship, that of the linear, ordered, singular organism, since it is the basis of our narrative, cause and effect descriptions of reality and the resulting atomized view affects many aspects of our understanding and relationship to nature, from monotheism to the Big Bang.
It's not the math is wrong, since it is distilled pattern, but how we apply it. For example;
" This means potentially there is only one electron in the universe, but where the huge number we observe are copies of that one state in different configuration variables."
"what we observe as the spacetime universe is just one bubble out of a vast number of such on an inflationary spacetime that is often called the multiverse.'
I can see how the math for this might be quite logical, but that in editing the variables, some important details might have been left on the cutting room floor. If the universe is one electron , might it be equally mathematically provable that every electron is a universe? If A=B, then does B=A? I tend to see multiverses as a version of C. S. Escher sketches of waterfalls and stairways going in circles. Quite interesting on paper, but problematic in reality.
So my point is, again, that we are not taking that scalar randomness into account as the background and balance to the logical vector.
This background balancing is a bit like looking into a mirror and trying to explain what we see, as though there is some world opposite ours, rather then the principles of the mirror, so we keep coming up with all these shape explanations, from multiverses to supersymmetric particles.
Hi Lawrence,
this is a very intriguing essay that touches on a lot of things that I have dimly glimpsed in my own thinking. In particular, the issue of how undecidability relates to physics is something that is intermittently on my mind. There's been a recurring trend to look for undecidability as somehow related to the measurement problem, or other quantum mechanical weirdness, starting with maybe Zwick in 1978, and continued by people such as Mittelstaedt or Thomas Breuer (who uses diagonal arguments to establish the impossibility of perfect self-measurement in theories assumed to be universally valid, that is, apply to observer as well as observed). A relatively recent development is the idea that the randomness of quantum measurement outcomes is related to the undecidability of the outcome from axioms encoded in the state preparation, as developed by Paterek et al. There's also interesting work by Karl Svozil, Christian Calude, and others, in investigating quantum randomness and uncertainty from the point of view of Chaitin's algorithmic information theory.
All of which is just to say that a lot of people have seen some common ground here, while apparently nobody has been able to find a rigorous formulation. Your take on the issue is a new one to me: as far as I understood, you seem to be saying that independent axioms may be repealed in order to allow greater mathematical freedom, citing the case of abandoning the parallel postulate in order to lead in a profitable way to new formulations of geometry. But of course, in any theory, all axioms are logically independent of one another, no? Otherwise, if any axiom can be derived from the other axioms, you can just strike it out, and you'll be left with the same theory. This was what drove the attempts to derive the parallel postulate from the other axioms: it was seen as a blemish on the theory, and it was hoped that the theory would hold up unchanged without it. The construction of geometries inequivalent to Euclid's by Lobachevsky and others ultimately was what killed this hope. (And besides, isn't Euclidean geometry decidable anyway?)
So the parallel postulate ultimately isn't derivable from the theory in the same sense that, say, the existence of the square root of -1 isn't derivable from the field axioms: the incompleteness here is in a sense trivial, and different from the Gödelian case in the sense that one probably wouldn't want to insist that the field axioms are complete in the sense that they derive every true position they can express. So it seems to me that there's a difference between the independence of the parallel postulate and the independence of, say, the continumm hypothesis from Zermelo-Fraenkel.
Also, even though there are undecidable propositions about any sufficiently complex system (any system capable of universal computation), this does not imply any 'uncertainty' about the fundamental laws (though I'm not sure if you're arguing for that): take, for instance, a universal cellular automaton such as the Game of Life. It's 'fundamental laws' are simple enough, and can be completely specified without any problem; nevertheless, there exist undecidable propositions, such as whether or not a specific configuration will ever turn up. But of course, GoL can be simulated on a computer; so the mere existence of undecidability does not imply anything about the uncomputability of physical evolution. So this does not put the hypothesis of the universe being, in some sense, a 'giant computer' to rest: in fact, I would rather consider it evidence in its favour, since as I said, every universal system gives rise to undecidable propositions.
But I'm not quite sure if I'm not arguing past your points; I need to re-read your essay in some calmer moment, I think.
Lawrence,
Rather than a top down platonic view of the entire universe as one electron and everything else as a reflection of it, what about a bottom up view, where every electron is its own unique view/reflection of the entire universe, necessarily most reflective of what it is most tied into/entangled with?