Phillip:

"The information that described the state of the universe is holographic, so it can be represented on a surface. This is the compressed form of the data. What we observe in the bulk volume is an uncompressed form with lots of redundancy in the form of gauge symmetry."

The information content of an emission, is not the same as the information content of the emitter that produced the emission. Every emission must travel through every spherical surface surrounding the emitter and with a radius less than the distance between the emitter and the receiver, if it is to ever be received in the first place. Thus, the entire information content of every long-range emission must be observable on those spherical surfaces. This is why the holographic principle exists, and why all long-range forces are inverse-square. It has nothing to do with the information content stored within the emitter or with data compression used to produce the emission. Assuming otherwise is a major misunderstanding of Shannon's Information Theory, within the physics community.

Rob McEachern

Don,

"There is an interesting assumption in information theory - that there is a limit to what can be compressed or represented by a 'unit' of information. There might be a limit, given today's mathematics, but will that always be the case?

How efficiently can I represent pi?"

There are two branches to information theory:

(1) Shannon's original theory has to do with how many discrete (quantized) samples are needed to perfectly reconstruct an arbitrary continuous function, such as those which might be solutions to the equations of mathematical physics. Shannon's Capacity Theorem specifies both the number of required samples and the number of required bits per sample, required to achieve perfect reconstruction. Thus, it provides the missing-link between the the worlds of continuous functions and quantized results. It is easy to show, for example, that setting Shannon's Capacity equal to a single-bit-of-information, will yield the minimum value of the Heisenberg Uncertainty principle. In other words, the Heisenberg Uncertainty Principle simply means that all observations must contain one or more bits of information. Otherwise, it is not an observation at all - just noise. That is why you cannot determine the values of two variables like position and momentum - in the limit, they only encode a single bit of information, between the two variables! This is also the cause of the so called "spooky action at a distance" and the correlations observed in experiemnts attempting to test Bell's Inequality Theorem.

(2) Algorithmic Information Theory, which deals with data compression, AFTER the data has been represents in quantized form.

The physics community has been mostly interested in (2), which is very unfortunate, since it has little relevance to physics, since it deals only with already quantized observations. But (1) addresses the question - Why are observations and measurements quatizable in the first place? - which is of direct relevance to the correct interpretation of quantum theory.

Rob McEachern

Very interesting Rob. I hope you will be submitting an essay with these ideas.

Phillip,

I have submitted a couple of essays in the past, that touched upon some of these issues, but as you yourself have observed, the physics community has little interest in ever looking at such things. These days I prefer to put things on vixra. You will find a couple of my posts on these matters there. Thanks for creating the site!

Rob McEachern

Robert,

if you are looking in, and apologies to Philip... your statement of all long range emissions making all the information content observable on all the intervening spherical surfaces, being a rationale for the inverse square law; is rather intriguing. Could you elaborate a bit, please. Thanks, jrc

John,

My contention is simple: Two things, be they particles, waves, fields or anything else, cannot interact, if they cannot even detect each others existence. Detection requires detection of information - at least one bit. Hence, all interactions are driven by information detection. Since the ability to detect information from a signal is a function of the signal-to-noise-ratio, which is in turn a function of the distance squared, long-range interactions are governed by the inverse-square law. At short-ranges, the emitter may appear as an extended source, rather than a point-source. Consequently, the situation in regards to the signal-to-noise ratio is more complicated than just an expanding sphere.

This is why quantum tunneling occurs - if an entity cannot even detect the existence of a barrier and the barrier cannot detect the entity - then the barrier, in effect is not even there.

It is also why the phenomenon of virtual particles exist. I like the analogy of two submarines, moving through an ocean of noise, trying to detect (and thus interact) with each other. If they make contact, they commence to interact, generating emissions that can be detected by others. But if they quickly lose contact (can no longer even detect each other) they return to running silent and running deep. And the ships on the surface, that themselves detected the subs' initial response, are left to wonder what just happened- was there anything really there?

Obviously, individual particles do not expand with distance, so their detection is not an inverse-square law - it is described, statistically, by the quantized behaviors which are the subject of quantum theory.

Rob McEachern

Thanks Robert,

signal-to-noise following the inverse square, being the rationale. That clarifies, I am always hoping for something more than the observed measurement we have had since Newton. I'll guess I'm still eating Lotus. :-) jrc

Hi Philip, your essay is a pleasure to read and once I had started I was compelled to read to the end. Your ideas about story telling resonate with my own thinking about how we relate to the world. Especially via our senses and by imposing singular perspectives. I wonder why then at the end you say"' If they don't know then they must be prepared to consider different philosophical options, letting the mathematics guide the way until the experimental outlook improves." Why do you say mathematics must guide the way? Why not biology first? (To elucidate the effects of building from a literal human centered perspective or to seek and eliminate its effects.) Or why not all of the sciences leading together in a multidisciplinary effort? Well done, kind regards Georgina

    Robert,

    Thank you for your reply.

    As I understand Information theory, it is built upon the use of positional representations of a number in a specific base. The use of logs for the expositions means it presumes a single base representation of number values (it is locked into one base, even if that base can change).

    This is all that is needed for Real numbers.

    However the suggestion I make is that there could be more powerful methods of representation that are not limited to a single base. This would suggest that Information theory can be expanded, as could its uses. It also means the limits placed by the current theory, using logs, may not be absolute limits.

    Don

    Don,

    "As I understand Information theory, it is built upon the use of positional representations of a number in a specific base. The use of logs..."

    That is not correct. Shannon was interested in determining under what circumstances a receiver would be able to perfectly reconstruct a transmitted message, without producing any errors in the encoded message. For example, suppose you wanted to send some critical information (where even a single error in a number received would have very bad consequences), like a list of bank account numbers, via a radio message. In effect, the "base-2 log" function in his Capacity formula, expresses the number of bits required per sample, needed to digitize the analog signal. Since the Capacity formula is representing the maximum number of bits that can be received without error, it is important to use base-2. You could use a different base, but then you would have to do a conversion, to convert the number into the correct expression for counting bits.

    Rob McEachern

    Hi Georgina, thanks for your interesting question. I do think that the human side of experience and evolution is relevant to the philosophical side of how we should understand our place in the universe. This came up more in the previous essay. However, I stop short of thinking that biology is in any way fundamental. I think that mathematics is the right place to look for fundamentalism, and is the more powerful tool for developing the harder side of the theory. Perhaps that is where our thinking diverges.

    Nevertheless, there are grey areas in my thinking and I do hope that we get a few essays that argue the case for biology being fundamental, or that there is more to be learnt about foundations from biology. It seems like a more radical idea but perhaps my view can be pushed a little in that direction. we will see.

    Yes, as we gather information our model of reality becomes a better approximation. Sometimes this can lead to a paradigm shift where the underling principles are suddenly very different even if the predicted measurements don't change by very much.

    Rob,

    You are missing my point. He is still using a single base number system - than of logs. It doesn't matter which base he uses, as they are inter-convertible.

    We all use a single base number system, be it decimal or binary. We don't really know of any other one as we have been using decimals and logs for a few hundred years.

    Is that system the best one that can be built? Or can a better one, say using multiple bases in the same number representation, be devised? Such a system might be able to represent numbers we cannot today. If so, then these calculations MAY need to be revised (as well most other ones).

    A (poor) analogy is the Romans attempting to build space ships using their Roman numeral system. The calculations would be much too hard and many measurements could not be performed (as they could not properly represent Real numbers). They would not be able to make certain measurements we can today, without proper scaling of numbers.

    Might we be in a (somewhat) similar situation, where a more powerful numeric system could be devised that would alter what and how we measure and/or calculate?

    Then a different value, inexpressible today, would change what a bit can represent.

    As I understand things, Shannon is saying this is the absolute limit regardless of mathematical tools. I am suggesting his statement needs to be limited to the tools we are currently using and there might be a different way, since there might be better mathematical tools.

    Don

    Don,

    "We don't really know of any other one as we have been using decimals and logs for a few hundred years." Most modern, communications systems don't encode information with any numerical system - they use "alphabets" of peculiar waveforms. For one example, see the wikipedia article on Quadrature Amplitude Modulation. These strange alphabets get translated into bit-patterns, not numbers, by the receiver. A second translation step might interpret those patterns as sets of numbers. But it might also interpret them as sets of alphabetic characters, like the ones I typed and you are now reading.

    "Then a different value, inexpressible today, would change what a bit can represent." It already does that - it represents both nothing and everything. Bits of information are not like bits of data. It is not like a measurement that has a most and least significant bit or digit. It is like an index number - an index to a look-up table. Consequently, what that index/number "represents", is whatever totally arbitrary stuff you may have placed into the Table, at that index location. In other words, what it "represents", has no relation whatsoever, to its numerical value. You can change what the number represents, by simply changing whatever "stuff" is in the corresponding location of the Table. One such looked-up meaning, of an index/number, might say "compute the square-root of pi. Another table, for the very same index/number, might say "slap your face with your right hand." It is completely arbitrary - having no relationship whatsoever, to the value of the index/number.

    Don't feel bad, if you don't understand; few physicists do either. It is related to the "measurement problem" - few physicists seem to realize that physical entities do not have to treat measurements as measurements - they may treat them as indices (symbols), resulting in a total disconnect between the "value" of the supposed "measurement" and the "meaning" of the measurement, in the sense of what behavior a system undertakes (compute sqrt(pi) versus slap your face) as a result of performing a measurement and obtaining some particular value for that measurement.

    Rob McEachern

    5 days later

    Philip and Georgina,

    When it comes to "there is more to be learnt about foundations from biology.", I harken back to a day long ago when I was struck by the physical symmetry of the classic Platonic Solid, the Octahedron. It has a number of planar aspects we find replicated in chemical arrangements into molecules and interactions, and shares an internal angle with the narrow range of the Brewster Angle which polarizes light in a laser. And among the most primitive known viruses, are octahedral entities. I have ever since had a waking nightmare that science will someday discover 'the spark of life' in that symmetry, and a naturally occurring compounding of energy that animates even the simplest volume in seeking form. Not too far from many a primitive religious belief that all things are imbued with a 'spirit'. I doubt we as a species have the wisdom to know such things. We could become Borg! :-) jrc

    Thank you very much for the essay. Only real entities can act be acted upon. If various objects mentioned in your article (under different categories) are real, they would have objective reality and positive existence. These qualities can be provided only by their substance. Therefore, whichever entity provides substance to these real entities is more fundamental than any of them. You mentioned, "Our reality is what we experience". Our senses and instruments also have limited capability. Entities, we do not sense or experience but have substance, are also real. Our inability to experience them would not make them unreal.

    An entity, its parameters, its properties or its actions cannot be defined by its own products. Therefore, products of substance cannot define substance. Since we, ourself, are formed by fundamental substance, it is impossible for us to define substance, the most fundamental entity. Most logical candidate for substance of all real entities is 'matter'.

      Philip,

      Most stories about fundamental aspects of nature start at a fairly high level. In other words, the considered aspects are not at all fundamental. Still, reality appears to exhibit structure and that structure will be based on one or more foundations. The search for such foundation has been undertaken several times and not with much success. The reason is that physics took another route. It works by interpreting and precisely describing observations. At the same time, it mistrusts deduced statements. This attitude inhibits the exploration of existing foundations.

      Garrett Birkhoff and John von Neumann most probably discovered one of the foundations of physical reality. This entry point was never seriously explored. See: The Incredible Story About the Reality; http://vixra.org/abs/1801.0033

        Right, now that my essay is up to illustrate where I'm coming from, some more comments.

        I like how you characterized your approach over in my thread---as 'pushing back' the fundamentals. This somehow seems very intuitive: the more general the mathematical structure, the less assumptions have to be made, and the less attack surface for 'But why this?'-type questions exist.

        But does this process have an end? In some sense, you can always generalize further---throw away some more axioms, to put it starkly. When are we general enough? Is there some endpoint that does not contain any assumptions that can be rationally doubted---and even if so, does this say something about the world, or about the boundaries of our reason?

        Exceptional structures seem to be good candidates for endpoints, in particular because they lend themselves to chains that actually do seem to terminate. Octonions are the division algebra with the highest dimension, things stop there---but then, why division algebras? E8 is the largest exceptional simple Lie group, but why any of that?

        That said, I can certainly relate to the intuition that there's got to be some mathematical object of maximal symmetry, something ideally self-justifying, which---one might hope---gives rise to observed phenomena through some process of iterated emergence, be that symmetry breaking or multiple quantization. So this is kind of a point where I have my doubts whether the whole thing works---but would love to be proven wrong.

        The "Why this?" question is an important driver in my thinking. I should perhaps have mentioned it more as you have. Of course it is nothing new. Wheeler asked what gave the equations wings to fly? Hawking asked what breathes fire into the equations?

        I know some people see exceptional structures like E8 or the octonions as something that can answer this question. These things do seem to turn up, but for me they don't answer the "Why this?" question. I feel a bit like the annoying who just keeps asking why? every time something is explained to them. If any information is needed at all to specify the way things are then there is still a question to answer.

        One candidate for a solution is that you just keep pushing back through layers of emergence until you arrive at a system with no information, where everything is possible. This is like Tegmark's MUH. This has problems. Firstly you need to define some measure or weight on the space of all mathematical possibilities and that introduces an unsourced system of information. Again I would ask Why this? Secondly, it does not explain why there is order and symmetry in the universe. If anything can happen we should live in a world where anything is possible. We might as well be living in a Disney cartoon.

        Emergence is usually something approximate. The Navier Stokes equations emerge from inter-molecular forces in a fluid, but if you look closely enough you can see the molecules. I think the emergence of space and time is like that too. If we look closely enough we can in principle see the underlying structures from which it emerges. However, I think there is a layer beyond which we can't see, even in principle. A system that emerges from a universality principle on the ensemble of all possibilities without any input of information. I know I keep saying this, but the important point that responds to your comment is why I think that it must be that way. It's because that provides the only possible answer I can think of to the "Why this?" question that fits with our experience of a rational structured universe without any external source of information.

        What is the difference between a necklace and a mouse? The mouse, or maybe it was moose, is a chain of Lie algebras. The vector spaces in the representations of these algebras form a sort of quiver. It would seem to me that in some setting if the group is a quotient H = G/K, then This algebra corresponds to a Hermitian symmetric space. An elementary example are the Grassmannian manifolds. This is an interesting development, where the local charts on the manifold are made of vectors that locally are a Lie group, and the atlas construction is a moose or what appears to be a necklace.

        My essay have finally showed up. I can now vote and gave your essay a boost.

        https://fqxi.org/community/forum/topic/2981

        LC