"... a group with one dimension for every degree of freedom in physics ..." If nature is infinite, then it is plausible to assume that physics has infinitely many degrees of freedom. If nature is finite, then nature might have only 78 degrees of freedom. Consider 3 copies of a model of 26-dimenional bosonic string theory, yielding 78 dimensions of bosonic waves. There might be a boson/fermion duality theorem derivable from Wolfram's cosmological automation. There could be 6 "barks" or "big quarks" each carrying a barkload of 12-dimensions of information, yielding 72 dimensions controlled by Fredkin's 6-phase clock, thus 78 dimensions of fermionic information. Each 12-dimensional barkload might represent 4 dimensions of spacetime, 3 dimensions of linear-momentum density, 3 dimensions of angular-momentum density, 1 dimension of quantum-spin density for matter, and 1 dimension of quantum-spin density for antimatter. By redundant representation of information, it might be possible to derive an 11-dimensional model of M-theory and a 12-dimensional model of F-theory -- the idea is that the interior of the multiverse would be 72-dimensional in terms of "barkload" data, and the measurable universes would all be 71-dimensional and located on the boundary of the multiverse.

Phil,

"I expect to find this symmetry in a pregeometric meta-law that transcends spacetime."

That says it pretty well. Like the shape of a Lotus petal bespeaking the whole form of the opening blossom. In spite of the possibility that not even the universe always works perfectly. Merry Christmas and a Happier New Year. jrc

    quote

    The biggest difficulty faced by theoretical physicists of this generation is that positive experimental

    input on physics beyond the standard models is very hard to come by. That situation could change or

    it could continue for much longer. Without empirical data how is it possible to tell if the answer is

    string theory, loop quantum gravity, non-commutative geometry or something else? The theorists

    can still progress by working with the few clues they have, but success will depend on guessing

    correctly the answer to questions like 'what is "fundamental"?' If they don't know then they must be

    prepared to consider different philosophical options, letting the mathematics guide the way until the

    experimental outlook improves. If young researchers are all corralled into one pen it could turn out

    to be in the wrong place. The chances are they are going to be influenced only by the highest profile

    physicists. If those leaders say that symmetry is unimportant because it is emergent or that

    geometry is more fundamental than algebra, other possibilities may be neglected. It appears to me

    that there is a clear program that would combine the ideas of algebraic geometry with quantum field

    theory. It just requires mathematicians and physicists to bring their knowledge together.

    You nailed it !!!!

      Dear Phillip,

      I knew there's a reason I always prioritize reading your contributions to these contests. Excellent work, and you certainly succeeded in your aim of provoking the readers' minds.

      I particularly like this sort of theory-independent view in terms of events: whatever the fundamental theory may turn out to be, it has to have events within it in some form, be those worldlines crossing, particles of all conceivable kinds interacting, string splittings or whatever. So let's not worry about those details for the moment, but rather, think in terms of those events, and the stories that can be told with them.

      One tiny bit of criticism I have is that there's many deep and possibly controversial ideas that aren't developed in the way they deserve (although that is likely owed to the length restrictions, and I'm also aware that this is a criticism you could probably lob straight back at my own essay if/when it gets posted). In particular on topics where I perceive some confluence with my own thinking---like the relative nature of reality, or the idea that in terms of information, 'nothing' and 'everything' are really the same---I would have liked more discussion, just to see how somebody like you develops these notions.

      But these are the complaints of one having been hooked by your ideas, and now finding themselves jonesing for more. Which, as you said, is really all you intended with this essay.

      All the algebraic stuff has re-awoken that curious sense that if you could just take one further step back, you'd just see the big picture pop out. There are so many tantalizing hints and connections, it's hard to believe that there isn't some fundamental story to be told in these terms.

      But I think that's for someone smarter than me to discover. I may get back to meddling with this some day (although my love of the octonions means that I'm skeptical of requiring associativity---alternativity is really all you need!), but for now, I'll concentrate on other matters.

        "The mind itself is not fundamental. Neither are the biological processes by which it works, but the principles of information by which it functions are"

        The central principle of Shannon's Information Theory is that, in order to reduce the length of any transmitted message, to the least possible number of encoded bits, it is imperative that the transmitter never send anything that the receiver already knows. For example, I don't need to keep telling you your name. But everything that you can predict, is a subset of the things you know. It follows, that everything that you can predict, is not even considered to be information, in Shannon's theory. That fundamental reality is enough to make most physicists apoplectic. They are searching for the truth, but as the movie said "You want the truth, you can't handle the truth." Because the truth is, the information content of most physical processes lies almost entirely within the unknown initial conditions, required to solve the equations of mathematical physics, not the long-sought equations themselves. This is what "emergence", emerges from.

        Rob McEachern

          I hope you do get round to submitting an essay this year. There is some overlap between our philosophies which helps me find ways to expand my own viewpoint.

          There are two sides to my essay, the philosophical and the mathematical. On the philosophical side it is partly about finding the right words to express ideas in a way that makes them sound reasonable. I think in terms of a high degree of emergence, so fundamentals must take us away from anything we know in conventional physics. This is bound to be ambitious and speculative to a high degree, but structures like space and time and particles have properties that are too specific to be fundamental in my opinion. Information, events are the relationships between them are much more generic. I think you have a similar view.

          The mathematical side is more important of course. Without mathematics to interpret the philosophy there is no end point. I have some mathematical ability in problem solving and algorithms but the more abstract ideas needed to develop these ideas are outside my comfort zone. I feel like an art critic who can appreciate what is good and can talk about how things should be, but without actually having enough skills and creativity to do it myself.

          I agree that non-associativity is likely to play a part. I see octonions as just one algebraic structure with some nice properties that plays some role in certain possible solutions with good properties. The starting point must be something much more general like free universal algebra or higher category theories. Simple categories are associative, but with higher categories it is more natural to relax and weaken the structure to allow more interesting properties. The identities that define associativity are replaced with isomorphisms. Symmetry always arises as a useful tool in any algebraic structure. For example, if you want to understand octonions you will certainly want to know its automorphism group, and then there are the exceptional Lie algebras up to E8 that are also related to octonions. I think symmetry has to be generalised to supersymmetry, quantum groups, n-groups etc, so there is a long way to go.

          The free Lie-algebra that I discuss in the essay is just a starting point that is simple enough to illustrate my point. It provides the important mapping from algebra to geometric structures using iterated integrals along paths. I suspect that there are generalisations of this where iterated integrals map more general alegbras onto branching networks, like Feynman diagrams. I don't know if I will ever get my mind round it well enough to formulate something that works.

          Philip Gibbs

          I thank you for this interesting article. I have been provoked in my thinking. We should, as you say, regard science as finding better, and better, approximations. You are also right when stating information as fundamental in the field of physics. It is dangerous to listen to only one guru, as you say.

          Best regards ___________________ John-Erik Persson

          Good luck.

            Rob, you are right to highlight the principle of redundant information. Imagine you wanted to send some information into space to tell any aliens something about us. You might send a bitmap photo image for example. To keep the transmission short you could compress the data, but the aliens would not have the decompression algorithm. When data is maximally compressed it becomes a stream of random bits that is impossible to decode without the algorithm. You could send send the algorithm in some uncompressed from, but that is adding extra information. The point is that fully compressed data without redundancy is incomprehensible.

            The information that described the state of the universe is holographic, so it can be represented on a surface. This is the compressed form of the data. What we observe in the bulk volume is an uncompressed form with lots of redundancy in the form of gauge symmetry. In this form it is comprehensible to us. we observe and understand the universe in its expanded version, not the compressed holographic form.

            Phillip & Andrew,

            There is an implicit assumption when depending upon mathematics "to guide the way" for new directions in physics. That assumption is that our current mathematics is adequate to the tasks we attempt to use it for. If it is not, then we will find it very difficult to make much progress. Mathematics likely suffers from the same effect as you describe for physics - the pen and corral situation.

            I will suggest that this is actually the problem physics, which tends to lead other scientific disciplines so all of science, is faced with: The mathematical tools we currently have are not adequate to the task science has put to it.

            The limitations of our mathematical tools might actually be keeping us from seeing aspects of our universe, which would be even more reason to consider fundamental reviews of mathematics and its limitations (especially on how it is applied).

            I believe we will find a guide to a new direction this way.

            Don

            Phillip & Robert,

            There is an interesting assumption in information theory - that there is a limit to what can be compressed or represented by a 'unit' of information. There might be a limit, given today's mathematics, but will that always be the case?

            How efficiently can I represent pi? Using decimal notation, it is an infinite non-repeating sequence. If I use pi as the base of the numeric system, then pi is 1 - possibly a tremendous compression of information, although not without its problems for other values. What if a new numeric system, that used different bases in the same representation of a number were found - might this supplant our current system?

            If context and perspective can make such a difference in the presentation of information, can we be sure that the limitations of our current representational structures will not be radically altered in the future? Is a positional numeric system the optimal way to present the value of pi? Like optimization concerns in general, there might not always be an optimal solution. This could suggest there is no limit to what can be represented as (a unit of) information.

            This also appears to be the implicit assumption of any final Unification Theory - that there is an optimal way (usually assumed to be mathematical) to characterize all phenomena in the universe. If mathematics cannot present an optimal solution then likely neither can physics.

            Don

            Phillip:

            "The information that described the state of the universe is holographic, so it can be represented on a surface. This is the compressed form of the data. What we observe in the bulk volume is an uncompressed form with lots of redundancy in the form of gauge symmetry."

            The information content of an emission, is not the same as the information content of the emitter that produced the emission. Every emission must travel through every spherical surface surrounding the emitter and with a radius less than the distance between the emitter and the receiver, if it is to ever be received in the first place. Thus, the entire information content of every long-range emission must be observable on those spherical surfaces. This is why the holographic principle exists, and why all long-range forces are inverse-square. It has nothing to do with the information content stored within the emitter or with data compression used to produce the emission. Assuming otherwise is a major misunderstanding of Shannon's Information Theory, within the physics community.

            Rob McEachern

            Don,

            "There is an interesting assumption in information theory - that there is a limit to what can be compressed or represented by a 'unit' of information. There might be a limit, given today's mathematics, but will that always be the case?

            How efficiently can I represent pi?"

            There are two branches to information theory:

            (1) Shannon's original theory has to do with how many discrete (quantized) samples are needed to perfectly reconstruct an arbitrary continuous function, such as those which might be solutions to the equations of mathematical physics. Shannon's Capacity Theorem specifies both the number of required samples and the number of required bits per sample, required to achieve perfect reconstruction. Thus, it provides the missing-link between the the worlds of continuous functions and quantized results. It is easy to show, for example, that setting Shannon's Capacity equal to a single-bit-of-information, will yield the minimum value of the Heisenberg Uncertainty principle. In other words, the Heisenberg Uncertainty Principle simply means that all observations must contain one or more bits of information. Otherwise, it is not an observation at all - just noise. That is why you cannot determine the values of two variables like position and momentum - in the limit, they only encode a single bit of information, between the two variables! This is also the cause of the so called "spooky action at a distance" and the correlations observed in experiemnts attempting to test Bell's Inequality Theorem.

            (2) Algorithmic Information Theory, which deals with data compression, AFTER the data has been represents in quantized form.

            The physics community has been mostly interested in (2), which is very unfortunate, since it has little relevance to physics, since it deals only with already quantized observations. But (1) addresses the question - Why are observations and measurements quatizable in the first place? - which is of direct relevance to the correct interpretation of quantum theory.

            Rob McEachern

            Very interesting Rob. I hope you will be submitting an essay with these ideas.

            Phillip,

            I have submitted a couple of essays in the past, that touched upon some of these issues, but as you yourself have observed, the physics community has little interest in ever looking at such things. These days I prefer to put things on vixra. You will find a couple of my posts on these matters there. Thanks for creating the site!

            Rob McEachern

            Robert,

            if you are looking in, and apologies to Philip... your statement of all long range emissions making all the information content observable on all the intervening spherical surfaces, being a rationale for the inverse square law; is rather intriguing. Could you elaborate a bit, please. Thanks, jrc

            John,

            My contention is simple: Two things, be they particles, waves, fields or anything else, cannot interact, if they cannot even detect each others existence. Detection requires detection of information - at least one bit. Hence, all interactions are driven by information detection. Since the ability to detect information from a signal is a function of the signal-to-noise-ratio, which is in turn a function of the distance squared, long-range interactions are governed by the inverse-square law. At short-ranges, the emitter may appear as an extended source, rather than a point-source. Consequently, the situation in regards to the signal-to-noise ratio is more complicated than just an expanding sphere.

            This is why quantum tunneling occurs - if an entity cannot even detect the existence of a barrier and the barrier cannot detect the entity - then the barrier, in effect is not even there.

            It is also why the phenomenon of virtual particles exist. I like the analogy of two submarines, moving through an ocean of noise, trying to detect (and thus interact) with each other. If they make contact, they commence to interact, generating emissions that can be detected by others. But if they quickly lose contact (can no longer even detect each other) they return to running silent and running deep. And the ships on the surface, that themselves detected the subs' initial response, are left to wonder what just happened- was there anything really there?

            Obviously, individual particles do not expand with distance, so their detection is not an inverse-square law - it is described, statistically, by the quantized behaviors which are the subject of quantum theory.

            Rob McEachern

            Thanks Robert,

            signal-to-noise following the inverse square, being the rationale. That clarifies, I am always hoping for something more than the observed measurement we have had since Newton. I'll guess I'm still eating Lotus. :-) jrc

            Hi Philip, your essay is a pleasure to read and once I had started I was compelled to read to the end. Your ideas about story telling resonate with my own thinking about how we relate to the world. Especially via our senses and by imposing singular perspectives. I wonder why then at the end you say"' If they don't know then they must be prepared to consider different philosophical options, letting the mathematics guide the way until the experimental outlook improves." Why do you say mathematics must guide the way? Why not biology first? (To elucidate the effects of building from a literal human centered perspective or to seek and eliminate its effects.) Or why not all of the sciences leading together in a multidisciplinary effort? Well done, kind regards Georgina