• Ultimate Reality
  • Classical Spheres, Division Algebras, and the Illusion of Quantum Non-locality:

"From Tom's link: 'Machine language; A set of instructions for a specific central processing unit, designed to be usable by a computer without being translated. Also called machine code.'

This is what I meant and what Rick said it was."

What you meant and what Rick claimed is not what I meant, which should be clear to you if you look further down to definitions number 2 and 3, and which should have been clear from the context of my remarks.

"Anyway my point made was that I questioned that the successful simulations in various computer languages do not add proof for the program being run."

A simulation of physical experiment doesn't prove anything except feasibility of the experiment.

"It also doesn't do harm. What I think it does do is demonstrate that Joy's model can be simulated by computer program."

It's more than that. Rick is incorrect that every simulation of a continuous function is a continuous function, and in this forum I gave a *specific* example of a continuous function (Chaitin's constant, the halting probability of a Turing machine) that is *not* transportable to different machine languages. The same algorithm generates different results.

It is a short step of logic from the random coin toss probability one sees in Chaitin's result -- which is not transportable, and Joy Christian's framework that is -- to the hypothesis that the simulation of a continuous function (in nature) is a continuous function (in mathematical analysis). *It's strong evidence* that Joy's model is analytical. Chaitin's number shows that there is randomness in arithmetic; Joy's model points toward the absence of randomness in the foundation of nature. We invented arithmetic (Kronecker's belief notwithstanding); we didn't invent nature.

Best,

Tom

Me: "From Tom's link: 'Machine language; A set of instructions for a specific central processing unit, designed to be usable by a computer without being translated. Also called machine code.'

I am separating this out thinking maybe it will go away:

Me: "This is what I meant and what Rick said it was."

Tom: "What you meant and what Rick claimed is not what I meant, which should be clear to you if you look further down to definitions number 2 and 3, and which should have been clear from the context of my remarks."

Each meaning said the same thing, machine language is the level at which the cpu operates. It consists only of strings of ones and zeros which are a representation of the storage or lack of storage of electrons placed at the input to transistors. The computer not only does operate on this level only, but in addition, all other levels of computer languages are meaningless to the cpu.

James Putnam

Hi Joy,

"While I agree with you to certain extent, you are underestimating the sociological value in the five different simulations of my model, authored by four different people, and written in four different programming languages."

I think I do see this. When the first simulation appeared, one or perhaps more specific program lines drew negative comments, but, as each new simulation came out, it became necessary to find new kinds of program lines to buttress each challenge to the original simulation. I am not aware that anyone was able to keep up their challenge by addressing the codes of each new simulation. They should have been able to do this for any valid challenge.

I recognize the simulations as an important success for proving that your model can be simulated by a computer. It is being simulated on various computers using various computer languages. Importantly, this probably includes various machine languages.

Perhaps this is well said or perhaps not. I am limited. What I will say confidently is that those programmers who simulated your model performed a valuable service.

James Putnam

Tom,

" we didn't invent nature."

Nor can we bottle her.

The problem with nature, is that prior to the calculation, you don't have all the input, which makes it difficult to calculate the output.

Regards,

John M

Tom,

"...the crux of your question, which I think can be reformulated as, "Do prior results determine future probabilities?"

I wasn't thinking about probabilities in particular. I was thinking that prior meaningful results of any experiment will, if used as inputs, effect future results of any new experiment. This effect will produce results that may form continuous functions(?) that will not be the same as would have resulted from random(?) inputs.

The question marks indicate that for the first one I am not certain if the word function is correct, The second question mark represents my opinion about what constitutes randomness. I see the word used when meaning is associated with it as for example results that produce a Bell curve. The input data clearly had direction. For me, randomness is meaninglessness or having no associated direction. Otherwise I find the word to be misleading.

James Putnam

James Putnam

Tom,

Me: "It [The computer simulations of Joy's model.] also doesn't do harm. What I think it does do is demonstrate that Joy's model can be simulated by computer program."

Tom: "It's more than that. Rick is incorrect that every simulation of a continuous function is a continuous function, and in this forum I gave a *specific* example of a continuous function (Chaitin's constant, the halting probability of a Turing machine) that is *not* transportable to different machine languages. The same algorithm generates different results.

It is a short step of logic from the random coin toss probability one sees in Chaitin's result -- which is not transportable, and Joy Christian's framework that is -- to the hypothesis that the simulation of a continuous function (in nature) is a continuous function (in mathematical analysis). *It's strong evidence* that Joy's model is analytical. Chaitin's number shows that there is randomness in arithmetic; Joy's model points toward the absence of randomness in the foundation of nature. We invented arithmetic (Kronecker's belief notwithstanding); we didn't invent nature."

Me: I guess I don't know how you are using the words 'machine language' and 'random'. If by machine language you still refer to higher level computer languages, then, I can't evaluate your statement.

With regard to the absence of randomness in nature, so long as my use of the word is understood to mean lacking meaning and direction, then I certainly agree with that. Any existence of meaninglessness, if it is possible, would destroy all meaning. I have stated this in the past as: The existence of randomness anywhere in the universe would destroy order in the universe.

Your example of the randomness of a coin toss says to me that you don't define random as lacking direction. It appears to me that your meaning of random is analogous to chance. I don't assume agreement with what I have said. I am not attempting to describe Joy's model. I guess that makes my part of this message off topic.

James Putnam

Never said this: Tom: "It's more than that. Rick is incorrect that every simulation of a continuous function is a continuous function...."

For the record, I have written tens of thousands of lines of assembler code i.e. machine code, tens of thousands of lines of high level language code i.e. C, C#, VB, Javascript in my 35 year career doing Engineering Design and Management. I have also done the hardware design on >40 microprocessor based products. I did not have to look up the definition of machine code to know what it is. Tom, you looked like a fool today claiming you knew more than I did not knowing me, while stating something obviously incorrect in the same breath.

There is nothing mystical going on in computer programs. Lack of portability ALWAYS has a very straight forward explanation that is entirely within the domain of the programming; in the high level language program and/or the compiler implementation. Near zero chance in the machine code. Zero chance in the implemented algorithm itself if all implementations are true to it. If you think otherwise Tom, wrong again.

Rick

  • [deleted]

Let's see;

Physical reality is required for processes to commence = bottom up

Evolving processes are required for physical reality to exist = top down

I might have it reversed, but it doesn't matter because my point was and remains that we can't choose one or the other, and neither does nature, because both are essential to reality. I think one can build upward from primitive structures, to create form of a specific type, but one can also whittle away from a larger field of possibilities. Additive and formant synthesis can bring one to exactly the same place. Add lumps of clay or sculpt down from the edges of a block? But this is sort of like the last contest question; is It or Bit the coin of the realm?

I think Fred has it right, in his comments above. The Sedenion algebra is simpler, in terms of rules. We cast out zero divisors and get the Octonions, apply the associative law and get the Quaternions, and commutativity to get the Complex numbers, and unitality to get to the Reals. Though we call H and O hyper-complex numbers - they are simpler than C and R - in terms of the number of rules and conditions we must apply to define them. So I dispute your assertion that physically real is necessary to process information deterministically. This seems exactly backwards; where deterministic information processing is essential to physical reality.

Regardless; topology itself CAN do the job. According to Myrhvold topology is the most efficient way to store information, so the exotic versions of figures like S7 may have a purpose. And of course; one can have flows on a simply connected surface like S4 or S7 which come to encompass the entire figure. So geometrical dynamism is deterministic. One could say that it is the physical reality of these figures that gives meaning to their dynamism - but it may be the dynamism of higher-order spheres that gave rise to physical reality instead.

Octo-It from Octo-Bit! But it works the other way too.

Jonathan

Oh well..

That was me above. But I'd like to clarify anyway. I'll start by saying that the bottom-up in approach NKS (Wolfram's New Kind of Science) terms leads to a formulation for the Number families in the terms that Fred suggests, where the addition of rules makes the properties of each successive kind - from S to O, H, C, and R - progressively more restricted.

But in conventional Math, or in the historical discovery of the number types, the Reals were defined or discerned first, and then the Complex and Hyper-complex types. So that's a different version of bottom-up. I think the existence of something like the Bott periodicity (a repeating hierarchy of 8), or the essential number types with Normed Division Algebras, predates our knowledge - as they were at work before we discovered them, however.

I find compelling the idea of an octonionic phase in the primordial universe, as an approach to quantum gravity, and I've done some research in that area. So in my view; I AM talking about physically real octonions or S7 being a computer that is contiguous to and continuous with the universe of form we observe. And that is the context in which I interpret Joy's work, and make my statements above.

Regards,

Jonathan

Let me elaborate a bit..

In constructive geometry; there is a sense that the properties of objects and spaces remain undefined, until we define a way to make a determination - which is both a measurement and a geometric construction. The nice thing about constructivist proofs is that they actually give you a way to create what you are proving, as well as plumb its depths. Such a proof is easy to turn into a computer program.

The subject of distance estimation and triangulation comes to mind. How does one gauge dimensionality? Can one determine the dimension of an object or space with but a single observation? Likewise, seen from the outside; can one place a single point in the middle of an expanse, and tell whether it is 2-d, 3-d, or 4-d? Obviously; one cannot unambiguously rule out a higher or lower dimensional space with a single point of view, and no prior knowledge. A 0-d and and infinite-dimensional space will equally well accommodate a single point.

But making more observations, or adding more points, will allow progressively higher dimensional objects and spaces to be determined or defined. Anyhow; this suggests a bimetric approach with an upper and lower bound that both evolve with time - to describe the first moments of the universe's evolution.

All the Best,

Jonathan

Whoops..

That should read "making more observations, or adding more points, will allow progressively higher AND LOWER dimensional objects and spaces to be determined or defined." It is our bias that makes us believe that 8-d reality would be built up from 2-d primitives, rather than having a 4-d reality emerging from an 8-d one, but Joy has talked about this too. If Grassman's prescription was to treat points, lines, and triangles or spheres, on an equal footing - rather than regarding the primitives of one dimensionality to be more elementary or essential than another - perhaps S15 and S7 should be seen as basic to reality.

Jonathan

Hi Jonathan,

Yes, it is a bit counter-intuitive to see it that way. We are indeed biased though. But construct an arrow (vector) from a single point in the void and it can point in an infinite number of directions. If that is all there was in the void, how do you determine dimensionality? It is the interactions of all quantum objects that gets the dimensionality we perceive down to 4 via more and more "math" rules being added. Nature does it automatically and Joy has shown the path Nature takes, IMHO. However, I don't believe that octonions have anything to do with gravity specifically. There is no reason why gravity can't emerge from the Standard Model of particle physics. The missing part is a more complete understanding of what mass is. When we can show how the mass of all the elementary fermions is produced by Nature fundamentally, I believe gravity will emerge. And it is all about the geometry of interactions of all quantum objects. A very difficult task to deduce that geometrical structure though. There is more to it than just S15, S7, etc. because those topologies can produce an even richer structure. But I think we do have a good path to follow now.

Best,

Fred

James,

"With regard to the absence of randomness in nature, so long as my use of the word is understood to mean lacking meaning and direction, then I certainly agree with that."

Well, if meaning is independent of language, then apparently random events don't have any meaning of their own.

"Any existence of meaninglessness, if it is possible, would destroy all meaning. I have stated this in the past as: The existence of randomness anywhere in the universe would destroy order in the universe."

That's just the thing. Joy's measurement framework demonstrates that apparent randomness adds meaning to order. It's classical randomness, though -- a deterministic framework that applies from the cosmological initial condition to microscale events.

One shouldn't confuse randomness with probabilism.

Best,

Tom

Tom,

"One shouldn't confuse randomness with probabilism."

Exactly.

What if there is no initial condition, no proverbial beach we walked out from? That the center point in the horizon is only ever our particular here and now?

Regards,

John M

"What if there is no initial condition, no proverbial beach we walked out from? That the center point in the horizon is only ever our particular here and now?"

Sure, John. That's exactly what I said, with the same metaphor (Poincare disk), in my FQXi essay before last. It supports my conclusion that the source of all information is a point at infinity.

However, measurement results are always finite, so the choice of initial condition affects what we observe here and now (local reality). The whole question of whether reality is probabilistic (standard quantum physics) or determined (classical physics) rests on the choice function. Who does the choosing?

Best,

Tom

Fred,

What if rather than just fields interacting, they fundamentally expand and contract as well? Then gravity is just such an contracting field and the intergalactic medium is the balancing expanding field? Physically, gravity is the contraction of mass and radiation expands, so there seems to be a convective cycle of expanding radiation and contracting mass. We know that when mass breaks down and releases radiation, it creates pressure, everything from basic heat to atomic shock waves. So wouldn't there be an opposite side of this cycle, where radiation is cooling in the intergalactic medium to the point it starts to condense out as gases and starts the condensation process back in the other direction. Then gravity is the vacuum effect of this contracting field, while the expansion of the universe is the expansion of the intergalactic medium, which is concurrently balanced by these gravitational vortices. The material not radiated out by stars is shot out the galactic poles as quasars.

Not that all the gaps are filled in, but a pattern that might be worth seeing if it is pointing in that direction.

Jonathan,

"But making more observations, or adding more points, will allow progressively higher dimensional objects and spaces to be determined or defined. Anyhow; this suggests a bimetric approach with an upper and lower bound that both evolve with time - to describe the first moments of the universe's evolution."

Or, conversely, the space and the points are already there and it is only our perception of them that grows and its that directionality of our growth which biases our perception. We are like one molecule of water, so we experience water from the perspective of a sequence of encounters with other molecules, but the feedback loop inherent in the thermodynamic properties balances out our sense of direction. Which doesn't negate our directionality, only balances it. It from bit and bit from it.

Regards,

John M

"The problem with nature, is that prior to the calculation, you don't have all the input, which makes it difficult to calculate the output."

John, that's exactly why quantum theorists and quantum computing specialists use Bayesian probability analysis, in the belief that prior probabilities change future output.

Consider Chaitin's number, though, in the same context. It is only a repeating series of binary digits; remarkably, though, the order of the digits is different from one machine language to another though they are supposedly calculating the same thing. If one chooses one of the digits as an initial condition, there is only a random 50-50 probability that the next digit will match. No prior probability for the next coin toss.

Tom

Fred,

" ... construct an arrow (vector) from a single point in the void and it can point in an infinite number of directions. If that is all there was in the void, how do you determine dimensionality?"

The vector field that points everywhere has to live in S^1 (S^0 is left-right symmetric). So right away we know that the odd dimension topology is infinitely orientable by the embedded even dimension topology of S^0, left-right and up-down continuously. Is that also true for even dimension topology > 0? S^2 won't allow an embedded S^1 object a continuous mirror image transformation of itself(the space isn't simply connected). The next odd dimension topology, S^3, is again infinitely orientable.

Almost all the participants here think in terms of factorizability in the division algebras of S^0, S^1, S^3, S^7. I think only in terms of continuity and arithmetic modulo 2. It gets us to the same place -- however, I think it helps explain why so many cling to probabilistic arithmetic descriptions of natural events:

Odd-even parity is conserved only on the even topology S^0 and not on even topologies > 0. There is no conservation of parity on the equator of S^3, where the only possible results are 1, - 1 and i. Left-right orientability is of odd parity only -- so if we think only in terms of linear algebra, it is impossible to get other than a result of - 1 (see attachment). That is, the linear array gotten by squaring all the terms is

- 1, - 1, 1.

That a unit parallelized S^3 will admit the normed division algebras is not for me a more significant fact than that parity is restored by left-right symmetry in a continuous function over the orientable manifold.

One can't think of these as left side and right side algebras, because the S^3 manifold is simply connected; i.e., functions continuous from 1 or - 1 (the imaginary condition is static) are not linear. Vector algebra can take one only so far in describing why nature conserves parity in all (classical) interactions and why the sum of terms is zero. (The sensitivity of weak atomic interactions to chirality that results in apparent parity violations helps verify the case, because one sign change among 3 nonzero elements results in linear independence.)

Best,

TomAttachment #1: S3__equator_viewed_from_simple_pole_at_infinity.docx

Tom,

Me: "With regard to the absence of randomness in nature, so long as my use of the word is understood to mean lacking meaning and direction, then I certainly agree with that."

Tom: "Well, if meaning is independent of language, then apparently random events don't have any meaning of their own."

There is no 'apparently'. The word random means there is no direction or meaning in the hypothetical existence of physical randomness. The hedge use of 'apparently' shows that you are not talking about randomness.

Me: "Any existence of meaninglessness, if it is possible, would destroy all meaning. I have stated this in the past as: The existence of randomness anywhere in the universe would destroy order in the universe."

Tom: "That's just the thing. Joy's measurement framework demonstrates that apparent randomness adds meaning to order. It's classical randomness, though -- a deterministic framework that applies from the cosmological initial condition to microscale events."

In other words the use of the word 'randomness' in the above paragraph does not mean physical randomness.

Tom: "One shouldn't confuse randomness with probabilism."

My use of the word random is consistent with the definition of random. One shouldn't confuse randomness with chance. It gives the false impression that order can arise from disorder.

James Putnam

Tom,

It seems to me you can have entirely deterministic processes, with ultimately random input.

I'm still not convinced the desire for a universal initial condition isn't more a consequence of our needs, than nature's. LeMaitre stated as much.

Regards,

John M