• Ultimate Reality
  • Classical Spheres, Division Algebras, and the Illusion of Quantum Non-locality:

""To readers: Length and time are primary because they are naturally fundamental indefinable properties. They are indefinable because they are the properties of empirical evidence.""

"No they're not. Meter sticks don't come with pre-marked gradations, and clocks don't come with pre-marked faces."

The sticks and clocks aren't primary nor are their markings. What is primary is object length and action timing so that acceleration can be recorded and evaluated for meaning. The sticks, clocks and their markings are tools. We can decide on different versions or choices of 'sticks' and 'clocks' don't need faces.

James Putnam

"What is primary is object length and action timing so that acceleration can be recorded and evaluated for meaning."

James, lengths and action of measure zero (at rest) accelerate only relatively. I'm not getting into this anymore.

At least two physicists working today on a deterministic theory of quantum correlations, actually met and spoke with John Bell. One is Joy Christian. Another is Gerard 't Hooft .

The idea of superdeterminism (which Sabine Hossenfelder also entertains), is Bell's own loophole which discards free will in favor of pre-ordained correlations. ("Everything is correlated with everything else -- not a little bit, but very, very strongly.")

While 't Hooft's framework is particle physics and quantum fields, and Christian's is the classical mechanics of continuous measurement functions, one has to expect that all deterministic theories of quantum correlations intersect in important ways, as shown particularly in slides 9, 15 and 23 of this 2000 't Hooft public lecture.

Why do I think Christian's classical framework is a superior construction? -- it preserves free will against the mystical conspiracy of superdeterminism; i.e., individual free will is hidden in plain sight, covariant with the free will of nature. That is, randomly covariant, such that nature on every scale participates in every event bifurcation -- which is exactly what we observe to happen on the classical scale, by a sensitive dependence on initial condition which characterizes deterministic chaos. No conspiracies. No mysticism. Just a direct rational correspondence between the measure space and the locally real measurement result.

Tom

""What is primary is object length and action timing so that acceleration can be recorded and evaluated for meaning.""

"James, lengths and action of measure zero (at rest) accelerate only relatively. I'm not getting into this anymore."

Objects accelerate. 'Relatively' has nothing to do with changing anything that I have said. I say it correctly and I stay with it to make sure it stays said correctly.

James Putnam

It's pretty simple really..

From our viewpoint; the universe went through an octonionic phase near or during the Planck Era, which initiated the inflation or expansion of a 3-sphere within the larger structure of S7. At this point; the octonionic reality appears to be coming from a smaller state. But this is in part because the coordinates wrt that space - the 7-sphere - are anti-commuting. In effect; the current S3 shaped universe is a bulb-like protrusion on the surface of S7 - but it is also properly subsumed by the larger space of that figure.

Size/Distance is a further refinement of Interiority/Exteriority. This speaks to the point Connes makes in NCG 2000 (smooth > topological > measurable) because size/distance relates to measurability and interiority/exteriority relates to topology. In terms of early childhood learning; topology pertains to object constancy and measurability pertains to the estimation of size and distance that is generally learned by age 2 1/2. I'll have more to say about R2D2 and C3PO vs the Terminator later, but simply; an exciting upshot of Joy's work is that it leads to the possibility for robots and computers more like the former than the latter.

All the Best,

Jonathan

To an extent..

One can have a theory that is super-deterministic, from one perspective, yet allows absolute freedom of choice from another. Let us examine Joy's theorema egregium which posits that the full embedding space of quantum correlations is S7. But classical Physics works quite effectively from the assumption that the space we inhabit is 3-d and Euclidean. However; the characteristics of parallelized S3 reproduce the appearance of such a space quite effectively, because parallelization creates flatness.

So what if..

There is a family of S4 worth of 3-spheres, S3, within the 7-sphere, S7. So nature's form of super determinism amounts to a selection of but one 3-sphere, from among that vast range of possibilities. And we still have what appears to be absolute freedom of choice within a well-ordered but more limited range of possibilities, afforded to and by a specific range of sub-atomic particle species or combinations thereof. Now from such a perspective; a change in the orientation of the embedded S7 might appear to be a nuclear flavor change, or some such - and so would be disallowed. But within the range of such constraints; we do have S3 worth of freedom of choice.

Regards,

Jonathan

The above should imply..

Nature as a whole has more freedom of choice than we do, or a higher-dimensional range compared to us, but we still have absolute freedom of choice within a specific range. This allows significant freedom of choice that is built-in, or automatic. But it also allows for nature to be super-deterministic, on some level.

All the Best,

Jonathan

I can express some satisfaction that..

I got to hear Professor 't Hooft lecture on topics related to those discussed in the article, on two separate occasions (FFP10 and 11), and to converse with him on that subject. I must add that, if you watch him for a while, it is easy to see that Gerard is always thinking, and always considering possibilities, but he has a keen ability to get directly to the point or the heart of matters - if asked a relevant Physics question.

In any case; I think there may be a way to weave Joy's approach with 't Hooft's (ask me later), but in the meanwhile I agree that Joy's approach may be inherently superior, in some regards. The impressive computational evidence through simulations continues to build up. Let the evidence show the true path.

All the Best,

Jonathan

Great link, John. I see some pretty weak assumptions, IMO:

" ... the only way for the light energy to find a reaction centre is to bounce through the protein network at random, like a ricocheting billiard ball. This process would take too long, much longer than the nanosecond or so it takes for the light energy to dissipate into the environment and be lost.

"So the energy transfer process cannot occur classically in this way. Instead, physicists have gathered a variety of evidence showing that the energy transfer is a quantum process."

Once again, researchers are relying on the probabilistic, linear nature of quantum theory (superposition, non-locality) to arbitrarily rule out classical processes. In fact, complex systems science is scale invariant, and principles such as the law of requisite variety, small world networks, nonlinear feedback functions, have the potential to explain microscale effects without the conventional quantum assumptions.

Tom

Tom,

I agree it was pretty speculative and there did seem to be a certain top down, whole body process vs. a bottom up path of least resistance action, working together. I just thought it might be an interesting example of how these effects work out in nature. Like you point out, there are lots of theoretical loose ends that are not sufficiently tied up. I think that when we do really start putting the pieces of the puzzle together, it will be more comprehensible and efficiently organized, than all the parts scattered about currently seem.

Regards,

John M

This could be an example of where geometry mimics quantum 'weirdness.'

Following on the idea laid out by Tom above, one could examine the snowflake and its regular, and self-similar, but varied forms. It's pattern arises from the molecular bond angles of the water molecule, in the presence of varying evolutive processes that shape its growth. But it creates a fractal pattern, which displays elements of form and shape that are scale invariant. So instead of light rays bouncing around at random, obliquely incident light rays tend to be systematically directed or reflected away - making fresh snow appear brilliantly white.

What if the matrix in which light-converting molecules naturally appear is a pattern that acts like a super-efficient collector? When some early experiments failed to bear out the high rates of energy conversion, in naturally occurring processes, I was curious. But when I looked into the experimental procedure, it led me to believe that the authors of that research systematically excluded any structural bonding which would have held key molecules at a specific angle with respect to the incident light. But in situ or in vivo settings would undoubtedly involve self-similar structures, so there could be geometrical scale-invariance and resolution-dependencies, which would work together in a simile of Quantum/Classical cooperation.

Regards,

Jonathan

That is..

In self-similar structures inward and outward, or larger and smaller, appear the same from the viewpoint of an observer on the microscale - but are obviously directional and asymmetrical from above, from the outside, or from a sufficiently large distance to see the shape of the structure in its entirety. This sort of form is observable in the Mandelbrot Set, and in its associated Julias, at the Misiurewicz points. The structure becomes more and more perfectly symmetrical and self-similar as one zooms in further, and further, but displays an obvious asymmetry when viewing the entire branch upon which it sits. I see this as a kind of geometric spreading into the periphery, of the form at the boundary of M.

Anyhow; Nature is likely exploiting this aspect of fractal geometry, in the conversion of light into cellular energy.

All the Best,

Jonathan

I read this earlier John..

I think perhaps there are a surprising number of computer systems which have reached the Turing limit, or where the architects and authors of such systems are trying to exceed it without knowing that there is a limit to how much needless complexity can be bundled in to a software product, when the methodology used is copy and paste programming. Very few modern programmers know the virtues of compact and efficient coding, because they are not concerned with or trained in making use of severely constrained systems.

Contrast this with some of the demoscene pioneers, who were able to squeeze the code to generate several minutes of video - rendered on the fly - from a program file only 64 KBytes in size! Instead of storing bitmaps for the surfaces of objects, they devised a way to use procedural texture maps instead, which I think mimics Nature's way of doing such things (Tiger stripes, Leopard or Ocelot spots, and so on). If more people designing software today saw efficient coding as an essential design concern; we wouldn't have so many problems with sites like healthcare dot gov crashing.

All the Best,

Jonathan

I wanted to offer..

At some point in the recent past; a paper came across my desk that referenced work by Terrence Barrett about a reformulation and extension of Maxwell's and Yang-Mills theory, using a topological basis. My thought is that this work might be relevant, in the context of this discussion of Joy's work, in that adopting professor Christian's framework might have a ripple effect, in forcing us to re-evaluate and adjust other theories - where I think Barrett's work is a step in that direction.

So I have attached a paper and a book chapter.

Regards,

JonathanAttachment #1: aflb26jp055.pdfAttachment #2: 6693_chap01.pdf