• Ultimate Reality
  • Classical Spheres, Division Algebras, and the Illusion of Quantum Non-locality:

The above should imply..

Nature as a whole has more freedom of choice than we do, or a higher-dimensional range compared to us, but we still have absolute freedom of choice within a specific range. This allows significant freedom of choice that is built-in, or automatic. But it also allows for nature to be super-deterministic, on some level.

All the Best,

Jonathan

I can express some satisfaction that..

I got to hear Professor 't Hooft lecture on topics related to those discussed in the article, on two separate occasions (FFP10 and 11), and to converse with him on that subject. I must add that, if you watch him for a while, it is easy to see that Gerard is always thinking, and always considering possibilities, but he has a keen ability to get directly to the point or the heart of matters - if asked a relevant Physics question.

In any case; I think there may be a way to weave Joy's approach with 't Hooft's (ask me later), but in the meanwhile I agree that Joy's approach may be inherently superior, in some regards. The impressive computational evidence through simulations continues to build up. Let the evidence show the true path.

All the Best,

Jonathan

Great link, John. I see some pretty weak assumptions, IMO:

" ... the only way for the light energy to find a reaction centre is to bounce through the protein network at random, like a ricocheting billiard ball. This process would take too long, much longer than the nanosecond or so it takes for the light energy to dissipate into the environment and be lost.

"So the energy transfer process cannot occur classically in this way. Instead, physicists have gathered a variety of evidence showing that the energy transfer is a quantum process."

Once again, researchers are relying on the probabilistic, linear nature of quantum theory (superposition, non-locality) to arbitrarily rule out classical processes. In fact, complex systems science is scale invariant, and principles such as the law of requisite variety, small world networks, nonlinear feedback functions, have the potential to explain microscale effects without the conventional quantum assumptions.

Tom

Tom,

I agree it was pretty speculative and there did seem to be a certain top down, whole body process vs. a bottom up path of least resistance action, working together. I just thought it might be an interesting example of how these effects work out in nature. Like you point out, there are lots of theoretical loose ends that are not sufficiently tied up. I think that when we do really start putting the pieces of the puzzle together, it will be more comprehensible and efficiently organized, than all the parts scattered about currently seem.

Regards,

John M

This could be an example of where geometry mimics quantum 'weirdness.'

Following on the idea laid out by Tom above, one could examine the snowflake and its regular, and self-similar, but varied forms. It's pattern arises from the molecular bond angles of the water molecule, in the presence of varying evolutive processes that shape its growth. But it creates a fractal pattern, which displays elements of form and shape that are scale invariant. So instead of light rays bouncing around at random, obliquely incident light rays tend to be systematically directed or reflected away - making fresh snow appear brilliantly white.

What if the matrix in which light-converting molecules naturally appear is a pattern that acts like a super-efficient collector? When some early experiments failed to bear out the high rates of energy conversion, in naturally occurring processes, I was curious. But when I looked into the experimental procedure, it led me to believe that the authors of that research systematically excluded any structural bonding which would have held key molecules at a specific angle with respect to the incident light. But in situ or in vivo settings would undoubtedly involve self-similar structures, so there could be geometrical scale-invariance and resolution-dependencies, which would work together in a simile of Quantum/Classical cooperation.

Regards,

Jonathan

That is..

In self-similar structures inward and outward, or larger and smaller, appear the same from the viewpoint of an observer on the microscale - but are obviously directional and asymmetrical from above, from the outside, or from a sufficiently large distance to see the shape of the structure in its entirety. This sort of form is observable in the Mandelbrot Set, and in its associated Julias, at the Misiurewicz points. The structure becomes more and more perfectly symmetrical and self-similar as one zooms in further, and further, but displays an obvious asymmetry when viewing the entire branch upon which it sits. I see this as a kind of geometric spreading into the periphery, of the form at the boundary of M.

Anyhow; Nature is likely exploiting this aspect of fractal geometry, in the conversion of light into cellular energy.

All the Best,

Jonathan

I read this earlier John..

I think perhaps there are a surprising number of computer systems which have reached the Turing limit, or where the architects and authors of such systems are trying to exceed it without knowing that there is a limit to how much needless complexity can be bundled in to a software product, when the methodology used is copy and paste programming. Very few modern programmers know the virtues of compact and efficient coding, because they are not concerned with or trained in making use of severely constrained systems.

Contrast this with some of the demoscene pioneers, who were able to squeeze the code to generate several minutes of video - rendered on the fly - from a program file only 64 KBytes in size! Instead of storing bitmaps for the surfaces of objects, they devised a way to use procedural texture maps instead, which I think mimics Nature's way of doing such things (Tiger stripes, Leopard or Ocelot spots, and so on). If more people designing software today saw efficient coding as an essential design concern; we wouldn't have so many problems with sites like healthcare dot gov crashing.

All the Best,

Jonathan

I wanted to offer..

At some point in the recent past; a paper came across my desk that referenced work by Terrence Barrett about a reformulation and extension of Maxwell's and Yang-Mills theory, using a topological basis. My thought is that this work might be relevant, in the context of this discussion of Joy's work, in that adopting professor Christian's framework might have a ripple effect, in forcing us to re-evaluate and adjust other theories - where I think Barrett's work is a step in that direction.

So I have attached a paper and a book chapter.

Regards,

JonathanAttachment #1: aflb26jp055.pdfAttachment #2: 6693_chap01.pdf

    Jonathan,

    Thanks. I posted that to see if there was any feedback on Woit's recent positions, though that mostly seems to be course work.

    I do think the solution to gravity won't be in ever more complex topologies, but in fully understanding the interrelationship of energy and mass. Quite simply, the equivalent amount of mass occupies far less space than the equivalent amount of energy, so getting from energy to mass would seem to necessitate a significant vacuum effect. Much as getting from mass to energy is the source of the pressure that powers human industry.

    Regards,

    John M

    Hi Jonathan,

    As I understand it, superdeterminism is absolute, with no domain-range dependence. That would obviate individual free will at any scale.

    Einstein once said " ... there are two ways to look at the world; either everything is a miracle, or nothing is a miracle." I think that if nothing is a miracle, free will must apply up and down the scale, on a continuum of consciousness. Does that obviate determinism, super or otherwise?

    If it did, there would remain the miracle of comprehensibility; i.e., every event being random, one is challenged to describe how it is that structures are coherent.

    The coherence and comprehensibility of the natural world suggests to me that determinism -- teleological certainty -- does not originate in a singular past initial condition. It originates in present measurement functions continuous from an initial condition chosen by classical randomness (coin toss probability).

    Joy's measurement framework for local quantum correlations has tremendous explanatory power, within both general relativity and quantum field theory, because it *is* a field framework -- with all the attributes of linear field determinants -- yet analytically continuous, singularity free, with strongly correlated elements on any time scale. The computer simulation contradicts critics' claims that the measurement results are trivial and constant -- the same linearity generated by multiply connected Bell-Aspect results is subsumed in Joy's simply connected nonlinear framework.

    So nature's free will is precisely equal to observer free will; i.e., classical randomness free of domain and range dependence, and without the assumption of a singular fixed cause("absolute being"), consonant with Joy's earlier work regarding relative becoming, which is also full of insight on breaking down the local-global distinction without sacrificing the free will hypothesis which necessitates "the experimental metaphysics of time."

    All best,

    Tom

    As I see it..

    All of the descriptions of super-determinism I've found are an extension of some conventional assumptions about the nature of spacetime, such as the blocktime view. I think that rather than being a matter of scale, over which super-determinism might apply, it is a question of topologies - such as how a simply connected surface or space is fundamentally different from a manifold that is not simply connected.

    That is; the crux of Joy's work is the assumption that - regardless of appearances - we do NOT live in a 3-d semi-Euclidean space, with Riemannian curvature. Instead; we live a a simply-connected space, that appears to be Euclidean because of parallelization. But this assumption is seldom entertained, and does not correspond to the workings of any background space in which super-deterministic theories are framed (so far as I know).

    So I would continue to assert that if the dimensionality of the source and target environment are different - reality CAN be super-deterministic in the realm of the source, but embrace free will in the target space. Specifically; even if nature makes strong and/or inflexible choices in 8-d space, this does not prevent there from being free will in a 4-d subspace - which is indistinguishable from absolute free will within that space. However; that range will be severely constrained, when compared to absolute freedom of choice in 8d. It may in fact be true that super-determinism in octonionic space is needed to assure freedom in quaternionic space - where we reside.

    Regards,

    Jonathan

    One could restate the above as follows..

    When discussing action from spaces that are non-commutative and non-associative; the possibility exists that - because size/distance or interiority/exteriority is reversed in sense of direction - constraints in the higher-d spaces may translate into freedoms in spaces where coordinates commute and associate directly.

    Specifically; if the constraints appear in terms that anti-commute or anti-associate, their direction is reversed. A Physics example would be the asymptotic freedom of quarks in a quark-gluon plasma, which are free to move at will within the constrained region - but not to exit that region unbound.

    All the Best,

    Jonathan

    Granted that the measure space is a subset of the complete physical space. However:

    Unless free will applies over the complete physical space, it does not apply anywhere. It can't -- since the metaphysically real physical space subsumes the space of real measurement events.

    This is another of those times when treating quaternions and octonions as if they were physical things makes me uncomfortable. Physical things are only measurement events; EPR, Bell, and Joy Christian all agree on that.

    If some event in physical space were the random cause of an event in measure space, then the probabilistic measures of conventional quantum theory would apply both to the physical space and the measure space. On the other hand, if classical randomness (coin-toss probability) is a property of both the physical space and the measure space -- the world is deterministic; as Joy's framework shows, nonlinear random input to the continuous measurement results in a smooth function. This could only hold if the randomness of the metaphysically real physical space were equal to the randomness of the real physical measure space.

    The difference between "random" and "probabilistic" is critical. Binary random events imply perfect information, as if nature and observer independently posses a qubit and are free to choose one value or the other for any measurement event. Nature's choice is the hidden variable -- and because it is equally random with the observer's choice, the continuous sinusoidal function is smooth with probability 1. "Relative becoming" is a deterministic schema, as is chaos theory.

    That the observer is also part of the complete physical space makes the case for deterministic randomness. Both free-will-determinism, and no-free-will determinism, are absolute. In neither case can probabilism apply. ("Either everything is a miracle, or nothing is a miracle.")

    Best,

    Tom

    I must contemplate this further...

    I agree with you up to a point, Tom, but I feel there is something still unexamined or not recognized to be an implied assumption. In the example above, the QGP could be considered a pre-topological form of matter. That is; if the individual quarks are not topologically complete, surface-bearing objects, in a 3-d space - then their combination is the birth of topology. So asymptotic freedom of quarks is limited because once there is enough (3-d) space for them to spread out into, they MUST link up in order to exist within that space.

    The sticking point (regarding your argument) comes in mainly in such extreme cases, be it the quark-gluon plasma, the primordial origin of the universe, the rim of a black hole, or other regimes where energy greatly dominates matter. If the energy bath is hot enough, this forbids the formation of the familiar particle families. My main question at this point is whether it is the effusiveness and incompressibility of energy, in the matter free regime, that 'pushes things out' into the 3-d realm, or is it the linking up of sub-units which creates a quenching effect on the available energy which does the trick.

    Regards,

    Jonathan

    Tom,

    "it preserves free will against the mystical conspiracy of superdeterminism; i.e., individual free will is hidden in plain sight, covariant with the free will of nature. That is, randomly covariant, such that nature on every scale participates in every event bifurcation -- which is exactly what we observe to happen on the classical scale, by a sensitive dependence on initial condition which characterizes deterministic chaos."

    "So nature's free will is precisely equal to observer free will; i.e., classical randomness free of domain and range dependence, and without the assumption of a singular fixed cause("absolute being"), consonant with Joy's earlier work regarding relative becoming, which is also full of insight on breaking down the local-global distinction without sacrificing the free will hypothesis which necessitates "the experimental metaphysics of time.""

    " On the other hand, if classical randomness (coin-toss probability) is a property of both the physical space and the measure space -- the world is deterministic; as Joy's framework shows, nonlinear random input to the continuous measurement results in a smooth function. This could only hold if the randomness of the metaphysically real physical space were equal to the randomness of the real physical measure space."

    " Nature's choice is the hidden variable -- and because it is equally random with the observer's choice, the continuous sinusoidal function is smooth with probability 1. "Relative becoming" is a deterministic schema, as is chaos theory."

    When I posit something similar, that time is the effect of change, turning future probabilities into past certainties and that while the process may be deterministic, the input is random, so the output cannot be fully determined prior to the event, you hold rather tightly to the spacetime model, which is deterministic, since past and future are not differentiated and there is no preferred direction, yet here you argue for a model that is much more realistic.

    "In neither case can probabilism apply."

    I would argue that the future, or at least the near future, is probabilistic, rather than random, since the range of potential outcomes become further constrained the nearer an event becomes. Input can be random, but momentum and inertia limit the effect truly random input can have. Its 'space' is being limited, thus the approaching events become more probable and less random.

    Jonathan,

    "If the energy bath is hot enough, this forbids the formation of the familiar particle families. My main question at this point is whether it is the effusiveness and incompressibility of energy, in the matter free regime, that 'pushes things out' into the 3-d realm, or is it the linking up of sub-units which creates a quenching effect on the available energy which does the trick."

    Why not extend this dichotomy of energy and structure to the processing of the entire universe? Energy does expand outward, until it becomes subsumed into structure and then compresses until such time it heats up and breaks the structure down, radiating back out. Think on just how much it creates and defines life processes, as new energy is constantly growing up and out, while old structure is holding onto form and pressing it down and inward. Then onto convective processes that form geological and stellar currents, than galactic structures, pulling in form and radiating away energy. Could it be that in the intergalactic deep, even light cools enough that it 'crusts' and becomes a gas? Such as at 2.7k?

    Try linking it up with complexity theory, with 'energy' as chaos and 'structure' as order. Then this complex reality is that dynamic of energy/chaos/randomness, pushing out, while structure/order/deterministicness, ie. probability, is compressing inward. Then time is the effect of this relationship, with the past as what is ordered/determined, while the future is the energy constantly pushing out this ordered form in all its weak spots, such that the new arises, either by motivating the old, or squeezing through the cracks, as either evolution, or revolution.

    Regards,

    John M