• Ultimate Reality
  • Classical Spheres, Division Algebras, and the Illusion of Quantum Non-locality:

Joy, this is just beautiful! My quick impression is that it may actually be experimentally replicable with electron input. It already reads like an experimental computation model.

Still nothing from the cone of silence where the critics are huddled?

(Great work by Michel Fodje!)

And I think that because the statistical function of Fodje's simulation verifies the topological structure of Roth's simulation, we can conclude, significantly:

The simulation of a continuous function is a continuous function.

Does it get more physically real and local than that?

Thanks, Tom.

I forgot to mention that the top part of the figure shows the familiar cosine correlation, whereas the bottom part shows the actual quantum probabilities predicted by my local model.

Image 1

I picked up a portion of a comment you made to a correspondent in sci.physics.foundations that I think is relevant:

"It is also important to realize that an orientation of a manifold (like that of a 3-sphere) is both a local and a global property at the same time. It is local in the sense that a basis is chosen at each point of the 3-sphere. But since this choice must be made consistently throughout the 3-sphere, its orientation is also a global (or a topological) property."

This local-global relation is an absolute breakthrough accomplishment, for future research directions -- including quantum computing without entanglement.

Yes, beautiful and impressive work for mathematicians. While not wanting to be a kill joy, it is only mathematicians who will insist that all lines and all planes, no matter their size contain the same number of points as my post on Nov. 5, 2013 @ 11:06 GMT refers.

Physicists, are now belatedly catching up with the idea that this is unlikely to be so. Einstein was already unto this long before as mentioned in a link. In a likely Quantum gravity theory there is likely to be a Planck limit at 10-35m. That being so, this line _____ and this _____________________ cannot contain the same number of points as Tom claims, since a Planck length cannot contain more than one point as it is indivisible.

I therefore stand to be corrected, that both Bell's theorem and its Disproof being founded on the assumption that there is no limit to distance at the Planck length will not be of much use to physicists standing at the gate between Quantum and Classical physics waiting and ready to "Tear down this wall"!

Akinbo

There is no wall, Akinbo.

"In a likely Quantum gravity theory there is likely to be a Planck limit at 10-35m."

What makes you think that?

Thank you, Mr. Moldoveanu, for your astute observation. I am sorry to say, however, that you are as mistaken as always (please recall that all your previous claims about my analytical model have been demonstrated to be false).

To begin with, fair or unfair, no one is doing a "sampling" of any kind in the simulation. The property p, or equivalently the parameter t, is part of the hidden variable set. It is a part of lambda. It is randomly chosen by Nature in each run of the experiment, just like the other hidden variable e. It took me only five seconds to "uncover" this (I also have certain advantages, but I prefer not to mention them).

Michel's simulation is simply a different implementation of the original simulation by Chantal. What Michel has done is take the C_a and C_b functions from my paper, streamline the notation, and use the sin^2(t) function appearing in the integration over the probability density function discussed in the footnote 1 of my paper to generate the correct statistics. In other words, Michel is doing the same statistical computation to arrive at the same correlation, but without using the C_ab part of Chantal's simulation, which has been deliberately misinterpreted by you. Thus the simulations of Chantal and Michel are two complementing simulations of one and the same analytical model.

I will not dignify the rest of your derogatory comments about Chantal's simulation with a response. I will let the readers judge her beautiful work for themselves.

Image 1

Thank you again, Mr. Moldoveanu. This is the first ever constructive contribution you have made as far as I can remember.

You have just demonstrated that the correct line of code to capture the 3-sphere topology of my analytical model is: out = p < abs(C) and numpy.sign(C) or 0.0 # if |C| > p then particle is detected by sign(C) channel. If this line of code is replaced by any incorrect line of code representing the topology of a real line, then the resulting correlation cannot be as strong as those between the points of a parallelized 3-sphere.

Let me also remind you once again that, fair or unfair, no one is doing a "sampling" of any kind in Michel's simulation. The property p, or equivalently the parameter t, is part of the hidden variable set. It is a part of lambda. It is randomly chosen by Nature in each run of the experiment, just like the other hidden variable e. The correct line of code from Michel's simulation you have just quoted is thus implementing the correct topological structure of the parallelized 3-sphere.

I hope I do not have to teach you that any simulation code by itself has no meaning without knowing what it is that is being simulated, or demonstrated. In the case of my local model what is being simulated is the correlation between the points of a parallelized 3-sphere. Unless one understands the meaning of the words "parallelized 3-sphere" (and you have amply demonstrated that you don't), there is no hope of understanding either my analytical model, or any simulation that confirms its validity.Attachment #1: 22_whither.pdf

To spell out what is already so very clearly stated by Michel here, in the measurement results A(x, lambda) the initial or complete state lambda is a set { e, t }, where e from [0.. 2pi) and t from [0 .. pi/2) are two random hidden variables, shared by both Alice and Bob, such that the constraint

|cos (x - e)| > ½ sin^2(t)

holds for all feely chosen detector angles x. This constraint arises from the geometry and topology of the 3-sphere. One can understand this clearly from the footnote 1 on pages 18 to 19 of the attached paper. The two graphs above only reinforce the crucial role played by the geometry of the 3-sphere in my model. Of course, one is free to endlessly misinterpret everything I say without ever reading---let alone understanding---what is so clearly discussed in so many of my papers on the subject to date.

Regardless of the interpretation of the hidden variable lambda, Michel's simulation clearly refutes the generally held belief that results A(a, lambda) and B(b, lambda) observed by Alice and Bob, with only freely chosen detector directions a and b and shared randomness lambda, cannot exhibit strong correlations seen in his simulation.

Tom,

On the Planck length, I throw the question back to you, what makes you think not?

However, let me not cause a distraction on this thread. Lets all enjoy the beautiful mathematics. We may take this issue up else where, concerning whether space is digital or not.

Regards,

Akinbo

Thanks for showing up, Florin.

All spaces of a continuous measurement function are make-believe, because a computer is a 3-dimensional instrument. What you, Vongehr, Gill and others have never understood, is that computing in a 4-dimensional continuum introduces a degree of freedom that differentiates the choice function of the experimenter, or programmer, from random input that simulates the choice function of naturally occurring classical randomness.

The computer doesn't care what space it is computing in. Until you realize that the Python implementation of a Java program implies that the simulation of a continuous function is a continuous function, you are going to be stuck in an outdated belief that 3-dimension computability = reality. Welcome to the 21st century, where Flatland is only a memory.

Best,

Tom

Tom,

As usual, Mr. Moldoveanu is making a blatantly false claim about Michel's and Chantal's simulations of my model, without understanding what the initial state lambda is in Michel's simulation (despite it being very clearly defined by Michel).

Best,

Joy

I know, Joy, but I don't think he making the claim to intentionally mislead. After all, the programmers who wrote these simulations are his own colleagues. He has to argue with them, not us theorists.

I am convinced that he thinks -- along with other believers -- that reality cannot be expressed in any but probabilistic terms (Vongehr has raised this belief to a level of religious fervor) and any *objective* evidence to the contrary must be a trick of some kind.

Sooner or later, one accepts that classical randomness is self-correcting, that nature fulfills all evolutionary possibilities completely and continuously -- or one condemns oneself to a prison of existential nihilism.

If it were true that the subjective mind creates external reality by changing the probability of physical manifestation in a moment, that reality is observer-created, it would be impossible that free will exists as other than illusion. All the probabilities would self-annihilate in a single bounded lump of perfect three dimension information -- like a six-sided die with no markings, no pips.

All along, probabilists have assumed that the pips we draw on the two dimension surface of a three dimension reality represent the total sum of information in a moment. They have forgotten that we don't live in three dimensions, and that the moment is much larger than even the superposition of states that they imagine. Or can imagine.

All best,

Tom

Here's a good example of what I'm talking about, picked up at sci.physics.foundations from John Reed:

"I have looked at the code for this new simulation. If we lived in a two dimensional world, it would mean something, but I don't see any connection to the previous simulations or the experiments dealing with Bell's theorem. There is no reference to Clifford algebra, bivectors or a parallelized three sphere. It's all now just picking random numbers and taking the cosine of their difference. What is the meaning of that? The experiments are done in three dimensions."

Continuous binary random input (the ThrowDie function in Chantal's program) is not a case of picking random numbers; it is a case of input to the continuous function that uses classical randomness to simulate 4-dimension continuity.

I have heard Joy explain over and over and over to countless individuals that the same classical randomness that applies to Newtonian physics produces strong quantum correlations without assuming entanglement, superposition and nonlocality. The measurement framework is entirely *classical.* Therefore, manifestly local.

CA and the bivectors thereof, apply to the spacetime algebra (Hestenes) which is simply another mathematical translation of classical (Minkowski) spacetime. The parallelized 3 sphere is the physical measure space of the simulation. The elements are all represented.

Tom

One philosophical result of John Bell's choice of measurement space is a school called "super-determinism," which maintains that deterministic theories imply the absence of free will. Bell addressed the issue in a BBC interview with Paul Davies in 1985:*

"There is a way to escape the inference of superluminal speeds and spooky action at a distance. But it involves absolute determinism in the universe, the complete absence of free will. Suppose the world is super-deterministic, with not just inanimate nature running on behind-the-scenes clockwork, but with our behavior, including our belief that we are free to choose to do one experiment rather than another, absolutely predetermined, including the 'decision' by the experimenter to carry out one set of measurements rather than another, the difficulty disappears. There is no need for a faster than light signal to tell particle A what measurement has been carried out on particle B, because the universe, including particle A, already 'knows' what that measurement, and its outcome, will be."

I think that believers in Bell's result (i.e., quantum configuration space cannot be mapped to physical space without a nonlocal model) tend to falsely associate Joy's measurement framework with superdetermism and the absence of free will. This is true only if one identifies "free will" with the choice of an experimenter to decide what measurement criteria to use.

Joy's measurement framework does not constitute a superdeterministic theory -- indeed, the experimenter's choice is irrelevant, because the experimenter is only another element of nature's choice. That Bell characterizes nature as "inanimate" betrays an ontological bias toward particle reality (and probability), while Joy's epistemic framework allows no bias of nature toward a predetermined outcome. The experimenter has just as much free will as the external reality.

*Picked up from a Wikipedia article which continues:

"Superdeterminism has also been criticized because of perceived implications regarding the validity of science itself. For example, Anton Zeilinger has commented:

'[W]e always implicitly assume the freedom of the experimentalist... This fundamental assumption is essential to doing science. If this were not true, then, I suggest, it would make no sense at all to ask nature questions in an experiment, since then nature could determine what our questions are, and that could guide our questions such that we arrive at a false picture of nature.'"

Joy's framework explicitly verifies that nature does *not* determine the questions, nor the outcome. It is perfectly consistent with John Wheeler's participatory universe ("The situation cannot declare itself until you've asked your question. But the asking of one question precludes the asking of another."). The initial condition of the continuous measurement function is indifferent to which element of nature, the experimenter or the classicly random outcome, decides the question.

Tom

You say it well Tom..

Just upstairs, you state "The initial condition of the continuous measurement function is indifferent to which element of nature, the experimenter or the classicly random outcome, decides the question." This sums an important insight up nicely, as both nature and the experimenter may determine some outcomes, but an observer can't know whether what's observed reflects nature's choice or the orientation of the observer and/or observing apparatus.

I think part of it is that for most people there is a disconnect between a continuous range and discrete outcomes, but that is a built-in part of nature's way of representing things. And speaking of what's built-in, I like Fred's comment further up that nature created the number types in the reverse order we humans found them in, because the Reals are a limited special case, where the Octonions are the most general type, and the existence of the Reals - via the sums of squares - proves that the Complex, Quaternion, and Octonion numbers must be part of what defines reality, as well.

So in a way; the Octonions had to exist, for the Reals to come to be.

Just my two cents,

Jonathan

    Tom,

    There is no translation. Chantal's code is the drawing of a cosine curve sprinkled with some random noise, while Michael's code obtains the correlations from experimental outcomes.

    So one is a decoy, the other one is the real deal.

    On a scale from 1 to 10 I would score Chantal and Michael as follows:

    Code quality Code value

    Chantal 10 2

    Michael 7 8

    Thank you, once again, Mr. Moldoveanu. I am in no position to judge your programming skills, but your latest attempt to misrepresent my model does not surprise me very much. Michel's simulation is not based on any "sampling" of any kind. It is based on the geometrical constraints of a parallelized 3-sphere. Nor has he used any trial and error to arrive at his simulation. Some of us know well that you have a habit of making blatantly false claims about my model, which have all been refuted several times over.

    What you have failed to understand for many years is a very basic point about my model: EPR-Bohm correlations are correlations among the points of a unit parallelized 3-sphere. They are NOT correlations among the points of a real line on which you are horribly stuck. As so very clearly stated by Michel, in the measurement results A(x, lambda) the initial or complete state lambda is a set {e, t}, where e from [0.. 2pi) and t from [0 .. pi/2) are both part of the hidden variables in the simulation, such that the constraint

    |cos (x - e)| > ½ sin^2(t)

    holds for all feely chosen detector angles x. This constraint arises from the intrinsic geometry and topology of the 3-sphere. It has nothing whatsoever to do with sampling of any kind. One can understand the constraint very clearly from the footnote 1 on pages 18 to 19 of the attached paper. Accordingly, given the initial state {e, t}, the measurement result is deterministically determined by the geometry of the 3-sphere, giving a definite value +1 or -1, with 100% efficiency. Thus the issue of detection loophole, or any other loophole, has no ralavance whatsoever within my analytical model, or within its numerical simulation by Michel.Attachment #1: 23_whither.pdf