• Ultimate Reality
  • Classical Spheres, Division Algebras, and the Illusion of Quantum Non-locality:

Thank you again, Mr. Moldoveanu. This is the first ever constructive contribution you have made as far as I can remember.

You have just demonstrated that the correct line of code to capture the 3-sphere topology of my analytical model is: out = p < abs(C) and numpy.sign(C) or 0.0 # if |C| > p then particle is detected by sign(C) channel. If this line of code is replaced by any incorrect line of code representing the topology of a real line, then the resulting correlation cannot be as strong as those between the points of a parallelized 3-sphere.

Let me also remind you once again that, fair or unfair, no one is doing a "sampling" of any kind in Michel's simulation. The property p, or equivalently the parameter t, is part of the hidden variable set. It is a part of lambda. It is randomly chosen by Nature in each run of the experiment, just like the other hidden variable e. The correct line of code from Michel's simulation you have just quoted is thus implementing the correct topological structure of the parallelized 3-sphere.

I hope I do not have to teach you that any simulation code by itself has no meaning without knowing what it is that is being simulated, or demonstrated. In the case of my local model what is being simulated is the correlation between the points of a parallelized 3-sphere. Unless one understands the meaning of the words "parallelized 3-sphere" (and you have amply demonstrated that you don't), there is no hope of understanding either my analytical model, or any simulation that confirms its validity.Attachment #1: 22_whither.pdf

To spell out what is already so very clearly stated by Michel here, in the measurement results A(x, lambda) the initial or complete state lambda is a set { e, t }, where e from [0.. 2pi) and t from [0 .. pi/2) are two random hidden variables, shared by both Alice and Bob, such that the constraint

|cos (x - e)| > ½ sin^2(t)

holds for all feely chosen detector angles x. This constraint arises from the geometry and topology of the 3-sphere. One can understand this clearly from the footnote 1 on pages 18 to 19 of the attached paper. The two graphs above only reinforce the crucial role played by the geometry of the 3-sphere in my model. Of course, one is free to endlessly misinterpret everything I say without ever reading---let alone understanding---what is so clearly discussed in so many of my papers on the subject to date.

Regardless of the interpretation of the hidden variable lambda, Michel's simulation clearly refutes the generally held belief that results A(a, lambda) and B(b, lambda) observed by Alice and Bob, with only freely chosen detector directions a and b and shared randomness lambda, cannot exhibit strong correlations seen in his simulation.

Tom,

On the Planck length, I throw the question back to you, what makes you think not?

However, let me not cause a distraction on this thread. Lets all enjoy the beautiful mathematics. We may take this issue up else where, concerning whether space is digital or not.

Regards,

Akinbo

Thanks for showing up, Florin.

All spaces of a continuous measurement function are make-believe, because a computer is a 3-dimensional instrument. What you, Vongehr, Gill and others have never understood, is that computing in a 4-dimensional continuum introduces a degree of freedom that differentiates the choice function of the experimenter, or programmer, from random input that simulates the choice function of naturally occurring classical randomness.

The computer doesn't care what space it is computing in. Until you realize that the Python implementation of a Java program implies that the simulation of a continuous function is a continuous function, you are going to be stuck in an outdated belief that 3-dimension computability = reality. Welcome to the 21st century, where Flatland is only a memory.

Best,

Tom

Tom,

As usual, Mr. Moldoveanu is making a blatantly false claim about Michel's and Chantal's simulations of my model, without understanding what the initial state lambda is in Michel's simulation (despite it being very clearly defined by Michel).

Best,

Joy

I know, Joy, but I don't think he making the claim to intentionally mislead. After all, the programmers who wrote these simulations are his own colleagues. He has to argue with them, not us theorists.

I am convinced that he thinks -- along with other believers -- that reality cannot be expressed in any but probabilistic terms (Vongehr has raised this belief to a level of religious fervor) and any *objective* evidence to the contrary must be a trick of some kind.

Sooner or later, one accepts that classical randomness is self-correcting, that nature fulfills all evolutionary possibilities completely and continuously -- or one condemns oneself to a prison of existential nihilism.

If it were true that the subjective mind creates external reality by changing the probability of physical manifestation in a moment, that reality is observer-created, it would be impossible that free will exists as other than illusion. All the probabilities would self-annihilate in a single bounded lump of perfect three dimension information -- like a six-sided die with no markings, no pips.

All along, probabilists have assumed that the pips we draw on the two dimension surface of a three dimension reality represent the total sum of information in a moment. They have forgotten that we don't live in three dimensions, and that the moment is much larger than even the superposition of states that they imagine. Or can imagine.

All best,

Tom

Here's a good example of what I'm talking about, picked up at sci.physics.foundations from John Reed:

"I have looked at the code for this new simulation. If we lived in a two dimensional world, it would mean something, but I don't see any connection to the previous simulations or the experiments dealing with Bell's theorem. There is no reference to Clifford algebra, bivectors or a parallelized three sphere. It's all now just picking random numbers and taking the cosine of their difference. What is the meaning of that? The experiments are done in three dimensions."

Continuous binary random input (the ThrowDie function in Chantal's program) is not a case of picking random numbers; it is a case of input to the continuous function that uses classical randomness to simulate 4-dimension continuity.

I have heard Joy explain over and over and over to countless individuals that the same classical randomness that applies to Newtonian physics produces strong quantum correlations without assuming entanglement, superposition and nonlocality. The measurement framework is entirely *classical.* Therefore, manifestly local.

CA and the bivectors thereof, apply to the spacetime algebra (Hestenes) which is simply another mathematical translation of classical (Minkowski) spacetime. The parallelized 3 sphere is the physical measure space of the simulation. The elements are all represented.

Tom

One philosophical result of John Bell's choice of measurement space is a school called "super-determinism," which maintains that deterministic theories imply the absence of free will. Bell addressed the issue in a BBC interview with Paul Davies in 1985:*

"There is a way to escape the inference of superluminal speeds and spooky action at a distance. But it involves absolute determinism in the universe, the complete absence of free will. Suppose the world is super-deterministic, with not just inanimate nature running on behind-the-scenes clockwork, but with our behavior, including our belief that we are free to choose to do one experiment rather than another, absolutely predetermined, including the 'decision' by the experimenter to carry out one set of measurements rather than another, the difficulty disappears. There is no need for a faster than light signal to tell particle A what measurement has been carried out on particle B, because the universe, including particle A, already 'knows' what that measurement, and its outcome, will be."

I think that believers in Bell's result (i.e., quantum configuration space cannot be mapped to physical space without a nonlocal model) tend to falsely associate Joy's measurement framework with superdetermism and the absence of free will. This is true only if one identifies "free will" with the choice of an experimenter to decide what measurement criteria to use.

Joy's measurement framework does not constitute a superdeterministic theory -- indeed, the experimenter's choice is irrelevant, because the experimenter is only another element of nature's choice. That Bell characterizes nature as "inanimate" betrays an ontological bias toward particle reality (and probability), while Joy's epistemic framework allows no bias of nature toward a predetermined outcome. The experimenter has just as much free will as the external reality.

*Picked up from a Wikipedia article which continues:

"Superdeterminism has also been criticized because of perceived implications regarding the validity of science itself. For example, Anton Zeilinger has commented:

'[W]e always implicitly assume the freedom of the experimentalist... This fundamental assumption is essential to doing science. If this were not true, then, I suggest, it would make no sense at all to ask nature questions in an experiment, since then nature could determine what our questions are, and that could guide our questions such that we arrive at a false picture of nature.'"

Joy's framework explicitly verifies that nature does *not* determine the questions, nor the outcome. It is perfectly consistent with John Wheeler's participatory universe ("The situation cannot declare itself until you've asked your question. But the asking of one question precludes the asking of another."). The initial condition of the continuous measurement function is indifferent to which element of nature, the experimenter or the classicly random outcome, decides the question.

Tom

You say it well Tom..

Just upstairs, you state "The initial condition of the continuous measurement function is indifferent to which element of nature, the experimenter or the classicly random outcome, decides the question." This sums an important insight up nicely, as both nature and the experimenter may determine some outcomes, but an observer can't know whether what's observed reflects nature's choice or the orientation of the observer and/or observing apparatus.

I think part of it is that for most people there is a disconnect between a continuous range and discrete outcomes, but that is a built-in part of nature's way of representing things. And speaking of what's built-in, I like Fred's comment further up that nature created the number types in the reverse order we humans found them in, because the Reals are a limited special case, where the Octonions are the most general type, and the existence of the Reals - via the sums of squares - proves that the Complex, Quaternion, and Octonion numbers must be part of what defines reality, as well.

So in a way; the Octonions had to exist, for the Reals to come to be.

Just my two cents,

Jonathan

    Tom,

    There is no translation. Chantal's code is the drawing of a cosine curve sprinkled with some random noise, while Michael's code obtains the correlations from experimental outcomes.

    So one is a decoy, the other one is the real deal.

    On a scale from 1 to 10 I would score Chantal and Michael as follows:

    Code quality Code value

    Chantal 10 2

    Michael 7 8

    Thank you, once again, Mr. Moldoveanu. I am in no position to judge your programming skills, but your latest attempt to misrepresent my model does not surprise me very much. Michel's simulation is not based on any "sampling" of any kind. It is based on the geometrical constraints of a parallelized 3-sphere. Nor has he used any trial and error to arrive at his simulation. Some of us know well that you have a habit of making blatantly false claims about my model, which have all been refuted several times over.

    What you have failed to understand for many years is a very basic point about my model: EPR-Bohm correlations are correlations among the points of a unit parallelized 3-sphere. They are NOT correlations among the points of a real line on which you are horribly stuck. As so very clearly stated by Michel, in the measurement results A(x, lambda) the initial or complete state lambda is a set {e, t}, where e from [0.. 2pi) and t from [0 .. pi/2) are both part of the hidden variables in the simulation, such that the constraint

    |cos (x - e)| > ½ sin^2(t)

    holds for all feely chosen detector angles x. This constraint arises from the intrinsic geometry and topology of the 3-sphere. It has nothing whatsoever to do with sampling of any kind. One can understand the constraint very clearly from the footnote 1 on pages 18 to 19 of the attached paper. Accordingly, given the initial state {e, t}, the measurement result is deterministically determined by the geometry of the 3-sphere, giving a definite value +1 or -1, with 100% efficiency. Thus the issue of detection loophole, or any other loophole, has no ralavance whatsoever within my analytical model, or within its numerical simulation by Michel.Attachment #1: 23_whither.pdf

    Let me point out the dishonesty in some of the assertions behind Mr. Moldoveanu's claims about Chantal's and Michel's simulations. I have done the same for his false claims about my analytical model many times before, so I won't bother with that here (in any case, it has been verified by many competent physicists by now, such as Lucien Hardy).

    To begin with, in the real experiments what is observed is what is predicted by quantum mechanics (if anyone claims otherwise, then they have observed actual violations of quantum mechanics and deserve a Nobel Prize). Quantum mechanics does not predict individual outcomes observed at one of the two stations. It predicts *probabilities* of observing *simultaneous* occurrences of measurement events at the two stations. Accordingly, in actual experiments what is observed are *coincidences* of measurement events. For example, coincidences of results A = +1 by Alice and B = -1 by Bob. This means that what is actually observed are simultaneous occurrences like {A = +1, B = -1}. Or, equivalently, AB = -1. The question then is: what are the probabilities of observing AB = +1 and AB = -1?. That is what is predicted by quantum mechanics, and that is what is observed. And---and here is the point---that is what is predicted by my analytical model, and that is what is simulated by Chantal in her simulation. Of course, one can try to observe results only at Alice's station. But then all we find is 50/50 occurrence of A = +1 and A = -1, and that is trivial to simulate. So the dishonest claim made about "experimental outcomes" is just that---dishonest. It smuggles-in something that is not even predicted by quantum mechanics, let alone actually observed.

    I have already commented on the dishonest claims made by Mr. Moldoveanu about Michel's simulation. His main dishonesty there is to neglect the actual random variable (or the actual initial state), which is the set {e, t}, and consider only e, so that he can misinterpret the simulation according to his own whim.

    I have already commented above on the dishonest claims made by Mr. Moldoveanu about Michel's simulation. His dishonesty is to neglect the actual random variable (or the actual initial state), which is the set {e, t}, and consider only e, so that he can misinterpret the simulation according to his own whim.

    Detector efficiency, detection rate, unfair bias, fair or unfair sampling, are all completely irrelevant once the correct initial state {e, t} is correctly taken into account.

    " ... by simply using a linear progressive bias the correlation curves can approach the QM curve."

    Florin, one can take any of the machine languages separately in which Joy's measurement framework has been simulated, and bias it toward a linear result. This is a limitation of all machine languages (difference vs. differential) to simulate a continuous function.

    The truly surprising and meaningful result of transporting the algorithm to four different programming languages is evidence that *the simulation of a continuous function is a continuous function.* That is, every time we use classically random input regardless of the program language, we get a smooth continuum of correlated points. This makes the simulated measurement framework a true nonlinear simulation of continuously random input -- in other words, random input to a 3 dimension simulation generates a 4 dimension reality, which is the one in which we live.

    Florin, programming skill -- and even the debatable efficiencies of programming languages -- is entirely irrelevant.

    The *science* of the simulation outcome is independent of computer results. Until you (and Vongehr and Gill) understand the meaning of a measurement function continuous from an initial condition, you are getting nowhere toward understanding why quantum correlations are a natural result of continuous classically random nonlinear input to the measurement function.

    Hi Jonathan,

    "So in a way; the Octonions had to exist, for the Reals to come to be."

    This is the kind of thing that makes mathematicians crazy :-) -- we don't tend to think that "pi in the sky" is real, until confronted with the consistency of numerical results that should live in disconnected spaces.

    It is one thing to speak of simply-connected spaces in an objective, abstract, mathematical way. To have evidence of that physical reality (Joy's measurement framework), is a wonderful leap in intellectual history.

    All best,

    Tom

    Florin,

    Let me try and explain it this way:

    Your consistent misspelling of Michel's name ("Michael") illustrates how languages differ in linearly ordered terms without affecting the meaning. The French spelling Michel means the same thing as the English transliteration of the Hebrew name Michael. Same goes for the Spanish Miguel and the Russian Mikhail.

    The scientific meaning of a continuous measurement function simulation is indifferent to how the terms are linearly ordered. What we want to find out, is what happens when totally disordered elements of reality are subjected to a classically random function (coin-toss probability) -- is the result more disorder (noise), or newly ordered relations (information, such as names found in a look-up table)?

    What we see is that the probability -- given a function continuous from the initial condition -- that the function generates ordered relations, is exactly the same as the probability for quantum correlations under the assumption of linear superposition (with implications for nonlocality and probability theory).

    It makes the difference between a probabilistic measure schema on a disconnected space, and a deterministic classically random and continuous measure space. Christian's measure space assumes only initial condition and a measurement function continuous from the initial condition. That assumption cannot be made on the real line of ordered relations -- it requires the nonlinearity that we witness in almost all natural phenomena. The inescapable conclusion is the lack of boundary between quantum and classical domains.