• Ultimate Reality
  • Classical Spheres, Division Algebras, and the Illusion of Quantum Non-locality:

Hi Tom,

Let us not forget that

(1) Professor Scott Aaronson claimed on his blog that the correlation predicted by my model is always constant and equal to -1.

(2) The exquisitely qualified FQXi panelist whose report I have seen claimed that the correlation predicted by my model is always constant and equal to -1.

(3) Professor Richard Gill claimed on these pages that the correlation predicted by my model is always constant and equal to -1.

(4) Professor James Owen Weatherall claimed in his paper that the correlation predicted by my model is always constant and equal to -1.

(5) A distinguished Editorial Board Member of Physical Review claimed in her report that the correlation predicted by my model is always constant and equal to -1.

(6) Several less-than-distinguished critics of my work also claimed that the correlation predicted by my model is always constant and equal to -1.

(7) I was declared a completely exposed charlatan and a crackpot, and my tiny research funding from FQXi was cut off.

Now compare the above distinguished opinions with the following opinion of a humble machine:

Image 1

Image 2

    Joy, I've spent years now trying to understand why your critics' arithmetic differs from yours. And mine.

    I concluded at some point that because they always assume the quantum probability measure, they deny the continuous function that completes the correlation measure. They never see it -- it never happens. Anything that might have happened is "nonlocal."

    If one computes only on the basis of first order arithmetic, the probability is compelled to collapse to unity, as the wave function of a quantum observation is believed to collapse.

    Introduce second order arithmetic (analysis), and the game changes -- there is no collapse, no nonlocality.

    The error -- just as you always claimed -- is built right into the assumption of what space one is working in. Where first order arithmetic applies, a many-sided die gives one real result with n-results in linear superposition; where second-order arithmetic applies, there is no probability for a linear outcome. The order relation (the primitive binary relation) in second order arithmetic will fluctuate (0,1) (1,0) continuously -- if one is judging this fluctuation by first order axioms, one reasons that because the statement, 0 < 1 is true and 1 < 0 is false on the real positive line R, what is less than 0 (the "distinguished member") is - 1 and mathematically illegal.

    In the analytical case, however, because we are not confined to the space of the real line (topological space S^0), the distinguished member is a complex double zero {0,0} such that a measurement function continuous from the initial condition, assuming the primitive binary relation and nondegeneracy, is either [0, +/- 1] or [+/- 1, 0] in which the closed interval makes the difference between judging results probabilistically on the open interval (- oo, + oo) and finding true deterministic correlation of left and right independent variables on parallelized topological spheres. S^0 is trivial; S^1 has the complex {0,0} but still allows the open interval. Only at S^3 do we encounter a closed manifold suitable for linear independence of the random variables; we know by complex function analysis that the only allowable results on the S^3 equator are + 1, - 1 and i (sqrt -1). We don't even need the complete physical space of S^7 to make the case for this subset of the parallelizable spheres, S^1, S^3, S^7 (for those unfamiliar with topology, the notation means Euclidean spheres of two, four and eight dimensions, accommodating division algebras from the algebraically closed plane to quaternion algebra (S^3) and octonion algebra (S^7).

    Let the critics come forward with counterarguments, if they have them. They certainly weren't shy of expressing their opinions when it wasn't clear if the continuous function simulation of nonlinear input could be programmed digitally (thanks again to courageous Chantal Roth) -- what now? Nothing to say?

    All best,

    Tom

    Joy,

    Congratulations. For your critics to retain credibility they must publicly don the hair shirt. The sign of a good scientist is to admit when they were wrong.

    I also hope you'll be generous and magnanimous in victory, another sign of a good scientist (if you're still alive when the win is noticed!)

    But I'm disappointed to have yet had no response or comment from you on my geometrical analogue of your finding. I have no desire to steal your thunder or give opportunity for others to drag yours down, but I believe important physical insights emerge to consolidate the principle.

    I've had success making the point that Neils Bohr would likely say today that we could update 'what we can say' about the lack of structure of a particle, and test the effects of, for instance, toroidal geometry and chirality. The religious adherence to singlet states alone is now untenable.

    Have you yet researched the findings of an 'orbital asymmetry' of time-resolved single particle correlations predicted in my essay and now found discarded in Alain Aspect's data? (not in his main paper).

    Best wishes.

    Peter

    Hi Peter,

    Thank you for your kind words. I thought we had discussed your ideas before; but perhaps not the latest details you mention. Aspect's experiment has been superseded by many more careful and sophisticated experiments. The state-of-the-art experiments are now moving towards completely loophole-free experiments. All of these experiments confirm quantum mechanical predictions. Therefore your obligation is to reproduce the quantum mechanical predictions. I have not seen any derivations from you, even for the simplest cases. I speak equations. You seem to speak a language that I do not understand.

    Best,

    Joy

    Hi James,

    I better understand what you mean, now. I also wondered whether writing a digital program of a continuous function with nonlinear input would be a "monumental task." The technical question is, "Is such a function algorithmically compressible?"

    Precise numerical implementation of a continuous function isn't possible in principle; however, if we accept the arbitrarily close points of a continuous line (as mentioned, the conversion of a differential equation to a difference equation) is sufficiently smooth -- then the problem remains how to randomize the input such that we know that wherever and whenever we insert on the line a pair of randomly generated dichotomous variables fluctuating between +1 and -1, the discrete choices remain linearly independent of the continuous function.

    In all previous simulations that were proposed to fail, that I saw -- there was no randomized input. The critics simply did not seem to understand that the arbitrary choice of vector by the programmer cannot be counted as the initial condition of a function continuous from the initial condition; they create a dependent condition based on the experimenter's choice -- and then when they don't get Joy's predicted correlations on their assumed probability space -- declare that something is wrong with the mathematics, rather than with their own assumptions about the initial condition and a probability space that isn't there in the first place. They don't grasp how Joy has left the choice of initial condition to nature and taken the experimenter out of it.

    The second worry that I had apart from the algorithmic compressibility of the framework, is that if it should prove simulable, there remains the question -- "is the simulation of a continuous function itself a continuous function?" Only today has my last worry been laid to rest -- having heard of the algorithm being replicated in at least two other machine languages. This is important, because it answers "yes" to the question and obviates any doubt that the model is independent of experimenter bias and the variables are linearly independent.

    Here is a prediction you can quote me on:

    Because Gregory Chaitin has shown that linear arithmetic functions have a built in uncertainty (Chaitin's Omega number is sensitive to the machine language running it), quantum computing algorithms based on linear superposition and quantum entanglement will fail. Preparation of the state vector in the complex Hilbert space biases the function. Better start looking for nonlinear solutions and the arithmetic nonlinearity that powers them.

    Best,

    Tom

    Joy,

    I'm speaking the 3D+1 geometry of non-linear optics. We're observing the same volcano but from opposite sides so see the same but nothing the same!

    I derive the QM prediction geometrically using helical dynamics in the essay, obeying Malus's Law. IQbit; The Intelligent Bit.

    Most recent experiments use 'weak measurement' statistical methods which are 'blind' to the cosine curves as it can't correlate time-resolved individual photon pairs. Now single photon production is more a reality I've proposed a refined Aspect experiment which should reproduce the 'orbital asymmetry' present in

    Tom, Joy,

    You should find compressible continuous functions in helical paths. I've been discussing in an APS Quantum Physics blog. I reproduce a recent post below, and can probably dig out a thesis on the Gottfried-Jackson angle and helical frame if you can't find anything helpful.

    Post; "I haven't found any hint that it may represent a longitudinal component, but studying related experimental results (grazing grounds I've always found useful), does give consistent hints about axial helicity consistent for instance with Jackson-Minkowski rotation viewable from the Gottfried-Jackson frame. The latter is nothing to do with me; but on the 'GJ angle' between the lab frame momentum of the Higgs and an emitted photon in the resonance rest frame. Rotating by the angle alpha on the y-axis gives the helicity rest frame. (alpha differentiates the z-axis in that frame and the approaching photons 3-momentum vector). I'll find some authoritative links on that if you want.

    But translating the gobbldygook into possible physical models with a simplified description, it looks something like the (spin-0 but possibly spin 2!) Higgs may facilitate a marriage between particle and antiparticle into a coherent (massive) and conserved toroidal dynamic by perhaps adding the second supplemetary 'winding' spin of the body. So the primary spin may be considered as the 'ring' rotating (giving the primary helicity on axial translation[motion]), but the torus is held together by the poles (of the dipole) spinning, which produces the twin counter 'winding' of the torus.

    The logical analogies of this are quite wide. For instance spin 1/2 and 1-1/2 can be physically represented by the charges ending up in the same place after only a half integer rotation of the ring. Sure, all circumstantial and highly speculative, but only in a similar way to jigsaw puzzles, and I've tried hard to falsify it and can't. I really thought Bells Theorem would destroy it, but, as you've seen, it avoids Bells (Bohr's 1020's) assumption of singles spin states because it has another couple hidden away!

    The 'no-mirror symmetry' matter is also critical to the EPR case. Do take a look at that aspect and comment, as it seems to act as 'information pre-held' between the entangled particles." [PJ, 3 days ago]

    best wishes

    Peter

    Let's take a trip back in time, to just a few months ago, when Richard Gill said, "We can be grateful for Christian that he has had the generosity to write his one page paper with a more or less complete derivation of his key result in a more or less completely explicit context, without distraction from the author's intended physical interpretation of the mathematics. The mathematics should stand on its own, the interpretation is 'free'. My fi nding is that in this case, the mathematics does not stand on its own."

    Now that we know that that finding is wrong -- and instead that the mathematical interpretation of Bell's theorem is not free, and in fact imposes a dependent condition on the outcome -- where's Gill?

    And where's Aaronson, who said, "I can't think of any better illustration than the comment thread below for my maxim that *computation is clarity.* In other words, if you can't explain how to simulate your theory on a computer, chances are excellent that the reason is that your theory makes no sense!"

    The simulation of Joy's function looks pretty clear to me.

    C'mon, guys. You dished it out.

    Tom

      Tom,

      As you know, the derivation of the correlation in my one-page paper stands on its own without the backing of the simulation. But, as one of my friends recently noted, in science "it seems to matter to have influential names on ones side." He then proceeded to ask me whether Lucien Hardy agreed with my calculation of the expectation value derived in my one-page paper. The answer is: Yes. Equations (1.22) to (1.25) of my book, as well as those in my one-page paper, have been explicitly verified---in great detail---by Lucien Hardy, Manfried Faber, and several other competent and knowledgeable physicists and mathematicians around the world. In fact any attentive reader with only basic skills in mathematics should be able to reproduce these equations rather effortlessly. The simulation discussed in the attached paper is thus just icing on the cake. It is no more than a feel-good factor. As you well know, the real beef is in the analytical model itself.

      Best,

      JoyAttachment #1: 18_whither.pdf

      Joy, I have tried my hardest to understand the sociology of this conflict, and I cannot. Even considering that the stakes are high, and that your idea is new, there are numerous high-stakes, new-idea propositions in theoretical physics that don't attract the level of disdain and uncollegial criticism that yours does.

      Einstein was known to warn against the "cult of personality" in matters of scientific importance. He rejected all attempts to enshrine his own work into such a cult, although not altogether successfully.

      If your research were such that it substituted techno-babble and mathematical esoterica for meaningful content (as so many papers these days do) -- I could understand the reaction. Such is not the case, however:

      Nothing could be clearer than your eqn 3 in the above attached paper, and the conclusion, "Unless based on a prescription of this precise form, any Bell-type analysis simply does not get off the ground, because without completeness there can be no theorem."

      Your argument is completely kosher and coherent -- from the precise prescription for a simply connected co-domain to the continuous measurement function that demonstrates the case.

      That's exactly what science is suppposed to do: conjecture and predict.

      All best,

      Tom

      Tom,

      I was just reading Alfred Wegener's biography. I think the hostility, disdain, and neglect his work received is unparalleled in the recent history of science. But let us not worry too much about the sociology of science.

      You have put your finger on the key equation---Equation 3 of my paper. The rest are details, as Einstein would have put it. The strong correlation necessarily follows from that prescription.

      Best,

      Joy

      Hi Joy,

      Seeing that an old George Musser blog on quantum teleportation has attracted new comments, I did some surfing and found what I think is a meaningful discussion among Lawrence Crowell, Ray Munroe (rest in peace), Fred, me and you, over topological foundations, beginning with Lawrence's post on 22 December 2011, 17:40 GMT.

      It would be nice if we could stimulate a new dialogue in that collegial manner -- Ray was wrong; however, he was wrong for the right reasons. One really does need to understand the topological principles that drive your result. And they need to understand that the topology isn't something that you made up to suit the case.

      Best,

      Tom

      Bee Hossenfelder is the most honest blogger I know, on science or any other subject. Her recent take on the subject of quantum foundations strikes a chord here:

      "Quantum foundations polarizes like no other area in physics. On the one hand there are those actively participating who think it's the most important thing ever but no two of them can agree on anything. And then there's the rest who thinks it's just a giant waste of time. In contrast, most people tend to agree that quantum gravity is worthwhile, though they may differ in their assessment of how relevant it is. And while there are subgroups in quantum gravity, there's a lot of coherence in these groups (even among them, though they don't like to hear that).

      "As somebody who primarily works in quantum gravity, I admit that I'm jealous of the quantum foundations people. Because they got data. It is plainly amazing for me to see just how much technological progress during the last decade has contributed to our improved understanding of quantum systems. May that be tests of Bell's theorem with entangled pairs separated by hundreds of kilometers, massive quantum oscillators, molecule interferometry, tests of the superposition principle, weak measurements, using single atoms as a double slit, quantum error correction, or the tracking of decoherence, to only mention what popped into my head first. When I was a student, none of that was possible. This enables us to test quantum theory now much more precisely and in more circumstances than ever before.

      "This technological progress may not have ignited the interest in the foundations of quantum mechanics but it has certainly contributed to the field drawing more attention and thus drawing more people. That however doesn't seem to have decreased the polarization of opinions, but rather increased it. The more attention research on quantum foundations gets, the more criticism it draws."

      Because Bee is a phenomenologist, I forgive her infatuation with technology. Fact is, new theoretical examinations of quantum foundations also tell us what technologies may *not* be viable -- such as quantum computing based on superposition and quantum entanglement. The successful tests of conventional quantum theory in the middle paragraph above, regardless that they "got data," it's not data that transfers to any useful technology -- in fact, the data have no way to show more than a demonstration of what the theory assumes.

      Where technology meets real world applications, the technology has nothing to do with quantum foundations. It operates on principles of electromagnetic theory and statistical mechanics that easily coexist with an incomplete mathematical framework of quantum theory.

      The prize, where quantum foundations makes a technology difference, is information-theoretic. And that's where it has to be -- like the classical physics of relativity -- mathematically complete. Already we see quantum computing theories that do not require entanglement making headway, like the D-wave program. Quantum discord also does not depend on entanglement.

      Tom

      Hi Tom,

      Another problem that quantum theory has is that I believe all its results are classical. It's a duality situation for sure. One does not exist without the other. If the folks trying to do quantum computers stop trying to use entanglement and instead use what Joy has discovered, they might be able to do something better than ordinary computers since there is a difference between the probabilities. The D-wave might actually be able to be taken to that approach.

      Best,

      Fred

      Hi Fred and Tom,

      Quantum computers would almost certainly require quantum entanglement for exponential speedup. If quantum entanglement is not a fundamental feature of nature, then that is almost certainly a very bad news for exponential speedup, and hence for quantum computers.

      Now within my local-realistic framework there is no entanglement of any kind. All physical systems, classical or quantum, are described by intrinsically factorizable measurement results and classical (albeit nonlinear) probabilities, as discussed in the appendix of the attached paper. This goes against the deeply held belief that certain probabilities and associated correlations cannot exist without quantum entanglement. But I have *derived* both quantum probabilities and quantum correlations without any kind of entanglement, as shown in the charts below. Therein lies the origins of hostility and disdain against my work and me personally. If I am right then many people are wrong. It is much easier to discredit me personally than try to understand what I have found.

      Best,

      Joy

      Image 1

      Image 2Attachment #1: 19_whither.pdf

        • [deleted]

        Quite right, Joy. The high vulnerability of quantum entanglement to decoherence should have been a clue long ago, that were entanglement a real physical phenomenon, it should exert a physical effect on neighboring states -- I mean, that if superposition is a protected state, it should intefere with the wave function of its environment rather than being semi-stable with the environment.

        Nature is not fragile. Computation is performed every moment on every hardware substrate without regard to special conditions. We have spent so much effort learning how to prepare against nature's propensity for decoherence and thermal equilibrium, that we have forgotten how nature uses decoherence to create new forms. Classical probability is a strong manifestation of this principle, because for every question answered (yes/no) there are a countable infinity of questions that remain locally asymmetric to the result; however, even with infinite separation between the two possible answers, evolution of the state is guaranteed unitary with no further assumptions. "All physics is local."

        Computing by quantum entanglement assumes that infinities are tamed in the superposition of states, with the further assumption of nonlocality. I have you to thank, Joy, for making it clear to me that these assumptions are superfluous and illusory.

        And it isn't that I don't think that computation can be exponentially faster -- distributed and parallel systems of computing that mimic nature's nonlinear way of converting infinite possibilites to local probabilities is identical to a simply connected topological network.

        All best,

        Tom

        Still hoping that this thread will attract interactive exchange with those who know something about the foundations of quantum theory ...

        Lucien Hardy has asked a poignant question: " ... could a 19th century theorist have developed quantum theory without access to the empirical data that later became available to his 20th century descendants?"

        Hardy's answer is an axiomatic treatment of quantum theory that peels back the veil of Hilbert space formalism, to bare the skeleton of essential assumptions:

        "Axiom 1 *Probabilities.* Relative frequencies (measured by taking the proportion of times a particular outcome is observed) tend to the same value (which we call the probability) for any case where a given measurement is performed on an ensemble of n systems prepared by some given preparation in the limit as n becomes infinite.

        "Axiom 2 *Simplicity.* K is determined by a function of N (i.e. K = K(N)) where N = 1, 2, . . . and where, for each given N, K takes the minimum value consistent with the axioms.

        "Axiom 3 *Subspaces.* A system whose state is constrained to belong to an M dimensional subspace (i.e. have support on only M of a set of N possible distinguishable states) behaves like a system of dimension M.

        "Axiom 4 *Composite systems.* A composite system consisting of subsystems A and B satisfies N = N_A N_B and K = K_A K_B

        "Axiom 5 *Continuity.* There exists a continuous reversible transformation on a system between any two pure states of that system."

        One is compelled to see the beauty in this formalization, if one has a mathematical soul. For the consistency of calculation to meet the consistency of observation there need be continuity between the mathematical model and the physical result, without sacrificing the independence between the model's formal language and the physical manifestation. So it is with elegant understatement that Hardy adds:

        "The first four axioms are consistent with classical probability theory but the fifth is not (unless the word 'continuous' is dropped). If the last axiom is dropped then, because of the simplicity axiom, we obtain classical probability theory (with K = N) instead of quantum theory (with K = N2). It is very striking that we have here a set of axioms for quantum theory which have the property that if a single word is removed -- namely the word 'continuous' in Axiom 5 -- then we obtain classical probability theory instead."

        My personal investment in researching the mathematical physics of continuous functions goes back about 10 years. I think this recent preliminary result that I am in the process of formalizing, supports the case that reversible transformations of pure states is native to the plane. There can hardly be a purer quantum arithmetic state than a pair of distinct odd prime integers.

        Tom

        Some day I may get the link right on the first try, but don't count on it.

        (My apologies. I wish there were some way to verify before posting. If this one doesn't work, then use the one in the post which started this thread.)