• Ultimate Reality
  • Classical Spheres, Division Algebras, and the Illusion of Quantum Non-locality:

Tom, Joy,

You should find compressible continuous functions in helical paths. I've been discussing in an APS Quantum Physics blog. I reproduce a recent post below, and can probably dig out a thesis on the Gottfried-Jackson angle and helical frame if you can't find anything helpful.

Post; "I haven't found any hint that it may represent a longitudinal component, but studying related experimental results (grazing grounds I've always found useful), does give consistent hints about axial helicity consistent for instance with Jackson-Minkowski rotation viewable from the Gottfried-Jackson frame. The latter is nothing to do with me; but on the 'GJ angle' between the lab frame momentum of the Higgs and an emitted photon in the resonance rest frame. Rotating by the angle alpha on the y-axis gives the helicity rest frame. (alpha differentiates the z-axis in that frame and the approaching photons 3-momentum vector). I'll find some authoritative links on that if you want.

But translating the gobbldygook into possible physical models with a simplified description, it looks something like the (spin-0 but possibly spin 2!) Higgs may facilitate a marriage between particle and antiparticle into a coherent (massive) and conserved toroidal dynamic by perhaps adding the second supplemetary 'winding' spin of the body. So the primary spin may be considered as the 'ring' rotating (giving the primary helicity on axial translation[motion]), but the torus is held together by the poles (of the dipole) spinning, which produces the twin counter 'winding' of the torus.

The logical analogies of this are quite wide. For instance spin 1/2 and 1-1/2 can be physically represented by the charges ending up in the same place after only a half integer rotation of the ring. Sure, all circumstantial and highly speculative, but only in a similar way to jigsaw puzzles, and I've tried hard to falsify it and can't. I really thought Bells Theorem would destroy it, but, as you've seen, it avoids Bells (Bohr's 1020's) assumption of singles spin states because it has another couple hidden away!

The 'no-mirror symmetry' matter is also critical to the EPR case. Do take a look at that aspect and comment, as it seems to act as 'information pre-held' between the entangled particles." [PJ, 3 days ago]

best wishes

Peter

Let's take a trip back in time, to just a few months ago, when Richard Gill said, "We can be grateful for Christian that he has had the generosity to write his one page paper with a more or less complete derivation of his key result in a more or less completely explicit context, without distraction from the author's intended physical interpretation of the mathematics. The mathematics should stand on its own, the interpretation is 'free'. My fi nding is that in this case, the mathematics does not stand on its own."

Now that we know that that finding is wrong -- and instead that the mathematical interpretation of Bell's theorem is not free, and in fact imposes a dependent condition on the outcome -- where's Gill?

And where's Aaronson, who said, "I can't think of any better illustration than the comment thread below for my maxim that *computation is clarity.* In other words, if you can't explain how to simulate your theory on a computer, chances are excellent that the reason is that your theory makes no sense!"

The simulation of Joy's function looks pretty clear to me.

C'mon, guys. You dished it out.

Tom

    Tom,

    As you know, the derivation of the correlation in my one-page paper stands on its own without the backing of the simulation. But, as one of my friends recently noted, in science "it seems to matter to have influential names on ones side." He then proceeded to ask me whether Lucien Hardy agreed with my calculation of the expectation value derived in my one-page paper. The answer is: Yes. Equations (1.22) to (1.25) of my book, as well as those in my one-page paper, have been explicitly verified---in great detail---by Lucien Hardy, Manfried Faber, and several other competent and knowledgeable physicists and mathematicians around the world. In fact any attentive reader with only basic skills in mathematics should be able to reproduce these equations rather effortlessly. The simulation discussed in the attached paper is thus just icing on the cake. It is no more than a feel-good factor. As you well know, the real beef is in the analytical model itself.

    Best,

    JoyAttachment #1: 18_whither.pdf

    Joy, I have tried my hardest to understand the sociology of this conflict, and I cannot. Even considering that the stakes are high, and that your idea is new, there are numerous high-stakes, new-idea propositions in theoretical physics that don't attract the level of disdain and uncollegial criticism that yours does.

    Einstein was known to warn against the "cult of personality" in matters of scientific importance. He rejected all attempts to enshrine his own work into such a cult, although not altogether successfully.

    If your research were such that it substituted techno-babble and mathematical esoterica for meaningful content (as so many papers these days do) -- I could understand the reaction. Such is not the case, however:

    Nothing could be clearer than your eqn 3 in the above attached paper, and the conclusion, "Unless based on a prescription of this precise form, any Bell-type analysis simply does not get off the ground, because without completeness there can be no theorem."

    Your argument is completely kosher and coherent -- from the precise prescription for a simply connected co-domain to the continuous measurement function that demonstrates the case.

    That's exactly what science is suppposed to do: conjecture and predict.

    All best,

    Tom

    Tom,

    I was just reading Alfred Wegener's biography. I think the hostility, disdain, and neglect his work received is unparalleled in the recent history of science. But let us not worry too much about the sociology of science.

    You have put your finger on the key equation---Equation 3 of my paper. The rest are details, as Einstein would have put it. The strong correlation necessarily follows from that prescription.

    Best,

    Joy

    Hi Joy,

    Seeing that an old George Musser blog on quantum teleportation has attracted new comments, I did some surfing and found what I think is a meaningful discussion among Lawrence Crowell, Ray Munroe (rest in peace), Fred, me and you, over topological foundations, beginning with Lawrence's post on 22 December 2011, 17:40 GMT.

    It would be nice if we could stimulate a new dialogue in that collegial manner -- Ray was wrong; however, he was wrong for the right reasons. One really does need to understand the topological principles that drive your result. And they need to understand that the topology isn't something that you made up to suit the case.

    Best,

    Tom

    Bee Hossenfelder is the most honest blogger I know, on science or any other subject. Her recent take on the subject of quantum foundations strikes a chord here:

    "Quantum foundations polarizes like no other area in physics. On the one hand there are those actively participating who think it's the most important thing ever but no two of them can agree on anything. And then there's the rest who thinks it's just a giant waste of time. In contrast, most people tend to agree that quantum gravity is worthwhile, though they may differ in their assessment of how relevant it is. And while there are subgroups in quantum gravity, there's a lot of coherence in these groups (even among them, though they don't like to hear that).

    "As somebody who primarily works in quantum gravity, I admit that I'm jealous of the quantum foundations people. Because they got data. It is plainly amazing for me to see just how much technological progress during the last decade has contributed to our improved understanding of quantum systems. May that be tests of Bell's theorem with entangled pairs separated by hundreds of kilometers, massive quantum oscillators, molecule interferometry, tests of the superposition principle, weak measurements, using single atoms as a double slit, quantum error correction, or the tracking of decoherence, to only mention what popped into my head first. When I was a student, none of that was possible. This enables us to test quantum theory now much more precisely and in more circumstances than ever before.

    "This technological progress may not have ignited the interest in the foundations of quantum mechanics but it has certainly contributed to the field drawing more attention and thus drawing more people. That however doesn't seem to have decreased the polarization of opinions, but rather increased it. The more attention research on quantum foundations gets, the more criticism it draws."

    Because Bee is a phenomenologist, I forgive her infatuation with technology. Fact is, new theoretical examinations of quantum foundations also tell us what technologies may *not* be viable -- such as quantum computing based on superposition and quantum entanglement. The successful tests of conventional quantum theory in the middle paragraph above, regardless that they "got data," it's not data that transfers to any useful technology -- in fact, the data have no way to show more than a demonstration of what the theory assumes.

    Where technology meets real world applications, the technology has nothing to do with quantum foundations. It operates on principles of electromagnetic theory and statistical mechanics that easily coexist with an incomplete mathematical framework of quantum theory.

    The prize, where quantum foundations makes a technology difference, is information-theoretic. And that's where it has to be -- like the classical physics of relativity -- mathematically complete. Already we see quantum computing theories that do not require entanglement making headway, like the D-wave program. Quantum discord also does not depend on entanglement.

    Tom

    Hi Tom,

    Another problem that quantum theory has is that I believe all its results are classical. It's a duality situation for sure. One does not exist without the other. If the folks trying to do quantum computers stop trying to use entanglement and instead use what Joy has discovered, they might be able to do something better than ordinary computers since there is a difference between the probabilities. The D-wave might actually be able to be taken to that approach.

    Best,

    Fred

    Hi Fred and Tom,

    Quantum computers would almost certainly require quantum entanglement for exponential speedup. If quantum entanglement is not a fundamental feature of nature, then that is almost certainly a very bad news for exponential speedup, and hence for quantum computers.

    Now within my local-realistic framework there is no entanglement of any kind. All physical systems, classical or quantum, are described by intrinsically factorizable measurement results and classical (albeit nonlinear) probabilities, as discussed in the appendix of the attached paper. This goes against the deeply held belief that certain probabilities and associated correlations cannot exist without quantum entanglement. But I have *derived* both quantum probabilities and quantum correlations without any kind of entanglement, as shown in the charts below. Therein lies the origins of hostility and disdain against my work and me personally. If I am right then many people are wrong. It is much easier to discredit me personally than try to understand what I have found.

    Best,

    Joy

    Image 1

    Image 2Attachment #1: 19_whither.pdf

      • [deleted]

      Quite right, Joy. The high vulnerability of quantum entanglement to decoherence should have been a clue long ago, that were entanglement a real physical phenomenon, it should exert a physical effect on neighboring states -- I mean, that if superposition is a protected state, it should intefere with the wave function of its environment rather than being semi-stable with the environment.

      Nature is not fragile. Computation is performed every moment on every hardware substrate without regard to special conditions. We have spent so much effort learning how to prepare against nature's propensity for decoherence and thermal equilibrium, that we have forgotten how nature uses decoherence to create new forms. Classical probability is a strong manifestation of this principle, because for every question answered (yes/no) there are a countable infinity of questions that remain locally asymmetric to the result; however, even with infinite separation between the two possible answers, evolution of the state is guaranteed unitary with no further assumptions. "All physics is local."

      Computing by quantum entanglement assumes that infinities are tamed in the superposition of states, with the further assumption of nonlocality. I have you to thank, Joy, for making it clear to me that these assumptions are superfluous and illusory.

      And it isn't that I don't think that computation can be exponentially faster -- distributed and parallel systems of computing that mimic nature's nonlinear way of converting infinite possibilites to local probabilities is identical to a simply connected topological network.

      All best,

      Tom

      Still hoping that this thread will attract interactive exchange with those who know something about the foundations of quantum theory ...

      Lucien Hardy has asked a poignant question: " ... could a 19th century theorist have developed quantum theory without access to the empirical data that later became available to his 20th century descendants?"

      Hardy's answer is an axiomatic treatment of quantum theory that peels back the veil of Hilbert space formalism, to bare the skeleton of essential assumptions:

      "Axiom 1 *Probabilities.* Relative frequencies (measured by taking the proportion of times a particular outcome is observed) tend to the same value (which we call the probability) for any case where a given measurement is performed on an ensemble of n systems prepared by some given preparation in the limit as n becomes infinite.

      "Axiom 2 *Simplicity.* K is determined by a function of N (i.e. K = K(N)) where N = 1, 2, . . . and where, for each given N, K takes the minimum value consistent with the axioms.

      "Axiom 3 *Subspaces.* A system whose state is constrained to belong to an M dimensional subspace (i.e. have support on only M of a set of N possible distinguishable states) behaves like a system of dimension M.

      "Axiom 4 *Composite systems.* A composite system consisting of subsystems A and B satisfies N = N_A N_B and K = K_A K_B

      "Axiom 5 *Continuity.* There exists a continuous reversible transformation on a system between any two pure states of that system."

      One is compelled to see the beauty in this formalization, if one has a mathematical soul. For the consistency of calculation to meet the consistency of observation there need be continuity between the mathematical model and the physical result, without sacrificing the independence between the model's formal language and the physical manifestation. So it is with elegant understatement that Hardy adds:

      "The first four axioms are consistent with classical probability theory but the fifth is not (unless the word 'continuous' is dropped). If the last axiom is dropped then, because of the simplicity axiom, we obtain classical probability theory (with K = N) instead of quantum theory (with K = N2). It is very striking that we have here a set of axioms for quantum theory which have the property that if a single word is removed -- namely the word 'continuous' in Axiom 5 -- then we obtain classical probability theory instead."

      My personal investment in researching the mathematical physics of continuous functions goes back about 10 years. I think this recent preliminary result that I am in the process of formalizing, supports the case that reversible transformations of pure states is native to the plane. There can hardly be a purer quantum arithmetic state than a pair of distinct odd prime integers.

      Tom

      Some day I may get the link right on the first try, but don't count on it.

      (My apologies. I wish there were some way to verify before posting. If this one doesn't work, then use the one in the post which started this thread.)

      "Still hoping that this thread will attract interactive exchange with those who know something about the foundations of quantum theory ..."

      OK Tom, I'll bite. Hardy's Abstract states that:

      "This work provides some insight into the reasons why quantum theory is the way it is. For example, it explains the need for complex numbers..."

      Elsewhere in these discussions, I have pointed out that complex numbers, arising from the introduction of Fourier Transforms into QM, are not in fact NEEDED at all. They are merely sufficient. The unmeasurable phase of the Fourier Transform is what provides the continuous transform between "pure states" (Axiom 5). But that is also unnecessary, since the Fourier Transform itself is unnecessary.

      The Fourier Transform, combined with the summation of the squared real and imaginary parts, is algebraically identical to a filter-bank that computes classical probabilities by merely counting discrete detections. In other words, it forms a histogram, which is why probabilities appear at all.

      Let me restate this more bluntly:

      Histograms are used to "measure" probability distributions.

      Physicists have unwittingly fabricated a mathematical structure for quantum theory, that is identical to a histogram. Consequently, the theory only produces descriptions of probability distributions of measurements, rather than specific measurements, as in classical theories, which do not construct histograms.

      Consider a purely classical analogy:

      If you created a frequency modulated radio signal, with discrete states, (frequency shift keying) and then attempted to characterize those states via Fourier Transforms, you would encounter all the same problems that physicists have encountered in QM. That is why transforms are not used to "measure" such signals. Instead, a mathematical sturcture that actually measures the frequency, rather than the probability (histogram) of the frequency is used, to characterize the states, and thus demodulate the signal.

      Rob McEachern

      Hi Joy,

      For sure there wouldn't be an exponential speedup as many are dreaming of. But there may be a little bit of a gain leveraging quantum-like probabilities based on your model. I guess due to your model, they shouldn't be called "quantum" probabilities any more. So what to call them?

      Best,

      Fred

      Hi Fred,

      I agree. The probabilistic predictions of my model and those of quantum mechanics are exactly the same, but without the unnecessary baggage of "entanglement."

      For probabilities I was going to suggest the name "non-commutative probabilities", but that too seems misleading. If we look at the product (A33) of my paper that leads to the "quantum" probabilities in the EPR case, then it is clear that non-commutativity does not play any role in the derivation of the probabilities. We can reverse the order of the product in (A33) and still get the same probabilities, even though the direction of the bivector in the product-quaternion on the LHS of (A33) would be different. Perhaps "probabilities due to non-commutativity" is more appropriate (because, after all, the quaternions that make up the 3-sphere do not commute), but that name seems too mouthful.

      Best,

      Joy

      Hi Fred, Joy,

      What's wrong with "classical probability?" A quantum, whether it applies to a microscopic particle or a universe, is a probable result of a binary choice. It's only noncommutative following measurement. It's otherwise reversible in principle, as an element of a continuous function.

      In other words, taking the classical case of a planetary orbit -- the trajectory is reversible before the initial condition prescribes a trajectory. That doesn't mean the two orbits are in superposition; the simple beauty of Joy's framework captured my imagination from the start, because it is dependent only on two fundamental principles: the topological space and the initial condition. The complete measurement function follows.

      All best,

      Tom

      Tom,

      I too prefer the name "classical probabilities", for that is what they truly are within my framework. But in the literature following Bell one means something different by "classical probabilities." They are not supposed to be able to produce correlations as strong as, say, the EPR-Bohm correlations. So a different name is surely called for. I suppose one can contrast the name "weak probabilities" with the EPR-Bohm like "strong probabilities", both being purely "classical" within my framework.

      Best,

      Joy

      Hi Rob,

      Really glad to see you here. I appreciate that you're one who can always be counted on to wade into the foundational details.

      The Fourier transform is popular I expect, because it makes a lot of calculation easier. In fact, the same applies to the complex plane in general -- I expect that some quantum theorists might take the Hilbert space and the quantum theory formalisms associated with it, as something special and perhaps even physical, but no mathematician is likely to make that mistake.

      The algebraically closed property of the complex plane, however, *is* important to any spacetime geometry -- because we can get nonlinear functions from fewer assumptions, and still recover the real valued numbers that you demand for real physical results. Personally, I find the primary importance of analysis to physical applications is in the realization that all real functions of a real valued variable are continuous. This is critical to any constructive theory of complete measurement functions, such as Joy Christian's, whether the theory applies to physics or is only mathematical. There cannot be any representation of a probability space that creates a gap between everywhere simply connected points.

      So I strongly agree with you that -- as you imply -- *if* probability measure is a *foundational assumption* of how nature works, then getting rid of complex numbers will leave only classical probability.

      And that's what Lucien Hardy is getting at, too -- he's enumerated five axioms of which four incorporate both classical and quantum probability, and one which obviates continuous function classical physics at the foundational level.

      You quote from Hardy's abstract:

      "This work provides some insight into the reasons why quantum theory is the way it is. For example, it explains the need for complex numbers..."

      And after arguing that quantum probability measures do not require Fourier transforms and therefore complex analysis, you say:

      "Let me restate this more bluntly:

      "Histograms are used to 'measure' probability distributions.

      "Physicists have unwittingly fabricated a mathematical structure for quantum theory, that is identical to a histogram. Consequently, the theory only produces descriptions of probability distributions of measurements, rather than specific measurements, as in classical theories, which do not construct histograms."

      I agree! It is only by the sum of histories and normalization that one recovers unitarity, in order to make quantum results coherent and mathematically compatible with observed outcomes.

      In lecture notes on the first law of thermodynamics the author is careful to point out right away that "The value of a state function is independent of the history of the system." A continuous change of state (measurement function continuous from the initial condition) cannot be cumulative when there is no probability measure to normalize; the usefulness of a histogram in this case is limited to showing that unitary evolution is scale invariant -- that is, by both classical and quantum predictions, correlated values are independent of the time at which they were measured. The difference between the probabilistic measure and the continuous-function measure is that by assuming probability on a measure space and normalizing it, one gets only what one assumes true. The continuous measurement function (Joy's) gets the true result by a frequentist statistical analysis, independent of assuming some probability on the closed interval [0,1].

      All best,

      Tom