• Ultimate Reality
  • Classical Spheres, Division Algebras, and the Illusion of Quantum Non-locality:

Hi Joy,

Seeing that an old George Musser blog on quantum teleportation has attracted new comments, I did some surfing and found what I think is a meaningful discussion among Lawrence Crowell, Ray Munroe (rest in peace), Fred, me and you, over topological foundations, beginning with Lawrence's post on 22 December 2011, 17:40 GMT.

It would be nice if we could stimulate a new dialogue in that collegial manner -- Ray was wrong; however, he was wrong for the right reasons. One really does need to understand the topological principles that drive your result. And they need to understand that the topology isn't something that you made up to suit the case.

Best,

Tom

Bee Hossenfelder is the most honest blogger I know, on science or any other subject. Her recent take on the subject of quantum foundations strikes a chord here:

"Quantum foundations polarizes like no other area in physics. On the one hand there are those actively participating who think it's the most important thing ever but no two of them can agree on anything. And then there's the rest who thinks it's just a giant waste of time. In contrast, most people tend to agree that quantum gravity is worthwhile, though they may differ in their assessment of how relevant it is. And while there are subgroups in quantum gravity, there's a lot of coherence in these groups (even among them, though they don't like to hear that).

"As somebody who primarily works in quantum gravity, I admit that I'm jealous of the quantum foundations people. Because they got data. It is plainly amazing for me to see just how much technological progress during the last decade has contributed to our improved understanding of quantum systems. May that be tests of Bell's theorem with entangled pairs separated by hundreds of kilometers, massive quantum oscillators, molecule interferometry, tests of the superposition principle, weak measurements, using single atoms as a double slit, quantum error correction, or the tracking of decoherence, to only mention what popped into my head first. When I was a student, none of that was possible. This enables us to test quantum theory now much more precisely and in more circumstances than ever before.

"This technological progress may not have ignited the interest in the foundations of quantum mechanics but it has certainly contributed to the field drawing more attention and thus drawing more people. That however doesn't seem to have decreased the polarization of opinions, but rather increased it. The more attention research on quantum foundations gets, the more criticism it draws."

Because Bee is a phenomenologist, I forgive her infatuation with technology. Fact is, new theoretical examinations of quantum foundations also tell us what technologies may *not* be viable -- such as quantum computing based on superposition and quantum entanglement. The successful tests of conventional quantum theory in the middle paragraph above, regardless that they "got data," it's not data that transfers to any useful technology -- in fact, the data have no way to show more than a demonstration of what the theory assumes.

Where technology meets real world applications, the technology has nothing to do with quantum foundations. It operates on principles of electromagnetic theory and statistical mechanics that easily coexist with an incomplete mathematical framework of quantum theory.

The prize, where quantum foundations makes a technology difference, is information-theoretic. And that's where it has to be -- like the classical physics of relativity -- mathematically complete. Already we see quantum computing theories that do not require entanglement making headway, like the D-wave program. Quantum discord also does not depend on entanglement.

Tom

Hi Tom,

Another problem that quantum theory has is that I believe all its results are classical. It's a duality situation for sure. One does not exist without the other. If the folks trying to do quantum computers stop trying to use entanglement and instead use what Joy has discovered, they might be able to do something better than ordinary computers since there is a difference between the probabilities. The D-wave might actually be able to be taken to that approach.

Best,

Fred

Hi Fred and Tom,

Quantum computers would almost certainly require quantum entanglement for exponential speedup. If quantum entanglement is not a fundamental feature of nature, then that is almost certainly a very bad news for exponential speedup, and hence for quantum computers.

Now within my local-realistic framework there is no entanglement of any kind. All physical systems, classical or quantum, are described by intrinsically factorizable measurement results and classical (albeit nonlinear) probabilities, as discussed in the appendix of the attached paper. This goes against the deeply held belief that certain probabilities and associated correlations cannot exist without quantum entanglement. But I have *derived* both quantum probabilities and quantum correlations without any kind of entanglement, as shown in the charts below. Therein lies the origins of hostility and disdain against my work and me personally. If I am right then many people are wrong. It is much easier to discredit me personally than try to understand what I have found.

Best,

Joy

Image 1

Image 2Attachment #1: 19_whither.pdf

    • [deleted]

    Quite right, Joy. The high vulnerability of quantum entanglement to decoherence should have been a clue long ago, that were entanglement a real physical phenomenon, it should exert a physical effect on neighboring states -- I mean, that if superposition is a protected state, it should intefere with the wave function of its environment rather than being semi-stable with the environment.

    Nature is not fragile. Computation is performed every moment on every hardware substrate without regard to special conditions. We have spent so much effort learning how to prepare against nature's propensity for decoherence and thermal equilibrium, that we have forgotten how nature uses decoherence to create new forms. Classical probability is a strong manifestation of this principle, because for every question answered (yes/no) there are a countable infinity of questions that remain locally asymmetric to the result; however, even with infinite separation between the two possible answers, evolution of the state is guaranteed unitary with no further assumptions. "All physics is local."

    Computing by quantum entanglement assumes that infinities are tamed in the superposition of states, with the further assumption of nonlocality. I have you to thank, Joy, for making it clear to me that these assumptions are superfluous and illusory.

    And it isn't that I don't think that computation can be exponentially faster -- distributed and parallel systems of computing that mimic nature's nonlinear way of converting infinite possibilites to local probabilities is identical to a simply connected topological network.

    All best,

    Tom

    Still hoping that this thread will attract interactive exchange with those who know something about the foundations of quantum theory ...

    Lucien Hardy has asked a poignant question: " ... could a 19th century theorist have developed quantum theory without access to the empirical data that later became available to his 20th century descendants?"

    Hardy's answer is an axiomatic treatment of quantum theory that peels back the veil of Hilbert space formalism, to bare the skeleton of essential assumptions:

    "Axiom 1 *Probabilities.* Relative frequencies (measured by taking the proportion of times a particular outcome is observed) tend to the same value (which we call the probability) for any case where a given measurement is performed on an ensemble of n systems prepared by some given preparation in the limit as n becomes infinite.

    "Axiom 2 *Simplicity.* K is determined by a function of N (i.e. K = K(N)) where N = 1, 2, . . . and where, for each given N, K takes the minimum value consistent with the axioms.

    "Axiom 3 *Subspaces.* A system whose state is constrained to belong to an M dimensional subspace (i.e. have support on only M of a set of N possible distinguishable states) behaves like a system of dimension M.

    "Axiom 4 *Composite systems.* A composite system consisting of subsystems A and B satisfies N = N_A N_B and K = K_A K_B

    "Axiom 5 *Continuity.* There exists a continuous reversible transformation on a system between any two pure states of that system."

    One is compelled to see the beauty in this formalization, if one has a mathematical soul. For the consistency of calculation to meet the consistency of observation there need be continuity between the mathematical model and the physical result, without sacrificing the independence between the model's formal language and the physical manifestation. So it is with elegant understatement that Hardy adds:

    "The first four axioms are consistent with classical probability theory but the fifth is not (unless the word 'continuous' is dropped). If the last axiom is dropped then, because of the simplicity axiom, we obtain classical probability theory (with K = N) instead of quantum theory (with K = N2). It is very striking that we have here a set of axioms for quantum theory which have the property that if a single word is removed -- namely the word 'continuous' in Axiom 5 -- then we obtain classical probability theory instead."

    My personal investment in researching the mathematical physics of continuous functions goes back about 10 years. I think this recent preliminary result that I am in the process of formalizing, supports the case that reversible transformations of pure states is native to the plane. There can hardly be a purer quantum arithmetic state than a pair of distinct odd prime integers.

    Tom

    Some day I may get the link right on the first try, but don't count on it.

    (My apologies. I wish there were some way to verify before posting. If this one doesn't work, then use the one in the post which started this thread.)

    "Still hoping that this thread will attract interactive exchange with those who know something about the foundations of quantum theory ..."

    OK Tom, I'll bite. Hardy's Abstract states that:

    "This work provides some insight into the reasons why quantum theory is the way it is. For example, it explains the need for complex numbers..."

    Elsewhere in these discussions, I have pointed out that complex numbers, arising from the introduction of Fourier Transforms into QM, are not in fact NEEDED at all. They are merely sufficient. The unmeasurable phase of the Fourier Transform is what provides the continuous transform between "pure states" (Axiom 5). But that is also unnecessary, since the Fourier Transform itself is unnecessary.

    The Fourier Transform, combined with the summation of the squared real and imaginary parts, is algebraically identical to a filter-bank that computes classical probabilities by merely counting discrete detections. In other words, it forms a histogram, which is why probabilities appear at all.

    Let me restate this more bluntly:

    Histograms are used to "measure" probability distributions.

    Physicists have unwittingly fabricated a mathematical structure for quantum theory, that is identical to a histogram. Consequently, the theory only produces descriptions of probability distributions of measurements, rather than specific measurements, as in classical theories, which do not construct histograms.

    Consider a purely classical analogy:

    If you created a frequency modulated radio signal, with discrete states, (frequency shift keying) and then attempted to characterize those states via Fourier Transforms, you would encounter all the same problems that physicists have encountered in QM. That is why transforms are not used to "measure" such signals. Instead, a mathematical sturcture that actually measures the frequency, rather than the probability (histogram) of the frequency is used, to characterize the states, and thus demodulate the signal.

    Rob McEachern

    Hi Joy,

    For sure there wouldn't be an exponential speedup as many are dreaming of. But there may be a little bit of a gain leveraging quantum-like probabilities based on your model. I guess due to your model, they shouldn't be called "quantum" probabilities any more. So what to call them?

    Best,

    Fred

    Hi Fred,

    I agree. The probabilistic predictions of my model and those of quantum mechanics are exactly the same, but without the unnecessary baggage of "entanglement."

    For probabilities I was going to suggest the name "non-commutative probabilities", but that too seems misleading. If we look at the product (A33) of my paper that leads to the "quantum" probabilities in the EPR case, then it is clear that non-commutativity does not play any role in the derivation of the probabilities. We can reverse the order of the product in (A33) and still get the same probabilities, even though the direction of the bivector in the product-quaternion on the LHS of (A33) would be different. Perhaps "probabilities due to non-commutativity" is more appropriate (because, after all, the quaternions that make up the 3-sphere do not commute), but that name seems too mouthful.

    Best,

    Joy

    Hi Fred, Joy,

    What's wrong with "classical probability?" A quantum, whether it applies to a microscopic particle or a universe, is a probable result of a binary choice. It's only noncommutative following measurement. It's otherwise reversible in principle, as an element of a continuous function.

    In other words, taking the classical case of a planetary orbit -- the trajectory is reversible before the initial condition prescribes a trajectory. That doesn't mean the two orbits are in superposition; the simple beauty of Joy's framework captured my imagination from the start, because it is dependent only on two fundamental principles: the topological space and the initial condition. The complete measurement function follows.

    All best,

    Tom

    Tom,

    I too prefer the name "classical probabilities", for that is what they truly are within my framework. But in the literature following Bell one means something different by "classical probabilities." They are not supposed to be able to produce correlations as strong as, say, the EPR-Bohm correlations. So a different name is surely called for. I suppose one can contrast the name "weak probabilities" with the EPR-Bohm like "strong probabilities", both being purely "classical" within my framework.

    Best,

    Joy

    Hi Rob,

    Really glad to see you here. I appreciate that you're one who can always be counted on to wade into the foundational details.

    The Fourier transform is popular I expect, because it makes a lot of calculation easier. In fact, the same applies to the complex plane in general -- I expect that some quantum theorists might take the Hilbert space and the quantum theory formalisms associated with it, as something special and perhaps even physical, but no mathematician is likely to make that mistake.

    The algebraically closed property of the complex plane, however, *is* important to any spacetime geometry -- because we can get nonlinear functions from fewer assumptions, and still recover the real valued numbers that you demand for real physical results. Personally, I find the primary importance of analysis to physical applications is in the realization that all real functions of a real valued variable are continuous. This is critical to any constructive theory of complete measurement functions, such as Joy Christian's, whether the theory applies to physics or is only mathematical. There cannot be any representation of a probability space that creates a gap between everywhere simply connected points.

    So I strongly agree with you that -- as you imply -- *if* probability measure is a *foundational assumption* of how nature works, then getting rid of complex numbers will leave only classical probability.

    And that's what Lucien Hardy is getting at, too -- he's enumerated five axioms of which four incorporate both classical and quantum probability, and one which obviates continuous function classical physics at the foundational level.

    You quote from Hardy's abstract:

    "This work provides some insight into the reasons why quantum theory is the way it is. For example, it explains the need for complex numbers..."

    And after arguing that quantum probability measures do not require Fourier transforms and therefore complex analysis, you say:

    "Let me restate this more bluntly:

    "Histograms are used to 'measure' probability distributions.

    "Physicists have unwittingly fabricated a mathematical structure for quantum theory, that is identical to a histogram. Consequently, the theory only produces descriptions of probability distributions of measurements, rather than specific measurements, as in classical theories, which do not construct histograms."

    I agree! It is only by the sum of histories and normalization that one recovers unitarity, in order to make quantum results coherent and mathematically compatible with observed outcomes.

    In lecture notes on the first law of thermodynamics the author is careful to point out right away that "The value of a state function is independent of the history of the system." A continuous change of state (measurement function continuous from the initial condition) cannot be cumulative when there is no probability measure to normalize; the usefulness of a histogram in this case is limited to showing that unitary evolution is scale invariant -- that is, by both classical and quantum predictions, correlated values are independent of the time at which they were measured. The difference between the probabilistic measure and the continuous-function measure is that by assuming probability on a measure space and normalizing it, one gets only what one assumes true. The continuous measurement function (Joy's) gets the true result by a frequentist statistical analysis, independent of assuming some probability on the closed interval [0,1].

    All best,

    Tom

    Joy,

    Dang, it's hard enough to explain a measurement function continuous from the initial condition without having to invent new terms for probability! :-) Sigh.

    I guess that until one understands topology enough to know the difference between simply connected, and disconnected and multiply connected spaces, it's going to be an uphill struggle. Heck, we haven't even gotten as far as critics' accepting that your framework contains no probabilistic measure space -- the statistical analysis is all based on a frequentist interpretation of aggregated random coin tosses. To me, that's what "classical probability" means, but I'm no expert in the literature.

    Best,

    Tom

    Interestingly, none of the detractors of my work is among the invited speakers.

    Tom,

    "In all previous simulations that were proposed to fail, that I saw -- there was no randomized input. The critics simply did not seem to understand that the arbitrary choice of vector by the programmer cannot be counted as the initial condition of a function continuous from the initial condition; they create a dependent condition based on the experimenter's choice -- and then when they don't get Joy's predicted correlations on their assumed probability space -- declare that something is wrong with the mathematics, rather than with their own assumptions about the initial condition and a probability space that isn't there in the first place. They don't grasp how Joy has left the choice of initial condition to nature and taken the experimenter out of it."

    This was very illuminating. Thank you for posting it.

    James Putnam

    I love Vienna, and the university. My wife and I were there summer of 2002 for the Karl Popper centenary. My paper fell flat, though I was so dazzled by the famous scholars in attendance, that it didn't matter to me all that much.

    There was a heat wave in Europe, and ice cold beers from the little taprooms lining the streets were a refreshing delight!

    I wonder if Vienna will be hot again this year. :-)

    Tom