This is starting to make sense Rob..

The detection process has incorporated a kind of hysteresis effect, where it gets stuck in place once it assumes one value or another. I will examine the 'One time pad' reference and comment further later.

All the Best,

Jonathan

Jonathan,

After reading about the one time pad, read about double and triple Stern-Gerlach experiments (attached below), where measurements are either repeated with A PRIOI known phase angles, or not. Then, it should do more than just start to make sense. In the former case, you recover the one and only bit value that is present within the object being observed. In the latter, you observe the result of measuring only the random noise, that is also present within the object being observed, since the detection process produces "no signal" output, when the detector is orthogonal to the polarization axis.

Rob McEachernAttachment #1: Stern-Gerlach_experiments.jpg

Hi Rob

Treating the quantum correlation as a problem in communications and signal processing is a novel approach, which has clearly allowed you to produce some interesting results. Signal, noise and bit rate, related through Shannon's theorem, are fundamental to the concept that quantum correlations are associated with processes that provide one bit of information per sample.

In your coin image model, the "ocean of noise" is an essential consideration, which seems to be missing from quantum mechanics, aside from when noise is studied explicitly. Far from being a nuisance, noise of this sort is more like a resource.

I would be suspicious of some system that could produce a detection rate close to the maximum you calculated - it would have to approach perfection in making selections to eliminate higher harmonics, while leaving the lowest harmonic untouched. That would be spooky!

Apart from the larger question you ask, classical models like yours are being used for simulating quantum computation. I tried to speed up your coin model by reducing it to a signal vector plus a noise vector, but that likely involved some over-simplification.

The noisy vector model is simple enough to allow a probabilistic treatment of threshold crossings, thus avoiding time-consuming Monte Carlo trials. Some notes on the vector model have been posted at sites.google.com/site/quantcorr. There is also a file with C functions for calculating correlations based on the "geometric probability" of the noise vector crossing a threshold.

The classical concepts you employ seem so powerful, and reasonable, I tend to agree that there must be some deep involvement in quantum phenomena which has been overlooked.

Colin

Colin,

"Treating the quantum correlation as a problem in communications and signal processing is a novel approach..." all too true, unfortunately, even though Information Theory is now 70 years old. In My 2012 FQXi essay, I noted that "In one of the great scientific tragedies of the past century, "Modern Physics" was developed long before the development of Information Theory."

"it would have to approach perfection..." That is what error detection and correction coding is all about. Shannon proved (at least in the case of a multi-bit signal) that it should always be possible to create such a code, resulting in perfect detection, right up to the Shannon limit. The final generation of telephone modems (before they became obsolete) came pretty close.

"...making selections to eliminate higher harmonics, while leaving the lowest harmonic untouched..." In another context, this is exactly what raised-cosine filters and square-root-raised-cosine filters are all about: placing nulls at discrete points, to completely eliminate unwanted components, while completely preserving the desired fundamental, and without requiring an impossibly sharp filter cut-off.

"there must be some deep involvement in quantum phenomena which has been overlooked" Exactly. I believe the fact that Shannon's very definition of a "single bit of information" turns out to be the Heisenberg Uncertainty Principle, is that overlooked item: you cannot make multiple, independent measurements, on a single bit of information.

Another second thing that has been overlooked, is that:

The "squared" Fourier transforms at the heart of the mathematical description of every wave-function, are equivalent to the mathematical description of a histogram process, which is why the process yields probability estimates (the Born Rule) - regardless of the nature (particle or wave) of the entities being histogrammed. In other words, the math only describes the histogramming of observed events, not the nature of the entities causing the events, as has been assumed, in the standard interpretations of quantum theory. When you compute a Power Spectrum, you get the total energy accumulated in each "bin". And when the input arrives in discrete quanta, dividing the MEASURED energy in each bin, by the energy/quanta, enables one to INFER the number of received quanta in each bin. There is no INTERFERENCE pattern. Rather, there is an INFERENCE pattern. And if you compute the Power Spectrum of a double (or single, or triple...) slit's geometry, you get the famous INFERENCE pattern: independent of either quantum or classical physics. Particles/waves striking the slits merely act like radio-frequency carriers. All the information content within the INFERENCE pattern, is spatially modulated onto those carriers, by the slit geometry. In other words, the pattern is a property of the slits themselves, not the things passing through the slits.

A third and related overlooked item, is that QM only describes the statistics of DETECTED entities. It does not describe undetected entities at all. That is why it is unitary. Probabilities will always add to unity, when you only compare them to observed counts, that have been normalized via the number of DETECTED entities.

Rob McEachern

5 days later

Rob,

For the first time, I confirm that a viXra paper the content of which is at least as good as the average in arXiv, and nature communication ones although it just summarizes what you already told us on FQXi.

I am not sure how to better reach those who are not familiar with Shannon and his anti-blockuniverse opinion.

I see similar or possibly even related hurdals in case of the two notions of infinity, the logical Galilean one and the pragmatical Leibniz/Bernoulli one. The former is absolute without a reference, the latter os relative: "larger than any reference".

Your hint to the squared FT seems to confirm my insight that cosine transform yields the same as does FT except for an arbitrarily chosen phase.

++++

2 months later
7 days later

Hi Rob,

"Schroedinger's coin" is an interesting take on the famous cat experiment, given your hypothesis that no more than one bit of Shannon information can be obtained from sampling a quantum process. The 2-dimensional nature of your coin simulation (or the vector model) likely restricts consideration of your idea to 2-d simulations or photon-based experiments. It seems that particles like electrons with Pauli spin matrices would require an extension to quaternions to account fully for their 3-dimensional spin, although I would guess that some experiments could demonstrate quantum correlation.

Your view on interference patterns (discussed previously in this thread) is somewhat in line with that of deBroglie (1926), who found equations for "pilot waves" to guide particles/waves - equations involving the geometry relevant to the experiment. For example, deBroglie could account for the diffraction pattern observed from passing particles/waves through a circular hole. The Wikipedia entry on "quantum potential", a term introduced later by Bohm (1952), is relevant. Subsequently, Bohm and Hiley (1979) reformulated the concept as (Fisher) information potential. That same year, the two-slit experiment was explained in terms of Bohmian trajectories. So there is a version of QM that treats interference something like what you might expect.

The Wiki on quantum potential has a section, "Quantum potential as energy of internal motion associated with spin", which implies that the quantum potential may be energy of some sort. deBroglie thought of all matter and radiation as being composed of tiny "particles". The uncertainty principle dictates that the smallest quantities in terms of energy and momentum require the greatest extent in time and space, thus DeBroglie's vision of tiny particles could also be described as waves comprising a quantum field.

Colin

Colin,

"The 2-dimensional nature of your coin simulation (or the vector model) likely restricts consideration of your idea to 2-d simulations or photon-based experiments."

Actually, there are too many degrees of freedom with 2 dimensions. Rob claims 1 bit of information recovered from each measurement; 1 dimension is sufficient. One records heads (H) at one observation, tails (T) at a later observation, a qubit. Which requires an interval of time.

H =/ T This is an irreducible level of superposition.

    Tom,

    "Rob claims 1 bit of information recovered from each measurement" No. I claim that there is only 1 bit of information that can ever be recovered from ANY set of measurements, that obeys the limiting case of the Heisenberg Uncertainty principle; that is why both position and momentum cannot be measured.

    Also, in Bell tests, no "interval of time" is required: both observers can perform their individual measurement, on their individual member of an entangled pair, simultaneously.

    Rob McEachern

    Rob,

    I stand corrected.

    So far as Bell tests, with two independent observers, there absolutely is an interval of time between them. Entanglement is a convenient fiction.

    Colin,

    The determination of the "heads vs. tails" of an ordinary coin is a 3-D problem. I deliberately made the simulation a 2-D simulation of polarity, to greatly reduce computing requirements and to make it directly analogous to polarity measurements made on photons.

    I have been working on a "slideshow" to explain how the misinterpretation of quantum theory began (with de Broglie's "association" of a wave with a particle) and why it persists. I have not finished it, but your post inspired me to post it on vixra so we could discuss it; the file is too large to post here. I'll let you know, when it appears.

    If you do not already have a copy of David Bohm's book, "Quantum Theory", I would urge you to get one; I am using it to provide context for my own reinterpretation of QM.

    Rob McEachern

    Hi Tom,

    When it comes down to it, the projection of the coin (or vector) onto Alice's or Bob's rotated instrument reduces each measurement to a single dimension, which is determined by Alice or Bob when they set the angle on their instruments. This measurement is then compared to a fixed threshold to determine whether the polarity is positive, negative, or undetectable.

    What I had in mind is that modeling electron spin requires quantum probabilities and, as you say, qubits to determine the state of electron spin. But the decision process ought to be similar, comparing a probability to a threshold to make a decision on the state of polarity.

    Colin

    From your slideshow: "Quantized Observations result from observations having: a small Information content not a small physical size of the object being observed!" You make an important point. Observations of quantum processes are not about small scale, but limited information. In a previous FQXi contest essay, I argued that the cosmological redshift is compatible with tired light under the assumption that quantum uncertainty extends to cosmological scale, with a complementary relationship between distant time dilation at the source of light, and energy lost in transit noted at the local light receptor. Hopefully, that assumption seems a little less outrageous now.

    Your discussion of Maxwell's demon is also relevant. Gravity is an obvious suspect: it is the force that makes hot air rise to the ceiling of a room, and cold fall to the floor, without a partition between the hot and cold regions.

    There seems to be a typo at the top of page 30. "Shannon discovered the *maximum* number of bits of information, required to perfectly reconstruct an arbitrary curve" should be *minimum*.

    Fisher's measure of information looks like it was made for hidden variable problems. Wikipedia says it is "the amount of information that an observable random variable X carries about an unknown parameter of a distribution that models X."

    Bohm's "Quantum Theory", published in 1951, sounds like a classic and I could use an introduction to his work. I will get that book. I wonder how it compares to "The Undivided Universe" (1993) written with Hiley and published four decades later.

    Colin

    In regards to the "minimum" versus the "maximum" on page 30, they become equal at the limit, which is why I "put it another way" in the middle of the page. I can see that the language is a bit ambiguous. The point I was trying to make is that additional measurements, of supposed, additional components, will not yield more information - the maximum amount has already been attained by the previous measurements; consequently all subsequent measurements must be correlated with the first.

    Maxwell's demon is precisely a pre-quantum-theory example of a "decision making" process, like "calling" a coin, that is being misinterpreted in quantum theory as "wave-function collapse." It is all related, to Shannon's insights into the nature of information.

    Bohm's book is more than just a classic, it is a treasure trove of insights into what is really going on in the quantum world (he seems to have written it, as an attempt to understand the theory himself, as much as to explain it to others), and in particular, from a particle scattering, rather than physical wave propagation, point-of view. Here is an interesting quote from the Wikipedia article on Bohm, regarding his Ph.D research:

    "the scattering calculations (of collisions of protons and deuterons) that he had completed proved useful to the Manhattan Project and were immediately classified. Without security clearance, Bohm was denied access to his own work; not only would he be barred from defending his thesis, he was not even allowed to write his own thesis in the first place!"

    What I find interesting about his ideas on particle scattering, are in connection to things like the double-slit experiment. He showed that (section 21.23) potentials with sharp edges will produce oscillatory scattering cross-sections: AKA interference patterns.

    The one thing that he could not figure out, is why quantum scattering, unlike classical scattering, is such that "the deflection process is described as a single indivisible transition", rather than like the continuous deflection, of a mass in a gravitational field. He remarks upon this fact in several places.

    The missing answer is an information-mediated decision process, like Maxwell's Demon: if the field cannot "detect" a particle (recover a single bit of information) and the particle cannot detect the field (think of the symmetric interaction of identical particles), then there can never be any interaction whatsoever, except at the points at which it *is* possible to recover such a bit. So quantum particles scatter off ripple's in the field within the slits, just like bullets scattering off one specific point on a rippled, steel plate, thereby producing an "interference" pattern - with no wave propagation required.

    Rob McEachern

    Thanks for explaining the significance of a decision process being involved in quantum scattering, as opposed to the gravitational analogy. An example like that really helps to put what you are getting at in context.

    I recall one of my high school teachers demonstrating the double slit experiment using microscope slides spray-painted black, with the two slits etched onto one of the slides using a pair of back-to-back razor blades. The separation between the slits was just right for the light that was used.

    Had to chuckle - "not even allowed to write his own thesis in the first place!"

    Colin

    As usual, the decrease in system entropy is balanced by an increase in the demon's entropy from storing information about the system. (But this raises the question: could a demon built in to fundamental physical processes use information about physics, as opposed to temporary information about the state of the system, to maintain a state of low entropy? That seems more like what happens with hot air rising, and cold air falling.)

    Anyway, it really looks like an elegant experiment in quantum thermodynamics.

    15 days later

    Hi Rob,

    You write 'So is "spooky action at a distance" just a grossly misunderstood classical phenomenon?'

    I have the same question. Maybe will you be interested by this article on EPR paradox vs. Bell's inequality:

    http://file.scirp.org/pdf/JMP_2015103010590224.pdf

    It deals about a classical explanation, based on the distinction between

    - the angles of the set up (alpha, beta)

    - and the polarization measures (a, b)

    Hi Olivier,

    I'll take a look at your paper.

    You might find my comment about Bell Tests, Schrödinger's coin, and One-Time Pads interesting in this context.

    Also, note that all Bell-type theorems and experiments only deal with the EPR-B paradox rather than the much more general, original EPR paradox. In other words, Bell only deals with David Bohm's version of the paradox, that is restricted to observables with only two observed states, like spin-up and spin-down. But the original EPR paradox deals with the Heisenberg Uncertain Principle. Why can't CONTINUOUS variables (like position and momentum), that can take on any value in the classical realm, be simultaneous measured in the quantum realm? The fact that Shannon's Capacity, when evaluated at number-of-bits-of-information=1 turns out to be identical to the Heisenberg Uncertainty Principle, provides the answer, as to why that happens in general, and not just in the resticted Bohm version of the paradox.

    Rob McEachern