This is a place to post quick queries about quantum-related issues.
Quickfire Quantum Qs
:) what is a particle ?
It has recently been demonstrated that Quantum Correlations can be Produced Classically with detection efficiencies higher than supposedly possible for any non-quantum system. (Note: The paper reports double-detection efficiencies (0.72) rather than the more commonly reported conditional detection efficiencies. For the model presented, the latter is equal to the square root of the former: sqrt(0.72) = 0.85)
In the attached figure below, it can be observed that the classical and quantum curves are intimated related: the quantum correlation curve, is simply the scaled, first harmonic in the Fourier Series defining the classical correlation curve. This relationship is purely mathematical, independent of either quantum or classical physics. The classical curve consists of a series of discrete, odd harmonics, with rapidly diminishing amplitudes. The quantum correlation curve is simply the first harmonic of this series.
This suggests that the quantum correlation curve is obtained, by a process that merely filters-out (fails to detect) the particle-pairs that contribute to the upper harmonics in the classical curve and then re-normalizing (scaling) the result, to make the correlation peaks equal to +/- 1. Such re-normalization is "built into" the expression for computing the quantum correlations: The denominator is set equal to the number of particle-pairs detected.
Since the peak of the classical curve is pi/2 and the peak of the first harmonic is 4/pi, this suggests that the re-normalization, corresponding to the double detection efficiency must equal the ratio (4/pi)/(pi/2) = 0.81, resulting in a conditional detection efficiency of 0.90, indicating that a classical process should be capable of perfectly duplicating the quantum correlation curve, with detection efficiencies of 90%, higher than anything reported in supposedly "loophole free" Bell-Inequality type tests.
So is "spooky action at a distance" just a grossly misunderstood classical phenomenon?
Rob McEachernAttachment #1: Classical_and_Quantum_Correlation_Curves_2.jpg
Thanks Rob..
For taking us to the other side of the (quantum) mirror.
All the Best,
Jonathan
This would suggest that..
What we are seeing as quantumness is simply nature's truncation (or its failure to represent and/or propagate) the higher harmonics of the (Classical) variational waveform via microscale dynamics.
All the Best,
Jonathan
Jonathan,
Actually, what it suggests is that nature's "identical particles" have the exact same interaction behavior, that identical submarines have, that are attempting to detect each other, and that results in behaviors identical to "quantum tunneling" and "virtual particles".
If they cannot detect each other's existence, in an ocean of noise, then they can sail (tunnel) right past each other as though the other does not even exist, with no interaction whatsoever. But when they do detect each other, they sound general quarters, "ALL AHEAD FULL! DIVE! DIVE DIVE!" and make such a disturbance that even a distant destroyer (observer) on the surface can detect the sudden appearance of the formerly undetectable, "virtual" subs. But if the subs subsequently lose contact (the ability to detect a single bit of information) with each other, then they return to running silent, running deep (not interacting), and they disappear, back into the ocean of noise from which they first materialized; and the distant observer is left to wonder if they were ever really there.
Rob McEachern
That's a cool image..
Silent running and non-interacting until spotted. Hmm. I must think on this.
Regards, JJD
Jonathan,
It all results from behaviors being driven by a single-bit of information: unlike more familiar, classical interactions, it is all or nothing. If the required bit cannot be detected, then it triggers no response whatsoever. But if it is detected, it triggers an a priori established response. It must be a priori, since a suitable behavior cannot be deduced from a single bit.
While you are thinking on this, think also about a One Time Pad in which each bit is manifested as one of the coins (matched filters) described in the above paper: to recover the underlying message, each coin in the message sequence, must be matched with another in the pad, with the exact A PRIORI KNOWN phase angle, in order to recover the underlying message with few, if any, bit errors.
It is foolhardy to attempt to recover the underlying message by using random coin phase angles, since that will result in large numbers of bit-errors. But that is exactly what Bell-type tests do; leaving physicists to wonder why they cannot decode the underlying significance of their own experiments. Spooky ERRORS at a distance.
Rob McEachern
This is starting to make sense Rob..
The detection process has incorporated a kind of hysteresis effect, where it gets stuck in place once it assumes one value or another. I will examine the 'One time pad' reference and comment further later.
All the Best,
Jonathan
Jonathan,
After reading about the one time pad, read about double and triple Stern-Gerlach experiments (attached below), where measurements are either repeated with A PRIOI known phase angles, or not. Then, it should do more than just start to make sense. In the former case, you recover the one and only bit value that is present within the object being observed. In the latter, you observe the result of measuring only the random noise, that is also present within the object being observed, since the detection process produces "no signal" output, when the detector is orthogonal to the polarization axis.
Rob McEachernAttachment #1: Stern-Gerlach_experiments.jpg
Hi Rob
Treating the quantum correlation as a problem in communications and signal processing is a novel approach, which has clearly allowed you to produce some interesting results. Signal, noise and bit rate, related through Shannon's theorem, are fundamental to the concept that quantum correlations are associated with processes that provide one bit of information per sample.
In your coin image model, the "ocean of noise" is an essential consideration, which seems to be missing from quantum mechanics, aside from when noise is studied explicitly. Far from being a nuisance, noise of this sort is more like a resource.
I would be suspicious of some system that could produce a detection rate close to the maximum you calculated - it would have to approach perfection in making selections to eliminate higher harmonics, while leaving the lowest harmonic untouched. That would be spooky!
Apart from the larger question you ask, classical models like yours are being used for simulating quantum computation. I tried to speed up your coin model by reducing it to a signal vector plus a noise vector, but that likely involved some over-simplification.
The noisy vector model is simple enough to allow a probabilistic treatment of threshold crossings, thus avoiding time-consuming Monte Carlo trials. Some notes on the vector model have been posted at sites.google.com/site/quantcorr. There is also a file with C functions for calculating correlations based on the "geometric probability" of the noise vector crossing a threshold.
The classical concepts you employ seem so powerful, and reasonable, I tend to agree that there must be some deep involvement in quantum phenomena which has been overlooked.
Colin
Colin,
"Treating the quantum correlation as a problem in communications and signal processing is a novel approach..." all too true, unfortunately, even though Information Theory is now 70 years old. In My 2012 FQXi essay, I noted that "In one of the great scientific tragedies of the past century, "Modern Physics" was developed long before the development of Information Theory."
"it would have to approach perfection..." That is what error detection and correction coding is all about. Shannon proved (at least in the case of a multi-bit signal) that it should always be possible to create such a code, resulting in perfect detection, right up to the Shannon limit. The final generation of telephone modems (before they became obsolete) came pretty close.
"...making selections to eliminate higher harmonics, while leaving the lowest harmonic untouched..." In another context, this is exactly what raised-cosine filters and square-root-raised-cosine filters are all about: placing nulls at discrete points, to completely eliminate unwanted components, while completely preserving the desired fundamental, and without requiring an impossibly sharp filter cut-off.
"there must be some deep involvement in quantum phenomena which has been overlooked" Exactly. I believe the fact that Shannon's very definition of a "single bit of information" turns out to be the Heisenberg Uncertainty Principle, is that overlooked item: you cannot make multiple, independent measurements, on a single bit of information.
Another second thing that has been overlooked, is that:
The "squared" Fourier transforms at the heart of the mathematical description of every wave-function, are equivalent to the mathematical description of a histogram process, which is why the process yields probability estimates (the Born Rule) - regardless of the nature (particle or wave) of the entities being histogrammed. In other words, the math only describes the histogramming of observed events, not the nature of the entities causing the events, as has been assumed, in the standard interpretations of quantum theory. When you compute a Power Spectrum, you get the total energy accumulated in each "bin". And when the input arrives in discrete quanta, dividing the MEASURED energy in each bin, by the energy/quanta, enables one to INFER the number of received quanta in each bin. There is no INTERFERENCE pattern. Rather, there is an INFERENCE pattern. And if you compute the Power Spectrum of a double (or single, or triple...) slit's geometry, you get the famous INFERENCE pattern: independent of either quantum or classical physics. Particles/waves striking the slits merely act like radio-frequency carriers. All the information content within the INFERENCE pattern, is spatially modulated onto those carriers, by the slit geometry. In other words, the pattern is a property of the slits themselves, not the things passing through the slits.
A third and related overlooked item, is that QM only describes the statistics of DETECTED entities. It does not describe undetected entities at all. That is why it is unitary. Probabilities will always add to unity, when you only compare them to observed counts, that have been normalized via the number of DETECTED entities.
Rob McEachern
Rob,
For the first time, I confirm that a viXra paper the content of which is at least as good as the average in arXiv, and nature communication ones although it just summarizes what you already told us on FQXi.
I am not sure how to better reach those who are not familiar with Shannon and his anti-blockuniverse opinion.
I see similar or possibly even related hurdals in case of the two notions of infinity, the logical Galilean one and the pragmatical Leibniz/Bernoulli one. The former is absolute without a reference, the latter os relative: "larger than any reference".
Your hint to the squared FT seems to confirm my insight that cosine transform yields the same as does FT except for an arbitrarily chosen phase.
++++
This discussion (posts #22 and #23) may be of interest: The connection between Bell's Inequality Theorem and Schrödinger's cat
Rob McEachern
Hi Rob,
"Schroedinger's coin" is an interesting take on the famous cat experiment, given your hypothesis that no more than one bit of Shannon information can be obtained from sampling a quantum process. The 2-dimensional nature of your coin simulation (or the vector model) likely restricts consideration of your idea to 2-d simulations or photon-based experiments. It seems that particles like electrons with Pauli spin matrices would require an extension to quaternions to account fully for their 3-dimensional spin, although I would guess that some experiments could demonstrate quantum correlation.
Your view on interference patterns (discussed previously in this thread) is somewhat in line with that of deBroglie (1926), who found equations for "pilot waves" to guide particles/waves - equations involving the geometry relevant to the experiment. For example, deBroglie could account for the diffraction pattern observed from passing particles/waves through a circular hole. The Wikipedia entry on "quantum potential", a term introduced later by Bohm (1952), is relevant. Subsequently, Bohm and Hiley (1979) reformulated the concept as (Fisher) information potential. That same year, the two-slit experiment was explained in terms of Bohmian trajectories. So there is a version of QM that treats interference something like what you might expect.
The Wiki on quantum potential has a section, "Quantum potential as energy of internal motion associated with spin", which implies that the quantum potential may be energy of some sort. deBroglie thought of all matter and radiation as being composed of tiny "particles". The uncertainty principle dictates that the smallest quantities in terms of energy and momentum require the greatest extent in time and space, thus DeBroglie's vision of tiny particles could also be described as waves comprising a quantum field.
Colin
Colin,
"The 2-dimensional nature of your coin simulation (or the vector model) likely restricts consideration of your idea to 2-d simulations or photon-based experiments."
Actually, there are too many degrees of freedom with 2 dimensions. Rob claims 1 bit of information recovered from each measurement; 1 dimension is sufficient. One records heads (H) at one observation, tails (T) at a later observation, a qubit. Which requires an interval of time.
H =/ T This is an irreducible level of superposition.
Tom,
"Rob claims 1 bit of information recovered from each measurement" No. I claim that there is only 1 bit of information that can ever be recovered from ANY set of measurements, that obeys the limiting case of the Heisenberg Uncertainty principle; that is why both position and momentum cannot be measured.
Also, in Bell tests, no "interval of time" is required: both observers can perform their individual measurement, on their individual member of an entangled pair, simultaneously.
Rob McEachern
Rob,
I stand corrected.
So far as Bell tests, with two independent observers, there absolutely is an interval of time between them. Entanglement is a convenient fiction.
Colin,
The determination of the "heads vs. tails" of an ordinary coin is a 3-D problem. I deliberately made the simulation a 2-D simulation of polarity, to greatly reduce computing requirements and to make it directly analogous to polarity measurements made on photons.
I have been working on a "slideshow" to explain how the misinterpretation of quantum theory began (with de Broglie's "association" of a wave with a particle) and why it persists. I have not finished it, but your post inspired me to post it on vixra so we could discuss it; the file is too large to post here. I'll let you know, when it appears.
If you do not already have a copy of David Bohm's book, "Quantum Theory", I would urge you to get one; I am using it to provide context for my own reinterpretation of QM.
Rob McEachern
Hi Tom,
When it comes down to it, the projection of the coin (or vector) onto Alice's or Bob's rotated instrument reduces each measurement to a single dimension, which is determined by Alice or Bob when they set the angle on their instruments. This measurement is then compared to a fixed threshold to determine whether the polarity is positive, negative, or undetectable.
What I had in mind is that modeling electron spin requires quantum probabilities and, as you say, qubits to determine the state of electron spin. But the decision process ought to be similar, comparing a probability to a threshold to make a decision on the state of polarity.
Colin