That paper states that "When the SG magnets are oriented the same way (case (a)), the outcomes are always the same due to conservation of spin angular momentum between the pair of particles."

But that statement assumes that all the "outcomes" of every detection event are in fact correct and thus indicative of the "true" state" of the entity being measured. But it has been demonstrated that such perfectly correct outcomes are not even a logical possibility, in purely classical systems, whenever the system has been constructed such that it manifests only one, single bit of information. In other words, even when the system is constructed with perfectly anti-parallel "entangled particles", the actually detected "outcomes" cannot possibly be anti-parallel in every case ("bit-errors" are inevitable in some detections), and the probability of detections failing to be anti-parallel, is a function of the misalignment between the polarization axis of the entity being measured and the axis of the measuring device.

The paper then says "Instead, what happens in case (b) trials is that Bob's outcomes corresponding to Alice's outcomes for a particular setting correctly average to what one would expect for conservation of spin angular momentum (Figure 4)."

That statement is correct - precisely because the average is based upon coincidence-detection, that systematically fails to detect the entangled pairs most prone to producing the bit-errors; because, the mechanism that causes the bit-errors is the same as that which causes coincidence-detection to fail to detect every particle pair.

Rob McEachern

I love your stubborn "Shannon bit noise explains quantum phase..." argument.

The basic issue with quantum measurements is that once an observer measures two entangled spin states from a single electron outcome, that observer cannot then know what the precursor spin state was for that electron. This means that before observation, quantum spin states exist as a superposition of both spin states.

Classical electron spin states represent revealed knowledge in that the classical observer measures a classical electron that the classical measurement reveals certain the spin states before the measurement, albeit within the classical Shannon noise level of the measurement. In the absence of perturbations, classical spin state outcome coming from a precursor necessarily means that spin state existed prior to the measurement and the measurement simply revealed that hidden knowledge.

The revealed knowledge of classical Shannon noise has no bit limit since a higher resolution classical measurement bit is always possible. Single electrons represent a limit for a Shannon bit, but a single electron is a qubit since it has quantum phase.

A electron simply cannot exist as a classical bit, so this classical explanation for quantum phase simply invents a new particle of matter called a classical bit. However, there is no way to measure a classical bit of noise without quantum phase.

By carefully fitting bit errors as a function of quantum phase angle, a classical bit can fit quantum phase correlations like Bell's. There is of course a classical noise fit to the Stern-Gerlach quantum phase as well. These classical noise fits have no useful predictions for any other measurements. In fact, a classical noise "hidden" function that fits a quantum phase property is a proof of the validity of a hidden quantum phase, not proof of classical noise...here is the figure from 2019jul...Attachment #1: 1_mceachernCorrelate.jpg

Hi , I liked also, but a question intrigues me, what is really an electron, I have my idea in my model and these spheres but they are intriguing in fact.

Steve Agnew, do you know well the dirac electrons? and the fact that they are massless in their comportments , it seems relevant for the electronics and the phases, do you know the method to have this comportment , is it with crystal and changements in temperatures , and if yes why exactly ?

the graphene seems the answer with the changements , amorphous to crystalline due to heat and the materials of course are important but what is an electron and why ?

Steve Agnew:

Everyone will either have to learn to love my argument, or at least live with it, since the argument is correct.

The entire point of Shannon's Information Theory, is to avoid ever making any "measurements" at all; because measurements always involve making errors, in the real, non-ideal world. To completely eliminate every error, it is necessary to substitute error-free "decisions", for error-prone "measurements".

"The basic issue with quantum measurements is that once an observer measures two entangled spin states from a single electron outcome..." then that observer has totally skewed up! An observer must never even attempt to make any such "measurements", because any such attempt is doomed to fail - many of the "measurements" will inevitably produce the exact opposite result, from the "true" result.

To put it bluntly (I never have been "politically correct"), it is an exercise in stupidity, to ever even perform a Bell test, unless the observer first figures out how to absolutely guarantee that the axis of the measuring device, will always be perfectly aligned with the axis of the entity to be measured; because that is the only configuration in which the errors can be eliminated, even in the classical case, that the Bell test statistics are being compared with.

Shannon's "decision" theory of information, is founded upon exploiting "exclusion" zones; think of the Pauli exclusion principle or the Demilitarized Zone separating North and South Korea. Everything of any consequence, is being systematically excluded from such a zone, meaning that nothing of consequence, is "allowed" to even exist within such a zone. Hence, anything "measured" within such a zone, must be something that should not be there. (For example, there are no "allowed" integers, between the values of 1 and 2 and there are no "allowed" letters between "a" and "b" in the English alphabet).

That is what "quantization" is all about - "quantizing" information (not space and/or time, as virtually all physicists have supposed) by enforcing "exclusion zones" between every valid value; all "valid" values of all physical states are necessarily discrete rather than continuous, because only discrete (hence potentially error-free) values can enable perfectly repeatable (deterministic) behaves to ever exist (emerge from chaos). In other words, it is the existence of discrete, "valid" states, that enable "Cause and Effect" itself to exist as a physical phenomenon. Continuous states can and do exist; but they cannot enable perfect "cause and effect" because there is no "perfectly measurable" cause to trigger a correspondingly perfectly reproducible effect. Deteminism exists precisely where discrete states are being exploited to ensure perfect (error free) information recovery.

In the case of the polarized coin measurements, there is an "exclusion zone" being constructed between 0 degrees and 180 degrees; hence, nothing "valid" can ever exist between 0 and 180 degrees. Hence, if you think you "measured" something within the exclusion zone between 0 and 180 degrees, then you measured something that should not be there - something invalid! The polarized objects being measured, in the classical case, were constructed to make that true! There is nothing "valid" to ever be measured at angles other than 0 or 180 degrees, since the exclusion zone was constructed such that only noise, distortion and inter-symbol-interference can exist between 0 and 180 degrees - nothing "valid" (AKA capable of being correctly "measured" exists there). It is an observed fact, that Quantum entities behave in exactly the same manner - that is where the name "Quantum" came from in the first place!

"Until Shannon, it was simply conventional wisdom that noise had to be endured. Shannon's promise of perfect accuracy was something radically new."

Rob McEachern

It is not clear what you mean by your argument being correct...your arguments now are a random laundry list of Stern-Gerlach and double-slit diffraction and Shannon noise theory.There is absolutely nothing wrong with Shannon noise...it is just the classical noise of chaos. What is wrong is that Shannon noise is the approximation that Shannon noise is always completely independent of either sender or receiver as the attached Wiki diagram shows.

This is a very good approximation for a coin flip, but is a invalid for quantum phase of Stern-Gerlach or 2-slit diffraction. Note that even with a coin flip, there are in between coin outcomes because the coin is 3-D and a coin edge can affect the outcome. So a coin can end up on its edge as the attached pic shows, or you could catch and manipulate the coin that to influence the outcome, which is a common magic trick after all.

Since Shannon never addressed the role of quantum phase in his information theory, Shannon had the quantum phase of qubits. But Shannon did know about quantum phase of von Neumann and that is quantum information theory.

I agree with you that Shannon theory injects classical "random" bit noise independent of either precursor or outcome. However, Shannon noise injection has no lower classical limit and if Shannon noise RMS voltage is an order of magnitude below the bit voltage, Shannon noise will affect very few results. Two orders of magnitude and so on without any limit will eventually reach the limit of the wavelength of the universe.

It is possible to tune a classical Shannon noise function to mimic outcomes of quantum phase noise, but without quantum superposition and entanglement, such tuning does not result in any useful predictions. In particular, classical and quantum noise behave differently with temperature, pressure, or other environmental effects.

Even one electron represents a measurable quantum event way above one bit of Shannon noise with many common measurements. One electron can be in a superposition state and one electron can interfere with itself and result in two possible paths or spin state outcomes. Those outcomes are necessarily subject to quantum phase uncertainty but not to the chaos of Shannon noise...Attachment #1: ShannonNoise.JPGAttachment #2: headEdgeTails.jpg

"A question intrigues me, what is really an electron, I have my idea in my model and these spheres but they are intriguing in fact."...Steve Defourny.

An electron definition is really quite tricky since, as a fundamental particle, it is like asking why the universe exists or why matter exists at all. Electrons exist because they exist, which is an identity and hardly helpful. Science says you must simply believe in electrons and then gives their mass, charge, spin, and so on.

You may want to know what is a matter-action electron. Since the only fundamental matter-action particle is the dust of quantum aether, it is quantum aether dust that makes up electrons and all fundamental particles, including photons. At the CMB, only a small fraction (~1e-7) of aether was frozen into quarks, and then quarks froze into electrons and other particles.

Before the CMB, forces were too weak and there was aether equilibrium. Since the CMB cooling, the reverse reaction has been kinetically limited except for very high energies like stars and black holes...and particle accelerators. Antimatter is then what made up the precursor to our matter universe.

The only matter that survived CMB cooling was neutral hydrogen and small amounts of other neutral atoms according to the Schrödinger equation. The Rydberg energy of hydrogen ionization stabilizes our universe by the Schrödinger equation and all else follows...

So a proton and neutron are both made up of three quarks as pairs. Quarks are then part of the same Rydberg energy stabilization as is hydrogen and all matter. An electron is then made up of a three quarks as pairs as well. Electron mass is a result of the same Rydberg stabilization energy that forms quarks into protons and then electrons into hydrogen.

Although the freezeout of CMB matter involved many possible outcomes including antimatter outcomes, only those outcomes that met the Rydberg energy formed stable hydrogen and thus only about 1e-7 of aether makes up observable matter. Unlike a proton or neutron which have both mass and charge radii, an electron has only a charge and not a mass radius. This is because of the very different quark quantum phases for an electron versus a proton or neutron.

The matter-action outcomes are, of course, not accepted at all by Science, but have not been falsified either...

    You seem have no idea at all, in regards to what Shannon's Theory was originally all about: To ensure error-free information recovery (no uncertainty! - none!), rather than transmitting any physically-meaningful "data", that can be "measured", an emitter needs to transmit a sequence of intrinsically meaningless random noise. That entire sequence of transmitted noise is "detected" in one, single, irreducible, integration "event", (incorrectly interpreted, in quantum theory, as a collapse of a wavefunction) via a correlation process (a matched filter, which has a priori knowledge of the sequences to be transmitted). Each entire sequence is thus either successfully "detected", or not - all or nothing - there is nothing being "measured". Whatever "meaning" or significance is attached to each such detection event, is just that; "attached" extrinsically. Shannon's theory is all about the design of such correlation based detection; how does the duration and bandwidth of the "noise-like" sequence effect the maximum correlation "processing gain" and thus the probability of never mistaking the detection of the desired sequences, for the detection of any other noise sequence. Longer sequences with larger bandwidths result in greater processing gain, and thus, less likelihood of ever mistaking a desired sequence for any other sequence.

    In other words, rather than directly transmitting any physically-meaningful "data", that might actually be desired (and subjected to an attempt at "measurement") by some intended receiver, an emitter must instead, deliberately transmit the only type of "signal" (random "noise-like" sequences with suitably high processing gain) that can ever be unmistakably detected by any receiver. The actually desired "data" can subsequently be deduced (not measured!), via an a priori agreed upon "encoding" of which data (such as a bit-value = 1, or "spin up") corresponds to which a priori known, random noise sequence.

    The issue of concern in quantum physics, is what happens when the sequence duration and its bandwidth, are simultaneously reduced to the minimum-possible time-bandwidth product, that would still enable the sequence to be just barely detectable (in the presence of "channel noise"), even under the most optimal conditions possible (the receiver has complete a priori knowledge of the optimal matched filter etc.) That is what the Heisenberg Uncertainty Principle is ultimately all about. And that is what is being demonstrated in my paper - and even though the matched filter being used, in the paper, is not actually the optimal filter, it nevertheless is already "good enough" to reproduce the entire quantum correlation curve (not just the few points produced by typical "Bell tests"), with detection efficiencies greater than that which has been supposedly "proven" to be the maximum possible, for any conceivable, classical system (AKA "local realism").

    Rob McEachern

    Hi Steve, thanks for developing, it is well explained. I like the works of Dirac , the matter action of course seems relevant , we need to know more because like I told we have limitations in knowledges unfortunally about the main origin of our reality and we don t know really also these foundamental mathematical and physical objects , your explainations were a pleasure to read, I study in the same time, thanks still. Regards

    Steve , if we take the matter action and if we correlate with the einstein hilbert action and the fields equations, we consider this general relativity and still these photons like primordial essence, I like these works, but we cannot affirm that this is the main origin of our universe, einstein recognised this, that is why for me they cannot renormalise and quantified this quantum gravitation , because in fact the aim is not to unify th GR and this QM, but we need to superimpose a deeper logic for the matter actions like you told. The electrons are fascinating but we don t know like I said what they are really and why they have their properties like mass, spin and others, we just see the effects and we are limited unfortunally about their main essence, the strings and geometrical algebras consider an origin of fields and they try to rank them with the non commutativity and the non acciativity also like the Lie groups but for me and it is just my opinion, it is not sufficient and it lacks many things, I doubt really that this universe comes from an infinite heat and after photons and after they oscillate and vibrates to give the geometries, topologies.... we cannot affrim these things, this reasoning permit just to encircle better our standard model and its feilds but don t explain the generality of our universe. You speak about the aether, it is still a luminiferous aether and it seems to lack something, that is why I consider two other aethers superimposed with the cold dark matter and this DE vaccuum space for the main codes, that can permit to better understand this matter action in its deepest meaning. like you told, the universe is the universe because it is like thais, but philosophically we need to understand better its origin and these foundamental objects, we just analyse the surfaces of the problem actually and this standard model is so limited, we know so few still, we must accept our limitations and the fact that we know a so small part of the puzzle, I doubt really that the fields are the solution, I prefer to consider particles coded giving fields. The electrons are intriguning because these fermions are important for the matters like these photons, they permit their properties but we don t know unfortunally thei main causes and codes if I can say. All this to tell that the tensors actually for the GR and this luminiferous aether with the works of Ricci , riemann, einstein,m hilbert are for this luminiferous aether , but they don t explain the deep unknowns, I beleive strongly that this relativity is a prison even if I recognise it like correct, I just tell that we need to superimpose other parameters, without this, we cannot explain our unknowns, it is not possible. The vaccuum , the aether of space and main codes more this cold dark matter the second fuel in the cold can permit to exaplin these unknowns, and the antiparticles , the evolution also like the life death even and the recycling. All this becomes philosophical and we need this philosophy and different ideas to try to understand this origin. We must think beyond the box probably. Regards

    If we take the charged point particles for the einstein gravitational field, we see the motions and the effects on this spacetime that we observe, but all this is for observations , I beleive humbly that it is the problem when the searchers try to unify this GR and the QM to reach this quantum gravitation, in fact we must consider a different logic because what we searc is not about the observations and the GR for me, it is a force between particles and their motions orbital and spinal, it is totally different, we must respect the newtonian mechanics it seems to me , that is why there is a problem of renormalisation, now in considering other particles than photons to be simplistic for the bosonic fields and that this cold dark matter permit to explain the balance of heat for all particles , that explains that the standard is encircled by a cold deeper logic explaining the antiparticles but also this quantum gravitation because the main codes are in this vacuum energetical of this DE and after we have the two fuels that I explained permitting the fields, the quantum gravitation so is not an emergent electromagnetic force but in fact the gravitation is the main chief orchestra and this electronagntism it is the emergence, it is totally diffrent. The last works of Wilczek and these gravitons and their noises permit to better understand the general relativity but they are not these points the quanta of this gravitational quantum weakest force, they are just points of gravitational waves. I return so about this prison of this relativity and the fact to consider only these photons like primordial essence, einstein has well worked indeed but it is just observations and the photons, and now all they try with different oscillations and geonetrical algebras to find our unknowns in this logic, but that cannot explain the deepest unknowns like this DM,this DE , this QG,this consciousness even and others, it lacked really to insert deeper reasonings , all this is philosophical also. I cannot affirm that my 3 finite series of 3D spheres, the vacuum space and the two fuels are true, but it is an idea like the others, and the spheres seem foundamental, after all, we have inside this universe only spheres , I have calculated the dirac large number converges with the finite numbr of cosmol spheres and probably that these 3 main finite series also quant are like this, why I don t know, it is simply the choice of the primordial fractal from this infinite eternal consciousness , it is my interpretation but I respct the others , and nobody knows the truth is we are an accident mathematical or others, we just discuss, and that implies that we have a central one for all finite series, now philosophically that implies a thing paradoxal about the central cosmological sphere, it is there that all comes from, maybe we have a multiverse I don t know but so it implies intriguing things about the central ones, multispheres or universal sphere , in all case we retrun always at a necessary uniqueness, but already our universe it is difficult to understand it ,so let s be humble. If the fractal of spheres is a primoridal thing, so it becomes even complex seem that each cosmological sphere is unique in its complexity. The matter action are more than we can imagine, it is nor about the GR nor about our limited analysis, it is deeper, it seems evident.Now is my reasoning is corrct considering these 3 main finite series, that becomes relevant to rank our particles , we have mainly particles and fields due to these encoded particles of DM and photons, the cold and heat dance together under the codes of this space.

    Dear Robert,

    for me your theory about the issue of (non-) local quantum behaviour is similar to the quest of whether mathematics is something merely invented or something really discovered by man.

    If local quantum behaviour for entanglement is thought to be possible only due to in-principle undetectable physical mechanisms, the question arises whether these mechanisms are just invented (to save local-realism) by man or are really discovered by man (to demonstrate local-realism).

    Using in-principle undetectable physical mechanisms to explain something may result in a nice explanation, but as we know not all nice explanations meet reality as it is independent of our thoughts.

    Therefore, for me there is a difference between a theory that cannot be falsified because all the contents of the theory perfectly match physical reality (theory spot-on describes what physically goes on) and a theory that cannot be falsified because parts of its contents are defined as in-principle undetectable. Therefore my question: is your theory falsifiable and if not, why?

    Stefan

    Hi Stefan, you speak about a relevant point of vue in fact, how can we consider the fiability of our tools mathematical utilised and its partitions, I beleive strongly that like Max Tegmark said that these maths when they are well utilised and concrete permit to describe correctly the quantum mechanics in its effects, but of course we have limitations and unknowns. The maths permit to prove but also can extrapolate assumptions not proved still, like the reversibility of this time for example, the whormholes or others mathematical symmetries, we must probably be prudent in our conclusions. It is the same with the strinsg theory and the dimensions , they begin in 1D with these strings at this planck scale and after connect with the 1D main fields and after extrapolate the 11D mainly , but we cannot affirm that all this is tue, we don t know like I said our foundamental objects and the real philosophical origin of our universe. I liked also the ideas of Robert and your question indeed is relevant , is it falsifiable and if not, why? regards

    we arrive at these hidden variables and new parameters to superimpose, the local realism is described with actual mathematical and physical tools but we analyse the surfaces of probelsm with the effects, the problem is mainly that we don t know the real origin of mechanisms and the main causes , the fields of course are relevant and the strings permit to better understand their comportments , they permit to better rank the fields bosonic of this standard model, but unfortunally we have not the main cause and the philosophical origin. The mathematics are not really invented by the humans , we have just invented the language to interpret it, a number 1 will be always a number 1 after all , even on an other planet with an advanced civilisations, they utilise the same tools even if the language is different. Maybe the only one thing with the maths to be prudent is the extrapolations , we cannot conclude and affirm all the mathematical extrapolations, we can just accept the proved rigourous laws, axioms, equations and in physics it is the same. The universe seems to utilise precise mathenatical and physical laws , Tegmark has described this universe with mor than 100 mathematical equations I beleive and he has concluded a multiverse, it is relevant, even if I consider only one universe, we cannot affirm in fact, it is beyond our understanding still.

    If Max Tegmark is right about the fact that the maths are foundamental , that becones veryy relevant because it exists an universal partition permitting to explain the majority of unknowns, but we are still youngs considering the evolution and our knowledges, this partition seems beyond our understanding, but it exists, I beleive strongly that these 3d spheres , these 3 main finite series are the secret but I cannot find the partition of these spheres due to their number and their complexity , furthermore th evolution also is essential and this thing is difficult to consider in our quantum mechanics. It could be relevant to make simulations in inserting all the mathematical equations more these finite series and consider the physical properties and play with the combinations and variables, but a computer is made by us with also limitations in equations and concepts utilised, the problem seems there, it lacks many things, that is why we cannot reach a quantum computer and explain our unknowns, just because we don t know the main partition , the foundamental mathematical and physical objects and the philosophical origin, but we evolve each day.

    Boy, you really love this Shannon-Hartley law of bits versus bandwidth and you are right, Shannon-Hartley is all about transmitting Gaussian white noise, not really information. I have to admit that you have studied Shannon-Hartley law much more than I have, but looking carefully at your example of missed Shannon bit detections shows that your arguments actually prove the fundamental quantum notion: measuring a bit polarization outcome does not reveal with certainty the polarization of its precursor.

    Quantum phase shows this uncertainty just like your tuned Shannon bit filter also shows this uncertainty...so your tuned Shannon bit filter actually validates the uncertainty principle of quantum phase. Here is a pic to show the difference between quantum and Shannon bits...Attachment #1: MceachernMissedDetections.JPG

    Robert, Doc and Steve,

    i.e. "measuring a bit polarization outcome does not reveal with certainty the polarization of its precursor."

    from widely divergent perspectives, you all seem to be in agreement on that! That individual divergence spans the paradigm difference of an assumed ideal, uniform particle specie, and the contrary realistic view that there is no such thing as a truly ideal particle that is uniform as a spicie.

    At least this does go to the question of what actually happens in the Near Field of the Transition zone, because as it is the practicing general consensus been assuming just the opposite: that what is detected on reception is a perfect replication of what happens in the source.

    I'd say this dialogue you have going is displaying real progress. jrc

    Stefan asked:

    "Therefore my question: is your theory falsifiable..."

    Of course it is. Just demonstrate that the code in my paper, either (1) does not in fact do what I claim it will do, when executed on your own computer, or (2), that it does in fact do what I claim, but not for any of the reasons that I have claimed. And consider this: I choose to attack this particular "quantum correlation" issue and provided my code, precisely to make it very simple (just download the code and run it) for everyone to "hit me with your best shot." So here we are, four years later, the code has now been viewed by thousands, rewritten and reproduced by a few, and yet no one has publicly claimed to have falsified it.

    In this regard, note that I am claiming to have falsified Bell's theorem and related Bell tests, via (2) above and not by (1); "Quantum Correlations" really do happen; but they are being caused by an entirely different mechanism than that supposed by Bell and others. I am merely exploiting a "loophole", that they failed to close. And they failed to close it, precisely because they failed to realize that it even existed, which is exactly what Einstein et al predicted would happen with the EPR paradox. So there is no great surprise that this has eventually happened.

    The issue has nothing to do with "undetectable physical mechanisms". It is simple, World War II era radar signal detection theory, combined with Shannon's World War II era Information Theory. Here is a simple, Korean "exclusion zone", detection problem, that may help to put this all in perspective:

    Your mission (and it is not always impossible), should you decide to accept it, is to determine that, if an intruder ever appears in the exclusion zone separating North and South Korea, then correctly decide if they entered from North Korea (up) or from South Korea (down). You are allowed to adjust these two parameters as required; (1) the "bandwidth" of the exclusion zone - how wide it is, and (2) the time interval between successive observations looking for any possible intruder. Thus, for example, if you think the intruder will be on foot and running no faster than 30 km/h, then you might wish to make the width of the exclusion zone wide enough to ensure that such a runner could not possibly run from either edge of the zone and across a "decision point" in the middle of the zone, between the time-intervals between observations, that you have also chosen. On the other hand, if the intruder is in a jet fighter, screaming along at 2000 km/h, you might wish to construct a wider exclusion zone than would be necessary in the case of the runner, and employ shorter time-intervals between observations, to ensure that even such a fast intrusion would never prevent you from correctly deciding if the intruder is either an "up" intruder or a "down" intruder; they cannot cross the decision "threshold" before being observed.

    The point is, your ability to correctly decide the "up" or "down" intrusion event, is highly dependent on the time-bandwidth product that has been built into the detection mechanism. That is what Shannon's Information Theory is all about. So what is going to happen, in the case in which you reduce both the time-interval and the bandwidth of the detector, to values so very small, that even a tiny amount of "noise" anywhere in the situation, makes it impossible, even in principle, to correctly decide the issue? You had better pray, very hard, that such non-zero amounts of noise can ever possibly exist, in the real world! Unfortunately, that prayer, the prayer of all physicists, does not appear to have been answered; "identical" particles (the "intruders") do not seem to exhibit identical "noise" - hence they appear to behave as if they are a bit "fuzzy" around the edges - just enough to make it impossible to ever correctly decide the detection problem, when the "exclusion zone" is effectively smaller than the "fuzz" around the edges. So even though you may frequently succeed at detecting the existence of any intruders within the "exclusion zone", you may nevertheless fail to correctly decide the "up" versus "down" problem in most intrusion events.

    That is the situation being systematically constructed within my code; it is constructing "locally real" objects, that have just the right amount of "noise" to prevent correct "up" versus "down" bit-decisions, whenever the polarized objects being measured have their polarization axis misaligned with the axis of the detector. And that brings us back to the original subject matter of this discussion - is there a preferred reference frame for making such measurements? Yes there is - the polarization axis of the entity to be measured. Because the entire detection process has been so delicately balanced - at the very limit (Shannon's limit) of what can been done, that ever Bell test will collapse (produce too many bit-errors) when "lope-sided", off-axis measurements are made.

    Rob McEachern