[deleted]
The preprint mentioned in the post may be found at http://arxiv.org/abs/1106.2133
The preprint mentioned in the post may be found at http://arxiv.org/abs/1106.2133
Thank you, Stephane. I will have a look at your paper.
Joy
"No one can disprove an inequality like 2 < 3."
Exactly so. Leslie Lamport ("Buridan's Principle," 1984)addressed this problem of making a decision (or measurement) in a bounded length of time:
"A real Stern-Gerlach apparatus does not produce the discrete statistical
distribution of electron trajectories usually ascribed to it in simplified
descriptions. Instead, it produces a continuous distribution having two maxima,
but with a nonzero probability of finding an electron in any finite region
between them. Trying to decide if the electron is deflected up or down then
becomes just another instance of the problem of making a discrete decision
based upon a continuous input value, so nothing has been gained by
measuring the discrete spin value.
"Validity of Buridan's Principle implies the following:
"Buridan's Law of Measurement. If x < y < z, then any measurement performed in a bounded length of time that has a nonzero probability of yielding a value in a neighborhood of x and a nonzero probability of yielding a value in a neighborhood of z must also have a nonzero probability of yielding a value in a neighborhood of y.
"If this law is not valid, then one can find a counterexample to Buridan's Principle, with the discrete decision being: 'Is the value greater or less than y?' There does not seem to be a quantum-mechanical theory of measurement
from which one can derive Buridan's Law of Measurement."
Lamport goes on to describe the experimental challenge in terms of classical continuous functions:
"Buridan's Principle rests upon mathematical concepts of continuity and boundedness that are not physically observable. No real experiment, having finite precision, can demonstrate the presence or absence of continuity, which is defined in terms of limits. No experiment can demonstrate that an arbiter requires an unbounded length of time to reach a decision. An experiment in which the arbiter failed to decide within a week does not prove that it would not always decide within a year.
"To understand the meaning of Buridan's Principle as a scientific law,
consider the analogous problem with classical mechanics. Kepler's first law states that the orbit of a planet is an ellipse. This is not experimentally verifiable because any finite-precision measurement of the orbit is consistent with an infinite number of mathematical curves. In practice, what we can deduce from Kepler's law is that measurement of the orbit will, to a good approximation, be consistent with the predicted ellipse."
Joy Christian's experimental paraemters are classical, as were Bell's. His measure criteria, therefore, are predictive without being probabilistic. Any experimental model is finite in space and bounded in time. Quantum mechanical experiments assume that reality is finite in time (t = 1) and unbounded in space, therefore nonlocal. A local realistic model finite in space and unbounded in time is a classical measurement scheme that -- like Kepler's orbits -- makes a closed judgment to arbitrary accuracy of determined particle paths and momenta as t --> T, according to the specified topology in which the functions are complete, continuous and real.
Tom
An ad hominem and an irrelevant trivial error in the formulation (inequality instead of theorem) is all you have to answer? Not much.
It would be interesting, at least for me, if you would support the claim of "the prejudices and ignorance of Ilja Schmelzer" with some evidence.
I think this should be quite easy, once my argument is quite easy.
The experimental data in Bell-type experiments are frequencies p(AB|ab). These are used to define the expectation values E(ab) = sum AB p(AB|ab). Here the A, B are classical results of observations, which are well-defined and +1. A model which explains Aspect-like experiments should be able to predict the observed frequencies p(AB|ab), which is not done.
Or at least I have not yet seen it done. Feel free to do it here.
Here is one central post from the thread with my argument in more detail.
Here are my papers in detail. Read them. Understand them. And then there can be any chance of a dialogue between us. So far what I have seen from you is nothing but bookish knowledge, prejudices, and a total lack of understanding of my argument---the proof of this fact is already there in your very post if you have the eyes to see it.
I have asked you a quite simple question. Again, even simpler: Have you provided, in one of your papers, a local model which predicts probabilities p(A,B|a,b) so that the corresponding expectation values E(a,b)=sum AB p(A,B|a,b) violate Bell's inequalities?
If yes, tell me the paper and the pages. If not, that's my point.
In this case, explain where is the error in my argument. I think the probabilities p(A,B|a,b) are predicted by QM and corresponding frequencies are measured in experiments, in a quite open way, without anything hidden. So any realistic model has to recover them.
What's your problem with explaining me the simple error in such a simple argument? Of course, I have no eyes to see my own errors, else I would not make them. That's a tautology. So, please help a poor soul, who is unable to understand your deep thoughts, to see his trivial error.
Else, I wish you succes with publishing your model in a good journal - it would be a nice possibility for me to receive some explanation of my error by peer review, or to publish a rebuttal there.
They should put this somewhere with open access, like arxiv.org. I don't recognize hidden science, or science which requires $31.50 for research usually paid by the taxpayers.
Judging from the abstract alone, it may be simply using the detector efficiency loophole, in this case it would be of no interest at all.
The topology of the measurement results is trivial, these results are values +1, that's all. There is nothing to change. Christian may be free to speculate about the hidden parameters, they may be, of course, whatever Christian proposes, but the observed measurement results are simply +1, nothing else, with nothing but the trivial discrete topology.
And, Graham, there is no reason to wonder about a silence. arxiv:1109.0535 lists several other refutations.
In almost all of my papers you will find a local-realistic model that exactly reproduces all of the predictions of quantum mechanics for the singlet state. And by all I mean all. I have no interest in educating you otherwise.
Hi Ilja,
We had a big discussion about De Raedt et al, on sci.physics.foundations. I guess you missed it. You can find most all of their papers at,
http://rugth30.phys.rug.nl/dlm/
Click on the download link. There is a new one on arXiv,
http://www.arxiv.com/abs/1108.3583
Joy Christian and De Raedt et al, successfully demostrate that Bell's theorem doesn't make proper contact with physical reality. Plus De Raedt et al show exstensively that the EPRB type experiments are flawed. Mostly by the so-called time coincidence "loophole" not the detector efficiency loophole. The time coincidence loophole is not really a loophole; it is a "problem" for the experiments to be valid.
Best,
Fred Diether
moderator sci.physics.foundations
The topology of the co-domain of the measurement results is *non-trivial.*
The co-domain of Bell's measurement functions A(a, L) = +1 or -1 must be a unit parallelized 3-sphere, otherwise the EPR criterion of completeness is not satisfied and Bell's argument becomes a non-starter.
When Bell's prescription A(a, L) is so completed (as shown, for example, in the attached papers), then the correlations are necessarily equal to -a.b, by the very topology of the co-domain S^3 of the measurement results. The "trivial discrete topology" of S^0 cannot possibly accommodate the completeness criterion accommodated by the simply-connected topology of the parallelized 3-sphere.
Further details of the topological mistake Bell made in his very first equation can be found in the first chapter of my book, as well as in the attached papers.
Joy ChristianAttachment #1: 21_pseudo.pdfAttachment #2: 89_Gill.pdf
PS: There is in fact a much closer link between my work and teleparallel gravity. You can find a more recent discussion about this on my blog.
Very impressive! However, I'd like to present an argument that quantum entanglement and non-locality are not illusions. At the risk of appearing naive and being dismissed; my argument will not involve technical language, topology or algebra (these are confusing to many). I'll use extracts in essay form (from my entry "Unified Field, Relativity and Quantum Mechanics Meet String Theory, Parallel Universes, the Mathematical Universe, and TOE" in FQXi's current contest - and from other essays where necessary).
FIRST, I DON'T THINK THERE ARE FOUR FUNDAMENTAL FORCES OF NATURE - BUT ONLY ONE.
Suppose Albert Einstein was correct when he said gravitation plays a role in the constitution of elementary particles (in "Do Gravitational Fields Play An Essential Part In The Structure Of The Elementary Particles Of Matter?", a 1919 submission to the Prussian Academy of Sciences). Einstein also said gravity and electromagnetism may be related - in his paper to the Prussian Academy, he said "Therefore, by equation (1) , we cannot arrive at a theory of the electron by restricting ourselves to the electromagnetic components of the Maxwell-Lorentz theory ..." A wave packet consisting of gravitation and EM (modified gravitation - see explanation later) would possess what we call mass because of that force's effect on other particles. Where does this leave the Standard Model Higgs field and boson? Also - Steven Weinberg, Abdus Salam and Sheldon Glashow shared the 1979 Nobel prize in physics for electroweak unification (of the weak force and electromagnetism). I suggest it's possible to alter the physics and mathematics of their electroweak theory to agree with the insights of a man called Einstein (especially when, as later parts of this article show, his insights lead to resolution of the dark matter problem and a minor revision of gravitational theory that explains all 3 of Kepler's laws of planetary motion - dark matter is also explicable in terms of gravity). And I suggest theories of the scientists who proposed quarks as elementary constituents of matter, George Zweig and Murray Gell-Mann, could also be adapted to fit Einstein's insights. After all, Stephen Hawking and Leonard Mlodinow wrote on p.49 of their book "The Grand Design" (Bantam Press, 2010), "It is certainly possible that some alien beings ... would make the same experimental observations that we do, but describe them without quarks."
Speaking of the electroweak force, here's a little bit about "the nuclear forces as modified gravity" - The strong force binds protons and neutrons (nucleons) together to form the nucleus of an atom. It's also the force (carried by gluons) that holds quarks together to form protons, neutrons and other hadron particles. It's 10^38 (100 trillion trillion trillion) times the strength of gravity because it's the product of the electromagnetic force (10^36 times gravity's strength) combined with 10^2 (100) gravitons per electromagnetic photon (the graviton is a hypothetical elementary particle that mediates the force of gravitation). The weak force is responsible for the radioactive decay of subatomic particles and initiating hydrogen fusion in stars. The weak force is 10^25 (10 million billion billion) times gravity's strength because it's the product of the electromagnetic force combined with 100 billion anti-gravitons. That is, it's 10^36 times the strength of gravity divided by 10^11.
If the nuclear forces may be different facets of gravitation, is it possible that electromagnetism also has no existence independently of it? This is possible if all forces have a mathematical origin, in which case a few ideas can be borrowed from string theory's ideas of everything being ultimately composed of tiny, one-dimensional strings that vibrate as clockwise, standing, and counterclockwise currents in a four-dimensional looped superstring. We can visualize tiny, one dimensional binary digits of 1 and 0 (base 2 mathematics) forming currents in a Mobius loop - or in 2 Mobius loops, clockwise currents in one loop combining with counterclockwise currents in the other to form a standing current Combination of the 2 loops' currents requires connection of the two as a four-dimensional Klein bottle whose construction from binary digits would make it malleable and flexible, deleting any gap and molding its border to perfectly fit surrounding subuniverses. This Klein bottle could possibly be a figure-8 Klein bottle because its similarities to a doughnut's shape describes an idea suggested by mathematics' "Poincare conjecture". The conjecture has implications for the universe's shape and says you cannot transform a doughnut shape into a sphere without ripping it. One interpretation follows: This can be viewed as subuniverses shaped like Figure-8 Klein Bottles gaining rips called wormholes when extended into the spherical spacetime that goes on forever (forming one infinite superuniverse which is often called the multiverse when subuniverses - which share the same set of physics' laws - are incorrectly called parallel universes which are wrongly claimed to each possess different laws). Picture spacetime existing on the surface of this doughnut which has rips in it. These rips provide shortcuts between points in space and time - and belong in a 5th-dimensional hyperspace. The boundary where subuniverses meet could be called a Cosmic String (they'd be analogous to cracks that form when water freezes into ice i.e. cosmic strings would form as subuniverses cool from their respective Big Bangs).
Look at the illustration below of a loop (in this case, a Mobius strip). The bottom of it looks like part of a circle while the top has a twist. As a starting point, this particular orientation is referred to here as "spin 1" - it only looks the same if it's turned round a complete revolution of 360 degrees, like the Ace of Spades card pictured in "A Brief History of Time" by Stephen Hawking - Bantam Press, 1988. All the particles of matter in the universe possess spin ½ and need to be turned through two complete revolutions to look the same, just as you have to travel around a Mobius strip twice to reach the beginning (implying that electrons etc. have a structure originating with the mathematical Mobius). A photon has spin 1 and when it interacts with a graviton in a wave packet* (gravitons have spin 2 and look the same if turned round 180 degrees or half a revolution, like the double-headed Queen of Spades in "A Brief History of Time"), the particles' orientations can be the same i.e. they can both have their twist at the top.
* (A wave packet is a short "burst" or "envelope" of wave action that travels as a unit, and is interpreted by quantum mechanics as a probability wave describing the probability that a particle will have a given position and momentum). It acts like 2 hands coming together and catching a ball. Actually, photons are absorbed and emitted just as in laser cooling but instead of a laser beam slowing down atoms, the envelope slows (and traps) photons.
Mobius Loop
If oriented the same way, the electromagnetic and gravity waves forming the wave packets undergo constructive interference and reinforce to produce mass - a massive W+, W- or Z^0 (the carriers of the weak force) that must be turned 360 degrees to look identical i.e. they have spin 1. Slight imperfections in the way the Mobius loops fit together determine the precise nature of the binary-digit currents and therefore of exact mass or charge. If oriented dissimilarly, they undergo destructive interference and partly cancel (there's little or no twist now - both top and bottom of the new Mobius resemble parts of a circle) to create masslessness - a massless, chargeless gluon (carrier of the strong force) that is identical if turned 360 degrees and similarly possesses spin 1. Quarks - in this interpretation, the gravitational and electromagnetic interference caused by a particular positioning of a Mobius strip - combine into protons, mesons and neutrons but are never found in isolation and cannot be observed directly. (In this explanation, the strong and weak nuclear forces have no existence independently of gravitation and electromagnetism. Since EM is modified gravitation according to this article, it's perfectly OK to simply say "independently of gravitation"). They could simply be products of graviton-photon interaction: the strong nuclear force - which is 10^38 times gravity's strength - could be gravity "added to" electromagnetism while the weak nuclear force - 10^25 times gravity's strength - could be gravity "subtracted from" electromagnetism [identical to the antigravitons of antigravity being added to electromagnetism]. The 2nd example assumes combining with 100 billion antigravitons while the 1st assumes the presence of 100 gravitons per electromagnetic photon, and I believe these "assumptions" are justifiable by photon-graviton oscillation or transmutation ...which makes electromagnetism "modified gravitation" and, nonlinearly, it also makes gravitation "modified EM" i.e. gravity can be produced by the on-off pulses of EM known as 1's and 0's) An antiphoton would be formed by the fitting together of a force-carrying, spin -2 antigraviton with a spin 1 photon: (-2)+(+1) = -1. If it's correct that "antiparticles are identical in mass to matter particles but opposite in one key property; we would expect the antiparticle of a massless, chargeless photon to have a spin of negative 1.
It's essential to remember that this article is not saying electromagnetism and the nuclear forces do not exist. It's saying they don't exist independently of gravitation, which is the cause of all repelling and attracting. Like everything, gravity obeys fractal geometry. On the cosmic scale, it pushes planets toward stars (which the inertia of the planets causes them to orbit and be "attracted" to) and pushes galaxy clusters apart (it's called dark energy in this case, the continued production of gravity by BITS or BInary digiTS causing accelerating cosmic expansion).
SECOND - BELL'S THEOREM, QUANTUM ENTANGLEMENT AND NONLOCALITY
"Hidden variables" is an interpretation of quantum mechanics which is based on
belief that the theory is incomplete (Albert Einstein is the most famous proponent
of hidden variables) and it says there is an underlying reality with additional
information of the quantum world. I suggest this underlying reality is binary digits
generated in 5D hyperspace. These allow time travel by making it possible to
warp space (wormholes being one example of doing this) simultaneously
adding precision and flexibility to the elimination of distances; and the "fitting
together" of subuniverses to form a continuous superuniverse.
"Empty" space (according to Einstein, gravitation is the warping of this) seems to be made up of what is sometimes referred to as virtual particles by physicists since the concept of virtual particles is closely related to the idea of quantum fluctuations (a quantum fluctuation is the temporary change in the amount of energy at a point in space). The production of space by BITS (BInary digiTS) necessarily means there is a change in the amount of energy at a certain point, and the word "temporary" refers to what we know as motion or time (in a universe made of Binary digiTS, motion would be a succession of "frames"). Vacuum energy is the zero-point energy (lowest possible energy that a system may have) of all the fields (e.g. electromagnetic) in space, and is an underlying background energy that exists in space even when the space is devoid of matter. Binary digits might be substituted for the terms zero-point energy (since BITS are the ground state or lowest possible energy level) and vacuum energy (because BITS are the underlying background energy of empty space). Relativistically, space can't be mentioned without also mentioning time, whose warping can therefore also be viewed as gravitation (since "dark matter" is invisible but has gravitational influence, its existence could be achieved by ordinary matter travelling through time).
I call hidden variables (or virtual particles) binary digits generated in a 5th-dimensional hyperspace which makes them - as explained in the next sentence - a non-local variety, in agreement with the limits imposed by Bell's theorem. (Bell's Theorem is a mathematical proof discovered by John Bell in 1964 that says any hidden variables theory whose predictions agree with quantum mechanics must be non-local i.e. it must allow an influence to pass between two systems or particles instantaneously, so that a cause at one place can produce an immediate effect at some distant location [not only in space, but also in time].) Comparing space-time to an infinite computer screen and the 5th dimension to its relatively small - in this case, so tiny as to be nonexistent in spacetime (at least to observation) - Central Processing Unit, the calculations in the "small" CPU would create and influence everything in infinite space and infinite time. This permits a "distant" event to instantly affect another (exemplified by the quantum entanglement of particles separated by light years) or permit effects to influence causes (exemplified by the retrocausality or backward causality promoted by Yakir Aharonov and others (see "Five Decades of Physics" by John G. Cramer, Professor of Physics, University of Washington - http://www.physics.ohio-state.edu/~lisa/CramerSymposium/talks/Cramer.pdf). This means quantum processes, in which effects and causes/distant events are not separated, wouldn't be confined to tiny subatomic scales but would also occur on the largest cosmic scales.
"The path to truth is thinner than a razor's edge."
"The fact that a great many people believe something is no guarantee of its truth."
- W. Somerset Maugham, The Razor's Edge
Hi Fred,
Along this line -- I think the following post of mine was deleted from now closed "Disproofs" thread, and I thought I had not saved it to an offline file. I just found that I did save it though; I am posting again because I think it makes a critical point about seeing what one only expects to see:
"Joy,
As confounding as it is, I think we should be charitable with Weatherall and the esteemed editors of the publication. They are only following a long-entrenched premise of quantum mechanics that I have come to recognize is exceedingly easy to accept and impossible to falsify:
"Before I say what the model is, let me motivate it a little. The experiment whose outcomes we are trying to reproduce involves spin measurements. Quantum mechanical spin does not have a direct classical analogue, but the spin operators satisfy the same algebraic relations as rotations in three dimensional Euclidean space. This is suggestive: one might take it to imply that the right way of thinking about spin is as some sort of rotation. Following this idea, one might reason that when an experimenter measures a particle to be spin up or spin down about a given vector, she is actually measuring the orientation of the body's rotation." Weatherall, p. 8
This justifies a later assumption: "A and B are explicitly deterministic, since given a value for the hidden variable, and given a choice of measurement vector, the observables take unique, determinate values. And they are local, in the sense that the value of the observable A is independent of Beta_a, and B is independent of Alpha_a.
"We assume an isotropic probability density over the space of hidden variables. This density is simply a map rho: Lambda --> [0, 1] that assigns equal probability to both cases."
I cited the paragraph immediately above in my earlier criticism of the Weatherall proposition. The unfalsifiable premise is the "equally likely" hypothesis. The rub is, that if one imposes the equally likely hypothesis on a theory of continuous measurement functions, one cannot avoid getting what Weatherall calculates for values of A and B (= 0) " ... on average, neither body is rotating about any direction. This result is consistent with the quantum mechanical expectation values and with experiment."
It certainly is. This isn't more than question-begging, though. If the equally likely hypothesis holds over the space of hidden variables, then the result is unavoidable. That is not a scientific conclusion, in the sense of correspondence between two independent variables. The variables are dependent on the assumption of a probabilistic measure space.
What has always frustrated the hell out of me, is that even when your critics KNOW that your proposition is not based on an assumption of probability, they insert a probability space into their criticisms anyway. They don't consider it nonsense, because they mistake probability laws for physical laws. To use that classical example of Kepler orbits again, it's as if, that until one measures the curve between t and t' no orbital trajectory exists. They neglect the fact that if this were true, we would be able to measure the trajectory in a direction *different* from the one it naturally lives in. Quantum mechanics assumes no continuous map t --> t'.
If one is immersed in analysis and topology, one knows that orientiability is not a product of observer orientation; rather, orientation is that which the topology chooses. Yes of course one can beg the question and assume " ... spin operators satisfy the same algebraic relations as rotations in three dimensional Euclidean space," but that doesn't really mean anything -- for if one simply accepts a probabilistic measure at every t = 0 averages over T to zero (as it must, by the rules of arithmetic), one gets only what one assumes. Give me any purported proof of Bell's theorem (and I have examined a great many of them -- Herbert, Gill, Motl, Aaronson, et al) and I can explicitly show that the proof is nonconstructive, by double negation. Why should one be bothered by that? -- well, if the world were only as it appears, then a proof of that proposition by double negation only proves the null value of double negation itself, and no new knowledge emerges.
Your framework is in the true sense of science as expressed by Bronowski -- a "search for unity in hidden likenesses" -- the unity of quantum curvature with the curvature at the beginning and end of the universe.
Frankly, I'm happy that Weatherall's critique has been elevated to "official" status. Most of the criticisms of your proposition, notably Moldoveanu's and Vongehr's, are simply childishly naive. Weatherall manages to actually grasp the problem, and his wrong direction in solving it promises to reopen the real debate between EPR and Bell. We might once more be able to speak of scientific falsifiability with the credibility it deserves.
All best,
Tom"
Hi Tom,
They say in Hollywood here that any publicity is good publicity. So in a way that is true that the real debate is only getting started. It is really too bad though that the playing field for the debate is not level.
The notion that QM spin has no classical analogue is another reason Joy gets so much push back. Thus that notion also fits the quote from Maugham. A classical analogue is easy with parallelized 3-sphere topology.
Best,
Fred
Hi Fred and Tom,
Thanks for your comments. Here is a well known story you might enjoy:
For five years, from December 1903 to September 1908, two young bicycle mechanics from Ohio repeatedly claimed to have built a heavier than air flying machine and to have flown it successfully. But despite scores of public demonstrations, affidavits from local dignitaries, and photographs of themselves flying, the claims of Wilbur and Orville Wright were derided and dismissed as a hoax by Scientific American, the New York Herald, the US Army and most American scientists. Experts were so convinced, on purely scientific grounds, that heavier than air flight was impossible that they rejected the Wright brothers' claims without troubling to examine the evidence. It was not until President Theodore Roosevelt ordered public trials at Fort Myers in 1908 that the Wrights were able to prove conclusively their claim and the Army and scientific press were compelled to accept that their flying machine was a reality. In one of those delightful quirks of fate that somehow haunt the history of science, only weeks before the Wrights first flew at Kittyhawk, North Carolina, the professor of mathematics and astronomy at Johns Hopkins University, Simon Newcomb, had published an article in The Independent which showed scientifically that powered human flight was 'utterly impossible.' Powered flight, Newcomb believed, would require the discovery of some new unsuspected force in nature. Only a year earlier, Rear-Admiral George Melville, chief engineer of the US Navy, wrote in the North American Review that attempting to fly was 'absurd'. It was armed with such eminent authorities as these that Scientific American and the New York Herald scoffed at the Wrights as a pair of hoaxers.
In January 1905, more than a year after the Wrights had first flown, Scientific American carried an article ridiculing the 'alleged' flights that the Wrights claimed to have made. Without a trace of irony, the magazine gave as its main reason for not believing the Wrights the fact that the American press had failed to write anything about them: "If such sensational and tremendously important experiments are being conducted in a not very remote part of the country, on a subject in which almost everybody feels the most profound interest, is it possible to believe that the enterprising American reporter, who, it is well known, comes down the chimney when the door is locked in his face -- even if he has to scale a fifteen-storey skyscraper to do so -- would not have ascertained all about them and published them broadcast long ago?"
Hi Joy and Fred,
Now that's a familiar story! LOL!
Thanks. I was going to compose something about having to wait for the mathematical understanding of already known theorems to catch up to Joy's model -- and this is a perfect example. Bernoulli's principle of fluid flow was known for about 150 years before the Wright Brothers, which should have suggested to informed scientists that heavier than air flight is far from impossible.
A hundred years of topology research should tell informed scientists today that local realism does not necessarily depend on nonlocal events.
All best,
Tom
Just to add another fact to puncture the mythology:
The SA article of 1905 claims "If such sensational and tremendously important experiments are being conducted in a not very remote part of the country, on a subject in which almost everybody feels the most profound interest, is it possible to believe that the enterprising American reporter, who, it is well known, comes down the chimney when the door is locked in his face -- even if he has to scale a fifteen-storey skyscraper to do so -- would not have ascertained all about them and published them broadcast long ago?"
I can testify from first hand experience as a journalist that this is not only possible to believe, it is exactly that reporters are pack animals catering to narrow corporate interests that *ensures* the suppression of new ideas in the popular press.
It is the major reason I abandoned my successful broadcast journalism career, and I can further testify that the field has not only not improved since 1974, it has gotten far worse.
That's why I admire some courageous freelancers here like Zeeya Merali. They fan the dying ember of what hope I have left.
All best,
Tom
To try and get a new technical discussion started ...
Suppose we contrast the Christian correlation function C_ab with the CHSH correlation function.
The former is deterministic, admitting random input; the latter is probabilistic, producing random output.
Could these functions be dual to each other? -- if so, it venerates Joy Christian's claim that entanglement is an illusion, without obviating the computability and mathematical validity of Bell-Aspect experimental results. Discrete output of Bell-Aspect -- on the assumption of entanglement -- would be identical to truncated output, from a continuously randomized input function (the ThrowDie function in Chantal Roth's programming terms) of Joy Christian's model.
See how the contrast makes duality possible? -- Bell-Aspect simply assumes entanglement, while the Christian framework follows Newton's prescription, "hypotheses non fingo." Entanglement is superfluous to deterministic, continuous, natural (and locally real) functions.
What makes Bell's theorem important to computer controlled applications, particularly security, is the assured pseudo-random integrity of the ouput, based on the assumption of quantum entanglement in which nonlocal results remain out of reach of computation by an adversary who has no knowledge of how the pseudo-random string was obtained.
Representative of such infomration technology is this 2010 paper by Stefano Pironio, et al:
"We quantify the Bell inequality violation through the CHSH correlation function [19]
I = SIGMAx,y (- 1)^xy [P(a = b|xy - P(a != b|xy]
where P(a = b|xy) is the probability that a = b given settings (x; y) and P(a != b|xy] is defined analogously. Systems that admit a local, hence deterministic [20], description satisfy I =< 2. Certain measurements performed on entangled states, however, can violate this inequality."
A measurement on an assumed entangled state, however, is an average of trials on a probabilistic space. A deterministic measure on a complete -- i.e., classically continuous -- space, is an arbitrary choice of boundary conditions on an arbitrary interval. In other words, does nonlinear input to a continuous function guarantee precisely correlated linear output?
To prove the above question in a positive way, we need an arithmetically continuous model computable from a discrete initial condition. I've been working on a proof, the strategy outlined here and here .
Dual quantum correlation functions that generate identical results -- one function assuming non-observable quantum entanglement; the other, a correlation measurement continuous from the initial condition -- tells us that entanglement is an artifact of an artificial probability space and not physically real.
What justifies the probability measure space? -- only the assumption that fundamental reality is probabilistic. This assumption is made apparent in Poronio et al's conclusion:
"Stepping back to the more conceptual level, note that Eq. (3) relates the random character of quantum theory to the violation of Bell inequalities. This bound can be modified for a situation where we assume only the no-signalling principle instead of the entire quantum formalism (see Figure 2 and 3 and Appedenix A.3). Such a bound lays the basis for addressing in a statistically significant way one of the most fundamental questions raised by quantum theory: whether our world is compatible with determinism (but then necessarily signalling), or inherently random (if signalling is deemed impossible)."
The no-signalling condition is a red herring; Joy's framework is completely compatible with the no-signalling condition -- it couldn't *not* be, in that it is fully relativistic. Non-relativistic quantum theory gets "inherently random" nonlocal results on the assumption that locality forbids signalling -- which only assumes what it means to prove. Joy's framework generates manifestly local quantum correlations that still forbid signalling, a la special relativity.
Tom