Declan

My last post must have fallen foul of special symbols in the text, so I am trying again.

Have you seen an R language computer simulation of Pearle's (1970) model for the EPR-Bohm correlations by Richard Gill in July 2015 at http://rpubs.com/gill1109/pearle2 ?

The third graph shows that data are less likely to be gathered at theta = 90 and 270 degree.

The model uses a particular function to decide on whether data are excluded and this is the relevant part of the R code:

U is uniformly distributed

s i given by (2/sqrt(3*U+1)) - 1

where Pearle's "r" is arc cosine of "s"; divided by pi/2

this seems to be similar to your method of excluding data but I am not sure how close it is. Does your model give a curve which asymptotically approach the cosine curve as the number of pairs of particles emitted increases? The points plotted in your Figure 2 do not look like an exact match but that could be caused by a small sample? If your function is different, which gives the better fit to the cosine curve?

Best wishes

Austin

Dear Austin,

Thank you for bringing that to my attention, I have not seen it before and it is great supporting evidence. The exact shape of the correlation curve can be tweaked by changing the algorithm for non-detects, and there may be different algorithms for different types of detectors. In my last graph the Steering Inequality is modeled so some of the non-detects (Alice's) are admitted as valid results. Thus has the effect of changing the correlation curve away from a perfect cosine, primarily as perfect correlation and anti-correlation cannot be achieved at 0 and 180 degrees in such a model (as non-detects are included in the data).

Regards,

Declan

Dear Declan

The paper that I mentioned by Gill shows the possibility of obtaining an asymptotically exact cosine curve by removing data according to a formula. I have written, a few years ago, a visual basic program (mimicking Chantal Roth's original java pprogram) to obtain the cosine curve but it does remove a lot of data (about 25%, by my memory?).

I am quite suspicious that you have an asymmetry of Alice in favour of Bob with respect to missing data: should not there be symmetry? But I confess that I am unclear as to how that asymmetry arose in the missing data. There is also the issue of how experienced experimentalists can keep on overlooking, if that is what they are still doing, remaining sources of 'missing data' after taking such pains to exclude all possible sources?

So does nature actually give a cosine curve or does it really give a sawtooth curve which morphs into a cosine curve when there are missing data? If the latter, I do not know enough to see what damage that would do to QM. I am not sure if 'entanglement' can be excised from QM without harming the rest of QM?

There is also a third option (Joy Christian's option) which is that space is not as flat as it may seem. There are 1500 posts at

http://retractionwatch.com/2016/09/30/physicist-threatens-legal-action-after-journal-mysteriously-removed-study/

discussing this option. That website closed after exhausting the time available of the chair/referee rather than coming to an agreement. The maths there uses Clifford Algebra.

There is an interesting brief essay in this contest by Richard Conn Henry whose main point, as I interpret it, is that we do not yet have the correct topology of 4-space. Having the correct topology may be relevant to the third option above?

It is one thing to remove simulated data from a program after generating it in assumed 'normal' space, but the question arises if that data were removed illegitimately or whether that data never really could exist in a space of the correct topology.

My own essay in this contest suggests (page 7) that there should be no random outcomes for Alice and Bob: as everything seems pre-determined. That would imply hidden variables and hence enforce a sawtooth curve and be incompatible with QM assuming space is normal or flat or R3. But with a modified topology of space there is the possibility of rescuing the cosine curve, though I am not sure of the effect on QM as 'entanglement' would cease to exist. The final paragraph on page 8 of my essay implies that spin may be a vital component of emergent space. And I do not know what effect that quantum spin would have on the topology of space.

Best wishes

Austin

Dear Austin,

That is how the latest loophole-free experiments using a Steering Inequality are formulated. Bob is considered the trusted person and Alice The untrustworthy partner whose non-detects form part of the data. See the paper in my Ref 2 from my essay for details on this. When I first encountered the Steering Inequality experiments I was very doubtful about the asymmetry too; but the QM theorists have devised this technique to, supposedly, close the detection loophole by including non-detects in the statistics.

If all data is rejected past a cutoff threshold angle then the curve becomes a sawtooth. If a random element is introduced then the sawtooth is rounded off to be more like a cosine. The degree of rounding can be controlled by changing the random element/algorithm.

Event with the best experimental techniques I think the detection efficiency I limited to fairly poor efficiency (I think something like 60-79% but I could be wrong). There is no way to determine double non-detects that I know of, and no symmetrical Steering Inequality that I know of.

My essay is aimed at showing that despite the QM theorists' claims the detection loophole has not been closed and experimental results can still be explained Classically.

Regards,

Declan

Dear Declan

Thanks for the pointers. I will need to read more about trusty and untrustworthy partners to overcome my suspicions. It is reassuring that you also were doubtful about the asymmetry at first.

Best wishes

Austin

Dear Declan ! Many thanks for supporting my work by rating it up; Jan20, I already rated yours with 10. See my post above. I can learn at FQXI at lot of things by reading the key essays, which are connected to my own research on the logic of exponentiality and the fundamentals of science. Best: stephen

    Dear Stephen,

    Ok, many thanks - I was wondering where that 10 came from...

    Best Regards,

    Declan

    Dear Declan

    I also have been investigating classically produced quantum correlations, in the context of Rob McEachern's one bit hypothesis. The simplified model I came up with had a similar amount of noise obscuring the cosine, but when the number of trials was increased to one billion, a residual systematic error of about 1 percent was discovered. This was verified with a purely probabilistic treatment instead of random trials. (see vixra.org/abs/1705.0377)

    I have a vague interest in simulating quantum computation, and it seems to me that a systematic error would likely limit the number of qubits. Can you increase the number of trials to see if there might be some unwanted correlation? I think the real problem is to find a method that actually converges to the cosine.

    Richard Gill found such a method in the work of Philip Pearle, as pointed out by Austin. I recall following (or trying to follow) the discussion mainly between Gill and Joy Christian, and then forgot about it until Austin's post. (see another of Gill's papers arxiv.org/abs/1505.04431) Having some experience now with the problem, I see that Gill does a nice job of explaining the procedure, but I find R code can be quite obscure. Here are some basic details.

    The method requires three(!) random numbers R1,R2,R3 generated uniformly over the interval 0-1 for each trial. Gill stores a large set of transformed random numbers, z,x,s, to be reused in trials for any combination of settings by Alice and Bob.

    The first two random numbers are transformed to cover a spherical shell, and then projected onto a plane running through the center, forming a disc. Points on the plane are taken as 2d vectors (z,x), so the distribution of their magnitude is biased towards the edge of the disc, where it is most dense.

    The third random number sets the threshold, s, for detection, with another carefully crafted distribution.

    z = 2 R1 - 1

    x = sqrt(1 - z^2) cos(2 pi R2)

    s = [ 2 / sqrt(3 R3 1) ] - 1

    A unit vector (az,ax) in the z-x plane sets Alice's angle, with (bz,bx) for Bob. Projections are calculated as follows

    pa = (z,x) . (az,ax) = z az x ax

    pb = (z,x) . (bz,bx) = z bz x bx

    A detection occurs when the absolute value of both pa and pb is greater than s. The correlation for a detected event is given by the product of the signs of their projections

    C = sign(pa) sign(pb)

    The average correlation converges to the cosine expectation as the number of trials is increased. Unfortunately, it is not clear to me how to measure noise for McEachern's hypothesis, but that is a problem for another day.

    Thanks for entering an essay on the technique of quantum steering, which I will have to experiment with.

    Cheers,

    Colin

      Dear Colin,

      Ok thanks for the comment. Austin brought the Pearle R code to my attention a few days ago (see his comment and my reply above). That is good supporting evidence for my essay.

      My model does a very similar thing, although my algorithm for non-detects is based on cosine squared (Born's rule - see past and current essays by Peter Jackson on the origin of cos squared dependency).

      It's good to have some like minded people in the contest. We should support each other's work with a decent rating.

      Regards,

      Declan

      Dear Declan,

      I read your essay, but due my formation in philosophy I've no means to evaluate - nor even to fully comprehend - your physics hypothesis. As far as I understand, it's quite revolutionary, if true - but sadly the history of science is full of thunderous silences.

      Could you please sum up your discovery in not specialists term? If it's possible, of course.

      Bests,

      Francesco D'Isa

        Dear Francesco,

        Thank you for your kind comment.

        Ok, I will try:

        Most of Physics can be understood and makes sense in terms of Classical Physics that people can understand and is what is termed "Local and Real". We can 'see' what is going on and can build a model of it.

        Quantum Mechanics, on the other hand, has some strange counterintuitive aspects, with entanglement being the main one that appears to be "spooky action at a distance" as Einstein famously called it. It asserts that particles can cooperate across vast distances instantaneously as if by magic.

        Experiments (usually termed EPR experiments, based on the original thought experiment devised by Einstein, Podolsky and Rosen) have been done that appear to confirm that the QM entanglement is actually occurring; however, due to the practical difficulties in performing the experiment there are possible ways that the correlation found in the experiment can be explained Classically, these are called 'loopholes'. Thus entanglement may not be occurring at all.

        In order to solve this situation, recently supposed 'loophole-free experiments have been devised in order to settle the answer once and for all. In these experiments, one of the loopholes termed the detection loophole is attempted to be closed by using a Steering Inequality rather than the usual Bell Test or CHSH Inequality that we're used to determine if entanglement is occurring in the original EPR experiments. The Steering Inequality includes non-detects in the statistical calculations so that detector inefficiency cannot be used as a means to explain the experimental result Classically.

        My work shows that this loophole has not been closed, and that even when the Steering Inequality is used, the experimental results can be explained Classically. As a result, the main problem with modeling the word in a 'Local and Real' way is overcome, and entanglement is shown not to be occurring. This may then allow the other aspects of QM to be unified with Classical Physics and Relativity.

        I hope this answers your question?

        Best Regards,

        Declan

        Dear Declan,

        thank you very much. It seems a very important discovery! I will rate it high to raise the curiosity of the specialists.

        Best regards and wish you luck!

        Francesco D'Isa

        Dear Francesco,

        Thank you very much, it would be good if it could get looked at by the establishment.

        Do you have an essay too?

        Best Regards,

        Declan

        Dear Declan,

        you are welcome, it seems like an important discovery who needs some more attention and review!

        Yes, I've one about absolute relativism, maybe you could be interested in it, it's a more philosophical point of view in the matter.

        bests,

        Francesco

        Excuse me, the link doesn't work. Anyway it is here:

        https://fqxi.org/community/forum/topic/3044

        Dear Wilhelmus,

        Well that is not a problem if you realize and accept that there is no such thing as point particles, but instead every particle is a 3D wave structure that extends to infinity. So by necessity the particle travels through both slots and interferes with itself.

        Regards,

        Declan

        Dear Eckard,

        Ok I will try to find it when I get a chance...

        Regards,

        Declan

        Dear Declan

        Please note that I have posted a reply to you on my own thread. Some of it is better placed on your thread so here it is ...

        I still have a couple of points about your contest paper which I will likely post in your thread. But I have not got to grips with the Steering Inequality yet. The Steering Inequality is a theoretical device (2011 paper) which is not part of the recent experimental results (2015 paper)? The 2015 paper did not mention 'steering' AFAIK and anyway they only had 245 pairs of outcomes. And surely they did not contaminate an experimental finding with results other than +1 or -1? I presume that the steering inequality implies that some data were not detected to be measured for some genuine pairs which would have reduced the correlation absolute size if that data had been detected in the 2015 experiment? Or am I missing something from the 2015 experimental design?

        Best

        Austin

          Dear Austin,

          When you talk about the 2015 paper, do you mean "Realization of mutually unbiased bases for a qubit with only one wave plate: theory and experiment" that I referenced in my essay?

          If so, I included this reference as it provides information about the detector angles for MUBs where quarter and half wave plates are used (as is the case in the Ref 2 paper).

          In that paper ther are -1,+1 and 0 results recorded and a Steering Inequality is used to analyze the results.

          Regards,

          Declan

          Hi Declan Andrew Traill

          Wonderful work " Quantum Mechanics claims that particles can become entangled such that there is a correlation in the detected results from EPR type experiments that cannot be explained by Classical Physics. This paper shows that the result can be fully explained by Classical Physics, and that the correlation curve for different angles between the two detectors can by reproduced when modeled this way."dear Declan Andrew Traill... you are clearing...a fundamental misunderstanding.............. very nice idea!.... I highly appreciate your essay and hope you please spend some of the valuable time on Dynamic Universe Model also and give your some of the valuable & esteemed guidance

          Some of the Main foundational points of Dynamic Universe Model :

          -No Isotropy

          -No Homogeneity

          -No Space-time continuum

          -Non-uniform density of matter, universe is lumpy

          -No singularities

          -No collisions between bodies

          -No blackholes

          -No warm holes

          -No Bigbang

          -No repulsion between distant Galaxies

          -Non-empty Universe

          -No imaginary or negative time axis

          -No imaginary X, Y, Z axes

          -No differential and Integral Equations mathematically

          -No General Relativity and Model does not reduce to GR on any condition

          -No Creation of matter like Bigbang or steady-state models

          -No many mini Bigbangs

          -No Missing Mass / Dark matter

          -No Dark energy

          -No Bigbang generated CMB detected

          -No Multi-verses

          Here:

          -Accelerating Expanding universe with 33% Blue shifted Galaxies

          -Newton's Gravitation law works everywhere in the same way

          -All bodies dynamically moving

          -All bodies move in dynamic Equilibrium

          -Closed universe model no light or bodies will go away from universe

          -Single Universe no baby universes

          -Time is linear as observed on earth, moving forward only

          -Independent x,y,z coordinate axes and Time axis no interdependencies between axes..

          -UGF (Universal Gravitational Force) calculated on every point-mass

          -Tensors (Linear) used for giving UNIQUE solutions for each time step

          -Uses everyday physics as achievable by engineering

          -21000 linear equations are used in an Excel sheet

          -Computerized calculations uses 16 decimal digit accuracy

          -Data mining and data warehousing techniques are used for data extraction from large amounts of data.

          - Many predictions of Dynamic Universe Model came true....Have a look at

          http://vaksdynamicuniversemodel.blogspot.in/p/blog-page_15.html

          I request you to please have a look at my essay also, and give some of your esteemed criticism for your information........

          Dynamic Universe Model says that the energy in the form of electromagnetic radiation passing grazingly near any gravitating mass changes its in frequency and finally will convert into neutrinos (mass). We all know that there is no experiment or quest in this direction. Energy conversion happens from mass to energy with the famous E=mC2, the other side of this conversion was not thought off. This is a new fundamental prediction by Dynamic Universe Model, a foundational quest in the area of Astrophysics and Cosmology.

          In accordance with Dynamic Universe Model frequency shift happens on both the sides of spectrum when any electromagnetic radiation passes grazingly near gravitating mass. With this new verification, we will open a new frontier that will unlock a way for formation of the basis for continual Nucleosynthesis (continuous formation of elements) in our Universe. Amount of frequency shift will depend on relative velocity difference. All the papers of author can be downloaded from "http://vaksdynamicuniversemodel.blogspot.in/ "

          I request you to please post your reply in my essay also, so that I can get an intimation that you replied

          Best

          =snp