Essay Abstract

TheTHE UNDECIDABILITY OF THE EPR PARADOX IS RESOLVED. Abstract B. Gilbert The quantum mechanics (QM) community seems to have a public relations machine to be envied, their hubris and denigrating language, complete with their intellectual bullying tactics, are seen with great effect in physics forums and papers. J. S. Bell although sceptical of quantum mechanics QM, and particularly the Einstein Podolsky Rosen (EPR) experimental results, still accepted the photon, and therefore never bothered to create a Bell inequality to test Maxwellian wave models, only "classical billiard ball models". I wish to propose a new loop hole, I will call the Mawellian wave model loop hole, or "Max's loophole"(ML). Max is very appropriate, because Max Plank subscribed to the Maxwellian wave model vehemently, much to the chagrin of Albert Einstein. Max only proposed his "quantization to apply to massive particles or atoms and electron energies, propagated via Maxwellin waves. Albert on the other hand insisted that the energy must propagate in little packets, to respect the laws of energy conservation. Max proposed other solutions to solve the energy conservation issues. These proposals included, loading theory(Eric) and or an energy sea (dirac, SED, Santos). But Albert wasn't having any of it, and published his version without consulting Max. After all they were working as team up until this point. Max was very upset and their relation ship never recovered. Zero point enegy (ZPE), zero poim field(ZPF) and quantum noise are modern contenders for providing the missing energy that Albert was justifiably concerned about. This loop hole is for QM to attack, as I propose classic folk should be the noisy aggressors, instead of humble self deprecating defenders. undecidability of

Author Bio

retired from Telecom Research laboratories. ex Fellow of the HPS department of the University of Melbourne. co-authored 10 papers in international peer reviewed physics journals

Download Essay PDF File

Dear Mr. Gilbert:

I am another quantum skeptic, regarded as a heretic.

I agree with you that a photon is properly an electromagnetic wave packet of finite extent, which is never the model taken for comparison with quantum predictions.

I have further argued that a single photon is a circularly polarized wave packet, carrying quantized spin. So is an electron.

If you are interested, this was addressed in my previous FQXi essay, "Fundamental Waves and the Reunification of Physics", and again in my new FQXi essay, "The Uncertain Future of Physics and Computing".

Alan Kadin

    Dear Barry,

    It's good to see that you finally decided to enter an essay in this competition.

    Your essay is short, but makes a very strong point on EPR loopholes.

    I downloaded "The Ghost in the Atom" by P. Davies and J. Brown. It is a good read. I noted in the foreword that the editors stated "A final thought and a note of caution; when we commissioned the interviews, several of our contributors (who shall remain nameless!) expressed the view that there is now no real doubt over how quantum theory should be interpreted. At the very least, we hope this book will show that there is little justification for such complacency."

    They were prescient, for as we now know there are serious doubts as to how to interpret a statistical theory as a framework for physics.

    Check out my essay "Wandering towards a 'Theory of Everything' and how I was stopped from achieving my goal by Nature", where I make observations about wave/particle duality, photons and the nature of time.

    I hope others will read, challenge and discuss your most interesting ideas.

    Good luck,

    Lockie Cresswell

      Dear Dr. Gilbert

      I know what it is like to be a straw man. All believers in the aether have felt like that since Einstein's SR. I think secretly even Albert felt like a strawman himself, after he was pigeonholed by Shankland.

      If have read your essay correctly it would seem that the graph you referenced from Wiki needs to have its labelling reversed, with the classical viewpoint represented by the sinusoid and the quantum viewpoint by the straight line sections. How are the QM group going to wriggle out of that! Max's loophole would have dear old Max Planck laughing in the grave. I'm sure he would approve.

      Good luck in your essay,

      Marts

      Wave model have been challenged

      see

      https://www.scirp.org/journal/paperinformation.aspx?paperid=93056

      There are other experiments rejecting wave models.

        Dear Mr. Hodge

        Thank you for reading my paper and your comments.

        I read your paper on the link you provided and it appeared well thought out and reasonable. However, I must point out a couple of things:

        Firstly, the laser pointer if typical of most laser pointers, would have a far-field of .5 to 1Km the slit width dimensions will create their own far fields at a similar distance. The far-field as you know is related to wavelength and the dimensions of the source. you can calculate it yourself. It is also defined as the distance at which E and H fields are orthogonal and in phase, as you would know, therefore your screen for this experiment would be valid only at this distance. Also, you would know that between your source slit and the screen, a Fourier Transform takes place at a distance of 500 meters. In between those distances, the pattern progresses from blurry shadows of your slits and nails etc. to a full-blown interference pattern at 500 metres, what you get at 5 or 6 metres is any bodies guess, not to mention the nail itself, will have diffraction and try to produce Its own pattern as well. The experiment would need overhauling before it is seen as a refutation of Maxwell's wave theory that seems to work well in the 5G network. It's generally a good idea to know something about the theory you are about to refute.

        Wave theory although challenged many times still serves us well in the real world stealth industry communication industry. A. Zeilinger the eminent physicist of EPR fame claimed to get interference between buckyballs he made the same mistake, operated in the near field, I managed to co-author a paper in a peer-reviewed physics journal, to publish our critique of his work.

        On the interference of Fullerenes and other Massive Particles

        Sulcs, Gilbert, Osborne

        Foundations of Physics, Vol 32, No. 8, August 2002

        Cheers

        Barry

        Dear Mr. Kadan:

        It is a long time between heretics, we've been on the endangered species list for quite some time now.

        I am rather flattered that you bothered to read my essay considering your Bio compared to mine.

        I thoroughly enjoyed your essay and in total agreement with most subjects you addressed.

        I agree that the quantum computer (QC) is quantum mechanics (QM) biggest mistake.

        By trying to actually create something useful and practical, using pre renaissance alchemy and voodoo will fail spectacularly. They would be better off sticking to useless EPR tests at ever greater distances and ever more bizarre random No. generators. Me thinks they protest too much? they must have inner doubts to spend 50 years proving the same point. Another 60 tests and their score will be 100 to 2 in favour of Bohr. If Popper is to believed, the first experiment to disprove Bohr's theory (the second experiment ever performed) is all you need. The QC has two strikes against it, firstly, noise (decoherence) even at 20 mK, Then they have zero point or quantum noise to contend with. Massive parallelism may save the day, instead of 54 qbits try 54000 qbits, 1000 sets of 54 in parallel (broadly equivalent to optical stacking in astrophotography to improve signal to noise ratio ). Secondly you have that spooky entanglement that has never been demonstrated to wave mechanics ((WM), new term). The best they can hope for is a damn fast analogue computer, because that is what a failed QC reverts to in the absence of entanglement.

        AI is set back by the digital computer and it heritage, primitive programming languages that are all virtually the same, serial state machines with the odd branch or loop totaly dependent on the programmers logic. The computers should rely on about half programmers logic and the rest environmental learning, involving "heuristics," guided and supervised trial and error, just like humans. They might also benefit from some analogue computation. Digital computers are obscenely accurate 64 bit, this slows them down although their figure of merit may be good If we define figure of merit as the product of speed by accuracy how does that compare with an analogue computer. Consider a 64 paralleled 4 bit digital computer, this concept may approach the human brains figure of merit. (fast but not vary accurate) also consider letting the least significant bit be randomised this means the computer will never give the same answer repeatedly, similar to us. The programming can also introduce sophisticated weighted randomness, with weighted time constants. In other words it starts off dumb and gets better, then starts to forget you can archive the trivial stuff instead of forgetting. This equates to long and short term memory, trivia slowly fades important stuff stay put.

        The Analogue computers some of us remember may have been implemented with operational amplifier etc. but other examples are wind tunnels wave tanks pulleys and strings small models etc. NASA uses many and varied analogue computers, they are very fast with a good figure of merit.

        Spatial filtering of images from deep space collages can be cleaned up almost instantly with spatial filtering using lenses. Your camera is an exceedingly fast Fourier transformer, starting from what appears to be stray random light into an image at the focal plane, as does your eye. It performs at the speed of light, uses no power, only limited by the Rayleigh limit, it is not the conventional "fast" Fourier transform, but a continuous, complex slow transform, executed with massive parallelism at light speed, who would have thunk-ed it?

        The massive parallelism approx the area of the lens in term of wavelengths across the lens, that is the number of parallel processors, about 10 billion, multiplied by the 1ns time for each pixels transform about: 10 e20 words per second.

        You philosophers out there don't have all the cool magic stuff

        I think the lens beats the pants off a digital computer and may be potentially similar to the D-Wave computer that is sold around the world as a QC. The D-Wave has not passed a Bell test for entanglement, but if it conceptually approachsthe speed of our camera lens, then It's no wonder they think It's a QC. A similar 54 qbit 20mK computer apparently beat the socks off an IBM's super computer and I've been getting bruised and battered something shocking over it by my own radio frequency (RF) colleagues, as well as died in the wool QM's on some other forums I'm on.

        I only hope the FQXi community accept my explanation and allow me to heal a bit.

        I'd like to elaborate a bit about my RF colleagues, you see they don't have a clue about QM's by and large, but they do know quite a lot about Maxwell from their EM text books, they are professional and expect all other professionals to be as straight laced as themselves, so when they read in light weight physics journals! the exaggerated, inflated claims along with spiffy graphics and other added value from Hollywood / Disney style articles, they're totally intoxicated, me to, I've had to stop reading the stuff lest I fall under the spell. I subscribe to some of these journals, both for hard copies and electronic access to archives, 175 years of archives in one case (hope thats not a clue) I love the archives but most of the hard copies of recent times, are still in their protective wrapper, you see they scare me with their piffle on physics, the other disciplines are OK, so I've been known to sneak a peak at articles on biology, geology history palaeontology and other interesting ologies

        QM love to claim anything that's successful, laser, transistors smart phone etc.

        it's pretty simple, you just rewrite history.

        Maxwell's equations are a work in progress and underpin all modern technology, optical fibres, 5g, smart phone. (250 billion field effect transistors (FET's)) Shockley a QM, failed to produce a working FET after 10 years of effort. He did manage to share the Nobel prize with Bardeen and Brattain, even though he played no part in their purely accidental discovery of the point contact transistor, that they new nothing about. After all their brief was to replace the vacuum tubes with a solid state version, a FET. That said, Shockley did manage to improve the point contact transistor to a junction transistor, a much superior device that served us many years. The modern FET is a simpler device that operates similar to the old vacuum tube triode, except a semiconductor replaces the vacuum. The FET concept was patented in 1926 by Lilienfeld and finally developed in 1959 by Dawon Kahng and Martin Atalla. What were QM's doing all those 33 years to address the simple problem that plagued the experimenters, nothing!

        The problem was that an electric field could not penetrate their pure silicon semiconductor. QM's weird wave function model failed spectacularly. (I love using this phrase, I've sneakily borrowed it from the QM's lexicon)

        The problem was solved by simply oxidising the surface of the chosen semiconductor, it could even have been discovered accidentally. QM's owe more to the transistor and the persistent experimenters, than the transistor owes to QM. The laser is a similar storey I'll save that until another time.

        Even today the laser is erroneously thought to produce photons that obey Poisson statistics. This is demonstrably wrong, If you inspect an inverse Fourier transform a narrow line-width laser it's obviously sub Poisson, the error comes about in their measurement technique. The huge attenuation required to reduce high intensity laser beams, to single photons, introduces the Poisson distribution, the thermal properties of the attenuater modulate the beam.

        I believe there is a crisis in physics due to the introduction of QM. While they are dreaming in their sandpit, engineers are left to do all the heavy lifting with modern technology: rockets, internet, smart phones, solid state discs, 5G, frequency domain multiplexing on the optical fibre network, stealth technology, the list goes on. Mr Kadin, I believe that people with your reversion to a scientific method, free of supernatural magical influences from the dark ages, could lead to a renaissance, with one proviso, abandon QM completely, revitalise or simply reintroduce Maxwell's Equations to the physics community. The equations originally had 20 equations and 20 unknowns, Heaviside and others simplified them to 4 equations and 4 unknowns. I have read, that the original equations are better at addressing some of the more esoteric problems in astrophysics. I predict that Maxwell's equations along with dust in the intergalactic medium (IGM) are going play an ever increasing role in the paradox's of dark matter, dark energy, CMB, non Doppler red-shift etc. There is another elephant in the IGM, the so called vacuum in the IGM, It is full of EM fields, EM waves, gravity fields and waves, whats-more, (waves are simply disturbances in fields, sorry Lockie!!) I agree with my friend Lockie Cresswell that all particles in the observable universe (within the event horizon) are in continual communication via these waves and fields, (sorry again) at the speed of light, at all times.

        This resolves the "measurement problem" in QM. Any particular measurement you perform is in constant contact with the local environment via these fields including the experimenter and his equipment and so called "noise" is simply signals from the local environment there is also noise from the distant environment. Your noise is simply other peoples signals

        This noise is not intrinsically random (as per the uncertainty principle(UP)) but chaotic and has causal sources. This noise is the equivalent of QM's uncertainty principle (UP). This noise is not equivalent in all cases of the UP however.

        This soup of fields and waves have a defined reference frame, the physical matter in the universe (stars, galaxies etc.) they are in fact the sources of the fields and waves. We now have an ether, whats-more It's not Lorentz invariant, this new ether is revealing itself in a measurable drift in the CMB. Remember Mach's principle and Newtons water bucket? This interpretation is not a theory it's an assemblage of facts and observations. I'm therefore calling it the "observable ether" Einstein in his later years almost insisted on an ether, however I'm not sure It's the same as the one I have just described. I also do not propose this ether is the medium required for radiation to take place, (the Michelson and Morley ether) I leave that for others to ponder?

        Subjects I could expanded upon somehow, within the guidelines of the comp:

        Fields:

        stochastic electrodynamics (SED)

        zero point radiation without UV catastrophe ( really gamma catastrophe) we'll call this gamma limited SED or GLSED

        SEDS is SED with spin

        zero point with real sources

        zero-point without Lorentz in-variance, therefore ether.

        total energy of zero point, capped by no reasonable realistic gamma sources.

        has zero point effective mass via e=mc^2

        does zero point exert friction (tidal forces)

        Zwicky and his friction, modern interpretation, photons and particles slowed by friction caused by tidal effect interaction with the new "observable ether" modern version of "tired light" not Hoyle's old tired light, it is prone be used as another straw-man, may need a better name, "weary light."

        Dust:

        metal and carbon dust from supernovae pervading the IGM, new finding

        attenuation constant of IGM, can light really travel 13 billion light years without attenuation or and getting "weary ?"

        radiation pressure on dust, increases its temperature

        CMB from warm dust? Grote Rebers theory.

        dielectric constant or refractive index of dusty IGM?

        dust lensing, similar to gravity lensing?

        non Doppler red-shift, various proponents, Grote Reber etc.

        dust or smoke does not blur, demonstrable, look at moon thru smoke its red and sharp

        dark energy? Due to optical attenuation, standard candles dimmed by dust?

        dark matter? Due to mass of dust, galaxies much bigger than we measure due to super novae metals and carbon?

        Cheers

        Barry

        Dear Lockie

        Thanks for threatening to put a nasty runic spell on me If I did not enter this comp. I'm enjoying it so far. I Think I'll study more of the Foundations of Physics before I get philosophical.

        Thanks also for your kind comments on my rather short essay, I liked your Inclusion of the remarks in the foreword of the "Ghost in the Atom" by Paul Davies and J Brown, my interpretation is they were bullied by QM's to bias their account of the interviews, I wonder if Paul will get around to reading this post, he may wish to comment?

        Good luck with your essay.

        Barry

        18 days later

        Dear Barry,

        I greatly appreciated your work and discussion. I am very glad that you are not thinking in abstract patterns.

        "This loop hole is for QM to attack, as I propose classic folk should be the noisy aggressors, instead of humble self deprecating defenders".

        While the discussion lasted, I wrote an article: "Practical guidance on calculating resonant frequencies at four levels of diagnosis and inactivation of COVID-19 coronavirus", due to the high relevance of this topic. The work is based on the practical solution of problems in quantum mechanics, presented in the essay FQXi 2019-2020 "Universal quantum laws of the universe to solve the problems of unsolvability, computability and unpredictability".

        I hope that my modest results of work will provide you with information for thought.

        Warm Regards, `

        Vladimir

        4 months later

        Hello Barry. There is so much confusion in these discussions that I need to start from basics. I am talking to the community, not just you Barry. Wikipedia has the graph mislabeled. In this case, QM agrees with Malus and classical waves. The triangle wave is a prediction of Bell's theorem, and billiard balls. Evidently, the assumptions behind Bell's theorem were too much like classical particles (not photons). Bell only applies to classical particles, like BBs. Let me clarify. There are classical waves and classical particles. Then there are QM waves and QM particles. In QM, people usually say wave-functions and just particles. To talk of photons is a QM concept that leads to instant confusion. Some will think of a wave-packet, or detector click, or who knows. I resort to a quote in Bohr's book where he quotes Einstein: A photon will go only one way or another at a beam-splitter, but if you re-converge the beam you will see an interference pattern over time. The photon concept embraces particle and wave concepts, both crammed together. There is no way to understand that in terms of space and time. It is a dualistic concept; a concept, not a thing. It is of no help to talk of a wave packet unless one makes the distinction between classical (spreads) and quantum (hold together). My models take light as a classical spreading wave packet only at the instant of emission. A classical emitted wave packet will not hold itself together; it will mix with all the other waves and spread. If a classical wave packet were to hold together it would be a particle. So, what is a particle, and what is a wave? A particle will hold itself together and a wave spreads. Using that distinction avoids confusion. A QM particle (photon) does both: instant confusion. Some will invoke the "shut-up and calculate" because they think the experiments say so. No. There are false interpretations in experiments, as I have outlined in my essays. Many of these false assumptions are due to polarizers routing classical polarized wave-packets that make detector anti-coincident-clicks to give the illusion of particles (or QM). Light is just a classical wave, but there are thresholds in absorption and emission to explain the particle-like properties. Is a gamma-ray a wave packet? A gamma-ray is emitted like a classical wave packet but it does not hold itself together, as by my experiments. For matter, it is more complicated. My essays explain with reference to experiment how matter (with rest mass) is a soliton. Barry, I am very happy that you seem to understand the threshold concept. For anyone who does not know me, I am "Loading Theory (Eric)." You can see my essays on this forum https://fqxi.org/community/forum/topic/1344 and on my website http://www.unquantum.net My latest writing is best: https://vixra.org/abs/2008.0071 Thank you. ER 9, 2020

        Write a Reply...