Dear Mr. Kadin:
It is a long time between heretics, we've been on the endangered species list for quite some time now.
I am rather flattered that you bothered to read my essay considering your Bio compared to mine.
I am no philosopher I am an Experimentalist and a fan of Maxwell, Plank and Newton.
I thoroughly enjoyed your essay and in total agreement with most subjects you addressed.
I agree that the quantum computer (QC) is quantum mechanics (QM) biggest mistake.
By trying to actually create something useful and practical, using pre renaissance alchemy and voodoo will fail spectacularly. They would be better off sticking to useless EPR tests at ever greater distances and ever more bizarre random No. generators. Me thinks they protest too much? they must have inner doubts to spend 50 years proving the same point. Another 60 tests and their score will be 100 to 2 in favour of Bohr. If Popper is to believed, the first experiment to disprove Bohr's theory (the second experiment ever performed) is all you need. The QC has two strikes against it, firstly, noise (decoherence) even at 20 mK, Then they have zero point or quantum noise to contend with. Massive parallelism may save the day, instead of 54 qbits try 54000 qbits, 1000 sets of 54 in parallel (broadly equivalent to optical stacking in astrophotography to improve signal to noise ratio ). Secondly you have that spooky entanglement that has never been demonstrated to wave mechanics ((WM), new term). The best they can hope for is a damn fast analogue computer, because that is what a failed QC reverts to in the absence of entanglement.
AI is set back by the digital computer and it heritage, primitive programming languages that are all virtually the same, serial state machines with the odd branch or loop totally dependent on the programmers logic. The computers should rely on about half programmers logic and the rest environmental learning, involving "heuristics," guided and supervised trial and error, just like humans. They might also benefit from some analogue computation. Digital computers are obscenely accurate 64 bit, this slows them down although their figure of merit may be good If we define figure of merit as the product of speed by accuracy. How does that compare with an analogue computer. Consider a 64 paralleled 4 bit digital computer, this concept may approach the analogue figure of merit. (fast but not vary accurate) also consider letting the least significant bit be randomised this means the computer will never give the same answer repeatedly, similar to us. The programming can also introduce sophisticated weighted randomness, with weighted time constants. In other words it starts off dumb and gets better, then starts to forget you can archive the trivial stuff instead of forgetting. This equates to long and short term memory, trivia slowly fades important stuff stay put.
The Analogue computers some of us remember may have been implemented with operational amplifier etc. but other examples are wind tunnels wave tanks pulleys and strings small models etc. NASA uses many and varied analogue computers, they are very fast with a good figure of merit.
Spatial filtering of images from deep space collages can be cleaned up almost instantly with spatial filtering using lenses. Your camera is an exceedingly fast Fourier transformer, starting from what appears to be stray random light into an image at the focal plane, as does your eye. It performs at the speed of light, uses no power, only limited by the Rayleigh limit, it is not the conventional "fast" Fourier transform, but a continuous, complex slow transform, executed with massive parallelism at light speed, who would have thunk-ed it?
The massive parallelism approx the area of the lens in term of wavelengths across the lens, that is the number of parallel processors, about 10 billion. Multiplied this by the 1ns time for each pixel to perform the transform about: 10 ^20 words per second.
I think the lens beats the pants off a digital computer and may be potentially similar to the D-Wave computer that is sold around the world as a QC. The D-Wave has not passed a Bell test for entanglement, but if it conceptually could approach the speed of our camera lens, then It's no wonder they think It's a QC. A similar 54 qbit 20mK computer apparently beat the socks off an IBM's super computer and I've been getting bruised and battered something shocking over it by my own radio frequency (RF) colleagues, as well as died in the wool QM's on some other forums.
I'd like to elaborate a bit about my RF colleagues, you see they don't have a clue about QM's by and large, but they do know quite a lot about Maxwell from their EM text books, they are professional and expect all other professionals to be as straight laced as themselves, so when they read in light weight physics journals, the exaggerated, inflated claims along with spiffy graphics and other added value from Hollywood / Disney style articles, they're totally intoxicated, me to, I've had to stop reading the stuff lest I fall under the spell. I subscribe to some of these journals, both for hard copies and electronic access to archives, 175 years of archives in one case (hope that's not a clue) I love the archives but most of the hard copies of recent times, are still in their protective wrapper, you see they scare me with their piffle on physics, the other disciplines are OK. I've been known to sneak a peak at articles on biology, geology history palaeontology and other interesting ologies.
QM love to claim anything that's successful, laser, transistors, smart phone etc.
It's pretty simple, you just rewrite history.
Maxwell's equations are a work in progress and underpin all modern technology, optical fibres, 5g, smart phone containing 250 billion field effect transistors. (FET's) Shockley a QM, failed to produce a working FET after 10 years of effort. He did manage to share the Nobel prize with Bardeen and Brattain, even though he played no part in their purely accidental discovery of the point contact transistor, that they new little about. After all their brief was to develop a FET. That said, Shockley did manage to improve the point contact transistor to a junction transistor, a much superior device that served us many years. The modern FET is a simpler device that operates similar to the old vacuum tube triode, except a semiconductor replaces the vacuum. The FET concept was patented in 1926 by J. E. Lilienfeld and finally developed in 1959 by Dawon Kahng and Martin Atalla. What were QM's doing all those 33 years to address the simple problem that plagued the experimenters, nothing!
The problem was that an electric field could not penetrate their pure silicon semiconductor. QM's weird wave function model failed spectacularly. (I love using this phrase, I've sneakily borrowed it from the QM's lexicon)
The problem was solved by simply oxidising the surface of the chosen semiconductor, it could even have been discovered accidentally. QM's owe more to the transistor and the persistent experimenters, than the transistor owes to QM. The laser is a similar storey I'll save that until another time.
Even today the laser is erroneously thought to produce photons that obey Poisson statistics. This is demonstrably wrong, If you inspect an inverse Fourier transform a narrow line-width laser it's obviously sub Poisson, the error comes about in their measurement technique. The huge attenuation required to reduce high intensity laser beams, to single photons, introduces the Poisson distribution, the thermal properties of the attenuater modulate the beam.
I believe there is a crisis in physics due to the introduction of QM. While they are dreaming in their sandpit, engineers are left to do all the heavy lifting, developing modern technology: rockets, internet, smart phones, solid state discs with thousands of billions of transistors, 5G, frequency domain multiplexing on the optical fibre network, stealth technology, the list goes on. Mr Kadin, I believe that people with your reversion to a scientific method, free of supernatural magical influences from the dark ages, could lead to a renaissance. One proviso, abandon QM completely, revitalise or simply reintroduce Maxwell's Equations to the physics community. The equations originally had 20 equations and 20 unknowns, Heaviside and others simplified them to 4 equations and 4 unknowns. I have read, that the original equations are better at addressing some of the more esoteric problems in astrophysics. I predict that Maxwell's equations along with dust in the intergalactic medium (IGM) are going play an ever increasing role in the paradox's of dark matter, dark energy, CMB, non Doppler red-shift etc. There is another elephant in the IGM, the so called vacuum in the IGM. This so called vaccuum is full of EM fields, EM waves, gravity fields and waves, whats-more, (waves are simply disturbances in fields,) I agree with my friend Lockie Cresswell that all particles in the observable universe (within the event horizon) are in continual communication via these waves and fields, at the speed of light, at all times.
This resolves the "measurement problem" in QM. Any particular measurement you perform is in constant contact with the local environment via these fields including the experimenter and his equipment. The universe is an observer as well as the experimenter, and the so called "noise" in the experiment is simply signals from the local environment there is also noise from the distant environment. Your noise is simply other peoples signals!
This noise is not intrinsically random (as per the uncertainty principle(UP)) but chaotic and has causal sources. This noise is the equivalent of QM's UP. This soup of fields and waves have a defined reference frame, the physical matter in the universe (stars, galaxies etc.) they are in fact the sources of the fields and waves. We now have an ether, whats-more It's not Lorentz invariant, this new ether is revealing itself in a measurable drift in the CMB. Remember Mach's principle and Newtons water bucket? This interpretation is not a theory it's an assemblage of facts and observations. I'm therefore calling it the "observable ether" Einstein in his later years almost insisted on an ether, however I'm not sure It's the same as the one I have just described. I also do not propose this ether is the medium required for radiation to take place, (the Michelson and Morley ether) I leave that for others to ponder?
Subjects I could expanded upon somehow, sometime, that I'm working on:
Fields:
stochastic electrodynamics (SED)
zero point radiation without UV catastrophe ( really gamma raycatastrophe) we'll call this gamma limited SED or GLSED
SEDS is SED with spin
zero point with real sources
zero-point without Lorentz in-variance, therefore ether.
total energy of zero point, capped by no reasonable realistic gamma sources.
has zero point effective mass via e=mc^2
does zero point exert friction (tidal forces)
Zwicky and his friction, modern interpretation, photons and particles slowed by friction caused by tidal effect interaction with the new "observable ether" modern version of "tired light" not Hoyle's old tired light, it is prone be used as another straw-man, may need a better name, "weary light."
Dust:
metal and carbon dust from supernovae pervading the IGM, new finding
attenuation constant of IGM, can light really travel 13 billion light years without attenuation or and getting "weary?"
radiation pressure on dust, increases its temperature
CMB from warm dust? Grote Rebers theory.
dielectric constant or refractive index of dusty IGM?
dust lensing, similar to gravity lensing?
non Doppler red-shift, various proponents, Grote Reber etc.
dust or smoke does not blur, demonstrable, look at moon thru smoke its red and sharp
dark energy? Due to optical attenuation, standard candles dimmed by dust?
dark matter? Due to mass of dust, galaxies much bigger than we measure due to super novae metals and carbon?
Cheers
Barry