Dear Alan Kadin,
Shouldn't Harvard be proud of you? I just wonder why didn't McEachern take issue. Klingman recently admitted his support for you in a comment on my essay.
Eckard
Dear Alan Kadin,
Shouldn't Harvard be proud of you? I just wonder why didn't McEachern take issue. Klingman recently admitted his support for you in a comment on my essay.
Eckard
Dear Alan Kadin,
Wow!!! This is the opinion essay. I really like this style. On the computational viewpoint, this is really interesting. Especially, you mentioned
3) No Quantum Computing
Quantum computing is unachievable for both fundamental and practical reasons, and will not be the future of computing; the experimental evidence thus far has been misinterpreted.
In this point, what concept of quantum computing is unachievable? My essay pointed out the different perspective on the random number generation. This is not for fundamental and practical?
Best wishes,
Yutaka
Dear Alan,
I was really excited by your essay; it gave me the sort of feeling that you get when you've just thought of something special.
Sometimes just viewing things slightly differently can cause big changes.
You said that De Broglie and Schrodinger believed the waves of the wave equation were real, whereas Bohr and Heisenberg thought it gave the probability of an electron being at a particular point. It is possible to merge the two points of view, so the waves are real, but the amplitude of a wave at a particular point also gives the probability of a new wave starting there.
This rather insignificant rethink actually has surprisingly large implications, while retaining the same maths. So, it results in something like the moving 'wavicles' that you describe, but also in some quite unthinkable things, so, for example, Special Relativity becomes a logical consequence of Quantum Theory. Actually, this rethink has implications for just about every idea you mention in your essay.
If you're interested, it's discussed in detail in my essay.
Thank you for a lovely piece,
All the best,
David
Dear Alan M. Kadin
I have my printed copy of your rise and fall of wave-particle duality paper with many notes or inserts. Your standing wave discussion gained a place in my universe. I have retained the paper from the prior contest as it remains the closest to my theory ideas in overview as anything else I have seen. I do not address all the history you do and I am less professional regarding the technical physics. My work is more encompassing across the universe and displaces many questionable concepts. You can see this in my paper. There are no direct conflicts with Relativity or QM . This cosmology covers many facets of astrophysics. The way is open provided we get others to agree with you saying 'mathematical proofs will not provide the answers'.
Believers can then review how the wave particle duality becomes useless. Moving on then, the cause of orbitals awaits. Then overcoming of the ignorance of the 'reality of pushing gravity' and the EM nature of space will awaken the world. It may be that nothing is enough to be the 'Theory of Everything' but a thorough alternative perspective of most concepts should spur some interest.
By the way, a point does spin and it deflects gravity. You started here with 'nothing is really spinning' and advance to 'the coherent rotation of a vector field around a spin axis'. Spin is no longer mysterious.
Spin is within electrons when they are created by opposing beams intersecting each other. If interested, you will need to see a creation of matter paper.
You encourage us all when you say the conscious mind is slow as that always seemed to be a flaw of mine.
You are a very involved person, but 'The Universe is Otherwise' and It might be interesting enough to discuss this with some interchange outside of the contest.
Paul Schroeder
Dear Dr. Kadin
Thanks for calling my attention to read your essay which I enjoyed and found quite attractive. I guess we have several points in common. I started my scientific career in the field of superconductors, I agree with your view that we should build a neoclassical theory unifying quantum and relativistic effects (I see no future in current approaches to the unification of physics). I also agree that solitons can help to solve most of our present problems, etc. Overall, I share the same vision as you. Something that drew my attention is the proposal that you put forth about the electron spin. I have seen that other researchers have elaborated the origin of the spin in the zero point field, just in a similar way you do it, assuming a circular polarization of the electromagnetic field. This was explained under the framework of stochastic electrodynamics; are you aware of this theory?
As for QC, I also agree, NN seem to have a more promising future than QC.
I also noticed that you have published several manuscripts in vixra and arxiv but not in orthodox journals, have you submitted your manuscripts to these journals? What can you tell me about it.
Congratulations for being a winner from the 2017 contest, I was also a winner in 2012.
Best regards
Israel Perez
Dear Mr. Kadin:
It is a long time between heretics, we've been on the endangered species list for quite some time now.
I am rather flattered that you bothered to read my essay considering your Bio compared to mine.
I am no philosopher I am an Experimentalist and a fan of Maxwell, Plank and Newton.
I thoroughly enjoyed your essay and in total agreement with most subjects you addressed.
I agree that the quantum computer (QC) is quantum mechanics (QM) biggest mistake.
By trying to actually create something useful and practical, using pre renaissance alchemy and voodoo will fail spectacularly. They would be better off sticking to useless EPR tests at ever greater distances and ever more bizarre random No. generators. Me thinks they protest too much? they must have inner doubts to spend 50 years proving the same point. Another 60 tests and their score will be 100 to 2 in favour of Bohr. If Popper is to believed, the first experiment to disprove Bohr's theory (the second experiment ever performed) is all you need. The QC has two strikes against it, firstly, noise (decoherence) even at 20 mK, Then they have zero point or quantum noise to contend with. Massive parallelism may save the day, instead of 54 qbits try 54000 qbits, 1000 sets of 54 in parallel (broadly equivalent to optical stacking in astrophotography to improve signal to noise ratio ). Secondly you have that spooky entanglement that has never been demonstrated to wave mechanics ((WM), new term). The best they can hope for is a damn fast analogue computer, because that is what a failed QC reverts to in the absence of entanglement.
AI is set back by the digital computer and it heritage, primitive programming languages that are all virtually the same, serial state machines with the odd branch or loop totally dependent on the programmers logic. The computers should rely on about half programmers logic and the rest environmental learning, involving "heuristics," guided and supervised trial and error, just like humans. They might also benefit from some analogue computation. Digital computers are obscenely accurate 64 bit, this slows them down although their figure of merit may be good If we define figure of merit as the product of speed by accuracy. How does that compare with an analogue computer. Consider a 64 paralleled 4 bit digital computer, this concept may approach the analogue figure of merit. (fast but not vary accurate) also consider letting the least significant bit be randomised this means the computer will never give the same answer repeatedly, similar to us. The programming can also introduce sophisticated weighted randomness, with weighted time constants. In other words it starts off dumb and gets better, then starts to forget you can archive the trivial stuff instead of forgetting. This equates to long and short term memory, trivia slowly fades important stuff stay put.
The Analogue computers some of us remember may have been implemented with operational amplifier etc. but other examples are wind tunnels wave tanks pulleys and strings small models etc. NASA uses many and varied analogue computers, they are very fast with a good figure of merit.
Spatial filtering of images from deep space collages can be cleaned up almost instantly with spatial filtering using lenses. Your camera is an exceedingly fast Fourier transformer, starting from what appears to be stray random light into an image at the focal plane, as does your eye. It performs at the speed of light, uses no power, only limited by the Rayleigh limit, it is not the conventional "fast" Fourier transform, but a continuous, complex slow transform, executed with massive parallelism at light speed, who would have thunk-ed it?
The massive parallelism approx the area of the lens in term of wavelengths across the lens, that is the number of parallel processors, about 10 billion. Multiplied this by the 1ns time for each pixel to perform the transform about: 10 ^20 words per second.
I think the lens beats the pants off a digital computer and may be potentially similar to the D-Wave computer that is sold around the world as a QC. The D-Wave has not passed a Bell test for entanglement, but if it conceptually could approach the speed of our camera lens, then It's no wonder they think It's a QC. A similar 54 qbit 20mK computer apparently beat the socks off an IBM's super computer and I've been getting bruised and battered something shocking over it by my own radio frequency (RF) colleagues, as well as died in the wool QM's on some other forums.
I'd like to elaborate a bit about my RF colleagues, you see they don't have a clue about QM's by and large, but they do know quite a lot about Maxwell from their EM text books, they are professional and expect all other professionals to be as straight laced as themselves, so when they read in light weight physics journals, the exaggerated, inflated claims along with spiffy graphics and other added value from Hollywood / Disney style articles, they're totally intoxicated, me to, I've had to stop reading the stuff lest I fall under the spell. I subscribe to some of these journals, both for hard copies and electronic access to archives, 175 years of archives in one case (hope that's not a clue) I love the archives but most of the hard copies of recent times, are still in their protective wrapper, you see they scare me with their piffle on physics, the other disciplines are OK. I've been known to sneak a peak at articles on biology, geology history palaeontology and other interesting ologies.
QM love to claim anything that's successful, laser, transistors, smart phone etc.
It's pretty simple, you just rewrite history.
Maxwell's equations are a work in progress and underpin all modern technology, optical fibres, 5g, smart phone containing 250 billion field effect transistors. (FET's) Shockley a QM, failed to produce a working FET after 10 years of effort. He did manage to share the Nobel prize with Bardeen and Brattain, even though he played no part in their purely accidental discovery of the point contact transistor, that they new little about. After all their brief was to develop a FET. That said, Shockley did manage to improve the point contact transistor to a junction transistor, a much superior device that served us many years. The modern FET is a simpler device that operates similar to the old vacuum tube triode, except a semiconductor replaces the vacuum. The FET concept was patented in 1926 by J. E. Lilienfeld and finally developed in 1959 by Dawon Kahng and Martin Atalla. What were QM's doing all those 33 years to address the simple problem that plagued the experimenters, nothing!
The problem was that an electric field could not penetrate their pure silicon semiconductor. QM's weird wave function model failed spectacularly. (I love using this phrase, I've sneakily borrowed it from the QM's lexicon)
The problem was solved by simply oxidising the surface of the chosen semiconductor, it could even have been discovered accidentally. QM's owe more to the transistor and the persistent experimenters, than the transistor owes to QM. The laser is a similar storey I'll save that until another time.
Even today the laser is erroneously thought to produce photons that obey Poisson statistics. This is demonstrably wrong, If you inspect an inverse Fourier transform a narrow line-width laser it's obviously sub Poisson, the error comes about in their measurement technique. The huge attenuation required to reduce high intensity laser beams, to single photons, introduces the Poisson distribution, the thermal properties of the attenuater modulate the beam.
I believe there is a crisis in physics due to the introduction of QM. While they are dreaming in their sandpit, engineers are left to do all the heavy lifting, developing modern technology: rockets, internet, smart phones, solid state discs with thousands of billions of transistors, 5G, frequency domain multiplexing on the optical fibre network, stealth technology, the list goes on. Mr Kadin, I believe that people with your reversion to a scientific method, free of supernatural magical influences from the dark ages, could lead to a renaissance. One proviso, abandon QM completely, revitalise or simply reintroduce Maxwell's Equations to the physics community. The equations originally had 20 equations and 20 unknowns, Heaviside and others simplified them to 4 equations and 4 unknowns. I have read, that the original equations are better at addressing some of the more esoteric problems in astrophysics. I predict that Maxwell's equations along with dust in the intergalactic medium (IGM) are going play an ever increasing role in the paradox's of dark matter, dark energy, CMB, non Doppler red-shift etc. There is another elephant in the IGM, the so called vacuum in the IGM. This so called vaccuum is full of EM fields, EM waves, gravity fields and waves, whats-more, (waves are simply disturbances in fields,) I agree with my friend Lockie Cresswell that all particles in the observable universe (within the event horizon) are in continual communication via these waves and fields, at the speed of light, at all times.
This resolves the "measurement problem" in QM. Any particular measurement you perform is in constant contact with the local environment via these fields including the experimenter and his equipment. The universe is an observer as well as the experimenter, and the so called "noise" in the experiment is simply signals from the local environment there is also noise from the distant environment. Your noise is simply other peoples signals!
This noise is not intrinsically random (as per the uncertainty principle(UP)) but chaotic and has causal sources. This noise is the equivalent of QM's UP. This soup of fields and waves have a defined reference frame, the physical matter in the universe (stars, galaxies etc.) they are in fact the sources of the fields and waves. We now have an ether, whats-more It's not Lorentz invariant, this new ether is revealing itself in a measurable drift in the CMB. Remember Mach's principle and Newtons water bucket? This interpretation is not a theory it's an assemblage of facts and observations. I'm therefore calling it the "observable ether" Einstein in his later years almost insisted on an ether, however I'm not sure It's the same as the one I have just described. I also do not propose this ether is the medium required for radiation to take place, (the Michelson and Morley ether) I leave that for others to ponder?
Subjects I could expanded upon somehow, sometime, that I'm working on:
Fields:
stochastic electrodynamics (SED)
zero point radiation without UV catastrophe ( really gamma raycatastrophe) we'll call this gamma limited SED or GLSED
SEDS is SED with spin
zero point with real sources
zero-point without Lorentz in-variance, therefore ether.
total energy of zero point, capped by no reasonable realistic gamma sources.
has zero point effective mass via e=mc^2
does zero point exert friction (tidal forces)
Zwicky and his friction, modern interpretation, photons and particles slowed by friction caused by tidal effect interaction with the new "observable ether" modern version of "tired light" not Hoyle's old tired light, it is prone be used as another straw-man, may need a better name, "weary light."
Dust:
metal and carbon dust from supernovae pervading the IGM, new finding
attenuation constant of IGM, can light really travel 13 billion light years without attenuation or and getting "weary?"
radiation pressure on dust, increases its temperature
CMB from warm dust? Grote Rebers theory.
dielectric constant or refractive index of dusty IGM?
dust lensing, similar to gravity lensing?
non Doppler red-shift, various proponents, Grote Reber etc.
dust or smoke does not blur, demonstrable, look at moon thru smoke its red and sharp
dark energy? Due to optical attenuation, standard candles dimmed by dust?
dark matter? Due to mass of dust, galaxies much bigger than we measure due to super novae metals and carbon?
Cheers
Barry
Dear Alan Kadin
Thank you for an enjoyable and very readable essay.
I agreed with you in some important places, for example that QC will end with the whimper of a damp squib. This is a consequence of my essay also although I did not have space to spell that out. My essay subverts Bell's theorem by having time travel backwards within antiparticles. Antiparticles are assigned as travelling backwards in time on Feynman diagrams and I just went further with that idea despite knowing that professional physicists see it as a mathematical trick which is convenient but unreal. The upshot is that the consequence for me is that entanglement is not spooky and not non-local and has no action-at-a-distance. And hence no huge benefits in quantum computing.
Before I forget, let me comment as an aside on your 'conscousness' wording late on in your essay: "The sense of consciousness is largely the continuing recognition of oneself in the environment, mapping onto previous incarnations of the self. The second key feature of consciousness is the creation of a narrative, a coherent story of oneself in the environment. This narrative continues from the past and projects to the future, and includes decision points. Note that these features do not necessarily include linguistic competence or intelligent thinking. "
I do not normally read essays about consciousness so came across your wording almost by accident. I wrote about my own memories of babyhood here: https://ben6993.wordpress.com/2008/09/13/early-memories-as-a-baby/
EXTRACT Age two months. My very first thoughts are " I must remember this time. I must, must, must remember this time. I will try my hardest to remember. I will be the first one ever to remember being born again." If you read further you will find that I, as an adult, rationally attribute this to remembering being awake before, rather than having been born before. Or as you put "previous incarnations of oneself". Anyway that is just an aside as your words, including the linguistic competence comments, chimed with my memories expressed in my babyhood essay.
But back to the physics. Who needs QC if one has time travel via antiparticles? Well, that was supposed to be humorous as I do not think it likely that there is signalling from the future back to the present. And even if there were signalling, if something is uncomputable to infinity we will not get answers from the future.
I need to read more about solitons, so I cannot comment about that idea, which seems potentially useful.
Your conclusions 1 and 6 may be at odds with each other with respect to uncertainty? As an ex-statistician, I would not like to deprive uncertainty of its existence, even for quantum uncertainty. So conclusion 6 is fine. I have already agreed about there being no spooky entanglement.
I am less sure about Conclusion 2. In my model, not discussed in this year's essay, dimensions are a property of particles/fields. Space or the metric of space requires negotiation between particles, which is why the metric of space collapses in Penrose's CCC at a node. But the property of dimension lives on within a particle even at the node or Big Bang. So, for me, dimensions exist as long as particles/fields exist. And for me they do always exist and do not get created nor annihilated. But space and its metric are emergent and can cease to exist, for example at a CCC node. Also, my model has more than 4D which goes against the importance of real space. So given all my biases, I do not really subscribe to Conclusion 2.
Best wishes
Austin
Dear Alan,
Glad to read your work again.
I greatly appreciated your work and discussion. I am very glad that you are not thinking in abstract patterns.
"The amplitude of a soliton is fixed; neither larger nor smaller wavepackets are possible. his suggests that a quantum "particle" may be more properly a "wavicle":a localized soliton-like wave packet, rather than a statistical distribution of point particles. Furthermore, two solitons tend to repel each other;they cannot be in the same place at the same time".
"Thisalternative quantum model makes predictions that are sharply different from those of the orthodox quantum theory[17]".
While the discussion lasted, I wrote an article: "Practical guidance on calculating resonant frequencies at four levels of diagnosis and inactivation of COVID-19 coronavirus", due to the high relevance of this topic. The work is based on the practical solution of problems in quantum mechanics, presented in the essay FQXi 2019-2020 "Universal quantum laws of the universe to solve the problems of unsolvability, computability and unpredictability".
I hope that my modest results of work will provide you with information for thought.
Warm Regards, `