Dear Rob,

As I expected, your comments largely confirm my essay.

In particular I hope our different approaches may eventually agree on consequences concerning QM. You didn't quote my complete sentence. I wrote:

"Unfortunately, one cannot even prove the theory of quantum mechanics wrong, because it was not logically derived but just heuristically fabricated by Schrödinger as well as by Heisenberg."

I meant the impossibility to prove something logically wrong that was not logically derived.

I vehemently maintain my utterance: "So far it is reasonable practice to tolerate so called non-causalities for instance in the current theory of signal processing for the sake of elegant calculability."

You obviously don't believe me when arguing: "It is not necessary to tolerate it - the problem you are addressing can be (and always is, in any properly functioning system) entirely avoided, by simply introducing enough delay into the processing, that everything that is being processed, already exists in the past (via a delay sampling buffer), so the future has no relevance to the only thing that is actually being processed (the previously buffered-up, past signal)."

While such more or less unnecessary maneuvers may hide the non-causalities to some extent, they do already obviously not work for ideal low pass filters.

Concerning the digital vs. analog issue, I imagined for instance an originally acoustic or optic (i.e.analog) content.

What about the laws abstracted from reality, I don't suspect the primary problem in the finiteness of observation. I rather understand the brain as selecting, combining and checking patterns which were positively evaluated from the amygdale. Nonetheless, I share your attitude concerning wild speculations. When Jonathan Dickau mentioned on p. 7 of his current essay with reference to [11] his "wild suspicion", I feel a layman.

Having participated for many years in discussions among leading experts of auditory perception including the outsider Steven Greenberg who picked me up, I don't deny tonotopy in cochlea, CN, ICC and beyond. If there is a mistake then not in physiology but in standard theory of signal processing.

Let me take a break for today.

Yours,

Eckard

Eckard,

Much of what Schrödinger and Heisenberg did, was "logically derived"; the problem is, some of the axioms their derivations were founded upon (such as perfectly identical particles, and using a single Fourier transform (wavefunction) to describe multiple particles) are demonstrably invalid, in any realistic rather than idealistic (noise-free) conception of reality, resulting in an entire century of misinterpretation, regarding what the math is actually describing.

"they do already obviously not work for ideal low pass filters" They only do not work for IIR filters, but they work perfectly for FIR filters. Similarly, all actual measurements are of finite duration. Consequently all Fourier transforms of such data, only integrate over finite intervals - you can use an infinite integral, but the integrand itself will be identically zero, beyond the limits of the actual data. Thus, formal integration beyond the limits of the actual data, contributes nothing to the final result, so there is no problem with causality, when the math is correctly interpreted. Idealistic functions (such as a harmonic oscillator), extending to infinity, are just that - idealistic - and do not correctly represent reality as it is actually observed (unless one assumes that reality is perfectly predictable - in which case, one can integrate over the prefect prediction, rather than any actual data!), because all observations are of finite duration.

"What about the laws abstracted from reality" In my view, the existing "laws of physics" are not "fundamental", "preexisting" or "fixed" and so do not cause effects, even though it seems "as if" they do. But be careful, rather, repeatable effects evolve into being, as the information content of interacting entities changes over time. Subsequently, later observers will be able to create their mathematical "laws" to describe these newly emergent, repeatable effects, that never previously existed. So the effects being described today, may not have even existed, in the distant past. In other words, if an observer existed in the distant past, any perceivable regularities would have very probably been rather different, from those perceivable today - so the laws formulated to describe those past effects, would have been different from those describing the observable effects today.

Rob McEachern

Robert,

As to evade useless quarreling with a believer of complex FT on whether or not truncation (FIR) in combination with shift rescues FT, I would like mocking: May we attribute to the complex wave function a behavior similar to phase deafness in case of hearing, instead of the mysterious so called collapse of wave function?

Eckard

Robert,

Did you realize my new claim to have found out after carefully reading Fourier's original work in what he was wrong? I hope you are beginning to understand my argument concerning redundancy. I know, it also means out of job, and it also has a positive meaning in the sense of an additional reserve. My argument: More freedom and in particular the need to arbitrarily choose something is not always acceptable.

What about Schrödinger, I refer to "Schrödinger, Life and Thought" and to his 4th communication.

Eckard

On p. 2 of Robert McEachern's 2015 essay I found the expression "Fourier Uncertainty Principle" which was certainly meant in the sense of Heisenberg's Uncertainty relation. Let me reiterate, Fourier was wrong when he believed that the complex transformation he advertised is as extended as is nature itself.

It is undoubtedly often advantageous to calculate as if this was the case. However, be careful ...

Eckard Blumschein

    Eckard,

    As I have stated in past conversations, I know of no instance, in which nature makes use of any orthogonal transform, such as a Fourier transform. So the properties of such transforms do not coincide with any natural phenomenon. Humans developed such transforms, and subsequently assumed that they can be used to describe natural phenomenon. They can - but the question remains, how precisely do they correspond to the actual phenomenon? Do they exactly reproduce every aspect of the phenomenon? I do not know of a single case in which that is true. They are useful tools, for describing behaviors, just like English is a useful tool for describing behaviors. But neither, by itself, is capable of providing enough detail to enable an engineer to perfectly emulate any natural phenomenon being described. In other words, transforms and language merely provide a mechanism for talking about phenomenon, but what is being talked about matters more than the mechanism does. If you use transforms (or English) to talk about nonsense (like wavefunctions collapsing) then you are still just talking about nonsense. The fact that you can very precisely describe this nonsense, does not change the fact that it is still just nonsense.

    Here is the issue that I have been trying (without much success) to get physicists to understand: "in general, nature did not yet and can certainly never obey manmade arbitrarily constructed mathematics"

    In Shannon's Information theory, nothing is arbitrary; in order to recover "information" (not just data!) a receiver must know, a priori, exactly how to recover any information encoded into a signal. The proper technique for performing such a recovery cannot be deduced from analyzing the signal. Thus, the fundamental assumption underlying the entire scientific method, namely that it ought to always be possible to deduce the required recovery technique, by studying the data, is false. There are cases in which it is simply not possible. But this does not mean that the information can never be recovered! It just means that you have to already know exactly how to do it, independently of the data to which the technique is going to be applied. This is exactly the problem with quantum theory and the Heisenberg Uncertainty Principle.

    For example, in a "Bell test", in order to correctly determine the polarization of a photon, you must know a priori how to get the polarization axis of the detector perfectly aligned with the polarization of the photon, prior to ever attempting the measurement! In other words, the phase angle between the detector and the photon must be either 0 or 180 degrees, before the measurement is ever even attempted! But the entire point of every Bell test is to avoid doing that! Thus, the entire experiment is just producing garbage results! My point is, it is not just the phase angle in a Fourier representation that matters. Everything matters! You have to know exactly what measurements can be made and exactly how they need to be made, before even attempting to make them. Failing to understand that fact is why physicists have failed to understand quantum theory.

    The ancient Greek philosophers debated whether or not it is possible to find something, when you do not know exactly what you are looking for. Shannon definitively answered that question: there are some "things" that can never be reliably found, unless you do know exactly how to find them. He called such things "information". The Heisenberg Uncertainty principle, in its limiting case, defines a single bit of information; that is why physicists have so much trouble trying to measure it (at arbitrary angles), as in a Bell test. They have yet to realize what they are even looking for, or looking at - they are witnessing the behavior of "information", not collapsing wavefunctions, non-localities, or any of the other nonsensical interpretations that they have concocted.

    Rob McEachern

    Dear Robert,

    I regret having not yet found a new essay by you because you as a native speaker seem to be better in position to reveal mistakes that might affect theories. Admittedly, I prefer abstaining from use of the insulting word nonsense.

    I am also waiting for a hopefully proficient essay by Klingman instead of the Swedish essay on ether wind.

    What about my definitely unwelcome criticism of FT without Heaviside's fictitious split, I feel forced by two reasons:

    - There is no known to me reason to accept that the frequency analysis of a measured signal may depend on the arbitrary (redundant) choice of a reference point. Phase makes only sense to me as a relative measure.

    - The conjecture of causality is indispensable to me.

    Yours,

    Eckard

    Yours,

    Eckard

    Eckard,

    You are correct: "There is no known to me reason to accept that the frequency analysis of a measured signal may depend on the arbitrary (redundant) choice of a reference point."

    When I first studied FM-signal demodulation techniques (there are many), I was intrigued to discover that communications engineers do not use the concept of "Fourier frequency" at all, as it appears in connection with a Fourier transform. Instead, they use the concept of "instantaneous frequency" which is the first derivative of the "instantaneous phase", with respect to time. Since the derivative removes any constant phase offset, any "phase reference point" is immediately rendered irrelevant to the process. This is one of the reasons why frequency modulation was preferred over phase modulation, in early, analog radio systems - the demodulator never has to concern itself with trying to establish a phase "reference point". The same is true of the auditory system - the actual signal processing in such systems, has little to do with Fourier phases, frequencies or superpositions.

    So the reason that I do not share your concern about "inappropriate" properties of Fourier transforms, is because I, like "mother nature", have learned to avoid ever using them, for inappropriate forms of data analysis - anything other than crude, imprecise estimates of "blobs" of energy, distributed across a "power spectrum". That is all that wavefunctions-squared represent - crude estimates of a received energy distribution, which when divided by the energy-received-per-quantum, yields an estimate of the number of received quanta (AKA a histogram or probability distribution). That is all there is to quantum theory! It has little to do with the behaviors of waves or particles or anything else! Quantum theory is nothing more than a Fourier transform based description of a histogram! Assuming that the theory is describing anything else, is the cause of all the nonsensical interpretations. And that is why I do not abstain from using the word "nonsense"; the problem is obvious, once you understand what a Fourier transform based wavefunction is really describing. It is not describing a superposition at all. It is only describing a histogram - an energy-detecting filter-bank, in which each filter, is just a "matched filter", for whatever you are trying to detect! When you employ an inappropriate "matched filter", as is the case in every Bell-test, you get a bogus estimate of the number of correctly detected quantum - resulting in spurious-correlations, the belief in "spooky action at a distance", and all the other misinterpreted nonsense being written about for the past eighty years.

    Rob McEachern

    Robert,

    "...crude estimates of received energy distributions, which..., yields an estimate of the number of received quanta ."

    Yes, I must agree. The key word is 'received' and the qualifier is 'quanta'. We only detect (and only to an arbitrary degree by prescribed observation) the transition zone at the antennae, not the source. We assume a symmetrical form of emission but we can neither detect it or the natural form of the far field. All of physics, including classicism, neglects that we are conjuring a hypothesis based entirely on the confusion of cross-sectional decay rates of intensity in the near field, which physically display not only inverse square, but also inverse cube and inverse exponential rates of change of intensity. And any detection only gives us the interactive responses in close proximity of a macro-scopic 'antennae'.

    So what registers as 'received' is a product of the response of the aggregate molecular domains of electromagnetic response, of which the Planck value Quanta is the least observable average. For all we know, and all we can do is conjecture, an emission of energy has the physical form of a linear 'jet', and the entire Quantum response is solely due to how material particles in a yet to be realistically formulated model, respond to energy 'loads'.

    Given those physical limitations on detection, and interpretive observation, any type of analysis is not about the EM physical form, but about how we experimentally detect its reception. best jrc

    Dear Robert,

    Let me explain why I prefer abstaining from use of the insulting word nonsense: Strong words cannot make arguments stronger. On the contrary, they tend to indicate weakness.

    Consider for instance Knoll's provocative "Remembering the Future". Knoll is aware that he contradicts to common sense. He kows that this makes him attractive, and many friends of Einstein's BU (block universe) will appreciate his according musing.

    I should add that Ben Akiba claimed "Any future event did already exist in the past". Corresponding religions including rebirth, eternal life with the rewarding virgins in heaven, and fatalism did perhaps arose from the observed cyclicity of the four seasons.

    Eckard

    Eckard,

    that states it quite simply, and nature is not likely to be adequately described by any one particular method of analysis. Let me again commend you on the effort you have put into writing and conversing in the casual idioms of the English language. You must have labored long on your essay, and it does read well for those of us whom have not had to learn any other language. Thanks again, this has been a learning experience for me, and good luck with the judging. best - jrc

    Eckard,

    I believe that your claim is valid for minor, evolutionary advances in knowledge; But not for revolutions in wisdom. Strong arguments, that are being systematically ignored, serve no purpose.

    "A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it." Max Planck

    Socrates, Galileo and Planck all eventually came to understand, that simply publishing a revolutionary, strong argument, falsifying existing "wisdom", is useless; because no one with a vested interest in the old ideas, will ever even consider it; it will be dismissed, outright, as "crackpot" at best or "heresy", at worst, that ought to be burned (along with the author). Physicists know the odds and know how to play them - what are the odds that legions of the "best and brightest" could be so completely wrong, for so long?

    Socrates et al, eventually realized that anyone attempting to get a radical, new idea accepted, will have to "throw down the gauntlet" and insult the intelligence of the "powers that be", to ever get them "off the fence" and either disprove the claim, or accept that their own long-cherished notions are wrong. In his old age, Socrates insulted the jury at his trial, rather than simply choosing to exile himself and thus be forgotten, as everyone expected him to do. And in his old age, Galileo eventually resorted to publicly calling the pope a "simpleton", in order to finally get the pope "off the fence". I too, am getting old.

    In the past, you yourself have commented on the fact that no one here, from academia, has even engaged, much less refuted, my simple "single bit of information" argument regarding Bell's theorem; they cannot find any flaw in it. Nor do they want it to be true - because (1) it will destroy their own "legacy" and (2) they are all deathly afraid of publicly admitting that they can find no flaw in it, for fear that someone else may eventually do so, thereby making them appear foolish and "lose face." Like people living on the flanks of Vesuvius, or near the San Andreas fault, they all know a "Big One" is inevitable, but they all cling to the hope that it will not happen now, after they have spent their entire careers, betting heavily, on the "wrong horse" - hence Planck's comment about waiting for their death. They are too afraid, to climb upon the shoulders of any giants, until after they have assured themselves that they will not be pushed off by their own "peers". In such situations, publicly challenging "their manhood" is often the only way to get them to ever commit to such an endeavor.

    Rob McEachern

    Dear Robert,

    Well, when I am suggesting calculate as if there was no causality, this is no revolution in wisdom. My position is in this case not even at odds with idealism and the bible: In the BEGINNING WAS the word.

    Don't get me wrong: While science doesn't even need a beginning of time, the word WAS is based on the trust in temporal order. The invariance of the laws of nature under shift along time scale is quite understandable: the laws were found by means of abstraction. They lost therewith their immediate link to the (conjectured) reality. In order to accept this compelling but systematically ignored argument, one doesn't even need to know Bell's theorem.

    When I selected just a few suggestions to calculate as if, I already tried to indicate how they are interrelated.

    Next time I will try and as promised explain how to understand and how to apply the notion of being infinite for a standing wave between Neumann or Dirichlet mirrors.

    Best,

    John,

    "... nature is not likely to be adequately described by any one particular method of analysis." ???

    A Polack wrote: "The map is not the territory." In so far I partially agree. The mathematical map in terms of abstracted laws extends from minus infinity to plus infinity. Nature is the unchangeable territory. It does not yet include what is still open to influences in the merely more or less predictable future. Fourier got utterly popular by providing something that is very elegant one the abstract level of theory but contradicts to common sense: Complex analysis seems to allow a spectral analysis not just of data from the past but also of not yet available future data. Consequently complex analysis implies to deny the distinction between past and future as actually did Einstein and Hilbert. Well, within a model or a record there is no "now": The map as well as a photo are not the territory.

    I already mentioned that science doesn't require a beginning of time (point of creation), and I add we don't need an end of time (doomsday) either. However, engineers like me need the here and the now.

    Kind regards,

    Eckard

    4 days later

    While perhaps nobody may deny that the complex Fourier transform introduces an arbitrary reference point, You Tube is propagating a video: "But what is the Fourier Transform? A visual introduction".

    The video repeatedly nicely illustrates how an endless sinusoidal function of time can be wrapped around a circle. Is there a problem? Yes: One may calculate as if there was no causality but in reality the past is closed and the future is open. Accordingly there is a border between past and future that can be shifted at will on the level of abstract models but definitely not at the basic level of physical reality.

    Fourier's theory as well as the video neglects something that is also quite plausible: An endless path is (only) imaginable along any closed loop, no matter whether it forms a circle or an interval between two mirrors. In the latter case, a standing wave obeys Neumann or Dirichlet boundary conditions at the surfaces of the mirrors.

    Endlessness at both sides implicates the need to chose a point of reference at will.

    Eckard:

    I read your essay with great interest. You'll see mine has some common ground. However, I understand your view is the characteristics you mention may be used by physics but with caution to relate to observation (not sure you meant the "observation" part). Could you accept the idea that the use of the math characteristics produces problematical physics and should be indicate a model that needs a redo (my thesis)?

    One thing I treat lightly (lack of space) was the place of error analysis/statistics in misleading and inadequate for physical models. This point was explored in Nielsen, Guffanti & Sarkar arxiv:1506.01354 "Marginal evidence for cosmic acceleration from Type Ia supernovae". This point was further explored in Sabine Hossenfelder's recent interview of S. Sarkar https://www.youtube.com/watch?v=B1mwYxkhMe8&list=PLwgQsqtH9H5fe4B5YCF3vcZgIkMMULS7z

    Let me add a bit on a previous comment on you essay, The truncated Fourier analysis results in the next term after truncation is the Uncertainty (Heisenberg's Uncertainty?).

    Would you comment on the idea that all the added dimensions, imaginary numbers, and things like Fourier constants do not improve Understanding or physics. They merely mask better physics.

    A bit on numbers. As you see, I hold only cardinal numbers as useful with irrational and transcendental function as contributing to the error between observation and math. I understand the natural number's interest is describing an extension of a point to a line to want to include such numbers. But I reject imaginary numbers as being an unnecessary crutch.

    I also note the Turing's proof includes the ordinal number's which makes the proof nonphysical. Similar nonphysical comments are in Godel. I wish I spent a bit more space on this point - but space.

    Hodge

      JC Hodge,

      My main credo/topic is causality. You wrote: "greater Understanding and greater Wisdom yields survival and population growth." Are you sure that survival and unlimited population growth don't eventually exclude each other? This perhaps shocking question of mine intends to make aware of the importance and the risks of idealization in general, including physics. Be careful, the map is not the territory. Nonetheless, I suggest continuing to deliberately calculate to some extent as if there was no causality.

      By the way, Heisenberg's uncertainty is not bound to complex Fourier transform. It relates to conjugate pairs like time and frequency with real-valued cosine transform too.

      Eckard

      Tim Palmer called some questions of mine „deep questions!! ". I guess, typical mistakes in mathematics cannot at all be so deeply rooted that they cannot be clearly addressed, and most likely it is often possible to find out what went wrong in history.

      My reason to delve into the fundamentals of mathematics was the rejection of my suggestion to allow R+ and cosine transform instead of R. My argument was: In order to describe the past (or the future) alone, on does not need time values that extend from minus infinity to plus infinity. In particular, data that are not yet available cannot be analyzed. I am arguing that R+ and cosine transform CT as a special cases of R and FT relate to FT as does N to Z. R and FT only differ from R+ and CT in that they need an arbitrarily chosen reference point. Instead of accepting R+ and CT, some mathematicians denied the possibility to separate R+ from R. Indeed, modern topology doesn't allow a discrete cut. I blame Hausdorff and Dirac for making this mistake very obvious. I found out, as I indicated in my essay, that the notion of continuity as used by Rolle is inappropriate in case of a discrete jump. By the way Rolle understood in contrast to Descartes that -1 is larger than -2.

      Eckard

      Note the "and" implying both. Certainly Malthus identified a rule of nature. Technology has allowed a greater population level. But Malthus was correct, population does increase faster than even technology. So, either humanity deals with Malthus' nature or nature will do the population limiting itself. Humanity is a "keystone" species (Serengiti Rules") which means starvation is nature's way of dealing with us. Such has been the case throughout history. The nature's contraction comes at the end of a warm period where food production allows the overpopulation. Then a cool period restricts food production to levels below that required by the population. Civilization collapse follows. So, now we are facing a coming cool period. Has humanity's morals going to prevent a collapse of civilization? I think not. The moral of supporting the weak and non-producers may guarantee collapse. Note the Polynesians on isolated islands had to deal with the food limit by supporting infanticide (of the weak) and suicide.

      Like your "territory analogy. Certainly, The line (math) on a map (transformation) is not the road (physics).

      John C Hodge,

      My message is: Be careful if calculating as if an ideal map was the territory even if it is of course different. In case of ethics, the biblical ideal is to get more power by getting more followers. As far as I now, Malthus did not yet vote for a more comprehensively responsible ethics because he focused on nutrition and ignored that the main risk is not directly malnutrition but side effects of more efficient methods to exhaust and irreversibly poison nature.

      At first, I suggest dealing with the ideals in mathematics and in physics. Is my distinction between Euclid's ideal point and mathematician's dot plausible to you? Wilhelm Busch was mocking:

      Who cannot imagine a point is simply too lazy for that. (My source: Mückenheim "Die Geschichte des Unendlichen").

      Eckard

      .