Dear John, dear all,

"Fourier Transforms of a Heaviside Step Function: a Tragedy" ??

I prefer using the word tragedy to human lives. Just a few excamples:

Archimedes was killed: Don't disturb my circles

Matrin Luther saved Stifel who had calculated and predicted the immediate doomsday

Georg Cantor got insane, claimed having got the CH directly from God, and died in a madhouse

Kronecker was mobbed and died

Ritz and also Minkowski got suddenly ill and died

Suicides by Boltzmann, Hausdorff, Turing

Gödel's paranoia

Grothendieck's disappearance

In the case addressed by Buehler, there is a quite simple logical solution - admittedly one has to get free from traditional formalism but go back to basics as I tried to indicate in § 2 of my essay.

My style of teaching is a bit different from Feynman's. I hope, you may find the solution yourself soon. I will give you just a hint. Feyman allegedly refused to explain half spin. Why didn't he just mention that a full circle (360°) of cos equals to 720° of cos squared?

Once again, the solution is easily to find out for anybody with readiness to critical tinking. Don't shy back from questioning a very basis of mathematics. Admittedly I was inspired by a Professor Schwarz of South Africa whom I met in Milano in 1992, Dean Mückenheim provided me with many details, and I recall having read a lot of literature in German, e.g. Hans Gericke and Oskar Becker.

Best,

Eckard

We should not neglect what John addressed: My suggestion "calculate as if there was no causality but be careful" does also relate to the artificial boundaries of the interval under consideration.

However, I claimed having a simple logical solution to the Fourier transform of the Heaviside function. Here it is:

H(t) can be split into two fictitious parts, the even one = ВЅ and the odd one = ВЅ sign(t).

Notice: Frequency analysis of measured (i.e. past) data requires H(-t) and sign(-t).

With cos(пЃ·t) + i sin(пЃ·t) as kernel of Fourier transform, integration from minus infinity to plus infinity yields the real part of H(пЃ·) = ВЅ and the imaginary part = 1/пЃ·i

Bueler's example doesn't share the widespread mistake to define H(t) with t>=0 instead of t>0, and it illustrates that calculating as if setting t=0 in H(t) was correct may lead to wrong results. Use of distributions is not easy and perhaps unnecessary.

Notice: Euclid's ideal point, something that has no parts, contradicts to a notion of number which is, as illustrated by Hausdorff, rather based on embedded dots.

By the way, Heaviside hated geometrical evidences. Gauss criticized the desire for unnecessary acuity. Why? A point "at zero" cannot be split into a positive and a negative part. The only solutions are to calculate as if or to have a 0+ and a 0-. According to Salviati the relations larger than or smaller than are invalid for indefinitely large (as well as small) numbers. We may add: They are invalid for any truly real numbers, not only for infinity and zero.

is

Anyway, if there is no natural reference as with the t=0 of H(t) but not with Fourier, an arbitrary choice is unavoidable.

Thanks Eckard,

that is reasonable and easy enough to follow. The clarity rests with the distinction that "greater than, or equal to" zero; is fundamentally different than simply "greater than" zero. So Fourier requires the choice of calculating one or the other sign firstly, and separately. Not to become confused with corresponding terms of steps of the opposite sign. Some may say that casts the whole exercise in an arbitrary measure of observer dependence, but in physical fact the observation of any signal is fundamentally arbitrary. We set the criteria of what we want to observe in the first place. When we get a noise free result returned, then we have the information we want. So in that sense, yes, I can see how Heaviside's method would be 'noisy'. best jrc

Yes, Eckard,

on giving further consideration, I think with that clarification your essay makes the case quite well. It all holds together very nicely. Perhaps for the present we must content ourselves with the only thing definite about 'zero', is that it has an indefinite value. cheers - jrc

Thanks for your comments on my page Eckard...

This essay looks very interesting, judging by the abstract, and I shall look forward to reading it. For the record; Hilbert was not alone, and many people have put faith in the 'excluded middle' when in fact there was a middle ground, spawning what I call 'false dilemmas,' and much confusion of course.

More later,

Jonathan

Dear Eckart,

the more I read your essay and follow the discussion, the less I see the point you're going to make. Fourier states in J.B.J. Fourier, Theorie de la chaleur dans les solides, 1807:

The integrals we have obtained are not only general expressions that satisfy the differential equations; they represent in a different way the natural effect, which is the object of the problem. This is the main condition that we have always had in view, and without which the results of the operations would appear useless transformations. [note the terms 'differential equations' and 'natural effect']

What he says is, that mathematics is a desert with few oases called physics. My personal guess is that less than one percent of the totality of known math has correspondence in the 'world'. And this tiny part must be used - not questioned, for there is no ever knowable connection between these bits of math and the PHENOMENA.

Best regards,

Heinz

    Dear Heinz,

    Of course, we may follow not just Fourier and calculate as if there was no causality. Having quoted from page 7 of the English translation of Fourier's 1822 theory: "... mathematical analysis is as extensive as nature itself, it defines all perceptible relations, measures times, spaces, ..." I maintain my objection: Fourier was wrong in this decisive respect. My argument is quite compelling: Measured data which are available for mathematical analysis do definitely not extend from minus infinity to plus infinity but they only include the past. In other words Fourier was wrong because he uncritically adopted a widespread fatalistic philosophy that generalized too much (cf. the word general expressions in what you quoted).

    Did this better explain my point?

    Best regards, Eckard

    Dear Eckart,

    Fourier was a pre-modern, a man of classical physics, that's why I highlighted 'differential equations'. TIME to him was something totally different than for historians and logicians, i.e. the romantics. If my sources are correct, he wrote the variables on both sides of the FT as 'x' and 'u' (which are still used in Fourier-optics), not as 't' and 'f or omega'. So, we disagree on the concept of TIME with no chance of reconciliation. Nevertheless, good luck for the contest!

    best regards,

    Heinz

    P.S. Hegel abhorred of FICHTE'S Dreischritt of thesis-antithesis-synthesis, because for him dialects is not a process set in grammatical-logical-historical time but a principle of the mind. That's why for him 'evolution occurs at a single stroke'.

    Dear Heinz,

    At the time of Lagrange, Laplace, and Fourier, differential equations were not new, and heat conduction is not thinkable without time. Admittedly, I don't understand why you are not in position to accept compelling arguments and at least correctly write my name Eckard (neither Eckhard nor Eckart). Why do you disagree with my concept of time? I wonder if there is to engineers an acceptable alternative to time as something that includes past and future in common sense.

    Even if I am just a bit familiar with Fourier acoustics and not at all with Fourier optics, I am aware of wave numbers k, evanescent modes etc. Complex spatial frequencies correspond to ordinary complex frequencies as elapsed time corresponds to the likewise always positive quantity radius r, not to spatial coordinates x, y, z.

    Best regards,

    Eckard

    Eckard,

    I enjoyed your essay.

    A few comments:

    "There is a decisive advantage of digital over analog technology: Digital signals may cope with the noise-caused loss of decidability." Any actual advantage comes from choosing to "represent" only discrete symbols, like the letters of an alphabet, rather than from merely representing an analog signal via digitized samples taken from the analog signal. In other words, it is what is being represented (a discrete stream of alphabetic symbols - the most familiar being a two symbol alphabet - a bit), rather than how it is being represented (either analog or digital), that matters, when attempting to cope with any impairment, such as noise, distortion, or interference.

    "However, since the laws of nature were abstracted from reality, they are no longer temporally or locally bound to concrete points on the actual scales of time or space." Too many physicists have lost sight of the fact, that the laws have only been abstracted from a small temporal fraction of reality, and from only a small spatial fraction, as well. That fact has a direct impact upon the finite information content of both the observations themselves and the laws being abstracted from them. Such laws, with only a finite (and very small) information content, can never completely represent any infinite reality, or even any finite reality, that happens to be greater than the fraction that has actually been observed. Assuming (as much of mathematical physics does) that everything that has never been observed, is going to behave in precisely the manner as everything that has been observed, is seldom a very good assumption.

    "So far it is reasonable practice to tolerate so called non-causalities for instance in the current theory of signal processing for the sake of elegant calculability." It is not necessary to tolerate it - the problem you are addressing can be (and always is, in any properly functioning system) entirely avoided, by simply introducing enough delay into the processing, that everything that is being processed, already exists in the past (via a delay sampling buffer), so the future has no relevance to the only thing that is actually being processed (the previously buffered-up, past signal).

    "Notice: Expansion of mathematics at will cannot expand nature... The practice to freely define axioms more or less at will... It opened the door for a considerable expansion of mathematical theories." I agree. In that regard, you may recall my 2015 essay, stating that it is precisely the different nature of their axioms, that distinguishes math from physics.

    "For instance the human ear as a frequency analyzer... " Be careful. The auditory system does not analyze frequency, anymore than the visual system does. The bad assumption, that the perceptions of pitch and color, are being generated by any sort of frequency analysis, has confused scientists for generations; they are generated from ratios of amplitudes, which is why they are so insensitive to phase. I do not dispute that it may indeed seem "as if" the auditory system is analyzing frequency. I am only advising that one needs to "be careful" - there are other signal processing techniques that correlate much better with pitch perception, than any sort of frequency analysis does.

    "Unfortunately, one cannot even prove the theory of quantum mechanics wrong..." Alas, it all depends on what the word "wrong" represents. It is easy to prove that QM is wrong, if it is supposed (AKA interpreted) to be describing the behavior of substances (analogous to a drug), rather than merely describing the behavior of a test (like a drug test) for the existence of some particular substance, at some particular place and some particular time. The point is, drug tests are known to exhibit "false positives"; QM only correctly predicts the likelihood that something will be detected, but makes no prediction at all, regarding the likelihood that what was in fact detected, was the same "something" that it was supposed to detect - such as actually being "up" when the detector mistaken called it "down". It is easy to show that such "false positives", occurring in the polarization tests associated with Bell's theorem, can reproduce the supposedly, impossible-to-produce-classically correlations. In short, the equations of QM do not represent (are being misinterpreted) what any of the well-known interpretations believe that they represent; The do not represent the behavior of any substance (like a photon or an electron). They only describe the behavior of a frequently "false positive" test for the substance.

    Rob McEachern

      Dear Rob,

      As I expected, your comments largely confirm my essay.

      In particular I hope our different approaches may eventually agree on consequences concerning QM. You didn't quote my complete sentence. I wrote:

      "Unfortunately, one cannot even prove the theory of quantum mechanics wrong, because it was not logically derived but just heuristically fabricated by Schrödinger as well as by Heisenberg."

      I meant the impossibility to prove something logically wrong that was not logically derived.

      I vehemently maintain my utterance: "So far it is reasonable practice to tolerate so called non-causalities for instance in the current theory of signal processing for the sake of elegant calculability."

      You obviously don't believe me when arguing: "It is not necessary to tolerate it - the problem you are addressing can be (and always is, in any properly functioning system) entirely avoided, by simply introducing enough delay into the processing, that everything that is being processed, already exists in the past (via a delay sampling buffer), so the future has no relevance to the only thing that is actually being processed (the previously buffered-up, past signal)."

      While such more or less unnecessary maneuvers may hide the non-causalities to some extent, they do already obviously not work for ideal low pass filters.

      Concerning the digital vs. analog issue, I imagined for instance an originally acoustic or optic (i.e.analog) content.

      What about the laws abstracted from reality, I don't suspect the primary problem in the finiteness of observation. I rather understand the brain as selecting, combining and checking patterns which were positively evaluated from the amygdale. Nonetheless, I share your attitude concerning wild speculations. When Jonathan Dickau mentioned on p. 7 of his current essay with reference to [11] his "wild suspicion", I feel a layman.

      Having participated for many years in discussions among leading experts of auditory perception including the outsider Steven Greenberg who picked me up, I don't deny tonotopy in cochlea, CN, ICC and beyond. If there is a mistake then not in physiology but in standard theory of signal processing.

      Let me take a break for today.

      Yours,

      Eckard

      Eckard,

      Much of what Schrödinger and Heisenberg did, was "logically derived"; the problem is, some of the axioms their derivations were founded upon (such as perfectly identical particles, and using a single Fourier transform (wavefunction) to describe multiple particles) are demonstrably invalid, in any realistic rather than idealistic (noise-free) conception of reality, resulting in an entire century of misinterpretation, regarding what the math is actually describing.

      "they do already obviously not work for ideal low pass filters" They only do not work for IIR filters, but they work perfectly for FIR filters. Similarly, all actual measurements are of finite duration. Consequently all Fourier transforms of such data, only integrate over finite intervals - you can use an infinite integral, but the integrand itself will be identically zero, beyond the limits of the actual data. Thus, formal integration beyond the limits of the actual data, contributes nothing to the final result, so there is no problem with causality, when the math is correctly interpreted. Idealistic functions (such as a harmonic oscillator), extending to infinity, are just that - idealistic - and do not correctly represent reality as it is actually observed (unless one assumes that reality is perfectly predictable - in which case, one can integrate over the prefect prediction, rather than any actual data!), because all observations are of finite duration.

      "What about the laws abstracted from reality" In my view, the existing "laws of physics" are not "fundamental", "preexisting" or "fixed" and so do not cause effects, even though it seems "as if" they do. But be careful, rather, repeatable effects evolve into being, as the information content of interacting entities changes over time. Subsequently, later observers will be able to create their mathematical "laws" to describe these newly emergent, repeatable effects, that never previously existed. So the effects being described today, may not have even existed, in the distant past. In other words, if an observer existed in the distant past, any perceivable regularities would have very probably been rather different, from those perceivable today - so the laws formulated to describe those past effects, would have been different from those describing the observable effects today.

      Rob McEachern

      Robert,

      As to evade useless quarreling with a believer of complex FT on whether or not truncation (FIR) in combination with shift rescues FT, I would like mocking: May we attribute to the complex wave function a behavior similar to phase deafness in case of hearing, instead of the mysterious so called collapse of wave function?

      Eckard

      Robert,

      Did you realize my new claim to have found out after carefully reading Fourier's original work in what he was wrong? I hope you are beginning to understand my argument concerning redundancy. I know, it also means out of job, and it also has a positive meaning in the sense of an additional reserve. My argument: More freedom and in particular the need to arbitrarily choose something is not always acceptable.

      What about Schrödinger, I refer to "Schrödinger, Life and Thought" and to his 4th communication.

      Eckard

      On p. 2 of Robert McEachern's 2015 essay I found the expression "Fourier Uncertainty Principle" which was certainly meant in the sense of Heisenberg's Uncertainty relation. Let me reiterate, Fourier was wrong when he believed that the complex transformation he advertised is as extended as is nature itself.

      It is undoubtedly often advantageous to calculate as if this was the case. However, be careful ...

      Eckard Blumschein

        Eckard,

        As I have stated in past conversations, I know of no instance, in which nature makes use of any orthogonal transform, such as a Fourier transform. So the properties of such transforms do not coincide with any natural phenomenon. Humans developed such transforms, and subsequently assumed that they can be used to describe natural phenomenon. They can - but the question remains, how precisely do they correspond to the actual phenomenon? Do they exactly reproduce every aspect of the phenomenon? I do not know of a single case in which that is true. They are useful tools, for describing behaviors, just like English is a useful tool for describing behaviors. But neither, by itself, is capable of providing enough detail to enable an engineer to perfectly emulate any natural phenomenon being described. In other words, transforms and language merely provide a mechanism for talking about phenomenon, but what is being talked about matters more than the mechanism does. If you use transforms (or English) to talk about nonsense (like wavefunctions collapsing) then you are still just talking about nonsense. The fact that you can very precisely describe this nonsense, does not change the fact that it is still just nonsense.

        Here is the issue that I have been trying (without much success) to get physicists to understand: "in general, nature did not yet and can certainly never obey manmade arbitrarily constructed mathematics"

        In Shannon's Information theory, nothing is arbitrary; in order to recover "information" (not just data!) a receiver must know, a priori, exactly how to recover any information encoded into a signal. The proper technique for performing such a recovery cannot be deduced from analyzing the signal. Thus, the fundamental assumption underlying the entire scientific method, namely that it ought to always be possible to deduce the required recovery technique, by studying the data, is false. There are cases in which it is simply not possible. But this does not mean that the information can never be recovered! It just means that you have to already know exactly how to do it, independently of the data to which the technique is going to be applied. This is exactly the problem with quantum theory and the Heisenberg Uncertainty Principle.

        For example, in a "Bell test", in order to correctly determine the polarization of a photon, you must know a priori how to get the polarization axis of the detector perfectly aligned with the polarization of the photon, prior to ever attempting the measurement! In other words, the phase angle between the detector and the photon must be either 0 or 180 degrees, before the measurement is ever even attempted! But the entire point of every Bell test is to avoid doing that! Thus, the entire experiment is just producing garbage results! My point is, it is not just the phase angle in a Fourier representation that matters. Everything matters! You have to know exactly what measurements can be made and exactly how they need to be made, before even attempting to make them. Failing to understand that fact is why physicists have failed to understand quantum theory.

        The ancient Greek philosophers debated whether or not it is possible to find something, when you do not know exactly what you are looking for. Shannon definitively answered that question: there are some "things" that can never be reliably found, unless you do know exactly how to find them. He called such things "information". The Heisenberg Uncertainty principle, in its limiting case, defines a single bit of information; that is why physicists have so much trouble trying to measure it (at arbitrary angles), as in a Bell test. They have yet to realize what they are even looking for, or looking at - they are witnessing the behavior of "information", not collapsing wavefunctions, non-localities, or any of the other nonsensical interpretations that they have concocted.

        Rob McEachern

        Dear Robert,

        I regret having not yet found a new essay by you because you as a native speaker seem to be better in position to reveal mistakes that might affect theories. Admittedly, I prefer abstaining from use of the insulting word nonsense.

        I am also waiting for a hopefully proficient essay by Klingman instead of the Swedish essay on ether wind.

        What about my definitely unwelcome criticism of FT without Heaviside's fictitious split, I feel forced by two reasons:

        - There is no known to me reason to accept that the frequency analysis of a measured signal may depend on the arbitrary (redundant) choice of a reference point. Phase makes only sense to me as a relative measure.

        - The conjecture of causality is indispensable to me.

        Yours,

        Eckard

        Yours,

        Eckard

        Eckard,

        You are correct: "There is no known to me reason to accept that the frequency analysis of a measured signal may depend on the arbitrary (redundant) choice of a reference point."

        When I first studied FM-signal demodulation techniques (there are many), I was intrigued to discover that communications engineers do not use the concept of "Fourier frequency" at all, as it appears in connection with a Fourier transform. Instead, they use the concept of "instantaneous frequency" which is the first derivative of the "instantaneous phase", with respect to time. Since the derivative removes any constant phase offset, any "phase reference point" is immediately rendered irrelevant to the process. This is one of the reasons why frequency modulation was preferred over phase modulation, in early, analog radio systems - the demodulator never has to concern itself with trying to establish a phase "reference point". The same is true of the auditory system - the actual signal processing in such systems, has little to do with Fourier phases, frequencies or superpositions.

        So the reason that I do not share your concern about "inappropriate" properties of Fourier transforms, is because I, like "mother nature", have learned to avoid ever using them, for inappropriate forms of data analysis - anything other than crude, imprecise estimates of "blobs" of energy, distributed across a "power spectrum". That is all that wavefunctions-squared represent - crude estimates of a received energy distribution, which when divided by the energy-received-per-quantum, yields an estimate of the number of received quanta (AKA a histogram or probability distribution). That is all there is to quantum theory! It has little to do with the behaviors of waves or particles or anything else! Quantum theory is nothing more than a Fourier transform based description of a histogram! Assuming that the theory is describing anything else, is the cause of all the nonsensical interpretations. And that is why I do not abstain from using the word "nonsense"; the problem is obvious, once you understand what a Fourier transform based wavefunction is really describing. It is not describing a superposition at all. It is only describing a histogram - an energy-detecting filter-bank, in which each filter, is just a "matched filter", for whatever you are trying to detect! When you employ an inappropriate "matched filter", as is the case in every Bell-test, you get a bogus estimate of the number of correctly detected quantum - resulting in spurious-correlations, the belief in "spooky action at a distance", and all the other misinterpreted nonsense being written about for the past eighty years.

        Rob McEachern

        Robert,

        "...crude estimates of received energy distributions, which..., yields an estimate of the number of received quanta ."

        Yes, I must agree. The key word is 'received' and the qualifier is 'quanta'. We only detect (and only to an arbitrary degree by prescribed observation) the transition zone at the antennae, not the source. We assume a symmetrical form of emission but we can neither detect it or the natural form of the far field. All of physics, including classicism, neglects that we are conjuring a hypothesis based entirely on the confusion of cross-sectional decay rates of intensity in the near field, which physically display not only inverse square, but also inverse cube and inverse exponential rates of change of intensity. And any detection only gives us the interactive responses in close proximity of a macro-scopic 'antennae'.

        So what registers as 'received' is a product of the response of the aggregate molecular domains of electromagnetic response, of which the Planck value Quanta is the least observable average. For all we know, and all we can do is conjecture, an emission of energy has the physical form of a linear 'jet', and the entire Quantum response is solely due to how material particles in a yet to be realistically formulated model, respond to energy 'loads'.

        Given those physical limitations on detection, and interpretive observation, any type of analysis is not about the EM physical form, but about how we experimentally detect its reception. best jrc

        Dear Robert,

        Let me explain why I prefer abstaining from use of the insulting word nonsense: Strong words cannot make arguments stronger. On the contrary, they tend to indicate weakness.

        Consider for instance Knoll's provocative "Remembering the Future". Knoll is aware that he contradicts to common sense. He kows that this makes him attractive, and many friends of Einstein's BU (block universe) will appreciate his according musing.

        I should add that Ben Akiba claimed "Any future event did already exist in the past". Corresponding religions including rebirth, eternal life with the rewarding virgins in heaven, and fatalism did perhaps arose from the observed cyclicity of the four seasons.

        Eckard