It seems to have escaped Wolpert's somewhat limited attention that no two real snowflakes are identical, and as all of the laws of the universe must be consistent throughout; this means that no two quanta can be identical either. As no two quanta can be identical, all abstract information concerning their utility is utter codswallop.

Joe Fisher

Fascinating. Another important part of quantum mechanics is the Born Rule. This is not the MOST relevant possible thread to make my point, but I do also want to test formatting (whether character versions of subscripts and superscripts will show up as such, rather than format commands.) The Born Rule in quantum mechanics states that probability of detection of a particle etc. is proportional to the square of the absolute value of the net wavefunction at that place and time. Despite inviting comparison to energy density being proportional to field amplitude squared, the BR is often presented as mysterious--as if it were a free parameter of nature rather than something that makes logical sense. I came up with a simple way to show that the known form of the BR is necessary, if we neglect complicated and unusual alternatives. We also reasonably assume simple additive superposition of amplitudes, basic linearity (e.g. of filter response), and that exponents must be positive or zero (to avoid the zero-amplitude crisis.)

My proof derives from the need to conserve the total number of particles transiting a Mach-Zehnder interferometer with asymmetrical beamsplitters. The total is normalized as unity. An ABS splits an incoming beam into unequal outputs. Hence a в‰  b, where a is transmitted amplitude and b is reflected amplitude. These may have different phases and thus complex values, but the proof can proceed because of the equal phases that combine for the fully constructive output from the relevant channel. This demonstration may not show universal applicability of squared moduli, but it does rule out alternatives.

We know from the BR that the corresponding intensities equal aВІ and bВІ, and hence in the idealized case of no absorption used for modeling: aВІ + bВІ = 1 . But did that have to be true, instead of say, cubed amplitudes; such that aВі + bВі = 1? If we simplify by considering one-term exponent laws, then consistency says "yes." (Further exploration is welcome, but the case implies any alternative would be contrived.) So, consider an MZI with asymmetrical BS at each end. The first, ABSв‚Ѓ, has transmitting amplitude a, and reflecting as b. Considering simple exponents (which don't have to be integers), we need aвЃї + bвЃї = 1. So far, we have no way to narrow that. These beams are recombined in ABSв‚‚. This latter follows typical practice of outputting maximum constructive interference (no phase difference) in the lower, "A" Channel. However, it reverses transmission/reflection amplitudes relative to ABSв‚Ѓ. So: the originally transmitted beam is reflected at ABSв‚‚ into Ch. A for a final output amplitude there of aВІ. The originally reflected beam is transmitted at ABSв‚‚ into Ch. A for a final output amplitude there of bВІ. Superposition gives the total as aВІ + bВІ.

That already looks promising but we aren't done yet. First, we have to ensure that the output at Channel B must be zero. We can: since the Ch. A output is already the maximum output, pairing it with other than zero amplitude would be a contradiction. Suppose zero output was paired with less than the maximum possible amplitude. If so, then pairing the maximum with any value zero or over, would produce a larger total than before. But the totals must always be the same, so zero and maximum are paired. (It may seem obvious, but it's good to show the formal necessity.)

Now, we can proceed to satisfy the following equation:

(aВІ + bВІ)вЃї = 1

aВІ + bВІ = 1вЃ»вЃї = 1

That is basically it. If the rule had been say, the amplitude itself or the cube; it could not be so that aВі + bВі = 1 and aВІ + bВІ = 1, as well. Note: this whole argument only makes sense if we assume or accept, that there really is a number of particles output according to some rule, and not just two "branches" of arbitrary relative amplitudes. The whole idea of probability falls apart in the latter case, despite awkward attempts by MWI supporters to contrive an equivalence.

I dislike Wolpert's tongue in cheek. It reminds me of someone who tried to catch attention with the words "I am God. I don't exist".

Nonetheless, I see the addressed topic related to the usual but questionable use of the notion number as well as to an Einstein's still mandatory in physics denial of the distinction between past and future.

Eckard

It is easy to demonstrate that the Heisenberg Uncertainty principle is merely the limiting case of the Shannon Capacity Theorem, for information content: it corresponds to a measurement that contains only a single bit of information.

The Shannon Capacity can be stated as:

The total amount of information contained in a series of measurements, of finite duration, can never exceed the number of bits required to digitize (and perfectly reconstruct, down to the noise level) the band-limited input signal being measured:

(number of bits of information)

    For some reason, the end of my previous post was cut off.

    I'll try posting it again:

    It is easy to demonstrate that the Heisenberg Uncertainty principle is merely the limiting case of the Shannon Capacity Theorem, for information content: it corresponds to a measurement that contains only a single bit of information.

    The Shannon Capacity can be stated as:

    The total amount of information contained in a series of measurements, of finite duration, can never exceed the number of bits required to digitize (and perfectly reconstruct, down to the noise level) the band-limited input signal being measured:

    (number of bits of information)

    For some reason, the end of my previous post was cut off again. I'll try posting it again:

    It is easy to demonstrate that the Heisenberg Uncertainty principle is merely the limiting case of the Shannon Capacity Theorem, for information content: it corresponds to a measurement that contains only a single bit of information.

    The Shannon Capacity can be stated as:

    The total amount of information contained in a series of measurements, of finite duration, can never exceed the number of bits required to digitize (and perfectly reconstruct, down to the noise level) the band-limited input signal being measured:

    (number of bits of information) is less than or equal to (number of samples)(number of bits per sample)

    The limiting case is when there is only one independent sample, with only a single, possible, significant bit (limited by the signal-to-noise ratio), hence, one bit of information.

    Rob McEachern

    The post problem in the previous posts, seems to be that the system interpreted the mathematical symbol for "less than", followed by the symbol for "equal", as a command to end the post. So I replaced the symbols with words.

    Rob McEachern

    Robert, I've had similar problems and I sympathize. REM that this system is more based on LaTeX not HTML, and to use preview function. You might be able to find a character that looks like gr.thn. or ls.thn., that won't be interpreted as a command. BTW have you seen anything like my BR proof before? Cheers and happy Fourth as it may apply.

    Neil,

    I have a rather different take on the Born Rule:

    Wave-functions are described mathematically in terms of Fourier Transforms. The transforms are invariably mis-interpreted as physical (rather than purely mathematical) superpositions. However, Fourier Transforms have another, very different, physical interpretation, commonly used in communications theory; they are tuned filter-banks, in which the "power spectrum" measures the power received within each narrow "bin" of the transform. If the input signal consists of a set of identical particles or wavelets, each carrying an identical amount (quantum) of energy, then the power spectrum is it fact, nothing more than a histogram, in which the total energy received in each bin, divided by the energy per particle, equals the number of particles received per bin. That is why the whole process corresponds to a probability measurement - it literally is - a histogram.

    11 days later
    • [deleted]

    Reality combines information on itself, making it perceptually unified and objective. In order for Heisenberg's Uncertainty Principle to work, there must be information that is excluded from reality at one moment in time and then "collapses" the next, transforming the unreal into the real. However, this is speaking colloquially, since reality is more than the sum of its perceptions and the information gained or yielded by them. Each inference device, such as humans, gods, or intelligent observational machines, moving forward each moment in time, must actually be conscious of the information contained in the others' mind in order to predict what that conscious entity will do. So information plays a fundamental role in determining reality or the state of the universe. Wheeler understood this and posited an "observer-participant universe".

      Claude Shannon understood it long before Wheeler. The Heisenberg uncertainty principle can be simply derived from Shannon's Capacity theorem, by considering the minimum possible amount of information (one bit) that can ever be extracted from a measurement; it corresponds to a single, independent, measurement "sample', with a single significant bit. It has nothing to do with any "collapse", nor is consciousness required. But you are correct in that information acquisition, is dependent upon a priori knowledge.

      • [deleted]

      A priori knowledge is knowledge gained by reasoning and reasoning, as Kant noted, yields theoretical knowledge, which, if successful, is reality.

        A virus entering your body, has a priori knowledge concerning how to infiltrate your cells. It did not obtain that knowledge by reasoning, or any other process known to Kant.

        Rob McEachern

        If I may say, down the road it is easy to mistake chance for design. To credit a virus with a priori knowledge when it was only lucky to be one out of a billion to infiltrate a cell.

        Starting out, billions are released from a sneeze, some are blown away by the wind, some are inhaled, millions may find their way to the doorstep of a cell but only a small amount enter. And even many of this small amount wander about aimlessly or get arrested by the cell police, then a very tiny few get to multiply and cause illness. It is these few that get credited down the road with a priori knowledge whereas they have no idea whatsoever what they are doing.

        In summary, what looks at the end of the chain like something designed to be so, may merely be the outcome of chance.

        Regards,

        Akinbo

        You are confusing causes and effects. A virus contains genetic information - a priori knowledge about how to behave, given an opportunity to behave. How that came to be, by chance, by design, or some other mechanism, is irrelevant. The knowledge exists within the virus before (a priori) its arrival near a cell. Consequently, it does not have to rely upon trial-and-error (chance), in order to infiltrate a cell.

        Rob McEachern

        • [deleted]

        One must first ask the right questions?..can a three-dimensional particle, transmute to a two-dimensional space, still exist as an observable? Just as an observable god A,within a Universe, cannot be sure of an un-observable god B external to the observable Universe A?

        2-Dimensional Quantum Mechanics, exist, (unobservable), within 3-dimensional Relative space-times. XYZ becomes XY-YZ-XZ..ETC..ETC, QM loses a factor of observation..or information loss?

        You cannot expect to make a measuring device, if the devise itself loses an observational contributing factor?

        Can you really infer 3-D bits, can exist within 2-D waves?

        • [deleted]

        Information equates to mind, which equates to reality.

        • [deleted]

        EINSTEIN ON SLD

        There is a possibility Einstein was using LSD because he "invented" many concept which still today are heavy burden on physics. Let's see most influencing one:

        - coordinate time

        -proper time

        -time dilatation

        -space-time (where time is a 4th dimension of space)

        -constancy of light speed

        -length contraction

        -internal observer

        -external observer

        -empty space

        -graviton.Attachment #1: Einstein_on_SLD.pdf

          I'm thinking that the time line might contradict your hypothesis. SR dates to 1905. GR dates to 1915. According to Wikipedia, LSD was synthesized is 1938. Its psycho-active properties were discovered in 1943. Of course, a variation occurs naturally as a rye fungus.

          Regards,

          Gary Simpson

          Gary,

          You are correct. More then that, the much publicized hype about the psychedelic era is largely a fabrication of sensationalism which is wholly ignorant of the realities, and of drug usage and their effects. The 'hippies' were a diverse mixture of competing conceptual movements which developed in the era of blossoming progressive intellectualism that was truthfully in response to the impact globally of the awareness that nuclear weapons could extinguish all life. Many of those that 'looked the part' were actually spiritually inclined and adverse to any intoxicants, and most others indulged only to a degree of what might be called a communion of safe passage, lest they be attacked by the more paranoid elements associated with trafficking. Experimentation by young adults especially on college campuses made drug use and progressive movements co-incidential in time and location. It was really the bureaucratic reactionism against such 'give peace a chance' movements that labled progressives as 'drug crazed', and there is plenty of archival evidence of that sensationalist propagandizing which actually had the consequence of promoting curiousity and defiance and exacerabating drug usage. COINTELPRO destroyed countless innocent lives and much of the reactionary response to drug use was counter-productive to interdiction and often corrupt. Witch-hunting.

          As for the psychotropic effect of LSD, it is just the opposite of what Amrit suggests. The drug causes a constant reiteration of a relatively focused perception, be it a mood, idea or sensation. The much hyped 'expansion of consciousness' by those whom experienced 'good trips' was more a post experience attempt to rationalize the indulgence. It does not induce an evolutionary thought process. THC, on the other hand, which was and is often a drug of choice by artists, musicians and other creative types, does lend itself to an exploratory reverie due to its mild euphoric and relaxant qualities with little in the way of side effects unless indulged compulsively by those whom would do the same with alcohol or gambling. I rather expect Einstein would have been exposed to absinthe, not pot, in Vienna. I think where you would find archival evidence of a real impact of drug use on scientific thinking is in the famous cocaine addiction of Sigmund Freud, whom did eventually come to recognize it himself and that the drug did have the effect of producing a channelized compulsive psychoactive drive. The modern cocaine epidemic also has impacted financial markets in that the very drive that Freud finally admitted, is often sought by many of compulsive ambition. And it might be added, that the financial markets have also become addicted to Quantum Mechanics.

          Keep yer glasses an yer cigaretts below the winder soes nobody hassels ya. 'Toronto John McCaul'