• [deleted]

"Once you start thinking of an agent as a quantum system in his or her own right, says Renner, things get complicated." The initial conditions get complicated, but the equations of motion might and presumably will still be simple, as they are in classical physics. Planck's constant, that Lorentz invariant scale of action, is still the one that rules them all when it comes to measurement incompatibility and quantum fluctuations, come whatever else may.

I look forward to them finding ways to finesse that, as if to make Planck's constant locally look as if it's smaller, even while in principle it's still a universal constant.

"One of the foundational insights of quantum theory is that, just by observing a system, you change it."

That is not an insight. It is a delusion. Like the famous cat, a coin is neither heads nor tails, until some observer decides to "call it", but that act of observation did not change the state of the coin - it does not collapse into a one-side coin. The observation merely changed the observer's mental state that models the coin, not the coin itself.

Rob McEachern

    Linguistically, live and dead cat are both the cat object. The live cat though has functioning aerobic respiration and many processes occurring in the body that rely upon that biochemistry. The dead cat is not respiring. Many processes are not functioning because of that and other biochemistry such as autolysis, the break down of cells, is happening. The live and dead cat are not the same object if the biochemistry is considered. Linguistically, broken and intact poison flask are both flask object. Their topology is very different though. Shards of glass are different objects to the intact flask if topology is considered, Linguistically, decayed and non decayed radioactive particle are both the particle. However if an alpha or beta particle is lost they are not the same object anymore. Different objects can't be in a state of superposition, only different states that might be observed pertaining to the same object. So the thought experiment is not a good analogy.

    • [deleted]

    Infinitely trolling with a shallow running crank bait is what's visible.

    If the structure and chemistry of the particle before and after decay is considered they are different objects. Yet linguistically both referred to as particle, so seemingly the same object. This may seem a bit pedantic but I think the use of language is failing to clearly categorize the objects as different things rather than same things in different observable states; before and after radioactive decay has happened, releasing the poison.

    2 months later
    • [deleted]

    "Let's say we would like to decide whether there's really a superposition of the dead cat and living cat," says Renner, returning to the Schrödinger's cat paradox. "If we want to do that we have to control the system extremely well," in particular the wavefunction, says Renner. That level of control would require an exquisitely precise clock--one that might be impossible to build.

    If Renner and del Rio can show that such a precise quantum clock is a physical impossibility, that would meant that there is no way to discriminate between a superposition and a mixture in a "macroscopic" object like a cat, and the difference between the two states would lose its meaning. "Then the distinction between superpositions and mixtures is just a mathematical curiosity without a 'physically existing' counterpart," says Renner. "The paradox would dissolve."

    A few questions:

    1) Are they arguing that we cannot know time accurately due to the time energy uncertainty principle? If so then can they use an observable that commutes with time?

    2) What is the theoretical limit of an optical frequency comb?

    3) Bell's inequalities tells us there are no local hidden variables but it does not rule out nonlocal hidden variables. If the collapse of a wavefunction is a nonlocal process, like measuring the spin between two entangled particles, then how does that impact our understanding of causality and time? Is there a second higher speed limit for a collapsing wavefunction or must it be instantaneous to prevent violating the conservation of angular momentum?

    4) Why is everyone obsessed with quantum gravity?

      Brian,

      A few answers:

      1) Are they arguing that we cannot know time accurately due to the time energy uncertainty principle? If so then can they use an observable that commutes with time?

      They seem to be arguing that the uncertainty principle is the ultimate limit, but perhaps other physical circumstances limit what can be done, even before the uncertainty principle limit is reached.

      2) What is the theoretical limit of an optical frequency comb?

      The limit of all observables is given by Shannon's Capacity Theorem, which in the case of the Heisenberg Uncertainty principle, reduces to the statement that, every set of measurements, must contain one or more bits of information; if you have failed to extract even a single bit of information, from within all the data bits comprising your set of measurements, then you have failed to make anything worthy of being called a measurement.

      3) Bell's inequalities tells us there are no local hidden variables but it does not rule out nonlocal hidden variables. If the collapse of a wavefunction is a nonlocal process, like measuring the spin between two entangled particles, then how does that impact our understanding of causality and time? Is there a second higher speed limit for a collapsing wavefunction or must it be instantaneous to prevent violating the conservation of angular momentum?

      Bell's inequality is derived from the false assumption that something ELSE always remains to be measured, after the first measurement of an entangled pair has been performed. But that is obviously false, when the entity being measured manifests only a single-bit of information - the Heisenberg limit. In this peculiar case, not only are there no hidden-variables, there are no variables (plural) at all - there is only one bit.

      4) Why is everyone obsessed with quantum gravity?

      Because, like oil and water, gravity and quantum theories do not mix, but everyone thinks that they should be modified so that they do.

      • [deleted]

      R. McEachern,

      Very good and succinct answers, i through 4.

      8 days later

      No one knows how this particular play will turn out, including del Rio and Renner. By Situs Togel Online

      a month later
      • [deleted]

      Our understanding of wave function collapse uses the language of "standard analysis," which is not a good language for talking about existence-- which is the problem here. Instead of a "limit", which necessarily involve statements about numbers on a real number line-- which therefore say that before the"Limit" can exist, there must first exist this number line stretching in front of and in back of me who is, on the number line, performing this algorithm for "limit". The pre existing number line goes into the vanishing points on both horizons.

      This kind of language doesn't help me to see "what exists" in that imagined point on a number line of time, along which travels the wave function.

      So when the wave function "collapses," I am stuck with the language of standard analysis and therefore "limits" on pre-existing "number lines," to say, first, "what existed," and so "what collapsed."

      Rather, the language of nonstandard analysis give me something that exists at the imagined "limit"-- "the monad." On which, one can build a mathematical game, called "the Born Infomorphism." Possibilities exist in the "nonstandard future" part of the monad. The scoreboard exists in the nonstandard past. It is just David Bohm's model of the computer guided ship guided by radio waves to its slip upriver. In nonstandard proper time-- "properTime = (now, properTime)"-- one player is the radio antenna, who places possibilities in the nonstandard future. In the standard present instant at the core of the monad, the other player in the game-- the quantum particle-- chooses where to move. After the move, the quantum particle finds whether or not the radio tower in the game wanted the quantum particle to move to that configuration, or not. And that information is placed in the scoreboard, which exists in the nonstandard past. Since after each move the radio tower lets the quantum particle know where it would have liked the quantum particle to have moved, whether it did so or not-- the probability with which the radio tower selects a possibility will match the probability with which the particle chooses that possibility. It's an old laboratory finding called "probability learning," which is explained by regret....

      In each monad, a play of this game of existence is a "collapse" of the wave function. The particle jumps from trajectory to trajectory in those computer generated graphics in Bohm and Hiley's book. When it happens to hit the detection screen, who knows what trajectory it would have been following? The distribution follows the Schrodinger equation.

      In language involving nonstandard analysis, non-wellfounded sets, mathematical games and "infomorphisms", the phrase "wave function collapse" means something else.

        "The particle jumps from trajectory to trajectory in those computer generated graphics in Bohm and Hiley's book." But there remains no good reason to suppose that any such jumping happens in reality.

        "When it happens to hit the detection screen, who knows what trajectory it would have been following?" No one, precisely because no one ever even attempted to follow it.

        "The distribution follows the Schrodinger equation." Precisely, because, rather than attempting to follow anything along any trajectory, the equation, together with the Born Rule, merely describes the detection statistics that can be observed, by a set of stationary detectors, sitting wherever the equation happens to specify. An animal-trap, does not reveal the path the animal took to arrive at the trap. The problem arose when Schrodinger switched from using a single equation/wavefunction to describe a single particle's trajectory, to using the same, single equation/wavefunction to describe something (He knew not what) about ALL particles simultaneously. This does not work correctly - precisely because the latter enables the "jumping" in the solution (entirely due to noise), that does not correspond to any phenomenon in the real world; the "cost-function" being minimized (least-squared error), behaves differently (produces a very different solution to the equation - one enabling "jumping") in the case of ANY noise in ANY measurement or ANY error in ANY potential used in the equation. It will drive ALL and ALL errors to zero, via the "jumping".

        Rob McEachern

        How to interprete a "wave function collapse"? Robert McEachern has the most convincing to me answer: The many delusive worlds of wave function models are turning out to be conceptually different from just one reasonably assumed obvious reality. We don't need non-standard analysis as to understand this and related weirdness.

        Having looked in Robert's power point presentation, I just criticize his naive use of Fourier transformation with integration from minus infinity to plus infinity over time. Shannon understood that the definitely real unchangeable past is essentially different from the many predictable and influencable possible futures which are permanently collapsing with growing time. Really already elapsed (past) time is not delusive.

        Every human so far was born from exactly one woman and one man, no matter whether or not his family tree is known. Theoretically he has a huge number of ancestors after millions of generations. However, the hypothetic family tree of his grand-grand-grand children will collapse as do wave functions.

          Eckard,

          "I just criticize his naive use of Fourier transformation with integration from minus infinity to plus infinity over time."

          Perhaps it is not quite as naive as it appears. As you know, physics seeks to discover and model predictable phenomenon. So, if perfect predictions are possible, it would enable perfect predictions of the past and future, from a finite duration set of observations. So instead of just integrating over the finite duration of actual measurements, one could integrate over the infinite duration predictions made by the model. This is why the process works for predictable phenomenon. This is why there is an "unreasonable effectiveness of mathematics" when applied to perfectly predictable phenomenon. And it is also why it is, rather less effective, when applied to unpredictable phenomenon. The latter, is what Shannon's theory is all about.

          Rob McEachern

          Rob,

          PREdiction of the past did imply reversed direction of time. Claude Shannon was not naive. He didn't accept Laplace's determinism instead of common sense. He meant that the past is known in principle but cannot be changed while the future can be influenced but is not known for sure.

          I consider you one of the few who understand that reality is quite different even from the best theory. If one integrates as if there was no fundamental difference between past and future, then one ignores that the restriction to the limited number of Laplacean initial conditions implies the loss of the perhaps infinite amount of unconsidered influences.

          In case of analysing measured data, measured future data are not yet available.

          Perfect PREdictions may only seem possible to the extent one feels safe to exclude unseen erratic influences. In other words, perfectly predictable phenomena are belonging to models, not to reality.

          Maybe, you mistook my criticism. I didn't blame you personally but Laplace, Fourier, and current mainstream physics. Integration either from minus infinity to t=0 in case of analysis or from t=0 to plus infinity would not ignore the conceptual difference between past and future of reality. Both of these half-sided integrals are likewise infinite.

          In contrast to Wigner, I don't see an unreasonable effectiveness of mathematics. Decomposition into Fourier components is equivalent to decomposition into Cosine components, even if this looks stunning.

          Of course, we certainly agree:

          Actual measurements imply additional deviations from reality: They are restricted to finite duration and to a finite number of sampled data.

          And Shannon's theory contradicts to Laplace's determinism of future.

          Eckard

          Eckard,

          As you may recall from my 2012 FQXi essay, I make a big distinction between computational models and physical models of reality. I think things like Fourier transforms are very useful as computational models/tools. But, like you, I believe they are not good physical models. The actual, physical processes occurring in the world, are not based on any infinite, orthogonal functions. Attempting to interpret them as if they are, is a long-standing problem.

          Rob McEachern

          Rob,

          On 28.12.2012, my last sentence was: "Again: Do not confuse mathematics with reality. A vector may be useful to mathematically describe reality. It is not reality." I am still claiming that FT, DFT, CT, DCT and even filter banks are just models of reality, not the reality itself.

          Could you please specify on which page of your 2012 essay you "made the big distinction between computational and physical models of reality"?

          FT and DFT certainly very useful to some extent models. I merely criticize that they ignore what worried the late Einstein seriously: the now. Outside tradition, there are uncommon models that don't ignore the now, i.e. the distinction between past and future, in particular CT on R.

          Eckard

          Eckard,

          The distinction between computational and physical models of reality, was the subject matter of the entire essay:

          On the first page, the sub-title of the essay is "Confusing Mathematics for Physics"

          On the second page: "Physicists fail to distinguish between the properties of the "reality" they are attempting to describe, and the properties of their mathematical "descriptions of that reality"."

          More of the same can be found in this slideshow

          Rob McEachern

          Rob,

          Because you distinguished "between the properties of a mathematically

          constructed map describing a territory, and those of the territory itself", I guess, you didn't clearly separate the "physical models of reality" from "reality itself" which I consider a ubiquitous conjecture.

          I see a filter bank a physical model of conjectured as reality function of cochlea in contrast to the (ironically thought as underlaying) FT description. Physical in this case means, it consists of lumped elements like R and C, the combination of which is thought to behave according to FT. I see, however, a serious difference:

          Such concrete physical model doesn't ignore the difference between past and future which is missing in the mathematical FT model due to abstraction. A mathematically "implemented" filter bank is in this respect also unrealistic.

          You are dealing with realistic band-limited signals. Filterbanks are always band-limited. I understand you.

          My criticism of not completely appropriate mathematics arose from awareness of a few, denied by the mainstream theory up to now, misconceptions, I had to teach for decades.

          Eckard,

          "A mathematically "implemented" filter bank is in this respect also unrealistic." A cochlea filter constructed from lumped elements is also unrealistic. The physical ones are built from things like neurons. The math is just being used to describe some approximate aspects of their behavior.

          As far as the "difference between past and future" is concerned, with regards to filtering, bear in mind that a simple delay-line can be used to effectively shift the dividing line between the past and future, relative to the time at which the filter is functioning. The use of "window" functions also effectively changes an infinite transform into a finite one; the infinitely long tails of the windowed transform, contribute nothing to the final result. This is an example of a computational model that may exactly reproduce a physically observed behavior, but represents an "unphysical" model of how nature goes about producing those same behaviors. From an engineering standpoint, the "unphysical" computational model may be highly-superior to the actual physical-model, since algorithms like Fast Fourier Transforms can be used to reduce the computational burden by many orders of magnitude. But from the physics perpective, it is important to remain cognizant of the fact, that nature does not appear to ever employ such algorithms. I think this is related to why so few people understand the actual physical mechanisms that underie either color or acoustic-pitch perception.

          Rob McEachern

          Rob,

          Output before any input is to be seen more or less obviously in FFT based spectrograms. It is unrealistic IN RESPECT OF causality, and is missing with models consisting of lumped physically real elements.

          Can "a simple delay-line be used to effectively shift the dividing line between the past and future"? No. Such delay doesn't change the border between past and future in reality, given in the original signal. It may merely more or less delay the arrival of it. By the way, in reality, there are no negative delays even if the difference between two delays of the same original may be considered as negative.

          "The physical ones are built from things like neurons." In my understanding, OHCs, IHCs, and the BM are not neurons. I know a lot of specialists worldwide who understand the physiology of cochlea pretty well. Of course, cochlear function doesn't yet directly explain pitch perception.

          Did you already try and deal with MPEG and Fast Cosine Transformation / DCT instead of FFT / DFT?

          A symmetrical time window that includes not just the existing traces of the past but also future data is not convincing to me. I was happily not forced to touch such incomprehensibilities with my students.