Essay Abstract

The principle of linear superposition is a hallmark of quantum theory. It has been confirmed experimentally for photons, electrons, neutrons, atoms, for molecules having masses up to ten thousand amu, and also in collective states such as SQUIDs and Bose-Einstein condensates. However, the principle does not seem to hold for positions of large objects! Why for instance, a table is never found to be in two places at the same time? One possible explanation for the absence of macroscopic superpositions is that quantum theory is an approximation to a stochastic nonlinear theory. This hypothesis may have its fundamental origins in gravitational physics, and is being put to test by modern ongoing experiments on matter wave interferometry.

Author Bio

Angelo Bassi works on foundations of quantum mechanics and has a Ph.D. degree in Physics from University of Trieste. After completing post-docs at ICTP and University Ludwig-Maxmillian he joined University of Trieste as faculty. Tejinder Singh is Professor at the Tata Institute of Fundamental Research in Mumbai. His research interests are in quantum gravity and measurement problem. Hendrik Ulbricht received his Ph.D. in the surface science experimental group of Gerhard Ertl, and after completing post-docs at Vanderbilt and Vienna he is now Reader at Southampton University where he leads an experimental effort on Matter-wave Interferometry and quantum-nanophysics.

Download Essay PDF File

Dear Dr's Bassi, Singh, and Ulbricht,

I enjoyed your essay and the fact that you use real numbers for particle masses and slit dimensions. I was somewhat disappointed to see the emphasis on GRW 'theory' which, as a "phenomenological modification of quantum theory" is as ugly as sin and also violates energy conservation, in addition to requiring two new unexplained universal constants that seem to show up nowhere else (unlike g, c and h). But I was glad that you noted "beyond doubt" that there should be deeper principles underlying this radical approach (though none are known).

You suggest, with Penrose and others, that gravity is responsible for the absence of macroscopic superpositions. Although you note that the GRW/CSL approach is non-relativistic and efforts to relativize it have failed, still you see collapse as "instantaneous and non-local", per Bell.

You are left with a century-old prediction that fails at macroscopic dimensions and a phenomenological model that can only be characterized as 'ugly'.

Although you've invested quite a bit in this model you do suggest that it is possible that linear superposition is a wrong assumption and that something radical is needed. I invite you to read my essay, The Nature of the Wave Function, for a gravity-based approach that radically departs from the century old assumption of superposition and collapse. I hope you find it interesting, and invite your comments.

Best of luck in the contest,

Edwin Eugene Klingman

    Dear Dr. Klingman,

    We refer to collapse models as phenomenological modifications of quantum theory precisely because they lack the degree of universality and beauty typical of theories such as classical mechanics or electromagnetism. Therefore the comment "as ugly as sin" apart from not being a scientific comment, does not come as a surprise. Moreover, phenomenological models contain phenomenological constants, which are expected to be justified by the underlying theory. It has always been like this in the history of science. So, there is not much to be surprised about the introduction of two new constants.

    However, in spite of being phenomenological models, collapse models have some important scientific merits.

    1. They provide a consistent resolution to the measurement problem of quantum mechanics. One may not like the way the problem is solved, but it is a consistent solution. And this is important.

    2. They achieve what many people thought was impossible beforehand: to reconcile the linear evolution as given by the Schroedinger equation and the collapse of the wave function into a single dynamical principle. The new dynamics is compatible with all known experimental facts, and moreover explains why we see the world the way we see it. Again, one may not like the explanation, but it is a consistent explanation.

    3. They are the only theory mathematically well formulated, which makes predictions which are different from those of standard quantum mechanics. Having models against which quantum mechanics can be tested experimentally is of paramount importance for devising novel experiments and, ultimately, for the development of physics.

    4. They suggest a direction to look at, for unfolding the underlying theory of nature. The direction might eventually prove wrong, but it is valuable that one has a clear direction to follow.

    Regarding the energy non-conservation, take as an example a particle in a gas. Its energy is not conserved, because the particles moves to thermal equilibrium. No one however is shocked by this fact. The non-coserved energy goes to the bath. The overall energy is conserved. In collapse models the same thing happens. The non-conserved energy goes to the noise responsible for the collapse. When we will have the underlying theory (the analog of classical mechanics for explaining the behavior of a particle in a gas), we will also restore energy conservation.

    Regarding non-locality, the issue is simple. The violation of Bell inequalities is an experimental fact. Within the framework of a theory (or model...) containing just the wave function and its dynamics and nothing else, the only way to comply with the violation of Bell inequalities is to have a superluminal collapse of the wave function.

    We look forward to reading your essay.

    Regards,

    Authors

    • [deleted]

    Teginder,

    "In order to describe dynamical evolution the theory depends on an external classical time, which is part of a classical spacetime geometry."

    The point I make in my entry is that the classical timeline is an effect of the underlaying dynamic, such that it isn't the present moving from past to future, but the changing configuration of what is, that turns future into past. Not the earth traveling the fourth dimension from yesterday to tomorrow, but tomorrow becoming yesterday because the earth rotates. Duration is simply what happens within the present, between events, not a property external to the present. This makes time an emergent effect of action, similar to temperature. So without an external dimension of time, there can be no dimensionless point on this timeline. A frozen point in time would be equivalent to a temperature of absolute zero; the complete absence of motion. It would be like trying to take a picture with the shutter speed set at zero. The result would not be a frozen moment in time, but nothing. Which means an entity, micro, or macro, cannot be isolated from its action. That macroscopic table only exists because of the dynamic activity sustaining it. So, effectively, the particle is actually just a really small wave.

    Consider this exchange from an interview with Carver Mead;

    "So how did Bohr and the others come to think of nature as ultimately random, discontinuous?

    They took the limitations of their cumbersome experiments as evidence for the nature of reality. Using the crude equipment of the early twentieth century, it's amazing that physicists could get any significant results at all. So I have enormous respect for the people who were able to discern anything profound from these experiments. If they had known about the coherent quantum systems that are commonplace today, they wouldn't have thought of using statistics as the foundation for physics.

    Statistics in this sense means what?

    That an electron is either here, or there, or some other place, and all you can know is the probability that it is in one place or the other. Bohr ended up saying that the only statements you can make at the fundamental level are statistical. You cannot grasp the reality itself, only probabilities related to it. They really, really, wanted to have the last word, and the only word they had was statistical. So they made their limitations the last word, saying, "Okay, the only knowledge that there is down deed is statistical knowledge. That's all we can know." That's a very dangerous thing to say. It is always possible to gain a deeper understanding as time progresses. But they carried the day.

    What about Schrodinger? Back in the 1920s, didn't he say something like what you are saying now?

    That's right. He felt that he could develop a wave theory of the electron that could explain how all this worked. But Bohr was more into "principles": the uncertainty principle, the exclusion principle--this, that, and the other. He was very much into the postulational mode. But Schrodinger thought that a continuum theory of the electron could be successful. So he went to Copenhagen to work with Bohr. He felt that it was a matter of getting a "political" consensus; you know, this is a historic thing that is happening. But whenever Schrodinger tried to talk, Bohr would raise his voice and bring up all these counter-examples. Basically he shouted him down.

    It sounds like vanity.

    Of course. It was a period when physics was full of huge egos. It was still going on when I got into the field. But it doesn't make sense, and it isn't the way science works in the long run. It may forestall people from doing sensible work for a long time, which is what happened. They ended up derailing conceptual physics for the next 70 years."

    .........

    "So early on you knew that electrons were real.

    The electrons were real, the voltages were real, the phase of the sine-wave was real, the current was real. These were real things. They were just as real as the water going down through the pipes. You listen to the technology, and you know that these things are totally real, and totally intuitive.

    But they're also waves, right? Then what are they waving in?

    It's interesting, isn't it? That has hung people up ever since the time of Clerk Maxwell, and it's the missing piece of intuition that we need to develop in young people. The electron isn't the disturbance of something else. It is its own thing. The electron is the thing that's wiggling, and the wave is the electron. It is its own medium. You don't need something for it to be in, because if you did it would be buffeted about and all messed up. So the only pure way to have a wave is for it to be its own medium. The electron isn't something that has a fixed physical shape. Waves propagate outwards, and they can be large or small. That's what waves do.

    So how big is an electron?

    It expands to fit the container it's in. That may be a positive charge that's attracting it--a hydrogen atom--or the walls of a conductor. A piece of wire is a container for electrons. They simply fill out the piece of wire. That's what all waves do. If you try to gather them into a smaller space, the energy level goes up. That's what these Copenhagen guys call the Heisenberg uncertainty principle. But there's nothing uncertain about it. It's just a property of waves. Confine them, and you have more wavelengths in a given space, and that means a higher frequency and higher energy. But a quantum wave also tends to go to the state of lowest energy, so it will expand as long as you let it. You can make an electron that's ten feet across, there's no problem with that. It's its own medium, right? And it gets to be less and less dense as you let it expand. People regularly do experiments with neutrons that are a foot across.

    A ten-foot electron! Amazing

    It could be a mile. The electrons in my superconducting magnet are that long.

    A mile-long electron! That alters our picture of the world--most people's minds think about atoms as tiny solar systems.

    Right, that's what I was brought up on-this little grain of something. Now it's true that if you take a proton and you put it together with an electron, you get something that we call a hydrogen atom. But what that is, in fact, is a self-consistent solution of the two waves interacting with each other. They want to be close together because one's positive and the other is negative, and when they get closer that makes the energy lower. But if they get too close they wiggle too much and that makes the energy higher. So there's a place where they are just right, and that's what determines the size of the hydrogen atom. And that optimum is a self-consistent solution of the Schrodinger equation."

      • [deleted]

      It is only a chat in a blog, not a complete theory: I read your article and I start to thinking on your interesting essay.

      I think that for the third law of thermodynamics exist a single quantum function for a macroscopic object: a superconductive apple (superconductive Newton's apple) is a quantum object that produce tunneling, double slit interference and others quantum effect.

      The problem with the real apple is the unattainability of the absolute zero, because of the phonon oscillations (thermal absorption); is it possible to built material transparent for phonons, like diamond, or glasses, or irregular lattice material (only apple shape)?

      Exist in the Universe macroscopic objects near the absolute zero?

      I think that a black dwarf is a macroscopic object with a single quantum function that emit like a single neutron in a gravitational field (hypothetical observable quantum jumps of the black dwarf).

      Saluti

      Domenico

        Dear Dr. Singh,

        I certainly agree that 'ugly as sin' is not a scientific characterization, and that all phenomenological models (MOND, etc) share this feature to some degree. I further agree with your reasons for taking GRW seriously, so it really is just a case of "one may not like the explanation". I do like your explanation for 'restoring' energy conservation.

        I also agree that "The violation of Bell inequalities is an experimental fact". It is the assumptions underlying Bell's inequality that I believe to be wrong. Because this topic has been discussed in great detail on other threads, I will not clog your space here. Thank you for agreeing to read my essay, I look forward to your comments.

        Best,

        Edwin Eugene Klingman

        • [deleted]

        Tejinder:

        Way over my head, but interesting. Does nature recognize fancy mathematics? My essay is perhaps overly simplified, but addresses the real problem of Physics. Wherein lies "consciousness"? Very murky, but emergentism (growth) and panpsychism (memory are properties suggested that aligns them with probabilities of a 1-D, 2-D, and 3-D geometric world. See:

        To Seek Unknown Shores

        聽聽 http://fqxi.org/community/forum/topic/1409

          • [deleted]

          Dear authors,

          there is little to read about the decoherence theory to explain why macroscopic superpositions are so difficult to achieve. What do you think about this theory and it's recent experimental tests that confirm this framework?

          Would be happy about some answers.

          Best regards,

          Michael Lee

            • [deleted]

            A clearly written essay, which, unlike many others here proposed, concerns physics and not science fiction.

            One may like these models or not, but one has to acknowledge that they give a logic and fully-consistent explanation of the failure of the superposition principle for macroscopic objects.

            B. V. Oman

              Hello Michael,

              You may kindly want to have a look at the article by Stephen Adler at

              http://arxiv.org/abs/quant-ph/0112095

              and Section I of our review article

              http://arXiv.org/abs/arXiv:1204.4325

              • [deleted]

              Dear colleagues,

              I found the essay quite stimulating. Congratulations.

              Regarding the issue at hand here, I view very favorably the the idea that Quantum Gravity could be behind the modifications that look effectively like CSL.

              In that case I think the issue of energy conservation should be considered in a rather different light. In General Relativity there are no generic laws of energy conservation (Energy becomes a well defied concept only in rather spacial situations, such as space-times with time-like Killing Fields). The

              only thing that is relevant in a general context is the conservation of the energy momentum tensor (i.e it should have zero divergence). But that, again, relies on physics taking place in a well defied space-time metric.What will be the form of whatever is left of such notions in a situation where metric is ill defied, or fuzzy, or fluctuating, is, I believe anybody's guess.

              5 days later
              • [deleted]

              Interesting essay. Nice job.

                Dear authors

                I really like your essay and the way that it tackles a crucial issue for present day physics that so many choose to sweep under the carpet. You suggest "1. Given a system of n distinguishable particles, each particle experiences a sudden spontaneous localization (i.e. collapse of position state) with a mean rate lambda, to a spatial region of extent r_C." I agree that this needs investigation. But my own view would be that this would be very likely to depend on the local context, in much the same way that state vector preparation does (see here for a discussion). Thus the rate lambda would be environmentally dependent. Penrose' idea is one way that this dependence might occur; but it could be that it is a far more local effect than that (i.e. on the scale of the measuring apparatus).

                George Ellis

                  Dear Angelo, Tejinder and Hendrik,

                  I think the criterion for this departure from linear QM may come with horizon scales. The de Broglie wave equation tells us the wave length of a particle with momentum p is λ = h/p. If we use the momentum p = mc (thinking in a relativistic sense of p = E/c) we may estimate the wave length for a Planck momentum particle p = m_pc = 6.5x10^5gcm/s, for m_p the Planck mass. The wave length for such a particle is then 1.0x10^{-32}cm, which is close to the Planck length scale L_p = sqrt{Għ/c^3} = 1.6x10^{-33}cm.

                  A quantum system is measured by a reservoir of states. The superposition of states in that system is replaced by entanglements with the reservoir of states. The standard measuring apparatus is on the order of a mole or many moles of atoms or quantum states. This then pushes the effective wavelength of this measurement, or maybe more importantly the time scale for the reduction of quantum states measured to an interval shorted than the Planck time T_p = L_p/c. This might mean that measurement of quantum systems, and associated with that the stability of classical states (the table not being in two places at once & Schrodinger's cat) involves this limit that is associated with quantum gravity.

                  John Wheeler discussed how there may be different levels of collapse. With gravity there is the collapse of a star into a black hole. He then said this may be related to the "collapse" (to use an over played buzzword) of a wave function. He said the dynamics of black hole generation and the problems with quantum measurement might well be related, or are two aspects of the same thing. It might also be pointed out there are theoretical connections between QCD and gravitation, where data from RHIC and some hints with the heavy ion work at the LHC, that gluon chains or glueballs have black hole (like) amplitudes similar to Hawking radiation.

                  In an ideal situation it might the be possible to maintain a system with around 10^{20} atoms in a quantum state. A reduction of such idealism may reduce this to a lower number. This does leave open the question of how the physics of superfluidity, superconductivity and related collective overcomplete or coherent states fits into this picture.

                  My essay is not related to this topic directly, though my discussions on replacing unitarity with modular forms and meromorphic functions could have some connection.

                  Good luck with your essay.

                  Cheers LC

                    Dear George,

                    Thanks for your liking our essay, and for your interesting viewpoint.

                    What you suggest might very well be the case, for a consistent theory of spontaneous wave function collapse. However as you know it is not what is assumed to happen in collapse models such as CSL. There, the collapse rate lambda is a uniquely fixed constant, which does not depend on anything. If collapse models were a fundamental theory, it would play the role of a new constant of nature. The equations of motion then tell you that, when you have a systems of particles, the collapse rate of the center of mass scales with the size of the systems. This scaling seems to be something like the contextually feature proposed by you. But the value of lambda remains always the same.

                    We are currently reading your detailed paper on quantum measurement mentioned by you above. Your essay here on top-down causation is fascinating. Do you have a picture on how corresponding mathematical models can be built, including specifically in the context of quantum measurement?

                    Regards,

                    Authors

                    Thank you Daniel. We broadly agree with your viewpoint.

                    Authors