Dear Lawrence,

1. You seem to subscribe the idea that decoherence solves the measurement problem, if we interpret correctly what you write. We strongly object against the possibility that decoherence alone provides a solution to the measurement problem. See [Adler's paper against decoherence] for a thorough criticism, which we think is convincing enough.

2. About John Wheeler's idea. They are certainly very appealing, and there could be much truth in them. However, they have not been translated so far into consistent mathematical models. In our essay, we stick on purpose only to ideas which find application in well-defined mathematical models, like collapse models and trace dynamics. Moreover, collapse models make precise predictions, which can be tested experimentally. In this way, one has what we think is a perfect match between speculation, mathematical modeling and experimental analysis.

3. Regarding superfluidity, superconductivity and related collective overcomplete or coherent states. They can be very well described within collapse models, and the answer is that they behave as we see them behaving. In other words, collapse models do not predict a (too) different behavior for such collective phenomena, with respect to standard quantum mechanics. The reason is that these phenomena do not involve the *superposition of a large number of particle in two appreciably different positions in space*, the only type of superpositions which are strongly suppressed by collapse models.

Regards,

Authors

As noted at the beginning of your article:

"The principle of linear superposition...Along with the uncertainty principle, it provides the basis for the mathematical formulation of quantum theory." You then suggest that it might only hold as an approximation.

I view the problem rather differently. Fourier Analysis is the actual mathematical basis for quantum theory. Superposition and the uncertainty principle are merely properties of Fourier Analysis. In other words, they are not properties of the physical world at all, but merely properties of the mathematical language being used to describe that world. Even the well-known, double-slit "interference pattern" is just the magnitude of the Fourier Transform of the slit geometry. In other words, the pattern exists, and is related to the structure of the slits, as a mathematical identity, independent of the existence of waves, particles, physics or physicists.

For the better part of a century, physicists have been misattributing the attributes of the language they have chosen to describe nature, for attributes of nature itself. But they are not the same thing.

Fourier Analysis, by design, is an extremely powerful technique, in that it can be made to "fit" any observable data. Hence it comes as no surprise that a theory based on it "fits" the observations.

But it is not unique in this regard. And it is also not the best model, in that it assumes "no a priori information" about what is being observed. Consequently, it is a good model for simple objects, which possess very little a priori information. On the other hand, it is, in that regard, a very poor model, for human observers; assuming that it is, is the source of all of the "weirdness" in the interpretations of quantum theory.

Putting Fourier Analysis into the hands of physicists has turned out to be a bit like putting machine guns into the hands of children - they have been rather careless about where they have aimed it. Aiming it at inanimate objects is acceptable. Aiming at human observers is not.

    • [deleted]

    Dear authors,

    When reading your excellent essay, a (perhaps silly) question compes to my mind. You write: "Suppose one has prepared in a controlled manner a beam of very large identical molecules...". What I wonder is: Mustn't there be an upper limit where the very large (hence comlex) molecules can no longer be assumed to be positively identical? Might the lack of controlled identity be the limit where linear superposition no longer holds? Might it be a question of molecular complexity, rather than size/weight?

    Best regards!

    Inger

      Thanks for that. I don't yet have a mathematical model in the case of measurement: am thinking about it. The first step is to look at state vector preparation, which is an analogous non-unitary process, involving a projection operator depending on the local macro context. With that in place, the steps to a contextual measurement model - maybe with a new universal constant, as you say - may become clearer. But the essential comment is that the local measurement context may be the "hidden variable" (it's non-local as far as the micro system is concerned, so the non-locality criterion is satisfied). It's hidden simply because we don't usually take into into account.

      George

      • [deleted]

      Dear Authors,

      You are tackling one of the elephants in the physics room, and as George Ellis commented above this one is hard to sweep under the rug.

      Your solution of Continuous Spontaneous Localization [CSL] seems like a good idea to me.

      My own work points to the Planck mass as the upper limit of all quantum phenomena including superposition. See my essay for two methods that show this. My essay is "An Elephant in the Room". This is a different elephant than yours (there are plenty of elephants to go around).

      Here is a vague outline of an experiment that I believe can be performed that would correlate with your theory:

      1. Chose a crystal like diamond to investigate. This is done because diamonds are considered to be a single molecule independent the number of carbon atoms.

      2. Create bins of diamonds with increasing numbers of carbon atoms up to the Planck mass.

      3. Test these diamond bins via the University of Vienna for the property of interference.

      4. I suspect that interference phenomena will gradually decrease with mass and will disappear at the Planck mass. This experiment (if it can be performed) should provide confirmation of your theory.

      Good to see you in this contest.

      Don Limuti

        Inger,

        Your question is not silly at all. It is very near the heart of the issue. One need only go a little bit deeper to arrive at "the issue."

        What is the significance of the particles all being "identical" in the first place? If they remain, forever identical, then they cannot change with the passage of time. If they cannot change with the passage of time, then they cannot store any information whatsoever, within their internal structure.

        But a larger entity, constructed from a number of such identical particles, can store information, by the relationships (like distances) between them. Entities that store information, can behavior towards other entities in a "symbolic" manner, and not just a "physical" manner. Even a tiny virus particle has genetic information stored within it, that enables it to exhibit such "symbolic" behavior.

        What is the significant difference between "symbolic" and "physical" behavior? It is this: in the latter, observed data measurements are treated as "real numbers", in the former, they are treated like "serial numbers." Real numbers have most significant and least significant digits. Serial numbers, like credit-card numbers do not; change one digit anywhere, and it symbolizes someone else's account number; introduce one genetic mutation, and it may code for a different protein.

        All the "interpretations" of mathematical models in physics have assumed that entities only interact "physically." That is true for entities devoid of any information storage capacity, like subatomic particles. But it is not true of macroscopic entities, especially human observers. Physical behaviors can be viewed as encoded into the equations. But symbolic behaviors are coded into the initial conditions. By ignoring the exact (individual digits) of the initial conditions of the information stored within complex entities, physicists has thrown the baby out with the bath-water.

        All the supposed "weirdness" in the "interpretations" of quantum theory, derives from the fact that physicists have failed to take into account that human observers interact "symbolically" with their experiments, as well as "physically."

        • [deleted]

        Dear Robert,

        Thank you very much for your enlighening reply! You gave me more than a hint of the role of information theory in physics, which I would like to follow up further. I entered this essay contest in order to have the opportunity to ask some silly questions to people that are more knowing than me - and kind enough to answer. See, if you like, my essay "Every Why Hath a Wherefore".

        You saved my day!

        Inger

        • [deleted]

        Dr. Singh and Colleagues:

        You ask an important fundamental question about quantum linear superposition. But implicit in that question is the assumption that linear superposition should be universal. Instead, I would suggest that linear superposition applies ONLY to primary quantum fields such as electrons and photons. Please see my essay "The Rise and Fall of Wave-Particle Duality", http://fqxi.org/community/forum/topic/1296. In this picture, Quantum Mechanics is not a universal theory of all matter, but rather a mechanism for generating localized particle properties from primary continuous fields, where these localized (but not point) particles then follow classical trajectories (as derived from the quantum equations). Composites of fundamental fields such as nucleons and atoms are localized composite objects WITHOUT wave properties of their own, and hence completely without linear superposition. Beams of neutrons or atoms do not require de Broglie waves for quantum diffraction from a crystal lattice, which instead reflects quantized momentum transfer between the beam particle and the crystal. Remarkably, this reinvisioned quantum picture is logically consistent and avoids quantum paradoxes. Even more remarkably, this interpretation seems to be virtually new in the history of quantum theory, although it could have been proposed right at the beginning. The FQXi contest would seem to be an ideal venue to explore such concepts, but this has drawn relatively little attention.

        Thank you.

        Alan M. Kadin, Ph.D.

          I don't think decoherence solves the measurement problem per se. It does indicate how superpositions of a quantum system are teken up by a reservoir of states in entanglements. This then reduces the density matrix of the system to a diagonal matrix which correspond to probabilities. Decoherence does not tell us which outcome actually happens.

          I framed this within the decoherence perspective. It seemed as if the criterion for the sort of nonlinear quantum physics would happen when the time of the state reduction occurs at a time comparable to the Planck time. This can happen for a system with approximately 10^{18} amu or proton masses. This might be the maximal size at which a system can have quantum properties.

          Cheers LC

          Dear Dr. Bassi and Dr. Singh,

          It was a pleasure to meet you at the Quantum Malta conference and I am delighted to see that you have made what is in my view one of the two most important features of quantum theory the subject of your paper.

          I agree with the belief that quantum superposition does not hold for macroscopic objects (but for different reasons which are outlined in my paper) and am glad that the predictions of CSL are being put to the experimental test. I just hope that it won't take 20 years, as you suggest in your paper, to test the theory in an adequate regime.

          All the best,

          Armin

            Dear Don.

            The logic of your proposed experiment is basically what the experiments are aiming for: to increase the mass of particles in matter wave experiments. However, It is technically very challenging to perform these de Broglie interferometry experiments. Problems include: the generation of intense beams of particles at slow speeds, the implementation of an appropriate interferometer to see interference pattern of molecules with higher and higher masses and also the detection of single molecules with sufficient temporal and spatial resolution. On top of that all has to be implemented at ultra-high vacuum conditions. Diamonds would be possible, but there are many other molecules and nanoparticles and clusters, which have to be considered for such experiments. They have to be chosen depending on their special properties for beam generation, interferometry and detection. It is a huge puzzle with many experimental options. To give you an idea about the complexity and influencing parameters, which have to be considered for the experiment see the experimental section of our recent review (Bassi et al. 2012 arXiv:1204.4325) and Hornberger 2012. It would be great to perform an experiment as you suggest, but it will take some time to work out all experimental options to find the optimal setup.

            Your results about Planck mass as the cut-off are intriguing. Curiously enough, as you know, Planck mass is already essentially in the macro-regime. Various studies based on gravity induced quantum-classical transition, as reviewed for instance in our above mentioned article, suggest that the transition happens at a few orders of magnitude lower than Planck mass. It would be interesting to try and understand why you get a different result.

            Regards,

            Authors

            Dear Inger,

            No, it is only the mass. If you take a look at some of the recent publications on molecule interferometry (Gerlich2011, Nat. Comm. 2, 263), then you can find that the molecules are already very complex. However one finds always the maximum predicted quantum visibility in interferometry experiments. So why is that so? First, what we observe is single particle interferometry otherwise it would hardly be a quantum experiment. Roughly speaking this means every particle interferes only with itself and this is per definition identical - that is what we mean when we say we probe quantum superposition.

            So then you could argue that other properties of the molecule play a role: internal states such as rotation, vibration or the conformation of the molecules, but again we don't see any indication in the experiments that those properties influence the centre of mass motion. These internal molecular properties are simply not coupled to the motion of the particles. This means in matter-wave experiments only the mass of the particle and they propagation speed is important. Both speed and mass define the de Broglie wavelength of the particle.

            There is of course a dependency on particle mass distribution and that comes from the fact that you have to sum many single particle interferometry event to observe a nice interference pattern as for instance in Juffmann2012 [Nature Nanoscience, 2012]. As the interferometer is sensitive to a narrow band of particle de Broglie wavelengths one needs particles of almost the same mass to collect a nice interference pattern. This is taken care of by chemical purification of the molecules after synthesis and also by mass-selective detection with a mass spectrometer in the present experiment. But again this mass dispersion is not a fundamental limitation for molecule interference experiments; it is a technical issue. The question we ask with such experiments is if there is a fundamental reason for the quantum to classical transition - something we cannot overcome by technology.

            Regards,

            Authors

            Dear Dr. Kadin,

            Thank you for your comments and for your intriguing essay. Experiments which perform matter-wave interferometry with atoms and molecules as large as fullerenes already establish their wave nature and the validity of superposition for them [e.g. please see arXiv:1204.4325]. We wonder how your proposal can be made consistent with these experimental results?

            Regards,

            Authors

            Dear Armin,

            Thank you for your comments, and good to see your essay here.

            Regards,

            Authors

            • [deleted]

            Hi Tejinder,

            Can you point me to information on testing particles for interference (that I would understand)?

            In the essay (http://www.fqxi.org/community/forum/topic/1403) I logically derive the Planck mass (via two methods) as the ultimate mass for a particle. This does not mean there are any particles in nature that can make it to this mass. I define particle as an object with mass that shows the property of interference. It does not surprise me that real particles never get close to the Planck mass.

            This is why I was interested in diamonds. They are peculiar because they are hard crystals that are thought to be quantum mechanical at all sizes. I think they have a chance of getting close to the Planck mass.

            Let me know what you think,

            Thanks,

            Don L.

            Dear Authors,

            Thank you for an interesting essay.

            Maybe I am missing something, but, on the face of it, there may be some contradiction between the following statements in your essay:

            1). "When one considers doing such an interference experiment for bigger objects such as a beam of large molecules the technological challenges become enormous."

            2)."However when we look at the day to day world around us linear superposition does not seem to hold! A table for instance is never found to be `here' and `there' at the same time. In other words, superposition of position states does not seem to hold for macroscopic objects. In fact already at the level of a dust grain, which we can easily see with the bare eye, and which has some 1018 nucleons, the principle breaks down."

            So if technological challenges are enormous for large molecules, one would think they are downright prohibitive for dust grains or tables, so the principle does not break down, but we just cannot solve the technological challenges to demonstrate it for such objects? The following analogy may be appropriate: we cannot demonstrate reversibility for large objects (e.g., when we break a vase), furthermore, thermodynamics is based on irreversibility, but that does not mean that reversibility fails for large objects.

            Another remark. For what it's worth, I expect interference to exist for arbitrarily large objects. My reasoning is based on the following almost forgotten ideas of Duane (W. Duane, Proc. Natl. Acad. Science 9, 158 (1923)) and Lande (A. Lande, British Journal for the Philosophy of Science 15, 307 (1965)): the direction of motion of electron in the interference experiment is determined by the momentum transferred to the screen, and this momentum corresponds to quanta (e.g. phonons) with spatial frequencies from the spatial Fourier transform of matter distribution of the screen. So I tend to make the following conclusion: when the mass of the incident particle increases, the momentum transferred to the screen remains the same, but the angle of deflection of the incident particle becomes smaller, as its momentum is greater. So the mass of the incident particle is in some sense an "external" parameter for the interference experiment.

            Thank you

            Best regards

            Andrey Akhmeteli

              Dear Andrey,

              Thank you for your comments.

              There is no contradiction actually. When doing an interference experiment with a large molecule, one overcomes the technological challenges to prepare an initial superposed state, and then essentially one waits and watches. If quantum theory is right, the superposition will last forever, an interference pattern will be seen, and indeed it will have been shown that the observed absence of superpositions in daily life is because of practical limitations. On the other hand, if CSL is right, then the superposed state which one has prepared after overcoming the technological challenges will not last forever, and interference will not be seen. This would mean that the absence of macroscopic superpositions is not because of technological challenges, but because of new fundamental physics to which quantum theory is an approximation.

              With regards,

              Authors

              Dear Don,

              Please have a look at this review:

              http://in.arxiv.org/abs/1109.5937

              • [deleted]

              Hi, Post above was by Don Limuti, and not anonymous. Time-out got me.