Hello Folks,

I've given this topic a lot of thought, so I figured I should weigh in. I gave a presentation at the 10th Frontiers in Fundamental Physics conference last Fall, about a common basis for non-locality and entropy, which focused on the role played by decoherence. I have to admit to incomplete knowledge of the subject, however. My proceedings paper here did not pass peer-review, but subsequent correspondence with H.D. Zeh, Erich Joos, and others, has given me a pretty clear notion of what was wrong with it (I will revise), and a clearer idea of what decoherence really means. I'll share what I'm fairly certain of here.

First off; the wavefunction does not collapse. The global wavefunction remains a unified and coherent entity which evolves according to Schrödinger's equation. However; its nature is essentially non-local. It is field-like, wave-like, and pervasive. When we introduce a local frame of reference; this reference frame is automatically identified with the material or particle-like nature, by virtue of its locality. Any local observer exists in contrast with the non-local reference frame of the global wavefunction, and induces (or observes) components of that wavefunction to decouple whenever there is an interaction or observation taking place.

Non-local components of the wavefunction that had been global become associated with one or another discrete or observable entity. In the process; what happens is exactly as Dieter says - "various systems (the observed one, the apparatus, the observer, and the environment) get entangled." In Bell's and GHZ experiments, sequential weak measurements are made that effectively break off one component at a time, so that wavefunction components of the particle under study (corresponding to energetic degrees of freedom) are transferred from that entity to the measurement apparatus (until all the degrees of freedom are linked to it). Thus; components preserving the entanglement of one particle with its distant counterpart become dedicated instead to entangling the particle with the measurement system.

So; there is no collapse as such. There is always an evolving wavefunction. But wavefunction components which had been associated with one system become linked with another, leaving the two systems entangled. One of the chief observations of decoherence theory, therefore, is that there are no isolated systems. Instead, entanglement is universal, and the manner in which various sub-systems are entangled evolves over time.

I shall have lots more comments on this topic, especially if someone responds to these. I want to talk about the observer's role as participant. I just skimmed professor Leiter's Journal of Cosmology paper and feel that it bears some commentary relating to this subject - especially as it seems to offer some confirmation of my non-locality and entropy linkage. I thought the new paper explained his ideas much more lucidly than his FQXi contest essay, and I commend him on his clarity.

All the Best,

Jonathan

2 months later
  • [deleted]

I have read a lot about the "action at a distance" problem of quantum mechanics -- it was mentioned in Craig Callender's article in the June issue of "Scintific American" magazine. Murray Gell-Mann, in his book "The Quark and the Jaguar" says that "action at adistance is just a misinterpretation of what quantum mechanics says. Any comments?

3 months later

Does God allow we apply 'natural number' to 'electron' ?Especially , We konw that electron isn't 'Apple' or Richard Feynman's 'clicks' . as he said that all the surprising wisdom of quantum mechanics is hiding in the double slit experiment. I think maybe the field of natural number's application is restricted by nature, e.g. quantum phenomenon. If we do not reconstruct quantum phenomenon on the old picture(natural number) , that could be think as another reality?

22 days later

Jonathan,

Very interesting. Here is what I think. In nature, what defines a particle is the number of constraint to its freedom i.e quantum numbers(Pauli exclusion). When we observe/interact with it, we are adding one more constraint or quantum number. We don't collapse the wave function; we simply create a new one. I call that "temporary quantization". Any parameter may, under constraint, give rise to a quantum number. Light can travel in any directions. Send it through a slit and you add one quantum number and the output, as direction, is now quantized; diffraction.

Marcel,

2 months later
  • [deleted]

Pankaj, I'd like to take your reference to Buddhism a step further and say that what emerged from the South Asian subcontinent 4500 years ago from the Vedas, while obviously not Buddhist but surely has a common thread, was dialectics. That is, something is apprehended in terms of what it is not. It is a process epistemology, where we cannot identify anything except in terms of its "other". A modern rendition of this is Hegel's Phenomenology. It seems that paradoxes, such as the wave-particle duality, arise because people try seeing something in isolation - an either-or thinking. Yet, if one views a wave (a continuum) in terms of particle (discreteness), it starts to make sense. In logic, these are called "duals", and the more we open our eyes, the more we see of them.

14 days later
  • [deleted]

Logically it makes sense why they act like particles when observed and waves when not observed. Quantum physics is common sense to anyone who understands the statistics of 2 coin flips and how those statistics are affected by observing the coins. I will explain why the double-slit experiment is the same experiment as something you can do with 2 coins.

In the double-slit experiment, an electron (or other particle/wave) has 2 holes it can go through and then is detected hitting somewhere on the back wall. If it goes through the left hole, statistically it will paint a pattern on the back wall. If it goes through the right hole, it paints a different pattern. It goes through each hole equally often as the other. Most peoples' common sense tells them that statistically it doesn't matter if you know which hole it went through because you can simply average the 2 patterns to get the pattern on the back wall for when it could go through either hole. But its a very different pattern from the average of the left-hole-pattern and right-hole-pattern. Its the same pattern as waves interfering with each other.

Sometimes electrons act like particles and sometimes like waves, but why? I'm going to explain why that happens using common sense instead of equations. The problem is most people don't have all the parts of common sense that they think they have. If you understand the following about 2 coin flips, and you see the patterns created in the double-slit experiments, then you can put them together and understand why electrons (and other particles/waves) sometimes act like particles and sometimes act like waves. Logically, without considering the specific equations of physics, we can know there has to be something like that in physics somewhere. Here's the 2 coin question:

If I flipped 2 coins and at least 1 coin landed heads, then whats the chance both landed heads?

Its 1/3, not 1/4 or 1/2 like most people think, because there are 4 ways 2 coins can land and I only excluded "both tails" when I said "at least 1 coin landed heads" so that leaves 3 possibilities and I asked what is the chance of 1 of those 3 things which happen equally often. Its 1/3. If you still don't believe it, flip 2 coins many times and only ask the question when at least 1 of them lands heads and you will see that 1/3 of the time you ask the question they both land heads. The flaw in Human minds is the need to choose 1 of the coins and say it certainly landed heads, but I did not tell you any specific coin landed heads, and it does change the answer if you take that shortcut.

Most peoples' common sense tells them that since its a symmetric question (between the 2 coins), it can't matter if they start with 1 of the 1-or-2 coins that landed heads, and they think it will get the same answer as not knowing if a specific coin is heads or not. How could it matter? We know at least 1 of the 2 coins landed heads, so I'll just define a variable called coinX=heads and figure out if coinY=heads or coinY=tails. Since coinY was randomly flipped and coins have a 1/2 chance of heads, then the chance both are heads must be 1/2. But then they think about the extra information I told them: at least 1 coin landed heads. That has to change something, so how could coinY be 1/2 chance of heads by itself and with coinX? CoinX and coinY are symmetric. You can trade them in this question and not change the answer. So whatever is true of coinY has to also be true of coinX on average. So maybe the chance both are heads is 1/4. Most people go back and forth between 1/2 and 1/4, but the answer is 1/3 as I explained above.

How is the 2-coin experiment related to the double-slit experiment?

The patterns of the 2 coins (how often they land heads) individually can not always be averaged to get the pattern of both coins together. If at least 1 coin landed heads and you observe a specific coin being heads, then the chance they are both heads is 1/2. If at least 1 coin landed heads but you don't observe any coin, then the chance both are heads is 1/3.

Logically, observing a specific case of something you know has to be true in general, about the 2 coins, produces a different outcome than only knowing its true in general.

The analogy to quantum physics is that when you observe a heads or tails, you collapse the wavefunction (including the other coin you didn't observe) to a particle and the other becomes a different wavefunction, but if you do not observe any heads or tails then its a symmetric wavefunction between the 2 coins.

I can say the same thing about the 2 holes in the double-slit experiment. If I put an electron detector past the left hole, and shoot an electron that could go through either hole, and the detector observes or does not observe an electron, then I get a different pattern (statistically on the back wall of where the electrons hit) than if the detector was not observing the space between the left hole and the back wall. If any part of the possible paths are observed (as containing or not containing an electron), then the other possible paths are affected even though they were not observed. The electron could have gone through both slits or neither or left or right, but still the path on the right is affected by observing the path on the left.

Most quantum physics scientists explain it as the electron going through the left hole, the right hole, both holes simultaneously, or bouncing off the thing containing the 2 holes without going through either hole. If the electron does not go through either hole, they do not count that in any of the patterns on the back wall.

In the 2-coin experiment and double-slit experiment, there are 4 possibilities, and 1 is excluded. I need to label the 2 coins for this, like the left and right holes/slits are labeled "left" and "right". One coin is a nickel and the other is a dime. This is not the only way to pair the 4 possibilities. Its just a way to explain that they are the same problem:

(1) Nickel heads. Dime tails. Electron left slit. Electron not right slit.

(2) Nickel tails. Dime heads. Electron not left slit. Electron right slit.

(3) Both coins heads. Electron goes through both slits.

(4) Both coins tails. Electron bounces off the thing containing the slits and does not go through either. It is not true that "at least 1 coin landed heads" so I don't ask the question or keep statistics of it. The electron didn't hit the back wall so its not part of the statistical patterns.

In the design of both experiments (2-coin and double-slit), cases (3) and (4) are opposites and cases (1) and (2) are symmetric. Exactly 1 of (3) and (4) is not counted in the statistics, but the chance is equally balanced between (1) and (2). It works the same way if you swap the left and right slits or swap the nickel and dime or swap heads and tails or swap going through a slit with not going through a slit. Its practical to test it going through the slit but not practical to test it after it bounces because bouncing is an observation by the thing it bounced on.

Quantum physics is a kind of statistics. So is the 2-coin experiment. In the double-slit experiment and the 2-coin experiment, observing any part changes the outcome statistically. I'm not saying the math of the double-slit experiment is exactly the math of a bayesian-network (which is the kind of statistics used for the 2-coin experiment), but I explained enough similarities that quantum physics scientists should take this seriously.

The double-slit experiment is a variation of the 2-coin experiment that uses continuous angles instead of only heads/tails.

That is why observing things changes the outcome and why electrons/photons/etc act like particles when observed and act like waves when not observed.

If I flipped 2 coins and at least 1 coin landed heads, then whats the chance both landed heads? The most important thing to remember is the question is symmetric between the 2 coins, but observing either of those coins changes the outcome, like observing what goes through either slit changes the outcome.

Quantum physics is common sense to anyone who understands the statistics of 2 coin flips.

4 months later
  • [deleted]

Dear Lisi,

You bring up a good question about the role of the observer in quantum mechanics. Answers to all the fundamental questions lie in answer to a simple question, who am I? A fully realized observer becomes one with the absolute or attains singularity. Everything emerges from this state of singularity from with in the absolute. We are all capable of attaining this state if we carefully follow ourselves to that ultimate [link:sridattadev-theoryofeverything.blogspot.com/2010_01_01_archive.html] absoulte truth.[link]

Love,

Sridattadev.

9 days later
  • [deleted]

All form is the result of resonances of space or D, which are products of velocity and time. Particles are merely the interference patterns of those resonances and can be associated with various matrices associated with resonances. In certain conditions it is clear that certain particles are associated with resonances, but then again when at scales that subdivide those resonances, those measurements fail since the subdivisions and associated particles are less than the products that resulted in said resonances. But all form is the result of matrices of velocity and time acting in various dimension of resonance D of the ether space.

The concept of ether field means that there would be propagation of effects and action at a distance, but also since due to the inductive nature of fields there also has to be immediate action since the existence of 1 is tied automatically unto another in v and t. The differences between the nature of related fields is merely functions related to time and velocity. The difference between magnetic field and current is time and velocity respectively, and the similarity is D or resonance space. The force is a product of space or D. so maybe in evaluation of those differences, some are instantaneous and others a propagating, and thus from my point what has value in assigning which as special, related to time, and related to velocity, has meaning through propagation and induction. And then perhaps a propagating resonant field does both, act at a distance and those that are instantaneous are related to time and velocity.

All things are products of time and velocity. Equivalency means that all things through all dimensions are products of time and velocity held together by a math or matrix and related as scalar, (additive), proper (multiplicative) within various dimensions connected through states of resonance and induction. Space is made from loops of a velocity and time matrix, woven together with properties almost like a fluid. The geometry, compression, and configuration of this through multiple dimensions of inductive states through dipole action result in a continuous continuum that is multidimensional. So therefore the atom is the perfect Trans dimensional model, exhibiting all states and transitions of the matrix or space, so there is no distinguishing between the cores or the space. The particle is merely the overlapping of the matrix in nodes of interference. The centres of all particles are the singularities, of the boundaries that go from the interval to infinity. One of the controls is the number pi. The motion of these various dimensional or resonant states act together one relating to another. So therefore the entire system of space and particle is an inertial field or matrix. All these relationship at the smallest level are products of time and velocity. So then the idea of mass, particularity used to define force and energy, are arbitrary assignments. Since the smallest denominator has nothing to do with mass directly, and the defining of systems of ether and atomic states in terms of a larger configuration or "energy" in joules is not periodic, meaning not in system of the assignment of parts and properties as we have in a periodic table of chemistry. If we redefine all properties as arising from three units, to include v and t, then the associated properties are an adjunct to the real structure, and that v or t is the energy (new word) that is in all things, and particles associated with various states. The Higgs is more a test of the system of units, the defining of mass, than the finding of a particle.

The evaluation of all particles in terms of mass and energy is linear and not periodic. Mass is a higher periodic state than velocity and time which are at the bottom. v and t as the basics and through the periodic development of v and t we finally get the effect of mass. Therefore to evaluate those things which arise out of more simple relationships of vt, such as mass less particles, with something much more evolved, such as mass, is not providing the clarity to the structure of the real universe.

a month later
  • [deleted]

Close scrutiny of QM shows that there is no such thing like a classical probability. For example, there is no joint probability for position and momentum. The observer can decide up to the last moment which he will measure, and the remainding of the entangled system, that may be very far, can't know, and though behaves as if it did.

Consider a very simple thought experiment that yet contains the gist of the measurement problem: a particle is emitted in a sperical symetric state, and there are two detectors diametrically opposed and at the same distance from the source. As soon as a detector sees the particle, the other detector can't, although there have not be time enough to signal it that the particle has already been detected. Still worst, the description of the process depends on the frame from which it is seen.

A purely classical probabilitic description can't be used because of this seemingly non local behavior. It is not an issue of lack of information, it is much more fundamental.

    9 months later
    • [deleted]

    Wave and corpuscle are theoretical concepts, they don't exist in Nature. Now, in what is observed, there are both behaviours described by a wave and by a corpuscle. There are interference patterns, and when a particle is observed at a space-time point, we know that it can't be observed at a spacelike interval from this point.

    QFT is another description of that, but it also relies on a wave equation and the projection postulate, even if abstracted. The difference is the classical system that is quantized, but quantization itself is where the wave-corpuscle duality resides.

    A corpuscle is a little body, while a particle is a small part of something.

    a month later
    • [deleted]

    What? Any disturbance that lets you know something happened will absorb energy.

    No. Heisenberg knew any disturbance will absorb energy and change the system. So what is this entangled thing? Psychic forces over distance? You do not need it. It just needs acknowledgement of the pre-loaded state in the unquantum loading theory. Then you get to change the sign on the uncertainty principle and you get to see what is happening from the inside-out.

    • [deleted]

    My real experiment(s) is in beam-split geometry with one emission at a time. This is even simpler than diametrically opposed with two emissions in coincidence, but answers the same question. I did it for gamma and alpha rays. It splits at rates exceeding chance, contradicting quantum mechanics. A particle should go one way or another, but it split like a wave, going both ways! No one tested this beam split test with gamma or alpha rays previously, because it was thought to be very particle-like. It turns out that there are properties of the alpha and gamma that are required to make the measurement see through what would otherwise be noise. With visible singly emitted hv you will get noise (chance coincidence rate) and think it goes one way or another, like a photon. My paper will appear soon in the contest, but you can find my work from one word: unquantum

    Thank you

    ER

    a month later
    • [deleted]

    Please let me chime in again. The issue of possibilities of interfering with each other can be resolved by experiment. My experiments have not been popularized and may be shocking, but are all well described in my essay. The model of a collapse of wave function psi, where psi is either probabilistic or material, is based on past experiments interpreted whereby a single particle is emitted, and psi is used to determine where and when that SINGLE particle is absorbed. Right? Well, what if there was an experiment that demonstrated that there were TWO particle-like detection events to correspond to a single emission event. That would change everything. Right? Well, that is what my experiments show, for both matter and light. A gamma-ray source that emits one-at-a-time is placed before a beam-splitter, and two detectors read detection events in-coincidence at rates exceeding chance. A similar experiment was performed with alpha-rays (helium nucli). Examining the case for (gamma, alpha): this seems to violate energy conservation. But energy is conserved if we give up the always-applicable (photon, particle atom) model and accept that there was (energy, proto-helium) in a detection-loading-center ahead of time that loaded-up and triggered a threshold detection event in each detector: the loading theory. The experiments show the flaw of QM, there is no wave function collapse, and resolution of the measurement problem. Please see current FQXI essay: A Challenge to Quantized Absorption by Experiment and Theory

    Thank you, Eric Stanley Reiter

    a month later
    • [deleted]

    A few minutes ago a learned about the Scientific American Contest for Fringe Scientists. I have been preparing an essay on Quantism, but unfortunately the deadline expired yesterday. Instead of submitting the full essay, will only post part of it.

    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%

    Understanding Why Nobody Really Understands Quantum Mechanics

    Daniel Crespin

    Universidad Central de Venezuela

    dcrespin@gmail.com

    During almost nine decades Quantism has made highly successful predictions. But Quantism is controversial and still generates endless debate. The present essay is an attempt to find the causes of this paradoxical situation. The original sin of Quantism is its inattention to conjugate momenta $\phi$ of wave functions $\psi$. But stationary states can still be correctly calculated by Quantism because they have zero momentum. However the time dependent, unitary, quantum evolution equation is physically incorrect. The probabilistic interpretation of states appeared as an ersatz for a good deterministic evolution equation. Unitary evolution and probabilistic interpretation contradict each other and, together with questionable ad-hoc principles, conform the core of Quantism.

    A comparison between Classical Mechanics and Quantism reveals that, at its most basic level, Quantism is incomplete.

    The first link in the chain of quantum mistakes is the absence of conjugate momenta in (the Hamiltonian formalism of) Quantism. Therefore, the customary quantum states provide an incomplete description of the physical states (of the electron).

    Next, because conjugate momenta are absent, the kinetic energy term is also absent from the total energy function (the term $\nabla ^2 \psi$ is in fact internal energy, not kinetic).

    With an incomplete energy function, a senseless unitary evolution equation was introduced which does not correspond with the Physics of atoms.

    But stationary states necessarily have zero momentum. Therefore the momentum is not required for the calculation of stationary states and stationary energies, and in this task Quantism excels.

    However, the time dependent, unitary, quantum evolution equation remains physically incorrect.

    Finally, the scaffolding of Quantism is completed by the probabilistic interpretation of states, and supported by ideological constructs (uncertainty, wave-particle duality, special role for the observer, etc.)

    To obtain a correct description of bound electrons, momenta should be reintroduced and states have to be renormalized. This is achieved ---for the hydrogen atom, say--- taking as configuration space the projective space $PE$ (definitely not a linear space) associated to the linear space $E$ of real valued wave functions. The correct space of states is then the cotangent bundle $T^*PE$. States are cotangent vectors $(\psi,\phi)$. The normalized square $\psi^2$ of a configuration $\psi$ is interpreted ---an important idea due to Schrödinger--- as charge density. The energy function $f:T^*PE\to \R$ equals the sum of (electrostatic Coulomb) potential, internal (\nabla^2 \psi) and kinetic (\|\phi\|^2).

    Comparison of classical Hamiltonians with quantum Hamiltonians make obvious that Quantism disregards wave momenta.

    When bound electrons make their continuous transitions between stationary states, the photons ---carriers of radiated and absorbed electromagnetic energy--- should be represented by momenta, but in Quantism they are not. Without momenta it is impossible to obtain correct Hamiltonian evolution equations. This explains the painful failure of the time dependent equation Quantism.

    The lack of the kinetic energy term is generally unnoticed because the quantum misnomer \Lq kinetic energy\Rq\ assigned to $\nabla^2\psi$ covers the absence in case of an energy roll call.

    As already mentioned, stationary states have zero momentum. Hence the motionless states and their stationary energies can be calculated ignoring momenta. This explains the remarkable computational success of Quantism.

    The artificial introduction of unitary evolution equations ---initially by Erwin Schrödinger, later refined by many quantum theorists--- is useless because trajectories of unitary evolution never approach stationary states and in particular none of these trajectories run from one stationary state to another located at a different energy level.

    Historically the inadequacy of the unitary evolution equation was improperly solved by Quantism with the invention of the probabilistic interpretation of states. These probabilities supposedly "explain" why systems "prefer" stationary states, and how systems make "discontinuous probabilistic jumps" between such states. This is also called "collapse of the wave packet". But not only probabilistic jumps and collapses convert unitary evolution into a useless decorative fixture, but they violate it. Nevertheless for Quantism it is important to retain both, the victim the and offenders.

    The unitary evolution is kept by Quantism perhaps because any theory aspiring to respectability should brandish some mathematical evolution equation. Isaac Newton inaugurated this tendency. Note that all evolution equations ---unitary ones included--- are strict deterministic/causal/continuous laws expressed in the language of Infinitesimal Calculus.

    The probabilistic jumps are included in Quantism because they provide a surrogate dynamics required to patch the failure of unitary evolution.

    For concreteness assume whenever necessary that the system is the hydrogen atom. The above comments can then be rephrased as follows.

    Momenta (photons) were overlooked (Schrödinger; a lapse?), but do not hinder the calculation (Schrödinger) of stationary states and energies. In physical systems transitions exist, called "quantum jumps" (Bohr), connecting stationary states at different energy levels. Evolution equations (all the rage since Newton) were expected from any reputable physical theory. Unaware of the transitions, Schrödinger introduced the artificial unitary evolution. However, unitary evolution contradicts physically undeniable transitions (Bohr; also Schrödinger "verdammte Quantenspringerei"). The even more artificial probabilistic interpretation of states was invented (Born) to explain energy transitions. Some were opposed (Planck, Einstein, de Broglie, Schrödinger). Others, with uncertain reasons, applauded (Heisenberg, Bohr, Born). Then quantum electrodynamics arose (Dirac), sick from birth with probabilistic maladies. The rest is history.

    Rife with contradictions and ad-hoc principles, Quantism makes correct calculations of stationary states and energies, maintains an incorrect and useless unitary evolution, adopts the probabilistic jumps to fill the gap left by unitary evolution, and creates a chaos where partial success coexists with paradoxes, contradictions and despair.

    The following disturbing fact then arises: Quantism is a mixture of virtuoso calculations providing valid mathematical expressions for stationary states and energies, with crackpot Physics.

    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

    TO BE CONTINUED, TIME AND RESOURCES ALLOWING.

    2 years later

    Dear Friends,

    Whether all the particles in this universe are 'particles' or 'waves' or they have dual nature, as currently believed in the 'wave-particle-duality' of light and electrons. In the attached files it is attemted to explain this 'wave-particle-duality'. Your comments and suggestions will boost our understanding.

    Hasmukh K. TankAttachment #1: Explanation_for_the_Wave-Particle-Duality.pdfAttachment #2: Will_the_QM_waves_of_equal_wavelengths_of_electrons_and_protons_interfere_A_question_by_Hasmukh_K_Tank.pdf

    What if a ghost or a spirit is it's own kind of quantum field? If this turned out to be true, would it shake the physics community to its very core?

    The strongest evidence I have found for the existence of ghosts is from a TV series called The Haunted, produced by the network Animal Planet. I am of the opinion that a ghost behaves like a quantum field. In a quantum field, if you give it energy, its particle will manifest. I believe that a ghost acts like a field; if you give it energy (or it takes energy) it can manifest in the physical.

      If hauntings by ghosts really did occur, they would shake the scientific community to its core. The doc-dramas TV series "The Haunted" presents I-witness accounts from home owners, business owners and the paranormal investigators. There is also video footage of paranormal activity including attacks (scratching, biting, shoving) from invisible attackers. The witnesses and the video are very believable. There is very little if any "woo"; it's just the facts.

      Why can't there be living organic quantum fields?