• [deleted]

Will the Uncertainty Relations Survive as

Fundamental Pieces in the Future of Physics?

by Spiridon DUMITRU

In this short text (by direct references to two my recent paper [1, 2] ) we try to evince the incorrectness of the largely agreed idea that Uncertainty Relations (UR) are fundamental pieces for the (present and) future of physics. The popularity of the respective idea is associated with the supposition that, in physics, UR have crucial a significance. Mostly the respective supposition is itemized through a set of Assertions (A) such are:

• A1: In the experimental reading the UR are crucial marks (error descriptors) of measurements from Quantum Mechanics (QM) comparatively with those from Classical Physics (CP).■

• A2: Regarded from a theoretical/mathematical perspective UR are considered as distinguishing sign between QM and CP.

• A3: In both experimental and theoretical views the UR are considered in an indissoluble connection with the description of uncertainties (errors) specific for Quantum Measurements (QMS).■

• A4: Being an essential element of UR, the Planck's constant ħ, is appreciated to be exclusively a QM symbol without any kind of CP analogue.

• A5: UR imply the existence of some 'impossibility' (or 'limitative') principles in foundational physics.■

• A6: UR are regarded even as expression of "'the most important principle of the twentieth century physics"'.■

In the mainstream of publications, due to the mentioned crucial significance/ assertions, UR are commonly agreed as fundamental pieces for the present and future of physics. We know only a single notification [3] contrasting with the respective agreement. Take into consideration however that the alluded notification was relied not on an analysis of UR per se but on some intuitional considerations about the future role of ħ as fundamental constant in physics.

Recently in [1] we investigated the deep truth of the mentioned crucial significance/assertions regarding UR. In the main the paper [1] proves the falsity of the noticed assertions and, consequently, the incorrectness of the derived supposition about the crucial significance of UR. The proof is grounded in part on the results from other our paper [2] as well as on some additional irrefutable arguments. So we find that naturally UR must be assumed either: (i) as provisional fictions destitute of durable physical significance (in an experimental perspective) or, (ii) as simple fluctuations formulae for random observables (in a theoretical approach).

Moreover one records that UR or their adjustments have not any connection with the description of QMS. The details of our investigations give forth a class of solid arguments which come to advocate and consolidate the Dirac's intuitional guess [3] that: "uncertainty relations in their present form will not survive in the physics of future". Then our investigations consolidate the viewpoint that in fact the UR will not survive as fundamental pieces in the future of physics.

Additionally in [1, 2] we present some serious reasons for the necessity of an

UR-disconnected quantum philosophy. Particularly we plead for the idea that QMS have in view to record the quantum (random) observables and, consequently, they must be understood and described not as a single trial (which give a unique value) but as a statistical sampling (which yields a spectrum of values). Certainly that, in such an understanding, the conception of "wave function collapse" becomes an obsolete thing. In our vision a QMS description must be regarded as a distinct and independent procedure comparatively with QM (which deals only with the intrinsic properties of the studied systems). In such a regard a QMS can be depicted as a distortion of the information about the measured system. For a given system the respective distortion can be described [2] as a process which change linearly, from the intrinsic to the recorded expressions, the probability density and current (given in terms of wave function) but preserve the mathematical picturing of QM operators (admitted as generalized random variables). The alluded UR-disconnection quantum philosophy facilitates [4] a promising approach of the existing (but often ignored) defects regarding QM foundations and interpretation.

On the other hand it is a known fact that FQXi (The Foundational Questions Institute) has as programmatic missions:

(a) "To catalyze, support, and disseminate research on questions at the foundations of physics ..... and innovative ideas integral to a deep understanding of reality but unlikely to be supported by conventional funding sources".

(b) To "encourage ... people to not passively swim with the mainstream, but to take a risk and try something genuinely novel".

Due to the above presented things I dare to appreciate that this text, as well as my papers [1, 2] , can be of real interest for many readers of debates (Forum and Blogs) promoted by FQXi COMMUNITY. Consequently I appeal to the FQXi Staff and Board of Editors to include the present text among the respective debates. Moreover I invite the alluded readers, concerned on QM problematic, to examine and to comment this text plus, especially, my papers [2, 3] (accessible free as it is indicated below).

***

[1] Dumitru S., Do the Uncertainty Relations Really have Crucial Signifcances for Physics? Progress in Physics , V. 4, October, 2010 pp.25 - 29

[2] Dumitru S., Reconsideration of the Uncertainty Relations and Quantum Measurements - SPECIAL REPORT, Progress in Physics , V. 2 April, 2008, pp.50 - 68.

NOTE: Progress in Physics is an American scientific journal, registered with the Library of Congress (DC, USA): ISSN 1555-5534 (print version) and ISSN 1555-5615 (online version). Also Progress in Physics is an open-access journal at the address:

http://ptep-online.com/index.html - see also the attached files to the present text.

[3] Dirac P.A.M. The evolution of physicist's picture of nature. Scientific American, v.208, May 1963, 45-53.

[4] Dumitru S., Possible "Therapeutic Remedies" for the Quantum Mechanics Defects: Reconsiderations versus Myths (Preprint, to be published).Attachment #1: Do_the_Uncertainty_Relations_Really_have_Crucial_Significances_for_Physics..pdfAttachment #2: Reconsideration_of_the_Uncertainty_Relations_and_Quantum_Measurements.pdf

    Heisenberg's Relations vs. Uncertainty

    Quantum Mechanics, in particular the Uncertainty Relations, need indeed a good interpretation. Well, I think that it is more than a matter of interpretation. If its internal logic is self-consistent, then there would not be needed an interpretation. The long discussions about interpretations actually reveal the existence of internal inconsistencies in the formalism of Quantum Mechanics. The "no interpretation" alternative, the "operational interpretation", tries to ignore the inconsistencies by avoiding discussing about reality, focusing only on the operations we perform when making experiments of Quantum Mechanics. I think that what really is needed is to resolve the internal conflicts of Quantum Mechanics. Actually, I think that the expression "interpretation of Quantum Mechanics" is used in fact for alternative theories, which propose mechanisms by which QM is implemented. Because what we can observe is described already by QM, such mechanisms are usually hidden, practically impossible to observe. So, in my opinion, they are named "interpretations" and not "theories" because of the exigencies of modern science to name them "theories" only if they are testable. We may call them "hypotheses", because they are not interpretations - they actually propose new mechanisms, but they cannot be tested, so they don't qualify to the modern definition of the word "theory". Of course, it can be argued that the assumption (superstition?) that Nature really gave us access to all its mechanisms, as if She had the purpose to allow us to test every statement we can make about them, should be kept open to debate.

    Seeing the Uncertainty Relations as fundamental is indeed problematic for several reasons. First, they are in fact the mix of two principles. The second of these principles is the Born rule, giving the probability to obtain a given state as outcome of an observation of a quantum state. The Born rule, by specifying the probability, provides the probabilistic interpretation of a wavefunction. If the Born rule already contains the probabilities, I think it would be better if we could see the Heisenberg Relations separated of the probabilities.

    If we take the solutions of the Schrödinger's equation - that is, the wavefunctions - as fundamental, then the basic Heisenberg relations appear from their very properties. We just take the relations between the size of the interval of the time (position) and the size of the interval of the frequency (wave vector), known from Fourier analysis. These relations are much more general: if we represent the same wavefunction in two different bases in the space of all possible wavefunctions, there is always such a relation between the corresponding intervals. Of course, an observable (Hermitian operator) comes with its own set of eigenfunctions, which are orthogonal, so it is naturally to obtain similar relations if we refer only to the observables and their commutation relations.

    Therefore, the Uncertainty Relations come directly from the wave nature of the solutions to Schrödinger's equation, combined with the Born rule. By "Heisenberg Relations", I will refer to the relations as they appear from the wave nature of the wavefunction, reserving the names "Heisenberg Uncertainty Relations" or "Uncertainty Relations" for their probabilistic interpretation.

    In a similar way, the entanglement between two or more particles is in fact a property of the tensor products between wavefunctions representing single particles. When the total state cannot be represented as a pure tensor product (which can be a combination of symmetric and antisymmetric products), but only as a superposition, we have entanglement. When we appeal to the Born rule, the entanglement manifests as correlations between the possible outcomes of the observation of the particles.

    The Born rule has been thus tested by all experiments in QM, involving entanglement or not. Being probabilistic, they are tested only statistical, but this doesn't mean that they reveal an intrinsic probabilistic reality.

    One central problem of Quantum Mechanics is to accommodate the unitary evolution described by the Schrödinger's equation, and the apparent collapse of the wavefunction due to the observation. There is clearly a contradiction here. If we introduce an internal mechanism to explain this collapse, then we have to make this mechanism able to explain both the unitary evolution and the collapse. This is difficult, because both processes are very simple. In a vector space, what can be simpler than unitary transformations and projections? Any hidden mechanism would have to compete with them. This is why it is so difficult to explain QM in terms of hidden variables, of multiverse, of nonlinear collapse and spontaneous diagonalization of the density matrix caused by the environment.

    On the other hand, there are already enough unknown factors even if we consider the wavefunction as the only real element. The Schrödinger's equation gives us the evolution, it doesn't give us the initial conditions. The initial conditions can be partially obtained from observation. Due to the particular nature of quantum observation, our choice of what to observe also is a choice of what the initial conditions were (yes, in the past). This is why the initial conditions are delayed until the measurement is taken. To this, let us add that we do not observe the initial conditions of just a particle, but of that particle and every system with which it interacted in the past - such as the preparation device, which ensures the state of that particle at a previous time. Since such a device is large and complex, we don't really know its initial conditions, so when we observe the particle, we also observe the preparation device, and everything with which they interacted. Therefore, there are much more factors to introduce in the Schrödinger's equation. These factors are complex enough to make the conclusion that the wavefunction collapse is discontinuous not so necessary as it initially seemed. It is possible to have a unitary evolution leading from the state before the preparation to that after the measurement, given that we need to account for the interaction with the preparation device, which also have much freedom in its initial conditions. I described these ideas here, and there is also a video. In this view, the wavefunctions are real, therefore the Heisenberg Relations are real too. By applying to them the Born rule, it follows their probabilistic meaning, the Heisenberg Uncertainty Relations. It would be nice to have an explanation for the Born rule as well, because it is very plausible that it just follows somehow from a measure defined over the space of all possible wavefunctions.

      • [deleted]

      Two thoughts occur to me as a philosopher - dialectics and Kant.

      First, what emerged from the South Asian subcontinent 4500 years ago a la the Vedas was dialectics. That is, something is apprehended in terms of what it is not. It is a process epistemology, where we cannot identify anything except in terms of its "other". A modern rendition of this is Hegel's Phenomenology. It seems that paradoxes, such as the wave-particle duality, arise because people try seeing something in isolation - an either-or thinking. Yet, if one views a wave (a continuum) in terms of particle (discreteness), it starts to make sense. In logic, these are called "duals", and the more we open our eyes, the more we see of them and less of phenomena as paradoxes.

      Second, I am reminded of Kant's appearance and reality, where reality is the "full life" of something, and the appearance is an instance of it immediately in our time in front of us. A wave is a particle's "full life", in the same sense that a video is the "full life" of a particular scene. Perhaps this is an approach to explaining the "why" of the Born Rule.

      6 days later
      • [deleted]

      Excellent subject. I hope to be able to contribute to this topic.

      James

      7 days later
      • [deleted]

      The most important basis of the String Theory is the Uncertainty Relations of Heisenberg, since only this way can be sure that the particles are oscillating in some way by the electromagnetic field with constant energies in the atom indefinitely. Also not by chance that the uncertainty measure is equal to the fermions spin, which is one of the most important feature of the particles. The attached document also uses the Uncertainty Relations as one of the most important basis to explain the most basic interactions, using also the Spontaneously Broken Symmetries and the Planck Distribution Law and gives a physical explanations for a 3 geometric dimensional String Theory.Attachment #1: 1_PhysicsUnified.tif

      4 months later
      • [deleted]

      A simple idea that can't be right: Is Heisenberg's Uncertainty Principle just a consequence of the limitations of what an elemental entity can know?

      As I understand it, all knowledge of "other" is acquired by one photon interacting with "me" (an elemental entity) and thus transmitting information about one "other" (another elemental entity) with which it has interacted at one time in the past. If I assume that photon's information is "fundamental" (i.e. contains only information about the instantaneous state of "other") then I would expect that I could not derive information about the state of "other" at an earlier or later time (i.e. its trajectory). Were I to have exact information about "other's" location and momentum, then I would be able to know at least its immediate past and future state (i.e. its trajectory). However, since photon has no information about "other" before or after its interaction, it cannot provide exact information about both location and momentum. In simple terms, I can not know what photon has not observed.

      This naïve hypothesis is surely wrong since I am not a physicist, philosopher, or mathematician. I appeal to those who know to explain my error.

      3 months later
      • [deleted]

      There is de Broglie-Bohm theory, which is deterministic, but in agreement with quantum theory if we presuppose quantum equilibrium.

      This seems already sufficient to prove that the uncertainty relations do not have to be fundamental. They may be simply restrictions caused by our insufficient possibilities.

      24 days later
      • [deleted]

      I think the fundamental problem in physics is the non-separation of physical and mathematical concepts. The physical world should be physically defined. The mathematical part is only for verification and for extracting predictable results. The uncertainty principle is purely mathematical. But the instant-duality (the alive-dead Schrodinger's cat) that follows from it is a physical concept based entirely on a mathematical concept, and so it is misleading.

      • [deleted]

      In plain words, I think, what you have said is that QM is fundamentally wrong, but QMS are helpful as a mathematical toll. Or in other words, the uncertainty principle is mathematical and not physical. So do you think there is no duality?

      In my opnion, even if there is duality, there should be mechanism and also a time factor for changing from particle form to the waveform. There cannot be any instant-duality.

      4 months later
      • [deleted]

      Spiridon DUMITRU :

      "ROUTES OF QUANTUM MECHANICS THEORIES"

      ' - a collage - '

      The conclusive view of quantum mechanics theory depends on its routes in respect with CIUR (Conventional Interpretation of Uncertainty Relations).

      As the CIUR is obligatorily assumed or interdicted the mentioned view leads

      to ambiguous, deficient and unnatural visions respectively to a potentially simple, mended and natural conception. The

      alluded dependence is illustrated in the attached poster.Attachment #1: 2_Routes_of_Quantum_Mechanics_Theories...pdf

      Uncertainty is a result of the fact that measurement is always taking place in the past of the "wave/particle". A measurement of the probability of a location of a particle is resulting in the collapse of the wave function. This collapse is the moment (lapse of time of this moment : Planck time) and the place (length : Planck length) after this observation the wave function of the "particle" is reinstalled, and again the probabilities of a possible location are diverse. That is how the two (and more) split experiment can be explained in a logic way. beyond these untill now minimum lapse of time and minimum length our universe becomes non-causal because there is no place for the cause and event together, so neither of them is reality.(see also Realities out of Total Simultaneity)

      • [deleted]

      Dear All,

      Singularity (soul or conscience or universal i) is the only absolute "string" (source of all the waves) in the universe, this truth can only be known by the self and not observed with our senses. What we observe with our senses is the limited view of this universal string, which results in discrete particle duality or relativity.

      Please see the absolute mathematical truth of singularity at

      zero = i = infinity

      Love,

      Sridattadev.

      2 months later
      • [deleted]

      By the way, it is interesting to note that according to conventional quantum mechanics, nowadays physicists still do not know why and how (particle such as) electron can act both wave and particle property! May be knowing the mechanism of wave-particle duality (in paper below) will guide us understanding the mechanism of uncertainty relation!

      [link]http://www.vacuum-mechanics.com/index.php?option=com_wrapper&view=wrapper&Itemid=17&lang=en[/link]

      16 days later
      • [deleted]

      Dear Sir,

      In a paper "Is Reality Digital or Analogue" published by the FQXi Community on Dec. 29, 2010, we have shown that: uncertainty is not a law of Nature. It is the result of natural laws relating to measurement that reveal a kind of granularity at certain levels of existence that is related to causality. The left hand side of all valid equations or inequalities represents free-will, as we are free to choose (or vary within certain constraints) the individual parameters. The right hand side represents determinism, as the outcome is based on the input in predictable ways. The equality (or inequality) sign prescribes the special conditions to be observed or matched to achieve the desired result. These special conditions, which cannot be always predetermined with certainty or chosen by us arbitrarily, introduce the element of uncertainty in measurements.

      When Mr. Heisenberg proposed his conjecture in 1927, Mr. Earle Kennard independently derived a different formulation, which was later generalized by Mr. Howard Robertson as: σ(q)σ(p) ≥ h/4π. This inequality says that one cannot suppress quantum fluctuations of both position σ(q) and momentum σ(p) lower than a certain limit simultaneously. The fluctuation exists regardless of whether it is measured or not implying the existence of a universal field. The inequality does not say anything about what happens when a measurement is performed. Mr. Kennard's formulation is therefore totally different from Mr. Heisenberg's. However, because of the similarities in format and terminology of the two inequalities, most physicists have assumed that both formulations describe virtually the same phenomenon. Modern physicists actually use Mr. Kennard's formulation in everyday research but mistakenly call it Mr. Heisenberg's uncertainty principle. "Spontaneous" creation and annihilation of virtual particles in vacuum is possible only in Mr. Kennard's formulation and not in Mr. Heisenberg's formulation, as otherwise it would violate conservation laws. If it were violated experimentally, the whole of quantum mechanics would break down.

      The uncertainty relation of Mr. Heisenberg was reformulated in terms of standard deviations, where the focus was exclusively on the indeterminacy of predictions, whereas the unavoidable disturbance in measurement process had been ignored. A correct formulation of the error-disturbance uncertainty relation, taking the perturbation into account, was essential for a deeper understanding of the uncertainty principle. In 2003 Mr. Masanao Ozawa developed the following formulation of the error and disturbance as well as fluctuations by directly measuring errors and disturbances in the observation of spin components: ε(q)η(p) + σ(q)η(p) + σ(p)ε(q) ≥ h/4π.

      Mr. Ozawa's inequality suggests that suppression of fluctuations is not the only way to reduce error, but it can be achieved by allowing a system to have larger fluctuations. Nature Physics (2012) (doi:10.1038/nphys2194) describes a neutron-optical experiment that records the error of a spin-component measurement as well as the disturbance caused on another spin-component. The results confirm that both error and disturbance obey the new relation but violate the old one in a wide range of experimental parameters. Even when either the source of error or disturbance is held to nearly zero, the other remains finite. Our description of uncertainty follows this revised formulation.

      While the particles and bodies are constantly changing their alignment within their confinement, these are not always externally apparent. Various circulatory systems work within our body that affects its internal dynamics polarizing it differently at different times which become apparent only during our interaction with other bodies. Similarly, the interactions of subatomic particles are not always apparent. The elementary particles have intrinsic spin and angular momentum which continually change their state internally. The time evolution of all systems takes place in a continuous chain of discreet steps. Each particle/body acts as one indivisible dimensional system. This is a universal phenomenon that creates the uncertainty because the internal dynamics of the fields that create the perturbations are not always known to us. We may quote an example.

      Imagine an observer and a system to be observed. Between the two let us assume two interaction boundaries. When the dimensions of one medium end and that of another medium begin, the interface of the two media is called the boundary. Thus there will be one boundary at the interface between the observer and the field and another at the interface of the field and the system to be observed. In a simple diagram, the situation can be schematically represented as shown below:

      O│ │S

      Here O represents the observer and S the system to be observed. The vertical lines represent the interaction boundaries. The two boundaries may or may not be locally similar (have different local density gradients). The arrows represent the effect of O and S on the medium that leads to the information exchange that is cognized as observation.

      All information requires an initial perturbation involving release of energy, as perception is possible only through interaction (exchange of force). Such release of energy is preceded by freewill or a choice of the observer to know about some aspect of the system through a known mechanism. The mechanism is deterministic - it functions in predictable ways (hence known). To measure the state of the system, the observer must cause at least one quantum of information (energy, momentum, spin, etc) to pass from him through the boundary to the system to bounce back for comparison. Alternatively, he can measure the perturbation created by the other body across the information boundary.

      The quantum of information (seeking) or initial perturbation relayed through an impulse (effect of energy etc) after traveling through (and may be modified by) the partition and the field is absorbed by the system to be observed or measured (or it might be reflected back or both) and the system is thereby perturbed. The second perturbation (release or effect of energy) passes back through the boundaries to the observer (among others), which is translated after measurement at a specific instant as the quantum of information. The observation is the observer's subjective response on receiving this information. The result of measurement will depend on the totality of the forces acting on the systems and not only on the perturbation created by the observer. The "other influences" affecting the outcome of the information exchange give rise to an inescapable uncertainty in observations.

      The system being observed is subject to various potential (internal) and kinetic (external) forces which act in specified ways independent of observation. For example chemical reactions take place only after certain temperature threshold is reached. A body changes its state of motion only after an external force acts on it. Observation doesn't affect these. We generally measure the outcome - not the process. The process is always deterministic. Otherwise there cannot be any theory. We "learn" the process by different means - observation, experiment, hypothesis, teaching, etc, and develop these into cognizable theory. Heisenberg was right that "everything observed is a selection from a plentitude of possibilities and a limitation on what is possible in the future". But his logic and the mathematical format of the uncertainty principle: ε(q)η(p) ≥ h/4π are wrong.

      The observer observes the state at the instant of second perturbation - neither the state before nor after it. This is because only this state, with or without modification by the field, is relayed back to him while the object continues to evolve in time. Observation records only this temporal state and freezes it as the result of observation (measurement). Its truly evolved state at any other time is not evident through such observation. With this, the forces acting on it also remain unknown - hence uncertain. Quantum theory takes these uncertainties into account. If ∑ represents the state of the system before and ∑ ± δ∑ represents the state at the instant of perturbation, then the difference linking the transformations in both states (treating other effects as constant) is minimum, if δ∑

      2 years later

      QUANTUM TUNNELLING, CAUSALITY AND RADIOACTIVE DECAY

      We're all frightened by radioactivity. We associate it with high level nuclear waste; atomic weapons and the mass destruction of nuclear war; Hiroshima and Nagasaki; Three Mile Island and Chernobyl; radioactive fallout that causes cancer and biological mutations. What I'm most frightened about radioactivity is that there is no rational scientific explanation for it! That's probably because radioactivity resides within the realm of quantum physics, and there's no rational scientific explanation for that either.

      In high school science classes, we are told about a class of elements that have nuclei that are unstable; these are the radioactive elements and they emit radioactivity - Alpha, Beta and Gamma radiation. This emission is their attempt to go from an unstable state to a less unstable state and eventually to a stable state. This progression happens at a fixed mathematical progression termed the element's half-life. In class you get an awful lot of the what - what decays; what are the daughter products; what is the measured half-life; what is the significance, etc. But you don't get very much, if any, explanations as to the how and the why of events. That's probably because any attempt to actually explain and how and the why of radioactivity ends up as pure bovine fertilizer.

      There are two main anomalies here. Firstly, why would two identical unstable particles in the exact same environment will decay or go poof at different times; secondly why any collection of identical unstable particles will decay or go poof while marching to the beat of a mathematical drum.

      DESCRIPTION

      Radioactive Decay: We all know about radioactivity (nuclear fission) and how some atomic nuclei are unstable and will at some point decay into more stable forms. So far; so good. The first issue is that nobody can predict when any particular unstable nuclei will go poof. There is no ultimate reason why one nucleus will go poof in five minutes and its next door neighbour won't poof over the next five hundred years. There is no apparent causality involved. That alone is "Twilight Zone" stuff, but wait, there's more. As we learn in high school, though the why is never explained, unstable (radioactive) nuclei decay or go poof in a fixed mathematical way, known by the phrase called the "half-life". An example would be if half of the unstable nuclei went poof in one year; one half of what remains unstable goes poof during the next year; one half of what is still unstable decays in the third year; one half of what remains after that goes poof in the fourth year, and so on down the line until all the unstable nuclei have gone poof. So if you start in the beginning with say sixteen million unstable nuclei, after one year there's still eight million unstable nuclei; after two years there's four million left to go; after three years two million still haven't gone poof; after four years one million; one year later there's still a half million left, and so on and so on.

      On a human level, apart from the nasties given in the abstract, radioactivity provides an abundant energy supply without any greenhouse gas emissions as well as a ways and means of dating historical events. On a cosmic level, radioactive decay turns complex unstable parent nuclei into simpler stable daughter nuclei by emitting Alpha, Beta and Gamma radiation, the former two being nothing more exotic than helium nuclei (the Alpha) and electrons (the Beta). Gamma radiation is best avoided since it is extremely high energy photons that can do your body a mischief.

      STANDARD EXPLANATIONS

      The standard quantum model attributes radioactivity or radioactive decay to a magical phenomenon called Quantum Tunnelling. Translated, radioactive decay happens for absolutely no reason whatsoever. There is no causality. There is no cause and effect. Things go poof - well, things just go poof.

      To get your head around the concept of Quantum Tunnelling, imagine one hundred convicts milling around a prison courtyard with twenty foot walls and no external exits. Then, for no obvious reason, fifty of those convicts vanish from inside the courtyard just to reappear somewhere outside the courtyard, and hence quickly make themselves scarce. One second they are confined within the prison walls; one nanosecond later they are scattering in all directions heading for the hills. They have tunnelled their way past the prison courtyard wall without actually physically doing any tunnelling! The escaped convicts in this analogy are of course those bits and pieces confined (or imprisoned) in the quantum realm, the Alpha, Beta and Gamma radiation part and parcel of radioactive decay.

      In the quantum realm, though the nuclei might be unstable, the bits and pieces are held in place by an energy barrier, the equivalent of the twenty foot prison courtyard wall. In the macro world, they don't have enough energy to clear the barrier, just like a long fly ball that doesn't have enough oomph to clear the outfield fence and become a homerun - it's just a long out. But in the micro realm, for reasons nobody comprehends, the unstable and restless-to-escape bits of the unwieldy unstable nuclei can cheat and tunnel past the energy barrier even though they don't have sufficient theoretical oomph to do so. Not only can they quantum tunnel through, when they do they so instantaneously. And there's no rhyme or reason behind it. There's no causality. One second they are inside the radioactive nucleus; the next nanosecond after they are free as a bird and outward bound.

      Not that in and of itself is absurd, but absurdity is piled upon absurdity when you consider that the 'convicts' don't escape not only for no reason, but they do so in a precise military precision or mathematical sort of way. So our one hundred convicts become fifty in one hour; then twenty-five of those remaining 'tunnel' to freedom in the next hour; thirteen of those twenty-five vanish through the wall in the third hour; six of the remaining twelve head for the hills during the fourth hourly interval; three more go walkabout in the fifth hour; two more vanish in the sixth hour; and the last one standing makes an unexplainable vanishing act in the seventh hour, leaving the prison courtyard in a pristine and very stable state indeed without an inmate in sight.

      How can you have both a total lack of causality AND maintain such military or mathematical (half-life) precision? It's pure bovine fertilizer.

      PROBABILITY vs. CAUSALITY

      The standard model suggests that radioactive decay happens for no apparent reason at all since Quantum Tunnelling happens for no apparent reason at all. It's all pure probability, even if it dances to a precise military/mathematical tune. The idea that Quantum Tunnelling is just pure probability yet results in a really neat graph when plotted goes rather against the grain of common sense.

      Dealing with radioactive decay, well we (the observers) say the odds (probability) that an unstable atomic nucleus will go poof in say one hour (just a measure of time which is a human concept) is 50/50. Actually, it's 100% certainty if you replace "one hour" with the phrase "sooner or later". There is no actual probability involved. Now let's go up one level. Each kind of unstable atomic nuclei, be it uranium (U-235 or U-238), plutonium (Pu), Technetium (Tc), Radon (Rn), Radium (Ra) and all those normally non-radioactive elements that have unstable isotopes, like radioactive carbon (C-14), and many others too numerous to mention, has its own unique half-life. That in itself tells you that causality must be operating. All differing nuclei are only different because they have different numbers of protons and neutrons that comprise them. Yet each, say U-235 nuclei, has the exact same number of protons and neutrons. That's what makes U-235, U-235. That's causality, not probability. And U-235 has a specific and unique half-life. That's causality, not probability. The fact that differing configurations of protons and neutrons result in differing half-lives, and any one unique configuration results in one unique half-life, tells you that things are not random. Causality is operating; certainty follows. I have no idea what is the causality behind Quantum Tunnelling, only that I'm certain there is one.

      DISCUSSION

      Now IMHO that radioactive half-life decay progression makes absolutely no sense. If nuclei go poof for no reason at all, all those that go poof should do so in a totally random fashion - no fixed pattern. Since there is a fixed pattern that suggests to me that the unstable nuclei have to 'know' about this half-life obligation they are required to follow. They are self-aware enough to know when it is their turn to suicide (decay) in order to keep up appearances; maintain the quantum social order, and keep the half-life relationship valid.

      Regarding Quantum Tunnelling, well firstly this violates Einstein's cosmic speed limit - the velocity light travels in a vacuum. That's because any gap instantaneously crossed by a particle undergoing Quantum Tunnelling - well, instantaneously means infinity and infinite velocity is greater than the speed of light.

      Even scientist and science writer Marcus Chown described quantum tunnelling as "The apparently miraculous ability of microscopic particles to escape from their prisons". When a scientist starts invoking miracles, you know something is weird!

      Presumably if it wasn't for that energy barrier holding together the bits and pieces of nuclei, stable or unstable, everything within would escape all at once and the micro world would go to hell in a hand-basket, just like if there were no prison walls all the convicts would flee in the immediate here and now. But if that energy barrier (or prison wall) could be breached (via Quantum Tunnelling) the question arises, if the 'convicts', macro or micro, can dematerialise and rematerialise elsewhere instantaneously, why don't they all escape at the same time?

      And I fail to see how invoking the wave property nature of elementary particles helps any since that would apply equally to stable and unstable (radioactive) nuclei. The wavelength would be larger than the nucleus, or in our analogy, the convict would be so spread out such that they would be larger than their prison courtyard. Everything, all the bits and pieces in each and every nuclei, should break out and break apart and escape immediately.

      When it comes to radioactivity, apparently nothing chemical or physical can be done that will alter the nature of that radioactivity. Something that's unstable, radioactive, will decay when it damn well feels like it. You can boil it in oil, sledgehammer it, soak it in acid, swear at it, even invoke the name of Jesus and it won't alter anything. That in itself is more than just a little bit anomalous - not the Jesus bit but the fact that nothing you can do to an unstable nucleus in any chemical or physical shape manner or form will cause it to decay before it feels like it.

      SUMMARY

      Enigma number one is why two identical non-living things in an absolutely identical environment should individually act as something possessing free will, which is acting with seemingly minds of their own. That's just plain bizarre. If they don't have self-awareness, and it's absurd to suggest that subatomic nuclei have consciousness, then the alternative is that things happen for absolutely no reason at all. That's also just plain bizarre. Further heading into "The Twilight Zone", well the mathematical half-life behind the concept of the decay of unstable radioactive nuclei is just not the sort of natural behaviour that you'd expect. All unstable nuclei of the same type and in the same environment should all go poof at nearly, if not exactly, at the same moment. They don't. That too is an enigma, IMHO.

      11 days later

      QUANTUM UNCERTAINTY: WHERE'S JANE?

      Quantum physics is weird for a whole lot of reasons. One of the central reasons is that all things in the quantum realm are stated in terms of probabilities, or uncertainties, or indeterminacy. That's unlike the realm of classical physics, the realm our normal day-to-day lives are lived in. However, I use an analogy from the classical world to illustrate the realm of quantum uncertainty.

      The physical universe is pretty predictable. The rising and setting of the Sun, the phases of the Moon, the tides, the positions of the planets and their satellites, eclipses, etc. can all be predicted to a high degree of accuracy centuries in advance. Applies fall from trees. Two parts hydrogen combines with one part oxygen makes water. Pure water boils at 100 degrees Centigrade at sea level. Spring follows winter. Entropy increases. Musical instruments play according to design. Bridges bridge according to design. Airplanes fly according to design and so on and so on and so on. Your macro (classical physics) world is as predictable for the most part as is your death and taxes.

      On the micro (quantum physics) scale however, quantum effects rule the roost, and that roost is anything but predictable. In other words, uncertainty rules in the tiny world of the micro, for at the heart of quantum physics lays the Heisenberg Uncertainty Principle. In other words, when dealing with all things micro, what you know is only probability. In the world of the macro: The sun will rise tomorrow. In the world of the micro: Any specific atom of a radioactive substance may, or may not decay within an hour, even if it's near certain that at least one atom will. You can't predict or know which one. In the world of the macro: You know where the moon is. In the world of the micro: Where is an electron that's 'in orbit' around an atomic nucleus? You don't know to any precise degree, unlike say, a satellite in orbit around the Earth. The very act of observing or measuring something at the micro level changes the very nature or the properties of what you are trying to observe or measure. You may know the general probability of the value of the property (say the location of an electron in the vicinity somewhere around an atomic nucleus), but never the exact value or location.

      From the realm of the classical macro, say an observer observes a pebble on the beach. The observation comes about because photons (light) reflect off the pebble, enter the eyes of the observer, carry sufficient energy to jiggle those retina receptors, causing an electrical nerve signal into the brain which does its brain thingy and 'sees' the pebble in a specific place on the beach. The photons, while energetic enough to jiggle the receptors in the retina, aren't energetic enough to budge the pebble. However, if you replaced a revolver bullet(s) for the photon(s), then the pebble would move, probably to an unexpected, indeterminist place, but with a certain probability of being with a certain radius of where it originally was; an even greater probability of being within twice that distance, etc. Instead of the pebble and the bullet, substitute an electron (pebble), which is small enough to be dislodged by a photon (bullet). You need the photon to see the electron, say in 'orbit' around an atomic nucleus, but after that photon enters your eye, the electron has gone walkabout. In other words, the very act of observing the electron changes the position of the electron, so you can't be certain post-observation where the electron is now and what its new velocity and direction might be. It might not even be in 'orbit' any more. That's part of the guts of the Heisenberg Uncertainty Principle and better eyeballs or better measuring equipment won't decrease the level of uncertainty. The other part of the uncertainty phenomena is that the electron is behaving as a wave - the wave-particle duality - and thus the electron is not behaving like a little billiard ball and travelling in a nice straight line, or a standard curved orbit at all but waving all over the place like a flag in a still breeze.

      So when comparing the macro and the micro worlds, there are two kinds of probability or uncertainty or indeterminacy - call it what you will. There's uncertainty in the macro world due to lack of knowledge that you in theory could acquire, like is that flipped coin that rolled under the sofa heads or tails? Then there's uncertainty in the micro world due to lack of knowledge that you can not ever acquire, even in theory. In general, the former tends to represent the classical physics of the macro; the latter, the quantum physics of the micro.

      To illustrate, I've thought up an example from the world of the classical macro world called 'where is Jane?' The starting point is that apparently, according to information on Facebook, Jane is to leave Adelaide, South Australia for Canberra, Australian Capital Territory at 9 am. That much is apparently certain, but that's all you know. The question is 'where is Jane?' at 10 am?

      It's highly probable that Jane will catch a direct flight from Adelaide to Canberra, and knowing the usual speed of a commercial airliner, you can predict where Jane will be at 10 am. BUT, what if Jane missed the flight? What if the flight was delayed? What if the plane hit high headwinds, tailwinds or crosswinds? What if the flight had to go around some nasty weather system? What if the flight was diverted or returned to Adelaide because of a mechanical problem? Then your prediction of where Jane is (latitude, longitude, altitude) at 10 am is fuzzier.

      Of course Jane, albeit with less probability, might have flown first to Melbourne hence Canberra. Or perhaps Jane went from Adelaide to Darwin to Brisbane to Canberra - improbable, but not impossible. Even more improbable (but not impossible) is that Jane flew from Adelaide to Perth then on to London via Africa (or the Middle East), hence to New York (or maybe Boston or Washington or Miami) then on to L.A. (or San Francisco) hence to Hawaii, Sydney and Canberra! To predict where Jane is at 10 am, you'd need to consider all those improbably but possible itineraries.

      To complicate things further, there's a reasonable possibility Jane went to Canberra not by plane, but by train. Or maybe she drove or took a taxi or bus. Maybe she decided to hitch-hike, or use her bicycle or walk the distance (say to raise and collect money for charity).

      So, where's Jane at 10 am? You don't know exactly, although you can assign various probabilities to all the possibilities and take your best guess. Of course if Jane knows you're looking for her, perhaps she deliberately took one of the low probability options - and then decided to head for Hobart instead as her port of call! Now you have an idea of how hard it is to pin down any property, such as the position of an electron, in the world of the quantum micro! In fact, it's even harder than that. You will be indecisive or indeterminate or uncertain that the electron in question is in fact anywhere even near that atomic nucleus it normally 'orbits' around. There's a possibility that the electron went totally walkabout. In our analogy, what if Jane went up - straight up. Maybe, just maybe, however improbable, our Ms. Jane took a suborbital rocket flight from Adelaide to Canberra, perhaps maybe via the Moon, or maybe she is currently heading outward bound towards Mars (and points beyond)!

      Actually, to satisfy your curiosity, Jane woke up, decided to hell with going to Canberra, rolled back over and went back to sleep!

      REFLECTIONS ON A PANE OF GLASS: IT'S A QUANTUM PAIN IN THE QUANTUM PANE

      In quantum physics, you often deduce that those residents of the micro realm, those elementary particles, have some very strange properties bordering on a quasi-free will. They sort of possess a 'mind' of their own. They seemingly have the ability to 'know' things about their external world and their relationship to that. They make decisions with respect to those relationships and act accordingly. They are not just little inert billiard balls. There are observations to back this up that include an observation you can make at home to verify this. Look outside your window. What do you see? A very big mystery is what you see, if your window is anything like my window or most windows.

      Even if you don't know or understand very much about quantum mechanics, or quantum physics (same difference), you have probably associated it with weirdness. Unlike the certainty and causality domination of your day-in and day-out macro world, the realm of the quantum is centred on probability, chance and randomness where things happen for absolutely no reason at all and identical scenarios will yield different results. One oft given example you can (and have) witnessed - how light (photons) interacts with a common pane of window glass.

      GENERAL DESCRIPTION

      Here is a common happening that you have experienced at home or in the office or in the car that you probably never gave a second thought to. That unregistered oddity you experienced is seeing the reflection of AND the passing through of light waves (photons) with respect to a pane of glass simultaneously. What's so odd about that? Well, what's odd is that light is both passing through and reflecting from the same pane of glass at the same time. Why both? Why not one or the other scenario? What's odder still, assuming you are inside, is that not only can you see your reflection or the reflection of what's in your background but what's also outside and through your own reflection. You see your reflection and the outside image, both superimposed on top of each other. So photons are both passing through the glass (you can see the outside while you are inside) from the outside to the inside and at the same time reflecting from the inside to the inside (you can see the inside from the inside) both happenings at the same spot on the glass.

      And if you go outside the reverse is also true. The outside is partly reflected by the glass surface back to you while you are outside looking in while at the same time light photons from the inside are passing through the entire glass so you can see inside your room though you are standing outside, both inside and outside as superimposed images.

      Further, the ratio of pass through to reflection also depends on the thickness of the glass, so presumably the photon 'knows' in advance what that thickness is and acts accordingly. If all of that doesn't strike you as odd, nothing will, though it's so commonplace it probably doesn't strike you as odd.

      OTHER EXAMPLES

      This 'do I or don't I' oddity doesn't just apply to panes of glass. This applies to a wide range of transparent, even translucent stuff. The same pass through vs. reflect back applies for example to your eyeball. Some photons enter your eye and deliver their message; some photons hit the identical spot but are reflected back, but can then hit a mirror and reflect back again this time entering your eye so that you see your eye that reflected in the mirror.

      Speaking of eyes, you can 'see' an external bright light even with your eyelids shut, yet some of the light is also being reflected off the external surface of your eyelids.

      Sunglasses are another obvious example. You can see your reflection in the outer side of the lenses, but clearly the sunglasses let through without any obstruction photons too.

      You can see your reflection in still water and the bottom beneath the surface too if the water is pretty clear and the bottom is fairly shallow. This should also apply to say a polished diamond or other similar gemstones or crystal(s).

      Another visual example - you see sunlight reflected off of the tops of clouds when in an aircraft that's flying above them. As you descend through them and land, though the day is now overcast, clearly some sunlight photons are passed through the clouds. It's the same clouds; and the same sunlight; and the same observer; but differing outcomes. So the pass through vs. reflection enigma applies equally to translucent objects (like clouds) too.

      Though this is an obviously visual puzzle, well that in itself is obvious since we can only see visible light photons. However, photons come in a wide range of forms, from ultraviolet to radio; infrared to microwave; gamma rays to X-rays. Presumably this pass through vs. reflection phenomena takes place with non-light photons too. The most obvious example is that radio, TV or cell phone reception tends to be better outside than inside - one reason for your TV aerial or antenna. So, some radio/TV/cell phone photons are reflected off of the outside of your solid building but some pass through too, but this has nothing to do with frequency or wavelength since these transmissions are on a very narrow bandwidth.

      In a similar vein, it's been advocated for decades that the ideal location to do radio astronomy and/or SETI, searching for alien radio signals, is on the far side of the Moon because the Moon's bulk is 100% opaque to terrestrial and human generated radio signals that just add unwanted noise to the signals the astronomers are looking for.

      One clue that the pass through vs. reflection conundrum must be density related, not just thickness related, comes from X-rays. We've all seen X-ray photos of the human hand. The bones stand out; the wedding ring more so, but the flesh is visible too though less so. So some X-ray photons were reflected, greater reflection related to the density of the stuff the X-ray photon was hitting. Yet clearly some X-ray photons passed through since the image of the fleshy bits isn't as strong as the bones and the bones weren't as solid an image as the ring. Yet it was the exact same X-ray dose that hit all three substances - flesh, bone and metal.

      THE STANDARD SOLUTIONS

      The basic postulate postulated by quantum physicists is that the photon pass through vs. reflection anomaly is an anomaly because it all happens for absolutely no reason at all. It's all random. It's all probability. Some photons pass through via the luck of the draw; other photons get reflected by that same random luck of the draw. How is that possible given that we have, in the original example, one identical pane of glass with identical photons impacting? Well, if you don't invoke causality, you can just about get away with anything anomalous.

      The other accepted answer is that any one photon is in a superposition of states. It can be in two places at the same time, so it can both reflect, and pass through the pane of glass at the same time. Either that or the photon has awareness of its external surroundings; it has a mind of its own and decides what it wants to do!

      Superposition of state has been experimentally demonstrated via the classic quantum double slit experiment whereby particles, like a photon (but any type of particle will do, like an electron) fired one at a time at two parallel slits, will pass through both slits and thus will interfere with itself and cause a classic wave interference pattern on a target board behind the slits. The only logical conclusion has to be that one particle was in two places at the same time. Personally, I find that absurd, but it's hard to debate hardcore experimental results.

      The one flaw I find in that standard pane of glass situation explanation is that if the photon is in two places at the same time, then both the inside reflected image and the external image - the pass through the glass image - should be equally as vivid. Usually the pass through the glass image is the more obvious of the two superimposed images assuming just one light source, say external sunshine, or the reflected image is the stronger, assuming the prime light source is inside, like say at night.

      CAUSALITY & CERTAINTY vs. PROBABILITY & CHANCE

      I need state the obvious here - all photons are identical; the pane of glass in question is obviously identical to itself. Therefore, knowing that and only that, one could only conclude that when photon meets window pane, one and only one outcome is possible.

      We, the observer say the photon has such and such a probability of going through, or being reflected from, the pane of glass. If seven out of ten photons go through the glass window, then there's a 70% probability the next photon will go through. Wrong. As far as that photon is concerned, we, the observer, are irrelevant, and it's 100% certain to either go through the glass or be reflected by the glass. We can be pretty damn sure that a group of photons won't gather together in the middle of the glass pane and do an impromptu performance of a Wagnerian opera. There's no probability involved. It's one or the other. There's no superposition of state. The photons aren't in two places at once - passing through and being reflected.

      Another way we can be sure causality is operating, albeit going up one level, is that every time you go to the inside of your window pane looking outside, you see both outside and a faint reflection of you and the interior. Not once in a while; not sometimes 100% outside and no reflection; not sometimes a 100% reflection but you can't see outside (your window isn't a mirror after all), but 100% of the time, each and every time, you see both the exterior outside the pane and the interior reflected inside the pane.

      SUMMARY, DISCUSSION & RESOLUTIONS

      In summary here, some photons from the inside pass through a pane of glass to the outside; some outside photons pass through that glass to the inside; some photons from the inside reflect off the glass back inside and some outside photons reflect off the glass back outside. The big question is, how does the photon decide what to do? Here comes Ms. Photon heading toward the pane of glass. She has to make up her mind whether to pass on through or reflect back: decisions, decisions. To reflect, or not to reflect, that is the question! IMHO, photons should all go through, or all reflect, from the same pane of clear glass at the same time.

      We note from the outset that the glass hasn't been tinted or polarized - not that that would alter the general picture. What we have here is just an ordinary pane of glass.

      Further, no external forces are apparently at work here. Both the photons and the glass are electrically neutral. Gravity plays no role and the strong and the weak nuclear forces are only applicable inside atomic nuclei.

      To make a long story shorter, causality rules IMHO! Photons are not in a state of superposition; they are not in two places at the same time. Clearly photons are not in a position to 'know' anything. Photons have no decision-making apparatus; they have no consciousness of any kind, no free will to be or not to be. That can be demonstrated by adding a little extra thickness and/or density and/or energy.

      But first, one could easily suggest that since even seemingly 'solid' stuff is 99.999% empty space, that a photon passing through the glass is passing through that entire void, and a photon reflected has hit a glass molecule and bounced back. One exception to that is that the reflection takes place at the surface of the glass pane, none from the interior of the glass. A second exception would be that reflections off of a solid molecular bit in the mainly empty glass pane would be totally scattered in many directions which is what we don't see. Basic optics - the angle of incidence equals the angle of reflection. Yet clearly if photons are being reflected, they are bouncing off something. Or, perhaps they are being absorbed by the electrons within the glass matrix and then re-emitted, though the photon that's re-emitted might not be the exact same photon - but that's of no consequence since all photons are identical.

      We note that the greater the thickness or the greater the density the more the pass through to reflection ratio changes. If you look through the exact same pane of glass, but this time edgewise, no photons pass through from one edge to the other edge. The X-ray case study above shows the role of increasing density. Both are an illustration that ultimately things become so thick and/or so dense that while there might not be total reflection, there would be any pass though either. The option for the photon might then be reflection vs. partial penetration. Of course that in itself doesn't explain the either this or that option the photon takes, at least until such time that it becomes one or the other. In a vacuum it's 100% pass through and 0% reflection; in the case of a metre thick lump of lead, a light photon will 100% reflect and 0% pass through. Restrictions placed in the photon's way by density and thickness just tends to confirm an earlier notation that stuff is 99.999% void such that pass through equals boldly going through that void; reflection is a collision with that rare bit of stuff that sometimes gets in your way.

      But that's not the entire story. Thickness is also related to opaqueness though they are not the same thing. Photons can pass through Earth's entire atmosphere from the fringes of outer space to ground level, yet if you dab a smear of black paint on your pane of glass, well that will strop the photons from passing through albeit black paint is a lot less thick than the Earth's atmosphere.

      Energy plays a role too. X-ray photons are more energetic than visible light photons, which is why X-rays are better for detecting structural flaws (like tooth cavities and bone micro-fractures) which are concealed by external surfaces which are opaque to light.

      Air and glass are transparent to light photons, but are generally fairly opaque to the less energetic infrared photons. That's the general principle or concept behind both the botanical greenhouse and the environmental greenhouse effect, although in the later case not all the components found in air are equally as opaque.

      Ultimately invoking variations in properties like density, thickness, energy levels and opaqueness doesn't totally explain why identical particles, with all other factors being equal too, have this Jekyll and Hyde property whereby some do and some don't; some will and some won't.

      But we see that while things aren't totally explained yet, we're well on the way to determining the real factors that decide the photon's fate, and it's not photon's free will either.

        What if light is wave and not particle (photon), would that alleviate your quantum pain?