• [deleted]

Why do people place so much emphasis on "measurement" and wave function collapse when surely it is "interaction". Measurement requires a measurer, interaction occurs regardless of whether anyone is measuring. I have seen the point made forcefully by much better minds than mine, but it just does not seem to stick in popular or casual usage.

Andrew Scott

25 days later
  • [deleted]

The entire Physics is centred around what is observed and the observer who perceives what he/she has observed. The tools of experimentation add further complexity to the observer. Instruments have their own response times that differ. Thus, the measurement is never instantaneous as it avarages over the response time. Thus two measurements of the same process/ phenomenon may well differ. The truth can only be ascertained if we can know what happens at each instant of time. Thus, experimental sensitivity and accuracy play a significant part as to what has been observed. Further, the observers subjectivity involves the mind. It may well limit the objectivity of the observer. The best way for one to conduct an experiment will be to observe a process as a function of shorter and shorter response time. It is further possibe for an experimentalist to reduce the response time of the sensers used by cloipping the electronic signal that it generates for a given event. It may well provide guidence as to be able to extrapolate the observed process towarsd a measurement at an 'instant'.

The theory has also got their own limitations by way of the chosen dependent and independent variables and the actual boundary conditions that are imposed on each of them. These can well result in entirely different set of conclusions. Every reaseracher has a human tendency to ignore his shortcomings and exaggerate his strengths, thereby distorting the objectivity of the treatment presented.

11 days later

Here is how I view the uncertainty principle and the decoherence problem.

1) Macroscopic conceptual error:

An oil tanker half a kilometer long has an uncertainty of position of half a kilometer; anywhere on this tanker is tanker. This is normal. It may appear as a weird uncertainty when we compare the half kilometer precision to that of the small GPS antenna on the mast which give a meter or so precision of position. Conclusion: To reduce objects to point particle does naturally introduce a conceptual uncertainty.

2) The Quantum conceptual error:

In the quantum realm, a moving particle is composed of the particle and of its associated wave (1 lambda); this is the whole tanker. The moving particle has a non uniform existence which is motion. Position and speed are part of the same boat; indissociable. A higher precision in position looses the associated wave which is the expression of its non-uniform existence i.e. speed. A higher precision of the speed i.e. th wavelenght, looses the position of the particle.

At the quantum realm, position and speed is replaced by a single concept; probability of existence. Associated with the moving particle is a single lambda Pilot wave that constitutes its probability wave, direction and speed.

Conclusion: Decoherence is the shared probability of existence.

3) Back to macroscopic

If one looks at the solar system from a distance, what is the probability of finding the earth in a specific position? This probability is equal to the ratio of time spent by earth in that position on sum of time spent in the rest of its orbit around the sun. The sum existence of the total orbit is one. The relative existence Re in one place is a function of the time spent in that place compared to all other available places. Where it spends more time it is more likely to be and be found. Conversely, something that exists tends to exist more where it can stay longer, because it is there longer. (e.g. gravitation)

Existence as a function of time spent in one place is valid for particles or planets ...

I don't know if this decohere it for you ...

Marcel,

  • [deleted]

o.k. One background idea may be missing in the above post to understand the point.

The particle and the vacuum are made of the same stuff, but different forms of it. This vacuum is a continually exploding process that we refer to as the passage of time. In essence, because it is of the same nature as the vacuum,

the particle replaces the vacuum by logical substitution, a substitution that is felt away from the particle as local dip in the rate of passage of time. (see model of explosive substitution in my essay)

The particle jiggles at the bottom of this time rate well which takes the shape of a sphere of probability. This spherical shape describes a uniform existence, uniform in all directions. If we push this particle, we skew this sphere of probability and cause the particle to have now a higher probability of existence in one specific direction. The sphere now takes the shape of a wave of probability; lower rate in front where the particle sits, higher rate at the back half of the wave. (remember that the particle replaces the vacuum and therefore always sits in the lower rate part of the wave). The particle is not falling in a time rate gradient (like in gravitation); it is actually being pushed by the higher time rate back half of the wave. In other words, a bump in the time rate in the vacuum makes things less likely to be there. A dip in the time rate in the vacuum makes things more likely to be or move there. Put them together and you have a directional traveling wave of time rate i.e a traveling wave of probability, i.e. pure motion.

In our calculations, this is where we have to use pi, to correlate values of spherical probability, or existence, to linear probability of existence, or motion.

hope this helps ...

Marcel,

p.s. Think this is a bit of metaphysics? Of course it is. The existence of the whole universe is metaphysical! Every experience or measurement exists only as a binary relationship. The universe does not require us in order to exist and

to work.

15 days later
  • [deleted]

i wonder if my comments of Nov., 19 are not worthy of any rejoinders. I again visited the site and found some comments by Atomiton1. These have interesting aspects where Physicists need to join neurologists in active collaboration. An year back, i read a science news report on an Internet site concerning the experiments conducted by Oxford University Prof. Eccles who is also a Nobel Prize winner. He was studying the activities of the neurons in the Supplementary Motor Area ( SMA ) of the brain, under sedation. He was not expecting any activity. However, he observed activity in a regular fashion. He postulated that it must have been caused by stimuli, external to the human body. He then suggested that such activities must be getting recorded in a non-physical shield that surrounds the SMA of the brain. Such records are not expected to disappear after the death of that body. This aspect needs closure scrutiny using sensitive electrical sensors where the physicist can help provide the technologies available today. For example there are nano-structured dyes that can be directed into individual cells, where these act as nano voltmeters measuring the electric fields and changes therein. These changes then will help us unrival the mysteries involved between the life giving soul and the physical body and what happens when a person dies and subsequent to it! The distinction between brain and human mind may be better understood and what not. The future lies in a meaningful science conducted through closure collaboration between the physical and the live sciences! i happen to emphasize such linkage in the essay posted this year on the FQXI Essay Contest site via ' What Physics Can & Can't Do?'.It will be interesting what others think about such possible investigations.

11 days later
  • [deleted]

Ulrich,

"What bugs me about all these discussions (collapse or no collapse while evolution is taken for granted) is that they drown out the real questions concerning the ontological implications of the testable ways quantum mechanics assigns probabilities in actual measurement contexts."

I think that the wave collapse is a conceptual artefact. I don`t see a radio photon a 100km in wavelength collapsing on the radio antenna. I goes by and induces as it goes the emf in the antenna. At the light photon wavelength size it is easy to assimilate absorption to instantaneity i.e. a particle. Lets make the photon a soliton and forget about this collapse thing!

The question as you say is what ontological conclusion or inferences can we draw from our experience of QM? In what concerns the fundamental nature of the universe, physics is but a method of gathering clues for us to deduce what the universe is made of and what makes it work. The true nature or state of the universe is metaphysical and we can never experience it directly, because an experience is a relationship, not an isolated state...

My essay shows the universe as being ruled by logic. Since the universe evolves by itself it must also be able to operate on logic (logical operations). This leaves us no choice but to admit only one substance and one cause in the universe. The logical creation of the universe from the rule of non-contradiction allows only time as a substance in the universe. This monism allows one to understand a lot of what physics describes.

This shows that local variable time rate and probability of finding a particle are interchangeable. We can't measure the first one so we use the other one. (This is your ontological connection). A slower time rate in one spot makes the particle stay there longer which translates in a greater chance for us of finding it here. Period. In essence, the wave function describes the distribution of highs and lows in the time rate that determine where the particle stays longer or quickly moves on, which is the probability of us finding it or not. Is this too simple to understand? Or is there some other barrier I don't know about?

Marcel,

a month later
  • [deleted]

Most attempts to incorporate observation and mearurement into physics seem maybe too narrowly human-oriented or anthropomorphic. In line with Relational Quantum Mechanics, it can reasonably be posited that no physical event can happen without some information or signal being observed and responded to. Then the key requirements of the physical universe would seem to be not particles and/or waves and humans; but information emitters, information responders and response time ? as in review of Newton's Principia

17 days later
  • [deleted]

Garrett,

In my unpublished paper "On Breaking the Time Barrier," I give time a physical definition ("n-dimensional infinitely orientable metric on self avoiding random walk") and follow through the consequences of a time metric dissipative over n dimensions, n > 4. One of these conseqeunces is an interpretation for collapse of the wave function:

"If time, as we assert, has an independent physical reality, it must be discontinuous with R^n and necessarily continuous with the n-dimensional Hilbert space. Necessarily so, because the continuity of a metric in n dimensional space implies discontinuity with R^n

by the invariance of dimension. The 3-space boundary in 3 1 dimension (Minkowski) space-time is the distance-time relation that changes sign in the metric signature - on a pseudo-Riemannian manifold of Lorentzian metric properties; "curved space." Compare with the Riemannian manifold in n dimensions of m eigenvalues, assuming non-degeneracy, which is smooth and positive definite. There are no timelike vectors, an important property for our definition of time, because an infinitely orientable

metric on a non-orientable surface is smoothly connected over dimensions

n - 1. We mean that the Lorentzian metric is compelled to obey local Minkowski space limits of time-distance. We mean that the time metric seeks the least dimension path in n-dimensional event space, which makes time an entirely scalar quantity. The magnitude {T} describes every time t --> T at which a measurement is recorded and all information about t is lost. This model preserves irreversibility in the measure space without sacrificing the

smoothly continuous function of a non-perturbative theory. I.e., we have the means to determine the large-scale bounds of quantum uncertainty in the relativistic limit:"

We then go on to calculate that result.

Tom

The fact that terms like 'demystified' are still being employed shows the crux of the issue in contemporary physics, which is the failure of many in the community to let go of their previously held ontological assumptions regarding the universe. The problem is that many want to see the structure of the world in a certain way and cannot accept any interpretation which does not conform to the neat and tidy world of classical reductionism.

For those who cling to the philosophical school of classical reductionism, which still describes the vast majority of the scientific community, the 'strangeness' of QM is the Elephant in the room that everyone sees but nobody wants to admit is real. It is obvious that the scientific community has always been uncomfortable with the hand we have been dealt and the last century has been full of various attempts to explain why the elephant is an illusion or why it must be an illusion because it does not jive with the pre-packged ontological assumptions that are born out of the desire to see the world through the crytsal-clear lens of classical reductionism.

The problem is not which interpretation of QM is correct, the problem is a direct result of trying to interpret the emperical results through the eyes of 18th-century Analytical Mechanics and classic determinism. We ain't in Kansas anymore Toto. Pretending that we are will be an exercise in futility.

I wanted to present a question for discussion. Unfortunately, I cannot create a new post. There is no option available on the menu. What gives?

...

How much do the Philosophical prejudices held by Physicists preclude them from entertaining formulations that infer implications about the nature of reality that diverge from these Philosophical ideals?

The History of Physics is replete with such examples. One example is Einstein's failure to accept QM as a complete system due to the fact that the Ontological implications of the theory did not sit well with his notion of strict causal determinism.

We like to think of Physicists as entirely objective creatures who go about their business by appealing only to the empirical facts that are set before them. Physicists are human beings and like other humans, they carry with them metaphysical and emotional prejudices regarding the ontological nature of the world. Like other humans, they do not like to let go of these prejudices, even in the face of hard-core empirical facts, as was the case with Einstein.

All one has to do is read a few popular accounts of Modern Physics, both past and present, and you will often be presented with the opinion that any fundamental theory of physical structure will possess a beauty and elegance that is derived from its simplicity. Who says so? Is that something that we desire or something that is a logical necessity?

It is common to hear that some aspect of the Universe must be this way or that and, more often than not, this notion comes from philosophical and emotional prejudices. The tendency then is to not even entertain any idea which does not pass an arbitrary litmus test. This fact, along with the Sociology and Politics of the scientific enterprise, is a cause for concern. These factors have generated a herd mentality in the Physics community and have created a stagnate environment in which straying too far from the herd is shunned. I can use a number of words or adjectives to describe the state of Modern Physics. Objective is not one of them.

    • [deleted]

    "How much do the Philosophical prejudices held by Physicists preclude them from entertaining formulations that infer implications about the nature of reality that diverge from these Philosophical ideals?"

    http://en.wikipedia.org/wiki/System_justification

    http://en.wikipedia.org/wiki/Falsifiability

    "All one has to do is read a few popular accounts of Modern Physics, both past and present, and you will often be presented with the opinion that any fundamental theory of physical structure will possess a beauty and elegance that is derived from its simplicity. Who says so? Is that something that we desire or something that is a logical necessity?"

    I think it was William Occam who said something like that, but he phrased it as more of a correlation.

    The classical elephant in the room was assumed to be spherical, frictionless and in a vacuum. The quantum elephant is assumed to be a zombified cat existing in multiple universes.

    There are a multiplicity of possible explanations. Which school of thought do you like and why?

    Bubba, thank you for the intelligent and insightful post.

    Hi Brian, thanks for the reply. I wanted to start a new topic on this subject so as not to diverge too much from the scope of the original thread. Unfortunately, I cannot create a new post. My apologies to the OP for hijacking his thread.

    Too often you find someone employing Ocaam's Razor as if it is a measuring tool for scientific truth or validity. Simplicity does not imply validity and it is not a fundamental litmus test for scientific truth. The litmus test for scientific truth is empirical observation. If it looks like a duck, acts like a duck, and quacks like a duck, then it's a a duck; even if the equations infer it must be a cow.

    Someone can come out of the woodwork and put forth the18'th century hypothesis that' the Sun is simply a burning lump of coal and attempt to justify the theory by asserting that the Coal theory is certainly much more simple and elegant than the explanation provided by nuclear physics. Right out of the gate, however, the lack of dominance of carbon emission lines throws a monkey- wrench into this theory. The empirical observations of the emission lines force a much higher level of complexity on any theory that attempts to to explain this observation.

    Also, I think it is important to separate methodological simplicity from causal simplicity when discussing concepts like Ocaam's Razor. A scientist does not start with the goal of constructing a theory which is simple and elegant. A scientist sets out to construct, in the most general and fundamental terms possible, an explanation for the phenomenon under scrutiny. This explanation and the associated methodology may or may not be more simple than any arbitrary theory which precede it or competes with it. If anyone doubts this, simply read a paper on the theory of vortex flow or non-linear dynamics.

    In terms of methodology, relative simplicity results because we choose a frame of reference that often, but not always, makes it possible to intuitively conceptualize the problem under study. As a very simple example, I can choose to utilize polar coordinates to calculate the trajectory of an object undergoing rectilinear motion even though it is much easier and more intuitive to utilize Cartesian 3-space or some arbitrarily simply Vector space. Both reference frames will produce the same results and allow me to predict and explain. To me, that is the gist of Ocaam's Razor -- there is no need to trouble yourself with unneeded complexity when a simpler explanation will do. Ocaam's Razor does not imply that something is very wrong when a theory is not relatively simple, elegant, and beautiful.

    I also believe that it is important to separate heuristic simplicity from causal simplicity when discussing a subject such as Ocaam's Razor. We generalize concepts to make them more subtle and compact. This flows out of our operational requirements. We formulate more fundamental generalizations to eliminate higher levels of relative complexity in our understanding of phenomenon. In terms of causal simplicity, a more fundamental theory is naturally going to be much more simple and elegant than those that precede it because it provides a more fundamental and generalized way of viewing the phenomenon.

    ....

    As far as interpretations of QM, interpretations are only worthwhile for a scientist if they lead to a framework that allows the scientist to create novel and testable predictions that the current theory cannot. Otherwise, different camps would simply be doing the same things using different metaphysical labels. When a novel conceptual understanding of fundamental ontology leads to a new theory that can explain and predict where others cannot, the result is a paradigm shift that revolutionize how one thinks about nature.

    IMO, Physics has been in a metaphysical funk for quite some time -- since the 20's, actually -- and it is a community that is not really sure of itself or where it wants to go. If any period in the history of Physics as in need of a paradigm shift, this is it. As I stated in an earlier post, we are not in Kansas anymore and pretending that we are is likely only going to result in more time spent in this funk.

    Modern Theoretical Physics has become String Theory vs the Standard Model vs Loop Gravity vs [insert theory here]. Very few have stopped to consider that perhaps everyone is digging for oil in the wrong place. Perhaps the community needs to find a totally different conceptual landscape in which to probe for oil. Unfortunately, budgets and manpower are all tied up and little of anything is left for any undertaking which challenges the status quo or heads in a novel direction. Physics is no longer 19'th century Natural Philosophy where scientists work largely autonomously and can head out into new ventures on their own. The current politics, sociology, and budgetary considerations of 21'st the century scientific enterprise preclude the possibility of a paradigm shift happening any time soon.

    IMO, a Reductionist blind-spot exists in our attempt to understand nature and formally relate lower levels of complexity to higher levels, and vice-versa. One thing that is apparent from our current understanding of nature is that there often exists no clear and systematic causal delineation between different levels of structure in the physical world. I would liken this to the notion of the 'so-called' missing links in evolution that don't permit one to empirically map one level of organizational structure to the next. For example, our distinction between atomic and molecular structures is methodological and heuristic--nobody has managed to come up with a complete and internally consistent physical model that applies, across the board, as to how molecular structure arises from atomic structure.

    We can explain the structure of the water molecule by appealing to the concepts of energy levels,stability, valence, polarity, and all these other useful concepts employed in chemistry;however, we would be hard-pressed to deductively predict the existence of the water molecule and it's properties based only on the the laws governing physical structure at the atomic level. In chemistry, scientists still have trouble explaining certain unique molecular properties such as chirality. In fact, scientists are still debating the number of bonds each water molecule makes with its neighbors. In our study of nature, a reductive blind spot always seems to exist when we go from one level of structure to another.

    In short, knowing and understanding the properties of lower level structures does not always allow us to predict or account the properties of higher level structures based only on the properties of those that exist at a lower level -- in any formal and systematic way. There will always be gaps, special cases and oddities along with the blind-spots in our reductive schemes. To abstract even further, would an assumed TOE which accounts for the lowest level structures be able, in theory or practice, to explain the migratory patterns of humpback whales or explain the existence and content of Shakespeare's plays? I am withholding my opinion as to whether or not I believe emergence is due to ontological or methodological concerns.

    Historically, physicists have been working with the metaphysical presupposition that discreet and autonomous bits of substance represent the most fundamental property that can be attributed to natural phenomenon. The autonomous parts that comprise themselves are all that is needed to account for complexity -- at least that's the current paradigm.

    This ontological and methodological framework is what I question.

      • [deleted]

      Bubba,

      That was well said, thank you. I am a dilettante so I am always appreciative of a wise physicist who will take the time to answer some of my questions.

      "Very few have stopped to consider that perhaps everyone is digging for oil in the wrong place. Perhaps the community needs to find a totally different conceptual landscape in which to probe for oil. Unfortunately, budgets and manpower are all tied up and little of anything is left for any undertaking which challenges the status quo or heads in a novel direction. Physics is no longer 19'th century Natural Philosophy where scientists work largely autonomously and can head out into new ventures on their own. The current politics, sociology, and budgetary considerations of the 21'st century scientific enterprise preclude the possibility of a paradigm shift happening any time soon."

      If what you're saying is true then what are scientists doing? I understand that experimental physics is expensive and difficult. However, theoretical physics is dirt cheap by comparison so why is there a lack of diversity and resistance to new ideas? Why must it be expensive to shift the paradigm? I thought the overhead for a new idea was only the cost of paper, pencils and the physicist's time. Physics should be full of well supported nascent ideas.

      If there is a precedent in physics for scientists working autonomously and heading out in new directions then why does science have metrics that work against that creative process? Why is an isolated population necessary for an organisms evolution? Did Einstein support his initial research with a sinecure or did he pursue tenure?

      Hello again Brian.

      From my perspective, I can sum up the issue in one word -- Politics.

      When discussing the subject, it is important to separate the institution of Science proper from the institutions that pay the theorists to conduct the business of science in an academic setting.

      A University is an exceedingly competitive environment. Every year, the typical university puts out more PhD's in Physics than there are positions available in research. An institution is very unlikely to hire a researcher that does not either have an interest or background in the areas of research that represents the focal point of the institution.

      Scientists are like anyone else. They need to buy food, clothing, and shelter. The University will judge a scientist on a number of factors, one of which is their ability to publish as many papers as possible in one of these areas of research -- and in the shortest amount of time possible. In short, unless you are already tenured and have established yourself, you will not risk your future by venturing outside of the confines established by the status quo. By the time you have established yourself, you have usually reached a level of complacency with the field that you will not venture out. By that time, you are more interested in validating your life's work rather than starting something new.

      There currently exists one large focal point of theoretical research that represents the status quo--one which will go unnamed. If you are fresh and young and wish to find your niche, you will either get with this program or find your potential future to be FUBAR.

      Back to the subject of interpretations.

      I think it's worthwhile to take a quick survey of ideas and concepts throughout history that presented similar conceptual and ontological difficulties. It certainly won't offer any solutions to our problem but will give us an idea of how such issues eventually were resolved.

      If you every get a chance, browse through Newton's Principia and Optiks. Compare and contrast the tones present in the works. In Optiks, you will find Newton offering many hypothesis and explanations for the phenomenon he is investigating. In Principia, he explicitly refrains from offering any opinions whatsoever concerning the nature or of the gravitational force.

      Action at a distance is so abhorrent to a mechanistic world-view that it is nonsensical. Forces are a local phenomenon and force is transmitted through direct contact. In the everyday world of the 18th century, a force is something that only occurs between two bodies that are in direct contact. It was impossible to fathom how two bodies separated by great distances could influence one another.

      Unlike the corpuscular theory of light he put forth, Newton pretty much dealt with the subject of gravitation in a pedantic way and left it at that. The subject was never revisited. Action at a distance was a hot potato and the subject was usually dropped from discussions. It was something that most everyone eventually accepted but nobody ever really spent a whole lot of time trying to think about. There were no Copenhagen or Many-Worlds interpretations for this spooky behavior. Just as we do today, Newton and his contemporaries simply acknowledged that they did not understand the manner in which such a phenomenon manifest itself. Like today, they simply knew that the mathematics worked, could make accurate predictions, and offered a complete accounting for the observations. Life goes on and the work continues. One does not need to understand the complete picture in order to continue to make use of the knowledge one has gained.

      In the 19th century, the development of analytical mechanics changed the focus away from force and onto the concept of the energy of a system. In the formulations of LaGrange and Hamilton, force is never an explicit part of the picture. One does not need to visualize the system as individual particles exerting forces on one another. The goal of nature is to minimize the integral of kinetic and potential energy differences. This makes it much easier and efficient, as one can deal with scalar quantities and generalized coordinates. It also means that the notion of action at a distance subtlety disappears from the scene. The problem of action at a distance basically gets tucked away in the attic. It isn't until Einstein that anyone seriously attempts to revisit the subject and attempt to make any sense of the ontological nature of gravity.

      A similar justification was used for accepting the field concept after it's introduction by Maxwell and Faraday. With Maxwell, a non-local force between distance charges never explicitly enters the picture. The electromagnetic field permeates space and is the causal agent responsible for the interactions of parts of the 'electromagnetic fluid.' As the field permeates space, matter is influenced by it's direct interaction with the field at the point at the point on the field at which it finds itself. The field is the causal mechanism of the electromagnetic force and action at a distance becomes superfluous.

      Later developments obviously model the forces of nature as being mediated by the exchange of particles or energy borrowed from the vacuum.

      The spooky notion of action at a distance completely disappeared from the picture. Or so we thought. After a short hiatus, it seems to be back again, only in a slightly different form. Entangled particles seem to be working some strange magic and we once again find ourselves in the unenviable position of trying to play poker with half a deck. It took almost 200 years and a radical shift in methodology and theory before we were able to form a coherent picture of gravitation that let go of the notion of action at a distance. How long it will take for us to make sense of the present conundrum remains to be seen. At least we know how Newton and his contemporaries felt when trying to make sense of the nonsensical. The intrigue and wonder is what draws most people to science. If we had the answers to everything, it would be a very boring world.

      • [deleted]

      Continuing with the discussion(or monologue in this case). I can never resist discussing this subject, even on Internet message boards. It's a good way to just sit back and unwind -- some might say a very geeky way, but a good way nonetheless. :)

      I am not trying to sustain a running monologue but am trying to add a different dimension to the subject. Jump in y'all !

      There are lots of things we don't know.

      What do we know?

      -- QM in all its incarnations works. It represents the most successfully theory to-date in terms of it's power to predict and account for observed phenomenon on the scales in which we apply it.

      -- For nearly a century, the most brilliant minds in the scientific community have been trying to come to terms with the implications of this theory.

      -- These brilliant minds have put forth a myriad of interpretations but there is no consensus and currently there exists no methodological or empirical way of establishing the validity of one interpretation over another.

      Unfortunately, that's pretty much all we can really say at the moment. It's fun to speculate but we are all basically shooting in the dark when offering our interpretations.

      If someone were to ask me my opinion, I might just say, "I haven't the slightest idea. If Feynman or Wheeler couldn't figure it out, I sure as hell can't." Then I would put on my philosophers hat and offer some metaphysical speculations, because that's really all that can be done without some new discovery or an entirely new reformulation of the subject. Obviously, our classical deterministic tendencies that we have gleaned from the macro world have failed us on this most fundamental level.

      We did not build our current paradigm from the ground up. We occupy a very peculiar corner of the Universe. The majority of the visible matter in the universe exists in the peculiar high-energy state of matter that we call plasma. Our tiny little corner of the cosmos is a Universal freakish oddity characterized by stable low-energy states of matter known as solids. If we were plasma-beings living in the chaotic soup of high-energy substance, our physics textbooks would look very different. Our world would be dominated by forces so powerful that it would probably take quite some time before we ever get around to the idea of gravitation, if we ever did at all. We would likely never catch onto the notion that a relatively stable substance called solids was something real. We would probably never speculate on the existence of relatively stable entities such as molecules,solids, or polypeptide chains.

      We find ourselves on the opposite end of the spectrum. Our world is characterized by relatively low-energy states where the effects of forces such as gravity are the most discernable features of our existence. It is in this environment that we formulated the rules of the game. We had to start big and work our way backwards down the causal chain into the relatively high-energy world of particle interactions. There is no reason not to believe that our intuition gleaned from our oddball existence will fail us in this environment. This strange and tiny quantum world looks and behaves nothing like the one we live in here in our corner of the cosmos. If a plasma-being happened onto our little corner of the world, he would bey as equally perplexed and dumbfounded.

      All the plasma-scientists would be running amok shouting, "Nothing behaves like it should ! What the hell is going on?!!" We would simply retort, "Of course nothing behaves like it should -- you live in a plasma! What did you expect? What made you think that your world should behave like ours?"

      We always assumed that the micro-world must behave like our oddball world full of colliding pool balls and masses sliding down frictionless planes. We were wrong. Just one look around tells us that we ain't in Kansas anymore, Toto.

      • [deleted]

      Bubba

      Your lucid comments are a pleasure to read. Ever thought of writing physics books? What would you consider to be a measurement in quantum physics? Why should the wavefunction collapse to an eigenvalue once there is a measurement? What specifically do you question in the ontological framework? We are solid beings yet we still understand plasma physics (except for me I got a C in that course). I'd also be interested if you have any historical connections or philosophical musings on entropy.

      I agree, we are not in kansas anymore...quantum can be a trip.

      7 days later

      Hello again, Brian.

      As Feynman once stated, "If you think you understand Quantum Mechanics, you don't understand Quantum Mechanics."

      I am reluctant to offer any specific opinions without qualifying them with arguments as to why I believe any individual interpretation or opinion holds merits over another--or why I formulate the specific opinions that I do. I don't want to come across as if I am skirting the question or engaging in hyperbole. I just prefer to think about these topics from a different angle.

      I believe it is more productive to start by trying to understand the factors that lead a scientists or any individual to choose one interpretation of another. How does someone come to choose the MW interpretation over Copenhagen, or vice-versa? What is the criteria of selection and where does the criteria originate?

      As I mentioned before, with our current state of knowledge, the problem with interpreting quantum mechanics is that we lack a formal means of deciding which interpretation corresponds to reality--it is theoretically possible that we may never know. Outside of possessing data that contradicts an interpretation, all interpretations are equally valid answers to the question , 'What is QM really saying about the ontological nature of reality on both the macro and micro scales?'. This is really what people want to know.

      The acceptance of an interpretation of QM often results more from a desire to make a selection that appeals to our metaphysical inclinations rather than from having thought through the issue in as objective a manner as possible and arrived at a conclusion. We end up grocery shopping and making selections that are agreeable to our palate -- "I will take two jelly doughnuts, two eclairs, a and a bagel, please -- and throw in a coffee, extra cream." The unpopular items like Brussel Sprouts are largely ignored and can be seen gathering dust on the shelf.

      In the absence of evidence to support any claim to validity, I believe that it is especially important that we think about why we have come to prefer our own interpretations over another and can rationally articulate our choice to accept this interpretation as a probable ontological narrative that describes reality. It is important that we do so because how we choose to interpret phenomenon often determines in what direction we head in trying to advance our knowledge of the subject. Just saying, "I think the MW worlds interpretation is true" is no different than saying, "I think the Yankees are a better team than the Red Sox". Can you back up the opinions with some sort of argument or is it something that arises out of a whim or personal preference? Many interpretations come across as if they were formulated in the same way one would attempt to find a rational way to jam a square peg into a round hole in an attempt to hold onto a classical metaphysical conception of physical reality.

      Continue on with this subject..

      Violation of Bell's Inequalities appear to have put us in the position of either rejecting the idea of hidden variables or accepting the notion of hidden variables as they apply to non-locality.

      Some theorists who do not like the experimental results will say, "I will concede the existence of non-locality as long as I get to retain the ability to rely on hidden variables." Others will simply reject the experimental results entirely and argue the fine points of the experimental setups. They will concentrate on finding ways to show how the results are invalid. Again, going back to my points earlier, this sounds more like an attempt to hold onto a particular worldview rather than taking an objective look at the evidence and trying to reach the most logical conclusion that represents the most plausible interpretation. If the inequalities had not been violated and the results of experiment had told them what they wanted to hear, would they be pressing the issue of validity? I would tend to believe not.

      The experimental results have been replicated many times. If someone were to demonstrate and empirically prove that hidden variables indeed exist, I would have no choice but to accept their existence. That evidence, however, does not exist and nobody has put forth any consistent theory that would account for or explain such hidden variables. This does not mean I am right. It simply means that I have formed my opinion in as objective a way possible, irrespective of my own prejudices and metaphysical bias. It doesn't really matter to me if it turns out that hidden variables do indeed exist anymore than it would matter to me if it turned out that we discovered that the Universe really was the back of a Tortoise floating in a gigantic pool of water. I simply see no reason to currently accept either proposition. They appear to be conjectures based on criteria that appeals to an individuals metaphysical desires and inclinations about how the world should operate.

      In the absence of evidence for hidden variables, I can only objectively infer that classical reductionism likely died a very slow death as we discovered more and more about the workings of nature on scales that are far removed from our senses. Many have refused to attend the funeral and hold out hope that it simply went AWOL and will one day reappear or manage to resurrect itself. The theoretical community appears to be going through the progressive states of Kubler-Ross -- Denial, Anger, Bargaining, Depression, and Acceptance. Currently, many seem to be somewhere between the Bargaining-Depression state.

      I believe that any resolution of the conceptual and ontological issues we face are likely going to result not from a reformulation of Quantum Mechanics or a discovery of hidden variables, but will involve a major paradigm shift that reorganizes our intuitive and causal notions of how nature is structured on a phenomenological level, not only microscopically but macroscopically as well.

      To give you an idea of what I mean, we should consider some examples from History. In the history of our quest to understand the structure and properties of the phenomenon that exist in the natural world, there have been two fundamental reasons for shifts in thought that lead us to reformulate current principles. Such shifts either result from observations that force this change upon, as is the case with QM, or they result from a radical ideological shift in perspective. Both will lead to a reformulation of fundamental notions or concepts about the world but the latter proves more powerful as it forces one to reformulate ideas on a global scale. It forces a new approach to an old problem using entirely novel ontological conceptions about the nature of the Universe itself, not just a specific phenomenon associated with an arbitrary scale of influence.

      I would present two examples here that explain each type of methodological shifts .

      Contrary to popular opinion and many textbooks, Copernicus did not discover that the Earth orbited the Sun. Copernicus presented a Hypothesis--one that was neither logically or empirically necessary, given the information available at the time. The Ptolemaic system could account for the observations of the behavior of heavenly bodies with equal precision. Unlike our venture into the Quantum world, the Heliocentric hypothesis was not one which was formed as a result of some new observation that could not be accounted for within the confines of the accepted theory of the time. Copernicus did not come out and say to the world, "Hey guys, we've got this all wrong. I was just observing the sky last night and discovered that the Earth is actually circling the Sun. We've got this whole thing wrong !"

      The Heliocentric hypothesis was one that had fermented in the cellar for centuries. You can find the hypothesis present in some schools of thought all the way back to the Greeks. Copernicus framed the heliocentric hypothesis on a much more refined and accurate scale, using the precise data available at the time. He was able to do something the Greeks could not--he could match empirical data to a hypothesis and give it some life. Still, I must emphasize that the Heliocentric hypothesis was not necessary to be true in order to account for the data -- the Ptolemaic model did so with equal precision. Since displacing the Earth from it's primal and privileged position was abhorrent to the common sensibilities of the time, why, then, did it gradually gain acceptance? It offered simplicity and elegance that the Ptolemaic system could not. It accounted for the astronomical observations in a much more straightforward and rational way. In short, it made more intuitive sense; whereas the Ptolemaic system, with its bizarre system of epicycles and deferent, did not. One could more easily explain and account for observed phenomenon.

      The period that followed represents a time when the ideology of the world was turned totally upside down as a result of the inferences we derived from the theory. It marks the beginning of the scientific revolution and allowed us to proceed in ways previously undreamed of. We have never looked back since. Obviously, the idea was slow to be accepted. As with the Theory of Evolution today, it infers things about our existence that many people may not want to hear. If our position in the Universe is not special then perhaps we are not as special as we thought. In the case of Copernicus, the idea started to take shape that perhaps the world does not revolve around us, either literally or figuratively. IMHO, the Copernican Revolution represents the most radical shift in naturalistic and rational thought in the History of the Western world.

      Compare this Paradigm shift to the changes in reasoning that occur when nature forces upon us, by means of direct observation, certain tenets or principles which cannot be avoided but do conflict with the current paradigm.

      A misconception in the history of physics as it is portrayed in most texts is that Planck was a willing accomplice in the development of Quantum Theory. Planck's motivation for introducing the notion of quantization was entirely methodological and not thematic. He was simply using quantization as a neat mathematical trick. His goal was to create a mathematical model which could be used to reproduce the spectral observations gleaned from the study of blackbody radiation.

      Another misconception is that he was motivated to do so due to the Ultraviolet Catastrophe. This is not the case -- in any way, shape, or form. Planck held great reservations surrounding the methodological application of Boltzmann's statistic to observed phenomenon. He never trusted such an approach as a internally consistent method of relating thermodynamic properties of macroscopic systems to micro-phenomenon. He saw the application of M-B statistics as rather arbitrary and contrived. He set out to create his own methodology that would not rely on assumptions about the mathematical forms of the distributions themselves. In other words, he was asking, 'What mathematical form of the distribution equations would allow one to successful model the spectral phenomenon?' His goal was to model the behavior of the system, not start with assumptions about the statistical behavior of an ideal ensemble.

      Using Classic Electrodynamics, Planck tackled the issue by imagining that all of the component structures contained in a body that gave rise to the EM waves could be imagined as charges on the end of springs, vibrating back and forth like a simple harmonic oscillator. He worked out an equation to model the energy distribution among the components that would give the exact results measured in experiment. He gradually found out that he could never apply a continuous distribution of energy among the oscillators and model the empirical results. The only way he could make the mathematics work is if he assigned each oscillator only discreet values of energy which were proportional to an arbitrary constant. We know the rest of the story.

      The gist here is that Planck thought of quantization only as a mathematical trick that would allow him to model the phenomenon. His goal was not to explain the nature of the underlying entities which made up the system. He, and others, refused to even entertain the idea that the quantization of energy states had anything to do with the entities themselves. When he published his results, he presented the equations as a mathematical tool to model the observed phenomenon. He was not making a statement about the nature of the underlying entities themselves.

      It wasn't until Einstein took up the study of the Photoelectric effect and direct experimental proof of quantization of light was shown to exist that he finally came to grips with the idea that quantization indeed occurs on a fundamental level. He admits he only grudgingly accepted the notion. It is for this reason, btw, that many consider Einstein the founder of Quantum Theory -- or at least co-founder. It is also ironic that both men eventually came to reject the theory which arose from their work. Neither would ever accept Quantum Theory as a complete and accurate representation of nature. Two of the most brilliant men in the history of physics let their prejudices concerning the ontological nature of reality interfere with the acceptance of a theory.

      The point here is not to bloviate but to offer a different perspective on the subject of interpretations. There are two different ways in which we come about altering our perspectives.

      Shifts in understanding may come from areas of research that are largely ignored and still in their infancy. In our current case, I believe that such areas of study include chaos theory, complexity, and emergence. These are realms where processes take on a more fundamental role than discreet substance and represent different ways of tackling an issue. I believe that work in these areas could possibly allow us to make sense of things that we may be missing in transitions as we go from one level of structure to the next.

      We have made a lot of ontological assumptions about the nature of physical reality and how systems are constructed. Many of them are hard to hold onto in the face of current observations of the world beyond our senses. I think we need to ask the question, what is fundamental to nature? The assumption has always been that discreet substance(whatever that means on the subatomic level) is the most fundamental property of nature that is responsible for all observed systems. The notion that substance may not be fundamental may seem absurd and nonsensical but on close inspection it is not as absurd as one may think -- can it be more absurd than believing that a photon possesses certain ontological properties that allow it to simultaneously be both a wave and a particle?

      Also, I wanted to add that a degree of 'weirdness' is not a valid criterion of selection when accepting a theory and my last statement did not mean to imply it should be. If it was, nobody would ever have accepted Quantum Mechanics. I am pointing out that Nature need not conform to our metaphysical predilections and we should never proceed with the inherent assumption that any fundamental theory is not obliged to jive with neat and tidy ontological notions concerning physical reality. Nature plays it's own game.We don't get to set the rules.