Dr. Leifer

Richard Feynman in his Nobel Acceptance Speech (http://www.nobelprize.org/nobel_prizes/physics/laureates/1965/feynman-lecture.html)

said: "It always seems odd to me that the fundamental laws of physics, when discovered, can appear in so many different forms that are not apparently identical at first, but with a little mathematical fiddling you can show the relationship. And example of this is the Schrodinger equation and the Heisenberg formulation of quantum mechanics. I don't know why that is - it remains a mystery, but it was something I learned from experience. There is always another way to say the same thing that doesn't look at all like the way you said it before. I don't know what the reason for this is. I think it is somehow a representation of the simplicity of nature."

I too believe in the simplicity of nature, and I am glad that Richard Feynman, a Nobel-winning famous physicist, also believe in the same thing I do, but I had come to my belief long before I knew about that particular statement.

The belief that "Nature is simple" is however being expressed differently in my essay "Analogical Engine" linked to http://fqxi.org/community/forum/topic/1865 .

Specifically though, I said "Planck constant is the Mother of All Dualities" and I put it schematically as: wave-particle ~ quantum-classical ~ gene-protein ~ analogy- reasoning ~ linear-nonlinear ~ connected-notconnected ~ computable-notcomputable ~ mind-body ~ Bit-It ~ variation-selection ~ freedom-determinism ... and so on.

Taken two at a time, it can be read as "what quantum is to classical" is similar to (~) "what wave is to particle." You can choose any two from among the multitudes that can be found in our discourses.

I could have put Schrodinger wave ontology-Heisenberg particle ontology duality in the list had it comes to my mind!

Since "Nature is Analogical", we are free to probe nature in so many different ways. And you have touched some corners of it.

Good luck,

Than Tin

Dear Matthew Saul Leifer:

I am an old physician and I don't know nothing of mathematics and almost nothing of physics, so is almost impossible for me to give an opinion in your essay. I this contest are many theories, mine is not.

Maybe you would be interested in my essay over a subject which after the common people, physic discipline is the one that uses more than any other, the so called "time".

I am sending you a practical summary, so you can easy decide if you read or not my essay "The deep nature of reality".

I am convince you would be interested in reading it. ( most people don't understand it, and is not just because of my bad English).

Hawking in "A brief history of time" where he said , "Which is the nature of time?" yes he don't know what time is, and also continue saying............Some day this answer could seem to us "obvious", as much than that the earth rotate around the sun....." In fact the answer is "obvious", but how he could say that, if he didn't know what's time? In fact he is predicting that is going to be an answer, and that this one will be "obvious", I think that with this adjective, he is implying: simple and easy to understand. Maybe he felt it and couldn't explain it with words. We have anthropologic proves that man measure "time" since more than 30.000 years ago, much, much later came science, mathematics and physics that learn to measure "time" from primitive men, adopted the idea and the systems of measurement, but also acquired the incognita of the experimental "time" meaning. Out of common use physics is the science that needs and use more the measurement of what everybody calls "time" and the discipline came to believe it as their own. I always said that to understand the "time" experimental meaning there is not need to know mathematics or physics, as the "time" creators and users didn't. Instead of my opinion I would give Einstein's "Ideas and Opinions" pg. 354 "Space, time, and event, are free creations of human intelligence, tools of thought" he use to call them pre-scientific concepts from which mankind forgot its meanings, he never wrote a whole page about "time" he also use to evade the use of the word, in general relativity when he refer how gravitational force and speed affect "time", he does not use the word "time" instead he would say, speed and gravitational force slows clock movement or "motion", instead of saying that slows "time". FQXi member Andreas Albrecht said that. When asked the question, "What is time?", Einstein gave a pragmatic response: "Time," he said, "is what clocks measure and nothing more." He knew that "time" was a man creation, but he didn't know what man is measuring with the clock.

I insist, that for "measuring motion" we should always and only use a unique: "constant" or "uniform" "motion" to measure "no constant motions" "which integrates and form part of every change and transformation in every physical thing. Why? because is the only kind of "motion" whose characteristics allow it, to be divided in equal parts as Egyptians and Sumerians did it, giving born to "motion fractions", which I call "motion units" as hours, minutes and seconds. "Motion" which is the real thing, was always hide behind time, and covert by its shadow, it was hide in front everybody eyes, during at least two millenniums at hand of almost everybody. Which is the difference in physics between using the so-called time or using "motion"?, time just has been used to measure the "duration" of different phenomena, why only for that? Because it was impossible for physicists to relate a mysterious time with the rest of the physical elements of known characteristics, without knowing what time is and which its physical characteristics were. On the other hand "motion" is not something mysterious, it is a quality or physical property of all things, and can be related with all of them, this is a huge difference especially for theoretical physics I believe. I as a physician with this find I was able to do quite a few things. I imagine a physicist with this can make marvelous things.

With my best whishes

Héctor

Hi Matt,

This is a really thought-provoking essay. I'm in full agreement with your conclusion via-a-vis an underlying reality. However, if I'm reading your essay correctly, your definition of noncontextuality seems to differ a bit from the usual Kochen-Specker sense (Ken Wharton tells me you're further developing this?). In my own essay I proposed *contextuality* as a sort of underlying principle which I guess (?) would be in accord with the notion that noncontextuality must be derivable, but I think that assumes that we're using the terms in a similar sense. Could you elaborate on your notion of noncontextuality a bit?

Ian

    There is some subtlety surrounding the terminology of contextuality and noncontextuality, so let me distinguish two types. Gleason noncontextuality is the idea that a generalized probability measure on the set of quantum measurement outcomes should assign the same probability to outcomes that are represented by the same projector. From this assumption and Gleason's theorem we get the set of quantum states and the quantum probability rule. Kochen-Specker noncontextuality is the idea that if we assign a deterministic outcome to each measurement then whether or not an outcome occurs only depends on the projector that represents it. It is not possible to find a hidden variable model of quantum theory that satisfies this, which we often summarize by saying that "quantum theory is contextual".

    Quantum theory satisfies Gleason noncontextuality, but not Kochen-Specker contextuality, and it is the former that I was talking about in my essay. Of course, the two notions are related, as pointed out by Bell. Because the Gleason noncontextual measures must be quantum states in dimension 3 or higher this implies that there can be no Kochen-Specker noncontextual assignments in these dimensions. This is because an assignment of definite outcomes is a special case of a generalized probability measure, but all the Gleason noncontextual measures assign probabilities other than zero or one to at least some measurements, so there can be no noncontextual assignment of definite outcomes.

    One of the reasons that Gleason noncontextuality is not often emphasised is that it is often built in to the assumptions of the operational frameworks udes to derive quantum theory. We say that two outcomes are operationally equivalent if they are always assigned the same probability for all states, and then we go ahead and identify equivalent outcomes. However, this approach assumes that probability is a primitive, or that it is identified with something empirically observable like statistical frequencies. This is not appropriate on a subjective Bayesian interpretation of probability, which requires an explanation for why we should assign identical probabilities to these things. Arguably, even on other interpretations of probability it would be better to have an explanation, but whether it is strictly needed in a derivation of quantum theory depends on which assumptions we view as part of the background framework, and which we view as substantive.

    Thanks for the clarification. Personally I think there's more to the Kochen-Specker result than meets the eye. Or, rather, there's a really a third possible view of (non)contextuality that has a limited Bayesian subjectivity to it, i.e. there are degrees of dependence of the projectors on other projectors (if that makes any sense).

    Dear Matthew,

    One single principle leads the Universe.

    Every thing, every object, every phenomenon

    is under the influence of this principle.

    Nothing can exist if it is not born in the form of opposites.

    I simply invite you to discover this in a few words,

    but the main part is coming soon.

    Thank you, and good luck!

    I rated your essay accordingly to my appreciation.

    Please visit My essay.

    Dear Dr. Leifer,

    I found your essay to be very lucid and informative! ! I particularly enjoyed the use of the Dutch book argument to explain subjective Bayesian probability.

    A realist approach to physics is possible if quantum potential is construed as constituting "objectively existing external reality." Counterfactual assertions can be applied to all possible paths calculated by the Lagrangian, in which probabilities are bound globally in time.

    Contextual information, on the other hand, arises from the conditional entropy of the observer, who is locally part of the path selection process. The observer iteratively generates knowledge by erasing the entanglement information that encodes the quantum potential. (See my essay "A Complex Conjugate Bit and It".) In a sense, then, contextuality and noncontextuality are two sides of the same coin acting in reciprocity with each other.

    Best wishes,

    Richard

    4 days later

    Dear Matt,

    My admiring post, comments on commonality and 'thought experiment' seemed to upset you (July 5th). I sincerely apologise if it did. You say you mainly use abstracts to select essays to read. In my case this may be wrongly judging a book by it's cover. A few have said the abstract is too dense but the essay excellent, indeed blog comments include; "groundbreaking", "clearly significant", "astonishing", "fantastic job", "wonderful", "remarkable!", "deeply impressed", etc.

    I do hope I can prevail on you to check it over and offer any views and advice.

    Thank you kindly and very best wishes

    Peter

      Matt,

      I've just re-read your essay in a 'round up' and confirm my initial favourable impressions. I think it very relevant and complimentary with mine, particularly the Bayesian interpretation, and "there is something wrong with our basic framework for realist models of quantum theory. The right framework ...should reveal that quantum theory is not nonlocal after all.!!"

      I'm sad you haven't read mine, but note you choose mainly by abstract. I'm told my dense abstract has put some off. I hope I may persuade you to ignore it in this instance by quoting from my blog posts, which include;

      "groundbreaking", "remarkable!", "clearly significant", " fantastic job", "wonderful essay", "deeply impressed", "valuable contribution", "Technically challenging and philosophically deep", "Rubbish", etc. (ok, I made that last one up!).

      I hope you don't judge my book by it's cover as I think we may even be on the brink of an important step forward, or even quantum leap!.

      Best wishes

      Peter

      Dear Matthew,

      We are at the end of this essay contest.

      In conclusion, at the question to know if Information is more fundamental than Matter, there is a good reason to answer that Matter is made of an amazing mixture of eInfo and eEnergy, at the same time.

      Matter is thus eInfo made with eEnergy rather than answer it is made with eEnergy and eInfo ; because eInfo is eEnergy, and the one does not go without the other one.

      eEnergy and eInfo are the two basic Principles of the eUniverse. Nothing can exist if it is not eEnergy, and any object is eInfo, and therefore eEnergy.

      And consequently our eReality is eInfo made with eEnergy. And the final verdict is : eReality is virtual, and virtuality is our fundamental eReality.

      Good luck to the winners,

      And see you soon, with good news on this topic, and the Theory of Everything.

      Amazigh H.

      I rated your essay.

      Please visit My essay.

      Late-in-the-Day Thoughts about the Essays I've Read

      I am sending to you the following thoughts because I found your essay particularly well stated, insightful, and helpful, even though in certain respects we may significantly diverge in our viewpoints. Thank you! Lumping and sorting is a dangerous adventure; let me apologize in advance if I have significantly misread or misrepresented your essay in what follows.

      Of the nearly two hundred essays submitted to the competition, there seems to be a preponderance of sentiment for the 'Bit-from-It" standpoint, though many excellent essays argue against this stance or advocate for a wider perspective on the whole issue. Joseph Brenner provided an excellent analysis of the various positions that might be taken with the topic, which he subsumes under the categories of 'It-from-Bit', 'Bit-from-It', and 'It-and-Bit'.

      Brenner himself supports the 'Bit-from-It' position of Julian Barbour as stated in his 2011 essay that gave impetus to the present competition. Others such as James Beichler, Sundance Bilson-Thompson, Agung Budiyono, and Olaf Dreyer have presented well-stated arguments that generally align with a 'Bit-from-It' position.

      Various renderings of the contrary position, 'It-from-Bit', have received well-reasoned support from Stephen Anastasi, Paul Borrill, Luigi Foschini, Akinbo Ojo, and Jochen Szangolies. An allied category that was not included in Brenner's analysis is 'It-from-Qubit', and valuable explorations of this general position were undertaken by Giacomo D'Ariano, Philip Gibbs, Michel Planat and Armin Shirazi.

      The category of 'It-and-Bit' displays a great diversity of approaches which can be seen in the works of Mikalai Birukou, Kevin Knuth, Willard Mittelman, Georgina Parry, and Cristinel Stoica,.

      It seems useful to discriminate among the various approaches to 'It-and-Bit' a subcategory that perhaps could be identified as 'meaning circuits', in a sense loosely associated with the phrase by J.A. Wheeler. Essays that reveal aspects of 'meaning circuits' are those of Howard Barnum, Hugh Matlock, Georgina Parry, Armin Shirazi, and in especially that of Alexei Grinbaum.

      Proceeding from a phenomenological stance as developed by Husserl, Grinbaum asserts that the choice to be made of either 'It from Bit' or 'Bit from It' can be supplemented by considering 'It from Bit' and 'Bit from It'. To do this, he presents an 'epistemic loop' by which physics and information are cyclically connected, essentially the same 'loop' as that which Wheeler represented with his 'meaning circuit'. Depending on where one 'cuts' the loop, antecedent and precedent conditions are obtained which support an 'It from Bit' interpretation, or a 'Bit from It' interpretation, or, though not mentioned by Grinbaum, even an 'It from Qubit' interpretation. I'll also point out that depending on where the cut is made, it can be seen as a 'Cartesian cut' between res extensa and res cogitans or as a 'Heisenberg cut' between the quantum system and the observer. The implications of this perspective are enormous for the present It/Bit debate! To quote Grinbaum: "The key to understanding the opposition between IT and BIT is in choosing a vantage point from which OR looks as good as AND. Then this opposition becomes unnecessary: the loop view simply dissolves it." Grinbaum then goes on to point out that this epistemologically circular structure "...is not a logical disaster, rather it is a well-documented property of all foundational studies."

      However, Grinbaum maintains that it is mandatory to cut the loop; he claims that it is "...a logical necessity: it is logically impossible to describe the loop as a whole within one theory." I will argue that in fact it is vital to preserve the loop as a whole and to revise our expectations of what we wish to accomplish by making the cut. In fact, the ongoing It/Bit debate has been sustained for decades by our inability to recognize the consequences that result from making such a cut. As a result, we have been unable to take up the task of studying the properties inherent in the circularity of the loop. Helpful in this regard would be an examination of the role of relations between various elements and aspects of the loop. To a certain extent the importance of the role of relations has already been well stated in the essays of Kevin Knuth, Carlo Rovelli, Cristinel Stoica, and Jochen Szangolies although without application to aspects that clearly arise from 'circularity'. Gary Miller's discussion of the role of patterns, drawn from various historical precedents in mathematics, philosophy, and psychology, provides the clearest hints of all competition submissions on how the holistic analysis of this essential circular structure might be able to proceed.

      In my paper, I outlined Susan Carey's assertion that a 'conceptual leap' is often required in the construction of a new scientific theory. Perhaps moving from a 'linearized' perspective of the structure of a scientific theory to one that is 'circularized' is just one further example of this kind of conceptual change.

      Hello Matthew

      I found your discussion of Bayesianism quite fascinating. You said that there is nothing in logic that tells you what premises you have to start with.

      In my essay, one begins with all possible propositions. How this connects with yours, I am not quite sure yet. Logic is subjugate to the General Principle of Equivalence, as is every proposition, and the GPE filters out all propositions but two at the first step.

      I found the last parts quite hard to follow, but liked the main theme.

      Best wishes

      Stephen Anastasi

      Really excellent essay, Matthew, one of the very best here.

      I'm not sure I'd agree that quantum theory represents a situation in which there is a failure of the normative force of standard probability theory... I think that whatever normative force it has still holds within contexts, and shouldn't be expected to hold across contexts, so I prefer not to say that quantum theory(and other operational theories I work with) uses "generalized probability theory". But I suspect this is more a matter of terminological taste than a substantive difference with you.

      I also tend to be more accepting of the notion that "that's just the empirically verified way things are", than you are, regarding the noncontextuality of quantum probabilities. However, I also think there are sometimes aspects of reality, involving the macroscopic structure of the experiments we do to measure quantum observables (which it is always good to remember, is usually quite a bit trickier than we assume in the idealized "you can measure any Hermitian operator" quantum information-theoretic, or operational quantum theory, context), that suggests identifying certain outcomes across different contexts. This is particularly convincing when one can change the context by tweaking parts of the apparatus associated with other outcomes, while not tweaking the part associated with the outcome being identified across contexts. E.g. an interference experiment in which we don't change things on the pathway leading to one photodetector, but mess about with phase shifters and beamsplitters on the other paths (and don't interfere these with the former path). But I agree it's reasonable to look for deeper explanations of noncontextuality of probabilities...

      Incidentally the improvements in Zurek's derivation in your citation [32], vs. [31], originated in discussions between Jerry Finkelstein and me, and appear in http://arxiv.org/abs/quant-ph/0312150. I'm still not convinced that these sorts of arguments, or the Wallace-type ones, provide a fully convincing derivation of quantum probability (with its noncontextuality) within many-worlds, though. The Wallace axioms don't seem to me to have normative force in a world with only superpositions, not actual outcomes. Perhaps "no-signalling" (which I viewed as the key ingredient of the Zurek-style argument in 031250) is better motivated... but I feel like the arguments there, though formally correct, need to be motivated from a more thoroughly many-worlds point of view.

        Dear Matthew - My goal was to review and rate every single essay on this site. I thought I had finished a few hours ago, but my assistant just checked off the whole list, and discovered that I had missed you. I will fix this and review it now (may take me an hour or more to do a good job). I promise I will get a rating in before the contest closes in 3 1/2 hours from now. From skiming through it quickly, I can already tell it looks good, but in the spirit of academic integrity, I need to read it properly in order to rate it accurately.

        In the meantime, I suspect you were not even aware of my essay, in which case, I would be honored if you would take a look at it.

        You can find the latest version of it here:

        http://fqxi.org/data/forum-attachments/Borrill-TimeOne-V1.1a.pdf

        (sorry if the fqxi web site splits this url up, I haven't figured out a way to not make it do that).

        Kind regards, Paul

        p.s. I will delete and replace this message with my actual comments later.

        Matthew - yours was the last of the over 180 essays I reviewed. I apologize for overlooking it earlier.

        As the wormhole is about to close ... this is a superb piece of work, clearly one of the most outstanding essays on this site.

        It would appear to go hand in hand with the essay by Mark Freely: a beautiful explication of what probability theory means, and how we (as physicists) abuse the mathematics without quite knowing what it means.

        As a realist, I totally resonated with your conclusion: extra physical principles are needed in the agent-independent reality in order justify the faith we have in probability.

        Well done. I believe this is a major contribution to the debate.

        Kind regards, Paul

        "I'm not sure I'd agree that quantum theory represents a situation in which there is a failure of the normative force of standard probability theory"

        It is pretty important for me to view things this way from a rhetorical point of view, even if it does not make much difference to the mathematics. The reason is that many Bayesians seem to think that their derivations of probability theory demonstrate that ordinary classical probability theory, with everything defined on a single sample space, do have normative force everywhere and always. Jaynes certainly thinks this, but I have seen it from subjective Bayesians as well. One of my main points is that the normative force fails in much more prosaic situations than quantum theory. In fact, I would argue that it fails almost all the time. The vast majority of rational decisions we make in our lives are not ones for which probabilities and utilities can or need be articulated precisely, e.g. a choice of which job offer to accept. The principles of decision theory still hold in these situations, but the space of possible decision scenarios is not rich enough to imply that probability theory must be used.

        In the case of quantum theory, I am pretty sure that the QBists have fallen into the trap of thinking that probability theory is always normative. That is why they place so much emphasis on viewing quantum theory as just ordinary probability with an extra empirical constraint rather than as a generalization of probability theory. This is part of the reason why they have gone down the SICPOVM rathole rather than sticking to their earlier Gleason's theorem based approach. It is pretty important for me to distance myself from this.

        "I also tend to be more accepting of the notion that "that's just the empirically verified way things are", than you are, regarding the noncontextuality of quantum probabilities."

        In that case I would ask you why the situation here is different from that of the equilibrium distribution in statistical mechanics. If this type of argument is acceptable for quantum theory then why not for stat. mech. as well?

        "However, I also think there are sometimes aspects of reality, involving the macroscopic structure of the experiments we do to measure quantum observables [..], that suggests identifying certain outcomes across different contexts."

        I agree with you, but my point is that we have to articulate exactly what this is. For example, consider a Stern-Gerlach measurement. It is pretty clear that if we rotate the device 180 degrees then we should consider the new "up" outcome as physically equivalent to the old "down" outcome and vice versa. This is actually not too different from what goes on with all noncontextually related outcomes except that they are related by rotations of a subspace rather then the whole Hilbert space. The question is what goes wrong if we don't do this. Now, if we had a deterministic physics and we could argue from the way that the Stern Gerlach device operates that the down outcome in the rotated experiment happens iff the up outcome would have happened had we made the non rotated experiment then we could derive a sure loss for anyone who believes that the device operates this way with certainty. However, the Kochen-Specker theorem blocks such assertions in the quantum case. Therefore, the question is to articulate precisely what we mean by saying that two outcomes are physically equivalent if not that they always yield the same outcome. Saying that they always have the same probability is not enough if you do not believe that probabilities are objective physical facts.

        "I'm still not convinced that these sorts of arguments, or the Wallace-type ones, provide a fully convincing derivation of quantum probability (with its noncontextuality) within many-worlds, though."

        I have my own variant of this sort of derivation that I might write up some day, but there was not space to get into in the essay. One of the issues with Wallace is that his axioms mix up things that have to do with pure decision theory with those that have to do with the structure of the way branching works in many-worlds quantum theory. I prefer an approach that starts with something like the Greaves-Myrvold axioms, or indeed just the naive Dutch book described in my essay. Then, once we have derived that we have probability on a semi-classical test space, we can impose a couple of axioms about indifference to certain aspects of the branching structure. This actually allows you to derive the noncontextuality of probability assignments, from which we get quantum probabilities via Gleason. Interestingly, this approach avoids all the Lewis nonsense in Wallace, as it does not depend on the actual amplitudes in the wavefunction. We get a purely subjective Bayesian account of probability from this.

        I am not sure that my derivation would convince you any more than the existing ones, but it does remove a number of objections. For example, Adrian Kent's argument that the amplitudes are just "numbers in the sky" which should not constrain rational decision making is avoided by saying that amplitudes do not constrain probabilities (only the branching structure does). Similarly the argument that probabilities should match world-counts is evaded by taking the position that probabilities are no more frequencies in the quantum case than they are in the classical case.

        Note also that, since amplitudes do not constrain probabilities in my approach, it may be possible to dispense with them as part of objective reality entirely, thus saving the epistemic interpretation of the wavefunction. The idea would be that the branching structure of the wavefunction is objectively real, but the precise values of the amplitudes are not. This is quite an appealing prospect to me as a way of showing that at least one realist interpretation of quantum theory with epistemic quantum states exists, although I think non many-worlds options should be investigated before we psi-epistemicists jump on this as a preferred interpretation.

        Matt, I'd love to see your derivation of probabilities in an Everett-style theory sometime. When you write:

        "For example, Adrian Kent's argument that the amplitudes are just "numbers in the sky" which should not constrain rational decision making is avoided by saying that amplitudes do not constrain probabilities (only the branching structure does)."

        I think Adrian's objection is close to my objections to the Wallace-style derivations.... everything happens, and why should I care what numbers are attached to the branches. But I also thought it was important to look at the specific Wallace axioms, and discuss in more detail why they aren't compelling...something I've never really gotten around to.

        Re Myrvold-Greaves...I only skimmed it, guess I should have a closer look.

        About Stat mech:

        "HB "I also tend to be more accepting of the notion that "that's just the empirically verified way things are", than you are, regarding the noncontextuality of quantum probabilities."

        ML: In that case I would ask you why the situation here is different from that of the equilibrium distribution in statistical mechanics. If this type of argument is acceptable for quantum theory then why not for stat. mech. as well? "

        Well, I don't think one shouldn't look for possible realistic descriptions underlying quantum theory (certainly the measurement outcomes --- at least to a non Everettian like myself --- are real). But the difficulty we are having coming up with such things at least raises the possibility that with quantum, they may not be forthcoming...or rather that they may mix the macroscopic and the microscopic, agents and systems, it and bit, in ways we haven't figured out how to conceptualize yet... If we had to settle for empirical success in the case of stat mech, then I'd settle...of course if we can say more, a more that is compatible with lots of other physics that is not described by equilibrium distributions, that is good... It has been good for me over the last few days to think about these issues with Bohmian mechanics in mind, too.

        The business about symmetries of measuring apparatus (or other structures) justifying noncontextual probability assignments resonates, just a bit, with the idea that we need to look toward a meatier, more spacetime conception of quantum theory (more aligned with the broader field of physics) than we sometimes do coming from the quantum information / operational theories perspective, in which we just have a complex Hilbert space and can do whatever unitary, POVM, or CP map on it we decide to... some idea that the "reality" of particles resides more in symmetries that are manifested by observable phenomena than in "microscopic entities" similar to little bits of furniture or fluff... just musing out loud here, more or less.

        Cheers,

        Howard