Dear Matt,

I want to be clear and to be able to quote you on this, as I am not sure you gave a committal answer above:

Question: Is existence/non-existence on the list of possible binary choices? That is, 'is IT existing or not existing' among the questions that can be asked in the game of twenty questions?

Regards,

Akinbo

    Within the context of the essay, there are two possible positions that make sense to me. One could be completely operational about things and say that the only things that exist are the readings on our experimental apparatus, e.g. detector clicks, instrument readings and so forth. In that case, we are assuming that our macroscopic apparatus exist, but not that there is necessarily any deeper reality underlying them. However, the position that I prefer is to assume that there are things called quantum systems that exist independently of us and we are just asking questions about the properties of those systems. In that sense, I am assuming that "it" in the sense of "a quantum system" exists and is not being questioned.

    Of course, one can also consider situations in which you are not sure whether a particular quantum system exists, e.g. a laser typically emits a superposition of different numbers of photons. In that case, it sometimes makes sense to include a question about whether the system exists if you can ask it without destroying the system. For example, there are heralded single photon sources that produce two photons in an entangled state and then one can use a measurement on the first photon as evidence that the second photon exists.

    I hope that answers your question, but of course it depends on precisely what you mean by IT.

    6 days later

    Dear Matthew,

    You have presented a very thoughtful and at times surprisingly subtle analysis of the argument that even a subjective Bayesian view of quantum mechanics must be grounded in some sort of external reality, although this reality need not be understood in terms of classical concepts.

    It seems to me that your argument could provide a starting point through which quantum Bayesianism could attain wider acceptance. As far as I can tell, a major point of criticism is that QBism does not really offer a clear ontology (if any at all), and I imagine most physicists do not see themselves in the business of figuring out degrees of belief. Your approach reminds me a bit of what I read in Howard Barnum's entry, to try to find some sort of neat integration between the subjectivist approach and an objective reality (at least at some level).

    It would seem, then, that the next step is to try to find out how to map the external requirement for non-contextuality on any purported features of the external reality. Is that correct? If it is, how do you intend to accomplish this? Also, your entry whetted my appetite for learning more about the philosophy of probability about which I know next to nothing (other than what I learned in your article). Can you recommend one or more introductory texts?

    I have developed an original framework myself, and while it seems that it can reproduce major features of QM, I do not at present know how to get the Born Rule out of it. I strongly suspect that it is due to my ignorance in precisely this area, so I intend to change that.

    I enjoyed your essay and wish you all the best,

    Armin

      Thanks for your kind comments about my essay.

      It is important to distinguish two things:

      1) Believing in the subjective Bayesian interpretation of quantum theory and wanting to employ it in understanding the probabilities that arise in quantum theory.

      2) Being a QBist, i.e. believing in what Caves, Fuchs and Schack have called quantum Bayesianism.

      Quantum Bayesianism, or QBism, is a very specific work-in-progress interpretation of quantum theory, in the category that I call neo-Copenhagen. It employs a subjective Bayesian interpretation of probabilities, but goes way beyond that in denying the existence of a deeper reality underlying quantum theory.

      In contrast, just believing in subjective Bayesian probability does not imply any of that. As I mentioned in the essay, you can apply it in the many-worlds interpretation, and it would also be unproblematic in Bohmian mechanics, at least if the probabilities statistical mechanics can be interpreted subjectively because they are of the same type.

      My view is quite far from QBism because it ends up accepting realism of a fairly straightforward type. Fuchs and company are dead against this. It is rather unfortunate that they have co-opted the phrase "quantum Bayesianism" to refer to their approach, since by rights that should refer to any attempt to apply Bayesian probability and statistical methodology to quantum theory. As it happens, I did obtain my first exposure to Bayesian probability from Fuchs et. al. I am very grateful for that and probably would not be doing quantum foundations were it not for Fuchs' eloquent papers and talks, but my view is definitely not a form of QBism. If my view gains more acceptance than QBism then I would be happy about that, but I doubt it would please the QBists.

      You are right that the next step is to try and construct an ontology in which noncontextuality emerges naturally. This is not true in Bohmian mechanics for example, since a generic probability distribution over the positions of particles would be contextual. The reason that I mentioned many-worlds in the essay is that I believe that the Deutsch-Wallace argument does achieve noncontextuality naturally. However, this is not too surprising because they assume the objective existence of the wavefunction and only the wavefunction, and this is the thing that encodes all the information about the quantum probabilities in the first place. As a proponent of the epistemic interpretation of the wavefunction this would not be where I would start. Instead, I would look at more exotic single-world ontologies than are usually encountered in the literature, such as those involving retrocausality.

      Finally, regarding reading on the foundations of probability, I wrote a blog post with a reading list some time ago http://mattleifer.info/2010/11/01/a-reading-list-on-the-foundations-of-probability-and-statistics/ I update it periodically with mini reviews as I read things, but I haven't done so for some time meaning that I have now read a lot of the things marked UNREAD. For the philosophy of probability, I would recommend starting with Donald Gilles' textbook and then the collection of essays edited by Anthony Eagle. The former is the best overview, but is not perfect and is lacking in coverage of many contemporary issues. The latter remedies this as well as containing many of the classic papers.

      Dear Matthew,

      Thank you for clarifying the distinctions. I looked at the reading list and it is far more extensive than I imagined. Thank you!

      Armin

      Hi Matthew -

      Your discussion of subjectivity is crucial to the subject of this contest. It is interesting to consider how we might apply the concept to an evolving observer, one who makes decisions at every moment, and over a very long period of time, during which his relation to the physical world - his own biological configuration, if you will - is continuously altered.

      All conceivable decision scenarios are then possible, as you say in referring to It from Bit - 'with only certain subsets of all possible bets being jointly resolvable'.

      If evolution affects us at every moment (and it is impossible to argue that it doesn't) then It from Bit is true: We live in a Species Cosmos that is being evolved from ourselves. However, it can and should be countered that we do seem to possess a certain objectivity - that Bits appear to be founded upon a reality greater than the continually evolving Species Cosmos - a reality where our logical and scientific parameters are less applicable, and often not applicable at all.

      You might be interested to see how I treat this evolutionary argument as a realist interpretation of the field of reality, thus expanding the definitions of It and Bit far beyond those signified by Wheeler. I think you touch on this when you allude to '"it from bit from it", where the first "it" refers to classical ontology and the second refers to quantum stuff.'

      I give this It-Bit-It sequence a structure you might find useful.

      All the Best,

      John.

        Dear Matthew,

        Excellent essay. I like the realist views and how you back up this approach. Moreover I really think you raise an excellent point and are dead right that we also need to consider the likes of wave fuctions, so it isn't a straightforward as It from Bit or vice versa.

        Hopefully you'll get a chance to look at my essay, which I hope you find of interst too.

        Best wishes & congratulations on your essay,

        Antony

          Hi Matt,

          Excellent essay! I never know quite what to make of it when someone carefully reasons out an argument that (in the end) agrees with my reasoning, but clearly is far more detailed and careful than my own considerations. Did I really just get lucky through sloppy reasoning? Are both arguments merely rationalizations, trying to justify a previously-held common conclusion? Or had I really worked out a parallel, simpler argument?

          In your case, at least, I'm pretty confident it's not the middle option... But I'm going back and forth between the other two.

          To the extent I think I might have a parallel argument, it basically revolves around the concept of "scientific explanation". By the end of section 6, after all, I think that your argument has basically boiled down to 1) noncontextuality as a necessary explanation for the Born rule, and 2) physical reality as a necessary explanation for noncontextuality. If I'm right, then the simpler, parallel argument would be 3) physical reality as a necessary explanation for the Born rule (and for that matter, any observed correlations in our universe). I guess the question then for you is this: Is there *any* set of observable (and reproducible) correlations that wouldn't need to have an explanation grounded in a physical reality? I would instinctively say no, but am curious as to your take on this.

          My only complaint is that you threw too large of a bone to the "It from Bit" concept at the very end... If classical ontology is put together in our heads, then it's not really an "it", is it? :-)

          Now, you just need to write up that general definition of contextuality you were telling me about, so that you can better explain the seeming disconnect between the (non)contextuality of "value assignments" and "probability assignments"!

          Cheers!

          Ken

          PS; After just reading a few of your responses to the above questions, I'm guessing that your answer to my above question will be that it depends whether one is a "realist" or not! So maybe I did just 'get lucky' by assuming realism from the outset. I applaud your efforts to convert anti-realists, as I don't seem to have the faintest clue where to begin on that front...

            When attempting to argue against anti-realists, I think it is more effective to try and take all their assumptions on board and then show that it leads to a contradiction than to try and argue for realism a priori on the basis of what constitutes a "scientific explanation" or something like that. Of course, most realists will agree with such a priori arguments, but that is just preaching to the choir. An anti-realist is, by definition, someone who does not agree with such arguments and thinks that they have valid counterarguments, so they can just go ahead and adopt something Copenhagen-like anyway. If you want to persuade them then you have to argue on their home turf.

            Regarding correlations that don't require a physical explanation, I would say that, technically speaking, the probabilities on a "semi-classical test space" are of this sort. This is where you just have N different betting contexts, each of which gets assigned its own independent classical probability distribution and there are no additional constraints. In this case, all the constraints are coming from the Dutch book argument, so they don't depend on any assumptions about a pre-existing reality. You may be inclined to reject such cases as uninteresting because they are only a slight modification of classical probability theory, but, as I mentioned in a footnote in the essay, such a system can be thought of as half of a nonlocal box, so interesting things can happen with such systems when we compose them. In fact, I would argue that the nonsignalling condition does have a viable physical explanation if one accepts fundamental Lorentz invariance, so in that case we could regard the PR-box correlations as an example of a set of probabilities that does not require additional physical explanation beyond what we already have.

            Thanks for your comments. Being an advocate of the epistemic interpretation of the wavefunction, I don't actually think that the wavefunction is a viable contender for the fundamental ontology, but I mentioned it in the essay because it is an obvious counterexample to the idea that the "it" has to be made of things that we view as real in classical physics, like particles and fields. I believe that the "quantum stuff" is something more exotic that does not yet appear in contemporary interpretations of quantum theory.

            Thanks for your comments. I think this is related to the longstanding debate about diachronic coherence in the literature on subjective Bayesian probability. Basically, the issue is about whether one can regard "you now" and "you at some point in the future" as one and the same agent. If you answer yes then it would be irrational to do something now that you believe with certainty that the future version of you will regard as irrational. Such kinds of argument are necessary to derive Bayesian updating as an rationality constraint. I have always been in the camp that regards diachronic coherence as unfounded. At most you can derive constraints on what you now expect that the future version of you will do, and not on what the future version of you actually should do. If this is the case then constraints on how probabilities evolve need to be grounded in physical reality rather than just rationality, but I believe that this point is already made just by considering alternative measurements at a single instant of time.

            Hi Matt,

            I really enjoyed reading your essay. While you defined the event space of subjective Bayesian probability as the discrete set, you mentioned about the Newtonian mechanics to seem to require the continuous event space. I a little confuse this point. How to explain that? Also, is there no difference or no technical or mathematical problem to extend the continues set from the discrete set? For example, considering functional analysis, there are many nontrivial examples to be satisfied in the continuous set but not in the discrete one and vice verse.

            Best wishes,

            Yutaka

              The example I gave from Newtonian mechanics does employ a continuous ontological state space, although we might imagine be betting on some coarse graining of that, in which case the event space would be discrete.

              Nevertheless, there are of course additional issues that come up in the case of a sample space of infinite cardinality, even for a countably infinite space let alone the continuum. In classical probability, the main additional requirement is Kolmogorov's axiom of countable additivity. There are Dutch book arguments for countable additivity, but of course they involve considering bets on an infinite sequence of events. The status of these arguments depends on whether you view the Dutch book argument as the true operational definition of probability or merely as useful window dressing to help us understand why degrees of belief must satisfy probability theory. In the former case, there could be some trouble with considering sequences of bets that it is not practical for someone to actually enter into. Bayesians of this stripe, which include de Finetti, typically argue that probabilities should only be required to be finitely additive.

              Whatever you think about this, the point is that it is equally an issue with classical probability theory and has nothing to do with the specific application to quantum theory. It is worth noting that most other interpretations of probability theory also have problems with the Kolmogorov axioms. For example, von Mises version of frequentism, which is probably the most popular one, also does not support countable additivity.

              I don't think there really are any issues of this sort that are unique to quantum theory, at least at the level I am discussing it in my essay. Of course, to get quantum theory, the things we are assigning probabilities to must have the structure of the closed subspaces of Hilbert space, the projections in a von Neumann algebra, or something of that sort. The major part of a derivation of quantum theory would be to derive those structures and assumptions in functional analysis will come up there. However, in my essay I focussed on how probabilities should be assigned once we know the structure of the betting contexts, in which case those structures are already being assumed.

              Dear Matt,

              I enjoyed very much your essay. It contains a very profound part on probabilities, but it is deep also from the viewpoint of physics and philosophy. Indeed, the standard view on probabilities should be reviewed and generalized, to allow "a bunch of probability distribution, rather than just one". What mysteries remain in quantum mechanics, after this revision is done?

              I also like the angle your essay takes on the main theme of the contest, namely that 'it from bit' is compatible with your proposed 'bit from it' by the scheme "it-as-quantum-stuff => bit => it-as-particles-and-fields".

              There are many points of your essay that I would like to understand better, so I intend to reread it, and also more of your writings in this direction.

              Best regards,

              Cristi Stoica

              Hi Matthew,

              Exactly how I interpreted your essay. Well done on great work!

              Antony

              Dear Matthew Leifer,

              I have down loaded your essay and soon post my comments on it. Meanwhile, please, go through my essay and post your comments.

              Regards and good luck in the contest,

              Sreenath BN.

              http://fqxi.org/community/forum/topic/1827

              Dr. Leifer

              Richard Feynman in his Nobel Acceptance Speech (http://www.nobelprize.org/nobel_prizes/physics/laureates/1965/feynman-lecture.html)

              said: "It always seems odd to me that the fundamental laws of physics, when discovered, can appear in so many different forms that are not apparently identical at first, but with a little mathematical fiddling you can show the relationship. And example of this is the Schrodinger equation and the Heisenberg formulation of quantum mechanics. I don't know why that is - it remains a mystery, but it was something I learned from experience. There is always another way to say the same thing that doesn't look at all like the way you said it before. I don't know what the reason for this is. I think it is somehow a representation of the simplicity of nature."

              I too believe in the simplicity of nature, and I am glad that Richard Feynman, a Nobel-winning famous physicist, also believe in the same thing I do, but I had come to my belief long before I knew about that particular statement.

              The belief that "Nature is simple" is however being expressed differently in my essay "Analogical Engine" linked to http://fqxi.org/community/forum/topic/1865 .

              Specifically though, I said "Planck constant is the Mother of All Dualities" and I put it schematically as: wave-particle ~ quantum-classical ~ gene-protein ~ analogy- reasoning ~ linear-nonlinear ~ connected-notconnected ~ computable-notcomputable ~ mind-body ~ Bit-It ~ variation-selection ~ freedom-determinism ... and so on.

              Taken two at a time, it can be read as "what quantum is to classical" is similar to (~) "what wave is to particle." You can choose any two from among the multitudes that can be found in our discourses.

              I could have put Schrodinger wave ontology-Heisenberg particle ontology duality in the list had it comes to my mind!

              Since "Nature is Analogical", we are free to probe nature in so many different ways. And you have touched some corners of it.

              Good luck,

              Than Tin

              Dear Matthew Saul Leifer:

              I am an old physician and I don't know nothing of mathematics and almost nothing of physics, so is almost impossible for me to give an opinion in your essay. I this contest are many theories, mine is not.

              Maybe you would be interested in my essay over a subject which after the common people, physic discipline is the one that uses more than any other, the so called "time".

              I am sending you a practical summary, so you can easy decide if you read or not my essay "The deep nature of reality".

              I am convince you would be interested in reading it. ( most people don't understand it, and is not just because of my bad English).

              Hawking in "A brief history of time" where he said , "Which is the nature of time?" yes he don't know what time is, and also continue saying............Some day this answer could seem to us "obvious", as much than that the earth rotate around the sun....." In fact the answer is "obvious", but how he could say that, if he didn't know what's time? In fact he is predicting that is going to be an answer, and that this one will be "obvious", I think that with this adjective, he is implying: simple and easy to understand. Maybe he felt it and couldn't explain it with words. We have anthropologic proves that man measure "time" since more than 30.000 years ago, much, much later came science, mathematics and physics that learn to measure "time" from primitive men, adopted the idea and the systems of measurement, but also acquired the incognita of the experimental "time" meaning. Out of common use physics is the science that needs and use more the measurement of what everybody calls "time" and the discipline came to believe it as their own. I always said that to understand the "time" experimental meaning there is not need to know mathematics or physics, as the "time" creators and users didn't. Instead of my opinion I would give Einstein's "Ideas and Opinions" pg. 354 "Space, time, and event, are free creations of human intelligence, tools of thought" he use to call them pre-scientific concepts from which mankind forgot its meanings, he never wrote a whole page about "time" he also use to evade the use of the word, in general relativity when he refer how gravitational force and speed affect "time", he does not use the word "time" instead he would say, speed and gravitational force slows clock movement or "motion", instead of saying that slows "time". FQXi member Andreas Albrecht said that. When asked the question, "What is time?", Einstein gave a pragmatic response: "Time," he said, "is what clocks measure and nothing more." He knew that "time" was a man creation, but he didn't know what man is measuring with the clock.

              I insist, that for "measuring motion" we should always and only use a unique: "constant" or "uniform" "motion" to measure "no constant motions" "which integrates and form part of every change and transformation in every physical thing. Why? because is the only kind of "motion" whose characteristics allow it, to be divided in equal parts as Egyptians and Sumerians did it, giving born to "motion fractions", which I call "motion units" as hours, minutes and seconds. "Motion" which is the real thing, was always hide behind time, and covert by its shadow, it was hide in front everybody eyes, during at least two millenniums at hand of almost everybody. Which is the difference in physics between using the so-called time or using "motion"?, time just has been used to measure the "duration" of different phenomena, why only for that? Because it was impossible for physicists to relate a mysterious time with the rest of the physical elements of known characteristics, without knowing what time is and which its physical characteristics were. On the other hand "motion" is not something mysterious, it is a quality or physical property of all things, and can be related with all of them, this is a huge difference especially for theoretical physics I believe. I as a physician with this find I was able to do quite a few things. I imagine a physicist with this can make marvelous things.

              With my best whishes

              Héctor

              Hi Matt,

              This is a really thought-provoking essay. I'm in full agreement with your conclusion via-a-vis an underlying reality. However, if I'm reading your essay correctly, your definition of noncontextuality seems to differ a bit from the usual Kochen-Specker sense (Ken Wharton tells me you're further developing this?). In my own essay I proposed *contextuality* as a sort of underlying principle which I guess (?) would be in accord with the notion that noncontextuality must be derivable, but I think that assumes that we're using the terms in a similar sense. Could you elaborate on your notion of noncontextuality a bit?

              Ian

                There is some subtlety surrounding the terminology of contextuality and noncontextuality, so let me distinguish two types. Gleason noncontextuality is the idea that a generalized probability measure on the set of quantum measurement outcomes should assign the same probability to outcomes that are represented by the same projector. From this assumption and Gleason's theorem we get the set of quantum states and the quantum probability rule. Kochen-Specker noncontextuality is the idea that if we assign a deterministic outcome to each measurement then whether or not an outcome occurs only depends on the projector that represents it. It is not possible to find a hidden variable model of quantum theory that satisfies this, which we often summarize by saying that "quantum theory is contextual".

                Quantum theory satisfies Gleason noncontextuality, but not Kochen-Specker contextuality, and it is the former that I was talking about in my essay. Of course, the two notions are related, as pointed out by Bell. Because the Gleason noncontextual measures must be quantum states in dimension 3 or higher this implies that there can be no Kochen-Specker noncontextual assignments in these dimensions. This is because an assignment of definite outcomes is a special case of a generalized probability measure, but all the Gleason noncontextual measures assign probabilities other than zero or one to at least some measurements, so there can be no noncontextual assignment of definite outcomes.

                One of the reasons that Gleason noncontextuality is not often emphasised is that it is often built in to the assumptions of the operational frameworks udes to derive quantum theory. We say that two outcomes are operationally equivalent if they are always assigned the same probability for all states, and then we go ahead and identify equivalent outcomes. However, this approach assumes that probability is a primitive, or that it is identified with something empirically observable like statistical frequencies. This is not appropriate on a subjective Bayesian interpretation of probability, which requires an explanation for why we should assign identical probabilities to these things. Arguably, even on other interpretations of probability it would be better to have an explanation, but whether it is strictly needed in a derivation of quantum theory depends on which assumptions we view as part of the background framework, and which we view as substantive.