Thanks for your comments. I think this is related to the longstanding debate about diachronic coherence in the literature on subjective Bayesian probability. Basically, the issue is about whether one can regard "you now" and "you at some point in the future" as one and the same agent. If you answer yes then it would be irrational to do something now that you believe with certainty that the future version of you will regard as irrational. Such kinds of argument are necessary to derive Bayesian updating as an rationality constraint. I have always been in the camp that regards diachronic coherence as unfounded. At most you can derive constraints on what you now expect that the future version of you will do, and not on what the future version of you actually should do. If this is the case then constraints on how probabilities evolve need to be grounded in physical reality rather than just rationality, but I believe that this point is already made just by considering alternative measurements at a single instant of time.
\"It from bit\" and the quantum probability rule by Matthew Saul Leifer
Hi Matt,
I really enjoyed reading your essay. While you defined the event space of subjective Bayesian probability as the discrete set, you mentioned about the Newtonian mechanics to seem to require the continuous event space. I a little confuse this point. How to explain that? Also, is there no difference or no technical or mathematical problem to extend the continues set from the discrete set? For example, considering functional analysis, there are many nontrivial examples to be satisfied in the continuous set but not in the discrete one and vice verse.
Best wishes,
Yutaka
The example I gave from Newtonian mechanics does employ a continuous ontological state space, although we might imagine be betting on some coarse graining of that, in which case the event space would be discrete.
Nevertheless, there are of course additional issues that come up in the case of a sample space of infinite cardinality, even for a countably infinite space let alone the continuum. In classical probability, the main additional requirement is Kolmogorov's axiom of countable additivity. There are Dutch book arguments for countable additivity, but of course they involve considering bets on an infinite sequence of events. The status of these arguments depends on whether you view the Dutch book argument as the true operational definition of probability or merely as useful window dressing to help us understand why degrees of belief must satisfy probability theory. In the former case, there could be some trouble with considering sequences of bets that it is not practical for someone to actually enter into. Bayesians of this stripe, which include de Finetti, typically argue that probabilities should only be required to be finitely additive.
Whatever you think about this, the point is that it is equally an issue with classical probability theory and has nothing to do with the specific application to quantum theory. It is worth noting that most other interpretations of probability theory also have problems with the Kolmogorov axioms. For example, von Mises version of frequentism, which is probably the most popular one, also does not support countable additivity.
I don't think there really are any issues of this sort that are unique to quantum theory, at least at the level I am discussing it in my essay. Of course, to get quantum theory, the things we are assigning probabilities to must have the structure of the closed subspaces of Hilbert space, the projections in a von Neumann algebra, or something of that sort. The major part of a derivation of quantum theory would be to derive those structures and assumptions in functional analysis will come up there. However, in my essay I focussed on how probabilities should be assigned once we know the structure of the betting contexts, in which case those structures are already being assumed.
Dear Matt,
I enjoyed very much your essay. It contains a very profound part on probabilities, but it is deep also from the viewpoint of physics and philosophy. Indeed, the standard view on probabilities should be reviewed and generalized, to allow "a bunch of probability distribution, rather than just one". What mysteries remain in quantum mechanics, after this revision is done?
I also like the angle your essay takes on the main theme of the contest, namely that 'it from bit' is compatible with your proposed 'bit from it' by the scheme "it-as-quantum-stuff => bit => it-as-particles-and-fields".
There are many points of your essay that I would like to understand better, so I intend to reread it, and also more of your writings in this direction.
Best regards,
Hi Matthew,
Exactly how I interpreted your essay. Well done on great work!
Antony
Dear Matthew Leifer,
I have down loaded your essay and soon post my comments on it. Meanwhile, please, go through my essay and post your comments.
Regards and good luck in the contest,
Sreenath BN.
http://fqxi.org/community/forum/topic/1827
Dr. Leifer
Richard Feynman in his Nobel Acceptance Speech (http://www.nobelprize.org/nobel_prizes/physics/laureates/1965/feynman-lecture.html)
said: "It always seems odd to me that the fundamental laws of physics, when discovered, can appear in so many different forms that are not apparently identical at first, but with a little mathematical fiddling you can show the relationship. And example of this is the Schrodinger equation and the Heisenberg formulation of quantum mechanics. I don't know why that is - it remains a mystery, but it was something I learned from experience. There is always another way to say the same thing that doesn't look at all like the way you said it before. I don't know what the reason for this is. I think it is somehow a representation of the simplicity of nature."
I too believe in the simplicity of nature, and I am glad that Richard Feynman, a Nobel-winning famous physicist, also believe in the same thing I do, but I had come to my belief long before I knew about that particular statement.
The belief that "Nature is simple" is however being expressed differently in my essay "Analogical Engine" linked to http://fqxi.org/community/forum/topic/1865 .
Specifically though, I said "Planck constant is the Mother of All Dualities" and I put it schematically as: wave-particle ~ quantum-classical ~ gene-protein ~ analogy- reasoning ~ linear-nonlinear ~ connected-notconnected ~ computable-notcomputable ~ mind-body ~ Bit-It ~ variation-selection ~ freedom-determinism ... and so on.
Taken two at a time, it can be read as "what quantum is to classical" is similar to (~) "what wave is to particle." You can choose any two from among the multitudes that can be found in our discourses.
I could have put Schrodinger wave ontology-Heisenberg particle ontology duality in the list had it comes to my mind!
Since "Nature is Analogical", we are free to probe nature in so many different ways. And you have touched some corners of it.
Good luck,
Than Tin
Dear Matthew Saul Leifer:
I am an old physician and I don't know nothing of mathematics and almost nothing of physics, so is almost impossible for me to give an opinion in your essay. I this contest are many theories, mine is not.
Maybe you would be interested in my essay over a subject which after the common people, physic discipline is the one that uses more than any other, the so called "time".
I am sending you a practical summary, so you can easy decide if you read or not my essay "The deep nature of reality".
I am convince you would be interested in reading it. ( most people don't understand it, and is not just because of my bad English).
Hawking in "A brief history of time" where he said , "Which is the nature of time?" yes he don't know what time is, and also continue saying............Some day this answer could seem to us "obvious", as much than that the earth rotate around the sun....." In fact the answer is "obvious", but how he could say that, if he didn't know what's time? In fact he is predicting that is going to be an answer, and that this one will be "obvious", I think that with this adjective, he is implying: simple and easy to understand. Maybe he felt it and couldn't explain it with words. We have anthropologic proves that man measure "time" since more than 30.000 years ago, much, much later came science, mathematics and physics that learn to measure "time" from primitive men, adopted the idea and the systems of measurement, but also acquired the incognita of the experimental "time" meaning. Out of common use physics is the science that needs and use more the measurement of what everybody calls "time" and the discipline came to believe it as their own. I always said that to understand the "time" experimental meaning there is not need to know mathematics or physics, as the "time" creators and users didn't. Instead of my opinion I would give Einstein's "Ideas and Opinions" pg. 354 "Space, time, and event, are free creations of human intelligence, tools of thought" he use to call them pre-scientific concepts from which mankind forgot its meanings, he never wrote a whole page about "time" he also use to evade the use of the word, in general relativity when he refer how gravitational force and speed affect "time", he does not use the word "time" instead he would say, speed and gravitational force slows clock movement or "motion", instead of saying that slows "time". FQXi member Andreas Albrecht said that. When asked the question, "What is time?", Einstein gave a pragmatic response: "Time," he said, "is what clocks measure and nothing more." He knew that "time" was a man creation, but he didn't know what man is measuring with the clock.
I insist, that for "measuring motion" we should always and only use a unique: "constant" or "uniform" "motion" to measure "no constant motions" "which integrates and form part of every change and transformation in every physical thing. Why? because is the only kind of "motion" whose characteristics allow it, to be divided in equal parts as Egyptians and Sumerians did it, giving born to "motion fractions", which I call "motion units" as hours, minutes and seconds. "Motion" which is the real thing, was always hide behind time, and covert by its shadow, it was hide in front everybody eyes, during at least two millenniums at hand of almost everybody. Which is the difference in physics between using the so-called time or using "motion"?, time just has been used to measure the "duration" of different phenomena, why only for that? Because it was impossible for physicists to relate a mysterious time with the rest of the physical elements of known characteristics, without knowing what time is and which its physical characteristics were. On the other hand "motion" is not something mysterious, it is a quality or physical property of all things, and can be related with all of them, this is a huge difference especially for theoretical physics I believe. I as a physician with this find I was able to do quite a few things. I imagine a physicist with this can make marvelous things.
With my best whishes
Héctor
Hi Matt,
This is a really thought-provoking essay. I'm in full agreement with your conclusion via-a-vis an underlying reality. However, if I'm reading your essay correctly, your definition of noncontextuality seems to differ a bit from the usual Kochen-Specker sense (Ken Wharton tells me you're further developing this?). In my own essay I proposed *contextuality* as a sort of underlying principle which I guess (?) would be in accord with the notion that noncontextuality must be derivable, but I think that assumes that we're using the terms in a similar sense. Could you elaborate on your notion of noncontextuality a bit?
Ian
There is some subtlety surrounding the terminology of contextuality and noncontextuality, so let me distinguish two types. Gleason noncontextuality is the idea that a generalized probability measure on the set of quantum measurement outcomes should assign the same probability to outcomes that are represented by the same projector. From this assumption and Gleason's theorem we get the set of quantum states and the quantum probability rule. Kochen-Specker noncontextuality is the idea that if we assign a deterministic outcome to each measurement then whether or not an outcome occurs only depends on the projector that represents it. It is not possible to find a hidden variable model of quantum theory that satisfies this, which we often summarize by saying that "quantum theory is contextual".
Quantum theory satisfies Gleason noncontextuality, but not Kochen-Specker contextuality, and it is the former that I was talking about in my essay. Of course, the two notions are related, as pointed out by Bell. Because the Gleason noncontextual measures must be quantum states in dimension 3 or higher this implies that there can be no Kochen-Specker noncontextual assignments in these dimensions. This is because an assignment of definite outcomes is a special case of a generalized probability measure, but all the Gleason noncontextual measures assign probabilities other than zero or one to at least some measurements, so there can be no noncontextual assignment of definite outcomes.
One of the reasons that Gleason noncontextuality is not often emphasised is that it is often built in to the assumptions of the operational frameworks udes to derive quantum theory. We say that two outcomes are operationally equivalent if they are always assigned the same probability for all states, and then we go ahead and identify equivalent outcomes. However, this approach assumes that probability is a primitive, or that it is identified with something empirically observable like statistical frequencies. This is not appropriate on a subjective Bayesian interpretation of probability, which requires an explanation for why we should assign identical probabilities to these things. Arguably, even on other interpretations of probability it would be better to have an explanation, but whether it is strictly needed in a derivation of quantum theory depends on which assumptions we view as part of the background framework, and which we view as substantive.
Thanks for the clarification. Personally I think there's more to the Kochen-Specker result than meets the eye. Or, rather, there's a really a third possible view of (non)contextuality that has a limited Bayesian subjectivity to it, i.e. there are degrees of dependence of the projectors on other projectors (if that makes any sense).
Dear Matthew,
One single principle leads the Universe.
Every thing, every object, every phenomenon
is under the influence of this principle.
Nothing can exist if it is not born in the form of opposites.
I simply invite you to discover this in a few words,
but the main part is coming soon.
Thank you, and good luck!
I rated your essay accordingly to my appreciation.
Please visit My essay.
Dear Dr. Leifer,
I found your essay to be very lucid and informative! ! I particularly enjoyed the use of the Dutch book argument to explain subjective Bayesian probability.
A realist approach to physics is possible if quantum potential is construed as constituting "objectively existing external reality." Counterfactual assertions can be applied to all possible paths calculated by the Lagrangian, in which probabilities are bound globally in time.
Contextual information, on the other hand, arises from the conditional entropy of the observer, who is locally part of the path selection process. The observer iteratively generates knowledge by erasing the entanglement information that encodes the quantum potential. (See my essay "A Complex Conjugate Bit and It".) In a sense, then, contextuality and noncontextuality are two sides of the same coin acting in reciprocity with each other.
Best wishes,
Richard
Dear Matt,
My admiring post, comments on commonality and 'thought experiment' seemed to upset you (July 5th). I sincerely apologise if it did. You say you mainly use abstracts to select essays to read. In my case this may be wrongly judging a book by it's cover. A few have said the abstract is too dense but the essay excellent, indeed blog comments include; "groundbreaking", "clearly significant", "astonishing", "fantastic job", "wonderful", "remarkable!", "deeply impressed", etc.
I do hope I can prevail on you to check it over and offer any views and advice.
Thank you kindly and very best wishes
Peter
Matt,
I've just re-read your essay in a 'round up' and confirm my initial favourable impressions. I think it very relevant and complimentary with mine, particularly the Bayesian interpretation, and "there is something wrong with our basic framework for realist models of quantum theory. The right framework ...should reveal that quantum theory is not nonlocal after all.!!"
I'm sad you haven't read mine, but note you choose mainly by abstract. I'm told my dense abstract has put some off. I hope I may persuade you to ignore it in this instance by quoting from my blog posts, which include;
"groundbreaking", "remarkable!", "clearly significant", " fantastic job", "wonderful essay", "deeply impressed", "valuable contribution", "Technically challenging and philosophically deep", "Rubbish", etc. (ok, I made that last one up!).
I hope you don't judge my book by it's cover as I think we may even be on the brink of an important step forward, or even quantum leap!.
Best wishes
Peter
Aha! The lost posts are returning from wandering in cyberspace! Sorry about the repeat.
Peter
Dear Matthew,
We are at the end of this essay contest.
In conclusion, at the question to know if Information is more fundamental than Matter, there is a good reason to answer that Matter is made of an amazing mixture of eInfo and eEnergy, at the same time.
Matter is thus eInfo made with eEnergy rather than answer it is made with eEnergy and eInfo ; because eInfo is eEnergy, and the one does not go without the other one.
eEnergy and eInfo are the two basic Principles of the eUniverse. Nothing can exist if it is not eEnergy, and any object is eInfo, and therefore eEnergy.
And consequently our eReality is eInfo made with eEnergy. And the final verdict is : eReality is virtual, and virtuality is our fundamental eReality.
Good luck to the winners,
And see you soon, with good news on this topic, and the Theory of Everything.
Amazigh H.
I rated your essay.
Please visit My essay.
Late-in-the-Day Thoughts about the Essays I've Read
I am sending to you the following thoughts because I found your essay particularly well stated, insightful, and helpful, even though in certain respects we may significantly diverge in our viewpoints. Thank you! Lumping and sorting is a dangerous adventure; let me apologize in advance if I have significantly misread or misrepresented your essay in what follows.
Of the nearly two hundred essays submitted to the competition, there seems to be a preponderance of sentiment for the 'Bit-from-It" standpoint, though many excellent essays argue against this stance or advocate for a wider perspective on the whole issue. Joseph Brenner provided an excellent analysis of the various positions that might be taken with the topic, which he subsumes under the categories of 'It-from-Bit', 'Bit-from-It', and 'It-and-Bit'.
Brenner himself supports the 'Bit-from-It' position of Julian Barbour as stated in his 2011 essay that gave impetus to the present competition. Others such as James Beichler, Sundance Bilson-Thompson, Agung Budiyono, and Olaf Dreyer have presented well-stated arguments that generally align with a 'Bit-from-It' position.
Various renderings of the contrary position, 'It-from-Bit', have received well-reasoned support from Stephen Anastasi, Paul Borrill, Luigi Foschini, Akinbo Ojo, and Jochen Szangolies. An allied category that was not included in Brenner's analysis is 'It-from-Qubit', and valuable explorations of this general position were undertaken by Giacomo D'Ariano, Philip Gibbs, Michel Planat and Armin Shirazi.
The category of 'It-and-Bit' displays a great diversity of approaches which can be seen in the works of Mikalai Birukou, Kevin Knuth, Willard Mittelman, Georgina Parry, and Cristinel Stoica,.
It seems useful to discriminate among the various approaches to 'It-and-Bit' a subcategory that perhaps could be identified as 'meaning circuits', in a sense loosely associated with the phrase by J.A. Wheeler. Essays that reveal aspects of 'meaning circuits' are those of Howard Barnum, Hugh Matlock, Georgina Parry, Armin Shirazi, and in especially that of Alexei Grinbaum.
Proceeding from a phenomenological stance as developed by Husserl, Grinbaum asserts that the choice to be made of either 'It from Bit' or 'Bit from It' can be supplemented by considering 'It from Bit' and 'Bit from It'. To do this, he presents an 'epistemic loop' by which physics and information are cyclically connected, essentially the same 'loop' as that which Wheeler represented with his 'meaning circuit'. Depending on where one 'cuts' the loop, antecedent and precedent conditions are obtained which support an 'It from Bit' interpretation, or a 'Bit from It' interpretation, or, though not mentioned by Grinbaum, even an 'It from Qubit' interpretation. I'll also point out that depending on where the cut is made, it can be seen as a 'Cartesian cut' between res extensa and res cogitans or as a 'Heisenberg cut' between the quantum system and the observer. The implications of this perspective are enormous for the present It/Bit debate! To quote Grinbaum: "The key to understanding the opposition between IT and BIT is in choosing a vantage point from which OR looks as good as AND. Then this opposition becomes unnecessary: the loop view simply dissolves it." Grinbaum then goes on to point out that this epistemologically circular structure "...is not a logical disaster, rather it is a well-documented property of all foundational studies."
However, Grinbaum maintains that it is mandatory to cut the loop; he claims that it is "...a logical necessity: it is logically impossible to describe the loop as a whole within one theory." I will argue that in fact it is vital to preserve the loop as a whole and to revise our expectations of what we wish to accomplish by making the cut. In fact, the ongoing It/Bit debate has been sustained for decades by our inability to recognize the consequences that result from making such a cut. As a result, we have been unable to take up the task of studying the properties inherent in the circularity of the loop. Helpful in this regard would be an examination of the role of relations between various elements and aspects of the loop. To a certain extent the importance of the role of relations has already been well stated in the essays of Kevin Knuth, Carlo Rovelli, Cristinel Stoica, and Jochen Szangolies although without application to aspects that clearly arise from 'circularity'. Gary Miller's discussion of the role of patterns, drawn from various historical precedents in mathematics, philosophy, and psychology, provides the clearest hints of all competition submissions on how the holistic analysis of this essential circular structure might be able to proceed.
In my paper, I outlined Susan Carey's assertion that a 'conceptual leap' is often required in the construction of a new scientific theory. Perhaps moving from a 'linearized' perspective of the structure of a scientific theory to one that is 'circularized' is just one further example of this kind of conceptual change.
Dear Matthew,
very interesting point of view. I completely agree that quantum mechanics enforces us for a new probability theory. von Weizsäcker was one of the first who noticed it. Maybe have a look into my essay?
It is a more geometrical point of view. I think that the structure of the spacetime determines a lot.
Best wishes
Torsten
Hello Matthew
I found your discussion of Bayesianism quite fascinating. You said that there is nothing in logic that tells you what premises you have to start with.
In my essay, one begins with all possible propositions. How this connects with yours, I am not quite sure yet. Logic is subjugate to the General Principle of Equivalence, as is every proposition, and the GPE filters out all propositions but two at the first step.
I found the last parts quite hard to follow, but liked the main theme.
Best wishes
Stephen Anastasi