Dear Flavio,

As an engineer we are taught early on that all the problems that we are going to solve are idealization also all of the laws of physics classical or otherwise are idealization of the problems being solved, theoretical or applied. So I don't know in that sense what is the real discovery. Moreover, the problem with the standard QM interpretation(the minimum) in terms of predictibility is fundamentally vastly different in quality and quantity, so I don't see how you casting doubt on the uncertainity in classical helps in QM.

My solution has been to try to deal with the problem with the only and final way to solve all these problems, that is to find the correct ontology which leads to to a fairly simple system that can be comprehended easily.

It is also very baffling for me when all the respectable physicists (not talking about amateur philosopher) here in this contest go on about Turing and Godel yet we have been doing physics in the past 400 years with regular math with fairly good success. So it should have been obvious that it is only about a correction to the model and deducing some ontology, that is all and not go on a wild goose chase.

Sorry for being direct, I don't mean to be unfriendly, just my opinion. Thanks

    I should just mention that I don't mean that my system is correct as is, but I think it is highly suggestive of the correct physics since it closely follows present understanding and the general set up of today's physics, yet more fundamental.

    Dear Adel Sadeq, there is a deep conceptual difference between the way engeneers and (even experimental scientists) are trained and the foundations we groumd our scientific theories upon. You are referring to the unavoidable finiteness of the error bars in actual measurements. Clearly this is not the same as having fundamental limits of determuninacy, as I discuss in my essay.

    Dear Flavio,

    Thank you for the reply. The problem of fundamental limits of determinacy is very hard to be determined experimentally whether classical or QM as discussed in the link below by you colleague in the math department Dr. A. Neumaier. Moreover, I agree with his interpretation as a basis, which my theory sort of confirms it because the probability density is shown to be just the density of energy and momentum very much like in QFT. Also EPR is trivial in my idea since the system is inherently a nonlocal theory. Thanks again.

    https://www.physicsforums.com/threads/the-thermal-interpretation-of-quantum-physics.967116/

    Dear Flavio Del Santo,

    Your essay seems to be exactly what the framers of this topic were looking to generate. My essay went in a totally different direction. Your writing style made this dense topic relatable. I was just going to glance at the essay, but ending up reading it in one sitting. Excellent work.

    A topic for a future essay could be a clearer link between quantum mechanics and measurable classical mechanics. Your phase space diagram showed a strong similarity between the two. This relationship gave me an idea about how the square well problem, which famously has opposite result for classical and quantum could be shown to be similar if measurement was accounted for in the set-up.

    There is also the issue of larger scale events like sound waves, which contain information, but seem independent of the atomic scale.

    Sincerely,

    Jeff Schmitz

      Dear Flavio,

      This is really an impressive and excellent essay. I enjoyed it very much.

      I have not much to comment but to ask some questions with regards to where your essay could lead:

      - There is an interesting trend in mathematics (I find it interesting at least) which attempts to extend concepts of standard mathematics to fuzzy sets. This is sometimes called fuzzification. I am wondering whether your essay could not be a starting point for a fuzzification of classical mechanics. Note that fuzzification has also the advantage of using fuzzy logic which has a working procedure to determine entailments and the like (this could be relevant to your notion of cause and effect).

      - Your last optimistic quote and paragraph on the openness of the future does remind me of the interpretation of probability by Carl Friedrich von Weizsacker. As far as I understand his view, probability can only be about the future because the future is "open" to re-use your wordings. Are you aware of his work on the matter (like his temporal logic) and, if so, do you have particular thoughts about it?

      While your essay mostly focuses on indeterminacy, the essay I have submitted focuses on undecidability and asks a similar question in substance "As physics and science ever been decidable?". If you are interested you can read it there https://fqxi.org/community/forum/topic/3477 .

      Best of luck for the contest.

      Fabien

        Dear Jeff Schmitz,

        thanks very much for your flattering words, I am glad that you liked my essay. I will surely have a look at your essay, to see your different approach.

        I'll be also glad to hear about what you have in minf about the square-well problem.

        All the best!

        Flavio

        Dear Fabien,

        thank you for wrtiting and to kindly appreciate my work.

        Indeed, what you say about the "fuzzification" program is very pertinent and similar to the aims expressed in my essay. Gisin with other collaborators and me is considering different (constructive) mathematical structures to capture this feature of fuzziness.

        I was not aware of von Weizsacker's work that you mention. I will surely have a look at it, as I will do with your essay as soon as time allows.

        Thanks once more and all the best,

        Flavio

        Flavio,

        Your essay seems to put the foundation's question, the 3 Uns, in a more reasonable light that helps to reset the anthropocentric tilt toward objectivity regarding outcomes in physics -- classical and quantum. Furthermore your concepts are quite accessible. Your use of figure 3, for example, I feel introduces the graphic realism in the classical world that is missing, perhaps also in the realm of subatomic particles as well. A probability factor is always relevant in engineering projects. Hope you have time to give mine a look: https://fqxi.org/community/forum/topic/3396.

        Great job. My rating is your 22nd. Having that many protects you against the 1s given by some with no comments.

        Jim Hoover

          Dear Jim (if I may),

          Thanks for your kind feedback, I appreciate that.

          I will have a look at your essay and leave thwre comments should I have some interesting ones.

          Best of luch for the contest amd kind regards,

          Flavio

          Dear Del Santo:

          It is a pleasure reading your essay. It is very well presented. Worthy of receiving high grades.

          Fundamentally, we are in agreement. Nature does not work out of two orthogonal worlds of Classical Mechanics and Quantum Mechanics. Nature is working out of one single undivided space using one set of rules, however complex they may be. However, as an experimentalist, I find that indeterminism in our universe emerges without the need of sophisticated math or philosophical deliberations.

          Please, make time to read my essay and grade it.

          "Complete Information Retrieval: A Fundamental Challenge"

          https://fqxi.org/community/forum/topic/3565

          The universe is fundamentally stochastic because the observable radiations and particles are emergent oscillations/vibrations of the universal cosmic space, which is, at the same time, full of random "background" fluctuations. These fluctuations, however weak they may be, nonetheless are perturbing any and all other interactions going on in the universe. Since, human initiated measurements are also interactions between our chosen interactants in our apparatus, stochastical changes will be inevitable in every measurements, classical or quantum.

          I am an experimentalist. Our measurements will always be imprecise. When we enumerate the basic steps behind the measurement processes, we find that we had been, we now are, and we will always be, information limited:

          (i) Data are some physical transformation taking place inside the apparatus.

          (ii) The physical transformation in a detectable material always require some energy exchange between the interactants, the "unknown" and the "known" as the reference interactant.

          (iii) The energy exchange must be guided by some force of interaction operating between the chosen interactants.

          (iv) Since we have started with an unknown universe, from the standpoint of building physics theories, the "known" entities are known only partially, never completely. This also creates information bottleneck for the "unknown" entity. Note that in spite of innumerable experiments, we still do not know what electrons and photons really are.

          (v) All forces of interactions are distance dependent. Hence, the interactants must be placed within the range of each other's mutual influence (force-field). Force-field creates the necessary physical "entanglement" between interacting entities for the energy transfer to proceed. In other words, interactants must be "locally regional" within their physical sphere of influence. They must be "entangled" by a perceptible physical force. Our equations are built on such hard causality.

          (vi) The final data in all instruments suffer from the lack of 100% fidelity. This is another permanent problem of imprecision. We can keep on reducing the error margin as our technology enhances; but we do not know how to completely eliminate this error.

          Many of my earlier papers have also articulated this position. They can be downloaded from:

          http://www.natureoflight.org/CP/

          You can also download the paper: "Next Frontier in Physics--Space as a Complex Tension Field"; Journal of Modern Physics, 2012, 3, 1357-1368 , http://dx.doi.org/10.4236/jmp.2012.310173

          Sincerely,

          Chandra.

            Your essay was interesting. However, it raises at least one major question that I don't think was sufficiently addressed.

            For example, in figure 3, why could the (indeterministic) graph on the right not be considered to be deterministic? It appears that it is interpreted to be indeterministic simply because events could have gone in a different direction.

            Let me state this in a broader context, which is also the reason that I did not directly address unpredictability in my essay. A popular interpretation of unpredictability is that the initial conditions cannot be known with sufficient enough precision to construct accurate predictions. I find this interpretation to be, at the very least, weak. It certainly is not the same thing proposed by undecidability and uncomputability. It is weak because it does not actually refute determinism as a philosophical concept, it simply seems to to refute it from what appears to be an epistemic perspective.

            Subsequently, you seem to refute the argument that it is merely epistemic by appealing to something akin to Heisenberg's uncertainty principle, which is to say that attaining the necessary precision (to make accurate predications) is fundamentally impossible. In other words, it is not merely a limitation of our formal systems or measurement equipment; it is inherent. To state this yet another way, our measurements are intrinsically fuzzy, which is not due to lack of accuracy, but rather due to the fact that they are actually fuzzy and thus, to some extent, indeterminate.

            I would then conclude this isn't really a refutation of determinism and thus an argument for indeterminism, but rather it would appear to be something more like an argument for a fuzzy determinism that would appear to be, at least to some extent, indeterminate because it is postulated that reality is intrinsically fuzzy.

            As a result, the question that I don't think was sufficiently addressed is the precisely the question of determinism because a fuzzy determinism is not indeterministic; it is just fuzzy. So, does God play dice or not?

              Hi Flavio,

              Thanks for the really well written essay. The principle of infinite precision is super useful in highlight the tension, either ontologically or epistemically, of formal axiomatic systems and physics. I'd never read Max Born's essay that you quote on page 2, but its an excellent reference!

              I completely sympathise with your idea of finite information quantities are loved the reference to Gisin's recent work. As you mentioned, Landauer has written extensively on this---I lent heavily on Landauer's perspectives in my essay.

              You wrote that

              ``In this view, the orthodox interpretation of classical physics can be regarded as a deterministic completion of an indeterministic model, in terms of hidden variables''.

              The finite information quantities seem to suggest that indeterministic models are fundamental. For example, a lot of powerful claims can be made from statistical physics or thermodynamics. Would these kinds of ideas occupy a more fundamental, rather than derivative role, in a theory of finite information quantities?

              Again, thanks for the great essay! It was a pleasure to read!

              Cheers,

              Michael

                Dear Jason W Steinmetz, it seems that your criticisms are based on misconceptions. You write: "A popular interpretation of unpredictability is that the initial conditions cannot be known with sufficient enough precision to construct accurate predictions. I find this interpretation to be, at the very least, weak". I disagree that this is populuar, for that matters, but most importantly that this is weak. It is a totally legit way of thinking of plausible indeterministic theories. The other main one being, what would perhaps make you happier, a fundamentally stochastic dynamics. Please have a look at my paper where we discuss these distinctions https://arxiv.org/abs/1909.03697 (section "FORMS OF INDETERMINISM IN CLASSICAL AND QUANTUM PHYSICS"). The argument is NOT merely epistemic, for the initial conditions are in my model supposed to be indeterminate (as opposed to anckown, as an experimentalist would claim).

                Determinism is absolute, making it just a little bit fuzzy makes it untanable. Think of any chaotic system and you will see this. Moreover, we have Bell's inequalities: If there is less than one bit of randomness in the universe, we can create unbounded one (again see my paper below).

                Best regards,

                Flavio Del Santo

                Dear Chandra,

                Thank you for appreciating my work and taking time to comment and relate to your work. I will have a look and possibily come back with comments on your essay.

                Best wishes,

                Flavio

                Dear Michael,

                Thanks for your kind feedback. Indeed, I think our essays have a great deal in common, although your is more concerned with limits of computation. (I am about to comment more precisely on the dedicated page of your essay).

                As for your question, indeed, statistical (indeterministic) laws would be just a reflection of an indeterministic microscopic behaviour. So it would surely attribute a more fundamental value to this. Moreover, this would even out the tension between microscopic determinism and statistical irreversibility. Gisin and I have gone through this problem in our paper (https://arxiv.org/abs/1909.03697), where we stated: "In the perspective of the alternative indeterministic interpretation (based on FIQs), instead, the deterministic law of the ideal gas, ruling the behavior at

                the macroscopic level, emerges as a novel and not reducible Notice that the historical debate on the apparent incompatibility between Poincar´e's recurrence theorem and Boltzmann's kinetic theory of gases does not arise in the framework of FIQs."

                Thank you once more and all the best,

                Flavio

                Dear Flavio,

                The refusal of actual infinities goes back to Aristotle, and those who followed always thought of measurement in terms of rational numbers with finite representations. And as the inverse square laws were established, classical determinism came to be understood in the same way, with laws restricted to quadratic equations, or conic sections, with rational solutions.

                When Poincare went beyond that to transversals, and the symbolic sequences that constrain them, he was exploring a new physics of chaotic dynamics, beyond what was possible for Laplace and classical rationalism. Those *infinite sequences give you the Kolmogarov-Sinai entropy, as a measure of entropy production, and thereby unpredictability. The axioms for probability of both Kolmogarov and Karl Popper both assume infinite sequence representations.

                Of course there is also the classical rationalism of games of chance, working from finite binomial distributions, but there measurement with complete precision becomes possible, and Maxwell's Demon is very much in business. If you insist on viewing reality as a computer it becomes hackable.

                Dear Flavio,

                during the contest, i advanced a lot in trying to explain the meaning of pi=1. It ist like a meditation between two competitors.. the small child and the old theoretical physicist. Hard stuff to explain that all our pocet calculaters ar "fake news".

                As my native language is german it is hard for me to explain in english.

                I tried here to explain the final theory of everything in quantum-biology (finalazing the ideas of A Turing)

                For some people of course this is "shocking". : https://www.youtube.com/watch?v=EIBjt_5TU0U

                (Language should be understood also by "non-scientist")

                All good wishes.

                Manfred

                Dear Flavio,

                Your essay gave me an idea. If you have time, I wish to know what you think.

                Indeterminism leads to irreversibility, which leads to entropy. A true reversible system would require infinite information density. Because we have the arrow of time that is entropy, we cannot be in a deterministic University, since a net increase in entropy requires irreversibility. It does not matter if this is a quantum or classical system.

                Thank you,

                Jeff Schmitz

                  Dear Flavio,

                  Thank you for your very interesting essay. Your angle of attack is quite original, and I completely agree with your argument that indeterminism is a matter of interpretation, and can both be chosen for quantum or classical theory. Indeed Laplace's demon is often considered as being exorcised by quantum theory, but its existence is already facing difficulties by the only fact that if the demon is not external to the Universe, then it might predict its own behaviour, which might lead to logical inconsistencies.

                  Are you familiar with Breuer's theorem ? It reminds me of your principle of infinite precision. Breuer, analysing the measurement problem as emerging from self-reference, has shown that every observer cannot distinguish the states of a system in which it is contained, irrespective of the nature of the system (classical or quantum) and of the time evolution (deterministic are stochastic). In a paper entitled "Quantum meausurement and Gödel's proof", analyzing Maxwell's demon, Zwick also presents, as you do, a mesurement problem for classical mechanics : "To avoid infinite regress in the description of measurement, and paradoxes either of self-reference, one must assume that at some point, the perturbation by the measurement can be ignored, or introduce a statistical postulate as an arbitrary addition to the dynamics. If neither is allowed, on can simply accept the need for the two-levelled theory." Do you think that this might be in line with your essay ?

                  The "future undecidability" that is described in Gisin's final quote of your essay is an old problem of classical logic, that goes back to Aristotle and "Sea battle tomorrow", his introduction of the notion of contingency. This problem of future contingents was especially studied by the scholastics, who were trying to conciliate Aristotle's logic and the Biblical narratives. Ernst Specker, who was one of the father of the theorem showing "quantum contextuality", was motivated by these scholastic study of "Infuturabilien" when he found a result that will become later the Kochen-Specker theorem.

                  If you ever find some time to do it, I will be glad to have your feedback on my essay, that defends an indeterministic interpretation of quantum theory based on contextuality and an analysis of the measurement problem as self-reference (analysis that I believe could be extended to classical mechanics, an incompatibility between absolute universality, measurement as a meta-theoretical process and full measurability).

                  Best,

                  Hippolyte