Dear Jim (if I may),

Thanks for your kind feedback, I appreciate that.

I will have a look at your essay and leave thwre comments should I have some interesting ones.

Best of luch for the contest amd kind regards,

Flavio

Dear Del Santo:

It is a pleasure reading your essay. It is very well presented. Worthy of receiving high grades.

Fundamentally, we are in agreement. Nature does not work out of two orthogonal worlds of Classical Mechanics and Quantum Mechanics. Nature is working out of one single undivided space using one set of rules, however complex they may be. However, as an experimentalist, I find that indeterminism in our universe emerges without the need of sophisticated math or philosophical deliberations.

Please, make time to read my essay and grade it.

"Complete Information Retrieval: A Fundamental Challenge"

https://fqxi.org/community/forum/topic/3565

The universe is fundamentally stochastic because the observable radiations and particles are emergent oscillations/vibrations of the universal cosmic space, which is, at the same time, full of random "background" fluctuations. These fluctuations, however weak they may be, nonetheless are perturbing any and all other interactions going on in the universe. Since, human initiated measurements are also interactions between our chosen interactants in our apparatus, stochastical changes will be inevitable in every measurements, classical or quantum.

I am an experimentalist. Our measurements will always be imprecise. When we enumerate the basic steps behind the measurement processes, we find that we had been, we now are, and we will always be, information limited:

(i) Data are some physical transformation taking place inside the apparatus.

(ii) The physical transformation in a detectable material always require some energy exchange between the interactants, the "unknown" and the "known" as the reference interactant.

(iii) The energy exchange must be guided by some force of interaction operating between the chosen interactants.

(iv) Since we have started with an unknown universe, from the standpoint of building physics theories, the "known" entities are known only partially, never completely. This also creates information bottleneck for the "unknown" entity. Note that in spite of innumerable experiments, we still do not know what electrons and photons really are.

(v) All forces of interactions are distance dependent. Hence, the interactants must be placed within the range of each other's mutual influence (force-field). Force-field creates the necessary physical "entanglement" between interacting entities for the energy transfer to proceed. In other words, interactants must be "locally regional" within their physical sphere of influence. They must be "entangled" by a perceptible physical force. Our equations are built on such hard causality.

(vi) The final data in all instruments suffer from the lack of 100% fidelity. This is another permanent problem of imprecision. We can keep on reducing the error margin as our technology enhances; but we do not know how to completely eliminate this error.

Many of my earlier papers have also articulated this position. They can be downloaded from:

http://www.natureoflight.org/CP/

You can also download the paper: "Next Frontier in Physics--Space as a Complex Tension Field"; Journal of Modern Physics, 2012, 3, 1357-1368 , http://dx.doi.org/10.4236/jmp.2012.310173

Sincerely,

Chandra.

    Your essay was interesting. However, it raises at least one major question that I don't think was sufficiently addressed.

    For example, in figure 3, why could the (indeterministic) graph on the right not be considered to be deterministic? It appears that it is interpreted to be indeterministic simply because events could have gone in a different direction.

    Let me state this in a broader context, which is also the reason that I did not directly address unpredictability in my essay. A popular interpretation of unpredictability is that the initial conditions cannot be known with sufficient enough precision to construct accurate predictions. I find this interpretation to be, at the very least, weak. It certainly is not the same thing proposed by undecidability and uncomputability. It is weak because it does not actually refute determinism as a philosophical concept, it simply seems to to refute it from what appears to be an epistemic perspective.

    Subsequently, you seem to refute the argument that it is merely epistemic by appealing to something akin to Heisenberg's uncertainty principle, which is to say that attaining the necessary precision (to make accurate predications) is fundamentally impossible. In other words, it is not merely a limitation of our formal systems or measurement equipment; it is inherent. To state this yet another way, our measurements are intrinsically fuzzy, which is not due to lack of accuracy, but rather due to the fact that they are actually fuzzy and thus, to some extent, indeterminate.

    I would then conclude this isn't really a refutation of determinism and thus an argument for indeterminism, but rather it would appear to be something more like an argument for a fuzzy determinism that would appear to be, at least to some extent, indeterminate because it is postulated that reality is intrinsically fuzzy.

    As a result, the question that I don't think was sufficiently addressed is the precisely the question of determinism because a fuzzy determinism is not indeterministic; it is just fuzzy. So, does God play dice or not?

      Hi Flavio,

      Thanks for the really well written essay. The principle of infinite precision is super useful in highlight the tension, either ontologically or epistemically, of formal axiomatic systems and physics. I'd never read Max Born's essay that you quote on page 2, but its an excellent reference!

      I completely sympathise with your idea of finite information quantities are loved the reference to Gisin's recent work. As you mentioned, Landauer has written extensively on this---I lent heavily on Landauer's perspectives in my essay.

      You wrote that

      ``In this view, the orthodox interpretation of classical physics can be regarded as a deterministic completion of an indeterministic model, in terms of hidden variables''.

      The finite information quantities seem to suggest that indeterministic models are fundamental. For example, a lot of powerful claims can be made from statistical physics or thermodynamics. Would these kinds of ideas occupy a more fundamental, rather than derivative role, in a theory of finite information quantities?

      Again, thanks for the great essay! It was a pleasure to read!

      Cheers,

      Michael

        Dear Jason W Steinmetz, it seems that your criticisms are based on misconceptions. You write: "A popular interpretation of unpredictability is that the initial conditions cannot be known with sufficient enough precision to construct accurate predictions. I find this interpretation to be, at the very least, weak". I disagree that this is populuar, for that matters, but most importantly that this is weak. It is a totally legit way of thinking of plausible indeterministic theories. The other main one being, what would perhaps make you happier, a fundamentally stochastic dynamics. Please have a look at my paper where we discuss these distinctions https://arxiv.org/abs/1909.03697 (section "FORMS OF INDETERMINISM IN CLASSICAL AND QUANTUM PHYSICS"). The argument is NOT merely epistemic, for the initial conditions are in my model supposed to be indeterminate (as opposed to anckown, as an experimentalist would claim).

        Determinism is absolute, making it just a little bit fuzzy makes it untanable. Think of any chaotic system and you will see this. Moreover, we have Bell's inequalities: If there is less than one bit of randomness in the universe, we can create unbounded one (again see my paper below).

        Best regards,

        Flavio Del Santo

        Dear Chandra,

        Thank you for appreciating my work and taking time to comment and relate to your work. I will have a look and possibily come back with comments on your essay.

        Best wishes,

        Flavio

        Dear Michael,

        Thanks for your kind feedback. Indeed, I think our essays have a great deal in common, although your is more concerned with limits of computation. (I am about to comment more precisely on the dedicated page of your essay).

        As for your question, indeed, statistical (indeterministic) laws would be just a reflection of an indeterministic microscopic behaviour. So it would surely attribute a more fundamental value to this. Moreover, this would even out the tension between microscopic determinism and statistical irreversibility. Gisin and I have gone through this problem in our paper (https://arxiv.org/abs/1909.03697), where we stated: "In the perspective of the alternative indeterministic interpretation (based on FIQs), instead, the deterministic law of the ideal gas, ruling the behavior at

        the macroscopic level, emerges as a novel and not reducible Notice that the historical debate on the apparent incompatibility between Poincar´e's recurrence theorem and Boltzmann's kinetic theory of gases does not arise in the framework of FIQs."

        Thank you once more and all the best,

        Flavio

        Dear Flavio,

        The refusal of actual infinities goes back to Aristotle, and those who followed always thought of measurement in terms of rational numbers with finite representations. And as the inverse square laws were established, classical determinism came to be understood in the same way, with laws restricted to quadratic equations, or conic sections, with rational solutions.

        When Poincare went beyond that to transversals, and the symbolic sequences that constrain them, he was exploring a new physics of chaotic dynamics, beyond what was possible for Laplace and classical rationalism. Those *infinite sequences give you the Kolmogarov-Sinai entropy, as a measure of entropy production, and thereby unpredictability. The axioms for probability of both Kolmogarov and Karl Popper both assume infinite sequence representations.

        Of course there is also the classical rationalism of games of chance, working from finite binomial distributions, but there measurement with complete precision becomes possible, and Maxwell's Demon is very much in business. If you insist on viewing reality as a computer it becomes hackable.

        Dear Flavio,

        during the contest, i advanced a lot in trying to explain the meaning of pi=1. It ist like a meditation between two competitors.. the small child and the old theoretical physicist. Hard stuff to explain that all our pocet calculaters ar "fake news".

        As my native language is german it is hard for me to explain in english.

        I tried here to explain the final theory of everything in quantum-biology (finalazing the ideas of A Turing)

        For some people of course this is "shocking". : https://www.youtube.com/watch?v=EIBjt_5TU0U

        (Language should be understood also by "non-scientist")

        All good wishes.

        Manfred

        Dear Flavio,

        Your essay gave me an idea. If you have time, I wish to know what you think.

        Indeterminism leads to irreversibility, which leads to entropy. A true reversible system would require infinite information density. Because we have the arrow of time that is entropy, we cannot be in a deterministic University, since a net increase in entropy requires irreversibility. It does not matter if this is a quantum or classical system.

        Thank you,

        Jeff Schmitz

          Dear Flavio,

          Thank you for your very interesting essay. Your angle of attack is quite original, and I completely agree with your argument that indeterminism is a matter of interpretation, and can both be chosen for quantum or classical theory. Indeed Laplace's demon is often considered as being exorcised by quantum theory, but its existence is already facing difficulties by the only fact that if the demon is not external to the Universe, then it might predict its own behaviour, which might lead to logical inconsistencies.

          Are you familiar with Breuer's theorem ? It reminds me of your principle of infinite precision. Breuer, analysing the measurement problem as emerging from self-reference, has shown that every observer cannot distinguish the states of a system in which it is contained, irrespective of the nature of the system (classical or quantum) and of the time evolution (deterministic are stochastic). In a paper entitled "Quantum meausurement and Gödel's proof", analyzing Maxwell's demon, Zwick also presents, as you do, a mesurement problem for classical mechanics : "To avoid infinite regress in the description of measurement, and paradoxes either of self-reference, one must assume that at some point, the perturbation by the measurement can be ignored, or introduce a statistical postulate as an arbitrary addition to the dynamics. If neither is allowed, on can simply accept the need for the two-levelled theory." Do you think that this might be in line with your essay ?

          The "future undecidability" that is described in Gisin's final quote of your essay is an old problem of classical logic, that goes back to Aristotle and "Sea battle tomorrow", his introduction of the notion of contingency. This problem of future contingents was especially studied by the scholastics, who were trying to conciliate Aristotle's logic and the Biblical narratives. Ernst Specker, who was one of the father of the theorem showing "quantum contextuality", was motivated by these scholastic study of "Infuturabilien" when he found a result that will become later the Kochen-Specker theorem.

          If you ever find some time to do it, I will be glad to have your feedback on my essay, that defends an indeterministic interpretation of quantum theory based on contextuality and an analysis of the measurement problem as self-reference (analysis that I believe could be extended to classical mechanics, an incompatibility between absolute universality, measurement as a meta-theoretical process and full measurability).

          Best,

          Hippolyte

            Dear Jeff,

            thanks for sharing your further ideas, and very glad that my work stimulated you to have new ideas ;)

            All the best,

            Flavio

            You are absolutely correct that my "criticisms are based on misconceptions."

            You wrote an interesting and well-reasoned essay based on the (legitimate) premise that the existence of what cannot be determined amounts to a refutation of determinism. Subsequently, since some things are indeterminate then these things cannot be predicted, which is technically correct. I simply made the case that this does not refute determinism unless, as you postulated, these things are inherently indeterminate and thus determined by "nothing".

            It certainly appears that the concepts of what is (ontology) and what can be known (epistemology) are inextricably linked and their exact fundamental difference ineluctably fuzzy.

            I wish you well in the contest.

            Dear Hippolyte,

            thank you for your kind words; I am glad that you found some of my ideas interesting.

            I am especially thankful for the references you pointed out to me. In fact, I was aware neither of Breuer's theorem nor of Zwick's paper, nut they look of the utmost interest to me. I will surely scrutinise their ideas. On spot, from what you wrote me, I am not sure if the classical measurement problem stem from the same motivations as mine -I never considere the self reference problem which seems central to the argument of Zwick- but surely it's nice to see that others had similar ides from different paths.

            I will gladly read your essay and comment on its dedicated page soon.

            All the best,

            Flavio

            Dear Flavio,

            what an interesting topic. I was happy to see, you have a contribution in the contest. The problem of the origin of randomness, which is not merely epistemic is a big puzzle to me. The investigation of indeterministic classical physics is interesting in order to see, where the structural/conceptual differences are between classical and quantum physics. And to recognize that it is not the randomness itself nor the collapse part of the measurement problem - which I am not so sure if this is a real problem - that make the difference.

            However, I am not sure you can reach with your approach beyond an epistemic randomness (hidden variable theory). If epistemic limits should have an influence on the ontological status of things - and I would be willing to follow that - that would need a good explanation. As I - and I think most physicist - are trained to accept Laplace's view and in fact imagine the underlying world to be like that. I call this in my essay 'simplistic realism'.

            In my essay I study the conceptual structure of scientific theories - specifically of physics. From there one additional randomness might occur, which I would like to share. In the view I put forward quantities and objects are only definable within closed systems, i.e. if systems and objects are separable from the environment. Imagine if forces outside the system would constantly be shaking the system around, never could any law or concept manifest in that system. But separability necessarily is always only an approximation. Hence the deterministic laws hold only approximately. The environment would create small random fluctuations impeding strictly deterministic laws. (May one could connect this to Tejinder's theory in this contest.)

            I do not think this explains true randomness. Nor do I think, it is in conflict with classical physics. That's fine for me. The goal in my essay was a bit to reconcile our realistic imagination of the world with our epistemic necessities.

            Luca

            Hi Flavio,

            I take it this excellent essay more or less summarizes your development of Gisin's application of constructive maths to classical mechanics? I'd be interested to know how you might see Brouwer's intuitionistic philosophy in relation to this mathematically indeterminate approach.

            As I understand it, his intuitionism was derived from Kantian intuition (anschauung, intuitus) as intuiting/apprehending/perceiving the forms of sensibility, space, and time given in empirical (phenomenal) experience. From that perspective we intuit/perceive phenomenal patterns in our empirical experience of the world (thus information is physical!), and the constructive mathematics is based on that empirically intuited pattern perception. The formal, intersubjective communication of these empirical patterns (or information) is effected in that constructive mathematics, for which classical mechanics thus becomes necessarily indeterminate, at least from this intuitionist (also phenomenological) perspective.

            What is objectively real in this sense are the phenomenal patterns themselves (or real patterns cf. Dennett) as given in empirical/phenomenal experience, rather than say, the Laplace demon's idealized external world of point particles with infinitely precise, initial physical conditions. Does this mean intuitionism, in your view, must reject the notion of a classical world defined as 'objective external reality' in favour of the actual empirical experience of such a merely potential reality? Or can potentia remain an idealized unobservable continuum from which our discrete actualitas emerges?

            Best regards,

            Malcolm Riddoch

            Je suit, nous sommes Wigner!

              Dear Flavio,

              Congratulations on the well-organized essay on the new notation of the randomness. As you visited my essay, I really enjoyed learning this. In the past essay contest, I wrote the essay on information to be finally published in the book. On this essay, the FIQ is not discussed. However, one-bit is not enough to well defined. Therefore, we seem to need more than two bits on the well-defined FIQ.

              Also, from your perspective, how do you understand the integrated information theory (IIT)? Since mathematical formulation of IIT, the conditional probability related to the certain randomness is implicitly assumed. What do you think?

              Best wishes,

              Yutaka

                Flavio - Thanks. A coherent, well-written essay bringing a key, and incorrect, premise of classical physics to light. Your discussion and critique of the principle of infinite precision is impressive and should, if widely read, put the notion of classical determinism to bed permanently.

                Where I was disappointed however, was in the very limited perspective your essay gives to the much broader epistemological issues of incompleteness and undecideability. These issues also point to invalid premises in our conceptual views and understanding of the world. I've taken a stab at these broader issues and would be very grateful if you gave my essay a look.

                Sincerely - George Gantz: The Door That Has No Key. https://fqxi.org/community/forum/topic/3494

                  Dear George, thank you for your feedback and your constructive criticisms. Indeed I believe that undecideability and perhaps (but I am not really sure) even incompleteness play a central role in the foundations of science. My work and my priorities, hence the focus of my essay, are however on determinism and predictability at the moment. But I will be glad to read what you have to say about it.

                  All the best,

                  Flavio

                  Dear Yutaka,

                  thanks for your kind appreciation!

                  As for your question, I am unfortunately not familiar at all with ITT.

                  Best wishes,

                  Flavio