Dear Irek,

thank you for your kind feedback! I will have a look at your essay and send you my comments, should I have some.

All good wishes,

Flavio

Dear Flavio,

I found your paper persuasive and powerful. The only content issue I think of is that a few comments about how your ideas relate to the popular block universe concept would have been nice. I am myself very solidly in the now-is-real column, but I also deeply respect both Einstein's concerns on frame foliation reconciliation, and quantum arguments such as Wheeler-Feynman retarded-advanced photon models. These are relevant since any pre-existence of complete, beginning-to-end world lines implies effectively infinite classical precision of all points in all foliations/slices of the resulting block universe.

One of the most profound and universal aspects of observable physics is a tendency for many, but by no means all, natural phenomena to converge towards well-defined limits. These limits often occurs at scales far smaller than our unaided senses can perceive. If physics did not work this way, the calculus never would have worked well enough for modeling this universe to have been worth the trouble.

Thus I read your arguments as emphasizing this collection of limit-approaching processes as the reality, while the limit itself is the fictional goal, at least for processes in the real universe and in computation.

I suspect there is also a powerful anthropic component to why we do this.

That's because all forms of life are about using low-bit models to predict (and thus survive in) high-bit environments, the latter of which are never fully predictable with such models. In the absence of widespread asymptotic convergence, such modeling would become infeasible. Our brains thus have a built-in biological bias towards bit-efficient Platonic idealism, since it gives us a way to approximating a reality that is far more computationally efficient that attempting more accurate convergence-level modeling.

Other items: I like your FIQ approach to defining numeric precision.

One observation there is that I suspect integers often seduce folks into sloppy thinking about such issues. Integers seem infinitely precise in ways that real numbers can never be. Thus integer thinking seems to enable forms of computation that are, at least for a subset of reality, "infinitely" precise.

However, in an interesting analogy to decoherence in quantum theory, this concept of exact classical precision falls apart badly when the full environment in which such calculations operate is taken into account. In fact, here's a radical suggesting: Integer counting is hugely more difficult in classical systems than in quantum systems. Uh... what??

Here's what I mean: When a rest state helium atom selects (counts) its two electrons, those electrons are really and truly state variants of a single underlying and literally universal object type, the electron. That this is so can be seen in the necessity for quantum mechanics to treat them as objects that cannot be distinguished, giving rise to fermi and bose statistics. So: Quantum "counts" are real counts of truly identical objects.

In sharp contrast, n classical object are not and never can be absolutely identical. Thus the only way by which the seemingly perfectly precise integer counting process of, say, a computer can be attached ("decohered") to the environment to which it applies is for some set of entities to be observed and classified as the "same". This in turn implies the existence inclusion in the process of a rather sophisticated intelligent observer, one that is capable of deciding where a particular time-evolving collection of atoms and energy is or is not an "object" as defined by the pristine counting process.

Thus not only is the object concept itself amorphous and burry around the edges in both space and time (e.g. is a worn bearing still a bearing?), all forms of classical counting -- including emphatically that of fingers and toes, since how often does one encounter atomically identical fingers -- are arguably more observer-dependent than are quantum counting phenomena. Atoms at least can count (select) the number of base-state electrons they need quite precisely, and do so without the need for any intelligent observers.

A final thought is the simplest argument of all: If one assumes the existence of only one infinitely precise number anywhere in the universe, whether real such as an object trajectory or as bits in some remarkable computer, the unavoidable result is that the universe collapses. It cannot do otherwise, since the energy required to instantiate any such number will always be infinite, no matter its physical form.

So again, this time in frustration mode vs. accommodation (of block universes) mode: For at least the century or so since folks figured out for sure that atoms are real, why the bleep do both math and science persist in pretending that every particle possesses the infinite precision needed to make determinism real, when such precisions, such FIQs, are flatly impossible both in theory (e.g. quantum uncertainty) and experimentally?

Bottom line: I think you are hitting all the right notes. Great paper!

    Dear Terry,

    thank you very much for your feedback!

    Indeed, you correctly point out a difficulty of our approach, which is a tension between the time that really passes due to the dertermination of new physical quantities and the relativistic time, which is just another component of the space-time manifold (which indeed is defined in terms of homeomorphisms with the Euclidean space R^n). We are working on this and have already discussed possible solutions.

    However, I am not sure whether I agree with what you then say about the continous limits:

    "These limits often occurs at scales far smaller than our unaided senses can perceive. If physics did not work this way, the calculus never would have worked well enough for modeling this universe to have been worth the trouble."

    In fact, quantum mechanics occurs at scales far smaller than our unaided senses can perceive and yet there the we found the first impossibility of thinking in terms of continuity. So I would say that it is the other way around: our human-scale made us think that everything is smooth and continous, but when you observe from much closer you are forced to introduce quantizations.

    Thank you once more and all the best,

    Flavio

    Flavio, thanks! I must have worded that badly: I was trying to say exactly what you just said: It's just the huge number of atoms in human-scale objects that fools us into believing in "continuity" that is actually quantized.

    Dear Flavio,

    As an engineer we are taught early on that all the problems that we are going to solve are idealization also all of the laws of physics classical or otherwise are idealization of the problems being solved, theoretical or applied. So I don't know in that sense what is the real discovery. Moreover, the problem with the standard QM interpretation(the minimum) in terms of predictibility is fundamentally vastly different in quality and quantity, so I don't see how you casting doubt on the uncertainity in classical helps in QM.

    My solution has been to try to deal with the problem with the only and final way to solve all these problems, that is to find the correct ontology which leads to to a fairly simple system that can be comprehended easily.

    It is also very baffling for me when all the respectable physicists (not talking about amateur philosopher) here in this contest go on about Turing and Godel yet we have been doing physics in the past 400 years with regular math with fairly good success. So it should have been obvious that it is only about a correction to the model and deducing some ontology, that is all and not go on a wild goose chase.

    Sorry for being direct, I don't mean to be unfriendly, just my opinion. Thanks

      I should just mention that I don't mean that my system is correct as is, but I think it is highly suggestive of the correct physics since it closely follows present understanding and the general set up of today's physics, yet more fundamental.

      Dear Adel Sadeq, there is a deep conceptual difference between the way engeneers and (even experimental scientists) are trained and the foundations we groumd our scientific theories upon. You are referring to the unavoidable finiteness of the error bars in actual measurements. Clearly this is not the same as having fundamental limits of determuninacy, as I discuss in my essay.

      Dear Flavio,

      Thank you for the reply. The problem of fundamental limits of determinacy is very hard to be determined experimentally whether classical or QM as discussed in the link below by you colleague in the math department Dr. A. Neumaier. Moreover, I agree with his interpretation as a basis, which my theory sort of confirms it because the probability density is shown to be just the density of energy and momentum very much like in QFT. Also EPR is trivial in my idea since the system is inherently a nonlocal theory. Thanks again.

      https://www.physicsforums.com/threads/the-thermal-interpretation-of-quantum-physics.967116/

      Dear Flavio Del Santo,

      Your essay seems to be exactly what the framers of this topic were looking to generate. My essay went in a totally different direction. Your writing style made this dense topic relatable. I was just going to glance at the essay, but ending up reading it in one sitting. Excellent work.

      A topic for a future essay could be a clearer link between quantum mechanics and measurable classical mechanics. Your phase space diagram showed a strong similarity between the two. This relationship gave me an idea about how the square well problem, which famously has opposite result for classical and quantum could be shown to be similar if measurement was accounted for in the set-up.

      There is also the issue of larger scale events like sound waves, which contain information, but seem independent of the atomic scale.

      Sincerely,

      Jeff Schmitz

        Dear Flavio,

        This is really an impressive and excellent essay. I enjoyed it very much.

        I have not much to comment but to ask some questions with regards to where your essay could lead:

        - There is an interesting trend in mathematics (I find it interesting at least) which attempts to extend concepts of standard mathematics to fuzzy sets. This is sometimes called fuzzification. I am wondering whether your essay could not be a starting point for a fuzzification of classical mechanics. Note that fuzzification has also the advantage of using fuzzy logic which has a working procedure to determine entailments and the like (this could be relevant to your notion of cause and effect).

        - Your last optimistic quote and paragraph on the openness of the future does remind me of the interpretation of probability by Carl Friedrich von Weizsacker. As far as I understand his view, probability can only be about the future because the future is "open" to re-use your wordings. Are you aware of his work on the matter (like his temporal logic) and, if so, do you have particular thoughts about it?

        While your essay mostly focuses on indeterminacy, the essay I have submitted focuses on undecidability and asks a similar question in substance "As physics and science ever been decidable?". If you are interested you can read it there https://fqxi.org/community/forum/topic/3477 .

        Best of luck for the contest.

        Fabien

          Dear Jeff Schmitz,

          thanks very much for your flattering words, I am glad that you liked my essay. I will surely have a look at your essay, to see your different approach.

          I'll be also glad to hear about what you have in minf about the square-well problem.

          All the best!

          Flavio

          Dear Fabien,

          thank you for wrtiting and to kindly appreciate my work.

          Indeed, what you say about the "fuzzification" program is very pertinent and similar to the aims expressed in my essay. Gisin with other collaborators and me is considering different (constructive) mathematical structures to capture this feature of fuzziness.

          I was not aware of von Weizsacker's work that you mention. I will surely have a look at it, as I will do with your essay as soon as time allows.

          Thanks once more and all the best,

          Flavio

          Flavio,

          Your essay seems to put the foundation's question, the 3 Uns, in a more reasonable light that helps to reset the anthropocentric tilt toward objectivity regarding outcomes in physics -- classical and quantum. Furthermore your concepts are quite accessible. Your use of figure 3, for example, I feel introduces the graphic realism in the classical world that is missing, perhaps also in the realm of subatomic particles as well. A probability factor is always relevant in engineering projects. Hope you have time to give mine a look: https://fqxi.org/community/forum/topic/3396.

          Great job. My rating is your 22nd. Having that many protects you against the 1s given by some with no comments.

          Jim Hoover

            Dear Jim (if I may),

            Thanks for your kind feedback, I appreciate that.

            I will have a look at your essay and leave thwre comments should I have some interesting ones.

            Best of luch for the contest amd kind regards,

            Flavio

            Dear Del Santo:

            It is a pleasure reading your essay. It is very well presented. Worthy of receiving high grades.

            Fundamentally, we are in agreement. Nature does not work out of two orthogonal worlds of Classical Mechanics and Quantum Mechanics. Nature is working out of one single undivided space using one set of rules, however complex they may be. However, as an experimentalist, I find that indeterminism in our universe emerges without the need of sophisticated math or philosophical deliberations.

            Please, make time to read my essay and grade it.

            "Complete Information Retrieval: A Fundamental Challenge"

            https://fqxi.org/community/forum/topic/3565

            The universe is fundamentally stochastic because the observable radiations and particles are emergent oscillations/vibrations of the universal cosmic space, which is, at the same time, full of random "background" fluctuations. These fluctuations, however weak they may be, nonetheless are perturbing any and all other interactions going on in the universe. Since, human initiated measurements are also interactions between our chosen interactants in our apparatus, stochastical changes will be inevitable in every measurements, classical or quantum.

            I am an experimentalist. Our measurements will always be imprecise. When we enumerate the basic steps behind the measurement processes, we find that we had been, we now are, and we will always be, information limited:

            (i) Data are some physical transformation taking place inside the apparatus.

            (ii) The physical transformation in a detectable material always require some energy exchange between the interactants, the "unknown" and the "known" as the reference interactant.

            (iii) The energy exchange must be guided by some force of interaction operating between the chosen interactants.

            (iv) Since we have started with an unknown universe, from the standpoint of building physics theories, the "known" entities are known only partially, never completely. This also creates information bottleneck for the "unknown" entity. Note that in spite of innumerable experiments, we still do not know what electrons and photons really are.

            (v) All forces of interactions are distance dependent. Hence, the interactants must be placed within the range of each other's mutual influence (force-field). Force-field creates the necessary physical "entanglement" between interacting entities for the energy transfer to proceed. In other words, interactants must be "locally regional" within their physical sphere of influence. They must be "entangled" by a perceptible physical force. Our equations are built on such hard causality.

            (vi) The final data in all instruments suffer from the lack of 100% fidelity. This is another permanent problem of imprecision. We can keep on reducing the error margin as our technology enhances; but we do not know how to completely eliminate this error.

            Many of my earlier papers have also articulated this position. They can be downloaded from:

            http://www.natureoflight.org/CP/

            You can also download the paper: "Next Frontier in Physics--Space as a Complex Tension Field"; Journal of Modern Physics, 2012, 3, 1357-1368 , http://dx.doi.org/10.4236/jmp.2012.310173

            Sincerely,

            Chandra.

              Your essay was interesting. However, it raises at least one major question that I don't think was sufficiently addressed.

              For example, in figure 3, why could the (indeterministic) graph on the right not be considered to be deterministic? It appears that it is interpreted to be indeterministic simply because events could have gone in a different direction.

              Let me state this in a broader context, which is also the reason that I did not directly address unpredictability in my essay. A popular interpretation of unpredictability is that the initial conditions cannot be known with sufficient enough precision to construct accurate predictions. I find this interpretation to be, at the very least, weak. It certainly is not the same thing proposed by undecidability and uncomputability. It is weak because it does not actually refute determinism as a philosophical concept, it simply seems to to refute it from what appears to be an epistemic perspective.

              Subsequently, you seem to refute the argument that it is merely epistemic by appealing to something akin to Heisenberg's uncertainty principle, which is to say that attaining the necessary precision (to make accurate predications) is fundamentally impossible. In other words, it is not merely a limitation of our formal systems or measurement equipment; it is inherent. To state this yet another way, our measurements are intrinsically fuzzy, which is not due to lack of accuracy, but rather due to the fact that they are actually fuzzy and thus, to some extent, indeterminate.

              I would then conclude this isn't really a refutation of determinism and thus an argument for indeterminism, but rather it would appear to be something more like an argument for a fuzzy determinism that would appear to be, at least to some extent, indeterminate because it is postulated that reality is intrinsically fuzzy.

              As a result, the question that I don't think was sufficiently addressed is the precisely the question of determinism because a fuzzy determinism is not indeterministic; it is just fuzzy. So, does God play dice or not?

                Hi Flavio,

                Thanks for the really well written essay. The principle of infinite precision is super useful in highlight the tension, either ontologically or epistemically, of formal axiomatic systems and physics. I'd never read Max Born's essay that you quote on page 2, but its an excellent reference!

                I completely sympathise with your idea of finite information quantities are loved the reference to Gisin's recent work. As you mentioned, Landauer has written extensively on this---I lent heavily on Landauer's perspectives in my essay.

                You wrote that

                ``In this view, the orthodox interpretation of classical physics can be regarded as a deterministic completion of an indeterministic model, in terms of hidden variables''.

                The finite information quantities seem to suggest that indeterministic models are fundamental. For example, a lot of powerful claims can be made from statistical physics or thermodynamics. Would these kinds of ideas occupy a more fundamental, rather than derivative role, in a theory of finite information quantities?

                Again, thanks for the great essay! It was a pleasure to read!

                Cheers,

                Michael

                  Dear Jason W Steinmetz, it seems that your criticisms are based on misconceptions. You write: "A popular interpretation of unpredictability is that the initial conditions cannot be known with sufficient enough precision to construct accurate predictions. I find this interpretation to be, at the very least, weak". I disagree that this is populuar, for that matters, but most importantly that this is weak. It is a totally legit way of thinking of plausible indeterministic theories. The other main one being, what would perhaps make you happier, a fundamentally stochastic dynamics. Please have a look at my paper where we discuss these distinctions https://arxiv.org/abs/1909.03697 (section "FORMS OF INDETERMINISM IN CLASSICAL AND QUANTUM PHYSICS"). The argument is NOT merely epistemic, for the initial conditions are in my model supposed to be indeterminate (as opposed to anckown, as an experimentalist would claim).

                  Determinism is absolute, making it just a little bit fuzzy makes it untanable. Think of any chaotic system and you will see this. Moreover, we have Bell's inequalities: If there is less than one bit of randomness in the universe, we can create unbounded one (again see my paper below).

                  Best regards,

                  Flavio Del Santo

                  Dear Chandra,

                  Thank you for appreciating my work and taking time to comment and relate to your work. I will have a look and possibily come back with comments on your essay.

                  Best wishes,

                  Flavio

                  Dear Michael,

                  Thanks for your kind feedback. Indeed, I think our essays have a great deal in common, although your is more concerned with limits of computation. (I am about to comment more precisely on the dedicated page of your essay).

                  As for your question, indeed, statistical (indeterministic) laws would be just a reflection of an indeterministic microscopic behaviour. So it would surely attribute a more fundamental value to this. Moreover, this would even out the tension between microscopic determinism and statistical irreversibility. Gisin and I have gone through this problem in our paper (https://arxiv.org/abs/1909.03697), where we stated: "In the perspective of the alternative indeterministic interpretation (based on FIQs), instead, the deterministic law of the ideal gas, ruling the behavior at

                  the macroscopic level, emerges as a novel and not reducible Notice that the historical debate on the apparent incompatibility between Poincar´e's recurrence theorem and Boltzmann's kinetic theory of gases does not arise in the framework of FIQs."

                  Thank you once more and all the best,

                  Flavio