Dear Israel,

I am flattered by your compliments on the potential you see in my modest work. I am glad that you find it interesting. I will look at your essay as well and comment if I have something to say about it.

Best regards,

Flavio

6 days later

Dear Dr Flavio Del Santo

Thank you for presenting a wonderful essay... Your words ..... classical physics (i.e., Newton's mechanics and Maxwell's electrodynamics) would allow, in principle, to predict everything with certainty.....are very much true in case of Dynamic Universe Model. Many of its predictions came true.

Let me hope you will have time visiting my essay to have a CRITICAL examination of my essay... "A properly deciding, Computing and Predicting new theory's Philosophy".....

Best Regards

=snp

Dear Flavio

I came here to comment by seeing the popularity of your essay and by reading the abstract I understood your view point at least (I guess). Then, going through the successive posts I encountered your comment: ``the whole point of my view is that there is ALWAYS an element of genuine randomness. If you accept my alternative interpretation, even the length of a metal rod would not be fully determined.'' I appreciate your attitude and your thought is in the right direction (in my personal opinion). I wrote this obviously because it somewhat matches with mine. However, I may humbly opine that you have assumed a lot and then expressed your view.

I believe you could have got to the crux of the problem by asking much more elementary questions related to you probably learnt in your school. It is about units, measurements and calculus. Have you ever wondered, when your write, following Cauchy, ``infinitely small quantity'' in the definition of derivative, if this phrase makes sense at all or it only makes sense when you write ``infinitely small quantity with respect to another quantity''. Think about it.

Consider your metal rod. Can you tell me whether it is ``long or short''? Yes, my question is meaningless and you can not answer because you need to know ``with respect to what''. So, comparison of two lengths (or similar physical quantities) only provide a number and with this number you do mathematics and draw inference. However, as you say, there is ALWAYS a randomness, or as I say ``inexactness'' in measurement. Einstein himself ignored this fact by writing that this problem can be overcome by choosing sub divided rods (smaller units). If you don't believe me, I can give you the reference.

In spite of such practical inexactness, equations in physics are written as if they are exact. Starting from Newton to Cauchy and other great men of science, including Einstein, have treated science as exact, at least in writing (may be not in the attitude). Otherwise, science could have been very different, without having any singularity problems or the difficulties regarding the classical-quantum distinction.

Anyways, I do not want to bore you more with my childish comments, because I have already written a childish essay posted here. I would rather conclude by wishing you luck for winning this essay contest (which I think you are the most probable one).

Regards

Abhishek Majhi

Dear Flavio,

Your ideas are brilliant and the essay itself is extremely well prepared. You touch the deep problems related to real numbers, no infinities in the Universe, nondeterminism, unpredictability and arguments are impeccable. Then the only question which remains is how come all this is like it is which is the domain of mystical Theory of Everything. Arguments like yours and other led me to conclusion that such theory has to be grounded in uncomputability. But how? I sketched the out-of-the-mainstream ideas in my essay.

Best regards,

Irek

    Dear Irek,

    thank you for your kind feedback! I will have a look at your essay and send you my comments, should I have some.

    All good wishes,

    Flavio

    Dear Flavio,

    I found your paper persuasive and powerful. The only content issue I think of is that a few comments about how your ideas relate to the popular block universe concept would have been nice. I am myself very solidly in the now-is-real column, but I also deeply respect both Einstein's concerns on frame foliation reconciliation, and quantum arguments such as Wheeler-Feynman retarded-advanced photon models. These are relevant since any pre-existence of complete, beginning-to-end world lines implies effectively infinite classical precision of all points in all foliations/slices of the resulting block universe.

    One of the most profound and universal aspects of observable physics is a tendency for many, but by no means all, natural phenomena to converge towards well-defined limits. These limits often occurs at scales far smaller than our unaided senses can perceive. If physics did not work this way, the calculus never would have worked well enough for modeling this universe to have been worth the trouble.

    Thus I read your arguments as emphasizing this collection of limit-approaching processes as the reality, while the limit itself is the fictional goal, at least for processes in the real universe and in computation.

    I suspect there is also a powerful anthropic component to why we do this.

    That's because all forms of life are about using low-bit models to predict (and thus survive in) high-bit environments, the latter of which are never fully predictable with such models. In the absence of widespread asymptotic convergence, such modeling would become infeasible. Our brains thus have a built-in biological bias towards bit-efficient Platonic idealism, since it gives us a way to approximating a reality that is far more computationally efficient that attempting more accurate convergence-level modeling.

    Other items: I like your FIQ approach to defining numeric precision.

    One observation there is that I suspect integers often seduce folks into sloppy thinking about such issues. Integers seem infinitely precise in ways that real numbers can never be. Thus integer thinking seems to enable forms of computation that are, at least for a subset of reality, "infinitely" precise.

    However, in an interesting analogy to decoherence in quantum theory, this concept of exact classical precision falls apart badly when the full environment in which such calculations operate is taken into account. In fact, here's a radical suggesting: Integer counting is hugely more difficult in classical systems than in quantum systems. Uh... what??

    Here's what I mean: When a rest state helium atom selects (counts) its two electrons, those electrons are really and truly state variants of a single underlying and literally universal object type, the electron. That this is so can be seen in the necessity for quantum mechanics to treat them as objects that cannot be distinguished, giving rise to fermi and bose statistics. So: Quantum "counts" are real counts of truly identical objects.

    In sharp contrast, n classical object are not and never can be absolutely identical. Thus the only way by which the seemingly perfectly precise integer counting process of, say, a computer can be attached ("decohered") to the environment to which it applies is for some set of entities to be observed and classified as the "same". This in turn implies the existence inclusion in the process of a rather sophisticated intelligent observer, one that is capable of deciding where a particular time-evolving collection of atoms and energy is or is not an "object" as defined by the pristine counting process.

    Thus not only is the object concept itself amorphous and burry around the edges in both space and time (e.g. is a worn bearing still a bearing?), all forms of classical counting -- including emphatically that of fingers and toes, since how often does one encounter atomically identical fingers -- are arguably more observer-dependent than are quantum counting phenomena. Atoms at least can count (select) the number of base-state electrons they need quite precisely, and do so without the need for any intelligent observers.

    A final thought is the simplest argument of all: If one assumes the existence of only one infinitely precise number anywhere in the universe, whether real such as an object trajectory or as bits in some remarkable computer, the unavoidable result is that the universe collapses. It cannot do otherwise, since the energy required to instantiate any such number will always be infinite, no matter its physical form.

    So again, this time in frustration mode vs. accommodation (of block universes) mode: For at least the century or so since folks figured out for sure that atoms are real, why the bleep do both math and science persist in pretending that every particle possesses the infinite precision needed to make determinism real, when such precisions, such FIQs, are flatly impossible both in theory (e.g. quantum uncertainty) and experimentally?

    Bottom line: I think you are hitting all the right notes. Great paper!

      Dear Terry,

      thank you very much for your feedback!

      Indeed, you correctly point out a difficulty of our approach, which is a tension between the time that really passes due to the dertermination of new physical quantities and the relativistic time, which is just another component of the space-time manifold (which indeed is defined in terms of homeomorphisms with the Euclidean space R^n). We are working on this and have already discussed possible solutions.

      However, I am not sure whether I agree with what you then say about the continous limits:

      "These limits often occurs at scales far smaller than our unaided senses can perceive. If physics did not work this way, the calculus never would have worked well enough for modeling this universe to have been worth the trouble."

      In fact, quantum mechanics occurs at scales far smaller than our unaided senses can perceive and yet there the we found the first impossibility of thinking in terms of continuity. So I would say that it is the other way around: our human-scale made us think that everything is smooth and continous, but when you observe from much closer you are forced to introduce quantizations.

      Thank you once more and all the best,

      Flavio

      Flavio, thanks! I must have worded that badly: I was trying to say exactly what you just said: It's just the huge number of atoms in human-scale objects that fools us into believing in "continuity" that is actually quantized.

      Dear Flavio,

      As an engineer we are taught early on that all the problems that we are going to solve are idealization also all of the laws of physics classical or otherwise are idealization of the problems being solved, theoretical or applied. So I don't know in that sense what is the real discovery. Moreover, the problem with the standard QM interpretation(the minimum) in terms of predictibility is fundamentally vastly different in quality and quantity, so I don't see how you casting doubt on the uncertainity in classical helps in QM.

      My solution has been to try to deal with the problem with the only and final way to solve all these problems, that is to find the correct ontology which leads to to a fairly simple system that can be comprehended easily.

      It is also very baffling for me when all the respectable physicists (not talking about amateur philosopher) here in this contest go on about Turing and Godel yet we have been doing physics in the past 400 years with regular math with fairly good success. So it should have been obvious that it is only about a correction to the model and deducing some ontology, that is all and not go on a wild goose chase.

      Sorry for being direct, I don't mean to be unfriendly, just my opinion. Thanks

        I should just mention that I don't mean that my system is correct as is, but I think it is highly suggestive of the correct physics since it closely follows present understanding and the general set up of today's physics, yet more fundamental.

        Dear Adel Sadeq, there is a deep conceptual difference between the way engeneers and (even experimental scientists) are trained and the foundations we groumd our scientific theories upon. You are referring to the unavoidable finiteness of the error bars in actual measurements. Clearly this is not the same as having fundamental limits of determuninacy, as I discuss in my essay.

        Dear Flavio,

        Thank you for the reply. The problem of fundamental limits of determinacy is very hard to be determined experimentally whether classical or QM as discussed in the link below by you colleague in the math department Dr. A. Neumaier. Moreover, I agree with his interpretation as a basis, which my theory sort of confirms it because the probability density is shown to be just the density of energy and momentum very much like in QFT. Also EPR is trivial in my idea since the system is inherently a nonlocal theory. Thanks again.

        https://www.physicsforums.com/threads/the-thermal-interpretation-of-quantum-physics.967116/

        Dear Flavio Del Santo,

        Your essay seems to be exactly what the framers of this topic were looking to generate. My essay went in a totally different direction. Your writing style made this dense topic relatable. I was just going to glance at the essay, but ending up reading it in one sitting. Excellent work.

        A topic for a future essay could be a clearer link between quantum mechanics and measurable classical mechanics. Your phase space diagram showed a strong similarity between the two. This relationship gave me an idea about how the square well problem, which famously has opposite result for classical and quantum could be shown to be similar if measurement was accounted for in the set-up.

        There is also the issue of larger scale events like sound waves, which contain information, but seem independent of the atomic scale.

        Sincerely,

        Jeff Schmitz

          Dear Flavio,

          This is really an impressive and excellent essay. I enjoyed it very much.

          I have not much to comment but to ask some questions with regards to where your essay could lead:

          - There is an interesting trend in mathematics (I find it interesting at least) which attempts to extend concepts of standard mathematics to fuzzy sets. This is sometimes called fuzzification. I am wondering whether your essay could not be a starting point for a fuzzification of classical mechanics. Note that fuzzification has also the advantage of using fuzzy logic which has a working procedure to determine entailments and the like (this could be relevant to your notion of cause and effect).

          - Your last optimistic quote and paragraph on the openness of the future does remind me of the interpretation of probability by Carl Friedrich von Weizsacker. As far as I understand his view, probability can only be about the future because the future is "open" to re-use your wordings. Are you aware of his work on the matter (like his temporal logic) and, if so, do you have particular thoughts about it?

          While your essay mostly focuses on indeterminacy, the essay I have submitted focuses on undecidability and asks a similar question in substance "As physics and science ever been decidable?". If you are interested you can read it there https://fqxi.org/community/forum/topic/3477 .

          Best of luck for the contest.

          Fabien

            Dear Jeff Schmitz,

            thanks very much for your flattering words, I am glad that you liked my essay. I will surely have a look at your essay, to see your different approach.

            I'll be also glad to hear about what you have in minf about the square-well problem.

            All the best!

            Flavio

            Dear Fabien,

            thank you for wrtiting and to kindly appreciate my work.

            Indeed, what you say about the "fuzzification" program is very pertinent and similar to the aims expressed in my essay. Gisin with other collaborators and me is considering different (constructive) mathematical structures to capture this feature of fuzziness.

            I was not aware of von Weizsacker's work that you mention. I will surely have a look at it, as I will do with your essay as soon as time allows.

            Thanks once more and all the best,

            Flavio

            Flavio,

            Your essay seems to put the foundation's question, the 3 Uns, in a more reasonable light that helps to reset the anthropocentric tilt toward objectivity regarding outcomes in physics -- classical and quantum. Furthermore your concepts are quite accessible. Your use of figure 3, for example, I feel introduces the graphic realism in the classical world that is missing, perhaps also in the realm of subatomic particles as well. A probability factor is always relevant in engineering projects. Hope you have time to give mine a look: https://fqxi.org/community/forum/topic/3396.

            Great job. My rating is your 22nd. Having that many protects you against the 1s given by some with no comments.

            Jim Hoover

              Dear Jim (if I may),

              Thanks for your kind feedback, I appreciate that.

              I will have a look at your essay and leave thwre comments should I have some interesting ones.

              Best of luch for the contest amd kind regards,

              Flavio

              Dear Del Santo:

              It is a pleasure reading your essay. It is very well presented. Worthy of receiving high grades.

              Fundamentally, we are in agreement. Nature does not work out of two orthogonal worlds of Classical Mechanics and Quantum Mechanics. Nature is working out of one single undivided space using one set of rules, however complex they may be. However, as an experimentalist, I find that indeterminism in our universe emerges without the need of sophisticated math or philosophical deliberations.

              Please, make time to read my essay and grade it.

              "Complete Information Retrieval: A Fundamental Challenge"

              https://fqxi.org/community/forum/topic/3565

              The universe is fundamentally stochastic because the observable radiations and particles are emergent oscillations/vibrations of the universal cosmic space, which is, at the same time, full of random "background" fluctuations. These fluctuations, however weak they may be, nonetheless are perturbing any and all other interactions going on in the universe. Since, human initiated measurements are also interactions between our chosen interactants in our apparatus, stochastical changes will be inevitable in every measurements, classical or quantum.

              I am an experimentalist. Our measurements will always be imprecise. When we enumerate the basic steps behind the measurement processes, we find that we had been, we now are, and we will always be, information limited:

              (i) Data are some physical transformation taking place inside the apparatus.

              (ii) The physical transformation in a detectable material always require some energy exchange between the interactants, the "unknown" and the "known" as the reference interactant.

              (iii) The energy exchange must be guided by some force of interaction operating between the chosen interactants.

              (iv) Since we have started with an unknown universe, from the standpoint of building physics theories, the "known" entities are known only partially, never completely. This also creates information bottleneck for the "unknown" entity. Note that in spite of innumerable experiments, we still do not know what electrons and photons really are.

              (v) All forces of interactions are distance dependent. Hence, the interactants must be placed within the range of each other's mutual influence (force-field). Force-field creates the necessary physical "entanglement" between interacting entities for the energy transfer to proceed. In other words, interactants must be "locally regional" within their physical sphere of influence. They must be "entangled" by a perceptible physical force. Our equations are built on such hard causality.

              (vi) The final data in all instruments suffer from the lack of 100% fidelity. This is another permanent problem of imprecision. We can keep on reducing the error margin as our technology enhances; but we do not know how to completely eliminate this error.

              Many of my earlier papers have also articulated this position. They can be downloaded from:

              http://www.natureoflight.org/CP/

              You can also download the paper: "Next Frontier in Physics--Space as a Complex Tension Field"; Journal of Modern Physics, 2012, 3, 1357-1368 , http://dx.doi.org/10.4236/jmp.2012.310173

              Sincerely,

              Chandra.