Dear Flavio,
I found your paper persuasive and powerful. The only content issue I think of is that a few comments about how your ideas relate to the popular block universe concept would have been nice. I am myself very solidly in the now-is-real column, but I also deeply respect both Einstein's concerns on frame foliation reconciliation, and quantum arguments such as Wheeler-Feynman retarded-advanced photon models. These are relevant since any pre-existence of complete, beginning-to-end world lines implies effectively infinite classical precision of all points in all foliations/slices of the resulting block universe.
One of the most profound and universal aspects of observable physics is a tendency for many, but by no means all, natural phenomena to converge towards well-defined limits. These limits often occurs at scales far smaller than our unaided senses can perceive. If physics did not work this way, the calculus never would have worked well enough for modeling this universe to have been worth the trouble.
Thus I read your arguments as emphasizing this collection of limit-approaching processes as the reality, while the limit itself is the fictional goal, at least for processes in the real universe and in computation.
I suspect there is also a powerful anthropic component to why we do this.
That's because all forms of life are about using low-bit models to predict (and thus survive in) high-bit environments, the latter of which are never fully predictable with such models. In the absence of widespread asymptotic convergence, such modeling would become infeasible. Our brains thus have a built-in biological bias towards bit-efficient Platonic idealism, since it gives us a way to approximating a reality that is far more computationally efficient that attempting more accurate convergence-level modeling.
Other items: I like your FIQ approach to defining numeric precision.
One observation there is that I suspect integers often seduce folks into sloppy thinking about such issues. Integers seem infinitely precise in ways that real numbers can never be. Thus integer thinking seems to enable forms of computation that are, at least for a subset of reality, "infinitely" precise.
However, in an interesting analogy to decoherence in quantum theory, this concept of exact classical precision falls apart badly when the full environment in which such calculations operate is taken into account. In fact, here's a radical suggesting: Integer counting is hugely more difficult in classical systems than in quantum systems. Uh... what??
Here's what I mean: When a rest state helium atom selects (counts) its two electrons, those electrons are really and truly state variants of a single underlying and literally universal object type, the electron. That this is so can be seen in the necessity for quantum mechanics to treat them as objects that cannot be distinguished, giving rise to fermi and bose statistics. So: Quantum "counts" are real counts of truly identical objects.
In sharp contrast, n classical object are not and never can be absolutely identical. Thus the only way by which the seemingly perfectly precise integer counting process of, say, a computer can be attached ("decohered") to the environment to which it applies is for some set of entities to be observed and classified as the "same". This in turn implies the existence inclusion in the process of a rather sophisticated intelligent observer, one that is capable of deciding where a particular time-evolving collection of atoms and energy is or is not an "object" as defined by the pristine counting process.
Thus not only is the object concept itself amorphous and burry around the edges in both space and time (e.g. is a worn bearing still a bearing?), all forms of classical counting -- including emphatically that of fingers and toes, since how often does one encounter atomically identical fingers -- are arguably more observer-dependent than are quantum counting phenomena. Atoms at least can count (select) the number of base-state electrons they need quite precisely, and do so without the need for any intelligent observers.
A final thought is the simplest argument of all: If one assumes the existence of only one infinitely precise number anywhere in the universe, whether real such as an object trajectory or as bits in some remarkable computer, the unavoidable result is that the universe collapses. It cannot do otherwise, since the energy required to instantiate any such number will always be infinite, no matter its physical form.
So again, this time in frustration mode vs. accommodation (of block universes) mode: For at least the century or so since folks figured out for sure that atoms are real, why the bleep do both math and science persist in pretending that every particle possesses the infinite precision needed to make determinism real, when such precisions, such FIQs, are flatly impossible both in theory (e.g. quantum uncertainty) and experimentally?
Bottom line: I think you are hitting all the right notes. Great paper!