hallo there, I came across this paper during these Christmas holydays, through FB.
I've got a point or two about the issue, which I'd like to discuss - at best, perhaps, with the author himself, Giovanni.
( BTW: I'm italian, so if he should come across this comment he may, of course, reply in italian )
Not knowing if a comment, written 2 months from the last one published, will even be read, I'm a little reluctant to write extensively my point of view, but I'll take myself some time to present some keywords and basic ideas. I hope I'm not doing this for nothing... for once that I encounter a VERY interesting article, it HAD to be on a page that no-one probably ever visits any more! ;-))
======================
COMMENT to the paper:
======================
Firstly, I find the space-time inference absurd: to postulate an universe giving itself the hard work of building en entire 4D (or more!!) continuum with highly complex mathematical properties, is insane to the least; that's why I completely agree with the historical/operational explanation of our "space-time-way-of-thinking" given in the paper. Also, a scheme of the type "emitter-detector" can be helpful to investigate some aspects of the entire problem, reducing it down to its essentials. Thus: reductionism, approved ;-)
(THE BEEF): My understanding is that 'space-time' is a complex construct, most probably EMERGING as an average reference frame for classical objects with no underlying reality of its own. A continuous, local (as opposed to "non local" in Bell's sense) classical-relativistic reality, though, holds for EVENTS in space-time, and cannot be eluded.
But EVENTS are classical phenomena, they are described by 'macro-states'. A macro-states consists of an incredibly numerous collection of micro(or quantum) states; it is a portion in the configurations' space, that ends up coherently describing a shared overall picture.
This "shared overall picture", we call 'classical world' and it obeys relativity, causality, and locality (in the sense of Bell).
In my picture, however, the shared overall is only a kind of "average", consisting of a huge number of micro-states, all of them describing alternate "little" stories about the numerous constituents of the macro-system, à la "sum over stories" (Feynman).
The expansion of one portion of the configurations' space, necessary to produce a "new" (or, better: "different") macro-state generally happens in a measuring device; the coherent description of a defined macrostate with its classical properties (position, time, momentum, energy, etc..) we call the "outcome", is classical, and could be highly non-linear
As a matter of fact, a micro-state (describing a single quantum object, or phenomenon) isn't observable, and if this is to become observable, the microstate must necessarily undergo an amplification through a measuring device, of the type "avalanche expansion" (snowball amplification).
The measure process will then eventually produce an outcome, which is a classical one.
No quantum microstates can ever be observed, only their amplifications.
More than that: my idea is that only certain amplifications can be observed, and so can be christened "real": in those observable configurations a large enough number of microstates coherently describes a classical, physically self-consistent picture (*), that most other microstates in the universe will share form that very point onwards.
On one side, the measuring device is clearly entangled with the totality of all of the microstates involved in the measure process in the laboratory, but, on the other side, it also communicates the classical results with the rest of the classical world.
Once we rename the term "communicate the classical result" with the periphrasis "is a coherent description via a very large number of microstates, which differ only by a tiny quantity, but give the same overall picture, all entangled with observers across the rest of the universe" we get that the measuring system is quantum mechanically entangled with everything else, also on the other side, but mainly via macro-states.
==========================
[1] Some curiosities tend to enforce this sight, in my opinion: in certain carefully prepared experiments, e.g., a particle is observed in two locations at the same time, or other "oddities" can be produced (super-conductivity, super-fluidity, Bose-Einstein condensates, etc...).
But these "oddities" are achievable only in conditions special enough that they only can happen through special arrangements of detectors in which particular attention is given to the segregation of the experiment from the rest of the universe.
In this case, it seems that oddities can be sustained as long as little or virtually no communication happens with the world outside the laboratory, thus via segregation from the rest of the universe.
Outside of a laboratory environment, "oddities" generally do not survive for an enough long time to be observed by passers-by, before degenerating to dull defined 'classical' states.
==========================
During the measure process, a number of "stories" can theoretically be connected with the microstate examined, that will eventually produce alternate outcomes. Rarely (see [1]) these outcomes can both exist at the same time.
But why is so ? in my picture, it is conceivable, that different outcomes are related to different portions of the configurations' space of the system (microstate to analyze + "antenna" microstates in the measuring device) and perhaps the amplification process favours one groups of stories more than another, and the entire universe is then entangled with the option (group of micro-stories) that has more "offspring" outnumbering other possible outcomes.
This - and only this group of stories, or macrostate
[[ apart from exceptions in which an equilibrium can be maintained (with significant effort) between more possible outcomes, in the sense of [1] ]]
becomes "real" in the sense that we can observe it, and every physical system relying upon the measuring result will be entangled with all the microstates produced by the amplification, via an incredibly large number of quantum interactions.
We are talking astronomic numbers, here, as a single detection generally involves a number, N, of the order or 10^20 particles, with a configurations' space portion that can expand of a number in the order of magnitude of N, but factorial !!
The collection of micro-states that is comprised in a group of stories that only differ by a tiny little bit (ħ, fro example ?!?!) from one another describe the very same picture, but slightly different. It is a fundamental property of microstates' statistics in a snowball expansion, that the outcomes will concentrate around a bell-shaped curve, of the Gaussian type
These outcomes relate to one another creating, in my picture, a network of ever-growing degrees of freedom, where some properties can be interpreted as localizations in space, and in time.
Lastly: I have indirect proof that this approach could be the right one for a better model of QM, and some of these concern the nature of space-.time on large scale. This line of thinking goes so far, as to account for galaxy stability without dark matter, or even to change significantly our understanding of the universe's dynamics at very large scale (structure of the order of magnitude of the "great wall", filaments, or even the cosmic expansion rate, the dark energy problem and the like).
For the next times, however, my goal would be to show that in a so constructed space.-time, a limit would emerge: an insuperable velocity only for macro-states transporting information, whereas everything else could be thought of as non local in space and timeless (or: "non local" in time sense).
My aim is to make a specially relativistic space-time of EVENTS just out of nothing thought of as being local. Is it crazy enough? Any comments?
Kind regards, just in case... ;-)