D
Daniel Sudarsky

  • Joined Dec 18, 2018
  • Dear Antoine,

    Thanks you very much for your kind and thoughtful reply.

    It would indeed be nice to have more opportunities to discus this and other issues.

    Let me just react here to one of your statements.:

    " I remain convinced that the price to pay to have fundamental non-linearity is much higher than people think. Nicolas Gisin's formulation of the no-go in terms of faster than light signaling is probably the most impressive, but in the end I do not think faster than light signaling is the main issue. In essence, I believe the problem is more one of predictability, and ability to separate systems into subsystems for all practical purposes. "

    I agree with you that linearity makes things manageable as far as our ability to analyze things is concerned. In particular as you note, the for all practical purposes' ( FAPP) separability is extremely convenient. But why should physics be that way. We already know (as shown for instance in Bell's theorem and related results ) that locality, a premise that seems well tied to separability, is not a fundamental feature of nature. It took us humans a long time to come to terms with that Why should it be so at the practical level in general. In fact, it seems to me that if things in nature were so, we would have to consider some kind of fundamental conspiracy. The world is nonlocal, there is no separability but nonetheless the laws of nature are so as to hide to a dramatic a universal extent those facts from us.

    Does it not seem more natural to think that we happen to live in a region of the universe where that separability works at the FAPP to a very large extent, simply because is only under such conditions that life might evolve. Thus we would be deceiving ourselves by a clearly understandable but contingent condition of our immediate environment rather that facing the fundamental realities of nature. No conspiracies there.

    Looking forward to seeing you somewhere and to continue our exchenge.

    In the meanwhile all the best,

    Daniel

  • Dear Antoine,

    I very much enjoyed your essay (as I have previously enjoyed your papers and our discussions).

    So, let me take the opportunity to ask for your reaction to my challenge to one particular aspect of your posture. It concerns an issue which we have previously discussed to some extent, but I think your essay gives us the opportunity to get a bit deeper into it .

    I am referring to the robustness of the linearity requirement. In fact, I would like to challenge the generality of the following statement: `` The price is stiff: non-linearity is allowed at the wave-function level, but it has to vanish exctly at the density matrix level".

    I know you have acknowledged explicitly that N. Gisin no-go theorem (as all no-go theorems) is only as strong as his assumptions (which include as far as I understand, no considerations of possible limitations on what can in fact be measured, other than those imposed by causality itself, as well as other possibilities that you raised yourself, like deviations from Born's Rule) .

    What I want to consider next are some concrete reasons to doubt the strict validity of that linearity as a characteristic of viable quantum theories.

    Let's start by assuming that some relativistic version of quantum theory involving spontaneous collapse is the adequate description of nature to a very good approximation at least in regimes where spacetime can be described by general relativity and the objects we are describing in quantum terms, are so small and light that their gravitational effects can be ignored, i.e. they are just test objects as far as gravity is concerned. If we depart substantially from the last condition there is of course good reasons to think that we would eventually enter a regime where a full quantum theory of gravity (QG) will be needed, and that quite possibly the standard notions of space-time would be lost. At that point of course our collapse theory will be meaningless as well, as it is formulated on the assumption that we have a suitable notion of time available to us (as in the nonrelativistic versions of spontaneous collapse theories are Schrödinger-like equations prescribing the time evolution of quantum states) or, a notion of spacetime (as used say in the relativistic versions such as Bedigham's , Tumulka's or Pearle's).

    So let me consider, instead of jumping right into that QG regime, an idealized one parameter set of situations connecting the regimes where gravity might be ignored, passing trough one where it might suitably approximated by its Newtonian description ( as you have considered yourself), to one requiring that full QG regime where the vey notion of spacetime itself might be gone. Along that one parameter set of situations I expect we would, encounter situations where spacetime retains its usual meaning, but still, truly general relativistic effects will become relevant. As you know very well general relativity involves fundamental nonlinearities. Moreover, GR implies that the state of matter, by affecting the spacetime geometry itself does also, in general, affect its causal structure. Thus, it seems that there should be, along that one parameter set of situations, points where we have both: notions of spacetime (so might still be in the realms where one could sensibly use some version of spontaneous collapse theories) and, still some nonlinearities would start arising from the GR aspects of the problem.

    That would, it seems to me, imply that at some point the linearity must be broken, that the superposition principle will have to give way to something else. The superposition principle would then survive as a good approximation valid in situations where gravity could be ignored, or at least where it could be treated within some linear approximation, such as a that provided by Newton's theory, or even linearized GR. In those situations the theory would indeed reduce to one satisfying linearity at the density matrix level. In more general situations it would not.

    Now, how could something like this avoid being ruled out by Gisin's no-go theorem. One possibility is that the experimental arrangements envisioned in the theorem, (as they would in the pertinent case, certainly involve important gravitational effects) would be impossible to realize as a result of the modified theory itself. That is, the arrangements devised so that Alice could send a faster than light signal to Bob, might involve for instance, the setting up of some sort of superposition of energy momenta distributions, corresponding to spacetime superpositions, which according to the theory, would be simply impossible to achieve. On might imagine for instance that according to the theory, a collapse might have to take place, with probability 1, before Alice and Bob are able to complete the ensemble the experimental setup. In fact there is precedence for the impossibility of certain type of measurements (not involving gravitation) , that at first sight seemed quite feasible [ see for instance, Y Aharonov, D. Albert, ``Can we make sense out of the measurement process in relativistic quantum mechanics?" PRD 24, 359 (1981) & R. Sorkin, ``Impossible measurements on quantum fields", in Directions in general relativity: Proceedings of the 1993 International Symposium, Maryland, Vol. 2, 293 (1993)]. Another possibility, taking us a bit outside what I had been considering, is that the attempt to create the set-up, would involve, in a sense, creating something like a "spacetime causal structure which is not well defined", so that at the end, whether or not a signal was sent faster than light between Alice and Bob would remain undecidable. Actually things might turn out differently and the causal structure might end up being emergent, and defined depending on the ``outcome" of the experimental development itself. Something of this nature is exemplified in [ "Large fluctuations in the horizon and what they teach us about Quantum Gravity and Entropy" R. Sorkin y D. S., CQG16, 3835, (1999); When is S =1/4 A ? ", A. Corichi & D. S. ,MPLA 17, pg. 1431, (2002), and "A Schrodinger Black Hole and its Entropy", D. S., MPLA17, 1047, (2002)]. Another interesting option is that in the context at hand ( and as a result of the fact that in the semi-classical description one would be ignoring the quantum nature of the gravitational degrees of freedom), Born's rule would end up being effectively modified as you suggested yourself ( but in what I took you considered as highly improbable development, please correct me if I read it wrongly). In other words to expect the unexpected does not seem out of place in dealing with the interface of gravitation and quantum theory.

    In fact, it seems to me that several of the steeps used in the derivations presented in the essay, in particular those that involve taking averages over ensembles, ought to be revised in the kind of gravitational context I am describing, for various reasons. To start with we do not even know how to make sense of the sum of two spacetimes, and much less "the average of various spacetimes", and even if we did, it seems quite likely that the intrinsic nonlinearities of GR would invalidate some of the usual steeps averaging procedures.

    It is of course a tall order to actually propose such a theory, but accepting that we might have to consider breaking with linearity (even at the level of the density matrix equation) might be the first necessary steep.

    I clearly do not expect you (or anybody at this time) to have any clear and definite answers to the questions raised by the above considerations ( I might be wrong of course) , but I am just curious to see what your first reaction would be.

    Again, congratulations for a very nice essay (which I will acknowledge with a very high mark) .

    • Dear Falvio

      This is quite an interesting essay.

      I was particularly intrigued by the alternative classical theory possessing some features analogous to quantum theory. The implementation of the ontological indeterminacy aspect is noteworthy.

      However as I understand the scheme there seems to be some problematic aspect with the minimal requirements for measurement.

      Let's recall that the framework is meant to deal with the in-principle infinite number of digits that determines a real number, (associated with a physical quentity) and pass to a modified scheme in which only a portion of those digits have well defined values.

      So let me start with the second requirement :

      2 Intersubjectivity: Different agents can access the same measurement outcomes.

      This seems to make a lot of sense at first sight however, it must be emphasized that i is only so if it the different agents are truly referring to the SAME quantity. That means among other things the quentity at the same time ( or spactime event). Consider now for instance a particle moving in one dimension. We would like to focus on the value of its velocity ( ina given frame), but of course if we want to assign it, a value we should specify at which time or at what positio . Once those data are specified, it makes all the sense to look at the corresponding value of the velocity... but if time and position are subject to the same indefiniteness as everything else how can we make sure that two agents refer to the same quantity ( i.e. the velocity at time t or at position x).

      Related concerns arise when considering the first requirement:

      1. Stability: Consecutive measurements of the same quantity leave the already determined digits unchanged.

      To start with it seems to me that this can only be sensibly demanded if the quantity in question is one not expected, classically, to depend on time ( i.e. say a quantity which in the normal version would vanishing Poison brackets with the Hamiltonian) , otherwise a change can be expected even in standard situations. So perhaps one should at least place a bound of the time elapsed between consecutive measurements, but how to do so precisely if such precision is discarded ab initio.

      Finally, similar concerns apply to the last postulate:

      3. Precision improvability: With more accurate measurement apparatuses, more digits become available (with the former two properties)

      It seems we might properly talk about improvement of the measurement only as long as it is a measurement of the same quantity at the same time, etc. and as we have seen those notions seem to be made problematic by the basic ideas underlying the proposal.

      So perhaps Flavio could clarify these issues for us.

      In any event, as I said, the general idea is quite interesting.

      • Dear Vladimir,

        First of all, thanks for your comments.

        Now let me ask you something. In connection with my suggestion to consider a certain possibility:

        «My proposal: untheoretizability. That is, such position would contemplate the ultimate untheoretizability of nature.»

        You said:

        I strongly disagree with this conclusion. A breakthrough requires a new ontology and dialectics in the spirit of N. Kuzansky "coincidence of opposites", taking into account all the problems in understanding matter, a dialectic of the material and the ideal.

        Let me firsts clarify that it was not a conclusion buy juts the consideration of an unpalatable possibility. However, the point is that you are convinced such is not the case. Can I ask what are the arguments that make you so certain about this? I.e.. why should we think that any such program (based on dialectic considerations or anything else for that matter) will eventually succeed?

        Best regards,

        Daniel

      • This is certainly a very interesting essay, however, I find myself a bit baffled by a certain recurring theme that I consider disturbing, and that is the implicit anthropocentricity of some of the ideas.

        To exemplify my concern let us focus on a topic that appears prominently in the essay:

        Gödel's lesson, regarding the undecidability of the truth values of certain statements within an axiomatic ( mathematical) system . What Gödel showed is that there are certain true statements that cannot be proven by following the ``axiomatic-logical path", but there is no issue at all regarding the truth of those statements. However, here the author seems to give a dramatically strong role to an aspect of the question that seems rather anthropocentric: The fact that there are meaningful statements for which no man could ever ascertain their truth value, is somehow taken as casting doubts about the statement having a truth value in itself. Furthermore, in fact, the very possibility of producing Gödel's result relies on a notion of the "truth value" of a statement ( within an axiomatic system) that is independent of the notion of proof .

        This is taken further regarding computability: the fact that nor man or man-made machine ( for which a Turing Machine is an idealized characterization) can compute in a finite time ( a criteria further reduced to take into account the finiteness of the time made available by cosmology) a certain number, is somehow taken to mean that such number does not exist. Thus, existence is made strongly dependent on men (or similar thinking organisms). This is , in my view, a step back from the lessons, that are often considered as starting with N. Copernicus, having shaken our conception of being at the center of the universe, and further enhanced by Darwin's theory of evolution that shook our conviction that we were the center of creation [with newly adapted versions of the idea, in my view equally erroneous, which somehow see us, humans as the ultimate goal of evolution : i.e. to create the simple cells and take live into more and more complex forms so that the world might end up with beings like us].

        Here physical laws are required to be such that their predictions are computable.

        This posture seems rather problematic to me. This is even so when taken in the realm of mathematics: should we take the view that say existence proofs, say of solutions to differential equations with given initial data, are meaningless unless they are constructive? We are often content to know the solution exist and is unique, while the question of

        actually finding the solution is taken as one of quite a different nature.

        To adopt Landauer's view of physical laws as necessarily tied to the capacities of a computer, even the most conceivably versatile version thereof ( say a superefficient universal Turing Machine or even a quantum version of it) is to place us humans at the center once again. This time not merely at the center the universe, but at the basis of the very essence of existence. I think we can all imagine a world where there are no sentient beings, no computers, and nothing like that. In fact, our own theories of cosmology indicate that for a very long time that was precisely the state of the universe. Thus, unless one adopts a teleological posture, it seems that we must accept that the emergence of the conditions that made beings like us possible (i.e. the formation of galaxies, stars, solar systems, life and the emergence intelligence as a successful adaptation) are mere contingent facts, and that the universe could easily be conceived as having gone into a different path.

        I think it is hard to dispute the notion that physical laws can limit what is, in principle, computable, but, one must recognize that the notion itself of what is computable ( say in terms of Turing Machines and the like) has a very strong anthropocentric component ( what the machines we can device can compute), and thus the posture that computability limits physics is in a sense, going back to placing ourselves at the center.

        It is natural to expect that the extent of the things we might be able to know, be strongly anthropocentric, it is quite different to claim that the same applies to the world out-there itself.

        I should say, however, that in certain aspects what is considered here resonates with one idea I considered in my own essay, but I think, we part ways dramatically at considerations like " ...reality is the total sum of what sentient beings can actually measure or observe (e.g. classical bits of information and rational numbers)...". Similarly, information is a notion that acquires meaning in the context where we have taken for granted the existence of sentient beings, who might store it in devices which in idealized terms they describe, using the notion of bits, but it seems hard to give a meaning to information in the absence of such beings. Placing information at the center like in Wheeler's " it from bit" catchy phrase, is again returning to an admittedly more sophisticated than older ones, but nonetheless, clearly anthropocentric world view.

        As I said I am convinced there is a world out-there and there is a question of the extent to which we can, through our own very human theoretical constructions, produce accurate descriptions thereof. Here the posture seems to make the very existence of world out-there dependent on us.

      • It seems to me there are various problematic aspects with the ideas discussed here.

        For one, it has long been known that theories of a massive graviton suffered from serious pathologies, including a Boulware-Deser ghost and a discontinuity with general relativity in the limit where the graviton mass goes to zero.

        Nowadays when considering such a thing one is drawn to rely on specific schemes that manage evade those problems such as de Rham-Gabadadze-Tolley model or bi-metric gravity theories ( i.e. theories with two spacetime metrics). In any event, all those proposals are based on specific action functionals which differ substantially from the standard Einstein Hilbert action with a cosmological term, as used in eq. ( 1) and eq. ( 2) in the present work. In fact nowhere do we find any hint of what the "new action principle of the massive gravity" under consideration here is supposed to be.

        On another hand, and on a different aspect I am also quite puzzled by the following statement:

        "I also argue that this limit (i.e. a question concerning the link between experimental results and the actual number of inflationary e- folds) and is a physics counter part to the Godel incompleteness axioms, i.e. where in [14] the emphasis is upon the incompleteness of axiomatic logic, which Godel stated doomed Hilberts dream of a fully axiomatic treatment of physics [15] ."

        The point is of course that as far as I know Hilbert never dreamt of a fully axiomatic treatment of physics ( which in contrast with math requires complex interpretative discussions involving ontological as well as epistemic issues). Hilbert certainly dreamt of a fully axiomatic treatment of mathematics and that dream was shattered by Gödel's famous results.

        • I think that there is something in this proposal that is quite problematic.

          If we adopt the posture advocated in the essay, namely that " If there can be no preexisting algorithm that is able to determine the value, the values simply do not exist", we would have to conclude that un-computable numbers do not exist either. The only real numbers that would exist would be those that are in fact computable and as indicated by the author those are countable. And if those numbers do not exit, then it would seem rather strange to argue that they do represent the state of any physical system.

          Perhaps the author would care to clarify, if there is some confusion on my part.

        • Hi Cristi,

          I am glad you enjoyed the essay. Let me start by acknowledging that I viewed that essay as simply a chance of forcing ourselves to contemplate a very dire possibility which seems as unpalatable today, as the other lessons must have felt to those that lived them at the times they were uncovered. On the other hand, part of the motivation comes from my current feeling that there is nothing at all on which we can reasonably and firmly depend on. You might see why I often feel that way by taking a look at two papers I coauthored ( which have parts that are a bit technical, but I think the main lessons from both are rather straightforward)

          1) "Extracting Geometry from Quantum Spacetime: Obstacles down the road" Yuri Bon- der, Chryssomalis Chryssomalakos, & Daniel Sudarsky, Foundations of Physics, pg. 1, (2018); arXiv:1706.08221 [gr-qc].

          2) "On the status of conservation laws in physics: Implications for semiclassical gravity", Tim Maudlin, Elias Okon & Daniel Sudarsky, Studies in History and Philosophy of Modern Physics, B 69 67-81 (2020); arXiv:1910.06473 [gr-qc] .

          I do not know to much about Wheeler's exact point of view but I had the feeling that he was focusing more on a general " method " for obtaining useful regularities from some underlying physical laws that were just to complex to handle ( i.e. Boltzman account of entropy is one such example in which there are indeed fundamental laws, such as energy and momentum conservation, and then there are other contingent aspects that are far too complex to manage in detail but that we can overcome in practice by suitable averaging process). I wanted to consider a situation that is more dire, where not even the simple conservation laws are there as solid foundations (see 2). Regarding Smolin's ideas, I would say that evolving laws, as long as they are controlled a well defined evolution, are nonetheless solid laws (even if in their evolution there is some level of randomness, just as it occurs, say, in spontaneous collapse theories ).

          Once more I was thinking in something even worse. Can I be more specific and provide details ? No! And that is why I thought it would be nice to provoke people into thinking about the possibility and arguing against it or in its favor.

          Cheers,

          Daniel

        • Essay Abstract

          I will consider how the issues at hand have affected our search for knowledge and understanding, emphasize the difference in status of quantum uncertainty, arguing that it even deserves a different denomination, and speculate inductively regarding possible limitations that go even further than those previously contemplated.

          Author Bio

          Born in Bogota, Colombia, Undergraduate in Math and Physics at the Hebrew University of Jerusalem , PhD at Purdue, USA, and postdoctoral stay at U. Chicago, USA. Sabbatical stays at Penn State, USA, U. of Buenos Aires, Argentina, NYU, USA and U. of Marseille, France. Currently Professor at the Institute for Nuclear Sciences UNAM Mexico. Interested in the interface between gravitation and quantum theory with a strong focus on foundational issues.

          Download Essay PDF File