(continued from previous post)

2. You make the very helpful analogy that "the causal links are the "wires" in the quantum circuit." If so, I don't see any disagreement on this point, because the directed graphs representing quantum circuits are not transitive graphs. Also, the classical causal networks in your arXiv paper seem not only intransitive, but almost "maximally so" in this sense.

3. Regarding my reasoning for not absolutely ruling out cycles, I actually think GR is very discouraging to fans of time travel, and I'm certainly not trying to rescue GR here. It's true that GR gives a sliver of hope to believers in causal cycles, but I don't take these solutions very seriously. My reasoning is partly caution and partly based on some potentially interesting or suggestive properties of graphs containing cycles. The models I have thought the most about are acyclic, however.

4. Of course you're correct that a binary relation doesn't determine a metric in general. Sorkin discusses this at length. His "order plus number equals geometry" motto is based on metric recovery theorems that take as input an appropriate binary relation together with some volume data. His choice of how to provide volume data is the simplest (counting), but there are other ways, defined by taking advantage of local data in the graphs. For a homogeneous graph, simple counting is probably the only option, but I don't prefer the assumption of homogeneity.

(continued below)

(continued from previous post)

5. I clearly did not explain my use of the sum over histories method adequately enough, and it is no wonder given the length constraints. First, in his 1948 paper Feynman discussed summing over particle trajectories in Euclidean spacetime and thereby recovered "standard" quantum theory, with its Hilbert spaces, operator algebras, Schrodinger equation, etc. Feynman was able to take all the trajectories to be in the same space because he was working with a background-dependent model; the ambient Euclidean space is unaffected by the particle moving in it. Now, if GR has taught us anything, it is that "spacetime" and "matter-energy" interact, so different particle trajectories mean different spacetimes. Hence, in a background-independent treatment, Feynman's sum over histories becomes a sum over "universes," with a different classical spacetime corresponding to each particle trajectory. His original version is a limiting case in which the effect of the particle on the spacetime is negligible.

What are the "classical spacetimes" in my approach? Well, they are directed graphs. However, it is not quite right to just sum over graphs. The reason why can be understood by looking at Feynman's method more carefully. He considered a region R of spacetime, and interpreted his path sum as the amplitude associated with measuring the particle somewhere on the upper (i.e., future) boundary given an initial measurement on the lower boundary. Hence, the path sum measures not the amplitude of a particular universe, but the amplitude of "transition" from one family of universes to another. A discrete approximation of this represents each particle trajectory as a sequence of directed segments in the corresponding configuration space, which inherits a partial order from the time-orders of the individual spacetimes. It is now clear how to generalize to the nonmanifold case: the appropriate sums are sums over paths in causal configuration space.

Take care,

Ben

Your essay does a excellent job of summarizing the problems with current physics models as well as pointing out some of the assumptions that need to be questioned.

The scope of the essay far exceeds what one would expect to find in so little space. Lots of food for thought requires lots time to digest. In the case of your essay, we're easily talking months.

As I read the essay, a number of questions immediately came to mind.

First, I would like to understand better how you determined which of the fundamental assumptions needed to be questioned and why. What was your starting point(s)?

Also, you mention the necessity for a new theory to recover established physics that has proven to be successful. Wouldn't this condition constrain the development of an entire class of new theories, particularly those from which established physics cannot be recovered and which, in some case, may come in direct opposition to them?

Wouldn't you think that all that should required of a theory, besides internal consistency, is that it be in agreement with observational and experimental data, and not necessarily with any theoretical interpretation of it? For instance, a theory may account perfectly for the bending of light near a massive structure yet be based on a axiom set that excludes general.

    Daniel,

    I appreciate the feedback. I will itemize my reply:

    1. The length requirement did limit my ability to explain my background thoughts. My emphasis on causality is principally motivated by two factors:

    First, most of science, and particularly the experimental method, consists of establishing "causal" relationships between things ("whenever we do so-and-so, we observe such-and-such"). The success of this approach is obvious, and I feel that its full potential should be exploited. Remarkably, much of modern theoretical physics has relegated causality to a secondary and sometimes obscure role, and many recent theories reject its fundamental importance altogether. Instead, they are based on objects like continuum manifolds, which have undeniable mathematical advantages, but which are disturbingly idealistic and exhibit properties that are obviously physically irrelevant (like the least upper bound property, nonmeasurable subsets, etc.)

    Second, it is startling how much of the structure of even such idealized theories can be recovered from causal relations, and how almost every "spacetime-related concept" you can think of has an analogous "causal interpretation" that is more natural and more general. For instance, a "light cone" in relativity is an abstract geometric locus of events in spacetime, and the rule that "information cannot escape its light cone" is viewed as secondary. In causal theory, the light cone is simply the scope of information flow, and the idealized view of a geometric locus is secondary. As another example, "frames of reference" in relativity are associated with abstract local coordinate systems on a manifold, and the relativity of simultaneity is viewed as secondary. In causal theory, frames of reference are different orderings of events compatible with the causal order, so the physical idea of relativity of simultaneity is direct, and the idealized view of a coordinate system is secondary. In both examples, I think it's obvious which is the more natural and physical point of view.

    2. Regarding the recovery of established physics, I don't mean that every prediction of currently accepted models, including those that haven't been experimentally verified, must be reproduced; that would be pointless. What I mean is that a new theory must do at least as well as current theories in explaining and predicting what we actually see. For instance, general relativity modifies Newton's laws only very slightly for solar system dynamics; if general relativity had predicted an inverse cube force for the gravity between the earth and the moon, it would have been thrown out. Similarly, a new theory must look like general relativity and quantum theory in any situation where those theories have been proven to work.

    3. Regarding the general requirements for a theory, yes, I agree completely. Experimental evidence is the final judge.

    Take care,

    Ben

    • [deleted]

    Ben,

    Will you be able to recover Minkowski spacetime and relativity in general from your new principles without additionally assuming the constancy of the speed of light (Einstein's 1905 light postulate)? Einsteinians sometimes claim that "the constant speed of light is unnecessary for the construction of the theories of relativity" but this is a fraud of course:

    Jean-Marc Lévy-Leblond: "Supposez que demain un expérimentateur soit capable de vraiment mettre la main sur le photon, et de dire qu'il n'a pas une masse nulle. Qu'il a une masse de, mettons 10^(-60)kg. Sa masse n'est pas nulle, et du coup la lumière ne va plus à la "vitesse de la lumière". Vous pouvez imaginer les gros titres dans les journaux : "La théorie de la relativité s'effondre", "Einstein s'est trompé", etc. Or cette éventuelle observation ne serait en rien contradictoire avec la théorie de la relativité!"

    Jean-Marc Lévy-Leblond "De la relativité à la chronogéométrie ou: Pour en finir avec le "second postulat" et autres fossiles": "Il se pourrait même que de futures mesures mettent en évidence une masse infime, mais non-nulle, du photon ; la lumière alors n'irait plus à la "vitesse de la lumière", ou, plus précisément, la vitesse de la lumière, désormais variable, ne s'identifierait plus à la vitesse limite invariante. Les procédures opérationnelles mises en jeu par le "second postulat" deviendraient caduques ipso facto. La théorie elle-même en serait-elle invalidée ? Heureusement, il n'en est rien..."

    Tom Roberts: "If it is ultimately discovered that the photon has a nonzero mass (i.e. light in vacuum does not travel at the invariant speed of the Lorentz transform), SR would be unaffected but both Maxwell's equations and QED would be refuted (or rather, their domains of applicability would be reduced)."

    Why Einstein was wrong about relativity, 29 October 2008, Mark Buchanan, NEW SCIENTIST: "A photon with any mass at all would imply that our understanding of electricity and magnetism is wrong, and that electric charge might not be conserved. That would be problem enough, but a massive photon would also spell deep trouble for the second postulate, as a photon with mass would not necessarily always travel at the same speed. Feigenbaum's work shows how, contrary to many physicists' beliefs, this need not be a problem for relativity."

    Tom Roberts: "As I said before, Special Relativity would not be affected by a non-zero photon mass, as Einstein's second postulate is not required in a modern derivation (using group theory one obtains three related theories, two of which are solidly refuted experimentally and the third is SR). So today's foundations of modern physics would not be threatened.

    Mitchell J. Feigenbaum: "In this paper, not only do I show that the constant speed of light is unnecessary for the construction of the theories of relativity, but overwhelmingly more, there is no room for it in the theory. (...) We can make a few guesses. There is a "villain" in the story, who, of course, is Newton."

    Pentcho Valev

      Dear Pentcho,

      Thanks for the feedback! By "recovering" relativity, I don't mean that I believe relativity in Einstein's original form is absolutely valid (otherwise, why would it need to be replaced or superseded?) What I mean is that in any case in which relativity makes good predictions, any theory that supersedes it must do at at least as well. Hence, a new theory must be able to describe/predict anything that relativity can describe/predict.

      Regarding the constancy of the speed of light, my guess would be that a concept like this only makes sense at sufficiently large scales. "Speed" requires a notion of distance, and my view is that spatial distance (separation) is ultimately just a way of talking about the extent to which events are "unrelated." It begins to look like a traditional distance only at large enough scales.

      Regarding photon mass, it was thought for a long time that neutrinos had no mass, but it was eventually discovered that they do have mass after all. Hence, I would be inclined to keep an open mind even about something as "sacred" as that. However, mass itself is again an emergent concept in my view, so the questions of what a "photon" really is and what "mass" really is are things that cannot be taken for granted.

      You'll have to remember that my background is mostly mathematical, and therefore I'm inclined to consider the possibility of things that most physicists "know" are wrong. This might be useful in some cases; in others it only reflects my own ignorance. Take care,

      Ben

      • [deleted]

      Dear Benjamin

      You wrote: "Predictions based on quantum field theory and the Planck scale yield a value for the cosmological constant roughly 120 orders of magnitude greater than observation implies."

      If you read my posts to my essay attentively you can read next:

      Yuri Danoyan wrote on Sep. 4, 2012 @ 00:25 GMT

      Appendix 4 Solution of cosmological constant problem

      Theory: Cosmological constant is 10^94 g/sm^3

      Practice: Cosmological constant is 10^-28 g/sm^3

      Planck constant h=10^-28 g x sm^2/sec in 2D space embedding in 3D space

      Only right value is experimental value.

      Theory based in wrong assumptions noted in my essay.

        Dear Yuri,

        I did look through the comments on your thread, but I am afraid I don't quite understand. It seems you are suggesting there is a simple dimensional relationship that explains the observed value of the cosmological constant. This would be great, but it's not obvious to me. Do you mind explaining a little more? Take care,

        Ben

        • [deleted]

        I argue that the Planck unit of length at short distances is not applicable, and the space has dimension 2, not 3.

        Hence density of space coincides with сonтstant h.

        • [deleted]

        Planck unit of length not applicable,because no Gn,no Newton gravity law.

        See part 3 my essay.No linear link between G and c,as in Planck unit of mass.

        Thanks for the clarifications, Ben. I understand your essay much better now.

        Daniel

        • [deleted]

        Dear Benjamin

        For better clarification my approach

        I sending to you Frank 3 keen articles

        http://ctpweb.lns.mit.edu/physics_today/phystoday/Abs_limits393.pdf

        http://ctpweb.lns.mit.edu/physics_today/phystoday/Abs_limits393.pdf

        http://ctpweb.lns.mit.edu/physics_today/phystoday/Abs_limits400.pdf

        All the best

        Yuri

        • [deleted]

        I send first all this links to address

        'bdribus@math.lsu.edu

        but get answer

        'bdribus@math.lsu.edu.' on 9/17/2012 12:54 PM

        Invalid recipient

        Dear Yuri,

        I got two out of the three articles, and I'm sure I can find the other one. I'm not sure why the first didn't come through. I don't know why you got that error message... that is the correct address. In any case, thanks for the articles; fortunately, they were easy to read, but included some information I did not know. I think I understand what you are suggesting about the relationship between the cosmological constant and Planck's constant, but don't you think that perhaps the cosmological constant is a little too small? Take care,

        Ben

          • [deleted]

          http://ctpweb.lns.mit.edu/physics_today/phystoday/Abs_limits388.pdf

          http://ctpweb.lns.mit.edu/physics_today/phystoday/Abs_limits393.pdf

          http://ctpweb.lns.mit.edu/physics_today/phystoday/Abs_limits400.pdf

          Dear Ben,

          I enjoyed your comments on Brian's and my essay pages. As promised I have read and will comment on your learned fqxi contribution:

          I agree with you that causality (I suppose you mean local causality, but you also refer to universes in the plural so I am left wondering) is the substrate on which to build a rational theory unifying quantum mechanics, relativity and the standard model.

          Beyond this understanding, your essay is far too technical for me to follow. You couch your arguments in terms like " acyclicity, morphisms, multicategory theory, transitivity, complex Hilbert spaces" which leave be baffled. Well at least as far as Hilbert spaces are concerned Brian Swingle has thankfully dispensed with those as far as physics is concerned. As a mathematician it is wonderful that you approach physics with this background, as you just might find a new math to explain a whole range of physics - just as quaternions are now found to be useful to explain quantum interactions.

          If you will forgive this image - the good wolf mathematicians huff and puff with their theories, circling around the various houses built by the little piggy physicists, and it is an excellent way to test those houses for good solid construction!

          I was reminded that we should peer-rate essays as only the top 35 rated essays get read by fqxi's expert panel of judges.

          With best wishes for your degree work,

          Vladimir

            Dear Vladimir,

            I appreciate the feedback! And I was quite amused by your metaphor of the three little pigs... although I think the physicists have given mathematicians at least as much of a headache over the years with ideas like path integrals and delta functions!

            As a matter of fact, as I wrote on your thread, the math I use here is simply whatever seems necessary to get the job done... the basic physical idea of cause and effect is the motivation. The length limitation for the contest makes it a bit difficult to explain things adequately and still fit in everything you want to say.

            I was going to wait to rate the essays until I had read them all, but I will rate yours now just so I don't forget. Take care,

            Ben

            Dear Benjamin,

            I am extremely sorry for the delay in replying to your query. I am glad to know that you have your original way of looking at the fundamental problems of physics and surprised to learn that you suspect too many basic assumptions of physics where as I consider as wrong only one basic assumption. On the basis of your 'causal metric hypothesis', you have tried to explain, in a novel way, the origin of the classical concepts of space-time and also the role of space and time in the quantum world. On the basis of 'causal metric hypothesis' you have attempted to unify both GR and QM leading to the theory of QG. I am also interested in knowing how you account for the appearence of continuous manifolds on the basis of 'discrete reference frames'.

            Anyway, you have put too much thought in to the problems facing physics and wish you succeed in solving them in one stroke on the basis of 'causal metric hypothesis'. I rate your essay high because of its originality and want to know how you feel about mine.

            Good luck and best regards,

            Sreenath.

              Ben,

              As you said my essay was filled with ideas I can reciprocate the comment about yours. The statement you make:

              "A number of existing proposals about spacetime microstructure lead naturally

              to noncommutative spaces in the sense of Connes [3] via the deformation theory of Hopf algebras, 10 but noncommutative geometry is relevant more generally, and even classical spaces such as Minkowski spacetime possess important noncommutative structures."

              on the top of page 7 is pretty spot on. Take a look at Giovanni Amelino-Camelia , and the reference to his paper on κ-Minkowki spacetime. You can search on down the blog comments to September 8 and see where I offer a connection to twistor theory. Giovanni's work is solid and it is regrettable that it has fallen so far down the community ranking. Spacetime is then under a certain measurement, which I think pertains to high energy processes or a very small scale is noncommutative. In my paper Noncommutative geometry of AdS coordinates on a D-brane I take a somewhat different approach to noncommutative geometry.

              We do have to take pause however. The NASA spacecraft FERMI measured the time of arrival of different wavelengths of EM radiation from very distant (billions of light years) burstars. Later the ESA spacecraft INTEGRAL made similar measurements. The time of arrival was virtually identical. However, if spacetime has a foamy or noncommutative structure it is expected that shorter wavelengths of radiation will couple more strongly to this small scale structure of spacetime. The result should be there is a dispersion of EM radiation. None was observed! Experiments count more than theory.

              Does this dash noncommutative geometry? Not necessarily, but it might mean something far more subtle is going on. These measurements are not directly small scale measurements. They are not experiments where particles near the Planck energy are scattered or where some Planck scale microscope looks at spacetime structure. We are actually measuring physics on a grand scale. So we are observers making a particular choice of measurement. Under these conditions we might then expect spacetime to be completely smooth with no foam or quantum noncommutative structure observed. Torsten Asselmeyer-Maluga connects exotic four manifolds (Donaldson theorem etc) with quantum spacetime. Yet this connection is with this strange business of spaces that are homeomorphic but not diffeomorphic, where this is connected to quantum amplitudes. I suggest on his website that in 11 dimensions it might be easier to consider the dual 7-manifolds with Milnor's exotic structure. We might then have some deep complementarity at work here.

              The path integral issue you discuss might fit into this. The Polyakov measure in a path integral

              ∫(D[g, ψ]/diff(g, ψ)) exp(iS)

              where one "mods out" diffeomorphisms or gauge dependencies. This gadget in some manner is generalized within this perspective. We also have to keep in mind there might be some general complementarity with noncommutivity.

              The best thing about these contests is the exchange and interaction with people and different ideas and concepts.

              Cheers LC

                Dear Sreenath,

                I appreciate the feedback! It's true that I doubt a lot of the modern assumptions, but this arises mostly from my doubt about the ultimate physical relevance of manifolds. In my mathematical work, I have come to appreciate how very idealized and mathematically convenient objects like continuum manifolds and algebraic varieties are, and it seems to me that many of the properties that make them mathematically convenient do not arise in any natural or necessary way in physics. Many people think that convenient properties such as the least upper bound property in the order theory of the continuum can be assumed without worrying about their ultimate physical reality, based on the belief that any sufficiently fine approximation will suffice for measurement purposes. However, these properties determine the symmetry groups whose representation theory governs the properties of particle states, so the difference is an important qualitative one, not simply a small quantitative one that vanishes in the limit. My approach is to begin with the concept I view as most central to scientific process, namely cause and effect, and explain as much as possible in these terms. Ultimately, it may not be enough, but it is an approach with obvious motivations and clear and simple principles, and one that has not been adequately explored.

                Regarding your essay, I view it positively even though your approach is much different than mine. I don't know if your equations will turn out to be correct, but the advantage of your approach is that you go into very specific details, and it should be possible to evaluate it one way or the other in a reasonable time frame. Like mine, I think your approach is worth trying, which is really all one can ask for. Take care,

                Ben