• [deleted]

Ben,

I can see now why you would like to do away with the manifold structure. It would seem that our essays run counter to each other, which is great for me to develop an understanding of your intended meaning. To me, the constant multiple of the metric represents a static potential for curvature (a potential for energy) whereas the tensor (i.e. Einstein tensor) represents the dynamic portion which gives rise to what we perceive as matter and energy moving in spacetime.

As an analogy, for me it is the derivatives within the fabric and not the fabric itself that is important, but the fabric does exist, whereas you would like to propose that the fabric itself doesn't exist even if the derivatives do?

BTW, I hope you rate my essay as highly as I have yours.

Regards,

Jeff

Jeff,

Well, I don't absolutely object to differentiable manifolds, though I find anything so uniform rather hard to believe in at the fundamental scale. However, one had better recover a Lorentzian structure at large scales and low energies, or the idea won't work. That's part of the task for my approach, but there is good reason to believe that it can be done. What worries me more (but also interests me more) is recovering the representation theory that describes the particles in the standard model. This requires some mathematics that appears to be very little developed and should be a lot of fun to get a handle on. In any case, tensor fields would be emergent, just like the geometry they refer to.

I find the whole rating thing a bit embarrassing, because I'd prefer to just learn about other people's ideas rather than presume to judge the quality of their work. However, I feel justified in giving high ratings to essays that lead me to think about things in new ways, and your essay certainly did. Take care,

Ben

Johannes,

Thanks for those suggestions... both of them are right on target. Take care,

Ben

  • [deleted]

Benjamin, you wrote:

"The fi rst few assumptions I reject are that spacetime is a manifold, that systems evolve with respect to an independent time parameter, and that the universe has a static background structure."

I reject too.

See my essay

http://fqxi.org/community/forum/topic/1413

    Dear Ben, I liked your idea of casual metric very much. You said in my thread that "you and I have perhaps different ideas on the nature of time" but I don't think so. In my mind, time is the expression of changes in energy state, and what can be more causative than that?

    Our major difference lies in you regarding matter and space as a single structure --like a true mathematician!-- and on a certain scale and at certain energies this is right. But there is also an intermediate scale, at low everyday energies, where this approach is not well suited, imo.

    Here are the quotes from your essay that especially resonated with me:

    Re : "These phenomena suggest the promise of physical models that naturally incorporate scale-dependence,.."

    Agree with you: scale is everything.

    Re : "The first few assumptions I reject are that spacetime is a manifold, that systems evolve with respect to an independent time parameter, and that the universe has a static background structure."

    Agree again: time as an independent parameter is suitable only on macro scales, while on the quantum scale, I believe, the micro-processes themselves (not 'particles'!) define spacetime volumes they trace, which can be mapped into time and distances at different scales. As for the universe having a static structure -- who actually thinks so? I can't even fathom it.

    Re : "Dimension becomes an emergent property, and is no longer assumed to be constant, nondynamical, or an integer."

    I see it exactly the same way.

    Re : "If spacetime has a sufficiently simple structure, "...

    Yeah, what is spacetime?

    Re : "Finally, the dimension of space as well as its curvature might vary with energy density, "...

    Just my thoughts. See, we have more in common than it seemed at first.

      Thanks for the thoughtful feedback! I re-read your section on time, and it does seem that we are in closer agreement than I thought at first. In particular, your concept of time seems to arise from local properties of the "fundamental energy units," while the overall order emerges from a tendency toward uniformity, which seems like a description of some sort of potential energy or entropic condition. I tried to suggest something similar in my essay, but only very briefly, since I don't know how to describe this condition precisely yet. You also describe "things in space" as a way of talking about alterations or defects of the structure, which I completely agree with.

      Also, when I said "if spacetime has a sufficiently simple structure," I guess I was being lazy... what I meant was "if the underlying structure, from which what is commonly called spacetime emerges, is sufficiently simple..."

      Take care,

      Ben

      Yuri,

      Thanks for the feedback. I just read your essay, which I found interesting in several regards. I note that you mention the idea that space can be described in terms of angles. Julian Barbour suggests something similar with his "shape dynamics," but doesn't suggest quantization.

      You point out that the strong, weak, and electromagnetic interactions are of similar strengths and that gravity is much weaker. This is true, of course, but it's also interesting to think about the size scales on which these interactions dominate. The strong and weak interactions have very short range, while electromagnetism dominates up to about the everyday scale, where gravity takes over.

      You also point out some interesting numerical relationships. There is much speculation about the dimensionality of space and the number of particle generations, but the 18-degree thing is something I have not heard of before. Take care,

      Ben

      • [deleted]

      I suggest that 3:1 ( examples #1,#2,#3) is enclosed in a total interaction of Bose and Fermi particles or fields, and it is a bootstrapping relationship between mentioned evidences.

      Surprisingly, the container(space-time), content(fermions-bosons), content (energy-matter) obey the same law 3:1.

      http://www.fqxi.org/community/forum/topic/946

      • [deleted]

      See also http://www.fqxi.org/community/forum/topic/946

      Benjamin,

      Wonderful dose of sense and lack of maths for a mathematician. Sound approach to the issues and nicely presented. I also agree most assumptions are reasonable, but I think you fall short of the path to the holy grail. First some favourite bits;

      "...complete unification of relativity and quantum theory was gradually understood to be a particularly intractable problem..." and;

      "...a variety of unexplained phenomena have been recognized." also;

      "Recovery of a Lorentzian manifold from a physically relevant causal relation is necessary at some level of approximation."

      Lastly on Dark matter; "However, this phenomenon does behave like ordinary matter in many respects, as observed in the collision of galaxies and in certain examples of gravitational lensing." I think this last point has started to be forgotten.

      Certainly worth a good score. But I'd also like to invite you to study the mechanisms embodied in my own essay, which I think finds the R postulates direct from a long known QM. I hope you are well versed in logic. I referred to PDL but had to omitt Truth propositional Logic, the exact hierarchical structure of which I've found applies to my emergent model on (non manifold) dynamic space-time frames.

      Please do study and see if you can assimilate the ontological structure from the components discussed. I throw in a bit of theatre just to help visualisation.

      Best of luck in the scoring. I hope mine will help.

      Peter

        Peter,

        Thanks for the feedback! I'll be sure to have a careful look at your essay when I get back from my trip. I won't be too discouraged if my approach "falls short of the path to the holy grail," as you put it; I believe it's fine to think, speculate, and theorize about the biggest questions, but I'm not quite that ambitious about my ideas; at best they're part of the story.

        Your ideas sound interesting as you describe them here, though I haven't yet had a chance to read your submission. Of course I have studied the common aspects of mathematical logic and some of the particular ideas applied to quantum settings, but I'm by no means an expert on this. Hopefully I can at least understand what you propose. Take care,

        Ben

        • [deleted]

        You wrote: "The strong and weak interactions have very short range, while electromagnetism dominates up to about the everyday scale, where gravity takes over".

        I think because c and G speed variation the same. See my essay part.3

        h is eternal constant and Planck unite of mass also eternal.

        Yuri,

        Let me make sure I understand. So you think that the ratio c/G is constant, but neither G nor c are independently constant? Do you mean constant in "space" or constant in "time?" Take care,

        Ben

        • [deleted]

        Variation constants in time.Within a single cycle.

        • [deleted]

        Big Bang; Present; Big Crunch

        c=10^30; c=10^10; c=10^-10

        G=10^12; G=10^-8; G=10^-28

        h=10^-28; h=10^-28; h=10^-28

        alfa =10^-3; 1/ 137; 1

        e=0,1 ; e=e ; e=12

        Dear Ben

        I enjoyed reading our manuscript very much! You made the point about the current situation very clear and synthetic, and I agree on many of your points. As you will see in my answer to your post on my essay http://fqxi.org/community/forum/topic/1506, even though there are some strong common points between our two manuscripts, there are also some relevant differences, about which I'll try to change your mind.

        POINTS ON WHICH WE AGREE.

        I agree on inexactness of symmetries and covariance, and that they should hold only at the Fermi scale and above. On the quantum automaton theory, this is exactly the case.

        Lorentz covariance (more generally Poincare covariance, but translations are almost trivial, since homogeneity is inherent in the automaton description) are recovered in the "thermodynamic" limit of infinite automaton steps and systems. Clearly, since all continuous symmetries are not true at the Planck scale, also all conservation laws must be rewritten and the digital form of Noether theorem should be given. The most general structure that is going to replace the Lie group of symmetry transformations, I agree is likely to be something more primitive than a group, I can say that in my case is likely to be a discrete semigroup, that is approximated by a Lie group at the Fermi scale. In the automaton, all violations of symmetries can be seen already at the easiest level of the dispersion relations.

        We (and everybody should) agree that between two theories explaining the same physics, "parsimony" and "more possibilities for falsification" should be taken as the main motivations in choosing one of the two. In my cellular automaton approach parsimony comes from taking Quantum Theory as the only very fundamental theory. Relativity is emerging. GR must come out as the description of an emergent gravity in the "thermodynamic limit" a la Jacobson-Verlinde. Clearly for falsifiability we need experiments at the Planck scale, e.g. the Craig Hogan's [Scientific American, feb. 2012] (really very nice experiment: I visited his lab).

        Background independence of the theory and physics as emergent: out of question!

        We need to incorporate scale-dependence in the theories: this is already the case of Planck scale incorporation!

        Proof is lacking that antimatter interacts in the same way as ordinary matter gravitationally: right! Something frequently forgotten!

        Spacetime is not a manifold: out of question! And most likely it is not commutative (I hope to recover this from the Dirac automaton in 3d).

        We need to reinterpret the principles of causality and covariance, and covariance should be viewed in order-theoretic terms. Agreed, but my solution is different from the one that you propose.

        POINTS ON WHICH IT SEEMS THAT WE DISAGREE

        I think that your causal metric hypothesis in some way is related to my quantum causality Deutsch-Church-Turing principle, i.e. in short the quantum automaton.

        But my notion of causality I think is very different from yours! And, is more similar to the canonical notion. The disagreement is that my causality is definitely transitive and acyclic! It is also countable (discreteness comes from the requirement of distinguishing cause and effect: sees definition) and locally finite (from the Deutsch-Church-Turing principle!) Why I want a transitive and acyclic causality? Because I don't want to modify Quantum Theory! Causality is a postulate of quantum theory, as established in my recent axiomatic work with Paolo Perinotti and Giulio Chiribella [[2] G. Chiribella, G. M. D'Ariano, P. Perinotti, Phys. Rev. A 84, 012311 (2011)]. Causality means independence of the probability distribution of state preparation from the choice of the following measurement (this is the same sense of Lucien Hardy's framework). Very shortly, this means that the causal links are the "wires" in the quantum circuit, whence they are fixed and they themselves establish the dynamical theory. I don't need to introduce a meta-dynamics for moving the wires. The theory is perfectly causal in the usual sense! I want to keep quantum theory: gravity must emerge as a thermodynamical effect a la Jacobson-Verlinde.

        You say that "intransitivity of the binary relation generating the causal order is self-evident at large scales": where?? We have no evidence at all. We believe in General Relativity, and take any astrophysical observation as an evidence of the theory. I want a direct evidence! I understand that you want to give up acyclicity (whence transitivity) for keeping GR, but this is not an experimental motivation.

        "Metric" properties of space-time unfortunately involve an additional information besides the binary relation generating the causal order, and this is the fact that the causal relation is of quantum nature: is a quantum interaction. One of the main thing that I have well understood is that we live in "a quantum digital universe": the quantum nature of the causal network is crucial in recovering the Lorentz covariance. I explained more in my reply to your post on my essay. The scale factor definitely must come from the Planck distance.

        Dimension is an emergent property? I'm not sure. If you believe in causal networks, the graphs dimension of the network (which equals the dimension of the emerging space-time)

        depends on the topology of causal connections. These ARE the theory. Having dimension as emergent would correspond to have the most basic theory as emergent. Causality is not emergent: causality is our way of describing physics. Moreover, let me comment on your apparent connection with the Feynman path-integral. The closed trajectories in the Feynman integral have nothing to do with acyclicity of causality, since the fact that you can evaluate the probability amplitude of coming back to the same state doesn't mean that the evolution is cyclic.

        Finally: do systems evolve with respect to an independent "time parameter". Time is emergent, and time in the usual Einstein sense need a synchronization procedure. But Lorentz covariance emerges from the automaton, and there we have an independent "discrete time parameter" which is just the number of unitary steps of the automaton!

        Thank you again for your essay! I really liked it a lot!

        I hope to meet you soon for the easiness and the pleasure of discussing in person.

        Giacomo Mauro D'Ariano

          • [deleted]

          Hello Mr.D'Ariano,Mr Dribus,

          I didn't know this Deutsch turing machine. It is relevant. I say me that it is possible to insert the organic semiconductors with my equations and spheres.The informations can be classed with the sortings and synvhros.of evolution. The fermionic spheres and the bosonic spheres can be seen in a pure 3D sphere and spherization evolution. The synmmetry seems essential. I ask me if the system is fusioned or binar for the serie of uniqueness ?

          In the reality, I prefer a fusion fermions/bosons.For the simulations and the convergences in 3D, it is seems interesting to insert the symmetry between the 2 systems for a better understanding of these synchros and sortings of evolution. mcosV=constanst is very relevant when the serie of uniqueness is insereted.

          I beleive that for a real understanding of the system of uniqeness. This number ! It is essential to understand the decreasing of volumes from the main central sphere, the number 1. We see that the lattices disappear in the perfect contact between spheres.Just due to this decerasing of spherical volumes. A little if I said that all the cosmological spheres are attracted towards the universal central sphere.The finite serie is so essential for the two systems, quant.and cosm.It is relevant also when we consider the volumes diffrenciating the bosons and fermions.Always with this serie of uniqueness and its precise number. It seems not possible to calculte correctly this number, that said it is possible to appraoch it. In logic, the cosmological number is the same, so ....between 1 ....and x :)

          The algorythms .......can converge !

          Regards

          Dear Mauro,

          I appreciate the excellent analysis. I will have to break my response into a couple of segments, so I will post them as new posts rather than replies. To the points on which we seem to agree, I have little more to add, though I am interested in the "Planck-scale experiment" you referenced. One point is that the generalization of covariance I have in mind is much more general than semigroup representations. For the points on which we may disagree, I will itemize.

          1. Regarding transitivity, I must insist on distinguishing between the "causal order" (of a classical universe) and the "binary relation generating the causal order." On large scales, the intransitivity I am talking about is as simple as the fact that the statement "Jane talked to Bill, then Bill talked to Susan," is not the same as the statement "Jane talked to Bill, then Jane and Bill talked to Susan." In either case, Susan received information from Jane, so the two statements are indistinguishable in their causal orders. However, in the first instance, the information is transmitted only through Bill, whereas in the second case it is transmitted both through Bill and directly. Thus, there are two different binary relations that generate the same causal order: the intransitive one in which information passes only through Bill, and the transitive one in which information also reaches Susan directly from Jane. These two are a priori different. At an ordinary scale, this is obvious to everyone.

          For fundamental physics, the reasoning is as follows. Many scientists (by no means all!) agree that "causality," however you define it, is one of the most fundamental concepts in physics. The question then becomes: how do you define/describe causality? Well, a cause and effect certainly seem to define a direction; you can imagine an arrow pointing from the cause to the effect. This is completely local. Include lots of causes and effects (vertices), and arrows (directed edges) without yet imposing any other conditions, and you get a directed graph, which is equivalent to a binary relation on the set of vertices. At this stage, there is nothing to rule out cycles, and certainly nothing to impose transitivity, which are both generally nonlocal phenomena. There is a "causal order" generated by this directed graph, which is the relation defined by closing the graph relation under transitivity. I put "order" in quotes because this is still more general at this stage than the usual definition of a partial order; it may still have cycles, for instance.

          This is all purely classical. To obtain a quantum theory, you need the superposition principle. The appropriate version of this in this case is a path sum over a configuration space of classical causal universes; i.e., directed graphs. I will explain why this is the appropriate version below. The question then becomes, "which graphs should be included in the configuration space?" This is the first real choice in the entire procedure, and involves a judgment about what types of graphs correspond to physical reality. My personal guess would be "acyclic locally finite directed graphs," but I want to make it clear that these are second-level assumptions that come further along in the development. I prefer acyclicity because we don't seem to observe causal cycles, and I choose local finiteness because I suspect that volume has something to do with counting (not necessarily as simple as Sorkin's "order plus number equals geometry", but in the same spirit).

          I particular, it makes an a priori difference if you include only transitive graphs (graphs in which there is an edge between two vertices whenever there is a path between them). It's conceivable that this difference would fall out of the path sum, but I see no justification for assuming this at the outset.

          (continued below)

          (continued from previous post)

          2. You make the very helpful analogy that "the causal links are the "wires" in the quantum circuit." If so, I don't see any disagreement on this point, because the directed graphs representing quantum circuits are not transitive graphs. Also, the classical causal networks in your arXiv paper seem not only intransitive, but almost "maximally so" in this sense.

          3. Regarding my reasoning for not absolutely ruling out cycles, I actually think GR is very discouraging to fans of time travel, and I'm certainly not trying to rescue GR here. It's true that GR gives a sliver of hope to believers in causal cycles, but I don't take these solutions very seriously. My reasoning is partly caution and partly based on some potentially interesting or suggestive properties of graphs containing cycles. The models I have thought the most about are acyclic, however.

          4. Of course you're correct that a binary relation doesn't determine a metric in general. Sorkin discusses this at length. His "order plus number equals geometry" motto is based on metric recovery theorems that take as input an appropriate binary relation together with some volume data. His choice of how to provide volume data is the simplest (counting), but there are other ways, defined by taking advantage of local data in the graphs. For a homogeneous graph, simple counting is probably the only option, but I don't prefer the assumption of homogeneity.

          (continued below)

          (continued from previous post)

          5. I clearly did not explain my use of the sum over histories method adequately enough, and it is no wonder given the length constraints. First, in his 1948 paper Feynman discussed summing over particle trajectories in Euclidean spacetime and thereby recovered "standard" quantum theory, with its Hilbert spaces, operator algebras, Schrodinger equation, etc. Feynman was able to take all the trajectories to be in the same space because he was working with a background-dependent model; the ambient Euclidean space is unaffected by the particle moving in it. Now, if GR has taught us anything, it is that "spacetime" and "matter-energy" interact, so different particle trajectories mean different spacetimes. Hence, in a background-independent treatment, Feynman's sum over histories becomes a sum over "universes," with a different classical spacetime corresponding to each particle trajectory. His original version is a limiting case in which the effect of the particle on the spacetime is negligible.

          What are the "classical spacetimes" in my approach? Well, they are directed graphs. However, it is not quite right to just sum over graphs. The reason why can be understood by looking at Feynman's method more carefully. He considered a region R of spacetime, and interpreted his path sum as the amplitude associated with measuring the particle somewhere on the upper (i.e., future) boundary given an initial measurement on the lower boundary. Hence, the path sum measures not the amplitude of a particular universe, but the amplitude of "transition" from one family of universes to another. A discrete approximation of this represents each particle trajectory as a sequence of directed segments in the corresponding configuration space, which inherits a partial order from the time-orders of the individual spacetimes. It is now clear how to generalize to the nonmanifold case: the appropriate sums are sums over paths in causal configuration space.

          Take care,

          Ben