Yuri,

Thanks for the feedback. I just read your essay, which I found interesting in several regards. I note that you mention the idea that space can be described in terms of angles. Julian Barbour suggests something similar with his "shape dynamics," but doesn't suggest quantization.

You point out that the strong, weak, and electromagnetic interactions are of similar strengths and that gravity is much weaker. This is true, of course, but it's also interesting to think about the size scales on which these interactions dominate. The strong and weak interactions have very short range, while electromagnetism dominates up to about the everyday scale, where gravity takes over.

You also point out some interesting numerical relationships. There is much speculation about the dimensionality of space and the number of particle generations, but the 18-degree thing is something I have not heard of before. Take care,

Ben

  • [deleted]

I suggest that 3:1 ( examples #1,#2,#3) is enclosed in a total interaction of Bose and Fermi particles or fields, and it is a bootstrapping relationship between mentioned evidences.

Surprisingly, the container(space-time), content(fermions-bosons), content (energy-matter) obey the same law 3:1.

http://www.fqxi.org/community/forum/topic/946

  • [deleted]

See also http://www.fqxi.org/community/forum/topic/946

Benjamin,

Wonderful dose of sense and lack of maths for a mathematician. Sound approach to the issues and nicely presented. I also agree most assumptions are reasonable, but I think you fall short of the path to the holy grail. First some favourite bits;

"...complete unification of relativity and quantum theory was gradually understood to be a particularly intractable problem..." and;

"...a variety of unexplained phenomena have been recognized." also;

"Recovery of a Lorentzian manifold from a physically relevant causal relation is necessary at some level of approximation."

Lastly on Dark matter; "However, this phenomenon does behave like ordinary matter in many respects, as observed in the collision of galaxies and in certain examples of gravitational lensing." I think this last point has started to be forgotten.

Certainly worth a good score. But I'd also like to invite you to study the mechanisms embodied in my own essay, which I think finds the R postulates direct from a long known QM. I hope you are well versed in logic. I referred to PDL but had to omitt Truth propositional Logic, the exact hierarchical structure of which I've found applies to my emergent model on (non manifold) dynamic space-time frames.

Please do study and see if you can assimilate the ontological structure from the components discussed. I throw in a bit of theatre just to help visualisation.

Best of luck in the scoring. I hope mine will help.

Peter

    Peter,

    Thanks for the feedback! I'll be sure to have a careful look at your essay when I get back from my trip. I won't be too discouraged if my approach "falls short of the path to the holy grail," as you put it; I believe it's fine to think, speculate, and theorize about the biggest questions, but I'm not quite that ambitious about my ideas; at best they're part of the story.

    Your ideas sound interesting as you describe them here, though I haven't yet had a chance to read your submission. Of course I have studied the common aspects of mathematical logic and some of the particular ideas applied to quantum settings, but I'm by no means an expert on this. Hopefully I can at least understand what you propose. Take care,

    Ben

    • [deleted]

    You wrote: "The strong and weak interactions have very short range, while electromagnetism dominates up to about the everyday scale, where gravity takes over".

    I think because c and G speed variation the same. See my essay part.3

    h is eternal constant and Planck unite of mass also eternal.

    Yuri,

    Let me make sure I understand. So you think that the ratio c/G is constant, but neither G nor c are independently constant? Do you mean constant in "space" or constant in "time?" Take care,

    Ben

    • [deleted]

    Variation constants in time.Within a single cycle.

    • [deleted]

    Big Bang; Present; Big Crunch

    c=10^30; c=10^10; c=10^-10

    G=10^12; G=10^-8; G=10^-28

    h=10^-28; h=10^-28; h=10^-28

    alfa =10^-3; 1/ 137; 1

    e=0,1 ; e=e ; e=12

    Dear Ben

    I enjoyed reading our manuscript very much! You made the point about the current situation very clear and synthetic, and I agree on many of your points. As you will see in my answer to your post on my essay http://fqxi.org/community/forum/topic/1506, even though there are some strong common points between our two manuscripts, there are also some relevant differences, about which I'll try to change your mind.

    POINTS ON WHICH WE AGREE.

    I agree on inexactness of symmetries and covariance, and that they should hold only at the Fermi scale and above. On the quantum automaton theory, this is exactly the case.

    Lorentz covariance (more generally Poincare covariance, but translations are almost trivial, since homogeneity is inherent in the automaton description) are recovered in the "thermodynamic" limit of infinite automaton steps and systems. Clearly, since all continuous symmetries are not true at the Planck scale, also all conservation laws must be rewritten and the digital form of Noether theorem should be given. The most general structure that is going to replace the Lie group of symmetry transformations, I agree is likely to be something more primitive than a group, I can say that in my case is likely to be a discrete semigroup, that is approximated by a Lie group at the Fermi scale. In the automaton, all violations of symmetries can be seen already at the easiest level of the dispersion relations.

    We (and everybody should) agree that between two theories explaining the same physics, "parsimony" and "more possibilities for falsification" should be taken as the main motivations in choosing one of the two. In my cellular automaton approach parsimony comes from taking Quantum Theory as the only very fundamental theory. Relativity is emerging. GR must come out as the description of an emergent gravity in the "thermodynamic limit" a la Jacobson-Verlinde. Clearly for falsifiability we need experiments at the Planck scale, e.g. the Craig Hogan's [Scientific American, feb. 2012] (really very nice experiment: I visited his lab).

    Background independence of the theory and physics as emergent: out of question!

    We need to incorporate scale-dependence in the theories: this is already the case of Planck scale incorporation!

    Proof is lacking that antimatter interacts in the same way as ordinary matter gravitationally: right! Something frequently forgotten!

    Spacetime is not a manifold: out of question! And most likely it is not commutative (I hope to recover this from the Dirac automaton in 3d).

    We need to reinterpret the principles of causality and covariance, and covariance should be viewed in order-theoretic terms. Agreed, but my solution is different from the one that you propose.

    POINTS ON WHICH IT SEEMS THAT WE DISAGREE

    I think that your causal metric hypothesis in some way is related to my quantum causality Deutsch-Church-Turing principle, i.e. in short the quantum automaton.

    But my notion of causality I think is very different from yours! And, is more similar to the canonical notion. The disagreement is that my causality is definitely transitive and acyclic! It is also countable (discreteness comes from the requirement of distinguishing cause and effect: sees definition) and locally finite (from the Deutsch-Church-Turing principle!) Why I want a transitive and acyclic causality? Because I don't want to modify Quantum Theory! Causality is a postulate of quantum theory, as established in my recent axiomatic work with Paolo Perinotti and Giulio Chiribella [[2] G. Chiribella, G. M. D'Ariano, P. Perinotti, Phys. Rev. A 84, 012311 (2011)]. Causality means independence of the probability distribution of state preparation from the choice of the following measurement (this is the same sense of Lucien Hardy's framework). Very shortly, this means that the causal links are the "wires" in the quantum circuit, whence they are fixed and they themselves establish the dynamical theory. I don't need to introduce a meta-dynamics for moving the wires. The theory is perfectly causal in the usual sense! I want to keep quantum theory: gravity must emerge as a thermodynamical effect a la Jacobson-Verlinde.

    You say that "intransitivity of the binary relation generating the causal order is self-evident at large scales": where?? We have no evidence at all. We believe in General Relativity, and take any astrophysical observation as an evidence of the theory. I want a direct evidence! I understand that you want to give up acyclicity (whence transitivity) for keeping GR, but this is not an experimental motivation.

    "Metric" properties of space-time unfortunately involve an additional information besides the binary relation generating the causal order, and this is the fact that the causal relation is of quantum nature: is a quantum interaction. One of the main thing that I have well understood is that we live in "a quantum digital universe": the quantum nature of the causal network is crucial in recovering the Lorentz covariance. I explained more in my reply to your post on my essay. The scale factor definitely must come from the Planck distance.

    Dimension is an emergent property? I'm not sure. If you believe in causal networks, the graphs dimension of the network (which equals the dimension of the emerging space-time)

    depends on the topology of causal connections. These ARE the theory. Having dimension as emergent would correspond to have the most basic theory as emergent. Causality is not emergent: causality is our way of describing physics. Moreover, let me comment on your apparent connection with the Feynman path-integral. The closed trajectories in the Feynman integral have nothing to do with acyclicity of causality, since the fact that you can evaluate the probability amplitude of coming back to the same state doesn't mean that the evolution is cyclic.

    Finally: do systems evolve with respect to an independent "time parameter". Time is emergent, and time in the usual Einstein sense need a synchronization procedure. But Lorentz covariance emerges from the automaton, and there we have an independent "discrete time parameter" which is just the number of unitary steps of the automaton!

    Thank you again for your essay! I really liked it a lot!

    I hope to meet you soon for the easiness and the pleasure of discussing in person.

    Giacomo Mauro D'Ariano

      • [deleted]

      Hello Mr.D'Ariano,Mr Dribus,

      I didn't know this Deutsch turing machine. It is relevant. I say me that it is possible to insert the organic semiconductors with my equations and spheres.The informations can be classed with the sortings and synvhros.of evolution. The fermionic spheres and the bosonic spheres can be seen in a pure 3D sphere and spherization evolution. The synmmetry seems essential. I ask me if the system is fusioned or binar for the serie of uniqueness ?

      In the reality, I prefer a fusion fermions/bosons.For the simulations and the convergences in 3D, it is seems interesting to insert the symmetry between the 2 systems for a better understanding of these synchros and sortings of evolution. mcosV=constanst is very relevant when the serie of uniqueness is insereted.

      I beleive that for a real understanding of the system of uniqeness. This number ! It is essential to understand the decreasing of volumes from the main central sphere, the number 1. We see that the lattices disappear in the perfect contact between spheres.Just due to this decerasing of spherical volumes. A little if I said that all the cosmological spheres are attracted towards the universal central sphere.The finite serie is so essential for the two systems, quant.and cosm.It is relevant also when we consider the volumes diffrenciating the bosons and fermions.Always with this serie of uniqueness and its precise number. It seems not possible to calculte correctly this number, that said it is possible to appraoch it. In logic, the cosmological number is the same, so ....between 1 ....and x :)

      The algorythms .......can converge !

      Regards

      Dear Mauro,

      I appreciate the excellent analysis. I will have to break my response into a couple of segments, so I will post them as new posts rather than replies. To the points on which we seem to agree, I have little more to add, though I am interested in the "Planck-scale experiment" you referenced. One point is that the generalization of covariance I have in mind is much more general than semigroup representations. For the points on which we may disagree, I will itemize.

      1. Regarding transitivity, I must insist on distinguishing between the "causal order" (of a classical universe) and the "binary relation generating the causal order." On large scales, the intransitivity I am talking about is as simple as the fact that the statement "Jane talked to Bill, then Bill talked to Susan," is not the same as the statement "Jane talked to Bill, then Jane and Bill talked to Susan." In either case, Susan received information from Jane, so the two statements are indistinguishable in their causal orders. However, in the first instance, the information is transmitted only through Bill, whereas in the second case it is transmitted both through Bill and directly. Thus, there are two different binary relations that generate the same causal order: the intransitive one in which information passes only through Bill, and the transitive one in which information also reaches Susan directly from Jane. These two are a priori different. At an ordinary scale, this is obvious to everyone.

      For fundamental physics, the reasoning is as follows. Many scientists (by no means all!) agree that "causality," however you define it, is one of the most fundamental concepts in physics. The question then becomes: how do you define/describe causality? Well, a cause and effect certainly seem to define a direction; you can imagine an arrow pointing from the cause to the effect. This is completely local. Include lots of causes and effects (vertices), and arrows (directed edges) without yet imposing any other conditions, and you get a directed graph, which is equivalent to a binary relation on the set of vertices. At this stage, there is nothing to rule out cycles, and certainly nothing to impose transitivity, which are both generally nonlocal phenomena. There is a "causal order" generated by this directed graph, which is the relation defined by closing the graph relation under transitivity. I put "order" in quotes because this is still more general at this stage than the usual definition of a partial order; it may still have cycles, for instance.

      This is all purely classical. To obtain a quantum theory, you need the superposition principle. The appropriate version of this in this case is a path sum over a configuration space of classical causal universes; i.e., directed graphs. I will explain why this is the appropriate version below. The question then becomes, "which graphs should be included in the configuration space?" This is the first real choice in the entire procedure, and involves a judgment about what types of graphs correspond to physical reality. My personal guess would be "acyclic locally finite directed graphs," but I want to make it clear that these are second-level assumptions that come further along in the development. I prefer acyclicity because we don't seem to observe causal cycles, and I choose local finiteness because I suspect that volume has something to do with counting (not necessarily as simple as Sorkin's "order plus number equals geometry", but in the same spirit).

      I particular, it makes an a priori difference if you include only transitive graphs (graphs in which there is an edge between two vertices whenever there is a path between them). It's conceivable that this difference would fall out of the path sum, but I see no justification for assuming this at the outset.

      (continued below)

      (continued from previous post)

      2. You make the very helpful analogy that "the causal links are the "wires" in the quantum circuit." If so, I don't see any disagreement on this point, because the directed graphs representing quantum circuits are not transitive graphs. Also, the classical causal networks in your arXiv paper seem not only intransitive, but almost "maximally so" in this sense.

      3. Regarding my reasoning for not absolutely ruling out cycles, I actually think GR is very discouraging to fans of time travel, and I'm certainly not trying to rescue GR here. It's true that GR gives a sliver of hope to believers in causal cycles, but I don't take these solutions very seriously. My reasoning is partly caution and partly based on some potentially interesting or suggestive properties of graphs containing cycles. The models I have thought the most about are acyclic, however.

      4. Of course you're correct that a binary relation doesn't determine a metric in general. Sorkin discusses this at length. His "order plus number equals geometry" motto is based on metric recovery theorems that take as input an appropriate binary relation together with some volume data. His choice of how to provide volume data is the simplest (counting), but there are other ways, defined by taking advantage of local data in the graphs. For a homogeneous graph, simple counting is probably the only option, but I don't prefer the assumption of homogeneity.

      (continued below)

      (continued from previous post)

      5. I clearly did not explain my use of the sum over histories method adequately enough, and it is no wonder given the length constraints. First, in his 1948 paper Feynman discussed summing over particle trajectories in Euclidean spacetime and thereby recovered "standard" quantum theory, with its Hilbert spaces, operator algebras, Schrodinger equation, etc. Feynman was able to take all the trajectories to be in the same space because he was working with a background-dependent model; the ambient Euclidean space is unaffected by the particle moving in it. Now, if GR has taught us anything, it is that "spacetime" and "matter-energy" interact, so different particle trajectories mean different spacetimes. Hence, in a background-independent treatment, Feynman's sum over histories becomes a sum over "universes," with a different classical spacetime corresponding to each particle trajectory. His original version is a limiting case in which the effect of the particle on the spacetime is negligible.

      What are the "classical spacetimes" in my approach? Well, they are directed graphs. However, it is not quite right to just sum over graphs. The reason why can be understood by looking at Feynman's method more carefully. He considered a region R of spacetime, and interpreted his path sum as the amplitude associated with measuring the particle somewhere on the upper (i.e., future) boundary given an initial measurement on the lower boundary. Hence, the path sum measures not the amplitude of a particular universe, but the amplitude of "transition" from one family of universes to another. A discrete approximation of this represents each particle trajectory as a sequence of directed segments in the corresponding configuration space, which inherits a partial order from the time-orders of the individual spacetimes. It is now clear how to generalize to the nonmanifold case: the appropriate sums are sums over paths in causal configuration space.

      Take care,

      Ben

      Your essay does a excellent job of summarizing the problems with current physics models as well as pointing out some of the assumptions that need to be questioned.

      The scope of the essay far exceeds what one would expect to find in so little space. Lots of food for thought requires lots time to digest. In the case of your essay, we're easily talking months.

      As I read the essay, a number of questions immediately came to mind.

      First, I would like to understand better how you determined which of the fundamental assumptions needed to be questioned and why. What was your starting point(s)?

      Also, you mention the necessity for a new theory to recover established physics that has proven to be successful. Wouldn't this condition constrain the development of an entire class of new theories, particularly those from which established physics cannot be recovered and which, in some case, may come in direct opposition to them?

      Wouldn't you think that all that should required of a theory, besides internal consistency, is that it be in agreement with observational and experimental data, and not necessarily with any theoretical interpretation of it? For instance, a theory may account perfectly for the bending of light near a massive structure yet be based on a axiom set that excludes general.

        Daniel,

        I appreciate the feedback. I will itemize my reply:

        1. The length requirement did limit my ability to explain my background thoughts. My emphasis on causality is principally motivated by two factors:

        First, most of science, and particularly the experimental method, consists of establishing "causal" relationships between things ("whenever we do so-and-so, we observe such-and-such"). The success of this approach is obvious, and I feel that its full potential should be exploited. Remarkably, much of modern theoretical physics has relegated causality to a secondary and sometimes obscure role, and many recent theories reject its fundamental importance altogether. Instead, they are based on objects like continuum manifolds, which have undeniable mathematical advantages, but which are disturbingly idealistic and exhibit properties that are obviously physically irrelevant (like the least upper bound property, nonmeasurable subsets, etc.)

        Second, it is startling how much of the structure of even such idealized theories can be recovered from causal relations, and how almost every "spacetime-related concept" you can think of has an analogous "causal interpretation" that is more natural and more general. For instance, a "light cone" in relativity is an abstract geometric locus of events in spacetime, and the rule that "information cannot escape its light cone" is viewed as secondary. In causal theory, the light cone is simply the scope of information flow, and the idealized view of a geometric locus is secondary. As another example, "frames of reference" in relativity are associated with abstract local coordinate systems on a manifold, and the relativity of simultaneity is viewed as secondary. In causal theory, frames of reference are different orderings of events compatible with the causal order, so the physical idea of relativity of simultaneity is direct, and the idealized view of a coordinate system is secondary. In both examples, I think it's obvious which is the more natural and physical point of view.

        2. Regarding the recovery of established physics, I don't mean that every prediction of currently accepted models, including those that haven't been experimentally verified, must be reproduced; that would be pointless. What I mean is that a new theory must do at least as well as current theories in explaining and predicting what we actually see. For instance, general relativity modifies Newton's laws only very slightly for solar system dynamics; if general relativity had predicted an inverse cube force for the gravity between the earth and the moon, it would have been thrown out. Similarly, a new theory must look like general relativity and quantum theory in any situation where those theories have been proven to work.

        3. Regarding the general requirements for a theory, yes, I agree completely. Experimental evidence is the final judge.

        Take care,

        Ben

        • [deleted]

        Ben,

        Will you be able to recover Minkowski spacetime and relativity in general from your new principles without additionally assuming the constancy of the speed of light (Einstein's 1905 light postulate)? Einsteinians sometimes claim that "the constant speed of light is unnecessary for the construction of the theories of relativity" but this is a fraud of course:

        Jean-Marc Lévy-Leblond: "Supposez que demain un expérimentateur soit capable de vraiment mettre la main sur le photon, et de dire qu'il n'a pas une masse nulle. Qu'il a une masse de, mettons 10^(-60)kg. Sa masse n'est pas nulle, et du coup la lumière ne va plus à la "vitesse de la lumière". Vous pouvez imaginer les gros titres dans les journaux : "La théorie de la relativité s'effondre", "Einstein s'est trompé", etc. Or cette éventuelle observation ne serait en rien contradictoire avec la théorie de la relativité!"

        Jean-Marc Lévy-Leblond "De la relativité à la chronogéométrie ou: Pour en finir avec le "second postulat" et autres fossiles": "Il se pourrait même que de futures mesures mettent en évidence une masse infime, mais non-nulle, du photon ; la lumière alors n'irait plus à la "vitesse de la lumière", ou, plus précisément, la vitesse de la lumière, désormais variable, ne s'identifierait plus à la vitesse limite invariante. Les procédures opérationnelles mises en jeu par le "second postulat" deviendraient caduques ipso facto. La théorie elle-même en serait-elle invalidée ? Heureusement, il n'en est rien..."

        Tom Roberts: "If it is ultimately discovered that the photon has a nonzero mass (i.e. light in vacuum does not travel at the invariant speed of the Lorentz transform), SR would be unaffected but both Maxwell's equations and QED would be refuted (or rather, their domains of applicability would be reduced)."

        Why Einstein was wrong about relativity, 29 October 2008, Mark Buchanan, NEW SCIENTIST: "A photon with any mass at all would imply that our understanding of electricity and magnetism is wrong, and that electric charge might not be conserved. That would be problem enough, but a massive photon would also spell deep trouble for the second postulate, as a photon with mass would not necessarily always travel at the same speed. Feigenbaum's work shows how, contrary to many physicists' beliefs, this need not be a problem for relativity."

        Tom Roberts: "As I said before, Special Relativity would not be affected by a non-zero photon mass, as Einstein's second postulate is not required in a modern derivation (using group theory one obtains three related theories, two of which are solidly refuted experimentally and the third is SR). So today's foundations of modern physics would not be threatened.

        Mitchell J. Feigenbaum: "In this paper, not only do I show that the constant speed of light is unnecessary for the construction of the theories of relativity, but overwhelmingly more, there is no room for it in the theory. (...) We can make a few guesses. There is a "villain" in the story, who, of course, is Newton."

        Pentcho Valev

          Dear Pentcho,

          Thanks for the feedback! By "recovering" relativity, I don't mean that I believe relativity in Einstein's original form is absolutely valid (otherwise, why would it need to be replaced or superseded?) What I mean is that in any case in which relativity makes good predictions, any theory that supersedes it must do at at least as well. Hence, a new theory must be able to describe/predict anything that relativity can describe/predict.

          Regarding the constancy of the speed of light, my guess would be that a concept like this only makes sense at sufficiently large scales. "Speed" requires a notion of distance, and my view is that spatial distance (separation) is ultimately just a way of talking about the extent to which events are "unrelated." It begins to look like a traditional distance only at large enough scales.

          Regarding photon mass, it was thought for a long time that neutrinos had no mass, but it was eventually discovered that they do have mass after all. Hence, I would be inclined to keep an open mind even about something as "sacred" as that. However, mass itself is again an emergent concept in my view, so the questions of what a "photon" really is and what "mass" really is are things that cannot be taken for granted.

          You'll have to remember that my background is mostly mathematical, and therefore I'm inclined to consider the possibility of things that most physicists "know" are wrong. This might be useful in some cases; in others it only reflects my own ignorance. Take care,

          Ben

          • [deleted]

          Dear Benjamin

          You wrote: "Predictions based on quantum field theory and the Planck scale yield a value for the cosmological constant roughly 120 orders of magnitude greater than observation implies."

          If you read my posts to my essay attentively you can read next:

          Yuri Danoyan wrote on Sep. 4, 2012 @ 00:25 GMT

          Appendix 4 Solution of cosmological constant problem

          Theory: Cosmological constant is 10^94 g/sm^3

          Practice: Cosmological constant is 10^-28 g/sm^3

          Planck constant h=10^-28 g x sm^2/sec in 2D space embedding in 3D space

          Only right value is experimental value.

          Theory based in wrong assumptions noted in my essay.

            Dear Yuri,

            I did look through the comments on your thread, but I am afraid I don't quite understand. It seems you are suggesting there is a simple dimensional relationship that explains the observed value of the cosmological constant. This would be great, but it's not obvious to me. Do you mind explaining a little more? Take care,

            Ben