Dear David Brown,

Thank you for a very interesting essay.

As you may have noted from my posts to other essayists I prefer a finite rotating universe composed of aether (one type of particle) and matter (also one type of particle) with simple rules (force laws) between aggegations of matter. This simple model predicts a baryonic CDM that is neutral, and in equal proportion to normal matter. This means that there must be a lot more unseen normal matter than is counted in the numerous census's. (most likely H2!) My theory of gravity is different to all others but obey's Newtonian rules. I do not go with gravitons or any form of boson as a force carrier at all, as I can explain forces using fields instead. However I do not cover these ideas in my essay, which is on other matters pertaining to the essay topic of the 3 Un's.

BTW I can calculate the masses of the various baryons quite simply, which makes me think the Koide formula is just a numerical co-incidence. I tried to make sense of it, knowing how basic matter is constructed from quarks but I failed.

Thanks for the Sanejouand link, I shall peruse it when I have finished with the plethora of essays.

LL&P

Lockie Cresswell

How might physicists give an empirical proof that "the Koide formula is just a numerical co-incidence"? Motl has argued the Koide formula is merely a meaningless curiosity.

"Could the Koide formula be real?" by Luboš Motl, 2012

I have speculated that string theory with the finite nature hypothesis implies square-root(mass) has a meaning in terms of area. In string theory with the infinite nature hypothesis, square-root(mass) might have a meaning in terms of Koide-uncertainty, where this uncertainty is somehow related to the string landscape. What might be the possibilities for introducing square-root(mass-energy) as an essential concept in physics?

    Let us assume that my basic theory is wrong.

    Consider: "The Discrepancy in Galaxy Rotation Curves" by Roy Gomel and Tomer Zimmerman, 2019.

    It seems to me that the idea as stated by Gomel and Zimmerman is wrong. However, I believe that their idea might be modified as follows:

    In the standard form of Einstein's field equations, replace the -1/2 by -1/2 Gomel-Zimmerman-fictitious-force-function.

    Replace the Einstein equivalence principle:

    (Inertial mass-energy) * (Acceleration) =

    (Intensity of the gravitational field) * (Gravitational mass-energy) by an Einstein-Gomel-Zimmerman equivalence principle:

    (Inertial mass-energy) * (Acceleration) =

    (Intensity of the gravitational field) * (Gravitational mass-energy in the universe Gravitational mass-energy modified by the multiverse ).

    With the Einstein equivalence principle, gravitons should have spin 2.

    With the Einstein-Gomel-Zimmerman equivalence principle, gravitons might have

    spin 2 multiverse-epsilon. This excess energy might be the explanation for the empirical successes of Milgrom's MOND.

    What might be plausible ways of modifying the Gomel-Zimmerman explanation?

    Your essay is intriguing...

    I agree with many of your premises and most of your conclusions, but I feel like your understanding is incomplete. There is too little 'glue' in some of your explanations, so the reader is left to fill in the blanks with appreciation for your cited references. As you know; I am not in denial about Milgrom and I have some appreciation for String theory, but this essay falls a little short of the ideal, raising some great questions and showing evidence but not making clear sense of how the pieces fit together.

    I think the jury is out on the zero mass spin-2 graviton. And some of the pieces you put in place also occur in some theories where the graviton has mass and therefore multiple polarization states, such as the DGP and cascading DGP models. So that piece has merit. As far as uncertainty coming with money goes; I think your view is a bit short-sighted. The story goes that money was invented when temple worshipers argued the value of various commodities like pigs or cattle, sheaves of wheat, or baskets of fruit, and they had to come up with a standard unit of exchange.

    There may be a good reason why the root words coinus and coitus sound the same. And ironically; the cause of the market crash in 2008 (?) might have been something called the Gaussian Copula Function, used to estimate derivatives. David X. Li was very clever and he gave his caveats first, then described the advantages of using his equation. So people had a false sense of security that risk estimation equaled risk containment. Mandelbrot warned people about some of this, not to trust the bell curve (Gaussian) because it was truncated due to attrition. People should have known better. It was greed and bad Math that made things economically unstable.

    All the Best,

    Jonathan

      "... the cause of the market crash in 2008 ..." Why do markets crash? Are markets and money always permeated by hidden uncertainties? It is unclear what money is -- is it wampum, tulips, or Bitcoin?

      According to Ben Bernanke, "Financial panics have a substantial psychological component. Projecting calm, rationality, and reassurance is half the battle ..."

      "Crashed: How a Decade of Financial Crises Changed the World" by Adam Tooze, Penguin, 2018

      Consider the following hypothesis: General relativity theory is a mathematical formulation of the concept of Einsteinian reference frames. However, Einstein's model omits dark matter, which is a stringy Gomel-Zimmerman fictitious force. This hypothetical fictitious force arises from the gravitational mass-energy of the Calabi-Yau component of the string landscape. In other words, the string landscapes does not yield Einsteinian cosmological models but instead modified Einsteinian models with the stringy Gomel-Zimmerman fictitious force.

      Calabi-Yau manifold, Wikipedia

      Absolutely!

      The cause is more than half panic, in many cases. I've been reading about the neoliberals and neoconservatives trying to outmaneuver each other for financial gain and to preserve the ideological conflict. Both schemes tend to concentrate wealth for the elites while leaving the commonwealth unfed. The only real difference is which special interests win.

      Disgusting! Only a strong middle class guarantees basic freedoms. Otherwise; the whole Finance game goes down. The world is coming more and more to resemble the dark predictions in the movie "Rollerball" and what really suffers most is the push for individual accomplishment. People are being trained to act like cogs in a machine. Not healthy for society.

      I'll have to read what Bernanke has to say.

      Best,

      JJD

      Dear David,

      I notice that Jonathan comments that the jury is still out on the spin 2 graviton.

      I commented to Andrew Beckwith "I am not a fan of bosons as fundamental force particles (except for Higgs), and can provide alternative suggestions for photons, gluons, and W/Z bosons. Nor am I a believer in gravitons, as I have formulated my own 'action at a distance' theory of gravity using strings of what I suppose are Higgs particles, although I call them ginn (or aether particles). Because I have a working particle theory I decided to do a back of the envelope calculation of their (string) gram equivalent mass and got a number 10-34 g which is some 28 orders greater than the 10-62 g you (A.B) mentioned for the graviton."

      I also have a working theory that dark matter is in two forms, a halo of neutral antimatter, and a good deal of normal matter in a variety of forms, notably H2.

      Thanks for the Motl link, I shall have a look right away.

      Regards

      Lockie

      Hi David,

      The thread reply wasn't working so I will post here instead. I checked out Motl and also Rivera and Gsponer. Most interesting! My gimli theory (nothing to do with River's Gims explains why the masses of the muon and tau are greater, but I cannot yet predict the values (hopefully in the future). What I didn't like about Koide was that with the quarks, he included the strange with the up and down. That is mixing the families, which I think is a mistake. I am still a bit unnerved wrt the leptons given that I consider they are composed of preons, that fit ratios of 1/3.

      As for Gomel-Zimmerman, I think fictitious forces that arise in non-inertial frames should be taken into account when constructing a theoretical rotation curve. I'm not sure about the G-Z stringy stuff, although I suspect DM's gravity is the same as normal matter's gravity. (which in my theory is all stringy stuff of a sort).

      Cheers

      Lockie

        The question of whether "DM's gravity is the same as normal matter's gravity" is central to understanding Milgrom's MOND. Milgrom's MOND has many empirical successes -- there are 2 basic possibilities:

        (1) After quantum averaging, Einstein's field equations are 100% correct, but appear to be slightly wrong for some unknown reason.

        (2) After quantum averaging, Newtonian-Einsteinian gravitational theory is slightly wrong.

        Have the string theorists underestimated Milgrom? My guess is that supersymmetry occurs in nature if and only if nature is infinite.

        According to Halverson and Langacker,, "Superstring theory ... yields a consistent and finite unification of quantum theory, gravity, and other interactions, at least at the perturbative level, and is therefore a promising candidate for an ultimate unified theory. However, it is not clear whether Nature actually takes advantage of string theory, or whether there is any way to confirm or falsify it. ... Part of the problem is that the fundamental string scale is most likely much larger than can ever be directly probed experimentally. ..."

        Halverson, James, and Paul Langacker. "TASI lectures on remnants from the string landscape." arXiv preprint arXiv:1801.03503 (2018)

        It is clear to me that if dark matter particles occur in nature then some (and perhaps all) of them must display MONDian weirdness (whatever that might be). However, it seems to me that the string landscape allows so may fudge factors that it can never be decisively refuted.

        7 days later

        To what extent are the foundations of physics undecidable? How is string theory related to undecidability, uncomputability, and unpredictability? On the basis of string theory is there a unified theory of algebraic geometry, differentiable geometry, and theoretical physics?

        According to Hirosi Ooguri in 2009, "The topological string theory was introduced by E. Witten about 20 years ago, and it has been developed by collaborations of physicists and mathematicians. Its mathematical structure is very rich, and it has lead to discoveries of new connections between different areas of mathematics, ranging from algebraic geometry, symplectic geometry and topology, to combinatorics, probability and representation theory."

        Ooguri, Hirosi. "Geometry as seen by string theory." Japanese Journal of Mathematics 4, no. 2 (2009): 95.

        arXiv preprint, 2009

        Think of algebraic geometry as theoretical physics with the removal of time and energy. What is the fundamental role of lattices in algebraic geometry? What is the fundamental role of lattice vibrations in theoretical physics? My guess is that supersymmetry occurs in nature if and only if nature is infinite, and nature is finite if and only if string vibrations are confined to 3 copies of the Leech lattice.

        9 days later

        Hi David,

        I am still reading various essays and I came across some further comments of yours that got me reading and thinking.

        Firstly my gimli theory makes a clear prediction of what dark matter is, and it is baryonic in nature and I would expect it to be in equal proportion to normal matter, which appears not to be the case. This might imply that there is more normal matter in the form of H2 that hasnt yet been detected, and possibly some warm dark matter component (I suspect neutrinos) that also needs to be 'weighed'.

        On another tack I noticed a reference to an old presentation of mine that discussed GM=TC3, which is something I hypothesised back in the early 2000's, when I worked out a definition of time. When presenting a section on time in 2013 I included a section on variable speed of light theories by Einstein, Dicke, Magueijo, and my own. I speculated that if the permittivity of free space was held constant (since I had no reason to expect charge to vary) then the permeability of free space would have to change over time, and that this may be seen in astrophysical observations. I have also used changing permeability to sucessfully explain neutrino oscillations, so there may be something worthwhile in it. Using your references I just found that Riofrio and Kulick both have used the same idea of GM=TC3, although from different reasonings. As I said mine came directly from my definition of time, which is based on the Einstein-Planck formula with an important modification to allow for the clock.

        Hope this may be of interest,

        Lockie Cresswell

        The prediction of "what dark matter is" seems to me to be a concept of string theory with the infinite nature hypothesis -- and not a concept of string theory with the finite nature hypothesis (i.e. there is no dark matter). My guess is that, in string theory with the infinite nature hypothesis, it is difficult to accommodate the hypothesis that the speed of light in a perfect vacuum decreases as our universe ages -- it leads to some extremely complicated modification of the prevailing paradigm of physics. In general relativity, there are the two following qualitative predictions: As energy-density increases, time slows down. As energy-density decreases, time speeds up. If the radius of our universe is a constant, the speeding up of time would seem to be equivalent to a loss of gravitational energy. According to Sean Carroll, "The Bullet Cluster and the CMB both provide straightforward evidence that there is gravity pointing in the direction of something other than the ordinary matter."

        "Dark matter vs. modified gravity: trialogue", Sean Carroll's blog (preposterousverse.com/blog), May 9, 2012

        However, Sean Carroll assumes that the gravitational bending of light is adequately predicted by Einstein's field equations -- I suggest dark-matter-compensation-constant = (3.9±.5) * 10^-5 .

        23 days later
        a month later

        According to Denef, Douglas, Greene, & Zukowski, "By defining a cosmology as a space-time containing a vacuum with specified properties (for example small cosmological constant) together with rules for how time evolution will produce the vacuum, we can associate global time in a multiverse with clock time on a supercomputer which simulates it."

        Denef, Frederik, Michael R. Douglas, Brian Greene, and Claire Zukowski. "Computational complexity of the landscape II--Cosmological considerations." Annals of Physics 392 (2018): 93-127.

        arXiv preprint

        Does string theory with the infinite nature hypothesis allow too many vacua? In terms of string theory, is it possible to understand why Milgrom's MOND has many empirical successes?

        In the article "Did the Universe Just Happen?" in the April 1988 issue of "The Atlantic Monthly", Robert Wright stated, "I talked with Richard Feynman, a Nobel laureate at the California Institute of Technology, before his death, in February. Feynman considered Fredkin a brilliant and consistently original, though sometimes incautious, thinker. If anyone is going to come up with a new and fruitful way of looking at physics, Feynman said, Fredkin will."

        It seems to be that there 2 alternatives: (1) Feynman overestimated Fredkin. (2) The string theorists have underestimated Fredkin.

        Assume that Majorana fermions do not occur in nature.

        In the Standard Model of particle physics, there are 36 different quarks, 12 leptons, and 13 bosons, giving a total of 61 fundamental particles. Add 3 bosons: graviton, axion, and inflaton. Do 64 dimensions of particle paths correspond to 64-dimensions of uncertainty based upon h-bar and alpha-prime?

        Why are there 6 basic quarks? Consider 2 other questions: What is measurement? Why does measurement exist?

        Let us consider 3 beliefs:

        (1) Quantum information is irreducible. (2) Measurement is an activity that experimental physicists do. (3) Measurements occur because experimental physicists want them to occur.

        Now, consider 3 alternate beliefs:

        (A) Quantum information reduces to Fredkin-Wolfram information. (B) Measurement is a natural process that separates the boundary of the multiverse from the interior of the multiverse. (C) Measurement always occurs in terms of quantum information, but quantum information is merely an approximation generated by Wolfram's cosmological automaton using the monster group and the 6 pariah groups.

        Are there 6 basic quarks because there are 6 pariah groups? If the answer to the previous question is "No!" then my guess is that string theory with the infinite nature hypothesis is the way to understand the foundations of physics. Is it possible to understand MOND in terms of gravitinos with variable effective mass based upon nearby gravitational acceleration?

        According to Kroupa, dark matter particles are unlikely to exist. I agree with Kroupa. For alternate ideas, see:

        Arcadi, Giorgio, Abdelhak Djouadi, and Martti Raidal. "Dark matter through the Higgs portal." Physics Reports 842 (2020): 1-180.

        arXiv preprint

        14 days later

        Consider the following:

        Sanejouand, Yves-Henri. "A framework for the next generation of stationary cosmological models." arXiv preprint arXiv:2005.07931 (2020).

        My guess is the preceding article is a work of genius, but the section "How are photons lost?" is wrong -- photons are not lost, instead they display the effects of loss of gravitational energy -- such loss confirms the existence of the timing mechanism of Wolfram's cosmological automaton. I say that MOND is the key to understanding string theory with the finite nature hypothesis, i.e. Wolfram's cosmological automaton.

        Claim: The empirical successes of MOND prove that string theory (with the finite nature hypothesis) is correct -- the string vibrations are confined to 3 copies of the Leech lattice -- the monster group and the 6 pariah groups allow Wolfram's cosmological automaton to create approximations of string vibrations.

        MOND idea: Force = Newtonian-acceleration + excess-acceleration-based-on-a0

        Alternate idea (string theory with the finite nature hypothesis):

        Force = (A0 + G)mM/r^2 = Newtonian-force + excess-gravitational-force

        Claim: the MOND idea and the alternate idea are approximately mathematically equivalent.

        What might be going on? According to Riofrio and Sanejouand, the radius of our universe is a constant -- our universe is not expanding. Astrophysicists mistakenly believe that our universe is expanding because they mistakenly believe that gravitational energy is conserved. Instead of our universe expanding, the observers and their associated reference frames are shrinking. During each Planck time interval, precisely one unit of Fredkin-Wolfram gravitational energy is transferred from the boundary of the multiverse into the interior of the multiverse. Gravitons do not have spin 2 -- instead they have spin 2 on average and there is a new uncertainty principle for graviton spin. The graviton spin uncertainty principle allows some gravitons to escape from the boundary of the multiverse into the interior of the multiverse. The escape process causes 3 phenomena: (1) For people who believe that gravitational energy is conserved, our universe seems to be expanding with a nonzero cosmological constant, i.e. dark energy; instead of a weird negative pressure there is a steady, real loss of gravitational energy. (2) MOND has many empirical successes, i.e. MOND replaces the so-called dark matter -- the escape of gravitons causes a back reaction against gravitational lines of force, thus resulting in dark-matter-compensation-constant = (3.9±.5) * 10^-5 . (3) The escape of gravitons causes an excess flattening of spacetime -- thus causing a deflaton field which replaces the inflaton field. Is the preceding scenario nonsense?

        6 months later

        My guess is that string theory with the infinite nature hypothesis implies the Friedmann model and dark-matter-compensation-constant = 0, but string theory with the finite nature hypothesis implies the Riofrio-Sanejouand model and dark-matter-compensation-constant = (3.9±.5) * 10^-5 . Have string theorists overlooked several possibilities for restricting the string landscape? J. P. Lestone of Los Alamos National Laboratory has developed a theory of virtual cross sections.

        "Vacuum Polarization: A Different Perspective" by J. P. Lestone, report LA-UR-20-23090, April 2020

        "QED: A different perspective" by J. P. Lestone, report LA-UR-18-29048, September 2018

        "Possible reason for the numerical value of the fine-structure constant" by J. P. Lestone, report LA-UR-18-21550, February 2018

        "Possible Mechanism for the Generation of a Fundamental Unit of Charge (long version)" by J. P. Lestone, report LA-UR-17-24901, June 2017

        Lestone's theory is semi-classical and is not completely valid in terms of relativity theory. Many string theorists are unaware of Lestone's theory (which is loosely based upon string theory). Can Lestone's theory be made fully relativistic and, at least partially, compatible with siring theory?

        Consider the following speculation:

        Tachyonic Network Hypothesis (with subsidiary hypotheses A,B,C,D):

        (A) String theory with the infinite nature hypothesis implies the Big Bang, the inflaton field, and, after quantum averaging, Einstein's field equations are totally correct. For times shorter than one Planck-time unit, the concepts of measurement, energy, and spacetime fail. For times in the range from one Planck-time unit to one hundred thousand Planck-time units, the Heisenberg uncertainty principle needs to be replaced by a generalized uncertainty principle (GUP). Call this the GUP range. For times greater than one hundred thousand Planck time units, the Heisenberg uncertainty principle is valid for almost all purposes.

        (B) In the GUP range, 3-dimensional space needs to be replaced by 9-dimensional virtual space, which consists of 3 copies of the unit sphere in quaternionic space. In the GUP range, 1-dimensional time needs to be replaced by 2-dimensional time, which consists of 1 dimension of measurable time and 1 dimension of non-measurable, imaginary time. Non-measurable, imaginary time is bounded. As non-measurable, imaginary time shrinks to zero, 9-dimensional virtual space shrinks to 3-dimensional measurable space. Massless bosons have 0-dimensional virtual cross sections. Massive bosons have 1-dimensional virtual cross sections. Leptons have 2-dimensional virtual cross sections. Quarks have 8-dimensional virtual cross sections.

        (C) To 36 quarks, 12 leptons, 9 massless bosons, and 4 massive bosons, adjoin the graviton, the axion, and the inflaton to get 64 fundamental particles. There is a 256-dimensional lattice approximation for string theory in which there are 64 virtual particle paths with each particle path having 4 dimensions of uncertainty. The 256-dimensional lattice approximation with 8 dimensions of virtual-cross-sectional uncertainty generates 264 dimensions of stringy uncertainty, which can be decomposed into 11 copies of the Leech lattice.

        (D) Each virtual particle has 11 copies of the Leech lattice associated with it. As virtual particles collapse into measurable particles, there is an enormous network of non-measurable virtual tachyons, which link the measurable particles together. The virtual tachyons do not last long enough to be directly measured, but they do allow a transient violation of energy conservation which is postulated in Lestone's theory of virtual cross sections. The massless virtual particles collapse into 10 possible massless bosons. The massive virtual particles (which have imaginary mass-energy) collapse into 54 possible bosons. There might be other measurable particles for which this lattice approximation scheme fails (e.g. SUSY, magnetic monopoles, Majorana fermions), but the lattice approximation based upon 11 copies of the Leech lattice does work well enough to justify Lestone's theory of virtual cross sections.

        Write a Reply...