My impression is that, as of the beginning of February 2020 C.E., both 't Hooft and Wolfram fail to realize that Milgrom is the Kepler of contemporary cosmology -- 't Hooft and Wolfram have some of the correct concepts but they have not appreciated the ideas of Milgrom, Riofrio, Sanejouand, and Pipino. MOND is data-based -- according to Kroupa, non-relativistic MOND is remarkably successful. The Riofrio-Sanejouand cosmological model is data-based. Riofrio published her model in 2004. By studying the same data, Sanejouand independently arrived at the model.

Sanejouand, Yves-Henri. "A simple varying-speed-of-light hypothesis is enough for explaining high-redshift supernovae data." arXiv preprint astro-ph/0509582 (2005)

What do I mean by the term "Einstein-Riofrio duality principle"?

In string theory with the infinite nature hypothesis, the assumption is that, after quantum averaging, Einstein's field equations are 100% correct. Can the ΛCDM model be empirically refuted?

Lambda-CDM model, Wikipedia

By using supersymmetry, D-branes, and D-brane charges, my guess is that mathematical models of dark matter particles and the inflaton field can be cleverly adjusted to match any plausible, or implausible, physics.

My guess is that the string theorists have discovered the "Einstein" part of Einstein-Riofrio duality. In the "Riofrio" part of Enstein-Riofrio duality, there are 3 modifications to Einstein's field equations: cutoff for minimum wavelength, cutoff for maximum wavelength, and

dark-matter-compensation-constant = (3.9±.5) * 10^-5. Furthermore, the inflaton field is redefined: Guth's inflaton field is replaced by a inflaton field that is defined in terms of the Riofrio-Sanejourand model.

I think some of those things have already been explored...

Using D-branes and D-brane charges is what DGP gravity and Cascading DGP is, so Pourhasan, Afshordi, and Mann's idea of a 5-d black --> 4-d white hole might implement that, go along with VSL and you get something like what Afshordi and Magueijo came up with.

Standard inflation may not be able to give you what you want, by varying parameters, but if it could Steinhardt thinks this is a pathology, because it fails to yield a single consistent picture. He is not a fan of having a String Theory landscape either, unless we also have a strategy at the ready to sub-select for physically realistic options.

One would wish for a formulation where it falls out of the model, rather than requiring adjusting parameters. Using the Mandelbrot Set for a guide suggests options like the above noted theories, but it only works out if there are no adjusting factors applied. The pure form of M already describes a scenario like DGP but has no need to adjust brane tension and so forth.

In fact; it suggests there was torsion on the fabric - as well as tension - which Joy Christian and Fred Diether say is what gave us the spectrum of particles we see. I.e. spin fields and trapped spin in the fabric (torsion) in the early universe produce spinor particles in the current day, according to their model. There is a lot to explore. It's hard for me to keep up.

Best Wishes,

Jonathan

Or you could add Milgrom too...

Then it would be MMMM-theory.

Sorry couldn't resist,

Jonathan

Just another thought...

What if the speed of light is a measure of the universe's mass? I we imagine there was a matter-free regime in the radiation dominated early universe; perhaps this translates into a higher light speed. By taking Einstein's venerable equivalence equation, and flipping terms to solve for c, We obtain c^2 = E/m, then let m --> 0 and discover c^2 is unbounded in a universe devoid of mass.

I interpret this to mean the speed of light is infinite in the 2-d regime near the Planck scale as postulated by many Quantum Gravity theories. But it also might reproduce exactly the effect Sanejouand was talking about - assuming more matter congeals out of energy over time. My essay focuses on a mechanism to do exactly that. Perhaps we are more on the same page than you imagine.

All the Best,

Jonathan

    "What if the speed of light is a measure of the universe's mass?" The preceding question, according to my speculation, is the essential question in deciding between string theory with the infinite nature hypotheis (dark matter particles) versus string theory with the finite nature hypothesis (no dark matter particles). If dark-matter-compensation-constant = zero, then my guess is that the Riofrio-Sanejouand model is wrong. If the Riofrio-Sanejouand model is empirically valid, then my guess is that, during each Planck time interval, gravitational energy is transferred from the boundary of the multiverse into the interior of the universe -- implying that the mass-energy of our universe is related to the average speed of light in intergalactic space.

    According to Peebles, "The evidence for the dark matter (DM) of the hot big bang cosmology is about as good as it gets in natural science."

    Peebles, P. James E. "Dark matter." Proceedings of the National Academy of Sciences 112, no. 40 (2015): 12246-12248.

    My understanding of the evidence is that Kroupa is correct about dark matter particles and Peebles is wrong -- but perhaps dark matter particles with weird MONDian properties might be discovered.

    My guess is that string theory is empirically irrefutable. According to Schwarz, "... we explored whether it is possible to interpret the massless spin 2 state in the closed-string spectrum as a graviton. This required carrying out an analysis analogous to the earlier one of Neveu and Scherk. This time one needed to decide whether the interactions of the massless spin 2 particle in string theory agree at low energy with those of the graviton in general relativity (GR). Success was inevitable, because GR is the only consistent possibility at low energies (i.e., neglecting corrections due to higher-dimension operators), and critical string theory certainly is consistent. At least, it contains the requisite gauge invariances to decouple all but the transverse polarizations. Therefore, the harder part of this work was forcing oneself to ask the right question. Finding the right answer was easy. In fact, by invoking certain general theorems, due to Weinberg ... , we were able to argue that string theory agrees with general relativity at low energies ... . Although we were not aware of it at the time, Tamiaki Yoneya had obtained the same result somewhat earlier ... Scherk and I proposed to interpret string theory as a quantum theory of gravity, unified with the other forces. This meant taking the whole theory seriously, not just viewing it as a framework for deriving GR and Yang-Mills theory as limits."

    Schwarz, John H. "The early history of string theory and supersymmetry." arXiv preprint arXiv:1201.0981 (2012)

    Consider 2 viewpoints (A & B) concerning string theory: Viewpoint A. If an individual string could be measured, then a stringy generalization of Heisenberg's uncertainty principle would be revealed. Spacetime is doomed in terms of mathematical symmetries of the string landscape. At the Planck scale, there is curling up of extra special dimensions. An electron is a wave-particle possibility that probabilistically propagates through spacetime. Measurement is something that experimental physicists do. Strings are geometric completions of quantum probability amplitudes. After quantum averaging, Einstein's field equations are 100% correct. Viewpoint B. An individual string cannot in principle be measured. String vibrations are entirely virtual. Heisenberg's uncertainty principle is empirically valid whenever and wherever measurement can occur. Spacetime is doomed in terms of Wolfram's cosmological automaton (which is defined by 4 or 5 simple rules). At the Planck scale, there are no extra spatial dimensions -- string vibrations are confined to 3 copies of the Leech lattice. An electron is an approximate, computational possibility that discontinuously, computationally propagates in the interior of the multiverse generated by Wolfram's automaton. All measurement occurs in terms of quantum information, but time, space, energy, and quantum informations are merely approximations generated by Wolfram's automaton. Measurement is a natural process that separates the boundary of the multiverse from the interior of the multiverse. During each Planck time interval, gravitational energy is transferred from the boundary of the multiverse into the interior of the multiverse. There are a huge (but finite) number of alternate universes, all of which are on the boundary of the multiverse. The multiverse is mathematically isomorphic to a 72-dimensional holographic, digital computer in the form of a giant ball controlled by the monster group, the 6 pariah groups, and Wolfram's cosmological automaton. There are 6 basic quarks because there are 6 pariah groups. Einstein's field equations, after quantum averaging, require 3 modifications: a Koide cutoff, a Lestone cutoff, and dark-matter-compensation-constant = (3.9±.5) * 10^-5 . The 3 modifications are required to model Milgrom's MOND and the Riofrio-Sanejouand cosmological model (which allows the redefinition of the inflaton field). The string theorists think that Viewpoint B is nonsense -- are they correct?

    9 days later

    Does string theory with the finite nature hypothesis provide a new paradigm for understanding uncertainty?

    Louis Marmet founded "A Cosmology Group" (ACG) to publicize problems with the Lambda-CDM model.

    ACG Position (with empirical evidence聽against the Lambda-CDM model)

    Louis Marmet, York University

    In string theory with the infinite nature聽hypothesis, the string vibrations聽are not synchronized among alternate universes -- thus allowing 聽a chaos of incomprehensibility in terms of physical experiments. In string theory with Wolfram's cosmological automation, there are 2 highly testable predictions:

    (1) relativistic MOND (i.e. dark-matter-compensation-constant = (3.9卤.5) * 10^-5) and

    (2) the Riofrio cosmological model (which allows the replacement of Guth's inflaton field by a new inflaton field defined in terms of the Riofrio cosmological model).

    Can supersymmetry be empirically refuted? No, because all of the superpartners might have wavelengths that are too short or too long for detection. Is supersymmetry useful in theoretical physics? Yes, supersymmetry is needed for the Einstein-Riofrio duality principle. In string theory with the infinite nature hypothesis, we assume that, after quantum averaging, Einstein's field equations are 100% correct. In string theory with the finite nature hypothesis, we assume that, after quantum averaging, Einstein's field equations need 3 corrections. Put D-brane supercharges on gravitons, gravitinos, inflatons, and inflatinos. This allows an embedding of a model of string theory with the finite nature hypothesis into a model of string theory with the infinite nature hypothesis.

      Is uncertainty infinite or finite? Is nature infinite or finite?

      According to Tegmark, " ... infinity is an extremely convenient approximation for which we haven't discovered convenient alternatives."

      "Infinity Is a Beautiful Concept -- And It's Ruining Physics" by Mag Tegmark, Discover Magazine, 2015

      I have suggested 3 modifications to Einstein's field equations:

      "Einstein's field equations: 3 criticisms", 2017

      One modification attempts to refute the hypothesis that energy-density continuously approaches zero. Another modification attempts to refute the hypothesis that energy-density can be infinitely large. The modification dark-matter-compensation-constant = (3.9±.5) * 10^-5 attempts to provide a model for relativistic MOND. The hypothesis that nature is finite seems to be in conflict with the hypothesis that, after quantum averaging, Einstein's field equations are 100% correct.

      Marmet is a very nice man...

      He spoke at one of the first Physics conferences I attended CCC-2 (2nd Crisis in Cosmology Conf.) and Louis was inspirational. Many of the objections raised in his stated position linked above still stand. Possibly of interest would be the diagram from his proceedings paper which is reproduced on its cover, detailing the redshift predictions for various (10 different) models 'Angular distance a s a function of redshift.' I think I still have an electronic copy of that volume on one of my working computers, and I can forward his paper with that diagram if you like.

      All the Best,

      Jonathan

      8 days later

      According to Kroupa, the Lambda-CDM model is now ruled out. Is there considerable uncertainty about dark matter particles? (According to my speculations, the Riofrio model need a Koide cutoff and a Lestone cutoff.)

      According to contemporary scientific thought, the diameter of the observable universe is about 93 billion light years. However, if the Riofrio cosmological model is correct, the radius of our universe is constant, the speed of light in a perfect vacuum is steadily decreasing, and dark matter particles do not exist. If the Riofrio model is empirically valid, the universe is far smaller than most astrophysicists now believe it to be. The question is: How much smaller?

      Let us assume that the fundamental basis of nature is an Einstein-Riofrio duality principle, in which string theory with the infinite nature hypothesis corresponds to the (original) Einsteiniian field equations with dark matter particles (Einstein part of the duality), and string theory with the finite nature hypothesis corresponds to modified field equations with the Riofrio cosmological model and without dark matter particles (Riofriio part of the duality). There is a serious problem in understanding the Riofrio model because all the cosmological data is presented in terms of the paradigm with dark matter particles. I speculate that the way to overcome this problem is to assume that ordinary matter is steadily converted into dark matter particles (which might, or might not, exist).

      A 3-sphere with radius r has 3-dimensional cubic hyperarea = (2 pi^2) * (r^3).

      3-sphere, Wikipedia

      The Planck time is approximately 5.39 * 10^-44 seconds.

      Planck time, Wikipedia

      Our universe is approximately 13.82 billion years old.

      Age of the universe, Wikipedia

      Hypothesis 1: Assuming that the Einstein-Friedmann model is valid and dark matter exists, the mass-density of our universe is approximately 9.9 * 10^-30 g/cm^3 . (See the WMAP data.)

      Hypothesis 2: Wolfram's Reset recurs every (81.6±.1.7) billion years.

      Hypothesis 3: During each Planck time interval, precisely one unit of Fredkin-Wolfram energy is converted from ordinary matter to dark matter (which is equivalent to the loss of precisely one unit of Fredkin-Wolfram energy from the boundary of the multiverse into the interior of the multiverse). Here the assumption is made that astronomical time is different from atomic time. See the article "On the compatibility of a proposed explanation of the Pioneer anomaly with the cartography of the solar system" by Antonio Fernández-Rañada and Alfredo Tiemblo-Ramos, 2009.

      Step 1: Calculate mass-energy of our universe at the beginning of the Big Bang (assuming that dark matter particles exist) and almost all of the mass-energy at the time of the Big Bang consisted of ordinary matter.

      (Planck mass) * (81.6±1.7 billion years)/ (Planck time) =

      (4.733 ±.14) * 10^61 = (1.02±.02) * 10^57 grams .

      Step 2. Assuming that ordinary matter is steadily converted to dark matter, calculate how much ordinary matter is converted to dark matter in 13.8 billion years.

      (1.02±.02) * 10^57 g * 13.8/(81.6±1.7) = (.17±.01) * 10^57 g = (1.7±.1) * 10^56 g .

      Step 3. Calculate how much mass-energy in non-converted form now exists, according to the various hypotheses assumed. The answer is roughly 8 * 10^56 grams.

      Step 4. Estimate the radius of our universe (assuming the Riofrio model).

      (2 pi^2) * (r^3) = (8±.5) * 10^56 g / (9.9 * 10^-30 g/cm^3) = (8±.6) * 10^86 cm^3 =

      (8±.6) * 10^80 m^3 .

      r = (3.4±.15) * 10^26 m = approximately (36±3) billion light-years. This is considerably less than the diameter of the OBSERVABLE universe, according to the Einstein-Friedmann paradigm. There are a number of speculative hypotheses in the preceding estimate, so the estimate might be completely wrong and misrepresent the Riofrio model, even though the Riofrrio model is empirically valid.

        In the previous post, in the 3rd sentence, replace "need" by "needs". In Hypothesis 3, assume that the conversion of ordinary matter to dark matter occurs with the dark matter having negligible mass-energy; in other words, ALL of the dark matter has negligible mass-energy.

        Is the Lambda-CDM model empirically valid?

        Sanejouand, Yves-Henri'"Has the density of sources of gamma-ray bursts been constant over the last ten billion years?" arXiv preprint arXiv:1803.05303 (2018)

        My guess is that the the tired light hypothesis has been ruled out by empirical evidence, but the tired light hypothesis and string theory with the finite nature hypothesis share several predictions that contradict the Lambda-CDM model.

        How uncertain is the empirical validity of the Copenhagen Interpretation? My guess is that the Copenhagen interpretation is the correct "psychological" interpretation of string theory with the finite nature hypothesis. Consider 5 questions: (1) What is an observer? (2) What is an observation or a measurement made by an observer? (3) What is a measuring apparatus? (4) What, if anything, determines quantum probability distributions? (5) How are the empirical successes of MOND related to the foundations of physics?

        According to Famaey and McGaugh,

        "Either (i) there is a vast amount of unseen mass in some novel form--dark matter-- or (ii) the data indicate a breakdown of our understanding of dynamics on the relevant scales, or (iii) both."

        [link:link.springer.com/article/10.12942/lrr-2012-10]Modified Newtonian Dynamics (MOND): Observational Phenomenology and Relativistic Extensions, Living Reviews in Relativity, volume 15, 7 September 2012[/link]

        According to McGaugh,, "One of the frustrating things about ΛCDM and MOND as competing scientific paradigms is that where one is elegant and predictive, the other tends to be mute. This makes a straightforward comparison difficult."

        The MOND Pages, ΛCDM and MOND compared

        What is the uncertainty concerning the economic value of string theory?

        I have conjectured that by the year 2025 C.E. the yearly economic value of string theory will be at least 50 billion U.S. dollars per year -- because string theory with the finite nature hypothesis will make it easier for scientists and engineers to understand quantum field theory. Is something wrong with Big Bang cosmology?

        Site Web de Louis Marmet

        Does supersymmetry occur in nature? Are string vibrations confined to 3 copies of the Leech lattice? Which is empirically valid -- Big Bang or Wolfram's Reset?

        Viewpoint 1. Electrons travel through spacetime. Electrons are wave-like when they are not measured. Electrons are particle-like when they are measured.

        Viewpoint 2. Electrons and spacetime are approximations generated by Wolfram's cosmological automaton. Electrons do not travel through spacetime. Measurement is a natural process that separates the boundary of the multiverse from the interior of the multiverse. The multiverse is mathematically isomorphic to a 72-dimensional holographic, digital computer. An electron is an approximate pattern of Fredkin-Wolfram information, and the electron's pattern is computationally and holographically propagated through the interior of the multiverse. The electron approximately consists of discontinuous displays of Fredkin-Wolfram information that are psychologically merged together in the minds of those electromagnetic fields called the "minds of physicists". How might Viewpoint 2 be tested?

        Prediction 1: dark-matter-compensation-constant = (3.9±.5) * 10^-5 .

        Prediction 2: The Riofrio-Sanejouand cosmological model is empirically valid, i.e., the radius of our universe is a constant, and the speed of light in a perfect vacuum steadily decreases as our universe ages.

        I make 3 fundamental claims: (1) String theory is definitely the mathematical way to geometrize Feynman diagrams so as to derive Einstein's field equations -- the only question is whether nature is based upon string theory with the infinite nature hypothesis or string theory with the finite nature hypothesis. (2) Milgrom is the Kepler of contemporary cosmology -- on the basis of overwhelming empirical evidence. (3) If Riofrio, Sanejouand, and Pipino are not geniuses then my basic theory is wrong. Does quantum information reduce to Fredkin-Wolfram information? Is quantum field theory unsatisfactory in that it theoretically allows many types of fanciful quantum fields?

        According to Steven Weinberg,

        "Quantum mechanics is not itself a dynamical theory. It is an empty stage. You have to add the actors: You have to specify the space of configurations, an infinite-dimensional complex space, and the dynamical rules for how the state vector rotates in this space as time passes."

        ''Towards the Final Laws of Physics: The 1986 Memorial Lecture" by Steven Weinberg', published in ''Elementary Particles and the Laws of Physics: The 1986 Dirac Memorial Lectures'', 1987, Cambridge University Press; 1999 pbk reprint, p. 72 (Feynman gave the other lecture.)

        Is it impossible to empirically disconfirm string theory with the infinite nature hypothesis?

        Argument 1. Put D-brane supercharges on gravitons and gravitinos. This allows arbitrary manipulation of the graviton field.

        Argument 2. Put D-brane supercharges on inflatons and inflatinos. This allows arbitrary manipulation of the inflaton field.

        Argument 3. By creating more and more complicated arguments, superpartners can always be assumed to have wavelengths that are too short or too long for detection

        .

        Dear David Brown,

        A general concern regarding String Theory: How can a theory that is limited to one slice of reality, say the Planck scale, determine and define the actions at the atomic scale (it should explain this), at the molecular scale (it can do some of this), at the macro-molecular and protein level, at the cellular level, at the ligament and tissue level, at the organ level, at the human body level, at the meteorological and planetary climate level, at the solar, black hole and solar system level, at the galactic level, at the galaxy cluster level?

        How can any theory limited to just one slice of this continuum of scale expect to describe and determine the actions and interactions at all these levels?

        When we touch our hand to a pane of glass, and visually see our hand touching the glass and feel our hand touch the glass, should we believe a theory that says "those experiences are not real, the only reality is what occurs at the atomic or particle level where your hand and the glass are mostly open space". Or should we require our theories to explain the Planck level, the cellular level, the surface of our skin and the glass surface level all together?

        We are measuring the universe in thin slices, like measuring only in the plane of Flatland when the universe is three-dimensional.

        The concept of MOND should indicate we are not looking at the universe correctly, rather than being the solution.

        We are barking up the wrong tree, using mathematical and measuring tools limited to only slices of this continuum of scale. We need to be able to measure and devise equations across scale, from the atomic to our macro level to the galactic level and back down.

        We need new tools.

        Don

          According to Wikipedia, "A Feynman diagram is a graphical representation of a perturbative contribution to the transition amplitude or correlation function of a quantum mechanical or statistical field theory." If we assign a positive number to each internal line in a Feynman diagram so that each internal line is associated with a gravitational energy-density, then there is a mathematical problem of how to formulate a quantum-gravitational action that yields an averaging procedure. It seems to me that the answer is the Nambu-Goto string action (or something mathematically equivalent to it). Study the following:

          TASI Lectures on Perturbative String Theories" by Hirosi Ooguri and Zheng Yin, 1996, arXiv.org

          In string theory with the infinite nature hypothesis, there are string vibrations at the Planck scale -- and there is the string landscape at the cosmological scale. In string theory with the finite nature hypothesis, the string vibrations are entirely virtual and never emerge from Wolfram's cosmological automaton -- and there are a huge, but finite, number of alternate universes on the boundary of the multiverse (which is approximately generated by Wolfram's cosmological automation using a network of Fredkin-Wolfram information.)

          My guess is that the string theorists are not "looking at the universe correctly" because they have underestimated (or have remained ignorant of) Milgrom, McGaugh, Kroupa, Scarpa, Sanders, Koide, Lestone, Riofrio, Sanejouand, Pipino,, and several other physicists and astronomers. I think that the problem is now to find Wolfram's conjectured 4 or 5 simple rules -- and to show that the rules are empirically valid.

          How does a point particle cross scale? The question needs to address objects and actions at all scales in between the largest and smallest - and not just the extremes or black holes.

          Statistics and probability are one-way tools, only able to move from the smaller to the larger. Are our tools biasing our theories? A bouncing beach ball is best described at the scale of the beach ball. The actions at this level impact what occurs at the molecular level, were we might want to describe the scuffing of the ball surface. How do any theories at only one scale perform both (plus what happens at in between scales)? If the only tools we have go from smaller to larger, will we even be able to model action or movement in the reverse direction?

          MOND is an attempt to connect the large with the small, however do we even have appropriate (mathematical) tools to address this situation?

          Basing theories on geometric points, smallest spaces or point particles does not give any confidence that such a theory can traverse the scales from smallest to largest.

          We seem to be developing theories as if we live in Flatland, only able to measure in one plane at a time. However, we perceive the three dimensional scale aspects of reality. Our theories do not match our wide scale-continuum perceptions and the limitations of our mathematical tools might be the reason why.

          Don