Is there a unified theory of mathematics, theoretical physics, and theoretical computer science?

According to Steven Weinberg, "Everyone knows that electronic computers have enormously helped the work of science. Some scientists have had a grander vision of the importance of the computer. They expect that it will change our view of science itself, of what it is that scientific theories are supposed to accomplish, and of the kinds of theories that might achieve these goals. ... Wolfram goes on to make a far-reaching conjecture, that almost all automata of any sort that produce complex structures can be emulated by any one of them, so they are all equivalent in Wolfram's sense, and they are all universal. This doesn't mean that these automata are computationally equivalent (even in Wolfram's sense) to systems involving quantities that vary continuously. Only if Wolfram were right that neither space nor time nor anything else is truly continuous (which is a separate issue) would the Turing machine or the rule 110 cellular automaton be computationally equivalent to an analog computer or a quantum computer or a brain or the universe. But even without this far-reaching (and far- out) assumption, Wolfram's conjecture about the computational equivalence of automata would at least provide a starting point for a theory of any sort of complexity that can be produced by any kind of automaton. The trouble with Wolfram's conjecture is not only that it has not been proved--a deeper trouble is that it has not even been stated in a form that could be proved. What does Wolfram mean by complex? ..."

"Is the Universe a Computer?" (review of "A New Kind of Science"), The New York Review of Books, 24 October 2002

What is a precise mathematical formulation of Wolfram's concept of complexity? What might be a decisive empirical test of Wolfram's cosmological automaton? My guess is that at the Planck scale the concepts of energy and spacetime somehow fail -- the failure is either in terms of higher mathematics (i.e. mathematical symmetries of the string landscape) or in terms of lower mathematics (i.e. Wolfram's cosmological automaton). I further speculate that the string landscape depends upon generalizing Heisenberg's uncertainty principle to include alpha-prime, while Wolfram's cosmological automaton depends upon modifying Einstein's field equations to include nonzero dark-matter-compensation-constant, Koide cutoff, and Lestone cutoff. Who might be the world's greatest living theoretical physicist? The answer might be Witten. Who might be the world's greatest living mathematician? The answer might be Mochizuki.

According to David Castelvecchi, "The overarching theme of inter-universal geometry, as Fesenko describes it, is that one must look at whole numbers in a different light -- leaving addition aside and seeing the multiplication structure as something malleable and deformable. Standard multiplication would then be just one particular case of a family of structures, just as a circle is a special case of an ellipse. Fesenko says that Mochizuki compares himself to the mathematical giant Grothendieck -- and it is no immodest claim. "We had mathematics before Mochizuki's work -- and now we have mathematics after Mochizuki's work," Fesenko says."

"The biggest mystery in mathematics: Shinchi Mochizuki and the impenetrable proof", Nature, 7 October 2015

How might the concepts of energy and spacetime be introduced into algebraic/arithmetic geometry? The answer to the preceding question might involve a unified theory of string theory and Mochizuki's IUT -- deformations of string vacua might have analogues within IUT. I have speculated that if my idea of dark-matter-compensation-constant = square-root((60±10)/4) * 10^-5 is empirically invalid, then MOND-chameleon particles might be the most likely alternative. Because individual gravitons cannot be detected, any suggested modification of Einstein's field equations might be interpreted in terms of weird particles combined with 100% truth of Einstein's field equations. I conjecture that there might be a theorem that, under plausible hypotheses, the following three assumptions imply that MOND-chameleon particles exist.

ASSUMPTION 1. Milgrom's MOND is approximately valid for a wide range of gravitational accelerations.

ASSUMPTION 2. Newton's 3 laws of motion are (non-relativistically) correct, and supersymmetry needs to be replaced by MOND-compatible-supersymmetry.

ASSUMPTION 3. String theory with MOND-compatible-supersymmetry explains the empirical successes of MOND.

Am I wrong about Fredkin-Wolfram information?

"So: Is Wolfram, as he plainly believes, the new Copernicus? Or is he merely a new Darwin or Einstein? Well, if it's comparisons you are seeking, the one that occurred to me was the astronomer in Dr. Johnson's Rasselas, who, after years of intense, solitary intellection, went quietly nuts." -- John Derbyshire

"Not Quite Copernicus", National Review, 16 September 2002

Is is true that Fredkin-Wolfram information somehow encodes quantum information?

13 days later

... Fredkin is one of those people who arouse either affection, admiration, and respect, or dislike and suspicion. The latter reaction has come from a number of professors at MIT, particularly those who put a premium on formal credentials, proper academic conduct, and not sounding like a crackpot. ... Fredkin doubts that his ideas will achieve widespread acceptance anytime soon. He believes that most physicists are so deeply immersed in their kind of mathematics, and so uncomprehending of computation, as to be incapable of grasping the truth. Imagine, he says, that a twentieth-century time traveler visited Italy in the early seventeenth century and tried to reformulate Galileo's ideas in terms of calculus. Although it would be a vastly more powerful language of description than the old one, conveying its importance to the average scientist would be nearly impossible. There are times when Fredkin breaks through the language barrier, but they are few and far between. He can sell one person on one idea, another on another, but nobody seems to get the big picture. It's like a painting of a horse in a meadow, he says. "Everyone else only looks at it with a microscope, and they say, 'Aha, over here I see a little brown pigment. And over here I see a little green pigment.' Okay. Well, I see a horse." -- Robert Wright

"Did the Universe Just Happen", Atlantic Monthly, April 1988, pp. 29-44

Digital Philosophy (DP) is a new way of thinking about things work. ... DP is based on two concepts: bits, like the binary digits in a computer, correspond to the most microscopic representation of state information; and the temporal evolution of state is a digital information process similar to what goes on in the circuitry of a computer processor. -- Edward Fredkin

"An Introduction to Digital Philosophy", International Journal of Theoretical Physics, February 2003

How might Fredkin's ideas on the foundation of physics be tested? I say that my 3 most important ideas are: (1) Milgrom is the Kepler of contemporary cosmology and the main problem with string theory is that string theorists fail to realize that Milgrom is the Kepler of contemporary cosmology. (2) The Koide formula is essential for understanding the foundations of physics. (3) Lestone's theory of virtual cross sections is essential for understanding the foundations of physics. How might the 3 preceding ideas be tested? The answer might be: (1) the Fernández-Rañada-Milgrom effect, (2) the Space Roar Profile Prediction, and (3) the "64 Particles Hypothesis". What is the explanation for dark matter? My guess is as follows: (A) String theory with the finite nature hypothesis implies that dark-matter-compensation-constant = sqrt((60±10)/4) * 10^-5 and that dark matter indicates that Wolfram's updating parameter transfers information within the Fredkin-Wolfram network of information. (B) String theory with the infinite nature hypothesis implies that MOND-chameleon particles have variable effective mass depending upon nearby gravitational acceleration and that dark matter is explained entirely by the existence of MOND-chameleon particles.

For the foundations of physics, should the concepts of spacetime and energy be replaced by the concepts of Fredkin distance, Fredkin time, and Fredkin digit transition?

In my previous posting at this website, the first phrase in the quotation from Fredkin should be "Digital Philosophy (DP) is a new of thinking about how things work ... " Note that my idea of Fredkin-Wolfram information is that such information cannot, in principle, be directly measured and that string theory with the infinite nature hypothesis can never be refuted (although such theory might be mathematically awkward).

On page 5 of the following article, when Fredkin refers to a "unit of Time" I suggest that "Time" should be understood as non-measurable "Fredkin time" in some as yet undescribed Fredkin-Wolfram network of information.

"Discrete Theoretical Processes" by Edward Fredkin

It seems to me that Fredkin's terminology should be based upon Fredkin time, Fredkin distance, and Fredkin digit transition with the true meanings of such concepts depending upon a hypothetical affirmative solution of what I call "Wolfram's Simple Rules Hypothesis".

2 months later

"The existence of exotic dark matter particles outside the standard model of particle physics constitutes a central hypothesis of the correct standard model of cosmology (SmoC). Using a wide range of observational data I outline why this hypothesis cannot be correct for the real Universe." -- Pavel Kroupa

"Lessons from the Local Group (and beyond) on dark matter", 2014

My guess is that Kroupa's analysis of the dark matter problem is correct. I say that my 3 most important ideas are: (1) Milgrom is the Kepler of contemporary cosmology. (2) The Koide formula is essential for understanding the foundations of physics. (3) Lestone's theory of heuristic cross sections is essential for understanding the foundations of physics. Are my speculations consistent with empirical reality? I suggest that there is a unified theory of mathematics and theoretical physics which consists more-or-less of a unification of Mochizuki's IUT with the string landscape. Consider 10 basic concepts relevant to theoretical physics: time, distance, space, curvature, torsion, energy, causality, randomness, logic, and information. Fredkin and Wolfram have suggested that quantum field theory and general relativity theory can be explained in terms of the network logic of Fredkin-Wolfram information. My guess is that string theory is essential for carrying out the Fredkin-Wolfram program. Is the introduction of time and energy into algebraic geometry the key to a unified theory of theoretical physics and pure mathematics? I conjecture that string theory with the finite nature hypothesis is empirically valid if and only if Wolfram's "A New Kind of Science" is one of the greatest books ever written. Google "kroupa milgrom" , "mcgaugh milgrom", "witten milgrom", and "einstein 3 criticisms".

"So, back to the telescope." -- Stacy McGaugh, 7 July 2017

Triton Station: A Blog About the Science and Sociology of Cosmology and Dark Matter

I say that Milgrom is the Kepler of contemporary cosmology.

Wikiquote for Pavel Kroupa

Wikiquote for Stacy McGaugh

Google "kroupa milgrom", "mcgaugh milgrom", "witten milgrom", and "einstein 3 criticisms".

My guess is that the Copenhagen interpretation is philosophically wrong but empirically irrefutable.

From Wolfram Alpha:

((reduced Planck mass) * (electron mass)^3)^(1/4) = 1.1 * (Higgs boson mass), where (Higgs boson mass) is approximated by 1.25 GeV/c^2

Is the preceding approximation a meaningless coincidence?

Consider a just-so story. A virtual electron-positron pair acquired a Big Bang mass. The anomalous acquisition created a 26-dimensional bosonic black hole that consisted entirely of virtual Higgs bosons and virtual electron-positron pairs. Within the 26-dimensional bosonic black hole, nonmeasurable Fredkin time equalled the reduced Planck mass, and nonmeasurable Fredkin spatial distance equalled the electron mass. The 26-dimensional bosonic black hole collapsed into a 10-dimensional GR model with measurable particles. Nonmeasurable Fredkin time and nonmeasurable Fredkin spatial distance somehow transitioned into measurable time and measurable distance with measurable particles. Shortly after the transition, each Higgs boson occupied approximately the same 4-volume as each electron, thus suggesting an approximation for the mass of a Higgs boson.

Is the preceding scenario complete nonsense?

"The failures of the standard model of cosmology require a new paradigm" by Kroupa, Pawlowski & Milgrom, 2013

Consider 7 conjectures:

(1) String with the finite nature hypothesis implies MOND and no SUSY.

(2) The Koide formula is essential for understanding string theory with the finite nature hypothesis,

(3) String vibrations are confined to 3 copies of the Leech lattice.

(4) There are 6 basic quarks because there are 6 pariah groups.

(5) The monster group and the 6 pariah groups allow energy to exist.

(6) There exists a (2/3)-Koide formula that allows some quarks to have charge ± 2/3.

(7) There exists a (1/3)-Koide formula that allows some quarks to have charge ± 1/3.

From Wolfram Alpha:

(muon mass) /(electron mass) = 206.7683

(tauon mass)/(electron mass) = 3477.48

(59^3 + 33 * 59^2 + 57 * 59 + 9 )^(1/27) - 1.59983643131952544 = 0 approx.

For a = 1.5998364, x = 206.7683, y = 3477.48,

calculate ( a^3 +(a^2) * x + a * y)/(a^3 + ( a^2) * x^.5 + a * y^.5)^2 Answer: .333333

For the polynomial x --> x^3 + 33 * x^2 + 57 * x + 9 my guess is that 33 + 26 = 59 is meaningful because of 26-dimensional bosonic string theory and the fact that the three primes 59, 59 ± 12 divide the order of the monster group. My guess is that the constant term 9 is meaningful because of Lestone's heuristic string theory.

10 days later

Monstrous moonshine, wikipedia

My guess is that there might be two infinite series, both involving the (1/3)-Koide polynomial, yielding a numerical expression for Wolframian pseudo-supersymmetry.

Note that:

(59^3 33 * 59^2 57 * 59 9)/78^3 - .6819568772231494 = 0 approx.

72^3 / (72^3 33 * 72^2 57 * 72 9) = .680571738... approx.

Here the number 78 represents 3 copies of bosonic string theory, and the number 72 represents 3 copies of the Leech lattice.

The percentage of dark energy is the universe is about 68.3 %.

Planck 2013 results

11 days later

From Wolfram Alpha:

(1 1/2.5 1/2.5^2 1/2.5^3 1/2.5^4 )^(1/24) / (1 1/47) = .99980215...

(1 1/2.5 1/2.5^2 1/2.5^3 1/2.5^4 1/2.5^5)^(1/24) / (1 1/47) = 1.0000600...

(1 1/3 1/3^2 1/3^3 1/3^4 )^(1/24) / (1 1/59) = .9999154432...

(1 1/3 1/3^2 1/3^3 1/3^4 1/3^5)^(1/24) / (1 1/59) = 1.00003006648...

(1 1/3.5 1/3.5^2 1/3.5^3 1/3.5^4 )^(1/24) / (1 1/71) = .99995403...

(1 1/3.5 1/3.5^2 1/3.5^3 1/3.5^4 1/3.5^5)^(1/24) / (1 1/71) = 1.0000108...

Do the 6 preceding estimates have something to do with the Leech lattice, the monster group, and the foundations of physics?

Leech lattice, Wikipedia

Monster group, Wikipedia

8 days later

I say that my 3 most important ideas are: (1) Milgrom is the Kepler of contemporary cosmology. (2) The Koide formula is essential for understanding the foundations of physics. (3) Lestone's theory of virtual cross sections is essential for understanding the foundations of physics.

The axion is coupled to the electromagnetic field in ways that should be testable.

Sikivie, P., 1983. Experimental tests of the" invisible" axion. Physical Review Letters, 51(16), p.1415.

Could most of the proton charge radius puzzle be explained by unexpected electromagnetic interactions caused by axions?

John P. Lestone has suggested that leptons might be quantum micro black holes according to his theory of virtual cross sections.

Possible Mechanism for the Generation of a Fundamental Unit of Charge (long version), Los Alamos Report LA-UR-17-24901, 16 June 2017

If Lestone is correct, then massive bosons might be (approximately) quantum micro black hole 1-spheres, leptons might be quantum micro black hole 2-spheres, and quarks might be quantum micro black hole 3-spheres.

The electromagnetic effects of axions (1-spheres with virtual cross sections) and leptons (2-spheres with virtual cross sections) might suggest that some form of electromagnetic uncertainty might be approximated by (axion mass)/(electron mass).

Can the proton charge radius puzzle be resolved in terms of electromagnetic effects from axions?

Proton radius puzzle, Wikipedia

Axion, Wikipedia

According to Graf and Steffen, "If the Peccei-Quinn (PQ) mechanism is the explanation of the strong CP problem, axions will pervade the Universe as an extremely weakly interacting light particle species."

"Thermal axion production in the primordial quark-gluon plasma" by Peter Graf and Frank Daniel Steffen, 2011

According to Derbin et al., "If the axions or other axion-like pseudo scalar particles couple with electrons then they are emitted from Sun by the Compton process and by bremstrahlung ...:

"Search for solar axions produced by Compton process and bremsstrahlung using the resonant absorption and axioelectric effect" by A. V. Derbin, et al., 2013

What is the magnitude of the uncertainty in determining the fine structure constant?

Fine structure constant, Wikipedia

Hanneke, Fogwell, and Gabrielse (2008) estimated 1/(fine structure constant) as: 137.035999084(51)

Hanneke, D., Fogwell, S., & Gabrielse, G. (2008). New measurement of the electron magnetic moment and the fine structure constant. Physical Review Letters, 100(12), 120801.

.000000051/137.035999084 = 3.72... * 10^-10

(electron mass) * 3.72 * 10^-10 = 1.901 * 10^-4 eV/c^2

P. J. Mohr, B. N. Taylor, and D. B. Newell (2015) estimated 1/(fine structure constant) as: 137.035999139(31)

"Fine structure constant" in CODATA Internationally recommended 2014 values of the fundamental physical constants, National Institute of Standards and Technology

.000000031/137.035999139 = 2.262... * 10^-10

(electron mass) * 2.262 * 10^-10 = 1.156 * 10^-4 eV/c^2

How might the proton radius puzzle be related to uncertainties in determining the fine structure constant?

In the article "The Rydberg constant and proton size from atomic hydrogen" Beyer et al. (2017) stated (p. 80), "Line shape distortions caused by quantum interference from distant neighboring atomic resonances have recently come into the focus of the precision spectroscopy community ... To the best of our knowledge, this effect has been considered in the analysis of only one of the previous H experiments and was found to be unimportant for that particular experimental scheme ... The effect was found to be responsible for discrepancies in the value of the fine structure constant α extracted from various precision spectroscopy experiments in helium ... The root of the matter is that natural line shapes of atomic resonances may experience deviations from a perfect Lorentzian when off-resonant transitions are present."

Beyer, Axel, Lothar Maisenbacher, Arthur Matveev, Randolf Pohl, Ksenia Khabarova, Alexey Grinin, Tobias Lamour et al. "The Rydberg constant and proton size from atomic hydrogen." Science 358, no. 6359 (2017): 79-85.

Are axions a likely source of off-resonant transitions? Could the older experiments for determining the proton charge radius have failed to eliminate axion interactions as a source of inflating the value of the proton radius? Could the 2017 experiment by Beyer et al. be failing to take into account axion interactions? What might happen if experiments for measuring the proton charge radius were performed at 4 different levels of shielding -- on the surface, in a slightly deep mine, in a moderately deep mine, and in a very deep mine -- to check for possible axion production of off-resonant transitions? By assuming a precise value for the axion mass could the 2017 experiment by Beyer et al. be modified for axion detection?

If most of the proton charge radius puzzle can be explained by uncertainty in estimating the finite structure constant due to unexpected axion detection, then (1.2 ± 1.0) * 10^4 eV/c^2 might be a plausible estimate for the axion mass.

    According to John P. Lestone, "If black holes (once thought to be point objects) are amenable to statistical mechanics, then why not fundamental particles like leptons? (1988)"

    "Possible path for the calculation of the fine structure constant", Los Alamos Report LA-UR-16-22121, 4 April 2016

    Is my basic theory (i.e. string theory with the finite nature theory) wrong? (Actually, my basic theory is string theory with the finite nature hypothesis and various simplifying assumptions.) Let us assume that string theory with the infinite nature hypothesis is correct. In that case, I would bet in favor of MOND-chameleon particles and a quantum foam theory based on Lestone's theory of virtual cross sections. Consider some hypotheses which might be a plausible alternative to my basic theory:

    (1) The flat spacetime of quantum field theory and the curved spacetime of general relativity theory emerge from a virtual quantum foam spacetime in which massless virtual particles travel at a virtual speed C which is vastly greater than c. The virtual speed C is so large that it appears to be infinite. Virtual quantum foam spacetime is amazingly hot -- so hot that ordinary spacetime decomposes into quantum foam. The non-virtual particles are the vastly cooler particles that are confined by gravitational geodesics.

    (2) The virtual quantum foam spacetime is 26-dimensional. It has 3 dimensions of ordinary space, 1 dimension of time, 3 dimensions of linear momentum, 3 dimensions of angular momentum, and 16 dimensions of uncertainty. The dimensions of uncertainty arise because 4-dimensional spacetime has 4 dimensions of hbar uncertainty multiplied by 4 dimensions of alpha-prime uncertainty. The quantum foam spacetime is the physical interpretation of 26-dimensional bosonic string theory, which has 25 dimensions of higher-dimensional space and 1 dimension of time.

    (3) Micro black holes with masses less than the Planck mass and with charges roughly approximated by the electron charge existed during the very early stages of the Big Bang but then rapidly evaporated down to the fundamental particles of the Standard Model and whatever massive particles might need to be added to the Standard Model.

    (4) All of the fundamental particles with mass are quantum micro black holes. Massive bosons are roughly like virtual 1-spheres in quantum foam spacetime. Leptons are roughly like virtual 2-spheres. Quarks are roughly like virtual 3-spheres.

    (5) Virtual photons and virtual gravitons can be exchanged between two micro black holes and there is a relevant transmission coefficient for widely separated micro black holes. Exchanges of virtual photons and virtual gravitons in quantum foam spacetime determine the strengths of the electromagnetic and gravitational fields.

    (6) The conventional, widely accepted theory of quantum evaporation of micro black holes is wrong because the theory ignores the virtual cross sections of massive virtual particles. If the virtual cross sections went to zero then the conventional theory would be correct.

    (7) There is widespread transient violation of conservation of electromagnetic energy but the violation averages out to zero on time scales larger than the Planck time.

    (8) The Heisenberg uncertainty principle is 100% correct for directly measured particles but for virtual particles the Heisenberg uncertainly principle needs to be replaced by an (hbar,alpha-prime) uncertainty principle which takes into account Lestone's theory of virtual cross sections.

    (9) After quantum averaging, Einstein's field equations are 100% correct but the -1/2 in the standard form of Einstein's field equations is apparently replaced by -1/2 MOND-chameleon-fake-function, where this function is caused by the presence of MOND-chameleon particles. These hypothetical MOND-chameleon particles have variable effective mass depending on nearby gravitational acceleration. The MOND-chameleon particles are bosons which form a Bose-Einstein condensate. This Bose-Einstein condensate forms an insulating barrier between the cooler non-virtual particles and the vastly hotter virtual particles.

    embarrassing typo ... The last sentence should be:

    If most of the proton charge radius puzzle can be explained by uncertainty in estimating the finite structure constant due to unexpected axion detection, then (1.2 ± 1.0) * 10^-4 eV/c^2 might be a plausible estimate for the axion mass.

    5 days later

    I say that my three most important ideas are: (Idea 1) Milgrom is the Kepler of contemporary cosmology. (Idea 2) The Koide formula is essential for understanding the foundations of physics. (3) Lestone's theory of virtual cross sections is essential for understanding the foundations of physics. How might (Idea 3) be tested?

    John P. Lestone of Los Alamos National Laboratory has suggested a theory of quantum micro black holes that might explain the value of the fine structure constant. He suggested 7 principles, the first 4 of which are as follows: "Properties used to calculate the fine structure constant for my imaginary particles (1) My particles have a very high temperature. (2) Despite having a high temperature my imaginary particles can not change their rest mass upon the emission of electromagnetic energy. Using known physics my imaginary particles (if isolated) can not emit any "real" photons. (3) However, I consider the possibility that my imaginary particles can emit and absorb unphysical L=0 "virtual" photons via the time-energy uncertainty principle. (4) The emission and absorption is controlled by statistical arguments involving their assumed "classical" temperature and possibly other effective temperatures. ..."

    "Possible path for the calculation of the fine structure constant", Los Alamos Report LA-UR-16-22121, 4 April 2016, page 10

    How might Lestone's theory be related to the idea that quantum information can be explained in terms of Fredkin-Wolfram information? My guess is that if string theory with the finite nature hypothesis is empirically valid, then it is necessary that string theory with the infinite nature hypothesis should "almost" work.

    Dreiner, Grab, and Stefaniak considered the possibility of a supersymmetric extension of the standard model with a baryon-triality symmetry with a right-handed scalar electron (selectron) or scalar muon (smuon) as the lightest supersymmetric particle.

    "Discovery Potential of Selectron or Smuon as the Lightest Supersymmetric Particle at the LHC" by H. K. Dreiner, S. Grab, and T. Stefaniak, 2011

    Consider the following 6 hypotheses:

    (1) After quantum averaging, Einstein's field equations are 100% correct. The empirical successes of Milgrom's MOND are explained by MOND-chameleon particles which have variable effective mass depending upon nearby gravitational acceleration. (2) For every particle in the Standard Model, its superpartner is a MOND-chameleon particle. (3) The right-handed selectron is the lightest supersymmetric particle and explains most of the mass-energy of dark matter particles. (4) The bosons that are not MOND-chameleon particles are the photon, the graviton, the 8 distinct gluons, the Z-boson, the W boson, the W- boson, the Higgs boson, the axion, and the inflaton. The string vibrations of these 16 distinct types of bosons determine a 25-dimensional space with 1 dimension of time. 26-dimensional bosonic string theory allows for an 10-dimensional general relativity model with 16 dimensions of virtual bosons. (5) The universe undergoes an amazingly fast cycle of heating and cooling. A cool phase of the universe lasts for one Planck time interval at the end of which the universe becomes amazingly hot and instantaneously undergoes a hot phase. The temporal duration of the hot phase appears to be instantaneous because its time duration is (Planck time interval) multiplied by ((little c)/(big C)). Big C is the speed of light in 26-dimensional bosonic spacetime. The hot phase is instantaneous de-compactification and re-compactification of 4-dimensional spacetime with respect to the 26-dimensional spacetime of bosonic string theory. At the end of a hot phase, the universe undergoes another cool phase. (6) The existence of the cycle of cool phases and hot phases might provide a basis for Lestone's principles (1) through (4) in a bosonic string theoretical 26-dimensional spacetime, which consists of amazingly hot virtual particles. Thus, Lestone's principles might be explained in a higher-dimensional spacetime of virtual particles -- the de-compactified spacetime might be thought of as a quantum foam that is so hot that ordinary spacetime breaks down.

    Write a Reply...