...and therefore:

regarding *dark energy*

Does anyone really believe that it requires no thermodynamic work to run the Universe?

Thermodynamics is very common sense.

First law of thermodynamics:

You can't win, you can only break even.

Second law of thermodynamics:

You can't break even unless you reach the state of absolute zero.

Third law of thermodynamics:

You can't reach the state of absolute zero.

(Because,

you are only a finite machine.)

Following the theory of these laws. engines running the Universe must produce waste streams.

Next consider the Probability Learning Game.

The experimenter throws loaded dice to determine a varying escape route for one after another of prey animals.

Each time, the subject predator guesses where.

Some always do escape.

Constituting a thermodynamic waste stream.

Which

expands the Universe

Dear Nr. Bloomquist,

Parmenides was wrong. Reality is. There is a lot more to abstraction for it to be in any way useful

Joe Fisher..

14 days later

Dear Lee,

Very interesting essay and original ideas. Good idea - "system of basic states". You noted in 4: "Having considered" time ", then" space "might seem a good next step. But on Science Friday (2/27/15) - when asked about a concept in physics that he would abandon as being basic- Sean Carroll answered "space".

In conclusion, You have given good words: "But before we can define these maps, we need to understand the basic structure they need to preserve. " (Barwise 1988, 256)

What do you think if to construct such ontologic chain: "basic system (limit, unconditional) states of matter (primordial generating structure)" 竊' "space of limit states of matter (extreme forms of existence)" 竊' structural (ontological) memory of matter (limit states of matter) 竊' phenomena of information and time?

Kind regards,

Vladimir

Greetings Joe and Vladimir,

Thank you for requesting my comments!

To better understand what you are saying is there a diagram you could show me? I think you could upload it as a file in order to accompany any explanatory text you may want to post.

Best Regards,

L

4 days later

Lee,

I didn't find your essay easy to read but you do introduce a lot of ideas that are new and intriguing to me. Re.your question, I think it depends on what you mean by basic. Does it mean most simple, or most foundational or of primary importance? I like the idea of stateless coding.I like it because if we were to think about a spinning coin in free fall- it has no singular state as no reference frame is applied, it can be viewed from any direction, and no instant in time or position in space is specified for a measurement of a singular state. It is all states but describing how it should move gives the flux of all states over time and space.

Action needs somewhere and when to occur but the example just given shows that position and orientation in space need not be the primary description. The action is the essence of what is occurring. That's like the flux of a river without the water. Which is the better, not most basic, description of the river? Certainly on a map the water content alone is shown, that's basic position in space info. But it is the flux that gives the rivers character and that has a greater information content than just the water content in space and so is perhaps the greater part of the river. Well I get from that that state in space need not be the primary or most important information. Which is food for further thought, thank you. Kind regards Georgina.

Vladimir, if there were a diagram of what you wrote I believe it would assist thinking about your ideas-- especially (for me at least) if the diagram were based on a language of objects and arrows.

Georgina-- I now see that the paper I submitted to this contest is too terse. More words are needed to clarify. So I will make the attempt. I also hope the result will be a useful example of a diagram written in the language of objects and arrows. To begin:

By expanding the model of proper time discovered by Minkowski (now basic to General Relativity) the paper shows a mathematical connection between General Relativity and Quantum Mechanics, specifically by linking the Born rule of Quantum Mechanics to proper time of General Relativity.

However this is not a link written in the language of Quantum Field Theory. Rather, it is a link between GR and QM that involves multiple mathematical languages-- non-wellfounded sets, nonstandard analysis, situation theory, channel theory, informationalism, game theory-- and then, the resulting, expanded model of time can be diagrammed using objects and arrows. It turns out to be compatible with something Einstein said:

"The only reason for time is so that everything doesn't happen at once"

Here's how I found this model of time:

Years ago, I used the programming language Smalltalk80 and a diagram based on objects and arrows to simulate a complex and expensive automated manufacturing system-- before it was built. The engineers in charge wanted to know from a simulation whether or not the system would work in all anticipated situations. The answer was not obvious by looking at the layout of machine drawings, because the system had to process a complex schedule of parts.

The objects in the diagram I used to model this system were of two types: (a) places, each drawn with something like a circle, and (b) transitions, each drawn with a rectangle. Hundreds of these objects had to be drawn and connected by arrows in order to model the system.

The result looked like a board for playing a game. There were rules for drawing this game board. There were also rules for moving the tokens on the game board. (The tokens represented the parts and information conveyed between the elements of the system).

Here are the rules for drawing the game board:

(1) Every arrow starting from a place must end at a transition.

(2) Every arrow starting from a transition must end at a place.

Here are the rules for moving tokens on the game board:

(1) When the places "upstream" from a particular transition (as determined by the direction of its attached arrows) become filled with tokens, remove one token from each upstream place.

(2) After the code written for that transition is completed, place a token into each of its downstream places.

Smalltalk (today, the programming language Pharo) allows objects to spawn blocks of parallel code that wait for signals from "semaphores." The block of code will only run when the semaphore signals it. The result is Not lines of code that run one after another exactly as they are written in the source text. Instead the sequence of executing the blocks of code depends on when the semaphores trigger the code-- in "real time." The semaphores look at upstream places from the transition object which, by means of the attached arrows in the diagram, own them. So instead of a sequence following how the source code was written down, in the simulation the sequence of running the blocks of code was determined by the tokens placed on the game board.

This worked-- except when the parts entering the system according to the simulated schedule backed up. I found that in this case, when a transition had "fired" (by means of its underlying semaphore), another token in an upstream place could fire yet another copy of the transition object. So instead of just one of that particular transition existing in the simulation, as a result there could be more than one. The transition was no longer unique. There existed multiple copies of it in the simulation. To anthropomorphize a bit, the transition had no unique self identity. It had multiple identities-- which is an oxymoron except in spy circles and psychotherapy.

In this sense things could "happen all at once." As in Einstein's quote I had to add something to the simulation in order to prevent "everything from happening at once."

The solution is in the attached diagram. I drew one of these kinds of places for just the transition itself for each transition where it was possible for "everything happening at once." To characterize this feature in words, each transition with the possibility for "everything happening at once" needed a place to itself and only for itself, to keep it from firing when it was already firing, and prevent "everything happening at once." I realized that I had to simulate time itself in order to make the simulation work properly.

When running the simulation, these places for each transition's "self" use looked like they were constant-- as if they were always constantly filled with a token. But each of these "self identity" tokens was really, according to the rules, being taken off and put back into its place faster than the eye could see. It was this mechanism, where each transition had a place for itself so that only one instance of itself existed, that kept "everything from happening at once."

Years later at a workshop at Stanford called "The Business Applications of Situation Theory," I learned from Jon Barwise about non-wellfounded sets.

It struck me that the "place for self" I had used years before in the simulation had a structure which looked like a non-wellfounded set:

Eq. 1: unique_constant_identity = (changes, unique_constant_identity).

"Unique_constant_identity" in eq.1 produces the stream of "changes" evident in all of the downstream places from the transition that owns the unique place.

This perspective on proper time is not new. In fact it's ancient. Parmenides, whom Zeno was defending with his story about the race between Achilles and the tortoise, had described the same situation in the fragments left to us of his poem titled "On Nature." The poem tells of a journey from the domain of belief into the domain of knowledge . The person making this journey must be carried in a chariot.

In eq. 1 the chariot is "unique_constant_identity"-- the constant presence of a unique self existing in the present, which is always constantly supporting and evident to a conscious person for as long as that person is alive. The unique constant self that's experienced, and the time in which this self is experienced-- i.e. the present-- are always the same to that person as long as the person is conscious and lives, just as for the place reserved for "self" in the above diagram, and just as for "unique_constant_identity" in eq. 1. The road on which the chariot travels is "changes." Like the old saying, wherever you go, there you are!

Although Minkowski discovered the modern representation of proper time which has become the basis of General Relativity, as far as I'm aware the ancient representation of proper time is due to Parmenides. The ticking of the clocks Einstein imagined to obtain relativity are like the trees and changes along the path travelled by Parmenides' chariot. But Einstein left the constant chariot itself implicit, and not explicit, in his mathematics of relativity. Eq. 1 make this constant chariot, "unique_constant_identity," explicit in the mathematics.

Even if not supportive of numerical calculation, this mathematical connection between General Relativity and Quantum Mechanics supports logical models of dark matter and dark energy.

Reference:

The diagrams in the following paper show the origin of this idea.

https://docs.google.com/file/d/0B9LMgeIAqlIET3B2NEE2MmxDOWM/edit?usp=docslist_apiAttachment #1: Petri_net.pdf

    Lee it is really interesting, thank you. I especially like your description of how the system you drew out led to later connection between proper time and QM.

    Regards Georgina

    Georgina wrote: "Re.your question, I think it depends on what you mean by basic. Does it mean most simple, or most foundational or of primary importance"

    "Yes" would be way too cute, of course. I should be able to answer you by examples.

    In the context of the paper, I mean using equations without space variables would be more basic than using equations with space variables. Existing classical formulae for action involve moving from one set of space variables to another (possibly the same) for example. Then would moving in a game be more basic action than action based on space variables? Would it be action at all?

    In another sense it's the vacuum state, as Roger Penrose uses it in this paragraph from his book The Road to Reality (p 657):

    "The shadowy state-vector

    [math]|0 \rangle[/math]

    over on the extreme right is normally taken to be the 'vacuum state', representing the complete absence of particles of any kind. A succession of these creation operators then creates a succession of particles, added one by one into the vacuum, so that

    [math]\Psi\Phi\ldots\Theta |0\rangle[/math]

    is the state that results from introducing particles successively with wave-functions

    [math]\theta , \ldots , \phi ,\psi.[/math]"

    The formula looks like the product of factors of powerful numbers, like octonions or something like that. The basic idea being that in a product of factors like this, if any factor is zero, the whole product gets zeroed. So the vacuum state-vector, from which the sequence of products starts, can't be zero.

    Probably in QFT there is too much basic stuff-- way too much, 120 orders of magnitude too much stuff. Which leads to the vacuum catastrophe.

    If instead the sequence of products like this begins with the Born infomorphism, there is far less basic stuff to begin with.

    John Baez found "Quantropy" (it's on his Azimuth blog) which means to me the sum of complex numbers representing possibility must add to one.

    In the case of multiple particles with Born infomorphisms-- in the nonstandard future, where the only "words" available involve possibilities, and particles do not yet exist, the state (previous comment here) "unique_constant_identity" is such a state of possibility in the nonstandard future. Using complex numbers to represent possibility and impossibility (as in the paper), information about a particle in this situation can be conveyed in terms of possibilities by saying that the particle on which we're focused has a possibility (complex number) of 1 for its particular "unique_constant_identity" and zero for every other "unique_constant_identity" in the universe of particles considered.

    In terms of quantropy, this looks OK. Because at this level of inquiry-- before space is introduced into the equations-- the 1 we have assigned conforms to the rule of quantropy, and, of course, adds to 1.

    That's compatible with the way complex numbers are concluded to represent possibility according to informationalism, as in the paper. 1 means it's possible with value 1.

    0 for all the other states of "unique-constant_identity" means all those other states are impossible for the particle.

    OK, now we start with a 1 in Penrose's equation.

    There is far less basic stuff to begin with. And the 1 can be ignored when later we introduce space as a possibility.

    [math] \Theta\times 1 =\Theta[/math]

      Lee, thanks for taking the time to explain . It has for the most part gone over my head.

      I did watch some of Leonard Susskind's lectures on mechanics so I understand moving from one set of space variables to another. You've just got me thinking that the space variables or coordinates are not needed at all but just the transformation matrix. As that matrix describes how to move, not where to move and so is like the action rather than a body tied to space.

      I also wonder about the 'unique constant identity', take for example spin of an electron. It seems that it might be interaction with the measurement apparatus that creates the observed state. Releasing a photon or not could be reaction to orientation and absolute motion of the electron interacting with the apparatus rather than presence of unique constant identity or state.

      Thank you for trying to explain you've given me lots to think about even if it isn't exactly what you intended.

      "...over my head." Actually me too.

      On the equation in previous post--

      There's a horrible mistake in physics.

      It's somewhere in that equation.

      And, it's not a *calculational* error.

      It's a *logical error.*

      First let me show you how big this mistake is.

      Then you can say whether or not you think it really is horrible.

      That shadowy vacuum state vector in previous post is the source of dark energy.

      Astoundingly brilliant physicists have calculated how much energy this equation puts into the universe. And the result is horribly and logically wrong.

      Calculations say that the amount of dark energy there should be in the universe is 120 orders of magnitude greater than the actual dark energy in the universe, according to the expansion of the universe as measured by instruments.

      That amount of error signals a *logical* mistake somewhere in the equation.

      First I'll try to write it down. Then after that huge number, I'll return say why it's a *logical* mistake in physics.

      [math]\bold 1000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000[/math]

      OK, with that much dark energy expanding the universe, the universe would have to expand at very close to the speed of light if the vacuum energy is massive.

      But if the vacuum is massless, then the Universe would expand faster than the speed of light.

      There's a *logical* error here.

      But now, because I said this, I have to give a concrete example in the history of physics of a *logical* error in physics.

      It's Einstein's thought experiment.

      Say that Einstein dreamed that he was riding a wave away from an electromagnet.

      But he was riding a normal magnet.

      The electromagnet held him back by pulling on the normal magnet that he was riding.

      So he applied more speed. He went faster than the speed of the waves of electromagnetism pulling backward on his normal magnet.

      But as soon as he went faster than the waves, from his perspective the electrons he was looking at in the electromagnet switched direction and began cycling around the core of the electromagnet in the opposite direction from which they had been turning.

      He was outpacing the waves, and seeing into the past of the electromagnet.

      Following the laws of electromagnetism, as soon as the elections were seen to change rotation, the force from the electromagnet on the normal magnet changed direction. Instead of pulling back on the normal magnet, because of the change in direction of rotation, the electromagnet then pushed the normal magnet in the direction it was already going.

      With no limit on the speed of light, this process could continue without end. And the faster the speed of the normal magnet, the more force exerted on it from the electromagnet. There could even be infinite force, if this kept up forever.

      It's a logical violation of the laws of thermodynamics.

      You can't have a force that can increase itself by pushing something away from itself.

      It's beyond being a perpetual motion machine. It's like wall street.

      That's what I mean by a *logical* mistake in the physics.

      But since physics is logical, you can't go faster than the speed of light.

      Once in a while there comes a time when new logic, instead of new method of calculation, has to introduced in order to fix up a logical mistake in physics.

      In previous post I tried to correct this logical mistake in physics by writing the final equation in the post.

      To get to that equation, I had to adopt the role of being an *informationalist*. It's in the references for the paper.

      I had to apply an "informationalist inquiry" in order to deduce what kinds of numbers represent the possibilities that exist in the nonstandard future.

      The science required to write that equation down is Not a science of calculational method.

      It's a science of information.

      Again, it's in the references.

      The same science is required to understand the mysterious connection between mathematics and physics.

      Where the mysterious connection occurs, it's an infomorphism.

      But you need a science of information to say that.

      6 days later

      Update:

      Interacting dark matter:

      http://arstechnica.com/science/2015/04/new-evidence-that-dark-matter-could-be-self-interacting/

      If Born infomorphisms are ordered commutative monoids, here's how they can be assembled into resources--

      https://johncarlosbaez.wordpress.com/2015/04/07/resource-convertibility-part-1/

      E.g.,

      [math]BornInfomorphism_1 \newline BornInfomorphism_2 \newline BornInfomorphism_3 = \newline BornInfomorphism_{cartesianProductOf123}[/math]

      On the right hand side of the equation is the core of an information channel, from Channel Theory (in the references for the paper).

      On the left hand side of the equation might be quarks.

      20 days later

      SOME ROUGH CALCULATIONS

      In the "basement" (see below) the vacuum catastrophe does not exist.

      Because the machinery at that level works only with what exists.

      And because "vacuum," there, means "the absence of anything that exists."

      ***

      How many hydrogen atoms are there? "The result is approximately 10^80 hydrogen atoms." http://en.m.wikipedia.org/wiki/Observable_universe

      For a rough calculation I'll ignore the rest.

      In the calculation I imagine the hydrogen atom to be like a building with three floors and a basement.

      In the basement, thermodynamic machinery connects by pipes and cables to the upper floors. And because of these information channels, the upper floors convey information about the basement; similarly the basement conveys information about the upper floors.

      In more detail, each floor supports a different type of particle. The top floor supports particles governed by Quantum Chromodynamics. The second floor supports particles governed by Electroweak Theory. The ground floor supports particles governed by Quantum Electrodynamics. The basement supports Born infomorphisms-- governed by Informationalism (the paper). Each particle on the top three floors is connected by the pipes and cables to its very own Born infomorphism in the basement.

      In the paper by SE Rugh, H Zinkernagel (2002. The quantum vacuum and the cosmological constant problem. Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 33 (4): 663-705) the results are presented of calculations of vacuum energy for the top three floors, assuming that the vacuum energy is indeed located only on those floors.

      But here the calculation assumes that all of the vacuum energy is in the basement. So each Born infomorphism not only has its own proper time (the paper), but also each Born infomorphism has its own background energy (previous post).

      Only part of the vacuum energy calculated for the top three floors conveys information about the posited background energy in the basement.

      The portion of the vacuum energy for particles in the top three floors which carries information about the background energy of the Born infomorphisms in the basement is just the portion of the vacuum energy held in the volume occupied by the particle(s) housed in the upper floor.

      The calculations answer the following two questions:

      For each floor, what volume in the Universe would a particular type of particle require in order to resolve the vacuum catastrophe?

      Is there anything interesting about this volume?

      First the volume of the Universe:

      4*10^83 liters

      Here's the equation I'll use to get the size needed for each type of particle in order to resolve the vacuum catastrophe:

      (numberOfParticlesOfThisTypeInTheUniverse)

      times

      (the ratio of

      (volumeNeededForThisTypeToResolveTheVacuumCatastrpoheWithDiameterInMeters)

      to

      (volumeOfTheUniverse))

      times

      (theOrderMagnitudeUsuallyCalculatedforThisTypeParticle)

      must equal

      1

      Quantum Chromodynamics:

      "One thus frequently estimates: ПЃQCD vac в€ј10в€'3 в€'10в€'2GeV4 в€ј 1035 в€'1036erg/cm3 which is more than 40 orders of magnitude larger than the observational bound (3) on the total vacuum energy density." (SE Rugh, H Zinkernagel, p. 16)

      ((((10^(80))((4/3)*3.14159*(d*100/2)^3)/1000))))/(4*10^(83)))*10^(40)=1

      d=9*10^(-14) meters

      Perhaps of interest:

      From http://en.m.wikipedia.org/wiki/Orders_of_magnitude_(length)

      Range (m) from 10^-15 to 10^-12, "atomic nucleus, proton, neutron"

      Electroweak Theory:

      "we are left with a Higgs vacuum energy density of the order of ПЃHiggs vac =в€'Вµ4/4g = в€'gv4 ≈в€'105 GeV4 =в€'1043erg/cm3 which, in absolute value, is roughly 52 orders of magnitude larger than the experimental bound on О›" (SE Rugh, H Zinkernagel, p. 15)

      ((((10^(80))((4/3)*3.14159*(d*100/2)^3)/1000))))/(4*10^(83)))*10^(52)=1

      d=9*10^-18 meters

      Perhaps of interest:

      From http://www.quora.com/What-is-the-physical-size-of-a-Higgs-boson

      "The most appropriate length scale I'd associate with the Higgs is about 10^-17 m."

      Quantum Electrodynamics:

      "Assuming this energy to be of the QED zero-point energy type, we get roughly (by inserting the Planck energy in eq.(6) with EP = ¯hωmax), ρPlanck vac ∼(1019 GeV)4 ∼ 1076 GeV4 ∼ 10114 erg/cm3 thus over-estimating the vacuum energy, relative to the observational constraint (3), by more than ∼ 120 orders of magnitude!" (SE Rugh, H Zinkernagel, p. 14)

      ((((10^(80))((4/3)*3.14159*(d*100/2)^3)/1000))))/(4*10^(83)))*10^(120)=1

      d= 10^-40 meters

      Perhaps of interest:

      This size is in the range discussed for "the black hole electron"in this reference:

      http://en.m.wikipedia.org/wiki/Black_hole_electron

      18 days later

      "a theory that sounds much too simple to be right, yet fits the data surprisingly well"

      John Carlos Baez

      https://johncarlosbaez.wordpress.com

      8 days later

      Theory of entropic forces-- each has an opposing force

      First start to write down Hamilton's first equation:

      [math]\frac{dC_hp}{dt} = \frac{\partial C_hp}{\partial psbltyCount_1}\frac{dpsbltyCount_1}{dt} \frac{\partial C_hp}{\partial infonCount_1}\frac{dinfonCount_1}{dt} \cdots [/math]

      As soon as you write down the first two terms on the right hand side of the equal sign, you can see the pattern.

      Say that

      [math]C_hp[/math]

      means "Chariot horsepower" in Parmenides' story. The more horsepower that a chariot has, the more horses will have to vanish before the chariot (existence) stops.

      To increase the possibilities for this first information channel is to increase the number of horses pulling the chariot. To have no possibilities at all means death. The higher the possibility count, the farther away from death. Assume:

      [math]\frac{\partial C_hp}{\partial psbltyCount_1} = 1 [/math]

      To increase the information carried by this first information channel is also to increase the number of horses pulling the chariot. To have no information at all means death. The higher the information count, the farther away from death. Assume:

      [math]\frac{\partial C_hp}{\partial infonCount_1}= 1 [/math]

      Now look at Hamilton's first equation. To model a constant number of horses pulling the chariot, in Hamilton's first equation it must hold that

      [math]\frac{dC_hp}{dt} = 0 [/math]

      Consider two cases: first, the count of possibilities increases in time (dt) by (1).

      Then given Hamilton's first equation, how must the count of information change in order to maintain the count of horses as a constant number?

      First case:

      [math]\frac{dpsbltyCount_1}{dt} = 1 [/math]

      means that this must hold:

      [math]\frac{dinfonCount_1}{dt} = -1 [/math]

      in order to make

      [math] \frac{dC_hp}{dt} = 0 [/math]

      Second case:

      [math]\frac{dpsbltyCount_1}{dt} = -1 [/math]

      means that this must hold:

      [math] \frac{dinfonCount_1}{dt} = 1 [/math]

      in order to make

      [math] \frac{dC_hp}{dt} = 0 [/math]

      In the paper this is the Inverse Principle of Informationalism.

      Corollary: Every entropic force must have an opposite.

      6 months later

      It's been a while since I've written notes here. In case anyone reads this, I've made the so far it seems successful attempt to delete all comments from others here. Almost every comment was from persons clearly not expert in physics or mathematics. From my experience elsewhere (trying to make some math comments on the Physics of the Observer) this characterizes those making comments in these blogs. So I've deleted all my comments elsewhere on this site. I suspect no one expert in math or physics is reading these comments, and very likely has the opinion that anybody posting in the blog comments on this site is a crackpot.

      My purpose here and now is to write some notes on recent findings about dark matter. Infant galaxies have been observed deeply nested in the early web of dark matter.

      If dark matter does host ordinary matter (as suggested in the paper here), it seems from this recent observation that not every particle of dark matter is selected to be a host for ordinary matter. Some particles do host ordinary matter. Others do not. The question then becomes: How do the lucky dark-matter hosts of ordinary matter get selected? There is an idea in the paper that might answer this. The lucky particles of dark matter that get to host ordinary matter are those selected to be locations from which dark energy feeds possibilities to ordinary matter particles. Not every particle of dark matter seems to host such a feeding location for dark energy. (Feeding possibilities to the particle of ordinary matter, as described by the probability learning algorithm in the paper.)

      And again as in the paper, this describes a thermodynamic engine which drives the time variable in the equations of the Standard Model in one direciton only: forward. The waste dark energy from this thermodynamic engine feeding possibilities to particles of ordinary matter expands the Universe. These ideas from the paper seem compatible with the recent observations.

      Who knows, maybe there will someday be a reason to write again.

      21 days later

      I worked on this FQXI essay contest because it suggested to me the idea of an information channel from the physical world into some minimum mathematical model, a channel through which information is carried and conserved by means of "infomorphisms." (The references for this term are in my paper for the contest: "Simple math for questions to physicists," which I'll abbreviate as SMQP in the rest of this post.) Long after the contest has ended, this is my final post in this thread.

      ***

      "Dark matter is a hypothetical kind of matter that cannot be seen with telescopes..." From the Wikipedia entry on dark matter.

      Dark matter seems like a puzzle about what exists. Because to interact with light and be seen with a telescope, it seems an object must occupy some finite region of space-- the particle must *exist* in space.

      So if dark matter does Not interact with light in this manner-- which would require an existence in some finite region of space-- then a particle of dark matter must not exist in any finite region of space. Somehow the particle of dark matter must exist only in a finite region of time.

      Hypothesis: A particle of dark matter has a center of gravity but no extension in space. A particle of dark matter exists Not in space, but only in time.

      ***

      "In the standard model of the evolution of the universe, galactic filaments form along and follow web-like strings of dark matter." From the Wikipedia entry on Galaxy Filament.

      It seems there are regions of the Universe with no matter, regions with dark matter, and regions with both dark and ordinary matter.

      Hypothesis: Where a particle of ordinary matter exists there exists a particle of dark matter. Where a particle of dark matter exists there exists the possibility of a particle of ordinary matter.

      ***

      "In physical cosmology and astronomy, dark energy is an unknown form of energy which is hypothesized to permeate all of space, tending to accelerate the expansion of the universe." From the Wikipedia entry on Dark Energy.

      Hypothesis: It requires work to expand something that exists only in time into something that exists in space. Dark energy fuels this expansion of possibilities. But thermodynamic work is inefficient. So dark matter disperses into the Universe from some kind of thermodynamic inefficiency.

      ***

      The mathematics in SMQP seems like a minimum model for these hypotheses, which also supports the attached diagram about what seems to be a kind of thermodynamics.

      ***

      A cycle produces the stream of time.

      For the above hypotheses-- rather than a field in space-- a "non-wellfounded set" would be the preferred mathematical object for modeling a particle of dark matter that has extension only in time, since the non-wellfounded set is useful for expressing a stream in time that is associated with a repeating cycle. (I learned what I know about non-wellfounded sets from the book by Barwise and Moss, "Vicious Circles: On the Mathematics of Non-Wellfounded Phenomena.")

      For example, the experience of a individual existing in proper time can be expressed by using such a set used to model such a cycle:

      1. individual_existence = (changes, individual_existence)

      Every instant of time in an individual's existence is the same "now"; Everyday the individual existence-- in human terms, the "self"-- is constant. As long as individual existence lasts, it's the same individual existence or in the case of a human being, it's the same "self."

      By repeatedly applying equation 1, we get a model of individual existence associated with a stream of changes:

      individual_existence = (changes, (changes, individual_existence))

      individual_existence = (changes, (changes, (changes, individual_existence)))

      individual_existence = (changes, (changes, (changes, (changes, individual_existence))))

      And so on.

      The non-wellfounded set can also model that individual existence in time goes through cycles of destruction of the old and creation of the new. Since the particle of dark matter is assumed by these hypotheses to exist only in time, the standard idea of time as a point moving on a line would be the wrong kind of circularity, because "moving" in this idea implies that, in some interval of time, space is traversed. But it seems that space should be removed from the concepts used to model existence of something that only exists in time.

      Here is how the syntax of the non-wellfounded sets models an individual existence using-- not the motion of a point along a line in space-- but instead, a cycle of destroying the old and creating the new:

      "-" means "destroying the old"

      "" means "creating the new"p

      A - B = C

      C D = A

      A. "individual_existence = ( changes, individual_existence)"

      Minus (destroying the old) B. "(changes, )"

      Equals C. "individual_existence = individual_existence"

      Adding (creating the new) D. "(changes, )"

      Equals A. "individual_existence = ( changes, individual_existence)"

      And so on.

      In this way the idea is modeled that the stream of changes associated with a constant individual existence and its stream of time of proper time arise from a cycle of destroying the old and creating the new.

      Therefore for these hypotheses, the particle of dark matter is (I think) minimally modeled as a non-wellfounded set which, through a cycle of destroying the old and creating the new produces the stream of changes associated with individual particle existence, while maintaining constant individual existence.

      ***

      "Gearing" dark matter to the standard model of time, as modeled by the non-wellfounded set.

      2. time = (real_number, time)

      Equation 2 uses the non-wellfounded set to model the stream of real numbers measuring time. Notice that unlike the model of a point moving on a line, there is no need to assume the previous existence of a line with one part past and another part future, with the location of the moving point being the "now."

      The idea is first to "gear" this real number to the non-wellfounded set used to model the particle of dark matter, thus modeling that the stream of real numbers in proper time of an individual existence is actuated by a particle of dark matter.

      3. time = (nonstandard_monad, time)

      Equation 3 models a stream of nonstandard monads, each monad containing in its standard part the real number in equation 2. (References are in SMQP.) The halo of nonstandard points around the standard point in the monad comprise the "nonstandard past" and the "nonstandard future," both infinitesimal in the mathematical model of nonstandard analysis.

      With this step in modeling the particle of dark matter, room is made available for the particle of dark matter to exist outside of space but still extended in time. The idea modeled is that the particle of dark matter exists extended in time in the nonstandard past and the nonstandard future, while space exists only in the standard part of the monad, the standard "now." The standard "now" is actuated by the particle of dark matter which exists around it in the nonstandard past and nonstandard future.

      ***

      An iterated transformation powers the cycle.

      Having introduced a monad that is a system of the nonstandard future, the nonstandard past, and the standard present, all inside a non-wellfounded set that models a particle of dark matter, the question becomes how this system actuates proper time of a particle of ordinary matter.

      In SMQP, this is where some mathematics of information enters the picture, specifically the mathematical theory of information called Situation Theory. Inside the particle of dark matter (inside the non-wellfounded set) the monad starts to carry information "about" the particle of ordinary matter for which it actuates proper time. The idea being modeled by the mathematics is that this must be powered by dark energy.

      Without the fuel of dark energy, the particle of dark matter is "about" nothing. It exists by itself in the vacuum. But given the fuel of dark energy, it becomes "about" something-- the particle of ordinary matter for which it carries the proper time. In SMQP this addition of ordinary matter is modeled by a complex number (which is supported as information through situation theory by the nonstandard future in the monad inside the non-wellfounded set) becoming non zero. When this complex number is zero, the particle of dark matter exists by itself without being associated with a particle of ordinary matter. But when this complex number becomes non zero, the particle of dark matter is associated with a particle of ordinary matter. The idea being modeled is that when this complex number is zero, there is no dark energy fueling the particle of dark matter. And when this complex number becomes non zero, there is dark energy fueling the particle of dark matter, in which case there now exists a particle of ordinary matter, about which the complex number, as modeled by situation theory, then carries information, and for which the non-well founded set actuates the stream of proper time.

      In SMQP the Born rule is incorporated into a "Born infomorphism," a transmission of information from the nonstandard future to the nonstandard past which carries information about the particle of ordinary matter for which the particle of dark matter actuates proper time. The idea being modeled is that here is a kind of thermodynamics.

      The Born informorphism has the same mathematical form as the game of Probability Learning, much studied in laboratories and in SMQP, modeled using the mathematics of Shannon's theory of information. The idea being modeled is that in Probability Learning, "fuel" is fed into a system. But not all the fuel is consumed. In the Born infomorphism, this fuel would escape into the Universe at large, thus dispersing dark energy throughout and allowing it to increase possibilities in the form of space as it increased possibilities (from zero to non zero) in the Born infomorphism.

      The limitations of the essay contest to nine pages kept me from writing too much about this in SMQP.Attachment #1: 5_image.jpg

      8 days later
      • [deleted]

      This is a continuation of the previous note in this thread about the paper "Simple Math for Questions to Physicists" (SMQP).

      "Statement: 'The Universe is not a perpetual motion machine. It requires thermodynamic work to actuate time while holding everything that exists inside the constraints of physical law. Hypothesis: The minimum mathematical situation to model this statement involves a mathematical model of possibilities and impossibilities, to account for physical laws, and also a mathematical model of time that's not based on motion but instead, on a cycle of destroying the old and creating the new."

      The existence of the Born infomorphism depends on the existence of the Born rule.

      [math]P(x_i) = \Psi(x_i)\Psi^*(x_i)[/math]

      [math]i \in\left\{ {1, 2, 3...}\right\}[/math]

      The equal sign in the Born rule is either basic-- or not. If the equal sign is not basic, then something else is basic, which results in the relation.

      There is a possibility that the Born relation is an equation, that it is not an identity, and that this equation represents something like the outcome of strategies of players in a game called Probability Learning (which falls under the mathematical theory of games) - specifically, if the algorithmic strategy of one player in the game can be represented by the same number that represents another player's algorithmic strategy in the game. In this situation, the equals relation itself is not basic. What's basic are the the algorithmic strategies of players represented by the same number.

      Imagining the mathematics of a game underneath the Born rule is like fabricating a model airplane for the wind tunnel. If the logical details are correct, the toy airplane in the wind tunnel should be able to model a fact or two about the real thing. In what follows, this approach may suggest a role for dark energy.

      But first here is the game in question:

      There is a set of locations, each member of the set being a possibility, or a possible location, for the payoff. Say that first player in the game is the Experimenter. In each cycle, the Experimenter chooses in private (and hides the move) at which possibility the payoff will occur. Say that the other player is the Subject. In each cycle the Subject chooses at which possibility to look for the payoff, based just on history, and not seeing the move of the Experimenter. When the Subject misses a payoff, at the end of the cycle the Experimenter removes that payoff from the game. Then the cycle is repeated.

      The references in SMQP review the many laboratory studies in which the following outcome is always found: For each possibility i where a payoff can occur, there is a probability with some value, say pi that the Experimenter will place the payoff at that possibility. The Subject will then choose that possibility with some probability, say qi. Repeated experiments show: for each possibility i these two numbers for probability at possibility i will be the same. The Subject is said to have "learned" the probabilities with which the Experimenter places the payoff at the possibilities.

      The algorithm which models the strategy of the Subject in this game minimizes the difference between two opposing "forces" associated with each possible location for the payoff. One of the forces "pushes" the Subject away from possibility i in subsequent moves. The other force "pulls" the Subject toward possibility i in subsequent moves. In human terms these forces could be called regret.

      Of course there will be some percentage of Subject's choices of possibility i when the Subject sees that the payoff occurred at some other possibility for the payoff, different from the possibility i that was chosen. These occasions of missing the payoff produce regret in the Subject at having chosen possibility i, which in subsequent moves "forces" the Subject away from choosing possibility i.

      And, there will also be some percentage of choices by the Subject of other possibilities different from possibility i, when the Subject sees that the payoff did indeed occur at the possibility i, which was not chosen. These occasions of missing the payoff produce regret in the Subject at having Not chosen possibility i, which in subsequent moves "forces" the Subject toward choosing possibility i.

      The algorithm which models the observed behavior of the Subject is then to balance these two opposing "forces" for each of the possibilities for occurrence of the payoff. The solution (check it out) is that pi = qi.

      The Subject "learns" the probabilities. In more detail--

      Statement 1: [math]\Psi(x_i)\Psi^*(x_i)[/math]

      In the Born infomorphism, the above product of complex numbers would represent the Experimenter's strategy in the Probability Learning game.

      Statement 2: [math]P(x_i)[/math]

      The above probability number would represent the Subject's strategy in the Probability Learning game.

      The players in the game have different contexts, which determine the meaning of the above symbols and numbers. The context for the Experimenter is possibilities. The context for the Subject is history of types of events. In the Born infomorphism, the minimum context for possibilities is the nonstandard future, and the minimum context for history of events is the nonstandard past.

      In the context of the nonstandard future and the game of probability learning, the symbols and syntax in statement 1 mean that a number represents the conjunction of two possibilities: (1) the possibility that the region of space xi will be occupied by something that exists AND (2) the possibility that No region of space other than xi will be occupied by something that exists.

      Space does not yet exist in the context of statement 1. In statement 1 the term xi is just the possibility of that type of space.

      In the context of the nonstandard past and the game of probability learning, the Subject has the next move, and the symbols and syntax in statement 2 mean this number represents the probability that the type of event where a particle can be observed in the region of space xi. That is...if the space to support the existence of such a particle were to exist.

      Neither does space nor the particle exist in the context of statement 2. Both are historical events - in the laboratory, just memories in the Subject.

      The only situation where both space and the particle can exist together in the game is on what can be imagined as the game board: the standard present in the nonstandard monad comprising nonstandard future, standard present, and nonstandard past. And then, only if the Experimenter and Subject choose the same i.

      This may be imagined as follows. When the Experimenter makes a move in the game, this makes all the possibilities impossible, except the chosen possibility. The chosen possibility is, on the Experimenter's move in the game, entered into the standard present in the monad.

      If in its move the Subject chooses an impossible location, one indexed in the set of possibilities j instead of i, then the conditions to support existence of such a particle do not hold, because such a particle cannot exist in a region where it is impossible for such a particle to exist. As a result, an instance of such a particle is not created in the standard present on this kind of move.

      The only situation when both the particle and the space required for such a particle both exist is when the Experimenter and Subject choose the same possibility i for their moves. In that case, the particle and the space required for it to exist as a particle both exist.

      In the case where the possibility set indexes i and j are different, no particle exists in the standard present in the monad, only the space chosen by the Experimenter. Following what happens in the lab, this possibility for existence is not destroyed in the game. Rather, it is removed from play.

      This may model the logic of dark energy. The possibility, the possible location for existence of such a particle is: a region of space. As in the laboratory game, the Experimenter removes the possibility not chosen from the game but does not destroy it. Although it is no longer in play for the game, the space is given to the Universe, which logically would expand. Keeping tidy the details of this game imagined to be underneath the Born infomorphism supports at least this observation of physics.

      Since the laws of physics say what's possible and impossible, in this sense the moves imagined of the Experimenter in the game are those expressing the laws of physics. But rather than "obeying" the laws of physics, because of what looks like inefficiency due to the excess space used by the game, this image of a game might suggest that a thermodynamic engine is at work.

      Imagine it takes work to implement the laws of physics. And as for all thermodynamic engines performing work, the engine would be inefficient. This idea about game theory underneath the Born infomorphism is a model for the idea that it takes the work of a thermodynamic engine to drive the laws of physics, and the idea that the lost energy due to inefficiency compatible with the laws of thermodynamics is the dark energy that's been observed to be expanding the Universe.

      Is this a minimum mathematical model of the idea?

      In the previous post in this thread there is a model supporting statements about time and dark matter. This post adds a model supporting statements about possibilities and impossibilities that involves dark energy.

      Are there infomorphisms to more involved mathematics - for example the theories of fields?

        The previous post by "anonymous" is mine. I wasn't logged in!

        Write a Reply...