(continued from the above post)

Re "apart from the algorithmic relationship, there is no necessary connection between the information and the outcome":

The demon's opening of the chamber door is an outcome that was not a logical consequence of the true-or-false information. Any outcome could have been linked to the true or false information e.g. buy a red hat, or wear the blue socks.

  • [deleted]

"Only those who attempt the absurd will achieve the impossible. I think its in my basement... let me go upstairs and check." - Maurits Cornelius Escher

On the Quantum probability, infinite two dimensional complex plane, M. C. Escher could construct a perpetual waterfall or a continuous ascending and descending staircase. Perhaps choice of geometries can therefore allow Maxwell's Demon uninformed results. What would his friend, Roger Penrose think?

  • [deleted]

Temperature, like mass, suffers a lack of general definition in theoretical terms. Both are treated operationally. Yet thermodynamics must become generalized with both Relativistic and Quantum mechanics to progress beyond the current assemblage that is the Concordance of the cosmological standard model. In short, quantization needs Canons of general terms.

Consider Maxwell's convergence and divergence functions, curl and div. These commonly apply to coefficiencies of permeability (mu) and permittivity (epsilon) of a field associated with a mass. Mass : energy equivalence provides no law of proportionality to prescribe an upper density bound for any quantity of energy and so attempts to determine a finite field fall into a mathematic singularity.

In analysis, permeability and permittivity are seperate operations, yet in reality both would physically coexist. Both limit to a light velocity proportion in free space, but operate under opposite signs. The product would therefore be (-c^2), and to be consistent with mass : energy equivalence a postulate of proportionate density should also be quadratic. Thus a hypothetical upper density bound could be prescribed as;

lim = mass (mu * epsilon)^2 = mc^4

this does not in itself define what portion of the total mass must exist at that upper density, but that proportionate density would prescribe a finite quantity if a finite volume were derived as existant at a constant density. A differentiated quantity of energy would not need to become any more dense to become inertially bound as a finite field. And within that constant density volume temperature could equate to relativistic rate of response.

  • [deleted]

Information, in the modern theoretical sense, is the attempt to formalize what has long been the colloquial ambiguity, "how does 'it' know what X". Mathematically this is confronted by the conundrum of discrete vs. continuous.

As adults we are accustomed to a working knowledge of water being comprised of discrete molecules electrostaticly bound as a viscous fluid. We loose that childlike facility to be fascinated by the seemingly seamless quality of a shimmering slender stream running from a faucet that breaks into rivulets and drops as we curiously poke it with our finger. We begin our quantum discretion with the wise question, "How does the water 'know' how to do that?" How does what appears seamless become differentiated? Where does the break in symmetry occur? It becomes easier to simply start with things being made of pieces that cling to each other, than to retain the fascination of a continuum and seek how the stream can no longer 'know' itself as a contiuous stream. The knowing *itself* is information that is simply connected. Simply, means to not over-think it. Pieces knowing each other, or of each other, is information that is complexly connected.

How does light know what the rate of passage of time is? Given light velocity is accepted as a universal absolute, light could be simply connected with the highest limit of rate of passage of time, at least in the one dimension of it linear propogation. But we can only assume that the rate of passage of time anywhere, is somewhere between nil and light velocity. So light would be complexly connected in the two other orthoganol spatial dimensions to the rate of time. So it would be quite possible for light to consist of an elastic light velocity particle and also a one dimensional light velocity interval where the information of rate of time is physical sought laterally, and that relativistic effect is what registers by an observing system as a wave.

    • [deleted]

    ...And a tip of my hat to Thomas H. Ray for his theoretic examination of One Dimensional Solitons.

    Anonymous, could you please put your name or initials at the end of your post if you are not signing in, as there may be several Anonymous-es posting and it would be good not to be unsure of who the posts belong to.

    • [deleted]

    Hi Georgi, its been me jrc, hogging the bandwidth. I had to get a new cheap laptop that is so overloaded by the OS that I don't use it for anything that requires 'creating an account'. Tough enough to keep a clean machine.

    I thought of you while composing the point in argument that we can only assume the rate of passage of time anywhere, is somewhere between nil and light velocity. Einsteins euphemism that 'time stops at light velocity' is provocative but neglects the obvious point that it would only be so in relation to light velocity being the upper limit to the rate of passage of time. But that is all within the very limiting constraints of SR. In GR, temperature can be generalized to higher energy density = higher gravitation = lower temperature = slower passage of time. The challenge is to formalize that at the quantum level, and GR treats a region of constant density as an averaged mass density rather than a proportional mass density.

    Condensed Matter Physics, which is by far the largest generic discipline in today's practicing physics community, recognizes this. But yet there remains the conspicuous absence of consensus on a definitive formulation of a particle of condensed energy which meets conventional standards, not least of which would be distribution in accord with inverse square law.

    John, I'll try not to overthink this. Doesn't the question 'how does it know?', come with the prior assumption that it knows? But I don't think it knows; to have knowledge seems to require some temporarily persistent structure or organisation independent of the phenomenon 'known'. The water is simply taking the path of least resistance around rather than through the finger. Isn't asking how it knows to do that a bit like asking how the battery knows when the distant switch is open?

    There is something odd about asking the rate of passage of time, let alone how light knows it. (Jumping back to what it meaning to know something rather than just acting, as the physical circumstances 'prescribe'). To measure the rate requires that used for comparison is already fixed, yet it is also the to be measured.

    • [deleted]

    Georgi,

    Yes, 'knowing' is a colloquialism. It's really about what we don't know, which is the greater part of anything we do seek to understand. So it is quite common and getting worse in the hubris of the Information Age of super-connectivity. Which is why the logical constraint of Rob McEachern, that information is a purely mathematical concept, is the better guide.

    And yes, asking what the rate of passage of time might be, is rather odd. As it is to ask what might be meant by connectivity. But in the here right now, these are time critical questions. In 2017 the Chinese achieved what they later publicly announced as "a significant sequence of singlet pair production" and successfully transmitted a video via quantum key encryption between their research facility near Bejing and their partner lab in Vienna, Austria. At the time of President Trump's State visit to China in 2017, it was stunningly obvious that his body language had deserted him in making the usual political platitudes following the initial meet and greet exchange of portfolios. Everyone in the U.S. entourage with the exception of Gen. Mattis, ret. looked like someone that goes to a party and encounters something everybody seems to know and thinks everyone is playing a trick on them. News commentators seemed to conclude that it was simply a lack of personal preparedness on the part of an inexperienced President, but it was apparent to seasoned observers of geopolitics that the U.S. strategic estimate going in had been caught flat-footed.

    So it isn't one's preference of paradigm that matters. It is a matter of who gets it to work with the infrastructure to support it, who then corners the global market on electronic transfer of funds and holds the algorithmic keys to pick winners and losers. And it doesn't matter who I am, in what I might say here, its just another grain of sand in the data mine.

      John, emergence of effects can happen at the macroscopic scale that aren't a property of the constituents at the atomic or sub atomic scale. Surface tension is one such effect of water. Also whether the water passes through or around depends upon the material of the finger. A foam finger will fill with water and then it will pass through,-not so for a flesh finger. The material's structure is a macroscopic arrangement of matter not a part of an atomic or sub atomic particle alone's existence. Though relevant, if thinking about a particle rather than continuous model, is the fact that fermion particles's can not occupy the same space but must occupy their own space. You say "the water appears seamless". Appearance is something else from the water, an observation product. Formed using input of EM radiation, the product does not consist of water molecules.

      John, I agree that the states prior to measurement, that is the information purportedly carried by the 'entangled' particles, is a mathematical model. The particles do not know their test outcome states which don't exist until the tests are carried out. As I understand it the entangled pairs can be used to tell whether there has been tampering, an attempt to look at an encrypted signal. Which isn't like a super padlock that will keep intrusion out but just lets on that it has happened. Like a hair stuck to door and door frame. It is a game changer for covert espionage. Which wouldn't be a problem if everyone had their cards on the table and was working at mutual cooperation, rather than winner takes all. I haven't seen the footage with body language you mention. Maybe there was some trepidation about possibly being caught with a hand in the cookie jar. Who knows? not me.

      Prior to what is the rate of passage of time comes the question 'what is passage of time? For a human observer it would be the updating of the experienced present. Which is usually about 0.1 s. But MIT have found the shortest duration of an image identified, more than chance alone would give, is 13 ms. So if seen (and recalled) it has only taken 13ms for the update. If passage of time is considered instead to be the change in configuration of all that exists, the smallest conceivable change in location (of some thing or phenomenon) and the fastest motion ought to be considered. The smaller the scale the more change is happening presumably until beyond the particulate nature of matter reaching the the limit where there is no longer differentiation whereby change can be identified.Rate of change is not uniform but time is, as there is just the one configuration.

      My last point wasn't very clear. At larger scales change tends to happen more slowly, making it less clear when the configuration is different (as on the larger scale the change (that has happened on the smaller scale) may not be discernible.

      Different parts and scales of the whole pattern of existence can be undergoing different amounts of spatial change simultaneously. The parts and scales are not experiencing different rates of passage of time. Each unique time is the entire, all scale configuration of existence.

      • [deleted]

      Georgi, Right, the appearance of smoothness macroscopicly emerges from the granular proliferation of the quantum realm. So much so that metaphorical illustration of a continuum is subsumed by the predominant theme in physics which all trends towards ever more of the same analysis of probabilities of the granular resulting in apparent smooth transitions.

      I'm just going the other direction. A field is not a vector probability space, classical or quantum, finite or otherwise. It is a region of continuously connected energy. A particle IS a continuum, and its physical property of density varies as a function of gravitation. It does not require a photon to transfer energy, though it can produce a projection of continuously connected energy that conventionally would be interpreted as a photon. And within a finite field, motion and translation of inertia are generally and relativistically covariant. So primary force effects between otherwise discrete fields are conditions of merging regions of energy. That doesn't mean that two electrons or two subatomic particles can exist in the same space, but at densities lower than electrostatic seperation those density regions can and do meld into a consolidated magnetic and gravitational domain. So thermodynamics are also relativistic and consistent with the Cosmological Standard Model of Particle physics. It doesn't reject QM, but does grant something in the way of a rational ontology for some QM results. Not least of which is that entanglement is not so spooky as Quants like it to be. Light velocity is a universal absolute, but it is so because its the root exponential mean. In one light second 'spin characteristics' are rigidly connected across a seperation of 2.143^14 cm. (I prefer cgs, no apologies, no more hints) jrc

        John, BTW if you post in 'Reply to this thread' rather than 'Add new post' the conversation joins up and replies don't get 'lost'.

        • [deleted]

        I know, Georgi, But I don't want to reinitiate an account to do that. No offense, but I was intending only generic comments rather than dialogue and have pretty much spent my dollar. And I never care if anyone agrees with me or not, I'm not doing anything new except in regards competing models of which there is no lack. This is Theory, after all. Heck, there's more theories then there is people! But niether am I a lone wolf, more a shepherd really, and for more then half my life. So I always keep things pretty generic so as not to give away the store. I'll tell you what interested me in this article about Maxwell's Demon letting cold molecules collect. In the CMP model I like, a cold spot could be expected to develop under conditions of particle interactions, so it is conceivable that thermo asymmetric molecules have a probability of passing as homgeneously hot molecules. And I didn't have to initiate an account to post. Best jrc

          John you wrote "it is conceivable that thermo asymmetric molecules have a probability of passing as homgeneously hot molecules." Tt's an interesting idea-like detergent molecules having hydro philic and phobic parts (I'm thinking). What keeps the gradient? Is one end more flexible, able to move more, and the other more rigid able to move less. Are such molecules common or unusual?

          • [deleted]

          Georgi, I'm not familiar with the types of molecule you refer to and the concept is a potential area of application, but you seem to get the idea. I can think of one example, and that would be weak hydrogen bonding in water molecules, where the two hydrogen atoms are more on one side of the oxygen atom, and so a hydrogen will become attracted towards the 'bald spot' on an oxygen in an adjacent molecule. Which suggests that a concentration of energy develops in the electrostatic range by the covalence bonding and migrates towards a center of gravity of the combined atomic masses.

          This would be a good place to explain the build-up I was getting to. The central idea of the theoretical model is that there's too much energy to exist at a smooth constant density in the universe so it has to slow down and condense to save space. But it will do so at an exponential rate of negative acceleration, spherically. You can immediately see the problem! How do you account for the quantity of energy between the radii as density starts to stack-up on itself as it slows from light velocity to form a rest mass, and still incrementalize that on any one radius? Well... in linear algebra you can't use the exponential function (e) as the index, only as the base, otherwise it would possibly extrapolate out as its own root. But on a radius in a sphere, the natural log would only the energy on that radii, not between raddi, but any radii would have the same exponentiation and a change in spherical volume would algebraically be non-linear. So on one radius the exponential root of light velocity would express the exponential function on every radii of the gravitation concentrating energy into density as it slows from c to rest. That provides a scale independent proportion of c(c)^1/e to shrink a sphere at an average constant density to one of smoothly differentiated density across density range of a light velocity proportion of density difference; such as the c proportionate difference between electric and magnetic intensity in a point charge. So each primary force can be theoretically defined as a c proportion of density difference, and you have the rudiment of a unified field. It gets messy from there but has some interesting results. And I'll put my jrc to that. Cheers

            Hello,

            You have well explained,thanks for sharing.Ps I liked also the extrapolations with the radii of sphères.:)

            Best Regards

            • [deleted]

            Hi Steve!

            I thought you would like it. It tickles me every time I try to wrap my head around it. You can see that while a sphere's volume varies by a factor of 2 its volume varies by a factor of 8, so that's a linear function, but the exponential deceleration compounding density is nonlinear. The amount of energy required to constitute the density gets progressively less and less in ever smaller volumes as the density compounds exponentially, so a huge density value needs a minuscule amount of actual energy to be extremely effective. But the corker is that energy quantity is the hidden variable! Its there! but only expressed in the quotient and divisor; density = erg/cm^3. Energy quantity, is the dividend and only expressed as a non-dimensional point density value. (aarghh! Where is IT!) Its there, its about energy. But its total quantity is expressed through density. Like in GR, force is the hidden variable as the product, and expressed in the multiplier and multiplicand mass*acceleration.

            It can be done (and was) geometrically and algebraically in Euclidean R4 space and time, but winds up forcing the issue of relativistic time dilation. It's not difficult to accept that energy with condense by slowing to rest from light velocity... sure. But then comes the ontological twist of once you are out there on the edge where energy is at an empirical minimum density at light speed... where can it go without sucking the energy out of the field? So the model forces acceptance that light velocity is the limit of speed of time, just like in SR. If time is going at light velocity, then the inertially bound enregy needs not move in space. Rather it cannot go any further than its zero boundary of minimum density, and saves space for the over abundance of energy in the universe. And in context with previous posts, the continuously connected energy in a quantum level field equates higher density with lower temperature, while in the general gravitational referrence higher density equates with higher temperature as a function of random particle motion. Kind of nice given todays announcement of the first real photograph of a black hole, where extreme density quiets particle motion down to the quantum level of high density and colder temperature.

            What a nice day! jrc