• [deleted]

Speed and position, while being relative between objects, also apply within any single object in terms of the reactivity of its intrinsic properties. And that we might designate as information given concise constraints.

Einstein once retorted in argument with Bohr; "I would just like to know what an electron Is". And this despite conventional assumptions at the time that the photoelectric effect had to be physically a single whole Quantum of multiple quanta value, because individual quanta would radiate away before the observed multiple could accumulate to liberate an electron from the sample of base metal exposed to modulated frequencies in experiment. Yet Compton had established formulated analysis of elastic and inelastic scattering.

Elasticity is the measure of mechanical Speed at which something can be stretched, and resilience is the inverse function of the Speed at which it will return to a relaxed state (experiment by trying to cut a tire tread and then a rubber band). Macroscopicly this occurs electrostaticly between molecules, while at the quantum level it is inherent to the electrostatic field of an electron. Eperimentally there is nothing small enough that is electrically neutral that can be used to probe the electronic field to precisely define an inelastic core, and photons emitted or absorbed exhibit their own electromagnetic field. This is where the deductive reasoning of theory can provide means of hypothesis that can be subjected to falsification.

Assume, hypothetically, that an inelastic core exists in an electron. While Spin characteristics are intentionally designed to be geometric projections of an instant of observed measurement and not a real measure of physical rotation, that is a theoretical paradigm constraint and does not preclude a local realistic physical rotation of a subject electron. At the zero boundary between inelastic response and elastic (however slow) response, becoming ever more elastic at greater distance from the core; a physical rotation of the core would translate axially as counter rotating torque imparted diametrically to an emission of electromagnetic energy we witness as a photonic stream. Accounting for polarity of 'light' as direction of angular momentum. like playing with a button on a string.

So the information sought, would be what part field elasticity plays in determining frequency in relation to rate of spin of of a hypothetical core.

9 days later

Is Maxwell's demon "governed by natural law" (https://fqxi.org/community/articles/display/234)?:

1. The universe runs on "fundamental level" information:

Fundamental-level information in the universe seems to exist as natural categories e.g. mass, position and velocity (speed and direction). And information also seems to exist as the numbers that apply to these categories.

2. Maxwell's demon acquires "higher level", algorithmic information:

IF the number representing the particle speed is greater than a cut-off number, AND IF the numbers representing the particle's extrapolated direction positions the particle within the set of numbers representing the chamber door plane (THEN the demon acts to open the chamber door, letting the particle through).

3. The information that Maxwell's demon acquires is the type of higher-level algorithmic information that living things acquire. This highly-structured higher-level information is not a logical consequence of fundamental-level information; however, higher-level information is necessarily built on top of fundamental-level information.

4. But these higher-level true-or-false conditions are information that has no reason to exist unless it provides the basis for "higher-level outcomes": opening the chamber door was not a logical consequence of laws of nature because laws of nature do not operate on true-or-false conditions; also, opening the chamber door was not a logical consequence of the true-or-false conditions.

5. Maxwell's demon is not "governed by natural law".

    • [deleted]

    Working from Faraday whom Maxwell analysed, the field is a physical attenuation of the same energy as the portion reckoned to be 'matter'. So it would be energy density in situ as a physical property of a particle which would equate with temperature, whereas macroscopically temperature is associated with motion of particulate matter. So macro realm temperature would be 'higher order information' and energy density would equate as higher temperature = higher velocity. But at the quantum level, energy density would equate as higher density = colder temperature.

    This apparent dichotomy might be resolved if higher order information obtains from lower energy density regions of the particle field, such as that evidenced by electrical repulsivity, being the impetus of macro particle motion.

    • [deleted]

    Not to detract from Maxwell, but in unifying the electric and magnetic fields mathematically his analytic method also segregated the full field. Where Faraday was essentially an intuitive experimentalist and made only the barest of mathematic analysis, Maxwell "associated" the electric and magnetic fields with a nondiscript particle. So it must be kept in mind that more questions abound from Maxwell than are answered.

    The (very) low speed experiments of Faraday, however meticulously measured and documented, could only be mathematically extrapolated by Maxwell to converge with the light velocity oscillations of Hertz' radio wave experiments. And experimentally it is only the transition zone of EMR wich is ever actually observed.

    This is a re-write of 4. from the above post:

    4. An algorithmic relationship has the structure: IF particular situational information is true THEN cause particular outcome information to be true. The algorithmic relationship links particular information with particular outcomes, but apart from the algorithmic relationship, there is no necessary connection between the information and the outcome.

    So, if outcomes that are 100% due to fundamental-level information continue to apply, then higher-level true-or-false information has no reason to exist. Higher-level true-or-false conditions are information that has no reason to exist unless it provides the basis for "higher-level outcomes" i.e. outcomes where at least one of the numbers representing the outcome information is not a logical consequence of fundamental law of nature relationships.

    The demon's opening of the chamber door is a "higher-level outcome". I.e. at least one of the numbers that represent the opening-the-chamber-door outcome was not a logical consequence of laws of nature. In any case, law of nature relationships are based on categories of information and numbers, they are not based on true-or-false conditions.

    (continued from the above post)

    Re "apart from the algorithmic relationship, there is no necessary connection between the information and the outcome":

    The demon's opening of the chamber door is an outcome that was not a logical consequence of the true-or-false information. Any outcome could have been linked to the true or false information e.g. buy a red hat, or wear the blue socks.

    • [deleted]

    "Only those who attempt the absurd will achieve the impossible. I think its in my basement... let me go upstairs and check." - Maurits Cornelius Escher

    On the Quantum probability, infinite two dimensional complex plane, M. C. Escher could construct a perpetual waterfall or a continuous ascending and descending staircase. Perhaps choice of geometries can therefore allow Maxwell's Demon uninformed results. What would his friend, Roger Penrose think?

    • [deleted]

    Temperature, like mass, suffers a lack of general definition in theoretical terms. Both are treated operationally. Yet thermodynamics must become generalized with both Relativistic and Quantum mechanics to progress beyond the current assemblage that is the Concordance of the cosmological standard model. In short, quantization needs Canons of general terms.

    Consider Maxwell's convergence and divergence functions, curl and div. These commonly apply to coefficiencies of permeability (mu) and permittivity (epsilon) of a field associated with a mass. Mass : energy equivalence provides no law of proportionality to prescribe an upper density bound for any quantity of energy and so attempts to determine a finite field fall into a mathematic singularity.

    In analysis, permeability and permittivity are seperate operations, yet in reality both would physically coexist. Both limit to a light velocity proportion in free space, but operate under opposite signs. The product would therefore be (-c^2), and to be consistent with mass : energy equivalence a postulate of proportionate density should also be quadratic. Thus a hypothetical upper density bound could be prescribed as;

    lim = mass (mu * epsilon)^2 = mc^4

    this does not in itself define what portion of the total mass must exist at that upper density, but that proportionate density would prescribe a finite quantity if a finite volume were derived as existant at a constant density. A differentiated quantity of energy would not need to become any more dense to become inertially bound as a finite field. And within that constant density volume temperature could equate to relativistic rate of response.

    • [deleted]

    Information, in the modern theoretical sense, is the attempt to formalize what has long been the colloquial ambiguity, "how does 'it' know what X". Mathematically this is confronted by the conundrum of discrete vs. continuous.

    As adults we are accustomed to a working knowledge of water being comprised of discrete molecules electrostaticly bound as a viscous fluid. We loose that childlike facility to be fascinated by the seemingly seamless quality of a shimmering slender stream running from a faucet that breaks into rivulets and drops as we curiously poke it with our finger. We begin our quantum discretion with the wise question, "How does the water 'know' how to do that?" How does what appears seamless become differentiated? Where does the break in symmetry occur? It becomes easier to simply start with things being made of pieces that cling to each other, than to retain the fascination of a continuum and seek how the stream can no longer 'know' itself as a contiuous stream. The knowing *itself* is information that is simply connected. Simply, means to not over-think it. Pieces knowing each other, or of each other, is information that is complexly connected.

    How does light know what the rate of passage of time is? Given light velocity is accepted as a universal absolute, light could be simply connected with the highest limit of rate of passage of time, at least in the one dimension of it linear propogation. But we can only assume that the rate of passage of time anywhere, is somewhere between nil and light velocity. So light would be complexly connected in the two other orthoganol spatial dimensions to the rate of time. So it would be quite possible for light to consist of an elastic light velocity particle and also a one dimensional light velocity interval where the information of rate of time is physical sought laterally, and that relativistic effect is what registers by an observing system as a wave.

      • [deleted]

      ...And a tip of my hat to Thomas H. Ray for his theoretic examination of One Dimensional Solitons.

      Anonymous, could you please put your name or initials at the end of your post if you are not signing in, as there may be several Anonymous-es posting and it would be good not to be unsure of who the posts belong to.

      • [deleted]

      Hi Georgi, its been me jrc, hogging the bandwidth. I had to get a new cheap laptop that is so overloaded by the OS that I don't use it for anything that requires 'creating an account'. Tough enough to keep a clean machine.

      I thought of you while composing the point in argument that we can only assume the rate of passage of time anywhere, is somewhere between nil and light velocity. Einsteins euphemism that 'time stops at light velocity' is provocative but neglects the obvious point that it would only be so in relation to light velocity being the upper limit to the rate of passage of time. But that is all within the very limiting constraints of SR. In GR, temperature can be generalized to higher energy density = higher gravitation = lower temperature = slower passage of time. The challenge is to formalize that at the quantum level, and GR treats a region of constant density as an averaged mass density rather than a proportional mass density.

      Condensed Matter Physics, which is by far the largest generic discipline in today's practicing physics community, recognizes this. But yet there remains the conspicuous absence of consensus on a definitive formulation of a particle of condensed energy which meets conventional standards, not least of which would be distribution in accord with inverse square law.

      John, I'll try not to overthink this. Doesn't the question 'how does it know?', come with the prior assumption that it knows? But I don't think it knows; to have knowledge seems to require some temporarily persistent structure or organisation independent of the phenomenon 'known'. The water is simply taking the path of least resistance around rather than through the finger. Isn't asking how it knows to do that a bit like asking how the battery knows when the distant switch is open?

      There is something odd about asking the rate of passage of time, let alone how light knows it. (Jumping back to what it meaning to know something rather than just acting, as the physical circumstances 'prescribe'). To measure the rate requires that used for comparison is already fixed, yet it is also the to be measured.

      • [deleted]

      Georgi,

      Yes, 'knowing' is a colloquialism. It's really about what we don't know, which is the greater part of anything we do seek to understand. So it is quite common and getting worse in the hubris of the Information Age of super-connectivity. Which is why the logical constraint of Rob McEachern, that information is a purely mathematical concept, is the better guide.

      And yes, asking what the rate of passage of time might be, is rather odd. As it is to ask what might be meant by connectivity. But in the here right now, these are time critical questions. In 2017 the Chinese achieved what they later publicly announced as "a significant sequence of singlet pair production" and successfully transmitted a video via quantum key encryption between their research facility near Bejing and their partner lab in Vienna, Austria. At the time of President Trump's State visit to China in 2017, it was stunningly obvious that his body language had deserted him in making the usual political platitudes following the initial meet and greet exchange of portfolios. Everyone in the U.S. entourage with the exception of Gen. Mattis, ret. looked like someone that goes to a party and encounters something everybody seems to know and thinks everyone is playing a trick on them. News commentators seemed to conclude that it was simply a lack of personal preparedness on the part of an inexperienced President, but it was apparent to seasoned observers of geopolitics that the U.S. strategic estimate going in had been caught flat-footed.

      So it isn't one's preference of paradigm that matters. It is a matter of who gets it to work with the infrastructure to support it, who then corners the global market on electronic transfer of funds and holds the algorithmic keys to pick winners and losers. And it doesn't matter who I am, in what I might say here, its just another grain of sand in the data mine.

        John, emergence of effects can happen at the macroscopic scale that aren't a property of the constituents at the atomic or sub atomic scale. Surface tension is one such effect of water. Also whether the water passes through or around depends upon the material of the finger. A foam finger will fill with water and then it will pass through,-not so for a flesh finger. The material's structure is a macroscopic arrangement of matter not a part of an atomic or sub atomic particle alone's existence. Though relevant, if thinking about a particle rather than continuous model, is the fact that fermion particles's can not occupy the same space but must occupy their own space. You say "the water appears seamless". Appearance is something else from the water, an observation product. Formed using input of EM radiation, the product does not consist of water molecules.

        John, I agree that the states prior to measurement, that is the information purportedly carried by the 'entangled' particles, is a mathematical model. The particles do not know their test outcome states which don't exist until the tests are carried out. As I understand it the entangled pairs can be used to tell whether there has been tampering, an attempt to look at an encrypted signal. Which isn't like a super padlock that will keep intrusion out but just lets on that it has happened. Like a hair stuck to door and door frame. It is a game changer for covert espionage. Which wouldn't be a problem if everyone had their cards on the table and was working at mutual cooperation, rather than winner takes all. I haven't seen the footage with body language you mention. Maybe there was some trepidation about possibly being caught with a hand in the cookie jar. Who knows? not me.

        Prior to what is the rate of passage of time comes the question 'what is passage of time? For a human observer it would be the updating of the experienced present. Which is usually about 0.1 s. But MIT have found the shortest duration of an image identified, more than chance alone would give, is 13 ms. So if seen (and recalled) it has only taken 13ms for the update. If passage of time is considered instead to be the change in configuration of all that exists, the smallest conceivable change in location (of some thing or phenomenon) and the fastest motion ought to be considered. The smaller the scale the more change is happening presumably until beyond the particulate nature of matter reaching the the limit where there is no longer differentiation whereby change can be identified.Rate of change is not uniform but time is, as there is just the one configuration.

        My last point wasn't very clear. At larger scales change tends to happen more slowly, making it less clear when the configuration is different (as on the larger scale the change (that has happened on the smaller scale) may not be discernible.

        Different parts and scales of the whole pattern of existence can be undergoing different amounts of spatial change simultaneously. The parts and scales are not experiencing different rates of passage of time. Each unique time is the entire, all scale configuration of existence.

        • [deleted]

        Georgi, Right, the appearance of smoothness macroscopicly emerges from the granular proliferation of the quantum realm. So much so that metaphorical illustration of a continuum is subsumed by the predominant theme in physics which all trends towards ever more of the same analysis of probabilities of the granular resulting in apparent smooth transitions.

        I'm just going the other direction. A field is not a vector probability space, classical or quantum, finite or otherwise. It is a region of continuously connected energy. A particle IS a continuum, and its physical property of density varies as a function of gravitation. It does not require a photon to transfer energy, though it can produce a projection of continuously connected energy that conventionally would be interpreted as a photon. And within a finite field, motion and translation of inertia are generally and relativistically covariant. So primary force effects between otherwise discrete fields are conditions of merging regions of energy. That doesn't mean that two electrons or two subatomic particles can exist in the same space, but at densities lower than electrostatic seperation those density regions can and do meld into a consolidated magnetic and gravitational domain. So thermodynamics are also relativistic and consistent with the Cosmological Standard Model of Particle physics. It doesn't reject QM, but does grant something in the way of a rational ontology for some QM results. Not least of which is that entanglement is not so spooky as Quants like it to be. Light velocity is a universal absolute, but it is so because its the root exponential mean. In one light second 'spin characteristics' are rigidly connected across a seperation of 2.143^14 cm. (I prefer cgs, no apologies, no more hints) jrc

          John, BTW if you post in 'Reply to this thread' rather than 'Add new post' the conversation joins up and replies don't get 'lost'.