To be accurate I ought to say the value output can a source of knowable information. To be knowable it must be in a form accessible to the senses; Most usually visual or auditory potential stimuli, emitted or reflected from the 'read out', or emitted 'sound waves'.The beable read out of an apparatus un-illuminated is not (generally speaking) knowable. Though it might be felt if the numbers are raised or indented.

Re: "The framework, which is being developed by physicist Benjamin Schumacher... and mathematician Michael Westmoreland... put information at the center--along with a hypothetical, microscopic observer known as Maxwell's demon. ...Schumacher and Westmoreland ...were interested in how information theory could help illuminate quantum mechanics... I started wondering if information is more fundamental than probabilities":

In the real world, fundamental-level information cannot exist as binary digits (true/false, 1/0, yes/no or on/off), because a string of these symbols can have no inherent context or meaning, and no inherent relationship to any other such string of binary digits.

In the real universe, fundamental-level information seems to exist as categories: mass, position and velocity (speed and direction) seem to be natural categories, where every category has context and meaning because it is built out of relationships between other, seemingly pre-existing, categories. Categories seem to exist as part of a network of relationships, relationships that we represent with mathematical symbols.

Information also seems to exist as (what we would represent as) numbers. Seemingly categories must have pre-dated numbers in the universe because: 1) categories/relationships can seemingly exist without numbers being applied to them; 2) the number one can be constructed by dividing a category by itself, and (rational) numbers can be constructed using the number one; and 3) standalone numbers have no inherent context unless they are applied to a category.

Information can also be built out of (what we would represent as) algorithms. So, Maxwell's demon 1) does an algorithmic analysis of the particle's velocity, and 2) acts on the results of that analysis. If the number representing the particle speed is greater than a cut-off number, and if the particle's direction is towards the chamber door, then the demon opens the door to let the particle through. To put it another way: If condition 1 is true, and condition 2 is true, then action is true. The true/false binary digit information only exists in the context of the existing category/relationship information, the existing number information, and the algorithmic question asked about that information.

So, the information that the hypothetical Maxwell's demon acquires is highly sophisticated: it is not fundamental-level information.

    (continued from the above post)

    Re "there is a trap door in the partition operated by a tiny being [Maxwell's demon] ...If the being saw a high-energy molecule approaching the partition from, say, the left half of the box, it could briefly open the trap door to let that molecule pass through to the right side. And likewise, it could let low energy molecules pass through from right to left":

    The door open/closed outcome was not random.

    And the door open/closed outcome was not necessary i.e. it was not necessitated/ caused by any law of nature relationship (a law of nature relationship is represented as an equation).

    The not-random, not-necessary outcome was due to (what we would represent as) an algorithm.

    But there is no way you can derive an algorithm from an equation i.e. there is no way that nature can evolve (what we would represent as) an algorithm from (what we would represent as) an equation.

    So, in any universe that included Maxwell's demon, (what we would represent as) algorithms must be an inherent part of the nature of that universe.

    • [deleted]

    "(what we would represent as) algorithms must be an inherent part of the nature of the universe." L. Ford

    This is a clear (and fine) distinction between math and its assignment for analysis. There may be something we would call a 'Math" that would adequately describe a physical phenomenon of space, time and energy in unity; such that a quantity of energy would naturally assume a preferred volume and shape in a universe with an abundance of energy, and thus determine the extremely limited numbers of sub-atomic particle species currently identified by the Standard Model.

    From a perspective of Topology, wherein an object is defined in relation to its own constituent reference points independent of an external reference, the Ford/Woodward dialogue seems to be wrestling with a challenge of establishing a convention of terminology in strictly limiting definition to an If and Only If constraint.

    (continued from the above post)

    Re "Or to put it another way, the being [Maxwell's demon] could cause heat to spontaneously flow from cold to hot--a violation of the Second Law of Thermodynamics...Maxwell left this paradox to later generations of physicists as a kind of homework assignment: Where was the flaw in this thought experiment? What would keep Maxwell's 'demon', as other physicists took to calling it, from violating the second law? Did the demon's ability to observe, think, and act change the fundamental physics in some way? Or was its 'intelligence' still governed by natural law?":

    As explained in the above 2 posts: 1) Maxwell's demon possesses sophisticated contextual high-level algorithmic information; and 2) Maxwell's demon possesses the ability to create non-random, non-deterministic (i.e. not determined by any laws of nature) outcomes.

    The possession of contextual high-level information, plus the ability to create required outcomes is what can "violat[e]...the Second Law of Thermodynamics".

    "[T]he demon's ability to observe, think, and act" DOESN'T "change the fundamental physics" IF contextual algorithmic information, and the ability to act independent of the laws of nature, is already part of the physics of the universe.

    Maxwell's demon has free will. The demon acquires the following information, derived from observation of a particle, as a basis for action:

    1) the speed and direction (velocity) categories [1]; and 2) the numbers that apply to these categories.

    Depending on this information, the demon opens the trap door. The demon acts, causing outcomes that are:

    1) not random; and 2) not determined by laws of nature.

    The above process can be represented as an algorithm [2]. I.e. the structure of free will is represented by algorithms:

    1) the algorithms are not necessitated by laws of nature or numbers that apply to e.g. speed or direction; 2) the algorithms represent the acquisition of information about information [3]; 3) the algorithms represent causal factors that are independent of laws of nature; and 4) the algorithms could be one-off, or "physically locked in".

    ..........

    1. Categories are essentially transposed law of nature relationships.

    2. See "Lorraine Ford wrote on Mar. 16, 2019 @ 21:37 GMT"

    3. The "lower level" information is the numbers pertaining to the speed and direction of the particle; the "higher level" information is whether these numbers are greater than, or less than, some reference numbers.

    • [deleted]

    Speed and position, while being relative between objects, also apply within any single object in terms of the reactivity of its intrinsic properties. And that we might designate as information given concise constraints.

    Einstein once retorted in argument with Bohr; "I would just like to know what an electron Is". And this despite conventional assumptions at the time that the photoelectric effect had to be physically a single whole Quantum of multiple quanta value, because individual quanta would radiate away before the observed multiple could accumulate to liberate an electron from the sample of base metal exposed to modulated frequencies in experiment. Yet Compton had established formulated analysis of elastic and inelastic scattering.

    Elasticity is the measure of mechanical Speed at which something can be stretched, and resilience is the inverse function of the Speed at which it will return to a relaxed state (experiment by trying to cut a tire tread and then a rubber band). Macroscopicly this occurs electrostaticly between molecules, while at the quantum level it is inherent to the electrostatic field of an electron. Eperimentally there is nothing small enough that is electrically neutral that can be used to probe the electronic field to precisely define an inelastic core, and photons emitted or absorbed exhibit their own electromagnetic field. This is where the deductive reasoning of theory can provide means of hypothesis that can be subjected to falsification.

    Assume, hypothetically, that an inelastic core exists in an electron. While Spin characteristics are intentionally designed to be geometric projections of an instant of observed measurement and not a real measure of physical rotation, that is a theoretical paradigm constraint and does not preclude a local realistic physical rotation of a subject electron. At the zero boundary between inelastic response and elastic (however slow) response, becoming ever more elastic at greater distance from the core; a physical rotation of the core would translate axially as counter rotating torque imparted diametrically to an emission of electromagnetic energy we witness as a photonic stream. Accounting for polarity of 'light' as direction of angular momentum. like playing with a button on a string.

    So the information sought, would be what part field elasticity plays in determining frequency in relation to rate of spin of of a hypothetical core.

    9 days later

    Is Maxwell's demon "governed by natural law" (https://fqxi.org/community/articles/display/234)?:

    1. The universe runs on "fundamental level" information:

    Fundamental-level information in the universe seems to exist as natural categories e.g. mass, position and velocity (speed and direction). And information also seems to exist as the numbers that apply to these categories.

    2. Maxwell's demon acquires "higher level", algorithmic information:

    IF the number representing the particle speed is greater than a cut-off number, AND IF the numbers representing the particle's extrapolated direction positions the particle within the set of numbers representing the chamber door plane (THEN the demon acts to open the chamber door, letting the particle through).

    3. The information that Maxwell's demon acquires is the type of higher-level algorithmic information that living things acquire. This highly-structured higher-level information is not a logical consequence of fundamental-level information; however, higher-level information is necessarily built on top of fundamental-level information.

    4. But these higher-level true-or-false conditions are information that has no reason to exist unless it provides the basis for "higher-level outcomes": opening the chamber door was not a logical consequence of laws of nature because laws of nature do not operate on true-or-false conditions; also, opening the chamber door was not a logical consequence of the true-or-false conditions.

    5. Maxwell's demon is not "governed by natural law".

      • [deleted]

      Working from Faraday whom Maxwell analysed, the field is a physical attenuation of the same energy as the portion reckoned to be 'matter'. So it would be energy density in situ as a physical property of a particle which would equate with temperature, whereas macroscopically temperature is associated with motion of particulate matter. So macro realm temperature would be 'higher order information' and energy density would equate as higher temperature = higher velocity. But at the quantum level, energy density would equate as higher density = colder temperature.

      This apparent dichotomy might be resolved if higher order information obtains from lower energy density regions of the particle field, such as that evidenced by electrical repulsivity, being the impetus of macro particle motion.

      • [deleted]

      Not to detract from Maxwell, but in unifying the electric and magnetic fields mathematically his analytic method also segregated the full field. Where Faraday was essentially an intuitive experimentalist and made only the barest of mathematic analysis, Maxwell "associated" the electric and magnetic fields with a nondiscript particle. So it must be kept in mind that more questions abound from Maxwell than are answered.

      The (very) low speed experiments of Faraday, however meticulously measured and documented, could only be mathematically extrapolated by Maxwell to converge with the light velocity oscillations of Hertz' radio wave experiments. And experimentally it is only the transition zone of EMR wich is ever actually observed.

      This is a re-write of 4. from the above post:

      4. An algorithmic relationship has the structure: IF particular situational information is true THEN cause particular outcome information to be true. The algorithmic relationship links particular information with particular outcomes, but apart from the algorithmic relationship, there is no necessary connection between the information and the outcome.

      So, if outcomes that are 100% due to fundamental-level information continue to apply, then higher-level true-or-false information has no reason to exist. Higher-level true-or-false conditions are information that has no reason to exist unless it provides the basis for "higher-level outcomes" i.e. outcomes where at least one of the numbers representing the outcome information is not a logical consequence of fundamental law of nature relationships.

      The demon's opening of the chamber door is a "higher-level outcome". I.e. at least one of the numbers that represent the opening-the-chamber-door outcome was not a logical consequence of laws of nature. In any case, law of nature relationships are based on categories of information and numbers, they are not based on true-or-false conditions.

      (continued from the above post)

      Re "apart from the algorithmic relationship, there is no necessary connection between the information and the outcome":

      The demon's opening of the chamber door is an outcome that was not a logical consequence of the true-or-false information. Any outcome could have been linked to the true or false information e.g. buy a red hat, or wear the blue socks.

      • [deleted]

      "Only those who attempt the absurd will achieve the impossible. I think its in my basement... let me go upstairs and check." - Maurits Cornelius Escher

      On the Quantum probability, infinite two dimensional complex plane, M. C. Escher could construct a perpetual waterfall or a continuous ascending and descending staircase. Perhaps choice of geometries can therefore allow Maxwell's Demon uninformed results. What would his friend, Roger Penrose think?

      • [deleted]

      Temperature, like mass, suffers a lack of general definition in theoretical terms. Both are treated operationally. Yet thermodynamics must become generalized with both Relativistic and Quantum mechanics to progress beyond the current assemblage that is the Concordance of the cosmological standard model. In short, quantization needs Canons of general terms.

      Consider Maxwell's convergence and divergence functions, curl and div. These commonly apply to coefficiencies of permeability (mu) and permittivity (epsilon) of a field associated with a mass. Mass : energy equivalence provides no law of proportionality to prescribe an upper density bound for any quantity of energy and so attempts to determine a finite field fall into a mathematic singularity.

      In analysis, permeability and permittivity are seperate operations, yet in reality both would physically coexist. Both limit to a light velocity proportion in free space, but operate under opposite signs. The product would therefore be (-c^2), and to be consistent with mass : energy equivalence a postulate of proportionate density should also be quadratic. Thus a hypothetical upper density bound could be prescribed as;

      lim = mass (mu * epsilon)^2 = mc^4

      this does not in itself define what portion of the total mass must exist at that upper density, but that proportionate density would prescribe a finite quantity if a finite volume were derived as existant at a constant density. A differentiated quantity of energy would not need to become any more dense to become inertially bound as a finite field. And within that constant density volume temperature could equate to relativistic rate of response.

      • [deleted]

      Information, in the modern theoretical sense, is the attempt to formalize what has long been the colloquial ambiguity, "how does 'it' know what X". Mathematically this is confronted by the conundrum of discrete vs. continuous.

      As adults we are accustomed to a working knowledge of water being comprised of discrete molecules electrostaticly bound as a viscous fluid. We loose that childlike facility to be fascinated by the seemingly seamless quality of a shimmering slender stream running from a faucet that breaks into rivulets and drops as we curiously poke it with our finger. We begin our quantum discretion with the wise question, "How does the water 'know' how to do that?" How does what appears seamless become differentiated? Where does the break in symmetry occur? It becomes easier to simply start with things being made of pieces that cling to each other, than to retain the fascination of a continuum and seek how the stream can no longer 'know' itself as a contiuous stream. The knowing *itself* is information that is simply connected. Simply, means to not over-think it. Pieces knowing each other, or of each other, is information that is complexly connected.

      How does light know what the rate of passage of time is? Given light velocity is accepted as a universal absolute, light could be simply connected with the highest limit of rate of passage of time, at least in the one dimension of it linear propogation. But we can only assume that the rate of passage of time anywhere, is somewhere between nil and light velocity. So light would be complexly connected in the two other orthoganol spatial dimensions to the rate of time. So it would be quite possible for light to consist of an elastic light velocity particle and also a one dimensional light velocity interval where the information of rate of time is physical sought laterally, and that relativistic effect is what registers by an observing system as a wave.

        • [deleted]

        ...And a tip of my hat to Thomas H. Ray for his theoretic examination of One Dimensional Solitons.

        Anonymous, could you please put your name or initials at the end of your post if you are not signing in, as there may be several Anonymous-es posting and it would be good not to be unsure of who the posts belong to.

        • [deleted]

        Hi Georgi, its been me jrc, hogging the bandwidth. I had to get a new cheap laptop that is so overloaded by the OS that I don't use it for anything that requires 'creating an account'. Tough enough to keep a clean machine.

        I thought of you while composing the point in argument that we can only assume the rate of passage of time anywhere, is somewhere between nil and light velocity. Einsteins euphemism that 'time stops at light velocity' is provocative but neglects the obvious point that it would only be so in relation to light velocity being the upper limit to the rate of passage of time. But that is all within the very limiting constraints of SR. In GR, temperature can be generalized to higher energy density = higher gravitation = lower temperature = slower passage of time. The challenge is to formalize that at the quantum level, and GR treats a region of constant density as an averaged mass density rather than a proportional mass density.

        Condensed Matter Physics, which is by far the largest generic discipline in today's practicing physics community, recognizes this. But yet there remains the conspicuous absence of consensus on a definitive formulation of a particle of condensed energy which meets conventional standards, not least of which would be distribution in accord with inverse square law.

        John, I'll try not to overthink this. Doesn't the question 'how does it know?', come with the prior assumption that it knows? But I don't think it knows; to have knowledge seems to require some temporarily persistent structure or organisation independent of the phenomenon 'known'. The water is simply taking the path of least resistance around rather than through the finger. Isn't asking how it knows to do that a bit like asking how the battery knows when the distant switch is open?

        There is something odd about asking the rate of passage of time, let alone how light knows it. (Jumping back to what it meaning to know something rather than just acting, as the physical circumstances 'prescribe'). To measure the rate requires that used for comparison is already fixed, yet it is also the to be measured.

        • [deleted]

        Georgi,

        Yes, 'knowing' is a colloquialism. It's really about what we don't know, which is the greater part of anything we do seek to understand. So it is quite common and getting worse in the hubris of the Information Age of super-connectivity. Which is why the logical constraint of Rob McEachern, that information is a purely mathematical concept, is the better guide.

        And yes, asking what the rate of passage of time might be, is rather odd. As it is to ask what might be meant by connectivity. But in the here right now, these are time critical questions. In 2017 the Chinese achieved what they later publicly announced as "a significant sequence of singlet pair production" and successfully transmitted a video via quantum key encryption between their research facility near Bejing and their partner lab in Vienna, Austria. At the time of President Trump's State visit to China in 2017, it was stunningly obvious that his body language had deserted him in making the usual political platitudes following the initial meet and greet exchange of portfolios. Everyone in the U.S. entourage with the exception of Gen. Mattis, ret. looked like someone that goes to a party and encounters something everybody seems to know and thinks everyone is playing a trick on them. News commentators seemed to conclude that it was simply a lack of personal preparedness on the part of an inexperienced President, but it was apparent to seasoned observers of geopolitics that the U.S. strategic estimate going in had been caught flat-footed.

        So it isn't one's preference of paradigm that matters. It is a matter of who gets it to work with the infrastructure to support it, who then corners the global market on electronic transfer of funds and holds the algorithmic keys to pick winners and losers. And it doesn't matter who I am, in what I might say here, its just another grain of sand in the data mine.