Hi Ines,
So first of all you have a lovely sense of langauge and if English is not your first langauge then that is really all the more impressive.
The basics of thermodynamics and information theory we cleary agree on, and you do a great job with Maxwell's demon. I might steal some of your phrases next time I teach about it!
OK and then you go and say something truly interesting and obviously true but just in a way that had never occurred to me before. Of course I knew that metabolism involves reducing entropy by exporting it to the environment, but I had never taken that additional step of noticing that this reverses our standard view of the relationship between information gains and enropy losses before, and the second I read that I started having lots of thoughts about how our minds operate, many of which you also discuss because it follows so naturally from that simple point.
Would you agree with this heurstic assesment: it's as though open systems can gain information, paradoxically(?), by ignoring things. Abstracting, coarse-graining, tossing out details, assuming spherical cows and all the other kinds of approximations we make are information losses in one sense, but what we are really doing is tossing things out so that our internal possibilities are reduced, hence increasing the probabilities of certain 'rare' states from the perspective of our own phase space and thus constituting information increases for ourselves. Animals "pay attention" to the world, but in order to be good at this, they also actively ignore an immense amount of information available in their environment. For example, species in the jungle hear the mating calls of their fellow creatures like a siren above the din of all of the other noises that other species are using to transmit information through their environment.
I always thought of that as being a result of the greater degree of mutual information between the two species, and that is another way to see it which should be mathematically identical. But I like this thought very much, that it is also a way for an open system to reduce it's entropy by selectively ignoring the much larger amount of meaningless information which pervades its environment. In a sense, the ability to hear the mating call above the rest of the noise can equivalently be seen as the ability to ignore the "noise." I mean, actually even in calling it noise I have already done that--it is all information, the "signals" and the "noise", so differentiating signals from noise is how open systems reduce their entropy.
Is this a fair characterization of your point about "observers"? And finally, I think you would agree with this but some of the langauge in your essay left me wondering: anything that can perform a "measurement" is an observer, right? For example, as long as molecular recognition driven by nothing but thermal noise and structural mutual information is an act of "observation" just as much as when I read your writing is, then I agreee with literally everything you say.
Great work!
Joe