A new thread to discuss an issue suggested by Alex Rhatushnyak (thank you).
Alex's summary follows:
At many complexity levels there seems to be two fundamental classes of
information systems:
1. Elementary particles are first of all either bosons or fermions.
Bosons are generally force carrier particles, whereas fermions
are usually associated with matter.
2. Atomic nucleus and electron cloud.
3. Life molecules including RNA and DNA, versus all other, lifeless molecules.
Living creatures are able to be proactive towards their environment, whereas
lifeless matter is adaptive (reactive) to environment rather than proactive.
4. Starting from the rather simple living creatures, most organisms are either
proactive (animals) or adaptive (plants) to resources of the environment.
5. More advanced living creatures are either proactive (males) or
adaptive (females) with respect to other individuals of their species.
6. Placebo effect suggests that there is a proactive and an adaptive agent in psyche.
7. It's hard to look above and beyond, but it seems likely that individuals
in advanced societies are either mostly creators or mostly consumers
of informational products (decisions, innovations, virtual realities...)
Thus again, mostly either proactive or adaptive, now towards their society.
8. It also seems likely that most societies position themselves first of all as
either proactive or adaptive towards their space-time neighbors.
Supposedly this is because information systems evolve better when there are features
enabling bimodal distributions, and seemingly proactive/adaptive is one of such
features.
If the Mathematical Universe Hypothesis[1] is true, that is, the hypothesis that
our external physical reality is isomorphic to a huge mathematical structure (HMS),
then it seems possible that the gradual increase of total complexity of HMS is
what we subjectively perceive as time, and increase of complexity in a self-aware
core (perhaps the proactive agent of psyche) could be essential for what we
subjectively perceive as qualia and consciousness.
Note this is descriptive (Kolmogorov) complexity, not computational complexity.
Descriptive complexity of a 10^122 bit long sequence[5] is below 1000 bits
if the sequence is fully describable with a short pattern and a repeat count.
We can assume that HMS always contains an infinite amount of "random" data, which
gradually become "non-random". For a finite, binary, one-dimensional illustration
consider a set of 2^Z files such that the first N bits are the same in all files
of the set, while data in the last Z bits are different in each file of the set.
Then at the next (for this reference frame) moment t+tp, where tp is Planck time,
the set contains 2^(Z-x) files such that N+x bits are the same, and Z-x bits are
all different. Or M groups of files (if we assume that M variants of future
may be equally real) such that N+x bits are same within each group.
Continue to read the full post here