Hector,
Regarding the "Unreasonable Effectiveness of Mathematics", in an earlier post, under Matt Visser's essay, and repeated somewhere under my own, I wrote:
"In your summary, you ask "Exactly which particular aspect of mathematics is it that is so unreasonably effective?" in describing empirical reality.
I would argue, that is not an aspect of mathematics at all, but rather, an aspect of physics. Specifically, some physical phenomenon are virtually devoid of information. That is, they can be completely described by a small number of symbols, such as mathematical symbols. Physics has merely "cherry picked" these sparse information-content phenomenon, as its subject matter, and left the job of describing high information-content phenomenon, to the other sciences. That is indeed both "trivial and profound", as noted in your abstract."
Regarding the effectiveness of "Shannon's information theory" as compared to "algorithmic information theory", I am very strongly of the opinion that the former is much more effective than the latter, in all the areas, like measurement theory, that have much real relevance to "an observer". The difference lies in the relative importance of "Source Coding" vs. "Channel Coding"; lossless compression algorithms, in spite of any "astonishing fact", are virtually useless in the realm of observing and measuring. One of the biggest problems hindering the advancement of modern physics, is that physicists "don't get it"; contrary to popular belief, observation and measurement are not about preserving source information, they are about distinguishing "relevant information" from "irrelevant information" as quickly and efficiently as possible. A lossless Source Coder, with sunlight as its input, would preserve huge amounts of information about the solar spectrum, that is absolutely irrelevant to any human observer, other than an astrophysicist. That is why the channel coder in the visual pigments in the retina totally ignore this "irrelevant information". The same is true of auditory information; well over 99% of the "Source Information" is discarded before the information stream ever exits the inner ear. While this information has great relevance to a modern telephone modem, it has none at all to a human observer.
Since, as Shannon demonstrated, all channels have a limited information carrying capacity, it is imperative for any multi-stage information processing observer, to remove capacity-consuming "irrelevant information" as quickly as possible. This presents a "chicken and egg" dilemma, that has been debated since at least the time of Socrates, 2500 years ago. How can you find what you are looking for, when you don't even know what you are looking for.
Nevertheless, as I pointed-out in the essay, when you do know, such as when you have created an experiment in which only a single frequency or energy exists, looking for (attempting to observe and model) a Fourier superposition, rather than a single frequency or energy, is a positively dumb thing to do. It is no wonder why there is so much "weirdness" in the interpretations given to such inappropriate models.
You stated that "Your view... seem to suggest most of the world information content is actually algorithmic random, hence not "capturable" by mathematical equations". That is not my view. My view is that much of the "world information content" is very predictable. HOWEVER, the function of any good design for a sensor, instrument, or observer, in other words, a Channel Coder, is to make sure that it's output is devoid of all such redundant predictabilities. Hence, although the world may not be random, any good channel coder will render all observations of that world into algorithmic random output. One does not need to observe the solar spectrum today, precisely because one can predict that it will look the same tomorrow. Evolution via natural selection has ensured that biological observers do KNOW what they are looking for, and actively and very, very effectively avoid ever looking at anything other than what they are looking for. Consequently, equations may very well be able to capture the "Source Information" about observed elementary particles. But they cannot capture the "Source Information" of a human observer, attempting to interpret any information. Such an observer has spent it's entire life recording outputs, and basing all its behaviors, on sensory channel coders that are attempting to produce algorithmic random inputs to the brain. The brain's function is then to look for "higher level" correlations between these multi- sense, multi-time inputs, in order to generate higher level models of exploitable, predictable correlations.
Unfortunately, for the better part of a century, quantum physicists have thought they should be looking for superpositions. But as Josh Billings (Henry Wheeler Shaw) once said:
"The trouble with people is not that they don't know, but that they know so much that ain't so."
Rob McEachern