Fundamental as (Literally) Finding the Cusp of Meaning
Terry Bollinger, 2018-02-25 Feb
NOTE: The purpose of a mini-essay is to capture some idea, approach, or even a prototype theory that resulted from idea sharing by FQXi Essay contestants. This mini-essay was inspired primarily by two essays:
The Perception of Order by Noson S Yanofsky
The Laws of Physics by Kevin H Knuth
Relevant quotes:
Yanofsky (in a posting question): "I was wondering about the relationship between Kolmogorov Complexity and Occam's razor? Do simpler things really have lower KC?"
Knuth: "Today many people make a distinction between situations which are determined or derivable versus those which are accidental or contingent. Unfortunately, the distinction is not as obvious as one might expect or hope."
Bollinger: "...the more broadly a pattern is found in diverse types of data, the more likely it is to be attached deeply within the infrastructure behind that data. Thus words in Europe lead back 'only' back to Proto-Indo-European, while the spectral element signatures of elements on the other side of the visible universe lead all the way back to the shared particle and space physics of our universe. In many ways, what we really seem to be doing there is (as you note) not so much looking for 'laws' as we are looking for points of shared origins in space and time of such patterns."
Messages, Senders, Receivers, and Meaning
All variations of information theory include not just the concept of a message, but also of a sender who creates that message, and of a receiver who receives that message. The sender and received share a very special relationship, which is that they both understand the structure of the message in a way that assigns to it yet another distinct concept, which is that of meaning.
Meaning is the ability to take specific, directed (by the sender) action as the result of receiving the message. Meaning, also called semantics, should never be confused with the message itself, for two reasons. The first is that a message in isolation is nothing more than a meaningless string of bits or other characters. In fact, if the message has been fully optimized -- that is, if it is near its Kolmogorov minimum -- it will look like random noise (the physical incarnation of entropy) to any observer other than the sender and receiver. The second is that the relationship between messages and meaning is highly variable. Depending on how well the sender and receiver "understand" each other, the same meaning can be invoked by messages that vary wildly in length.
This message-length variability is a common phenomenon in human relationships. Couples who have lived together for decades often can convey complex meaning by doing nothing more than subtly raising an eyebrow in a particular situation. The very same couple in the distant past might well have argued (exchanged messages) for an hour before reaching the same shared perspective. Meaning and messages are not the same thing!
But the main question here is this: What makes the sender and receiver so special?
That is, how does it come to be that they alone can look at a sequence of what looks like random bits or characters, and from it implement meaning, such as real-world outcomes in which exquisitely coordinated movements by the sender and receiver accomplish joint goals that neither could have accomplished on their own?
In short: How does meaning, that is, the ability to take actions that forever alter the futures of worlds both physical and abstract, come to be attached to a specific subset of all the possible random bit or character strings that could exist?
Information Theory at the Meta Level
The answer to how senders and receivers assign meaning to messages is that at some earlier time they received an earlier set of messages that dealt specifically with how to interpret this much later set of messages. Technologists call such earlier deployments of message-interpretation messages protocols, but that is just one name for them. Linguists for example call such shared protocols languages. Couples who have been together for many years just call their highly custom, unique, and exceptionally powerful set of protocols understanding each other.
But it doesn't stop there. Physicists also uncover and identify shared protocols, protocols that they had no part in creating. They have however slowly learned how to interpret some of them, and so can now read some of the messages that these shared protocols enable. Physicists call such literally universal protocols the "laws" of physics, and use them for example to receive messages literally from the other side of the universe. For example, these shared protocols enable to look at the lines in light spectra and, amazingly, discern how the same elements that we see on earth can also be entrained within the star-dismantling heat and power of a quasar polar plasma jet billions of light years distant in both space and time.
Protocols as Meaning Enablers
While the word "protocol" has a mundane connotation as the rules and regulations by which either people or electronic equipment interact in clear, understandable ways (share information), I would like to elevate the stature of this excellent word by asserting that in terms of the meta-level at which all forms of information theory first acquire their senders and receivers, a protocol is a meaning enabler. That is, to create and distribute a protocol is to create meaning. They enable previously isolated components of the universe, at any scale from that of fundamental particles to light from distant quasars, to enable receivers to alter their behaviors and adopt new sets of coordinated, "future selective" behaviors that no longer leave the future entirely open to random chance. This in turn means that the more widely a protocol is distributed and used, the "smarter" the universe as a whole becomes. The enhancements can vary enormously is scale and scope, from the tiny sound-like handshakes that enable electrons to pair up and create superconductive materials, through the meaning exchanged by an aging couple, and up to scales that are quite literally universal, such as the shared properties of electrons. The fact that those shared electron properties define a protocol can be seen by imagining what would happen if electrons on the other side of the universe did not have the same quantum numbers and properties as the electrons we know. The protocol would be broken, and the light that we see would no longer contain a message that we understand.
Historically such protocol deficiencies, that is, a lack or misunderstanding of the protocols that enable us to assign meaning to data, is the norm rather than the exception. Even in the case I mentioned earlier of how the electrons-photons-and-elements protocol enabled us to know what elements are in a quasar on the other side of the universe, there was a time in the 1800s when scientists mourned that we would never be able to know the composition of distant stars, which by that time they had realized were forever unreachable by any means of transportation that they could envision. It was not until the electrons-photons-and-elements protocol was deciphered that the availability of this amazing information became known.
And even then that new information created its own mysteries! The element helium should have and would have been named "helion" had it been known on earth at the time of its discovery in solar spectra. That is because "-ium" indicates a metal (e.g. titanium, while "-on" indicates a gas (e.g. "argon). In this case the newly uncovered electron-photon-element protocol sent us a message we did not yet understand!
Many more such messages are still awaiting protocol, with biology, especially at the biochemical level, being a huge and profound area in need of more protocols, of more ways to interpret with meaning the data we see. Thus for example, despite our having successfully unearthed the protocol for how DNA codes amino acids and proteins at the connection level, we remain woefully lacking in protocols for understanding how the non-protein components of DNA really work, or even of how those amino acids, once strung together, almost magically fold themselves into a working protein.
Naturally Occurring Protocols
To understand the full importance of protocols, however, it is vital as Kevin Knuth strongly advocates in his essay that we get away from the human-centric view that calls such discoveries of meaning "laws" in the human sense. In particular, the emergence and expansion of meaning-imbuing protocols is not limited just to relationships between humans (the aging couple) or ending with human-only receivers (we blew it for helion). The largest and most extensive protocols exist entirely independently of humans, in domains that include physics and especially biology.
In the case of physics, the protocols that count most are the shared properties and allowed operations on those properties that enable matter and energy to interact in a huge variety of extremely interesting, and frankly bizarrely unlikely, ways. Kevin Knuth dives into some of these anthropic issues in his essay, primarily to point out how remarkable and, at this time at least, inexplicable they are. But in any case they exist, almost literally like fine-tuned machinery custom made to enable still more protocols, and thus still more meaning, to emerge over time.
The First Open-Ended Protocol: Biochemistry
Chemistry is one such protocol, with carbon-based biochemistry as an example in which the layering of protocols -- the emergence of compounds and processes whose very existence depends on earlier protocols, such as proteins out of amino acids -- is essentially unlimited.
It is flatly incorrect to view computer software and networks as the first example of open-ended protocols that can be layered to create higher and higher levels of meaning. The example of truly open-ended protocols capable of supporting almost unlimited increases in meaning was the remarkable cluster of basic protocols centered around the element carbon. Those elemental protocols -- their subtleties include far more than just carbon, though carbon is literally the "backbone" upon which the higher-level protocols obtain the stability they require to exist at all --enabled the emergence of layer upon layer of chemical compounds of increasing complexity and sophistication. As exploited by life in particular, these compounds grow so complex that they qualify as exceptionally powerful machines capable of mechanical action (cutting and splicing DNA), energy conversion (photosynthesis), lens-like quantum calculation (chlorophyll complexes), and information storage and replication (DNA again).
Each of these increasingly complex chemical machines also enable new capabilities, which in turn enable new, more sophisticated protocols, that is, new ways of interpreting other chemicals as messages. This interplay can become quite profound, and has the same ability to "shorten" messages that is seen in human computer networking. Fruit for example responds to gas ethylene by ripening faster, a protocol created to create enticing (at first!) smells to attract seed-spreading animals. The brevity of the message, the shortness of the ethylene molecule, is a pragmatic customization by plants to enable easy spreading of the message.
Humans do this also. When after an extended effort (think of Yoda after lifting Luke Skywalker's space ship out of the swamp) we inhale deeply through our nose, we are self-dosing with the two-atom vasodialator nitric oxide, which our nasal cavities generate slowly over time for just such purposes.
Cones Using Shared Protocols (Cusps)
To understand Kevin Knuth's main message, it's time to take this idea of protocols to the level of physics, where it recursively becomes a fundamental assertion about the nature of fundamental assertions.
Minkowski, the former professor of Albert Einstein who more than anyone else created the geometric interpretation of Einstein's originally algebraic work, invented the four-dimensional concept of the light cone to describe the maximum limits for how mass, energy, and information spread out over time. A 4D light "cone" does not look like a cone to our 3D-limited human senses. Instead, it appears like a cone, but like a ball of included space whose spherical surface expands outward at the speed of light. Everything within that expanding ball has potential access to -- that is, detailed information about -- whatever event created that particular cone. The origin of the light cone becomes the cusp of an expanding region that can share all or some subset of the information first generated at that cusp. Note that the cusp itself has a definite location in both space and time, and so qualifies as a well-defined event in spacetime, to use relativistic terminology.
Protocols are a form of shared information, and so form a subset of the types of information that can be shared by such light cones. The cusp of the light cone becomes the origin of the protocol, the very first location at which it exists. From there it spreads at speeds limited by the speed of light, though most protocols are far less ambitious and travel only slowly. But regardless of how quickly or ubiquitously a new protocol spreads, it must always have a cusp, an origin, an event in spacetime at which it comes into being, and thereby creates new meaning within the universe. Whether that meaning is trivial, momentous, weak, powerful, inaccurate, or spot-on remains to be determined, but in general it is the protocols that enable better manipulations of the future that will tend to survive. Meaning grows, with stronger meanings competing against and generally overcoming weaker ones, though as in any ecology the final outcomes are never fixes or certain. The competitive multi-scale ecosystem of meaning, the self-selection of protocols as they vie for receivers who will act upon the messages that they enable, is a fascinating topic in itself, but one for some other place and time.
In an intentional double entendre, I call these regions of protocol enablement via the earlier spread of protocols within a light cone "cones using shared protocols", or cusps. (I hate all-cap acronyms, don't you?) A protocol cusp is both the entire region of space over which the protocol applies or is available, but it is also the point in spacetime -- the time and location -- at which the protocol originated.
Levels of Fundamentality as Depths of Protocol Cusps
And that is where Kevin Knuth's focus on the locality and contingency of many "fundamental" laws comes into play. What we call "laws" are really just instances where we are speculating, with varying levels of confidence, that certain repeated patterns are messages with a protocol that we hope will give them meaning.
Such speculations can of course be incorrect. However, in some instances they prove to be valid, at least the degree that we can prove it from the data. Thus the existence of the Indo-European language group was at first just a speculation, but one that proved remarkably effective at interpreting words in many languages. From it the cusp or origin of this truly massive "protocol" for human communications was given a name: Proto-Indo-European. The location of this protocol cusp in space was most likely the Pontic-Caspian steppe of Eastern Europe, and the time was somewhere between 4,500 BCE and 2,500 BCE.
Alphabets have cusps. One of the most amazing and precisely located examples is the Korean phonetic alphabet, the Hangul, which was create in the 1400s by Sejong the Great. It is a truly masterful work, one of the best and most accessible phonetic alphabets ever created.
Live is full of cusps! One of the earliest and most critical cusps was also one of the simplest: The binary choice between the left and right chiral (mirror-image) subsets of amino acids, literally to prevent confusion as proteins are constructed from them. Once this choice was made it became irrevocable for the entire future history of life, since any organism that went against was faced with instant starvation. Even predators cooperate in such situations. The time and origin of this cusp remains a deep mystery, one which some (the panspermia hypothesis) would assign to some other part of the galaxy.
The coding of amino acids by DNA is another incredibly important protocol, one whose features are more easily comparable to the modern communications network concept of a protocol. The DNA-amino protocol is shared with minor deviations by all forms of life, and is a very sophisticated. It has been shown to perform superbly at preventing the vast majority of DNA mutation from damaging the corresponding proteins. The odds of that property popping up randomly in the DNA to amino acid translation mechanism are roughly one million to one. I recall from as recently as my college years reading works that disdained this encoding as random and an example of the "stupidity" of nature. It is not, though its existence does provide a proof of how easily stupidity can arise, especially when accompanied by arrogance.
The Bottom Line for Fundamentality
In terms of Kevin Knuth's concepts of contingency and context for "fundamental" laws (protocols) and rules, the bottom line in all of this is surprisingly simple:
The fundamentality of a "law" (protocol for extracting meaning from data) depends on two factors: (1) How far back in time its cusp (origin) resides, and (2) how broadly the protocol is used.
Thus the reason physics gets plugged so often as having the most fundamental rules and "laws" is because its cusp at the same time as the universe itself, presumably in the big bang, and because its protocols are so widely and deeply embedded that they enable us to "read" messages from the other side of the universe.
Nearly all other protocol cusps, including those of life, are of a more recent vintage. But as Kevin Knuth points out in his essay, and as gave examples of through the very existence of physics-enable open protocols in biochemistry, deeper anthropic mysteries remain afoot, since strictly in terms of what we can see, the nominally "random" laws of physics were in fact direct predecessor steps necessary for life to begin creating its own upward-moving layers of protocols and increased meaning.
It was a huge mistake to think that DNA-to-amino coding was "random."
And even if we haven't a clue why yet, it is likely also a huge mistake to assume that the protocols of physics leading so directly and perfectly into the protocols of life. We just do not understand yet what is going on there, and we likely need to do a better job of fully acknowledging this deeply mysterious coincidence of continuity before we can make any real progress in resolving it.