All,

I just did an evaluation of Karl Coryat's excellent essay The Four Pillars of Fundamentality. It is both funny and profound, and I recommend it highy!

For anyone interested, I once again inadvertently got "into the zone" while contemplating Karl's Pillar #3 (Relations), resulting in another one of my on-the-fly mini-papers. This one addresses two topics: (a) the deep physics level fundamentality of "relations", which is the topic of Karl's Pillar #4, and (b) a years-old space-as-entanglement idea from my personal physics notes.

I had not intended to present the space-as-entanglement idea here, but it just seemed too relevant. It is equivalent to a hugely simplified, non-holographic approach to constructing 3-space out of a direct 3D (not 4D) web of group-level entanglements. The entangled "unit of space" is an overlooked direction-only conjugate component of particle spin. Since these were just personal musings, I was genuinely surprised to find out that a lively community for exploring the idea that space is a form of 4D holographic entanglement has existed for years. My version is much simpler (3D), much more direct (just a web), and I think kind of fun to read as a mind-stretching exercise if nothing else!

Cheers,

Terry

Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)

Essayist's Rating Pledge by Terry Bollinger

All,

Another very well written essay that I must recommend is Marc Séguin's Fundamentality Here, Fundamentality There, Fundamentality Everywhere.

It was one of my most enjoyable reads. It is lucid, learned, well-stated, well-ordered, addresses the topic in an interesting and engaging way, and has a sly self-deprecating sense of humor that had me chuckling multiple times. It is also spot-on for the question that FQXi asked this year.

On looking back at my assessment of Marc's essay, it looks like I got a bit carried away again. This time the topic was the nature of qualia. That is the word for the internal sensations and emotions that you can bring up in your mind without external sensory inputs. Try it: Close your eyes and imaging red and green lights, alternating. Those are qualia.

Notice that even though your optical system consistently maps the external light frequencies that we call red and green into the corresponding qualia in your head, the very fact that you can bring up the qualia without any external stimulation shows that all that is going on here is mapping: the light frequencies get mapped into those "somethings" in your head that you can also bring up from memory. For all you or I know, what red light brings up in my head might be what you would have called green. That sort of thing happens all the time for folks with synesthesia (which makes me jealous!).

So if you happen to have any interest in qualia, you can see what I wrote in my comments on Marc's essay.

Cheers,

Terry

Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)

Essayist's Rating Pledge by Terry Bollinger

    Dear Terry,

    Thank you for the kind words about my essay! To keep the ball rolling, may I recommend another excellent essay,

    "What if even the Theory of Everything isn't fundamental" by Paul Bastiaansen

    fqxi.org/community/forum/topic/3063

    I too got carried away with my comments on his thread... I used, of course, your very helpful and honest "what I liked/what I liked less" approach, and even referred to your essay contestant pledge!

    Cheers,

    Marc

    Terry,

    I like your definition (quote?) of QM. The thing about history is that nobody can see it as history at the time.

    There's history being written in my essay you've so far missed due to normal embedded assumptions. To make it more visible I've posted the below check list which the ontology builds on; Hope you can find the l time to look with a fresh mind.

    AS MOST STRUGGLE WITH THE CLASSICAL SEQUENCE (TO MUCH TO HOLD IN MIND ALL AT ONCE) A QUICK OUTLINE INTRO IS HERE;

    1. Start with Poincare sphere OAM; with 2 orthogonal momenta pairs NOT 'singlets'.

    2. Pairs have antiparalell axis (random shared y,z). (photon wavefront sim.)

    3. Interact with identical (polariser electron) spheres rotatable by A,B.

    4. Momentum exchange as actually proved, by Cos latitude at tan intersection.

    5. Result 'SAME' or 'OPP' dir. Re-emit polarised with amplitude phase dependent.

    6. Photomultiplier electrons give 2nd Cos distribution & 90o phase values.

    7. The non detects are all below a threshold amplitude at either channel angle.

    8. Statisticians then analyse using CORRECT assumptions about what's 'measured!

    The numbers match CHSH>2 and steering inequality >1 As the matching computer code & plot in Declan Traill's short essay. All is Bell compliant as he didn't falsify the trick with reversible green/red socks (the TWO pairs of states).

    After deriving it in last years figs I only discovered the Poincare sphere already existed thanks to Ulla M during this contest. I hope that helps introduce the ontology.

    Very best. Peter

    Conrad,

    Out of sheer luck I managed to find this posting just now! Speaking of how difficult it is to find reply postings and such, would be sooooo nice if FQXi did things like:

    -- When people sign up to get alerts for new or reply postings to an essay, send them emails with real, exact links to the new posts or replies, as opposed to mindlessly repeating only the generic link to the top-level essay;

    -- Make linking to sub-posts trivial and intuitive;

    -- Fix the "invisible sub-post" problem;

    -- Add more meaningful titles to links, instead of labeling absolutely everything as "FQXi Community";

    -- Stop taking people to some new or wrong location after they do something like logout to log back in again (which should keep you on the same login in page, not send you off to the home FQXi home page!);

    -- And worst of all, stop logging people out invisibly and for no reason!

    Other than that, I'm good... :)

    Conrad, it will be my great pleasure to respond in more detail to you today. I'll also try to fix some of the invisibility issues. I am for example considering consolidating those two very radical on-the-fly postings into one top-level mini-essay of some sort.

    More later!

    Cheers,

    Terry

    The Crowther Criteria for Fundamental Theories of Physics

    The source for this consolidated and lightly edited list is the 2017 FQXi Essay When do we stop digging? Conditions on a fundamental theory of physics, by Dr Karen Crowley at the University of Geneva. You can download her essay and read the discussion about it here:

    https://fqxi.org/community/forum/topic/essay-download/3034/__details/Crowther_Crowther_-_when_do.pdf

    To qualify under the Crowther Criteria, a fundamental theory of physics must be:

    CC#1. Unified: It must address all of reality using a single set of self-consistent premises.

    CC#2. Unique: It should be the only possible theory once its premises have been stated formally.

    CC#3. UV complete: There should not exist any phenomena are outside of its formal scope.

    CC#4. Non-perturbative: Its formalisms should be exactly solvable rather than probabilistic.

    CC#5. Internally self-consistent: It should be well-defined formally, and should not generate singularities.

    CC#6. Scale smooth: Its explanation of reality should be continuous across all scales (levels) of space and time, with no gaps, overlaps, or other discontinuities.

    CC#7. Fully generative: It requires no pre-existing fixed or "given" structures, such as space itself, that have complex and non-trivial properties.

    CC#8. Natural: It should require no arbitrary, inexplicable "fine-tuning" of numeric parameters.

    CC#9. Not weird: The underlying premises should be simple, easily comprehensible, and subject to Occam's razor.

    Don,

    My apologies, I completely forgot this one.

    I have now created a response folder for your essay and my responses. (Yes, I create an entire folder for each essayist with whom I interact.)

    Most likely I got distracted (left my laptop) right after responding to you. With so many essays and so many posts (and other distractions), I tend to forget my promises if I do not immediately created the corresponding folder.

    Please note in advance that due to my own pledge (see link at bottom) I can be a pretty tough reviewer. So, when folks request reviews I reserve the right just to make comments and not to score the essay in cases where I know I would give a low score. I don't mind giving blunt feedback-- sometimes we all need that -- but I just don't feel good giving low scores in response to a polite request for a review.

    It's best to mention all of this before I look at your essay, since I have no idea in advance what I'll be seeing or how I may react.

    Cheers,

    Terry

    Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)

    Essayist's Rating Pledge by Terry Bollinger

    Terry,

    If #7 were true, physics would have no foundational theories. Complex, non-trivial properties are often the result of dynamics with specified boundary conditions.

    Dear Terry,

    An original and daring idea, to define a numerical measure to answer the question 'what is fundamental'. It is really interesting to literally view scientific theories as a concise way to represent measurement data.

    I have a few comments. Your example of decimals of pi nicely illustrates that an unambiguous measure of Kolmogorov complexity is not possible. The example doesn't suffice, however, because the string of decimals is too short. I'm quite sure that in the space of 20 decimals, you cannot write a program that enumerates the decimals of pi. So probably the shortest way to represent your example string is the string itself (or a zipped variant). But of course, if you would take a string of decimals of pi that is much longer, the example does work.

    A more serious objection on your view on physics as information theory is that I would want to see arguments why it makes sense to view a physical theory as a concise way of reproducing data. This view misses the semantical part, the meaning of the theory. In practice, the question on how to check measurement data against the predictions of a theory, is not a straightforward one. A lot of theory interpretation is needed to calculate what outcome the theory predicts for a certain measurement. This aspect is absent from your view.

    Let me put it in a different way: if I understand you right, I can rephrase your claim that the most fundamental physical theory is somewhat like the best compression algorithm. Both are the shortest possible way to represent a set of data. But there is an important difference, because any scientific theory is a finite description of an infinite set of data, whereas the size of a compressed set of data still scales with the size of the original data. This indicates that there is an important difference between the two.

    In the end, I tend to think that how fundamental a theory is, is not a concept that can be given a numerical measure. But I admit that the idea is really interesting.

    Let me read up on the Spekkens principle, because I don't think I understand it. And I really like the three challenges you conclude with. I must confess, as a physicist, that I never appreciated the mysteries around spin and the difference between fermions and bosons. My bad, because indeed this must be profound.

    All the best,

    Paul Bastiaansen

    All,

    I'm introducing a new FQXi process idea here, which is this: I want to create a new format for capturing important essay contest conversations in a more explicit, more accessible form that makes them easier to cite and reference.

    Specifically, I will be posting for reference a number of supplemental "mini-essays" that capture, clean up, and document some of the particularly interesting ideas that have emerged from what have been for me very stimulating interactions with other essays and their authors. My goal is to make these synergistic outcomes more explicit and easier to reference in the future. Putting aside the competitive aspects of the FQXi Essay contests, I would judge that the greatest value of these contests emerges instead from the interactions between essay authors. Our essays are far more valuable as an interactive whole than they are if viewed only in isolation.

    The Crowley Criteria posting is my first example of such a mini-essay, though it is more an example of a reference summary than a mini-essay. For any of the mini-essays I post, please free to add your thoughts with (preferably) a reply-to at that posting. Note however that in the case of the Crowley Criteria I'm just trying to capture her ideas in a simple format. So if you want to debate any of the points in her list you should go to her essay thread rather than mine.

    I'm posting the Crowley Criteria in part because I need them as a reference for a rather unusual mini-essay that I will be posting soon. To be frank, that mini-essay ends up directly contradicting her CC#4. I didn't expect to wind up there, but such an unexpected journey is worth documenting!

    Each time I post a mini-essay I will try quickly (I was slow this time) to post a short addendum that provides links to the mini-essay. Below is my link addendum for the Crowley Criteria post.

    --------------------

    To link to the mini-essay titled:

    [link:fqxi.org/community/forum/topic/3099#post_145551]The Crowther Criteria for Fundamental Theories of Physics[/link]

    ... Please copy and paste either the named link above or the direct URL below:

    https://fqxi.org/community/forum/topic/3099#post_145551

      Terry, some quick short notes as I work my way to your essay:

      1. FQXi Essay Contestant Pledge = Suggested FQXi Voting Pledge

      Your Pledge is so refreshing that I've hot-linked it above. LHS wording of the title is yours; to me, it reads "official" and is thus too hopeful (for now). RHS is my suggested edit as we work with FQXi to improve things!

      2. Under current circumstances, my own position is clear:

      (i) As an independent researcher, I'm here to discuss, learn, teach, debate, respond to every question, critique others, etc. Result = Fail; eg, next-to-no questions, few responses.

      (ii) I'm not here for the votes: Result = Just-as-well; eg, given a 0 without explanation: how can I learn, respond, correct, defend, revise, acknowledge, etc?

      3. While we await (with many others) for FQXi improvements, why don't we develop an OPEN voting system? Add to your Pledge a (say, for argument's sake) 5-category [each numbered; #1-5] scoring sheet [maximum vote per category = 2??] with space for explanations, plus identifier (say, for you, hot-linked Terry Bollinger [or with hot-linked email-addresses also allowed] so that we ALWAYS get an alert -- with easy-return access. [You get the idea.]

      Recipient can respond to Terry Bollinger#2, for all to see: thus promoting open learning, debate, progress, support for one view or the other, or a middle view, etc. Given the teaching/learning, who then here, as a serious researcher, would focus on "fake-scores"?

      The advantage of this OPEN proposal is that you, with your background, could lead us to something truly useful, actionable, within the current rules, a worthwhile experiment, ready for the next "contest" (surely the wrong word here) -- which FQXi can monitor before refining (if need be), and accepting as the new gold-standard in OPEN teaching/learning/essay-exchange; etc: ready for the next 1 "contest"!

      4. To your (for me) excellent essay:

      (i) I counted 8 important fundamental symbols in Challenge #1.

      (ii) Re Challenge #2: in my [hurried] essay, see hot-linked Reference [12], p.639! It's part of my theory.

      (iii) NB: Your editorial red-pen will be very welcome there at any time; hopefully after you've read [in the first thread], the Background to my theory (which dates from 1989).

      (iv) Maybe, with hard work and insight, you might just become the person who finds a hidden gemstone of simplicity by unravelling the threads of misunderstanding that for decades have kept it hidden.

      PS: Terry, if/when you reply to my post (at any time), please copy it to my essay-thread so that I'm alerted to it. I will do likewise.

      Enough (for now): With many thanks and much appreciation for your lovely work;

      Gordon Watson More realistic fundamentals: quantum theory from one premiss.

      ... and of course, not having a line return at the end of the text URL caused the FQXi software to invalidate the URL by placing its final digit on a separate line. I'm pretty sure that did not show up in the preview, but maybe I just didn't notice it. Tsk, why didn't I anticipate such an obvious bug in advance?... :)

      Trying again:

      --------------------

      To link to the mini-essay titled:

      [link:fqxi.org/community/forum/topic/3099#post_145551]The Crowther Criteria for Fundamental Theories of Physics[/link]

      ... Please copy and paste either the named link above or the direct URL below:

      https://fqxi.org/community/forum/topic/3099#post_145551

      AAAAARRRRRGGGGGHHHHH!!!!! EVEN IF I WIN THIS IS NOT WORTH $10,000!!

      --------------------

      To link to the mini-essay titled:

      [link:fqxi.org/community/forum/topic/3099#post_145551]The Crowther Criteria for Fundamental Theories of Physics[/link]

      ... Please copy and paste either the named link above or the direct URL below:

      https://fqxi.org/community/forum/topic/3099#post_145551

      Mary had a little lamb, I hope it eats the bug...

      (Why THANK YOU nice uniformed people that my wife just called in! Yes, I would just LOVE to wear that nice white jacket to help keep my arms from spontaneously beating my own head! Just be sure to send the bill to FQXi!)

        Philip,

        Your question about whether there is a distinction between descriptive (which I interpret as more "English like") data and data that can be reduced though symmetry groups.

        The best answer I can give is that (a) I really don't know, and (b) I nonetheless rather strongly suspect that even the most random-looking descriptive parts of a theory are just finer-scale compositions of symmetry operations. That is because I have difficulty visualizing data compression processes that do not at some level invoke symmetries. Saying that two pieces of data are really one is, after all, just another way of stating a symmetry.

        I read your essay and found your approach intriguing and resonant with some of my own perspectives. I was struck in particular by this description:

        "In category theory a generalisation can be formulated using coequivalence and epimorphisms. The assimilation of information is an algebraic process of factorisation and morphisms. In algebraic terms then, the ensemble of all possibilities forms a freely generated structure in a universal algebra. Information about the world that forms part of life experience defines substructures and epimorphisms onto further algebraic structures that represent the possible universes that conform to the observed information."

        The image that brought to mind for me was Kolmogorov compression with a focus on free algebras, and applied not just to observed data in our universe, but to the definition of all possible universes. Intriguing!

        I note from the book chapter below that there seems to have been some coverage of algebraic generalizations of quantum mechanics (or at least of Hilbert space) in some of the side branches of physics, even if they are not dominant topics in the mainstream:

        Landsman N.P. (2009) Algebraic Quantum Mechanics. In: Greenberger D., Hentschel K., Weinert F. (eds) Compendium of Quantum Physics. Springer, Berlin, Heidelberg .

        Cheers,

        Terry

        Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)

        Essayist's Rating Pledge by Terry Bollinger

        You're new here, aren't you? :-)

        Being also retired from the DoD, you must have experienced the difficulties making legacy software work with rapidly advancing technology and updates to Windows. You're suffering from PTSD.

        The world is catching up with us, Terry.

        Oh yes indeedy! The stories either of us could tell... DISA alone...

        But in recent years I had the true privilege of working almost entirely with (a) Leading-edge commercial tech (I saw Google's Earth tech before Google owned it, and some amazing drones long before anyone had them at home); and (b) AI and robotics research. In short, I got spoiled!

        Fundamental as (Literally) Finding the Cusp of Meaning

        Terry Bollinger, 2018-02-25 Feb

        NOTE: The purpose of a mini-essay is to capture some idea, approach, or even a prototype theory that resulted from idea sharing by FQXi Essay contestants. This mini-essay was inspired primarily by two essays:

        The Perception of Order by Noson S Yanofsky

        The Laws of Physics by Kevin H Knuth

        Relevant quotes:

        Yanofsky (in a posting question): "I was wondering about the relationship between Kolmogorov Complexity and Occam's razor? Do simpler things really have lower KC?"

        Knuth: "Today many people make a distinction between situations which are determined or derivable versus those which are accidental or contingent. Unfortunately, the distinction is not as obvious as one might expect or hope."

        Bollinger: "...the more broadly a pattern is found in diverse types of data, the more likely it is to be attached deeply within the infrastructure behind that data. Thus words in Europe lead back 'only' back to Proto-Indo-European, while the spectral element signatures of elements on the other side of the visible universe lead all the way back to the shared particle and space physics of our universe. In many ways, what we really seem to be doing there is (as you note) not so much looking for 'laws' as we are looking for points of shared origins in space and time of such patterns."

        Messages, Senders, Receivers, and Meaning

        All variations of information theory include not just the concept of a message, but also of a sender who creates that message, and of a receiver who receives that message. The sender and received share a very special relationship, which is that they both understand the structure of the message in a way that assigns to it yet another distinct concept, which is that of meaning.

        Meaning is the ability to take specific, directed (by the sender) action as the result of receiving the message. Meaning, also called semantics, should never be confused with the message itself, for two reasons. The first is that a message in isolation is nothing more than a meaningless string of bits or other characters. In fact, if the message has been fully optimized -- that is, if it is near its Kolmogorov minimum -- it will look like random noise (the physical incarnation of entropy) to any observer other than the sender and receiver. The second is that the relationship between messages and meaning is highly variable. Depending on how well the sender and receiver "understand" each other, the same meaning can be invoked by messages that vary wildly in length.

        This message-length variability is a common phenomenon in human relationships. Couples who have lived together for decades often can convey complex meaning by doing nothing more than subtly raising an eyebrow in a particular situation. The very same couple in the distant past might well have argued (exchanged messages) for an hour before reaching the same shared perspective. Meaning and messages are not the same thing!

        But the main question here is this: What makes the sender and receiver so special?

        That is, how does it come to be that they alone can look at a sequence of what looks like random bits or characters, and from it implement meaning, such as real-world outcomes in which exquisitely coordinated movements by the sender and receiver accomplish joint goals that neither could have accomplished on their own?

        In short: How does meaning, that is, the ability to take actions that forever alter the futures of worlds both physical and abstract, come to be attached to a specific subset of all the possible random bit or character strings that could exist?

        Information Theory at the Meta Level

        The answer to how senders and receivers assign meaning to messages is that at some earlier time they received an earlier set of messages that dealt specifically with how to interpret this much later set of messages. Technologists call such earlier deployments of message-interpretation messages protocols, but that is just one name for them. Linguists for example call such shared protocols languages. Couples who have been together for many years just call their highly custom, unique, and exceptionally powerful set of protocols understanding each other.

        But it doesn't stop there. Physicists also uncover and identify shared protocols, protocols that they had no part in creating. They have however slowly learned how to interpret some of them, and so can now read some of the messages that these shared protocols enable. Physicists call such literally universal protocols the "laws" of physics, and use them for example to receive messages literally from the other side of the universe. For example, these shared protocols enable to look at the lines in light spectra and, amazingly, discern how the same elements that we see on earth can also be entrained within the star-dismantling heat and power of a quasar polar plasma jet billions of light years distant in both space and time.

        Protocols as Meaning Enablers

        While the word "protocol" has a mundane connotation as the rules and regulations by which either people or electronic equipment interact in clear, understandable ways (share information), I would like to elevate the stature of this excellent word by asserting that in terms of the meta-level at which all forms of information theory first acquire their senders and receivers, a protocol is a meaning enabler. That is, to create and distribute a protocol is to create meaning. They enable previously isolated components of the universe, at any scale from that of fundamental particles to light from distant quasars, to enable receivers to alter their behaviors and adopt new sets of coordinated, "future selective" behaviors that no longer leave the future entirely open to random chance. This in turn means that the more widely a protocol is distributed and used, the "smarter" the universe as a whole becomes. The enhancements can vary enormously is scale and scope, from the tiny sound-like handshakes that enable electrons to pair up and create superconductive materials, through the meaning exchanged by an aging couple, and up to scales that are quite literally universal, such as the shared properties of electrons. The fact that those shared electron properties define a protocol can be seen by imagining what would happen if electrons on the other side of the universe did not have the same quantum numbers and properties as the electrons we know. The protocol would be broken, and the light that we see would no longer contain a message that we understand.

        Historically such protocol deficiencies, that is, a lack or misunderstanding of the protocols that enable us to assign meaning to data, is the norm rather than the exception. Even in the case I mentioned earlier of how the electrons-photons-and-elements protocol enabled us to know what elements are in a quasar on the other side of the universe, there was a time in the 1800s when scientists mourned that we would never be able to know the composition of distant stars, which by that time they had realized were forever unreachable by any means of transportation that they could envision. It was not until the electrons-photons-and-elements protocol was deciphered that the availability of this amazing information became known.

        And even then that new information created its own mysteries! The element helium should have and would have been named "helion" had it been known on earth at the time of its discovery in solar spectra. That is because "-ium" indicates a metal (e.g. titanium, while "-on" indicates a gas (e.g. "argon). In this case the newly uncovered electron-photon-element protocol sent us a message we did not yet understand!

        Many more such messages are still awaiting protocol, with biology, especially at the biochemical level, being a huge and profound area in need of more protocols, of more ways to interpret with meaning the data we see. Thus for example, despite our having successfully unearthed the protocol for how DNA codes amino acids and proteins at the connection level, we remain woefully lacking in protocols for understanding how the non-protein components of DNA really work, or even of how those amino acids, once strung together, almost magically fold themselves into a working protein.

        Naturally Occurring Protocols

        To understand the full importance of protocols, however, it is vital as Kevin Knuth strongly advocates in his essay that we get away from the human-centric view that calls such discoveries of meaning "laws" in the human sense. In particular, the emergence and expansion of meaning-imbuing protocols is not limited just to relationships between humans (the aging couple) or ending with human-only receivers (we blew it for helion). The largest and most extensive protocols exist entirely independently of humans, in domains that include physics and especially biology.

        In the case of physics, the protocols that count most are the shared properties and allowed operations on those properties that enable matter and energy to interact in a huge variety of extremely interesting, and frankly bizarrely unlikely, ways. Kevin Knuth dives into some of these anthropic issues in his essay, primarily to point out how remarkable and, at this time at least, inexplicable they are. But in any case they exist, almost literally like fine-tuned machinery custom made to enable still more protocols, and thus still more meaning, to emerge over time.

        The First Open-Ended Protocol: Biochemistry

        Chemistry is one such protocol, with carbon-based biochemistry as an example in which the layering of protocols -- the emergence of compounds and processes whose very existence depends on earlier protocols, such as proteins out of amino acids -- is essentially unlimited.

        It is flatly incorrect to view computer software and networks as the first example of open-ended protocols that can be layered to create higher and higher levels of meaning. The example of truly open-ended protocols capable of supporting almost unlimited increases in meaning was the remarkable cluster of basic protocols centered around the element carbon. Those elemental protocols -- their subtleties include far more than just carbon, though carbon is literally the "backbone" upon which the higher-level protocols obtain the stability they require to exist at all --enabled the emergence of layer upon layer of chemical compounds of increasing complexity and sophistication. As exploited by life in particular, these compounds grow so complex that they qualify as exceptionally powerful machines capable of mechanical action (cutting and splicing DNA), energy conversion (photosynthesis), lens-like quantum calculation (chlorophyll complexes), and information storage and replication (DNA again).

        Each of these increasingly complex chemical machines also enable new capabilities, which in turn enable new, more sophisticated protocols, that is, new ways of interpreting other chemicals as messages. This interplay can become quite profound, and has the same ability to "shorten" messages that is seen in human computer networking. Fruit for example responds to gas ethylene by ripening faster, a protocol created to create enticing (at first!) smells to attract seed-spreading animals. The brevity of the message, the shortness of the ethylene molecule, is a pragmatic customization by plants to enable easy spreading of the message.

        Humans do this also. When after an extended effort (think of Yoda after lifting Luke Skywalker's space ship out of the swamp) we inhale deeply through our nose, we are self-dosing with the two-atom vasodialator nitric oxide, which our nasal cavities generate slowly over time for just such purposes.

        Cones Using Shared Protocols (Cusps)

        To understand Kevin Knuth's main message, it's time to take this idea of protocols to the level of physics, where it recursively becomes a fundamental assertion about the nature of fundamental assertions.

        Minkowski, the former professor of Albert Einstein who more than anyone else created the geometric interpretation of Einstein's originally algebraic work, invented the four-dimensional concept of the light cone to describe the maximum limits for how mass, energy, and information spread out over time. A 4D light "cone" does not look like a cone to our 3D-limited human senses. Instead, it appears like a cone, but like a ball of included space whose spherical surface expands outward at the speed of light. Everything within that expanding ball has potential access to -- that is, detailed information about -- whatever event created that particular cone. The origin of the light cone becomes the cusp of an expanding region that can share all or some subset of the information first generated at that cusp. Note that the cusp itself has a definite location in both space and time, and so qualifies as a well-defined event in spacetime, to use relativistic terminology.

        Protocols are a form of shared information, and so form a subset of the types of information that can be shared by such light cones. The cusp of the light cone becomes the origin of the protocol, the very first location at which it exists. From there it spreads at speeds limited by the speed of light, though most protocols are far less ambitious and travel only slowly. But regardless of how quickly or ubiquitously a new protocol spreads, it must always have a cusp, an origin, an event in spacetime at which it comes into being, and thereby creates new meaning within the universe. Whether that meaning is trivial, momentous, weak, powerful, inaccurate, or spot-on remains to be determined, but in general it is the protocols that enable better manipulations of the future that will tend to survive. Meaning grows, with stronger meanings competing against and generally overcoming weaker ones, though as in any ecology the final outcomes are never fixes or certain. The competitive multi-scale ecosystem of meaning, the self-selection of protocols as they vie for receivers who will act upon the messages that they enable, is a fascinating topic in itself, but one for some other place and time.

        In an intentional double entendre, I call these regions of protocol enablement via the earlier spread of protocols within a light cone "cones using shared protocols", or cusps. (I hate all-cap acronyms, don't you?) A protocol cusp is both the entire region of space over which the protocol applies or is available, but it is also the point in spacetime -- the time and location -- at which the protocol originated.

        Levels of Fundamentality as Depths of Protocol Cusps

        And that is where Kevin Knuth's focus on the locality and contingency of many "fundamental" laws comes into play. What we call "laws" are really just instances where we are speculating, with varying levels of confidence, that certain repeated patterns are messages with a protocol that we hope will give them meaning.

        Such speculations can of course be incorrect. However, in some instances they prove to be valid, at least the degree that we can prove it from the data. Thus the existence of the Indo-European language group was at first just a speculation, but one that proved remarkably effective at interpreting words in many languages. From it the cusp or origin of this truly massive "protocol" for human communications was given a name: Proto-Indo-European. The location of this protocol cusp in space was most likely the Pontic-Caspian steppe of Eastern Europe, and the time was somewhere between 4,500 BCE and 2,500 BCE.

        Alphabets have cusps. One of the most amazing and precisely located examples is the Korean phonetic alphabet, the Hangul, which was create in the 1400s by Sejong the Great. It is a truly masterful work, one of the best and most accessible phonetic alphabets ever created.

        Live is full of cusps! One of the earliest and most critical cusps was also one of the simplest: The binary choice between the left and right chiral (mirror-image) subsets of amino acids, literally to prevent confusion as proteins are constructed from them. Once this choice was made it became irrevocable for the entire future history of life, since any organism that went against was faced with instant starvation. Even predators cooperate in such situations. The time and origin of this cusp remains a deep mystery, one which some (the panspermia hypothesis) would assign to some other part of the galaxy.

        The coding of amino acids by DNA is another incredibly important protocol, one whose features are more easily comparable to the modern communications network concept of a protocol. The DNA-amino protocol is shared with minor deviations by all forms of life, and is a very sophisticated. It has been shown to perform superbly at preventing the vast majority of DNA mutation from damaging the corresponding proteins. The odds of that property popping up randomly in the DNA to amino acid translation mechanism are roughly one million to one. I recall from as recently as my college years reading works that disdained this encoding as random and an example of the "stupidity" of nature. It is not, though its existence does provide a proof of how easily stupidity can arise, especially when accompanied by arrogance.

        The Bottom Line for Fundamentality

        In terms of Kevin Knuth's concepts of contingency and context for "fundamental" laws (protocols) and rules, the bottom line in all of this is surprisingly simple:

        The fundamentality of a "law" (protocol for extracting meaning from data) depends on two factors: (1) How far back in time its cusp (origin) resides, and (2) how broadly the protocol is used.

        Thus the reason physics gets plugged so often as having the most fundamental rules and "laws" is because its cusp at the same time as the universe itself, presumably in the big bang, and because its protocols are so widely and deeply embedded that they enable us to "read" messages from the other side of the universe.

        Nearly all other protocol cusps, including those of life, are of a more recent vintage. But as Kevin Knuth points out in his essay, and as gave examples of through the very existence of physics-enable open protocols in biochemistry, deeper anthropic mysteries remain afoot, since strictly in terms of what we can see, the nominally "random" laws of physics were in fact direct predecessor steps necessary for life to begin creating its own upward-moving layers of protocols and increased meaning.

        It was a huge mistake to think that DNA-to-amino coding was "random."

        And even if we haven't a clue why yet, it is likely also a huge mistake to assume that the protocols of physics leading so directly and perfectly into the protocols of life. We just do not understand yet what is going on there, and we likely need to do a better job of fully acknowledging this deeply mysterious coincidence of continuity before we can make any real progress in resolving it.