... and of course, not having a line return at the end of the text URL caused the FQXi software to invalidate the URL by placing its final digit on a separate line. I'm pretty sure that did not show up in the preview, but maybe I just didn't notice it. Tsk, why didn't I anticipate such an obvious bug in advance?... :)

Trying again:

--------------------

To link to the mini-essay titled:

[link:fqxi.org/community/forum/topic/3099#post_145551]The Crowther Criteria for Fundamental Theories of Physics[/link]

... Please copy and paste either the named link above or the direct URL below:

https://fqxi.org/community/forum/topic/3099#post_145551

AAAAARRRRRGGGGGHHHHH!!!!! EVEN IF I WIN THIS IS NOT WORTH $10,000!!

--------------------

To link to the mini-essay titled:

[link:fqxi.org/community/forum/topic/3099#post_145551]The Crowther Criteria for Fundamental Theories of Physics[/link]

... Please copy and paste either the named link above or the direct URL below:

https://fqxi.org/community/forum/topic/3099#post_145551

Mary had a little lamb, I hope it eats the bug...

(Why THANK YOU nice uniformed people that my wife just called in! Yes, I would just LOVE to wear that nice white jacket to help keep my arms from spontaneously beating my own head! Just be sure to send the bill to FQXi!)

    Philip,

    Your question about whether there is a distinction between descriptive (which I interpret as more "English like") data and data that can be reduced though symmetry groups.

    The best answer I can give is that (a) I really don't know, and (b) I nonetheless rather strongly suspect that even the most random-looking descriptive parts of a theory are just finer-scale compositions of symmetry operations. That is because I have difficulty visualizing data compression processes that do not at some level invoke symmetries. Saying that two pieces of data are really one is, after all, just another way of stating a symmetry.

    I read your essay and found your approach intriguing and resonant with some of my own perspectives. I was struck in particular by this description:

    "In category theory a generalisation can be formulated using coequivalence and epimorphisms. The assimilation of information is an algebraic process of factorisation and morphisms. In algebraic terms then, the ensemble of all possibilities forms a freely generated structure in a universal algebra. Information about the world that forms part of life experience defines substructures and epimorphisms onto further algebraic structures that represent the possible universes that conform to the observed information."

    The image that brought to mind for me was Kolmogorov compression with a focus on free algebras, and applied not just to observed data in our universe, but to the definition of all possible universes. Intriguing!

    I note from the book chapter below that there seems to have been some coverage of algebraic generalizations of quantum mechanics (or at least of Hilbert space) in some of the side branches of physics, even if they are not dominant topics in the mainstream:

    Landsman N.P. (2009) Algebraic Quantum Mechanics. In: Greenberger D., Hentschel K., Weinert F. (eds) Compendium of Quantum Physics. Springer, Berlin, Heidelberg .

    Cheers,

    Terry

    Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)

    Essayist's Rating Pledge by Terry Bollinger

    You're new here, aren't you? :-)

    Being also retired from the DoD, you must have experienced the difficulties making legacy software work with rapidly advancing technology and updates to Windows. You're suffering from PTSD.

    The world is catching up with us, Terry.

    Oh yes indeedy! The stories either of us could tell... DISA alone...

    But in recent years I had the true privilege of working almost entirely with (a) Leading-edge commercial tech (I saw Google's Earth tech before Google owned it, and some amazing drones long before anyone had them at home); and (b) AI and robotics research. In short, I got spoiled!

    Fundamental as (Literally) Finding the Cusp of Meaning

    Terry Bollinger, 2018-02-25 Feb

    NOTE: The purpose of a mini-essay is to capture some idea, approach, or even a prototype theory that resulted from idea sharing by FQXi Essay contestants. This mini-essay was inspired primarily by two essays:

    The Perception of Order by Noson S Yanofsky

    The Laws of Physics by Kevin H Knuth

    Relevant quotes:

    Yanofsky (in a posting question): "I was wondering about the relationship between Kolmogorov Complexity and Occam's razor? Do simpler things really have lower KC?"

    Knuth: "Today many people make a distinction between situations which are determined or derivable versus those which are accidental or contingent. Unfortunately, the distinction is not as obvious as one might expect or hope."

    Bollinger: "...the more broadly a pattern is found in diverse types of data, the more likely it is to be attached deeply within the infrastructure behind that data. Thus words in Europe lead back 'only' back to Proto-Indo-European, while the spectral element signatures of elements on the other side of the visible universe lead all the way back to the shared particle and space physics of our universe. In many ways, what we really seem to be doing there is (as you note) not so much looking for 'laws' as we are looking for points of shared origins in space and time of such patterns."

    Messages, Senders, Receivers, and Meaning

    All variations of information theory include not just the concept of a message, but also of a sender who creates that message, and of a receiver who receives that message. The sender and received share a very special relationship, which is that they both understand the structure of the message in a way that assigns to it yet another distinct concept, which is that of meaning.

    Meaning is the ability to take specific, directed (by the sender) action as the result of receiving the message. Meaning, also called semantics, should never be confused with the message itself, for two reasons. The first is that a message in isolation is nothing more than a meaningless string of bits or other characters. In fact, if the message has been fully optimized -- that is, if it is near its Kolmogorov minimum -- it will look like random noise (the physical incarnation of entropy) to any observer other than the sender and receiver. The second is that the relationship between messages and meaning is highly variable. Depending on how well the sender and receiver "understand" each other, the same meaning can be invoked by messages that vary wildly in length.

    This message-length variability is a common phenomenon in human relationships. Couples who have lived together for decades often can convey complex meaning by doing nothing more than subtly raising an eyebrow in a particular situation. The very same couple in the distant past might well have argued (exchanged messages) for an hour before reaching the same shared perspective. Meaning and messages are not the same thing!

    But the main question here is this: What makes the sender and receiver so special?

    That is, how does it come to be that they alone can look at a sequence of what looks like random bits or characters, and from it implement meaning, such as real-world outcomes in which exquisitely coordinated movements by the sender and receiver accomplish joint goals that neither could have accomplished on their own?

    In short: How does meaning, that is, the ability to take actions that forever alter the futures of worlds both physical and abstract, come to be attached to a specific subset of all the possible random bit or character strings that could exist?

    Information Theory at the Meta Level

    The answer to how senders and receivers assign meaning to messages is that at some earlier time they received an earlier set of messages that dealt specifically with how to interpret this much later set of messages. Technologists call such earlier deployments of message-interpretation messages protocols, but that is just one name for them. Linguists for example call such shared protocols languages. Couples who have been together for many years just call their highly custom, unique, and exceptionally powerful set of protocols understanding each other.

    But it doesn't stop there. Physicists also uncover and identify shared protocols, protocols that they had no part in creating. They have however slowly learned how to interpret some of them, and so can now read some of the messages that these shared protocols enable. Physicists call such literally universal protocols the "laws" of physics, and use them for example to receive messages literally from the other side of the universe. For example, these shared protocols enable to look at the lines in light spectra and, amazingly, discern how the same elements that we see on earth can also be entrained within the star-dismantling heat and power of a quasar polar plasma jet billions of light years distant in both space and time.

    Protocols as Meaning Enablers

    While the word "protocol" has a mundane connotation as the rules and regulations by which either people or electronic equipment interact in clear, understandable ways (share information), I would like to elevate the stature of this excellent word by asserting that in terms of the meta-level at which all forms of information theory first acquire their senders and receivers, a protocol is a meaning enabler. That is, to create and distribute a protocol is to create meaning. They enable previously isolated components of the universe, at any scale from that of fundamental particles to light from distant quasars, to enable receivers to alter their behaviors and adopt new sets of coordinated, "future selective" behaviors that no longer leave the future entirely open to random chance. This in turn means that the more widely a protocol is distributed and used, the "smarter" the universe as a whole becomes. The enhancements can vary enormously is scale and scope, from the tiny sound-like handshakes that enable electrons to pair up and create superconductive materials, through the meaning exchanged by an aging couple, and up to scales that are quite literally universal, such as the shared properties of electrons. The fact that those shared electron properties define a protocol can be seen by imagining what would happen if electrons on the other side of the universe did not have the same quantum numbers and properties as the electrons we know. The protocol would be broken, and the light that we see would no longer contain a message that we understand.

    Historically such protocol deficiencies, that is, a lack or misunderstanding of the protocols that enable us to assign meaning to data, is the norm rather than the exception. Even in the case I mentioned earlier of how the electrons-photons-and-elements protocol enabled us to know what elements are in a quasar on the other side of the universe, there was a time in the 1800s when scientists mourned that we would never be able to know the composition of distant stars, which by that time they had realized were forever unreachable by any means of transportation that they could envision. It was not until the electrons-photons-and-elements protocol was deciphered that the availability of this amazing information became known.

    And even then that new information created its own mysteries! The element helium should have and would have been named "helion" had it been known on earth at the time of its discovery in solar spectra. That is because "-ium" indicates a metal (e.g. titanium, while "-on" indicates a gas (e.g. "argon). In this case the newly uncovered electron-photon-element protocol sent us a message we did not yet understand!

    Many more such messages are still awaiting protocol, with biology, especially at the biochemical level, being a huge and profound area in need of more protocols, of more ways to interpret with meaning the data we see. Thus for example, despite our having successfully unearthed the protocol for how DNA codes amino acids and proteins at the connection level, we remain woefully lacking in protocols for understanding how the non-protein components of DNA really work, or even of how those amino acids, once strung together, almost magically fold themselves into a working protein.

    Naturally Occurring Protocols

    To understand the full importance of protocols, however, it is vital as Kevin Knuth strongly advocates in his essay that we get away from the human-centric view that calls such discoveries of meaning "laws" in the human sense. In particular, the emergence and expansion of meaning-imbuing protocols is not limited just to relationships between humans (the aging couple) or ending with human-only receivers (we blew it for helion). The largest and most extensive protocols exist entirely independently of humans, in domains that include physics and especially biology.

    In the case of physics, the protocols that count most are the shared properties and allowed operations on those properties that enable matter and energy to interact in a huge variety of extremely interesting, and frankly bizarrely unlikely, ways. Kevin Knuth dives into some of these anthropic issues in his essay, primarily to point out how remarkable and, at this time at least, inexplicable they are. But in any case they exist, almost literally like fine-tuned machinery custom made to enable still more protocols, and thus still more meaning, to emerge over time.

    The First Open-Ended Protocol: Biochemistry

    Chemistry is one such protocol, with carbon-based biochemistry as an example in which the layering of protocols -- the emergence of compounds and processes whose very existence depends on earlier protocols, such as proteins out of amino acids -- is essentially unlimited.

    It is flatly incorrect to view computer software and networks as the first example of open-ended protocols that can be layered to create higher and higher levels of meaning. The example of truly open-ended protocols capable of supporting almost unlimited increases in meaning was the remarkable cluster of basic protocols centered around the element carbon. Those elemental protocols -- their subtleties include far more than just carbon, though carbon is literally the "backbone" upon which the higher-level protocols obtain the stability they require to exist at all --enabled the emergence of layer upon layer of chemical compounds of increasing complexity and sophistication. As exploited by life in particular, these compounds grow so complex that they qualify as exceptionally powerful machines capable of mechanical action (cutting and splicing DNA), energy conversion (photosynthesis), lens-like quantum calculation (chlorophyll complexes), and information storage and replication (DNA again).

    Each of these increasingly complex chemical machines also enable new capabilities, which in turn enable new, more sophisticated protocols, that is, new ways of interpreting other chemicals as messages. This interplay can become quite profound, and has the same ability to "shorten" messages that is seen in human computer networking. Fruit for example responds to gas ethylene by ripening faster, a protocol created to create enticing (at first!) smells to attract seed-spreading animals. The brevity of the message, the shortness of the ethylene molecule, is a pragmatic customization by plants to enable easy spreading of the message.

    Humans do this also. When after an extended effort (think of Yoda after lifting Luke Skywalker's space ship out of the swamp) we inhale deeply through our nose, we are self-dosing with the two-atom vasodialator nitric oxide, which our nasal cavities generate slowly over time for just such purposes.

    Cones Using Shared Protocols (Cusps)

    To understand Kevin Knuth's main message, it's time to take this idea of protocols to the level of physics, where it recursively becomes a fundamental assertion about the nature of fundamental assertions.

    Minkowski, the former professor of Albert Einstein who more than anyone else created the geometric interpretation of Einstein's originally algebraic work, invented the four-dimensional concept of the light cone to describe the maximum limits for how mass, energy, and information spread out over time. A 4D light "cone" does not look like a cone to our 3D-limited human senses. Instead, it appears like a cone, but like a ball of included space whose spherical surface expands outward at the speed of light. Everything within that expanding ball has potential access to -- that is, detailed information about -- whatever event created that particular cone. The origin of the light cone becomes the cusp of an expanding region that can share all or some subset of the information first generated at that cusp. Note that the cusp itself has a definite location in both space and time, and so qualifies as a well-defined event in spacetime, to use relativistic terminology.

    Protocols are a form of shared information, and so form a subset of the types of information that can be shared by such light cones. The cusp of the light cone becomes the origin of the protocol, the very first location at which it exists. From there it spreads at speeds limited by the speed of light, though most protocols are far less ambitious and travel only slowly. But regardless of how quickly or ubiquitously a new protocol spreads, it must always have a cusp, an origin, an event in spacetime at which it comes into being, and thereby creates new meaning within the universe. Whether that meaning is trivial, momentous, weak, powerful, inaccurate, or spot-on remains to be determined, but in general it is the protocols that enable better manipulations of the future that will tend to survive. Meaning grows, with stronger meanings competing against and generally overcoming weaker ones, though as in any ecology the final outcomes are never fixes or certain. The competitive multi-scale ecosystem of meaning, the self-selection of protocols as they vie for receivers who will act upon the messages that they enable, is a fascinating topic in itself, but one for some other place and time.

    In an intentional double entendre, I call these regions of protocol enablement via the earlier spread of protocols within a light cone "cones using shared protocols", or cusps. (I hate all-cap acronyms, don't you?) A protocol cusp is both the entire region of space over which the protocol applies or is available, but it is also the point in spacetime -- the time and location -- at which the protocol originated.

    Levels of Fundamentality as Depths of Protocol Cusps

    And that is where Kevin Knuth's focus on the locality and contingency of many "fundamental" laws comes into play. What we call "laws" are really just instances where we are speculating, with varying levels of confidence, that certain repeated patterns are messages with a protocol that we hope will give them meaning.

    Such speculations can of course be incorrect. However, in some instances they prove to be valid, at least the degree that we can prove it from the data. Thus the existence of the Indo-European language group was at first just a speculation, but one that proved remarkably effective at interpreting words in many languages. From it the cusp or origin of this truly massive "protocol" for human communications was given a name: Proto-Indo-European. The location of this protocol cusp in space was most likely the Pontic-Caspian steppe of Eastern Europe, and the time was somewhere between 4,500 BCE and 2,500 BCE.

    Alphabets have cusps. One of the most amazing and precisely located examples is the Korean phonetic alphabet, the Hangul, which was create in the 1400s by Sejong the Great. It is a truly masterful work, one of the best and most accessible phonetic alphabets ever created.

    Live is full of cusps! One of the earliest and most critical cusps was also one of the simplest: The binary choice between the left and right chiral (mirror-image) subsets of amino acids, literally to prevent confusion as proteins are constructed from them. Once this choice was made it became irrevocable for the entire future history of life, since any organism that went against was faced with instant starvation. Even predators cooperate in such situations. The time and origin of this cusp remains a deep mystery, one which some (the panspermia hypothesis) would assign to some other part of the galaxy.

    The coding of amino acids by DNA is another incredibly important protocol, one whose features are more easily comparable to the modern communications network concept of a protocol. The DNA-amino protocol is shared with minor deviations by all forms of life, and is a very sophisticated. It has been shown to perform superbly at preventing the vast majority of DNA mutation from damaging the corresponding proteins. The odds of that property popping up randomly in the DNA to amino acid translation mechanism are roughly one million to one. I recall from as recently as my college years reading works that disdained this encoding as random and an example of the "stupidity" of nature. It is not, though its existence does provide a proof of how easily stupidity can arise, especially when accompanied by arrogance.

    The Bottom Line for Fundamentality

    In terms of Kevin Knuth's concepts of contingency and context for "fundamental" laws (protocols) and rules, the bottom line in all of this is surprisingly simple:

    The fundamentality of a "law" (protocol for extracting meaning from data) depends on two factors: (1) How far back in time its cusp (origin) resides, and (2) how broadly the protocol is used.

    Thus the reason physics gets plugged so often as having the most fundamental rules and "laws" is because its cusp at the same time as the universe itself, presumably in the big bang, and because its protocols are so widely and deeply embedded that they enable us to "read" messages from the other side of the universe.

    Nearly all other protocol cusps, including those of life, are of a more recent vintage. But as Kevin Knuth points out in his essay, and as gave examples of through the very existence of physics-enable open protocols in biochemistry, deeper anthropic mysteries remain afoot, since strictly in terms of what we can see, the nominally "random" laws of physics were in fact direct predecessor steps necessary for life to begin creating its own upward-moving layers of protocols and increased meaning.

    It was a huge mistake to think that DNA-to-amino coding was "random."

    And even if we haven't a clue why yet, it is likely also a huge mistake to assume that the protocols of physics leading so directly and perfectly into the protocols of life. We just do not understand yet what is going on there, and we likely need to do a better job of fully acknowledging this deeply mysterious coincidence of continuity before we can make any real progress in resolving it.

      FQXi Essay Contestant Pledge

      Author: Terry Bollinger. Version 1.3, 2018-02-15

      ----------------------------------------

      When evaluating essays from other FQXi Contest participants, I pledge that I will rate and comment on essays based only on the following criteria:

      -- My best, most accurate judgement of the quality of the essay, without regard to how my ratings and comments on that essay could affect my own contest status.

      -- How well the essay makes its argument to back up its answer.

      -- How accurately and reliably an essay uses reference materials.

      -- How focused the essay is on answering the question as posed and intended by FQXi. (This is secondary to criteria above.)

      Furthermore, I will consciously strive to:

      -- Avoid rating an essay low just because it has a novel approach.

      -- Avoid rating an essay low because I disagree with its answer. Instead, I will focus how well the essay argues for that answer.

      -- Avoid rating an essay high solely because I like its conclusion. Even if I agree, my rating will reflect the overall essay quality.

      -- Avoid ratings inflation. If an essay does very poorly at arguing its conclusion, I pledge to give it the appropriate low rating, versus an inflated "just being nice" number such as a 5 or 6.

      -- Avoid reprisal behavior. I pledge that I will never knowingly assign unfair point ratings or make false comments about another essay as a form of reprisal against another contestant who gave my essay low ratings or negative comments.

      -- Avoid rudeness towards other contestants. If other contestants become abusive, I will appeal to FQXi to intervene, rather than attempt to respond in kind on my own.

      ...btw, the offer to work on a javascript part of the problem still stands.

      • [deleted]

      Hi Terry,

      I read your mini-essay and like it.

      I consider such mini-essays and / or addenda as very helpful - after one has read dozens of different essays with different ideas and at least I would need a somewhat more compact summary of the main ideas of the many different authors.

      A couple of thoughts about your mini-essay:

      'Protocols' sounds like a rather mechanical term to catch the distinction between message and meaning. It is really a big puzzle how 'meaning' can arise from rather mechanical processes. 'Meaning' traditionally is connected to awareness of the orderedness of the external reality - and additionally the orderedness of the internal reality of a subject that is capable of being aware of something. With this, the circle of meaning is closed. I suspect that 'meaning' is somewhat a similar tautology than the one I describe in my own addendum to my essay: meaning self-confirms itself in the same manner as my purported idea of fundamental truths do.

      I think you are totally on the right track to suspect that 'meaning' has exactly the meaning we ascribe to it: by finding some meaning in nature, we find a certain truth that speaks to us through nature. By finding some meaning that we epistemologically have facilitated by means of our preferences to emotionally conclude something, we may gain some truth or some falsehood about this 'something'. In summary: whereas meaning about the external reality is more likely to be stable and pointing to some objective truths, the meaning of some more subjective conclusions about very specific circumstances that do not really justify to make some general rule out of them, we are more in danger to conclude something that could be objectively false or at least incomplete.

      Your example with the aging couple is to the point, since it shows that the problem of subjective conclusions and their real meaning is solved over time by compressing the message as far as possible: Highten an eyebrow then has a very precise meaning - regardless of whether or not the couple loves one another or is in permanent confrontation. In either case one's emotions are perfectly understood by the other via the compressed message that has a well-suited meaning for the couple.

      Interestingly this could be a complementary example of 'internalizing some external reality as a model, as a set of symbols' as is done by modelling some abilities for perception of the brain for the sake to understand the latter in information-theoretic terms. The hightened eyebrow does the complementary, it *externalizes* not a model, but a precise emotional state by means of a compressed and very specific symbol / action. Together with the model one has about the emotional landscape of the partner, one can even reliably deduce how to further interpret the hightening of the eyebrow, since the latter can be interpreted in general as dislikening something, and the internal model can further specify what the dislikening specifically is all about in the actual situation.

      Another interesting aspect of protocols seem to be for me that they limit or exclude other possibilities. This is what we all want to achieve by searching for some more fundamental level of nature. Limiting the options that are left makes it easier to determine the more fundamental level.

      Just a couple of thoughts :-)

      Best wishes,

      Stefan Weckbach

      A reply of mine to this comment was unfortunately edited in a mutilating manner. I merely recall that you mentioned fractional calculus. Maybe, I should have a look at this because for instance half differentiation implies boundaries.

      Eckard

      Gordon,

      Thank you for supporting the Pledge!

      Your title is intriguing; look at my signature line and its single-concept definition of QM and you can see why. My queue on this last day is long, but I will follow your link and a look at your essay.

      Cheers,

      Terry

      Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)

      Essayist's Rating Pledge by Terry Bollinger

      "Quantum mechanics is simpler than most people realize. It is no more and no less than the physics of things for which history has not yet been written."

      Gordon,

      Wow! That is one of the best arguments for locality that I think I've seen. I like your Bell-ish style of writing and focus on specifics. You are of course in very good company, since both Einstein and Bell were localists.

      I can't do a detailed assessment today -- too many equations that would need careful examination to assess your argument meaningfully -- but what I've seen at a quick look seems pretty solid.

      That said, there is an expanding class of pro-entanglement data anomalies that you need somehow to take into account:

      ID230 Infrared Single-Photon Detector Hybrid Gated and Free-Running InGaAs/InP Photon Counter with Extremely Low Dark Count

      This field has moved way beyond the Aspect studies. A lot of hard-nosed business folks figured out years ago that arguments against the existence of entanglement don't matter much if they can simply build devices that violate Bell's inequality. Which they did, and now they sell them to some very smart, physics-savvy customers who use them on a daily basis to encrypt some critical data transmissions. Many of these customers would be, shall we say, upset in interesting ways if some company sold them equipment that did not work.

      Again, thanks for a well-argued essay! I'll try (no promises though) to take a closer look at your essay at some later (post-commenting-close) date. Again assuming the equations are solid, yours is the kind of in-depth analysis needed to sharpen everyone's thinking about such topics.

      Cheers,

      Terry

      Terry -

      That was wonderfully clear and readable, not to mention vast in scope - an excellent summary of what I think are the key issues here. I agree with pretty much everything, except - there's a basic missing piece to your concept of meaning. Naturally, it happens to be what I've been trying to articulate in my essays.

      You write, "To create and distribute a protocol is to create meaning." This describes the aspect of information-processing that's well-understood: data gets transferred from sender to receiver and decoded through shared protocols - a very good term for the whole range from laws of physics to human philosophies. But this concept of meaning takes it for granted that the underlying data is distinguishable: that there are physical contexts - for both sender and receiver - in which the 1's and 0's (or any of the many different kinds of information that actually constitute the physical world), make an observable difference.

      This is hard not to take for granted, I know - both because such contexts are literally everywhere we look, and because it's very difficult to describe them in general terms. But I've argued both on logical grounds and empirically, from "fine-tuning", that it takes an extremely special kind of universe to make any kind of information physically distinguishable.

      The physical world is essentially a recursive system in which information that's distinguished (measured) in one context gets communicated out to help set up other contexts, to distinguish more information. Quite a number of distinct protocols are apparently needed to make this work, and I've tried to sort some of them out in my current essay, to suggest how they might have emerged. In my 2017 essay I compared the way this system works with the other two basic recursive systems that make up our world, biological evolution and human communication.

      Regarding biological and human systems, you're right that there's "natural selection" for meanings that "enable better manipulations of the future." But while this also applies to the evolution of human theories about the physical world, I don't think it's quite right for the generation of meaning in the physical world itself. Rather, the meanings that get selected are the ones that keep on enabling the future itself - that is, that constantly set up new situations in which the same protocol-system can operate to create new meaning.

      I don't mean to detract at all from your remarkable mini-essay - I give it a 10. But please fix your next-to-last sentence. I think you mean that it's a mistake to suppose the protocols of physics just happen to support the protocols of life. That's a complex issue... that can't become clear, I think, until we have some idea where the protocols of physics come from.

      Thanks for your many eye-opening contributions to this contest - once again, I'm in awe.

      Conrad

      Terry,

      In our (Feb 17th) string above we didn't resolve the non integer spin video matter; 100 sec video Classic QM. It's just occurred that you were after a POLAR spin 1/2, 2 etc! Now that's not quite what the original analysis implied, but, lest it may have been, YES, the 3 degrees of freedom also produce that.

      Just one y axis rotation with each polar rotation gives spin 1/2; Imagine the polar axis horizontal. Now rotate around the vertical axis to switch the poles horizontally. HALF a polar rotation at the same time brings your start point back.

      Now a y axis rotation at HALF that rate means it takes TWO rotations of the polar axis to t return to the start point.

      Occam never made a simpler razor! It's a unique quality of a sphere that there's no polar axis momentum loss from y or z axis rotations.

      Was there anything else? (apart from confusing random number distributions explained in Phillips's essay with real 'action at a distance'!) Of course tomography works but within strict distance limits. Just checked through Karen's list again and can't find one the DFM doesn't qualify for apart from a few particle physics bits. Can you check & see if I can stop digging now and leave those to the HEP specialists!?

      Peter

      PS; Not sure if that link hasn't suddenly died!

        Hi,

        This is a wonderful essay, with Deep fundamental knowledge. I am impressed.

        Nothing to ask for now.

        Ulla Mattfolk https://fqxi.org/community/forum/topic/3093