Terry -

That was wonderfully clear and readable, not to mention vast in scope - an excellent summary of what I think are the key issues here. I agree with pretty much everything, except - there's a basic missing piece to your concept of meaning. Naturally, it happens to be what I've been trying to articulate in my essays.

You write, "To create and distribute a protocol is to create meaning." This describes the aspect of information-processing that's well-understood: data gets transferred from sender to receiver and decoded through shared protocols - a very good term for the whole range from laws of physics to human philosophies. But this concept of meaning takes it for granted that the underlying data is distinguishable: that there are physical contexts - for both sender and receiver - in which the 1's and 0's (or any of the many different kinds of information that actually constitute the physical world), make an observable difference.

This is hard not to take for granted, I know - both because such contexts are literally everywhere we look, and because it's very difficult to describe them in general terms. But I've argued both on logical grounds and empirically, from "fine-tuning", that it takes an extremely special kind of universe to make any kind of information physically distinguishable.

The physical world is essentially a recursive system in which information that's distinguished (measured) in one context gets communicated out to help set up other contexts, to distinguish more information. Quite a number of distinct protocols are apparently needed to make this work, and I've tried to sort some of them out in my current essay, to suggest how they might have emerged. In my 2017 essay I compared the way this system works with the other two basic recursive systems that make up our world, biological evolution and human communication.

Regarding biological and human systems, you're right that there's "natural selection" for meanings that "enable better manipulations of the future." But while this also applies to the evolution of human theories about the physical world, I don't think it's quite right for the generation of meaning in the physical world itself. Rather, the meanings that get selected are the ones that keep on enabling the future itself - that is, that constantly set up new situations in which the same protocol-system can operate to create new meaning.

I don't mean to detract at all from your remarkable mini-essay - I give it a 10. But please fix your next-to-last sentence. I think you mean that it's a mistake to suppose the protocols of physics just happen to support the protocols of life. That's a complex issue... that can't become clear, I think, until we have some idea where the protocols of physics come from.

Thanks for your many eye-opening contributions to this contest - once again, I'm in awe.

Conrad

Terry,

In our (Feb 17th) string above we didn't resolve the non integer spin video matter; 100 sec video Classic QM. It's just occurred that you were after a POLAR spin 1/2, 2 etc! Now that's not quite what the original analysis implied, but, lest it may have been, YES, the 3 degrees of freedom also produce that.

Just one y axis rotation with each polar rotation gives spin 1/2; Imagine the polar axis horizontal. Now rotate around the vertical axis to switch the poles horizontally. HALF a polar rotation at the same time brings your start point back.

Now a y axis rotation at HALF that rate means it takes TWO rotations of the polar axis to t return to the start point.

Occam never made a simpler razor! It's a unique quality of a sphere that there's no polar axis momentum loss from y or z axis rotations.

Was there anything else? (apart from confusing random number distributions explained in Phillips's essay with real 'action at a distance'!) Of course tomography works but within strict distance limits. Just checked through Karen's list again and can't find one the DFM doesn't qualify for apart from a few particle physics bits. Can you check & see if I can stop digging now and leave those to the HEP specialists!?

Peter

PS; Not sure if that link hasn't suddenly died!

    Hi,

    This is a wonderful essay, with Deep fundamental knowledge. I am impressed.

    Nothing to ask for now.

    Ulla Mattfolk https://fqxi.org/community/forum/topic/3093

      The Illusion of Mathematical Formality

      Terry Bollinger, 2018-02-26 Feb

      Abstract. Quick: What is the most fundamental and least changing set of concepts in the universe? If you answered "mathematics," you are not alone. In this mini-essay I argue that far from being eternal, formal statements are actually fragile, prematurely terminated first-steps in perturbative sequences that derive ultimately from two unique and defining features of the physics of our universe: multi-scale, multi-domain sparseness and multi-scale, multi-domain clumping. The illusion that formal statements exist independently of physics is enhanced by the clever cognitive designs of our mammalian brains, which latch on quickly to first-order approximations that help us respond quickly and effectively to survival challenges. I conclude by recommending recognition of the probabilistic infrastructure of mathematical formalisms as a way to enhance, rather than reduce, their generality and analytical power. This recognition makes efficiency into a first-order heuristic for uncovering powerful formalisms, and transforms the incorporation of a statistical method such Monte Carlo into formal systems from being a "cheat" into an integrated concept that helps us understand the limits and implications of the formalism at a deeper level. It is not an accident, for example, that quantum mechanics simulations benefit hugely from probabilistic methods.

      ----------------------------------------

      NOTE: A mini-essay is my attempt to capture and make more readily available an idea, approach, or prototype theory that was inspired by interactions with other FQXi Essay contestants. This mini-essay was inspired by:

      1. When do we stop digging, Conditions on a fundamental theory of physics by Karen Crowther

      2. The Crowther Criteria for Fundamental Theories of Physics

      3. On the Fundamentality of Meaning by Brian D Josephson

      4. What does it take to be physically fundamental by Conrad Dale Johnson

      5. The Laws of Physics by Kevin H Knuth

      Additional non-FQXi references are listed at the end of this mini-essay.

      ----------------------------------------

      Background: Letters from a Sparse and Clumpy Universe

      Sparseness6 occurs when some space, such as a matrix or the state of Montana, is occupied by only a thin scattering of entities, e.g. non-zero numbers in the matrix or people in Montana . A clump is compact group of smaller entities (often clumps themselves of some other type) that "stick together" well enough to persist over time. A clump can be abstract, but if it is composed of matter we call it an object. Not surprisingly, sparseness and clumping tend to be closely linked, since clumps often are the entities that occupy positions in some sparse space.

      Sparseness and clumping occur at multiple size scales in our universe, using a variety of mechanisms, and when life is included, at varying levels of abstraction. Space itself provides a universal basis for creating sparseness at multiple size scales, yet the very existence of large expanses of extremely "flat" space is still considered one of the greatest mysteries in physics, an exquisitely knife-edged balancing act between total collapse and hyper expansion.

      Clumping is strangely complex, involving multiple forces at multiple scales of size. Gravity reigns supreme for cosmic-level clumping, from involvement (not yet understood) in the 10 billion lightyear diameter Hercules-Corona Borealis Great Wall down to kilometer scale gravel asteroids that just barely hold together. From there a dramatically weakened form of the electromagnetic force takes over, providing bindings that fall under the bailiwick of chemistry and chemical bonding. (The unbridled electric force is so powerful it would obliterate even large gravitationally bond objects.) Below that level the full electric force reigns, creating the clumps we call atoms. Next down in scale is yet another example of a dramatically weakened force, which is the pion-mediated version of the strong force that holds neutrons and protons together to give us the chemical elements. The protons and neutrons, as well as other more transient particles, are the clumps created by the full, unbridled application of the strong force. At that point known clumping end... or do they? The quarks themselves notoriously appear to be constructed from still smaller entities, since for example they all use multiples of a mysterious 1/3 electric charge, bound together by unknown means at unknown scales. How exactly the quarks have such clump-like properties remains a mystery.

      Nobel Laureate Brian Josephson1 speculates that at least for higher level domains such as biology and sociology, the emergence of a form of stability that is either akin to or leads to clumping always the result of two or more entities that oppose and cancel each other in ways that create or leave behind a more durable structure. This intriguing concept can be translated in a surprisingly direct way to the physics of clumping and sparseness in our universe. For example, the mutually cancelling of positive and negative charges of an electron and a proton can combine to leave enduring and far less reactive result, a hydrogen atom, that in turn supports clumping through a vastly moderated presentation of the electric forces that it largely cancels More generally, the hydrogen atom is an example of incomplete cancellation, that is, cancellation of only a subset of the properties of two similar but non-identical entities. The result qualifies as "scaffolding" in the Josephson sense due to its relative neutrality, which allows it for example to be a part of chemical compounds that would be instantly shredded by the full power of the mostly-cancelled electric force. Physics has many examples of this kind of incomplete cancellation, ranging from quarks that mutually cancel the overwhelming strong force to leave milder protons and neutrons, protons and electrons that then cancel to leave charge-free hydrogen atoms, unfilled electron states that combine to create stable chemical bonds, and hydrogen and hydroxide groups on amino acids that combine to enable the chains known as proteins. At higher levels of complexity, almost any phenomenon that reaches an equilibrium state tends to produce a more stable, enduring outcome. The equilibrium state that compression-resistant matter and ever-pulling gravity reach at the surface of a planet is another more subtle example, one that leads to a relatively stable environment that is conducive to, for example, us.

      Bonus Insert: Space and gravity as emerging from hidden unified-force cancellations

      It is interesting to speculate whether the flatness of space could itself be an outcome of some well-hidden form of partial cancellation.

      If so, it would mean that violent opposing forces of some type of which we are completely unaware (or have completely misunderstood) largely cancelled each other out except for a far milder residual, that being the scaffolding that we call "flat space." This would be a completely different approach to the flat space problem, but one that could have support from existing data if that data were examined from Josephson's perspective of stable infrastructure emerging from more mutual cancellation by far more energetic forces.

      The forces that cancelled would almost certainly still be present in milder forms, however, just as the electric force continues to show up in milder forms in atoms. Thus if the Josephson effect -- ah, sorry, that phrase is already taken -- if the Josephson synthesis model applies to space itself, then the mutually cancelling forces that led to flat space may well already be known to us, just not in their most complete and ferocious forms. Furthermore, if these space-generating forces are related to the known strong and electric forces -- or more likely, to the Standard Model combination of them with the weak force -- then such a synthesis would provide and entirely new approach to unifying gravity with the other three forces.

      Thus the full hypothesis in summary: Via Josephson synthesis, it is speculated that ordinary xyz space is a residual structural remnant, a scaffolding, generated by the nearly complete cancellation of two oppositely signed versions of the unified weak-electric-strong of the Standard Model. Gravity then becomes not another boson force, but a topological effect applied by matter to the the "surface of cancellation" of the unified Standard Model forces.

      Back to Math: Is Fundamental Physics Always Formal?

      In her superb FQXi essay When do we stop digging? Conditions on a fundamental theory of physics, Karen Crowley2 also created an exceptionally useful product for broader use, The Crowther Criteria for Fundamental Theories of Physics.3 It is a list of nine succinctly stated criteria that in her assessment need to be met by a physics theory before it can qualify as fundamental.

      There was however one criterion in her list about which I uncertain, which was the fourth one:

      CC#4. Non-perturbative: Its formalisms should be exactly solvable rather than probabilistic.

      I was ambivalent when I first read that one, but I was also unsure why I felt ambivalent. Was it because one of the most phenomenally accurate predictive theories in all of physics, Feynman's Quantum ElectroDynamics or QED, is also so deeply dependent on perturbative methods? Or was it the difficulty that many fields and methods have in coming up with closed equations? I wanted to understand why, if exactly solvable equations were the "way to go" in physics for truly fundamental results, why then were some of the most successful theories in physics perturbative? What all does that work really imply?

      As it turns out, both the multi-scale clumpiness and sparseness of our universe are relevant to this question because they lurk behind such powerful mathematical concepts as renormalization. Renormalization is not really as exotic or even as mathematical is it is in, say, Feynman's QED theory. What it really amounts to is an assertion that our universe is, at many levels, "clumpy enough" that many objects (and processes) within it can be approximated when viewed from a distance. That "distance" may be real space or some other more abstract space, but the bottom line is that this sort of approximation option is a deep component of whatever is going on. I say that in part because we are ourselves as discrete, independently mobile entities are very much part of this clumpiness, as are the large, complex molecules that make up our bodies... as are the atoms that enable molecules... as are the nucleons that enable atoms... and as are the fundamental fermions that make up nucleons.

      This approximation-at-a-distance even shows up in everyday life and cognition. For example, let's say you need an AA battery. What do you think first? Probably you think "I need to go to the room where I keep my batteries." But your navigation to that room begins as a room to room navigation. You don't worry yet about exactly where in that room the batteries are, because that has no effect on how you navigate to the room. In short, you will approximate the location of the battery until you navigate closer to it.

      The point is that the room is itself clumpy in a way that enables you to do this, but the process itself is clearly approximate. You could in principle super-optimize your walking path so that it minimizes your total effort to get to the battery, but such a super-optimization would be extremely costly in terms of the thinking and calculations needed, and yet would provide very little benefit. So, when the cost-benefit ratio grows too high, we approximate rather than super-optimize, because the clumpy structure of our universe makes such approximations much more cost-beneficial overall.

      What happens after your reach the room? You change scale!

      That is, you invoke a new model that tells you how to navigate the draws or containers in which you keep the AA batteries. This scale is physically smaller, and again is approximate, enabling tolerance for example of highly variable locations of the batteries within a drawer or container.

      This works for the same reason that in Feynman's QED is incredibly accurate and efficient for modeling an electron probabilistically. The electron-at-a-distance can be safely and very efficiently modeled as a point particle with a well-defined charge, even though that is not really correct. That is the room-to-room level. As you get closer to the electron, that model must be replace by a far more complex one that involves rapid creation and annihilation of charged virtual particle pairs that "blur" the charge of the electrons in strange and peculiar ways. That is the closer, smaller, dig-around-in-the-drawers-for-a-battery level of approximation. In both cases, the overall clumpiness of our universe makes these special forms of approximation both very accurate and computationally efficient.

      At some deeper level, one could further postulate that this may be more than just a way to model reality. It is at least possible (I personally think it probable) that this is also how the universe actually works, even if we don't quite understand how. I say that because it is always a bit dangerous to assume that just because we like to model space as a given and particles as points within it, those are in the end just models, ones that actually violate quantum mechanics in the sense of postulating points that cannot exist in real space due the quantum energy cost involved. A real point particle would require infinite energy to isolate, so a model that invokes such particles to estimate reality really should be viewed with a bit of caution as a "final" model.

      So bottom line: While Karen Crowley's Criterion #4 makes excellent sense as a goal, our universe seems weirdly wired for at least some forms of approximation. I find that very counterintuitive, deeply fascinating, and likely important in some way that we flatly do not yet understand.

      Perturbation Versus Formality in Terms of Computation Costs

      Here is a hypothesis:

      In the absence of perturbative opportunities, the computational costs of fully formal methods for complete, end-to-end solutions trends towards infinity.

      The informal proof is that full formalization implies fully parallel combinatorial interaction of all components of a path (functional) in some space, that being XYZ space in the case of approaching an electron. The computational cost of this fully parallel optimization then increases both with decreasing granularity of the path segment sizes used, and with path length. The granularity is the most important parameter, with the cost rapidly escalating towards infinity as the precision (inverse of segment length) decreases towards the limit of representing the path as an infinitely precise continuum of infinitely precise points.

      Conversely, the ability to use larger segments instead of infinitesimals depends on the scale structure of the problem. If that scale structure enables multiscale renormalization, then the total computational cost remain at least roughly proportional to the level of precision desired. If no such scale structure is available, the cost instead escalates towards infinity.

      But isn't the whole point of closed formal solutions is that they remain (roughly) linear in computational cost versus the desired level of precision?

      Yes... but what if the mathematical entities we call "formal solutions" are actually nothing more than the highest-impact granularities of what are really just perturbative solutions made possible by the pre-existing structure of our universe?

      Look for example at gravity equations, which treat stars and planets as point-like masses. However, that approximation completely falls apart at the scale of a planet surface, and so is only the first and highest-level step in what is really a perturbative solution. It's just that our universe is pre-structured in a way that makes many such first steps so powerful and so broadly applicable that it allows us to pretend they are complete, stand-alone formal solutions.

      A More Radical Physics Hypothesis

      All of this leads to a more radical hypothesis about formalisms in physics, which is this:

      All formal solutions in physics are just the highest, most abstract stages of perturbative solutions that are made possible by the pre-existing clumpy structure of our universe.

      But on closer examination, even the above hypothesis is incomplete. Another factor that needs to be taken into account is the neural structure of human brains, and how they are optimized.

      The Role of Human Cognition

      Human cognition must rely on bio-circuitry that has very limited speed, capacity, and accuracy. It therefore relies very heavily in the mathematical domain on using Kolmogorov programs to represent useful patterns that we see in the physical world, since a Kolmogorov program only needs to be executed to the level of precision actually needed.

      Furthermore, it is easier and more compact to process suites of such human-brain-resident Kolmogorov programs as the primary data components for reasoning about complexity, as opposed to using their full elaborations into voluminous data sets that are more often than not beyond neural capacities. In addition to shrinking data set sizes, reasoning at the Kolmogorov program level has the huge advantage that such program capture in direct form at least many of the regularities in such data sets, which in turn allows much more insightful comparisons across programs.

      We call this "mathematics."

      The danger in not recognizing mathematics as a form of Kolmogorov program creation, manipulation, and execution is that as biological intelligences, we are by design inclined to accept such programs as representing the full, to-the-limit forms of the represented data sets. Thus the Greeks assumed the Platonic reality of perfect planes, when in fact the physical world is composed of atoms that make such planes flatly impossible. The world of realizable planes is instead emphatically and decisively perturbative, allowing the full concept of "a plane" to exist only as unobtainable limit of the isolated, highest-level initial calculations. The reality of such planes falls apart completely when the complete, perturbative, multi-step model is renormalized down to the atomic level.

      That is to say, exactly as with physics, the perfect abstractions of mathematics are nothing more than top-level stages of perturbative programs made possible by the pre-existing structure of our universe.

      The proof of this is that whenever you try to compute such a formal solution, you are forced to deal with issues such as scale or precision. This in turn means that the abstract Kolmogorov representations of such concept never really represent their end limits, but instead translate into huge spectra of precision levels that approach the infinite limit to whatever degree is desired, but only at a cost that increases with the level of precision. The perfection of mathematics is just an illusion, one engendered by the survival-focused priorities of how our limited biological brains deal with complexity.

      Clumpiness and Mathematics

      The bottom line is this even broader hypothesis:

      All formal solutions in both physics and mathematics are just the highest, most abstract stages of perturbative solutions that are made possible by the pre-existing "clumpy" structure of our universe.

      In physics, even equations such as E=mc2 that are absolutely conserved at large scales cannot be interpreted "as is" at the quantum level, where virtual particle pairs distort the very definition of where mass is located. E=mc2 thus more accurately understood as a high-level subset of a multi-scale perturbative process, rather than as a complete, stand-alone solution.

      In mathematics, the very concept of an infinitesimal is a limit that can never be reached by calculation or by physical example. That makes the very foundations of real mathematics into a calculus not of real values, but of sets of Kolmogorov programs for which the limits of execution are being intentionally ignored. Given the indifference and often lack even of awareness of the implementation spectra that are necessarily associated with all such formalisms, is it really that much of a surprise how often unexpected infinities plague problems in both physics and math? Explicit awareness of this issue changes the approach and even the understanding of what is being done; math in general becomes a calculus of operators, of programs, rather than of absolute limits and concepts.

      One of the most fascinating implications of the hypothesis that all math equations ultimately trace back to the clumpiness and sparseness of the physical universe is that heuristic methods can become integral parts of such equations. In particular they should be usable in contexts where a "no limits" formal statement overextends computation in directions that have no real impact on the final solution. This makes methods such as Monte Carlo into first-order options for expressing a situation correctly. As one example, papers by Jean Michel Sellier7 show how the carefully structured "signed particle" applications of Monte Carlo methods can dramatically reduce the computation costs of quantum simulation. Such syntheses of both theory (signed particles and negative probabilities) with statistical methods (Monte Carlo) promise not only to provide practical algorithmic benefits, but also to provide deeper insights into the nature of quantum wavefunctions themselves.

      Possible Future Expansions of this Mini-Essay

      As a mini-essay, my time is growing short for posting here. Most of the above arguments are my original stream-of-thought arguments that led to my overall conclusion. But as my abstract shows, I have a great many more thoughts to add, but likely not enough time to add them. I will therefore post this following link to a public Google Drive folder I've set up for FQXi-related postings.

      If this is OK with FQXi -- basically if they do not strip out the URL below, and I'm perfectly fine if they do -- then I may post updated versions of this and other mini-essays in this folder in the future:

      Terry Bollinger's FQXi Updates Folder

      ----------------------------------------

      Non-FQXi References

      6. Lin, H. W., Tegmark, M., and Rolnick, D. Why does deep and cheap learning work so well? Journal of Statistical Physics, Springer,168:1223-1247 (2017).

      7. Jean Michel Sellier. A Signed Particle Formulation of Non-Relativistic Quantum Mechanics. Journal of Computational Physics, 297:254-265 (2015).

      • [deleted]

      An Exceptionally Simple Space-As-Entanglement Theory

      Terry Bollinger, 2018-02-26 Feb

      Abstract. There has been quite a bit of attention in recent years to what has been called the holographic universe. This concept, which originated somehow from string theory (!), postulates that the universe is some kind of holographic image, rather than the 3D space we see. Fundamental to this idea is space as entanglement, that is, that the fabric of space is built out of the mysterious "spooky action" links the Einstein so disdained. In keeping with its string theory origins, the holographic universe also dives down to the Planck foam level. The point of this mini-essay is that except for the point about space being composed of entanglements between particles, none of this complexity is needed: there are no holograms, and there is no need for the energetically impossible Planck foam. All your need is group entanglement of the conjugate of particle spin, which is an overlooked "ghost direction" orthogonal to spin. Particles form a mutually relative consensus on these directions (see Karl Coryat Pillar #3) that allows them to ensure conservation of angular momentum, and that consensus becomes xyz space. Instead of a complicated hologram, its structure is that of an exceptionally simple direct-link web that interlinks all of the participating particles. It is no more detailed than it needs to be, and that number is determined solely by how many particles participate in the overall direction consensus. Finally, it is rigid in order to protect and preserve angular momentum, since the overriding goal in all forms of a quantum entanglement is absolute conservation of some quantum number.

      ----------------------------------------

      NOTE: A mini-essay is my attempt to capture an idea, approach, or prototype theory inspired by interactions with other FQXi Essay contestants. This mini-essay was inspired by:

      1. The Four Pillars of Fundamentality by Karl Coryat

      ----------------------------------------

      Introduction

      For this mini-essay I think the original text gives the thought pretty well "as is," so I am simply quoting it below. My thanks again to Karl Coryat for a fun-to-read and very stimulating essay.

      A quote from my assessment Karl Coryat's Pillar #3

      If space is the fabric of relations, if some vast set of relations spread out literally across the cosmos, defining the cosmos, are the true start of reality instead of the deceptive isolation of objects that these relations then make possible, what are the components of that relation? What are the "bits" of space?

      I don't think we know, but I assure you it's not composed of some almost infinite number of 10-35 meter bubbles of Planck foam. Planck foam is nothing more than an out-of-range, unbelievably extrapolated extremum created by pushing to an energetically impossible limit the rules of observation that have physical meaning only at much lower energies. I suspect that the real components of space are much simpler, calmer, quieter, less energetic, and well, space-like than that terrifying end-of-all-things violence that is so casually called "Planck foam."

      I'll even venture a guess. You heard it here first... :)

      My own guess is that the units of space are nothing more radical than the action (Planck) conjugation complements of the angular momenta of all particles. That is, units of pure direction, which is all that is left after angular momentum scarfs up all of the usual joule-second units of action, leaving only something that at first glance looks like an empty set. On closer examination, though, a given spin must leave something behind to distinguish itself from other particle spins, and that "something" is the orientation of the spin in 3-space, a ghostly orthogonality to the spin plane of the particle. But more importantly, it would have to be cooperatively, relationally shared with every other particle in the vicinity and beyond, so that their differences remain valid. Space would become a consensus fabric of directional relationships, one in which all the particles have agreed to share the same mutually relative coordinate system -- that is, to share the same space[/]. This direction consensus would be a group-level form of entanglement, and because entanglement is unbelievably unforgiving about conservation of conserved quantum numbers such as spin, it would also be extraordinarily rigid, as space should be. Only over extreme ranges would it bend much, to give gravity, which thus would not be an ordinary quantum force like photon-mediated electromagnetism. It would also be loosely akin to the "holographic" concept of space as entanglement, but this version is hugely simpler and much more direct, since neither holography, nor higher dimension, nor Planck-level elaborations are required. The entanglements of the particles just create a simple, easily understood 3-space network linking all nodes (particles).

      But space cannot possibly be compose of such a sparse, incomplete network, right?

      After all, space is also infinitely detailed as well as extremely rigid, so there surely are not enough particles in the universe to define space in sufficient detail! Many would in fact argue that this is precisely why any phenomenon that creates space itself must operate at the Planck scale of 10-35 meters, so that the incredible detail needed for 3-space can be realized.

      Really? Why?

      If only 10 objects existed in the universe, each a meter across, why would you need a level of detail that is, say, 20 orders of magnitude more detailed for them to interact meaningfully and precisely with each other? You would still be able to access much higher levels of relational detail, but only by asking for more detail, specifically by applying a level of energy proportional to the level of detail you desired. Taking things to the absolute limit first is an incredibly wasteful procedure, and incidentally, it is emphatically not what we see in quantum mechanics, where every observation has a cost that depends on the level of detail desired, and even then only at the time of the observation. There are good and deeply fundamental quantum reasons why the Large Hadron Collider (LHC) that found the Higgs boson is 8.6 km in diameter!

      The bottom line is that in terms of as-needed levels of detail, you can build up a very-low-energy universal "directional condensate" space using the spins of nothing more than the set of particles that exist in that space. It does not matter how sparse or dense those particles are, since you only need to make space "real" for the relationships that exist between those particles. If for example your universe has only two particles in it, you only need one line of space (Oscillatorland!) to define their relationship. Defining more space outside of that line is not necessary, for the simple reason that no other objects with which to relate exist outside of that line.

      So regardless of how space comes to be -- my example above mostly shows what is possible and what kinds of relationships are required -- its very existence makes the concept of relations between entities as fundamental as it gets. You don't end with relations, you start with them.

      Conclusions

      Quite few people who are reading this likely do not even believe in entanglement! So I am for you the cheerful ultimate heretic, the fellow who not only believe fervently in the reality of entanglement, but would make it literally into the very fabric of space itself. Sorry about that, but I hope you can respect that I have my reasons, just as I very much respect localism. Two of my top physicist favorites of all time, Einstein and Bell, were both adamant localists!

      If you are a holographic universe type, I hope you will at least think about some of what I've said here. I developed these ideas in isolation from your community, and frankly was astonished when I finally realized its existence. I deeply and sincerely believe that you have a good and important idea there, but history had convoluted it in very unfortunate ways. Take a stab at my much simpler 3D web approach, and I think interesting things could start popping out fairly quickly.

      If you are MOND or dark matter enthusiast, think about the implications of space being a direct function of the presence or absence of matter. One of my very first speculations on this topic was that as this fabric of entanglement thins, you could very well get effects relevant to the anomalies that both MOND and dark matter attempt to explain.

      Finally, I gave this fabric a name a long time, a name with which I pay respect to a very great physicists who literally did not get respect: Boltzmann. I call this 3D fabric of entanglements the Boltzmann fabric, represented (I can't do it here) by a lower-case beta with a capital F subscript. His entropic concepts of time become cosmic through this fabric.

      It's Time to Get Back to Real String Theory

      Terry Bollinger, 2018-02026 Feb

      Abstract. There is a real string theory. It is experimentally accessible and verifiable, at scales comparable to ordinary baryons and mesons, as opposed to the energetically impossible Planck foam version of string theory. It has perhaps 16 or so solutions, most likely, as opposed to the 10500 vacuums of Planck foam string theory. It was abandoned in 1974. It's time we got back to it.

      ----------------------------------------

      NOTE: A mini-essay is my attempt to capture an idea, approach, or prototype theory inspired by interactions with other FQXi Essay contestants. This mini-essay was inspired by:

      A well-founded formulation for quantum chromo- and electro-dynamics by Wayne R Lundberg

      ----------------------------------------

      A Long Quote from my Lundberg Essay Assessment

      Most folks aren't aware of it, but nucleons like protons and neutrons have additional spin states that appear like heavier particles built from the same set of quarks. Thus in addition to uud forming a spin 1/2 proton, the same three quarks can also form a heavier particle with spin 3/2 (1 added unit of spin) and spin 5/2 (2 added units of spin). These three variations form a lovely straight line when plotted as mass versus spin, which in turn implies a fascinatingly regular relationship between mass and nucleon spin.

      These lines are called Regge trajectories, and back in the late 1960s and early 1970s they looked like a promising hint for how to unify the particle zoo. Analyses of Regge trajectories indicated string-like stable resonance states were creating the extreme regularity of the Regge trajectories. These "strings" consisted of something very real, the strong force, and their vibrations were highly constrained by something equally real, the quarks that composed the nucleons (and also mesons, which also have Regge trajectories). These boson-like resonances of a string-like incarnation of the strong force were highly unexpected, extremely interesting, and experimentally accessible. Theorists were optimistic.

      Then it all went to Planck.

      Specifically, the following paper caught on like wildfire (slow wildfire !) and ended up obliterating any hope or future funding for understanding the quite real, experimentally accessible, proton-scale, strong-force-based string vibrations behind Regge trajectories. They did this by proposing what I like to call the Deep Dive:

      Scherk, J. & Schwarz, J. H., Dual Models for Non-Hadrons, Nuclear Physics B, Elsevier, 1974, 81, 118-144.

      So what was the Deep Dive, and why did they do it?

      Well, it "went down" like this: Scherk and Schwarz noticed that the overall signature of some of the proton-sized strong-force vibrations behind Regge trajectories were very similar to the spin 2 signatures of the (still) hypothetical gravitons that were supposed to unify gravity with other three forces of the Standard Model. Since the emerging Standard Model was having breathtaking success in that time period for explaining the particle zoo, quantum gravity and the Planck-scale foam were very popular at the time... and very tempting.

      So, based as best I can tell only on the resemblance of these very real vibration modes in baryons and mesons to gravitons, Scherk and Schwarz made their rather astonishing, revelation-like leap: They decided that the strong-force-based vibrations behind Regge trajectories were in fact gravitons, which have nothing to do with the strong force and are most certainly not "composed" of the strong force. The Planck-scale vibrations of string theory are instead composed of... well, I don't know what, maybe intense gravity? I've never been able to get an answer out of a string theorist on that question of "what is a string made of?" This is not an unfair question, since for example the original strings behind Regge trajectories are "composed" of the strong force, and have quite real energies associated with their existences.

      I still don't even quite get even the logic behind the Deep Dive, since gravity had exactly zero to do with either the substance of the strings (a known force) or the nature of the skip-rope-like, quark-constrained vibrations behind Regge trajectories. Nonetheless they did it. They took the Deep Dive, and it only ended up costing physics the following:

      ... 20 orders of magnitude of and shrinking size, since protons are about 10-15 meters across, and the gravitons were nominally at the Planck foam scale of 10-35 (!!!), which is a size scale that is inaccessible to any conceivable direct measurement process in the universe; plus:

      ... 20 orders of magnitude of increased energy costs, which is similarly universally inaccessible to any form of direct measurement; plus:

      ... a complete liberation from all of those annoying but experimentally validated vibration constraints that were imposed in real nucleons and mesons by the presence of quarks and the strong force. That's a cost, not a benefit, since it explodes the range of options that have to be explored to find a workable theory. Freeing the strings from... well... any appreciable experimental or theoretical constraints... enabled them instead to take on the nearly infinite number of possible vibration modes that a length or loop of rope gyrating wildly in outer space would have; and finally:

      ... just to add yet a few more gazillion unneeded and previously unavailable degrees of freedom, a huge increase in the number of available spatial dimensions, always at least 9 and often many more.

      And they wonder why string theory has 10500 versions of the vacuum... :)

      Oh... did I also mention that the Deep Dive has cost the US (mainly NSF plus matching funds from other institutions) well over half a billion dollars, with literally not a single new experimental outcome, let alone any actual working new process or product, as a consequence?

      This was only to be expected, since the Deep Dive plunged all research into real string-like vibrations down into the utterly inaccessible level of the Planck foam. Consequently, the only product of string theory research has been papers. This half a billion dollars' worth of papers has built on itself, layer by layer of backward citations and references, for over 40 years. In many cases, the layers of equations are now so deep that no human mind could possibly verify them. Errors only amplify over time, and if there is no way to stop their propagation by catching them though experiments, it's the same situation as trying to write an entire computer operating system in one shot, without having previously executed and validated its individual components.

      In short, what the US really got for its half billion dollars was a really deep stack of very bad programming. Our best hope for some eventual real return on string theory investments is that at least a few researchers were able to get in some real, experimentally meaningful research in all of that, to produce some real products that don't depend on unverifiable non-realities.

      Giovanni,

      Well... hmm, it's Feb 27 but this is still working, at least for a while.

      Thank you for your very kind remarks! I'll be sure to read your essay, as I try to do whenever anyone posts, even though the rating periods is over.

      (Or can we still post, just not rate? Sigh. I must read the rules again...)

      Cheers,

      Terry

      Peter,

      Thank you for the follow-up, but at 12:30 AM I'm not quite sure I followed all of that? I assume you did see my long posting at your site? I'll try to read your posting above again when I'm awake... :/ zzz

      Cheers,

      Terry

      Ulla,

      Thank you for your generous and kind remarks! It's past the rating period now, but I'll be sure to take a look at your essay tomorrow (today?)

      Cheers,

      Terry

      Dear Terry,

      there is no hurry to read my essay, if you want to do it. The forum remains open until the nomination of the winners (and even beyond), although I fear it will be very little frequented from now on.

      Mine is the modest contribution of a non-specialist. Read it without obligation, when you have time.

      Regarding the scoring system, I know it enough, having participated in the last three contests. I feel able to say (and I'm not the only one) that it works pretty badly and it's the worst aspect of the contest. The problem is that almost no one of us uses a rigorous and correct voting pledge as yours and the score is given often by sympathy, or resentment, or to return a high mark, or because absurd alliances and consortia come out..

      As a rule, I have never asked anyone to score my essay, but I have certainly sometimes been influenced by the requests of others, or by a too high rating that I received, or by the desire not to disappoint someone, and I certainly ended up by evaluating too high some essays that perhaps did not deserve it, or that I simply could not understand. My mistake, no doubt.

      Fortunately, I rarely participate in the scoring and unfortunately, having difficulties with English, even in discussions, but others do not so, and this way of doing negatively affects the final ranking of the community. Thus, some objectively mediocre essays often end up in the upper part of the ranking, while others objectively valid end up in undeservedly low positions. Your own essay, in my opinion one of the best, if not the best, deserved to end up in a position much higher than that it had (after blasts of 1 or 2 given without adding any motivation). But I also think of other contributions, like that of Karl Coryat, that you have appreciated and discussed in detail. Or of even more neglected essays, like that of A. Losev, which seemed to me very interesting and original. Or the suggestive one by Joe Becker (founder of the Unicode system!), who may have been penalized, as well as by his very shy and humble attitude, by his clearly holistic and metaphysical perspective (but similar to that of a great visionary scientist and philosopher like Leibniz). Or that of Bastiaansen, which certainly offers food for thought. But there are certainly many others, perhaps even lower scored, but certainly valid, which I forgot or I have not even read, because there are 200 essays and time is lacking..

      You will ask me: why do you put this in my thread, instead of writing it in a more appropriate and general context? In fact, these considerations may be out of context here and I apologize for this. But they came to me immediately after the closing of the community vote, while I was reading some of your posts. Moreover I have a little hope that your tireless, qualified, very correct contribution to this year's contest-forum can serve to make the FQXi community better, avoiding the risk of becoming a confused and scientifically sterile ground of personalism and preconceptions.

      Thanks again for all your contributions and, in particular, for the latest precious mini-essays, which will be for me a material for reading and reflection, in the coming days or weeks.

      Cheers,

      Giovanni

      Giovanni,

      I have finally figured out how to finds posts like yours! I simply search by date, e.g. "Feb. 27" for this one. It has been very hard for my poor brain to find entries when they show up in the middle of the blogs, both in mine and in others.

      Thank you for your positive and constructive comments! Also, thanks for that bit of info on how just the ratings close, not the commenting. I for one will be more likely to show up, not less. The ratings part is designed like a Hunger Games incentive program, so having it gone makes me feel like a more unfettered form synergistic interactions is now possible.

      I am particularly appreciative of your quick list of essays worth examining. I plan to look at them, hopefully all of them! I keep finding unexpectedly interesting points in so many of these essays.

      Finally, please feel very free to post in my essay thread anytime you want to. It never even occurred to me that it might not be the right "spot" for you to do so. (Come to think of it, considering some of the humongous posts that I've put on other folks' threads, I guess it's sort of a given that I'm not too worried about people cross-posting, isn't it?)

      Cheers,

      Terry

      Terry,

      Going back to what spawned string theory and Len Susskinds thoughts an even simpler interpretation in another direction seems to yield a whole lot more useful stuff without infinite recursion; i.e. here; VIDEO Time Dependent Redshift. Are we locked in a circular one way street without the exit of helicical paths?

      My present classic QM derivation emerged from a test of the model and SR components, via the 2015 top scorer;The Red/Green Sock Trick.

      Might it not be time to step back and review other routes?

      Peter

      Terry: NB: your time is avaluable to me; so no need to rush! Seeking to minimize misunderstandings -- see below -- from the get-go, your comments follow [with some editing for efficiency] -- with some bolding for clarity (and sometimes emphasis).

      TB: "Your title is intriguing; look at my signature line and its single-concept definition of QM and you can see why."

      GW: Here it is: "(i) Quantum mechanics is simpler than most people realise. (ii) It is no more and no less than the physics of things for which history has not yet been written."

      We agree: 'Quantum mechanics is simpler than most people realise.' I would add: It's little more than an advanced [and experimentally-supported] probability/prevalence theory. But please, for me, translate your 2nd sentence (ii) into a few more words: "(ii) It is no more and no less than the physics of things for which history has not yet been written = ..." ??

      TB: "My queue on this last day is long."

      GW: Rightly so! But (NB) the threads can remain open for years!!

      TB: "But I will follow your link and a look at your essay."

      GW: Please take your time with the essay and communicate directly by email (it's in the essay) when you have difficulties; especially if you're rusty with delta-functions in ¶13. I am here for critical feedback and questions, etc. And I cannot be offended.

      TB: "Wow! That is one of the best arguments for locality that I think I've seen. I like your Bell-ish style of writing and focus on specifics."

      GW: Tks.

      TB: "You are of course in very good company, since Einstein was a localist."

      GW: Yes; without doubt!

      TB: "And Bell was a localist."

      GW: ??? Not from my readings! For me, a true localist would have reviewed his theorem and spotted the error. Further, here's Bell's dilemma from as late as 1990:

      'I cannot say that AAD is required in physics. I can say that you cannot get away with no AAD. You cannot separate off what happens in one place and what happens in another. Somehow they have to be described and explained jointly. That's the fact of the situation; Einstein's program fails ... Maybe we have to learn to accept not so much AAD, but the inadequacy of no AAD. ... That's the dilemma. We are led by analyzing this situation to admit that, somehow, distant things are connected, or at least not disconnected. ... I don't know any conception of locality that works with QM. So I think we're stuck with nonlocality ... I step back from asserting that there is AAD and I say only that you cannot get away with locality. You cannot explain things by events in their neighbourhood. But, I am careful not to assert that there is AAD,' after Bell* (1990:5-13); emphasis added.

      *Bell, J. S. (1990). "[link:www.quantumphil.org./Bell-indeterminism-and-nonlocality.pdf||Indeterminism and nonlocality.]" Transcript of 22 January 1990, CERN Geneva. Driessen, A. & A. Suarez (1997). Mathematical Undecidability, Quantum Nonlocality and the Question of the Existence of God. A. 83-100.

      TB: "I can't do a detailed assessment today -- too many equations that would need careful examination to assess your argument meaningfully -- but what I've seen at a quick look seems pretty solid."

      GW: PLEASE: Do not get bogged down; send me emails when you have difficulties. For me, your time is precious!

      TB: That said, there is an expanding class of pro-entanglement data anomalies that you need somehow to take into account:

      ID230 Infrared Single-Photon Detector Hybrid Gated and Free-Running InGaAs/InP Photon Counter with Extremely Low Dark Count

      GW: Terry: My theory expects "entanglement" to be strengthened with better equipment; and you [thankfully] next supply the supporting evidence!

      TB: "This field has moved way beyond the Aspect studies. A lot of hard-nosed business folks figured out years ago that arguments against the existence of entanglement don't matter much if they can simply build devices that violate Bell's inequality. Which they did, and now they sell them to some very smart, physics-savvy customers who use them on a daily basis to encrypt some critical data transmissions."

      GW: We agree, 100%.

      TB: "Many of these customers would be, shall we say, upset in interesting ways if some company sold them equipment that did not work."

      GW: NBB Why wouldn't it work? My theory would be kaput if it didn't!

      TB: "Again, thanks for a well-argued essay! I'll try (no promises though) to take a closer look at your essay at some later (post-commenting-close) date. Again assuming the equations are solid, yours is the kind of in-depth analysis needed to sharpen everyone's thinking about such topics."

      GW: Please take you time; every word of criticism is like a kiss from wife.

      Tingling in anticipation; with my thanks again; Gordon

      More realistic fundamentals: quantum theory from one premiss

      REPOSTED TO CORRECT FORMATTING ERROR NOT PRESENT IN PREVIEW! Adding: my comments below are to mimimize some apparent misunderstandings.

      Terry: NB: your time is valuable to me; so no need to rush! Seeking to minimize misunderstandings -- see below -- from the get-go, your comments follow [with some editing for efficiency] -- with some bolding for clarity (and sometimes emphasis).

      TB: "Your title is intriguing; look at my signature line and its single-concept definition of QM and you can see why."

      GW: Here it is: "(i) Quantum mechanics is simpler than most people realise. (ii) It is no more and no less than the physics of things for which history has not yet been written."

      We agree: 'Quantum mechanics is simpler than most people realise.' I would add: It's little more than an advanced [and experimentally-supported] probability/prevalence theory. But please, for me, translate your 2nd sentence (ii) into a few more words: "(ii) It is no more and no less than the physics of things for which history has not yet been written = ..." ??

      TB: "My queue on this last day is long."

      GW: Rightly so! But (NB) the threads can remain open for years!!

      TB: "But I will follow your link and a look at your essay."

      GW: Please take your time with the essay and communicate directly by email (it's in the essay) when you have difficulties; especially if you're rusty with delta-functions in ¶13. I am here for critical feedback and questions, etc. And I cannot be offended.

      TB: "Wow! That is one of the best arguments for locality that I think I've seen. I like your Bell-ish style of writing and focus on specifics."

      GW: Tks.

      TB: "You are of course in very good company, since Einstein was a localist."

      GW: Yes; without doubt!

      TB: "And Bell was a localist."

      GW: ??? Not from my readings! For me, a true localist would have reviewed his theorem and spotted the error. Further, here's Bell's dilemma from as late as 1990:

      'I cannot say that AAD is required in physics. I can say that you cannot get away with no AAD. You cannot separate off what happens in one place and what happens in another. Somehow they have to be described and explained jointly. That's the fact of the situation; Einstein's program fails ... Maybe we have to learn to accept not so much AAD, but the inadequacy of no AAD. ... That's the dilemma. We are led by analyzing this situation to admit that, somehow, distant things are connected, or at least not disconnected. ... I don't know any conception of locality that works with QM. So I think we're stuck with nonlocality ... I step back from asserting that there is AAD and I say only that you cannot get away with locality. You cannot explain things by events in their neighbourhood. But, I am careful not to assert that there is AAD,' after Bell* (1990:5-13); emphasis added.

      *Bell, J. S. (1990). "Indeterminism and nonlocality." Transcript of 22 January 1990, CERN Geneva. Driessen, A. & A. Suarez (1997). Mathematical Undecidability, Quantum Nonlocality and the Question of the Existence of God. A. 83-100.

      TB: "I can't do a detailed assessment today -- too many equations that would need careful examination to assess your argument meaningfully -- but what I've seen at a quick look seems pretty solid."

      GW: PLEASE: Do not get bogged down; send me emails when you have difficulties. For me, your time is precious!

      TB: That said, there is an expanding class of pro-entanglement data anomalies that you need somehow to take into account:

      ID230 Infrared Single-Photon Detector Hybrid Gated and Free-Running InGaAs/InP Photon Counter with Extremely Low Dark Count

      GW: Terry: My theory expects "entanglement" to be strengthened with better equipment; and you [thankfully] next supply the supporting evidence!

      TB: "This field has moved way beyond the Aspect studies. A lot of hard-nosed business folks figured out years ago that arguments against the existence of entanglement don't matter much if they can simply build devices that violate Bell's inequality. Which they did, and now they sell them to some very smart, physics-savvy customers who use them on a daily basis to encrypt some critical data transmissions."

      GW: We agree, 100%.

      TB: "Many of these customers would be, shall we say, upset in interesting ways if some company sold them equipment that did not work."

      GW: NBB Why wouldn't it work? My theory would be kaput if it didn't!

      TB: "Again, thanks for a well-argued essay! I'll try (no promises though) to take a closer look at your essay at some later (post-commenting-close) date. Again assuming the equations are solid, yours is the kind of in-depth analysis needed to sharpen everyone's thinking about such topics."

      GW: Please take you time; every word of criticism is like a kiss from wife.

      Tingling in anticipation; with my thanks again; Gordon

      More realistic fundamentals: quantum theory from one premiss.

      Terry, while I liked her essay and criteria a lot, I'm sure that there are more than really necessary. Especially when you consider that the mathematical uniqueness criteria can only be filled by a cosmology with 11 dimensions, one of which is a cyclic variable. I don't know of any other besides my own theta-mass-time.

      WRL

      "The World's Most Famous Equation" is also one of the most misunderstood. First, it was first derived by Poincaré, not Einstein, and it is better written as

      E0 = mc2

      "Thus the 20 digit sequence could in principle be replaced by a short binary program that generates and indexes pi". Which would consume more memory than simply storing the original 20 digit string.

      "In physics the sole criterion for whether a theory is correct is whether it accurately reproduces the data in foundation messages." A theory can be refuted without even running a single experiment. We have other criteria to evaluate data, including internal consistency checks.

      "The implication is that a better way to think of physics is not as some form of axiomatic mathematics, but as a type of information theory". It is neither.

      Challenge 2. Bosons are in reality virtual combinations of Fermions that arise in the formalism when one switches from a non-local real picture to the approximate local picture of QFT. All the properties of bosons are derived from the properties of fermions, including spin. E.g. for photons the available spin states are

      (+-1/2) - (+-1/2) = 0,1,-1,0.

      The Standard Model needs to postulate the properties of bosons: mass, spin, charge. I can derive those properties from first principles.

      "There are after powerful theoretical reasons for arguing that gravity is not identical in nature to the other forces of the Standard Model. That reason is the very existence of Einstein's General Theory of relativity, which explains gravity using geometric concepts that bear no significant resemblance to the quantum field models used for other forces". Gravity can be formulated non-geometrically. So there is nothing special about it regarding this. On the other hand the gauge theory used in QFT for the other interactions can be given a geometrical treatment with the gauge derivatives playing a role similar to the covariant derivatives in GR, and the field potentials playing a role similar to the Christoffel symbols.