Dear Terry

Thank you for your feedback on my essay, '

I will use a similar marking system to that used by you.

What I liked:

Easy to read. Well set out. Your core idea came through well.

What I thought about as I read it:

The initial idea seemed quite a lot like Occam's razor, shifted from a philosophical stance to a mathematical stance.

Einstein's derivation of E=mc^2 was moderately complex at certain points, and each step represented the last, using more letters, yet each says the same thing, so must express the same level of fundamentality. The difference is only in he or she who reads it. Fundamentality would then be in the eye of the beholder?

There is an implicit assumption in your example of replacing a gigabyte file. Say I write this program, but before it ends, I pull the plug! The program is time dependent while the gigabyte file is more space dependent. This was a cool idea, but I don't think the file and the system are equivalent, only potentially equivalent. Also, reproducing the original file by a computer, even a perfect computer, would require an expenditure of energy under Landauer's 'information is physical' correlation that would require more effort to retrieve than simply keeping the original file. I mention this because it hints at hidden aspects that make some things seem more fundamental than they really are.

I thought the reference to pi was quite fun. It occurred to me that given pi is irrational, if one had some way of referencing a point in its decimal expansion, then every finite number would be expressed within it and the larger the number, the more efficient it would be to specify the start and end point? I see this would likely have some limit due to a trade-off, but it is fun to think about. I suppose the same would apply to any irrational number.

You say, 'the role of science is first to identify such pre-existing structures and behaviors, and then to document them in sufficient detail to understand and predict how they work.' For philosophical reasons, there is a well-known (Hume, Kant, Popper) strong epistemic schism between science and reality. We can never identify, from an empirical standpoint, pre-existing structures. We can only guess at them from hints provided by experiment. But you probably know this already. My essays were written exactly to span the gap from a rationalist perspective.

On the Spekkens Principle. I haven't read that essay, but the terminology suggests his work echoes that of Raphael Sorkin and his Causal Set Theory, only interpreted in an informational universe context (I think). It is exactly the dynamics, the ultimate cause, which my work addresses.

On your 'Challenge 3'. I am a bit surprised that you didn't mention in your comments on my essay argument for an increasing baryon mass (whether due to intrinsic change of the baryon or extrinsic change due to the 'shape of space', for which my model is not yet sufficiently advanced) which would answer at least one part of this challenge, namely why the ration of electron to proton mass is at it is.

'The Standard Model qualifies overall as a remarkably compact...framework.' Oi! Are we reading the same model? That said, I see you point that compared to SUSY and string theory and so forth, it is relatively compact, but with all those parameters, might they be hiding a very large set of other theories?

I hope you take the time to read my previous essay in 'It from Bit'. In it there may be a gemstone of simplicity. My whole model comes from a single principle, and that principle is no more than an expression of our idea of equivalence.

I am providing a short response to your comments on my paper.

Best wishes,

Stephen

    Stephen,

    Thank you for your thoughtful and intriguing comments! I will look at your comments on your essay thread. A few quick responses:

    -- Ironically, I'm not a big fan of E=mc2, since I immensely prefer the energy-complete, Pythagorean-triangle-compatible form:

    [math](E)^2=(\overrightarrow{p}c)^2(mc^2)^2[/math] E=mc2 addresses only the very special case of mass at rest, and so is not very useful in any actual physics problem. I also find it deeply amusing that Einstein's original paper in which he announced his mass-equals-energy insight, "Does the Inertia of a Body Depend on Its Energy Content?", uses L in place of E, and for some odd reason never actually gives the equation! Instead, Einstein uses a sentence only to describe an equivalent equation with c on the other side:

    "If a body gives of the energy L in the form of radiation, its mass decreases by L/c2."

    -- Your points about the subtleties of the gigabyte example are correct, very apt, and interesting! In some of my unpublished work I am strong on distinguishing very carefully between "potentiality" (yor word) versus "actuality", and not just in physics, but especially in mathematics. The very ease with which our brains take potentials to their limits and then treat them as existing realities is both an interesting efficiency aspect of how capacity-limited biological cognition works, and a warning of the dangers of being sloppy about the difference. Computer science flatly forces one to face some of these realities in ways that bio-brain mathematical abstractions do not. This is why I think it is very healthy for mathematicians to contemplate what happens if they convert their abstract equations into software.

    In any case, issues such as the ones you mention are where the real fun begins! While a short essay is great for introducing a new landscape to a broader audience, there is a lot more going on underneath the hood, as you have just pointed out with your comments on both Einstein's mass equation and my gigabyte file example. An essay is at its best more like a billboard enticing viewers to visit new land, including at most a broad, glossy, overly simplified view of what that new land looks like. The real fun does not begin until you start chugging your vehicle of choice over all the &%$^# potholes and crevices that didn't show up in that enticing broad overview!

    -- I would note that while in principle it may be possible to find any random sequence of numbers somewhere in pi (wow, that would be an interesting proof or disproof...), there is a representation cost for the indices themselves that must be also be taken into account. A full analysis of the potential of pi or other irrational numbers as data compression mechanisms would be mathematically fun, and who knows, might even end up pointing to some subset of methods with actual compression value. The latter to me seems unlikely, though, since the computation cost is likely to get huge pretty quickly.

    -- As an everyday passionate Hume in being, I Kant speak all that knowledgeably about the biological structures behind how our minds perceive reality. But I try hard to avoid the subconscious assumptions of truth that Popper up so often in physics and even math, when all one can really do is prove that at least some of these assumptions are false. Thus when writing for a broad audience, I try hard to make it easier for the reader to follow an argument and stay focused on it by simply explaining any necessary philosophical point "in line" and as succinctly as possible. The less satisfying alternative would be to give them a hyperlink to some huge remote body of literature that would then require a lot of reading on their part before they could even get back to the original argument... which by that point they have likely forgotten... :)

    -- I am also surprised that I did not mention your baryon mass idea in my comment, since I certainly was thinking about them when I wrote the comment! I guess I was just more focused on the experimental constraints on the idea?

    -- On the Standard Model being "relatively compact", that is, say, in comparison to string theory. But oi indeed, my emphasis was very definitely on the word "relatively"! The Standard Model as it stands is a huge, juicy, glaringly obvious target for some serious Occam outtakes and Kolmogorov compressions.

    By the way, as someone who is deeply convinced that space, a very complex entity if you think about it, is emergent from simpler principles, I found your constructive approach to it interesting.

    I will try to read It from Bit soon.

    Cheers,

    Terry

    Edwin Eugene Klingman,

    I am delighted and more than a little amused at how badly I misunderstood your intent! I would have bet that your answer was going to be "yes, I was just being subtle about simulation"... and I was so wrong!

    I'll look more closely to figure out why I got that so wrong. I may even look up your thesis, but no guarantee on that -- theses tend to be long in most cases!

    I downloaded your ref [10] and definitely look forward to looking at that one! I would say immediately that a lot of particle and especially atomic nuclei folks would vehemently disagree, since e.g. things like flattening have to be taken into account when trying to merge nuclei to create new elements. But that's not the same as me having a specific reference hand, as you do here.

    So: More later on that point. Thanks for an intriguing reference in any case!

    Cheers,

    Terry

    Dear Terry,

    With great interest I read your essay, which of course is worthy of the highest praise.

    I found in the forum thread of Kadin your questions, which are much more interesting and relevant than the questions of FQXi.

    My opinion on these issues:

    (1) Entanglement - is the only remote mechanism in the Universe for forming the force of interaction between the elements of matter, which is realized as a result of the interaction of the de Broglie toroidal gravitational waves at the common frequencies of the parametric resonance.

    This quantum mechanism of gravity is shown in a photo of phenomena observed in outer space (essay 2017) "The reason of self-organization systems of matter is quantum parametric resonance and the formation of solitons" (https://fqxi.org/community/forum/topic/2806).

    For example, a molecule is a state of entanglement (interaction) of atoms at common resonant frequencies of the de Broglie toroidal gravitational wave complex (including tachyon waves) belonging to different levels of matter.

    (2) Full fundamental fermion zoo - is described by simple similarity relations of the fractal structure of matter, on the basis of the parameters of the electron and the laws of conservation of angular momentum and energy.

    Fermions of different levels of matter are neutrinos for each other. All this is given in the essay 2018 "Fundamental" means the underlying principles, laws, essence, structure, constants and properties of matter (https://fqxi.org/community/forum/topic/3080).

    Also given are the ratios for the deterministic grids of all the main resonance frequencies of the zoo of toroidal gravitational waves (fundamental fermions), and comparisons are made with known observed resonant frequencies.

    (3) Recreating GR predictive power - is possible only after understanding the fact of the existence of potential stability pits in all fundamental interactions, both in strong interaction.

    After such an understanding, logically easily is solved the paradox of electrodynamics, when the orbital electron does not radiate.

    Potential stability pits (de Broglie toroidal gravitational waves, orbital solitons) are formed due to quantum parametric resonance in the medium of a physical vacuum.

    With understanding of potential pits comes an understanding of inertia and mass.

    (4) Clarifying waves vs superposed states - This is the result of the interaction of the toroidal gravitational waves of de Broglie (fundamental fermions), it can be determined by solving classical quantum parametric resonance problems, for example, using the Mathieu equations (as in radio engineering).

    The solutions of these equations can be represented as a Fourier series, which is actually a set of real toroidal gravitational waves interacting (entangled) in a system on deterministic grids of a set of resonance frequencies.

    I'm sorry that everything is wrong, «how very much like space curvature could create such observed effects».

    Instead of curvature of space-time, there is a derivative of spatial coordinates in time. Equivalent of "curvature of space" is the speed of propagation of gravitational interaction.

    I hope that my modest achievements can be information for reflection for you.

    Vladimir Fedorov

    https://fqxi.org/community/forum/topic/3080

      Vladimir,

      Thank you for your kind remarks and comments. I will take a look at your essay sometime today, Friday Feb 16.

      Cheers,

      Terry

      Hi Edwin Eugene Klingman,

      Let's get to the main point: You surely realize that the Curt Renshaw paper (your ref 10) contains has no data whatsoever disproving special relativity? I assume you do, since you worded your description of the paper as "arguing" that SR length contraction does not exist, versus saying that the paper actually provides data contradicting SR.

      The Crenshaw paper instead only asserts that when the NASA Space Interferometry Mission (SIM) satellite is launched in 2005 (it is an old paper), it will disprove SR, because the author says it will:

      "The author has demonstrated in several previous papers that the Lorentz length contraction likely does not exist, and, therefore, will not be found by SIM."

      SIM was supposed to be launched in 2005 but was cancelled. Its nominal successor, SIM Lite, was also cancelled. Thus no such data exists, either for or against SR. The title of the paper, "A Direct Test of the Lorentz Length Contraction", is at best misleading, although it could perhaps be generously interpreted as a paper about a proposed test that never happened.

      In sharp contrast to this absence of data, all of the effects of SR, including time dilation, relativistic mass increases, and squashing of nuclei, is unbelievably well documented by hundreds or thousands of people who use particle accelerators around the world. Particle accelerators easily explore velocities very close to the speed of light, and so can produce extremely strong signals regarding the effects of SR. Shoot, even ordinary vacuum tubes prove SR if you crunch the numbers, since the electrons must accumulate energy under the rules of SR. Denying the existence of this gigantic body of work, engineering, and detailed SR-dependent published papers is possible only by saying "I don't like that enormous body of data, so I will just ignore it."

      Bottom line: You gave me a data-free reference with a title that fooled me into thinking it had actual data in it. I like you and your willingness to explore alternatives, but I sincerely wish you had not done that. My advice: Go look at the thousands of papers from the particle accelerator community, and stop focusing on a single deceptively titled non-paper (or author, since Crenshaw has other papers).

      Sincerely,

      Terry Bollinger

      Terry,

      There is a lot to like in your essay. It gives good guidance for simplistic discovery, including your 3 challenges, which add to a relatively out-of-the-box perception of simplistic processes of investigation. We all marvel over Einstein's equation, it's simplistic epiphany of the duality of energy and mass. Euler's identity is intriquing to all and fermion-boson spin baffling. And if we programmed in our careers we staggered over the mind-numbing immensity of mishmash of recursive equations years of coding piled on. I speak of new approach and discovery as well in my essay. Fundamental does involve fewer bits but also new discovery in following a more simplistic thread as you mention. I rate your essay high on several points. Hope you get a chance to look at mine.

      Jim Hoover

        Thanks Terry,

        There are questions that arise concerning our interests in simplification that are not commonly admitted. For example:

        1. Is simplification 'simply' a means of reducing complexity to a level of understanding that is acceptable (i.e. comfortable) and thereby communicable to others?

        2. Is the search for simplification acknowledgement that the subject under consideration is beyond the capacity of a person to comprehend in its totality?

        3. Is simplification a means by which one can get connected to people operating at a higher (or lower) level of consciousness?

        4. If simplification is assumed to promote a common cause, the purpose of which is to unite one's interests with those of others, at what point does the process of simplification become too simple and thereby confuse rather than clarify issues?

        5. Is the FQXi question so simple that it stimulates multiple lines of enquiry rather than serving to unite people in a common understanding?

        At issue is how many people are reasonably expected to benefit from any process of simplification. If that family is limited to professional physicists, mathematicians, or people that happen to speak a particular 'foreign' language, then is the quest for simplification really justified?

        Does being 'more fundamental in the sense of having the deepest insights' really contribute to understanding, or was Einstein the only person that truly understood what he was saying at the time?

        Thank you Terry for inviting us along your chosen path. You carry my best wishes.

        Gary.

          Jim,

          Thank you for your positive and thoughtful remarks! I look forward to seeing your essay, and will download a copy of it shortly.

          Cheers,

          Terry

          Gary,

          Thank you for your positive remarks! And wow, that is an intriguing set of questions you just asked!

          I like in particular that you are addressing the human and social interactions aspects of communications simplification. These are critical aspects of what I call collaborative or collective intelligence, that is, the "IQ" of an entire group of people, systems, and environments. The idea of a collective IQ addresses for example why free market economies tend in comparison to authoritative economies tend to be hugely more clever, efficient, and adaptable in their use of available resources. The intelligences that emerge from free market economies are examples of intelligences that are beyond detailed human comprehension; that is precisely why the human-in-charge authoritarian structures are so ineffective.

          Intelligence is never fully spatially localized, and that is the source of many deep misunderstandings about its nature. Even when you do something as simple as read a book, you have extended your intelligence beyond the bounds of your own body, since you are now relying on an external memory. I would suggest that the main reason human intelligence can be oddly difficult to distinguish from animal intelligence is because it is not the innate cleverness of any one human that defines human intelligence, but rather the extraordinarily high level of networking in both time (writing) and space (language) of human intelligence that makes us unique. For example, a very clever bonobo can I think be individually not that different from a human in terms of innate problem solving and cleverness. But that same bonobo lacks the scaffolding of language, both internally (e.g. for postulating complex imaginary worlds) and externally (for sharing with other bonobos), and so is unable to "build on the shoulders of others," as we like to say.

          (A bit of a physics tangent: I would also suggest that intelligence is deeply intertwined with the emergence of information within our universe, in ways we do not yet fully comprehend. At the very origin of our universe the emergence of "I need my own space!" fermions in flat space enabled the emergence of what we call information, via persistent configurations of fermions within that accommodating flat space. But only obstinately persistent and isolationist fermions can readily create the kinds of unique configurations that we call history. Once the universe made history (information) possible, higher levels of complexity also became possible, including only very recently networked human intelligence.)

          Your particular questions can be answered specifically only by first grappling with the curiously probabilistic issues that underlie all forms distributed intelligence, but which are particularly conspicuous in human interactions. Pretty much by definition, an intelligent system must deal with issues that cannot be fully anticipated in advance, but which also can be at least partially anticipated. These complex underlying probabilities in turn affect the nature of "simplifications" needed in any one messaging event. Three major simplification options include subsetting (sending only a small but specific subset), generalizing (capturing an overall message, but leaving the recipient to synthesize the details), and complete transfer (e.g., loading a new app onto a smartphone).

          The nature of and state of the recipient is of course also critical, and just to confuse everything a bit more, often highly variable over time. The general trend is that due to accumulation of earlier messages and their implications, meaning-per-message increases over time. That also complicates the idea of summarization, since what previously was an incomplete message may over time become entirely adequate. You can watch that effect in slow-but-real time as your Alexa or Hey Google or whatever grows a little smarter each week about how to interpret exactly the same human sentence.

          I will address your specific questions after I've read your essay. Again, thank you for such excellent questions!

          Cheers,

          Terry

          Hi Terry,

          I read you essay and I loved the last paragraph...

          If you see such a thread and find it intriguing, your first step should be to find and immerse yourself in the details of any high-quality experimental data relevant to that thread. Some obscure detail from that data could become the unexpected clue that helps you break a major conceptual barrier. With hard work and insight, you might just become the person who finds a hidden gemstone of simplicity by unravelling the threads of misunderstanding that for decades have kept it hidden.

          Now even though I am going to say this - I still loved your essay... Your conclusion is completely wrong and this is the reason why...

          I can assure you with utmost confidence that no high-quality experiment with its high quality data will help in revealing what is hidden from us which is required to figure out the theory of everything. Yes I know I am making a very bold statement but, I just wanted you to hear this for future reference when physicists start looking into Gordon's Theory of Everything.

          The law of conservation of energy is what is preventing us from realizing what dark energy is... Yes it would actually break the law of physics to solve the theory of everything the way you are proposing. :)

          Anyway - if you have any interest - a very limited exposure to my theory is presented in my essay, "The Day After the Nightmare Scenario"

          All the best to you

          Scott S Gordon, MD/Engr.

          Hi Scott,

          I love it!!

          Yep, you are right: Details of past data are unlikely to do squiddly for such incredibly important issues as "dark matter" and "dark energy". You nailed me royally on that point! I was thinking in particular about overlooked issues in the Standard Model, but hey, even there the whole dark-dark issue has to come in somehow.

          I've added you to my reading list, which is a getting a bit long, but I hope to get to it soon.

          Thanks again! Since I am Missourian by upbringing, it is the well-stated critiques that make my day. I've found by hard experience that if I start getting way too confident in my own ideas, I start looking and acting like the rear end of one of those Missouri mules. :)

          Cheers,

          Terry

          • [deleted]

          Hi Terry,

          I liked that you provided a simple model of what is fundamental. And your essay followed its own premise: "Fundamental as Fewer Bits". I really enjoyed reading it.

          In particular I liked:

          "Because gravity is so weak, principles of quantum mechanics drove the scale of such models into both extremely small length scales and extraordinarily high energies. This in turn helped unleash so many new options for "exploration" that the original Standard Model simply got lost in an almost unimaginably large sea of possibilities.[9]"

          I my essay "The Thing that is Space-Time" I attempt to pull gravity out of the Standard Model.

          I postulate a graviton is not a Boson and that, and in general has very low energy and very large distances (aka wavelength) that span all the matter in the universe. Thus it is a very low energy particle. I use three basic equations to produce this theory: 1. The Planck-Einstein equation. 2. E=mc^2 and 3. The equation for the Planck mass. The general overview is that the graviton is much like a guitar string that is anchored on opposing Planck masses. This quantum mechanical guitar string (the graviton) has a mass and instead of supporting musical notes it supports the different frequencies of light (photons).

          Question: Would you take a look at my entry and let me know if this version of gravity has any merit in terms of meeting your criteria of having fewer bits? Any response appreciated!

          Thanks,

          Don Limuti

            Thanks, Terry Bollinger,, for his criticism of my essay. I understand that it was written poorly. Its main aim is to attract researchers to continue the theory of everything of Descartes' taking into account modern achievements in physics. The principle of identity of physical space and matter of Descartes' allows us to remodel the principle of uncertainty of Heisenberg in the principle of definiteness of points of physical space, according to which in order to get the point of it required an infinitely large momentum. Look at my essay, FQXi Fundamental in New Cartesian Physics by Dizhechko Boris Semyonovich Where I showed how radically the physics can change if it follows this principle. Evaluate and leave your comment there. Do not allow New Cartesian Physics go away into nothingness.

            Sincerely, Dizhechko Boris Semyonovich.

            Terry,

            This is a fine essay with many interesting points, eminently clear and sensible. On your main theme of simplicity, you should check out Inés Samengo's excellent essay. She has a similar take, but also considers the scope of a theory as a second key factor in determining what's fundamental. And she makes the point that these two criteria are not necessarily in synch. FYI, though essay ratings have to be done by 2/26, I believe we can continue reading and posting comments afterwards. So no rush!

            As you know from looking at my essay, I agree that "a better way to think of physics is not as some form of axiomatic mathematics, but as a type of information theory." And I like the way you characterize the difficulties we face when we have a theory that seems close to being fundamental -- your description of "the trampoline effect" was especially vivid and on point, with the Standard Model. Most of all, though, I like your general attitude - you can get seriously involved in specific issues (your "challenges"), but also really broad ones - like the "lumpiness" you mention in your comments to Karen Crowther's essay: "Our universe is, at many levels, "lumpy enough" that many objects (and processes) within it can be approximated when viewed from a distance."

            You were writing about renormalization, and making an interesting shift in perspective. Physicists have tried to understand this by delving into the mathematics, which by now is apparently well-understood. You suggest that a different viewpoint might also help, comparing this with many other cases in which the "approximate" (or "effective") properties of a complex system define it more usefully at a higher level. I agree that this is a deep and important characteristic of our universe, where lower-level complexity supports new and simpler kinds of relationships, where new kinds of complexity can become important. I hope this perspective can eventually elucidate the amazing complications of our current physics.

            Your summary credo is excellent: "the belief that simplicity is just as important now as it was in the early 1900s heydays of relativity and quantum theory." The wonder of our situation is that we're still trying to grasp exactly what kind of simplicity those two foundational theories are showing us.

            By the way, I'm much in sympathy with your remarks to Flavio, above. The earliest-submitted essays in these contests can be discouraging, and it's a marvelous relief when a really good one shows up - in my case it was Emily Adlam's that rescued me from despair. So thanks for joining in!

            Conrad

              Dear Terry,

              I was most impressed, even inspired. Your ability to find the right questions is leagues above most who can't even recognize correct answers! Lucid, direct, one of the best here.

              I entirely agree on simplicity as the title of my own essay suggests, but isn't a reason we haven't advanced that our brains can't quite yet decode the complex puzzle (information)?

              But now more importantly. I'd like you to read my essay as two of your sought answers are implicit in an apparent classical mechanism reproducing all QM's predictions and CSHS>2. Most academics (& editors) fear to read, comment or falsify due to cognitive dissonance but I'm sure you're more curious and honest. It simply follows Bell, tries a new starting assumption about pair QAM using Maxwell's orthogonal states and analyses momentum transfers.

              Spin 1/2 & 2 etc emerged early on and is in my last essay (scored 8th but no chocs). Past essays (inc. scored 1st & 2nd) described a better logic for SR which led to 'test by QM'. Another implication was cosmic redshift without accelerating expansion closely replicating Euler at a 3D Schrodinger sphere surface and Susskinds seed for strings.

              By design I'm quite incompetent to express most thing mathematically. My research uses geometry, trig, observation & logic (though my red/green socks topped the 2015 Wigner essay.) But I do need far more qualified help (consortium forming).

              On underlying truths & SM, gravity etc, have you seen how closed, multiple & opposite helical charge paths give toroid... ..but let's take things 2 at a time!

              As motion is key I have a 100 sec video giving spin half (, QM etc.) which you may need to watch 3 times, then a long one touching on Euler but mainly Redshift, SR, etc. But maybe read the essay first.

              Sorry that was a preamble to mine but you did ask! I loved it, and thank you for those excellent questions and encouragement on our intellectual evolution.

              Of course I may be a crackpot. Do be honest, but I may also crack a bottle of champers tonight!

              Very best

              Peter

                Don,

                Thank you both for your supportive remarks, and for your intriguing comments on a non-boson approach to gravity! I will definitely take a look, though I should warn you that my reading queue is getting a bit long.

                I'd say that your proposing a "non-boson" approach sounds pretty radical... except that after about 40 years of trying, the boson approaches still haven't really worked, have they? Also, general relativity, which does succeed very well experimentally (well, there is that dark energy thing) is anything but "boson" based. I think folks underestimate just how utterly incompatible the boson approach of quantum gravity and the geometric approach of general relativity are! The very languages are so utterly different that it's hard even to say what either one means in the language of the other.

                So, thanks again, and I'll get to your essay as soon as I can.

                Cheers,

                Terry

                Dear Terry Bollinger,

                My challenge #0:

                Accept that the border between past and future is a non-arbitrary point of reference; hence cosine transformation is more concise than complex-valued Fourier transformation. Just the redundant information of a chosen point t=0 is missing.

                Thank you for encouragement,

                Eckard

                  Conrad,

                  [Argh, I almost became Anonymous! Why in the world does FQXi automatically sign people out after a few hours, without even giving a warning like everyone else in the world? And similarly, why do they keep expiring the reCAPCHA? That's not security, that's just annoying, argh2! Keeping folks signed in is the norm these days!]

                  First, I should probably mention that I've posted a follow-up to my contemplation of the perturbative issue (post 144023) you just mentioned. That is the one in which I took a deeper look at the issues underlying Criterion 4 from Karen Crowley's superb essay.

                  Sleeping on that issue precipitated a rather unusual early-morning chain of analysis that I documented in real-time in post_144220. Here is my final, fully generalized hypothesis from the end of that analysis chain:

                  All formal solutions in both physics and mathematics are just the highest, most abstract stages of perturbative solutions that are made possible by the pre-existing "lumpy" structure of our universe.

                  If that assertion makes your eyes pop a bit, please take a look at my analysis chain at the above link. Once I went to this (for me at least) new place... well, it became very hard to go back. That's because even equations like E=mc2 have a scale-dependent, perturbative component if you look at them across the full scale spectrum of the universe, since at the quantum scale mass fuzzes out due to virtual pairs, just as in QED for electrons. Including math in that assertion was the final part of the sequence. Again, take a look at why at the above link if you are interested.

                  Since I don't know if I'm using links rightly yet or not, I'll keep this reply separate and create another one to address the main content of your thoughtful and generous post.

                  Cheers,

                  Terry

                  Dear Terry,

                  It has always been the case that the very high-end computing requirements of theoretical physics produce machines and codes specialized to the theoretic structure. So of course the IT community is always a key player. Lattice Gauge Theory, among others, are very compute-intensive stuff! But lets get to the fundamental physics of the subject...

                  Have you, in your 'broad' research on the subject, run across the Rishon model? It requires only two type of quanta (T & V) to create the algebraic group of quarks and leptons (QC/ED actually).

                  H. Harari and N. Seiberg, "The Rishon Model", Nucl. Phys. B, Vol 204 #1, p 141-167, September 1982.

                  So the reductionist approach to the 'minimal quantum basis' problem does reveal a somewhat 'binary' solution.

                  More to the IT-ish point, though, your software skills and devotion to the algebra of the quantum subjects could well be of GREAT use. Do you by chance write javascript? There is a nice java code for displaying the QC/ED group theory for some academic research applications as well as public explanations.

                  Another interesting point you raise earlier

                  In it, you offer Challenge #1 - "What is the full physics meaning of Euler's identity, ?^??+1=0 ?". That is actually an elegantly simple fundamental question /criterion, but just a little off the mark. Of course mathematically we known that for physics to have a unique solution it must have a cyclic variable. At least, all the best formal Proofs of Uniqueness reduce a conformal mathematics problem to a cyclic variable, removing all true singularities (including the point-like particle approximation).

                  So how does ?^?0 fit in?? Well, it seems that the universe is cyclic in mass and time... NOT radius and time as the astronomic observables would hope /make easy. So the general Theta is actually Thetamass-time! For a more complete answer why this works read my essay, if you please.

                  Further you discuss: "If someone can succeed in uncovering a smaller, simpler, more factored version of the Standard Model, who is to say that the resulting model might not enable new insights into the nature of gravity?" so please see

                  C.W. Misner, K.S. Thorne and J.H. Wheeler, Gravitation, W.H. Freeman and Co., p 536, 1973. in which the Nobel-winning author (Thorne) notes that mass is area-like at small (planck) scale.

                  Here the discussion can go into the finite representation geometries, which are area-like, and their respective quantum state algebras. Or it could look at the influence of ralpha'/R on BH theory,as I've long advocated with Prof Mathur (see his essay), in which the strong (conical) lensing effects observed are due to "PRESERVED" matter in Black Holes. Interesting inquest, again read further into the literature.

                  Best regards,

                  Dr Wayne Lundberg

                  https://fqxi.org/community/forum/topic/3092

                  p.s.

                  I too have a 30yr civil service career but started publishing on physics topics in 1992. More dod stories...