Jim,
Thank you for your positive and thoughtful remarks! I look forward to seeing your essay, and will download a copy of it shortly.
Cheers,
Terry
Jim,
Thank you for your positive and thoughtful remarks! I look forward to seeing your essay, and will download a copy of it shortly.
Cheers,
Terry
Gary,
Thank you for your positive remarks! And wow, that is an intriguing set of questions you just asked!
I like in particular that you are addressing the human and social interactions aspects of communications simplification. These are critical aspects of what I call collaborative or collective intelligence, that is, the "IQ" of an entire group of people, systems, and environments. The idea of a collective IQ addresses for example why free market economies tend in comparison to authoritative economies tend to be hugely more clever, efficient, and adaptable in their use of available resources. The intelligences that emerge from free market economies are examples of intelligences that are beyond detailed human comprehension; that is precisely why the human-in-charge authoritarian structures are so ineffective.
Intelligence is never fully spatially localized, and that is the source of many deep misunderstandings about its nature. Even when you do something as simple as read a book, you have extended your intelligence beyond the bounds of your own body, since you are now relying on an external memory. I would suggest that the main reason human intelligence can be oddly difficult to distinguish from animal intelligence is because it is not the innate cleverness of any one human that defines human intelligence, but rather the extraordinarily high level of networking in both time (writing) and space (language) of human intelligence that makes us unique. For example, a very clever bonobo can I think be individually not that different from a human in terms of innate problem solving and cleverness. But that same bonobo lacks the scaffolding of language, both internally (e.g. for postulating complex imaginary worlds) and externally (for sharing with other bonobos), and so is unable to "build on the shoulders of others," as we like to say.
(A bit of a physics tangent: I would also suggest that intelligence is deeply intertwined with the emergence of information within our universe, in ways we do not yet fully comprehend. At the very origin of our universe the emergence of "I need my own space!" fermions in flat space enabled the emergence of what we call information, via persistent configurations of fermions within that accommodating flat space. But only obstinately persistent and isolationist fermions can readily create the kinds of unique configurations that we call history. Once the universe made history (information) possible, higher levels of complexity also became possible, including only very recently networked human intelligence.)
Your particular questions can be answered specifically only by first grappling with the curiously probabilistic issues that underlie all forms distributed intelligence, but which are particularly conspicuous in human interactions. Pretty much by definition, an intelligent system must deal with issues that cannot be fully anticipated in advance, but which also can be at least partially anticipated. These complex underlying probabilities in turn affect the nature of "simplifications" needed in any one messaging event. Three major simplification options include subsetting (sending only a small but specific subset), generalizing (capturing an overall message, but leaving the recipient to synthesize the details), and complete transfer (e.g., loading a new app onto a smartphone).
The nature of and state of the recipient is of course also critical, and just to confuse everything a bit more, often highly variable over time. The general trend is that due to accumulation of earlier messages and their implications, meaning-per-message increases over time. That also complicates the idea of summarization, since what previously was an incomplete message may over time become entirely adequate. You can watch that effect in slow-but-real time as your Alexa or Hey Google or whatever grows a little smarter each week about how to interpret exactly the same human sentence.
I will address your specific questions after I've read your essay. Again, thank you for such excellent questions!
Cheers,
Terry
Hi Terry,
I read you essay and I loved the last paragraph...
If you see such a thread and find it intriguing, your first step should be to find and immerse yourself in the details of any high-quality experimental data relevant to that thread. Some obscure detail from that data could become the unexpected clue that helps you break a major conceptual barrier. With hard work and insight, you might just become the person who finds a hidden gemstone of simplicity by unravelling the threads of misunderstanding that for decades have kept it hidden.
Now even though I am going to say this - I still loved your essay... Your conclusion is completely wrong and this is the reason why...
I can assure you with utmost confidence that no high-quality experiment with its high quality data will help in revealing what is hidden from us which is required to figure out the theory of everything. Yes I know I am making a very bold statement but, I just wanted you to hear this for future reference when physicists start looking into Gordon's Theory of Everything.
The law of conservation of energy is what is preventing us from realizing what dark energy is... Yes it would actually break the law of physics to solve the theory of everything the way you are proposing. :)
Anyway - if you have any interest - a very limited exposure to my theory is presented in my essay, "The Day After the Nightmare Scenario"
All the best to you
Scott S Gordon, MD/Engr.
Hi Scott,
I love it!!
Yep, you are right: Details of past data are unlikely to do squiddly for such incredibly important issues as "dark matter" and "dark energy". You nailed me royally on that point! I was thinking in particular about overlooked issues in the Standard Model, but hey, even there the whole dark-dark issue has to come in somehow.
I've added you to my reading list, which is a getting a bit long, but I hope to get to it soon.
Thanks again! Since I am Missourian by upbringing, it is the well-stated critiques that make my day. I've found by hard experience that if I start getting way too confident in my own ideas, I start looking and acting like the rear end of one of those Missouri mules. :)
Cheers,
Terry
Hi Terry,
I liked that you provided a simple model of what is fundamental. And your essay followed its own premise: "Fundamental as Fewer Bits". I really enjoyed reading it.
In particular I liked:
"Because gravity is so weak, principles of quantum mechanics drove the scale of such models into both extremely small length scales and extraordinarily high energies. This in turn helped unleash so many new options for "exploration" that the original Standard Model simply got lost in an almost unimaginably large sea of possibilities.[9]"
I my essay "The Thing that is Space-Time" I attempt to pull gravity out of the Standard Model.
I postulate a graviton is not a Boson and that, and in general has very low energy and very large distances (aka wavelength) that span all the matter in the universe. Thus it is a very low energy particle. I use three basic equations to produce this theory: 1. The Planck-Einstein equation. 2. E=mc^2 and 3. The equation for the Planck mass. The general overview is that the graviton is much like a guitar string that is anchored on opposing Planck masses. This quantum mechanical guitar string (the graviton) has a mass and instead of supporting musical notes it supports the different frequencies of light (photons).
Question: Would you take a look at my entry and let me know if this version of gravity has any merit in terms of meeting your criteria of having fewer bits? Any response appreciated!
Thanks,
Don Limuti
Thanks, Terry Bollinger,, for his criticism of my essay. I understand that it was written poorly. Its main aim is to attract researchers to continue the theory of everything of Descartes' taking into account modern achievements in physics. The principle of identity of physical space and matter of Descartes' allows us to remodel the principle of uncertainty of Heisenberg in the principle of definiteness of points of physical space, according to which in order to get the point of it required an infinitely large momentum. Look at my essay, FQXi Fundamental in New Cartesian Physics by Dizhechko Boris Semyonovich Where I showed how radically the physics can change if it follows this principle. Evaluate and leave your comment there. Do not allow New Cartesian Physics go away into nothingness.
Sincerely, Dizhechko Boris Semyonovich.
Terry,
This is a fine essay with many interesting points, eminently clear and sensible. On your main theme of simplicity, you should check out Inés Samengo's excellent essay. She has a similar take, but also considers the scope of a theory as a second key factor in determining what's fundamental. And she makes the point that these two criteria are not necessarily in synch. FYI, though essay ratings have to be done by 2/26, I believe we can continue reading and posting comments afterwards. So no rush!
As you know from looking at my essay, I agree that "a better way to think of physics is not as some form of axiomatic mathematics, but as a type of information theory." And I like the way you characterize the difficulties we face when we have a theory that seems close to being fundamental -- your description of "the trampoline effect" was especially vivid and on point, with the Standard Model. Most of all, though, I like your general attitude - you can get seriously involved in specific issues (your "challenges"), but also really broad ones - like the "lumpiness" you mention in your comments to Karen Crowther's essay: "Our universe is, at many levels, "lumpy enough" that many objects (and processes) within it can be approximated when viewed from a distance."
You were writing about renormalization, and making an interesting shift in perspective. Physicists have tried to understand this by delving into the mathematics, which by now is apparently well-understood. You suggest that a different viewpoint might also help, comparing this with many other cases in which the "approximate" (or "effective") properties of a complex system define it more usefully at a higher level. I agree that this is a deep and important characteristic of our universe, where lower-level complexity supports new and simpler kinds of relationships, where new kinds of complexity can become important. I hope this perspective can eventually elucidate the amazing complications of our current physics.
Your summary credo is excellent: "the belief that simplicity is just as important now as it was in the early 1900s heydays of relativity and quantum theory." The wonder of our situation is that we're still trying to grasp exactly what kind of simplicity those two foundational theories are showing us.
By the way, I'm much in sympathy with your remarks to Flavio, above. The earliest-submitted essays in these contests can be discouraging, and it's a marvelous relief when a really good one shows up - in my case it was Emily Adlam's that rescued me from despair. So thanks for joining in!
Conrad
Dear Terry,
I was most impressed, even inspired. Your ability to find the right questions is leagues above most who can't even recognize correct answers! Lucid, direct, one of the best here.
I entirely agree on simplicity as the title of my own essay suggests, but isn't a reason we haven't advanced that our brains can't quite yet decode the complex puzzle (information)?
But now more importantly. I'd like you to read my essay as two of your sought answers are implicit in an apparent classical mechanism reproducing all QM's predictions and CSHS>2. Most academics (& editors) fear to read, comment or falsify due to cognitive dissonance but I'm sure you're more curious and honest. It simply follows Bell, tries a new starting assumption about pair QAM using Maxwell's orthogonal states and analyses momentum transfers.
Spin 1/2 & 2 etc emerged early on and is in my last essay (scored 8th but no chocs). Past essays (inc. scored 1st & 2nd) described a better logic for SR which led to 'test by QM'. Another implication was cosmic redshift without accelerating expansion closely replicating Euler at a 3D Schrodinger sphere surface and Susskinds seed for strings.
By design I'm quite incompetent to express most thing mathematically. My research uses geometry, trig, observation & logic (though my red/green socks topped the 2015 Wigner essay.) But I do need far more qualified help (consortium forming).
On underlying truths & SM, gravity etc, have you seen how closed, multiple & opposite helical charge paths give toroid... ..but let's take things 2 at a time!
As motion is key I have a 100 sec video giving spin half (, QM etc.) which you may need to watch 3 times, then a long one touching on Euler but mainly Redshift, SR, etc. But maybe read the essay first.
Sorry that was a preamble to mine but you did ask! I loved it, and thank you for those excellent questions and encouragement on our intellectual evolution.
Of course I may be a crackpot. Do be honest, but I may also crack a bottle of champers tonight!
Very best
Peter
Don,
Thank you both for your supportive remarks, and for your intriguing comments on a non-boson approach to gravity! I will definitely take a look, though I should warn you that my reading queue is getting a bit long.
I'd say that your proposing a "non-boson" approach sounds pretty radical... except that after about 40 years of trying, the boson approaches still haven't really worked, have they? Also, general relativity, which does succeed very well experimentally (well, there is that dark energy thing) is anything but "boson" based. I think folks underestimate just how utterly incompatible the boson approach of quantum gravity and the geometric approach of general relativity are! The very languages are so utterly different that it's hard even to say what either one means in the language of the other.
So, thanks again, and I'll get to your essay as soon as I can.
Cheers,
Terry
Dear Terry Bollinger,
My challenge #0:
Accept that the border between past and future is a non-arbitrary point of reference; hence cosine transformation is more concise than complex-valued Fourier transformation. Just the redundant information of a chosen point t=0 is missing.
Thank you for encouragement,
Eckard
Conrad,
[Argh, I almost became Anonymous! Why in the world does FQXi automatically sign people out after a few hours, without even giving a warning like everyone else in the world? And similarly, why do they keep expiring the reCAPCHA? That's not security, that's just annoying, argh2! Keeping folks signed in is the norm these days!]
First, I should probably mention that I've posted a follow-up to my contemplation of the perturbative issue (post 144023) you just mentioned. That is the one in which I took a deeper look at the issues underlying Criterion 4 from Karen Crowley's superb essay.
Sleeping on that issue precipitated a rather unusual early-morning chain of analysis that I documented in real-time in post_144220. Here is my final, fully generalized hypothesis from the end of that analysis chain:
All formal solutions in both physics and mathematics are just the highest, most abstract stages of perturbative solutions that are made possible by the pre-existing "lumpy" structure of our universe.
If that assertion makes your eyes pop a bit, please take a look at my analysis chain at the above link. Once I went to this (for me at least) new place... well, it became very hard to go back. That's because even equations like E=mc2 have a scale-dependent, perturbative component if you look at them across the full scale spectrum of the universe, since at the quantum scale mass fuzzes out due to virtual pairs, just as in QED for electrons. Including math in that assertion was the final part of the sequence. Again, take a look at why at the above link if you are interested.
Since I don't know if I'm using links rightly yet or not, I'll keep this reply separate and create another one to address the main content of your thoughtful and generous post.
Cheers,
Terry
Dear Terry,
It has always been the case that the very high-end computing requirements of theoretical physics produce machines and codes specialized to the theoretic structure. So of course the IT community is always a key player. Lattice Gauge Theory, among others, are very compute-intensive stuff! But lets get to the fundamental physics of the subject...
Have you, in your 'broad' research on the subject, run across the Rishon model? It requires only two type of quanta (T & V) to create the algebraic group of quarks and leptons (QC/ED actually).
H. Harari and N. Seiberg, "The Rishon Model", Nucl. Phys. B, Vol 204 #1, p 141-167, September 1982.
So the reductionist approach to the 'minimal quantum basis' problem does reveal a somewhat 'binary' solution.
More to the IT-ish point, though, your software skills and devotion to the algebra of the quantum subjects could well be of GREAT use. Do you by chance write javascript? There is a nice java code for displaying the QC/ED group theory for some academic research applications as well as public explanations.
Another interesting point you raise earlier
In it, you offer Challenge #1 - "What is the full physics meaning of Euler's identity, ?^??+1=0 ?". That is actually an elegantly simple fundamental question /criterion, but just a little off the mark. Of course mathematically we known that for physics to have a unique solution it must have a cyclic variable. At least, all the best formal Proofs of Uniqueness reduce a conformal mathematics problem to a cyclic variable, removing all true singularities (including the point-like particle approximation).
So how does ?^?0 fit in?? Well, it seems that the universe is cyclic in mass and time... NOT radius and time as the astronomic observables would hope /make easy. So the general Theta is actually Thetamass-time! For a more complete answer why this works read my essay, if you please.
Further you discuss: "If someone can succeed in uncovering a smaller, simpler, more factored version of the Standard Model, who is to say that the resulting model might not enable new insights into the nature of gravity?" so please see
C.W. Misner, K.S. Thorne and J.H. Wheeler, Gravitation, W.H. Freeman and Co., p 536, 1973. in which the Nobel-winning author (Thorne) notes that mass is area-like at small (planck) scale.
Here the discussion can go into the finite representation geometries, which are area-like, and their respective quantum state algebras. Or it could look at the influence of ralpha'/R on BH theory,as I've long advocated with Prof Mathur (see his essay), in which the strong (conical) lensing effects observed are due to "PRESERVED" matter in Black Holes. Interesting inquest, again read further into the literature.
Best regards,
Dr Wayne Lundberg
https://fqxi.org/community/forum/topic/3092
p.s.
I too have a 30yr civil service career but started publishing on physics topics in 1992. More dod stories...
All,
I'm having some difficulty getting in the amount of FQXi time I wanted to this weekend, but I still hope to get to all of your excellent comments and questions this evening, Sun 18 Feb.
Cheers,
Terry
Terry,
Seems to be some subterfuge on scoring. My score for you on 2/16 was an 8, reflecting a high opinion of your piece. Hope you can check out my essay.
Jim
..this time with the first 'h'(ttp); https://youtu.be/WKTXNvbkhhI 100 sec..Classic QM
Dear Terry,
Congratulations for the essay contestant pledge that you introduced (goo.gl/KCCujt) --- I think we should all follow it, and I will certainly attempt to from now on. Congratulations also on the truly constructive comments that you have left so far on the threads of many of the participants in this contest. I thought I would use a similar format and comment on your essay!
What I liked:
- Your essay is well written and interesting to read: at the end, I wanted more of it!
- You introduce vivid/memorable expressions to describe your main points: the principle of binary conciseness, the trampoline effect, foundation messages. I specially like the trampoline effect, defined as the bouncing-off of the near-minimum region of Kolmogorov simplicity by adding new ideas that seem relevant, yet in the end just add more complexity without doing much to solve the original simplification goal. I think you will agree that, when you read some of the essays submitted to your typical FQXi contest, you can observe spectacular examples of the trampoline effect. It seems easy to diagnose a trampoline effect in accepted theories that we find lacking, or in alternative theories that we find even more flawed. True wisdom, of course, would be to be able to become aware of the trampoline effect in our own thinking... which is so hard to do!
- You directly address the specific essay contest question, "What is fundamental?" (at least, in the first half of your essay)
- Nicely worded and accessible introduction to the famous equation E = mc²
- Pedagogical presentation of Kolmogorov complexity for the reader not already familiar with the concept
- Interesting parallel between the increased difficulty in reducing Kolmogorov complexity in an already well-compressed description and the increased difficulty in improving an already well-developed theory
- It was interesting to end with challenges to the physics community, although it fits only tangentially with the essay topic (it would make a great essay topic for a future contest!)
- Your challenges #2 and #3 are profound questions: WHY the spin-statistics theorem? WHY the three generations in the Standard Model? There is certainly much to be learned if we can make progress with these fundamental "Why?" questions --- although the particular physics of our particular universe might just be arbitrary at the most fundamental level, forever frustrating our hopes of ultimate unification and simplicity.
What I liked less / constructive criticism:
- You say that the content of foundation messages (data sets expressing structures and behaviors of the universe that exist independently of human knowledge and actions) must reflect only content from the as-is-universe, despite the extensive work that humans must perform to obtain them. But this presupposes that we can have a reasonably access to the "as-is" universe, which many historians and philosophers of science would deny, saying that observations are always more-or-less theory-dependent (no such thing as a pure observation, independent of the previous knowledge of the observer): see for instance the articles "Theory-ladenness" and "Duhem-Quine Thesis" in Wikipedia.
- You say that in physics, the sole criterion for whether a theory is correct is whether it accurately reproduces the data in foundation messages. It is true that reproducing data is an important criterion, but is it the sole one? For example, a modern, evolved, computer assisted epicycles-based Ptolemaic model (with lots and lots of epicycles) could probably reproduce incredibly well the planetary positions data, but we could use other criteria (simplicity, meshing with theories explaining other phenomena) to strongly criticize it and ultimately reject it.
- I am not sure that the map analogy and the associated figure helps clarify the concept of a Kolmogorov minimum. Maybe it's because I was distracted by the labels: Why pi-r-squared in one of the ovals? Why Euler's identity? Why the zeros and ones along the path? Why is the equation E = mc2 written along a path that goes from Newton to Einstein, since it is purely an Einsteinian equation?
- Your short section on the "Spekkens Principle" is very compact and will probably remain obscure to many readers (it was for me). It might have been beneficial to expand it (I understand there was a length limit to the essay...) or to drop it altogether.
- Concerning your challenge no. 1... Like many mathematicians and physicists, I am in awe with Euler's identity, but I am not sure that there is explicit undiscovered physics insight hiding within it. Once you understand that the exponential function is its own derivative, that the exponent i in e to the i*t comes in front when you derive with respect to time, that multiplication by i rotates a vector by 90° in the complex plane and that the velocity vector in uniform circular motion is perpendicular to the position vector, it becomes "evident" that you can model circular uniform motion (hence, the trigonometric circle) with an exponential function with an imaginary argument: Euler's identity then follows from the fact that pi radians corresponds to half a turn, which is the same as multiplying by -1! If there is anything truly remarkable in all this basic math, it is perhaps that the ratio pi (or, more often, 2 times pi) appears so often in the fundamental equations of physics, even in phenomena that do not seem related in any way to circles or rotations.
And finally, a question:
In your expression "principle of binary conciseness", what does the "binary" stand for exactly? The fact that it deals with TWO (or more...) theories that address the same data, or the fact that Kolomogorov complexity is often applied to strings of BINARY digits?
Congratulations once again, and welcome to the FQXi community! I hope you have the time to take a look at my essay and leave comments --- especially constructive criticism, which is unfortunately so hard to get in these contests, because of the fear of rating reprisal.
Marc
Dear Terry,
This is a well-written essay for a general science reader (by far the hardest type of essay to write). Looks like you have a good shot at winning. The word "tree" is simple, but a tree is complex. Does a simple equation mean a simple thing? Perhaps a simple equation just fits with how we communicate or think.
A side note: I thought spin 1/2 is the way it is because of interaction with photons a spin 1.
All the best,
Jeff Schmitz
A nice essay. I think you would be interested in my 2012 FQXi essay titled "A Classical Reconstruction of Relativity" located here:
https://fqxi.org/community/forum/topic/1363
And my work on modelling the electron/positron wavefunctions as 3D standing waves, located here: http://vixra.org/pdf/1507.0054v6.pdf
I also have an essay in this year's contest titled "A Fundamental Misunderstanding" about a Classical explanation for QM entanglement (EPR experiment).
Regards,
Declan Traill
Dear Jeff,
Thank you for your kind comments! I looked up your interesting short essay and added a posting on it.
Regarding spin, it's definitely the interaction between identical fermions, e.g. a bunch of tightly packed electrons, that makes them unique. What happens is that the antisymmetric nature of the fermion wave functions cause surfaces of zero probability of finding an electron to form between them. This compresses the electrons, which do not like that at all and fight back by trying to expand the space within these zero-probability cells that form around them. The result is a kind of probability foam that we so casually call "volume" in classical physics. Without this effect, earth would be just a centimeters-ish black hole.
This Pauli exclusion occurs for any cluster of identical fermions, regardless of electromagnetic or any other kind of charge, and so is completely unrelated to electromagnetism and the spin 1 photons that make electromagnetism possible.
By far the best short explanation of antisymmetric (spin ½) and symmetric (spin 1) wave functions that I've encountered on the web are these two teaching notes by Simon Connell, a physics professor in South Africa:
Symmetric / antisymmetric wave functions
Cheers,
Terry Bollinger, Fundamental as Fewer Bits