An en excellent essay, I liked it very much! A very well thought criticism of the prevailing idea that there are some terminal building blocks. Moreover, it contains as a remedy the Parmenidean principle, and various holistic ideas endorsed by modern physics. I also liked the Weinberg-Anderson duality, which is a synthesis that goes beyond the two limiting extremes. On a personal level, I enjoyed because it touches some of my favorite topics, including Wheeler's geometrodynamics, his "one-electron" idea, the role of mathematics in physics, and Bohm's "implicate order". A great reading!
Of Lego and Layers (and Fundamentalism) by Dean Rickles
Dear Dean Rickles,
Since I disliked the unreasonable ineffectiveness of mathematics in physical description of hearing, I arrived at something extremely simple.
My last boss refused commenting on it because it is too fundamental.
Admittedly I am not a fan of Parmenides, and I tend to suspect symmetries as redundant void information. Therefore I can only hope for your fierce resistance.
Best,
Eckard Blumschein
Dear Dean Rickles,
Aside from the philosophical review of the last centuries ideas, I found it interesting that you note:
"Attempts to geometrize physics (e.g. John Wheeler's geometrodynamics, or even Einstein's unified field theory) are of this kind: from pure geometry one tries to extract the particulate nature of the world as we find it (with discreteness, charge, mass, and so on, all falling out of the spacetime metric, or metric and topology)."
My first paper at DPF 92 (FermiLab) was titled "Topological combinatorics of a Quantized String Gravitational Metric" physics/9712042 . btw it isn't a topology so much as curvature and area that define a finitary basis for particle theory. But topology does explain why exactly three generations are in the representative geometry and non-communitive matrix algebra.
Anyway, I have a nice physics paper if you're interested in a different mathematical insight.
Wayne L.
https://fqxi.org/community/forum/topic/3092
Dean,
I would have to disagree on one point you made.
"thus, we might say: 'I believe that Max Tegmark is really a bunch of excitations of quantum fields'; or, if we have read Tegmark's book, 'I believe that Max Tegmark is really a mathematical sub-structure in a multiverse of such structures.' It is rare these days to find people espousing this radical eliminitivism."
I would change the last sentence to read: "It is rare these days to find people espousing this radical eliminitivism except in FQXi.org essay contests."
In my own essay I went after the lesser flea of gravity. Do check it out.
I would like to surpass the other positive comments to your essay, but there were so many of them I will just say "It is that good".
Don Limuti
"The history of philosophy provides many responses: atoms and void, ONE, numbers, four elements, geometry, substrata, mind-stuff (truly the funda-mentalists!), states-of-affairs, etc. Physics often informs (and perhaps corrects) these fundamental theories, for naturalists at least; but physics in this case is not considered to provide the most fundamental description of reality: it leaves too much out." Whereas all the responses given by philosophy are invalid, physics provides fundamental descriptions of reality. And why has philosophy failed? Feynman gives some thoughts about this: "what is an object? Philosophers are always saying, "Well, just take a chair for example." The moment they say that, you know that they do not know what they are talking about any more."
Of course, there are cosmologists sharing similar levels of ignorance about the nature of reality; Lawrence Krauss is a notable example. David Albert doesn't show a much better level of understanding either; his review of Krauss' book is an example of it.
The idea that nature has infinite layers is not a scientific notion of reality.
"A philosopher might think, "why spend all that money on particle accelerators when you could pay just me to think, with no equipment other than my brain, to find out what is truly the fundamental structure of reality?"", because the money spend on the philosopher would be lost money, as the history of philosophy shows. One cannot find the fundamental structure of reality without first describing reality, and philosophers fail at this, as Feynman mentioned.
"It is a widespread assumption that scientific progress means finding more basic constituents." Not even close. There are lots of progress made on studying new properties of existing constituents or new ways of combining existing constituents. There are even new scientific disciplines built around such matters! Stating there is no scientific progress if no new fundamental particle has been discovered is ab incorrect conception of science, even if we reduce science to mean only physics.
"physics (elementary particle physics, or something like it) should (and can) furnish a complete account of the world: any and all things should be traceable back to the fundamental layer". This is the older reductionist picture, which has been disproved time ago. This picture doesn't present science not even physics alone.
"The idea is, of course, that the compositional structure of physical reality is something like stacking blocks of Lego to produce a bigger, more complicated object possessing different properties to those found at the level of individual Legos. But we aren't supposed to ask what the blocks are made of, since we would have to then ask the question again, possibly ad infinitum. The fundamentalist intuition is that there must be some end to the questioning." This is not about intuition; it is about being practical, a concept is often missing in the philosophers toolkit.
"So why believe that there is a fundamental level? Why not an infinite descending hierarchy of levels?" One can imagine the existence of infinite layers of existence, an infinite sequence of turtles --paraphrasing that old lady--, but those infinite layers cannot be detected or refuted by us, finite beings living in a finite place. So if one cannot confirm the existence of an infinite layers of existence, debating about them is a waste of time, and the best we (scientists) can do is to propose some fundamental layer as hypothesis and see if we can disprove that hypothesis by founding data that requires a more fundamental layer of existence. What if some day we find some layer that explains all that we know up to that moment? Would we take that layer as truly fundamental or would we play philosophers' games and imagine more turtles?
The "atomic principle" and the "Parmenidean principle" correspond to the synthetic and analytic components of science, but in science we are fully aware that the intermediate states are so fully real as the extremes. We scientists must consider elementary particles as fundamental entities, but we don't consider nuclei, atoms, molecules, cells, planets or any other combination of particles as "mere 'appearance'".
"Fundamental" refers to the foundations of something, or the basis on which other things rest (fundare= 'to found'). Hence it often implies that something is being generated (built) from it, or being made to rest on it (i.e. reduced to it). There exists a dependence relation between less and more fundamental things that define 'levels' of reality." There is only one level of reality. Elementary particles and clusters of macromolecules represent one and the same reality, not two different 'levels'. We can represent this unity using equations as AB=A+B.
"What can be reduced (what has parts) is not fundamental according to this mindset. [...] If something can be reduced, then it is often asserted that that thing does not really exist (mere appearance versus reality)---less derogatory is to say that it is emergent, or scale-dependent". Same missunderstanding again. As stated above composite objects really exist. Reality is unique, there is no scale-dependent reality. The Milky Way, at the cosmological level of sizes, is so real as H2O at the atomic-molecular level. And the concept of emergence is better left to characterize properties of the composite object that don't exist at the components level.
"thus, we might say: `I believe that Max Tegmark is really a bunch of excitations of quantum fields'; or, if we have read Tegmark's book, `I believe that Max Tegmark is really a mathematical sub-structure in a multiverse of such structures.' It is rare these days to find people espousing this radical eliminitivism". Maybe it is rare because people today is better informed and knows that such claims are incorrect.
"The mereological account, of reduction to simples, is already in trouble in standard quantum field theory in which there is, strictly speaking, no basic, elementary, eternally persisting, concrete, physical stuff" as such." At contrary, the basic elementary physical stuff are elementary particles and, so far as we know, everything in the universe is made of them:
"The theories and discoveries of thousands of physicists since the 1930s have resulted in a remarkable insight into the fundamental structure of matter: everything in the universe is found to be made from a few basic building blocks called fundamental particles, governed by four fundamental forces."
https://home.cern/about/physics/standard-model
Dirac didn't discover antimatter. In fact, he believed the mysterious positive charge particle appearing in his flawed equation would be the proton.
"the fact that quantum field theory makes any particle a complex dynamical system (of virtual particles which comprise the `physical' particle) implies that A-TOMs are dead". No, he confounds reality with model. Quantum field theory is only an approximated description of reality built around the concept of bare particle. It is only after a renormalization procedure that those unphysical bare particles are surrounded by clouds of virtual particles to generate real particles. The real particles that we measure in experiments aren't the bare particles that one finds in the Lagrangians of quantum field theory. One can formulate the world directly in terms of real particles, one simply abandons quantum field theory by direct-particle interaction theories.
"Nobel prizes are routinely awarded for finding the smaller, simpler constituents of complex systems." Only if one ignores the history of Nobel Prizes.
"Of course, it doesn't show that reduction to more basic elements and laws is impossible, only that generation of complexity from these basic parts is often not possible. This has been taken to indicate that a theory of everything based on these simples and their laws alone would not enable us to `deduce the world.'" That is not the content of Anderson paper. The problem is not computational. The problem is not it is difficult to extract the properties of composite objects from properties of components. The main remark in Anderson's paper is on the existence of emergent properties that don't exist at the component level. So a full understanding/study of components cannot provide a description of the composite object.
Anderson's critique is against classic reductionism, understood as the study of components only, not about the existence of fundamental level of description with entities and associated laws. The reduced theory continues being less fundamental, simply the old reductionist attitude has to be replaced by modern integrationism.
"Hence, such systems are irreducible in the sense that one cannot find a unique micro-grounding which would imply the properties and laws of the macro-level". But this irreducibility is computational rather than ontological. Twenty years ago we could study certain complex systems at the molecular level, because computers lacked the needed performance. Today we can run simulations of those systems. The same happens with systems are too complex for current computers. In 50 years we can run certain simulations thoday we can only dream.
"Basically, the ultimate approach is grounded in mathematical laws that aim to represent a unique system (some basic field or particle): they are specific and are usually based on symmetry principles (with elementarity defined in terms of invariances). In contrast, complex systems, inasmuch as they admit a representation in terms of exact mathematical laws at all, possess much universality or what philosophers call `multiple realizability.'" Again this confounds reality with specific models. Quantum field theory is not an ultimate approach, but a model used in certain situations. There is no objective reason to believe that different complexity layers of description require qualitatively different ways of description. A simple example is classical chaos. The same equations, the same 'mathematical' laws, that describe the simple behavior of an harmonic oscillator, describe the very complex behavior of chaotic systems.
"This indicates that the critical exponents are independent of the microscopic details of the matter, so that the systems occupy the same universality class. Systems at critical points obey conformal symmetry: one can rescale in various ways and the system looks identical
(i.e. it is a fractal). One can adopt the view that it is such symmetry that is doing the work in generating the properties of critical systems, just as it is the symmetries (e.g. U(1), SU(2), and SU(3)) that generate the physics of elementary particles." Those critical exponents are just a consequence of the microscopic details of the matter. What happens is that the system evolves towards a final state where correlations are minimized and different systems satisfy the same symmetries, the symmetries of the state, but the correlations continue here, the physics has not vanished, and the system can evolve to another less-symmetric state if properly perturbed.
"The approach was developed to understand hadrons (which quantum field theory was then struggling with), and supposed that there was an infinite spectrum of particles (laid along a `Regge trajectory,' with ever rising masses), but, crucially, no one was more fundamental than any other,
thus bypassing a standard particle physicist's question: which particles are fundamental and which are composite? One could in fact view the particles as either fundamental (part of a composite system) or composite themselves." And another confusion between model and reality. The S-matrix approach only provides an approximated description of scattering processes. It is this approximated character which does that different Hamiltonians can be equivalent from the point of view of S-matrix theory. By inspecting only the S-matrix we couldn't differentiate certain fundamental from certain non-fundamental entities, which we could differentiate at the Hamiltonian level directly.
Attempts to geometrize physics (e.g. John Wheeler's geometrodynamics, or even Einstein's unified field theory) have failed miserably. Feynman devotes part of his lectures to explain the differences between physics and geometry. Not only that, but Feynman and other demonstrated how one can formulate gravity non-geometrically.
"Bohm had personal reasons for following this `wholeness' view, since he believed that how we conceptualise the fundamental nature of reality has a bearing on how we relate to the world and one another. Viewing the world as so many independent, separate entities leads to an independent, separate existence, with all that entails in terms of divisions. Adopting a mentality of one, unified system eliminates divisions and establishes us as part of the same whole". This is misleading, the atomic-molecular picture of the world, the 'reductionist' approach do not see reality as a collection of independent, separate entities. Universe is not made of a collection of independent particles. So this 'wholeness' view, is unneeded.
Dean,
Anything physical has to be defined, which makes it finite, making anything physical not fundamental.
So if you ask for "fundamental," do you mean "absolute," or "infinite?"
I would argue space is both, if we would accept geometry only maps space.
For one thing, GR implicitly assumes an equilibrium state(absolute zero) to the vacuum/space, with the observation that clocks and rulers dilate equally in a moving frame, so the frame with the fastest clocks and longest rulers would be closest to the equilibrium of the universal frame, i.e., space.
As for the idea of space emerging from a point, in BBT, when they first realized all galaxies being redshifted equal to distance made us appear as the center of the universe, it was then argued it must be an expansion of space, not simply in space, because Spacetime! Then every point would appear as the center.
What got overlooked is the central premise of GR being that the speed of light is always measured at C, in any frame. Necessarily, if the light is being redshifted, because it is taking longer to cross this intergalactic frame, obviously it is NOT Constant to the ruler of that frame. So two metrics of space are being derived from the same intergalactic light. A stable one, based on the speed and an expanding one, based on the spectrum. Given C is being used as the denominator, or it would be a "tired light" issue, even the cosmologists must subconsciously realize the idea is nonsense, but since BBT can never be falsified, only patched, it keeps them employed.
given we do appear at the center and we are at the center of our point of view, logically an optical effect might be worth considering.
Which is to say that space is infinite, as in unbounded, or limited. Since both infinity and equilibrium are non-physical qualities, they don't need cause and that is useful when one is trying to discover the fundamental.
Regards,
John B Merryman
Dear Dean,
congratulations on a well thought-out and written essay. More than any other I've seen so far, you position the question of fundamentality within the historical discourse, drawing on an impressive background knowledge to do so. There are many threads in your essay for me to chase down!
I like your picture of the Parmenidean world view as a sort of limit of the atomist one: instead of many different kinds of things in the void, there's only one, making the void superfluous. You can take that in the other direction, too, arriving at something like Anaxagoras' worldview of infinite divisibility. This puts the options on a kind of spectrum: with Anaxagoras, you have an infinitely deep bottom, from which everything flows, while for the Eleatics, the top dictates what goes on below. One could imagine various kinds of atomism interpolating between them, each with N kinds of atoms, with the extremes being N=1 and N=infinity.
Maybe, then, it doesn't really matter which of these one chooses to be 'fundamental': just like with a differential equation, you can stipulate its initial conditions, as well as its final conditions, or indeed at any point in between, you can take each of these levels as fundamental with equal justification.
I'm not quite sure the world really is structured like that, however. Rather, it's another suggestion you make that has a much greater appeal to me: the idea that the competing approaches are complimentary in some Bohr-like sense. Then, the universe is neither top-down nor bottom-up, nor some sort of mixture between the two, but rather, one needs both the top-down and the bottom-up view in certain situations, even though both are ultimately incompatible: there's simply not a single story you can tell that neatly covers everything, just as there's not a single story you can tell about quantum systems that accounts for all phenomena (rather, there's one for every complete set of compatible observables).
This fits rather well with my own views---every single model of the world leaves something out, hence, there's no such thing as 'the' fundamental level; instead, each model will have its own notion of fundamentality.
Thanks again for an edifying read!
Cheers,
Jochen
good luck on the grant app.
Dear Dean
If you are looking for another essay to read and rate in the final days of the contest, will you consider mine please? I read all essays from those who comment on my page, and if I cant rate an essay highly, then I don't rate them at all. Infact I haven't issued a rating lower that ten. So you have nothing to lose by having me read your essay, and everything to gain.
Beyond my essay's introduction, I place a microscope on the subjects of universal complexity and natural forces. I do so within context that clock operation is driven by Quantum Mechanical forces (atomic and photonic), while clocks also serve measure of General Relativity's effects (spacetime, time dilation). In this respect clocks can be said to possess a split personality, giving them the distinction that they are simultaneously a study in QM, while GR is a study of clocks. The situation stands whereby we have two fundamental theories of the world, but just one world. And we have a singular device which serves study of both those fundamental theories. Two fundamental theories, but one device? Please join me and my essay in questioning this circumstance?
My essay goes on to identify natural forces in their universal roles, how they motivate the building of and maintaining complex universal structures and processes. When we look at how star fusion processes sit within a "narrow range of sensitivity" that stars are neither led to explode nor collapse under gravity. We think how lucky we are that the universe is just so. We can also count our lucky stars that the fusion process that marks the birth of a star, also leads to an eruption of photons from its surface. And again, how lucky we are! for if they didn't then gas accumulation wouldn't be halted and the star would again be led to collapse.
Could a natural organisation principle have been responsible for fine tuning universal systems? Faced with how lucky we appear to have been, shouldn't we consider this possibility?
For our luck surely didnt run out there, for these photons stream down on earth, liquifying oceans which drive geochemical processes that we "life" are reliant upon. The Earth is made up of elements that possess the chemical potentials that life is entirely dependent upon. Those chemical potentials are not expressed in the absence of water solvency. So again, how amazingly fortunate we are that these chemical potentials exist in the first instance, and additionally within an environment of abundant water solvency such as Earth, able to express these potentials.
My essay is attempt of something audacious. It questions the fundamental nature of the interaction between space and matter Guv = Tuv, and hypothesizes the equality between space curvature and atomic forces is due to common process. Space gives up a potential in exchange for atomic forces in a conversion process, which drives atomic activity. And furthermore, that Baryons only exist because this energy potential of space exists and is available for exploitation. Baryon characteristics and behaviours, complexity of structure and process might then be explained in terms of being evolved and optimised for this purpose and existence. Removing need for so many layers of extraordinary luck to eventuate our own existence. It attempts an interpretation of the above mentioned stellar processes within these terms, but also extends much further. It shines a light on molecular structure that binds matter together, as potentially being an evolved agency that enhances rigidity and therefor persistence of universal system. We then turn a questioning mind towards Earths unlikely geochemical processes, (for which we living things owe so much) and look at its central theme and propensity for molecular rock forming processes. The existence of chemical potentials and their diverse range of molecular bond formation activities? The abundance of water solvent on Earth, for which many geochemical rock forming processes could not be expressed without? The question of a watery Earth? is then implicated as being part of an evolved system that arose for purpose and reason, alongside the same reason and purpose that molecular bonds and chemistry processes arose.
By identifying atomic forces as having their origin in space, we have identified how they perpetually act, and deliver work products. Forces drive clocks and clock activity is shown by GR to dilate. My essay details the principle of force dilation and applies it to a universal mystery. My essay raises the possibility, that nature in possession of a natural energy potential, will spontaneously generate a circumstance of Darwinian emergence. It did so on Earth, and perhaps it did so within a wider scope. We learnt how biology generates intricate structure and complexity, and now we learn how it might explain for intricate structure and complexity within universal physical systems.
To steal a phrase from my essay "A world product of evolved optimization".
Best of luck for the conclusion of the contest
Kind regards
Steven Andresen
Darwinian Universal Fundamental Origin
"Krauss argues that quantum fields and the vacuum are all one needs to explain the genesis and structure of every other thing." Has Krauss ignored Milgrom's MOND? According to Kroupa: (1) MOND has passed all empirical tests so far (in the realm of MOND's applicability). (2) Conventional physics cannot explain MOND's empirical successes. Is Kroupa wrong? Google "kroupa milgrom".
Thank you, Dean Rickless, for the reference, you have cited as comment under my essay!
I was not aware of "Mill-Ramsey-Lewis" philosophy. Actually, I have never read any serious philosophy. I am an experimental physicist. So, pardon my ignorance.
I will read your citation carefully and cite them as references in my future articles.
Re-realization of patterns in nature has been happening for thousands of years, which is a good sign that ancient people were at least as smart as we think we are!
Chandra.