Dear jcns

Adaptive selection is one of the most important types of top-down causation. I did not have space to go into that aspect of things in the essay, but it is discussed in two papers accessible as follows:

On the nature of causation in complex systems ,

[linl:http://www.mth.uct.ac.za/~ellis/Top_down_gfre.pdf] Top down causation and emergence: some comments on mechanisms [/link]

Adaptive selection is top-down because the selection criteria are at a different level than the objects being selected: in causal terms, they represent a higher level of causation. Darwinian selection is the special case when one has repeated adaptive selection with heredity and variation. It is top-down because the result is crucially shaped by the environment [as demonstrated by numerous experiments: e.g.a polar bear is white because the polar environment is white].

However adaptive selection occurs far more widely than that; e.g. it occurs in state vector preparation, as I indicate in the essay.

Hope that clarifies this.

George

  • [deleted]

George,

Thank you very much for the references. I'll take a close look at your paper 'Top-down causation and emergence: some comments on mechanisms' as well as your paper 'Physics in the Real Universe: Time and Spacetime.' It's my preliminary sense that we share more than a few ideas in common about the nature of time. More later.

jcns

"As to your second para, the issue is what are "fundamental aspects of reality". One does not have to agree that the only such aspects are those described by physics: what abut mathematics for example? Or logic?"

Interesting questions...

How we define "fundamental" determines how we interpret data and build models.

While in mathematics one can arbitrarily chose any consistent set of axioms as a basis of an axiomatic system, the axioms in a physics theory should represent fundamental aspects of reality. This raises the essential question: What constitutes a fundamental aspect of reality?

What I am exploring (which I briefly discuss in my essay and at length in another work) is the idea that reality obeys a principle of strict causality. From the principle of strict causality, it follows that an aspect of reality is fundamental if it is absolutely invariable. That is, regardless of interactions or transformations it is subjected to, a fundamental aspect of reality remains unaffected.

Reality, I suggest, can be thought as an axiomatic system in which fundamental aspects correspond to axioms and non-fundamental aspects correspond to theorems.

The empirical method is essentially a method by which we try to deduce the axiom set of reality, the fundamental components and forces, from theorems (non-fundamental interactions). There lies the problem. Even though reality is a complete and consistent system, the laws extracted from observations at different scales of reality and which form the basis of physics theories do not together form a complete and consistent axiomatic system.

The predictions of current theories may agree with observations at the scale from which their premises were extracted, but they fail, often catastrophically, when it comes to making predictions at different scales of reality.

This may indicate that current theories are not axiomatic in the sense I described above; that is, they are not based on true physical axioms, that is; the founding propositions of the theories do not correspond to fundamental aspects of reality (as per the above definition of "fundamental.") If they were, then the axioms of distinct theories could be merged into consistent axiomatic sets. There would be no incompatibilities.

Also, if theories were axiomatic systems in the way we describe here, their axioms, would be similar or complimentary. True axioms can never be in contradiction.

This raises important questions in regards to the empirical method and its ability to extract true axioms from theorems it deduces from observations. Even theories which are based on the observations of phenomena at the microscopic scale have failed to produce true axioms (if they had, they would explain interactions at larger scales as well). The reason may be that everything we hold as fundamental, the particles, the forces, etc, are not. So we ended up with theorems which can be applied successfully to the scale they were extracted from, but not to others scales.

Also, theories founded on theorems rather than axioms cannot be unified. That suggests that the grand unification of the reigning theories which has been the dream of generation of physicists may be mathematically impossible because their axiom sets are incompatible or mutually exclusive.

So, what I find interesting is that our approaches are in diametrically opposed. While you propose a top-down model of causality and the representations that models it, which I see as deconstructive approach ( gathering observational and experimental data and mathematically processing it in an attempt to extract or deduce from it the fundamental laws of the Universe), I propose a bottom-up approach where were physical emerge from the smallest and simplest possible axiom set. An axiomatic approach, as I define it, is the opposite of an empirical method. I suspect that there may be limit to heuristics, a limit to the empirical method and when this limit is attained, physics may have to rely on an axiomatic approach. The exploration of top-down causality may actually help find the "heuristical" limit, if such limit exists. That limit is the point at which reality is unobservable and somewhere beyond would be the true fundamental scale.

But that fundamental reality is unobservable does not imply we can't design a physics theory that describes it. It may very well be possible to devise a complete and consistent set of axioms to which interactions at all scales of reality can be reduced to. This means that even if the fundamental scale of reality remains unobservable, an axiomatic theory would make precise predictions at scales that are.

Dear George:

Excellent paper and clearly written to provide a wholesome perspective of reality provided by the top-down causation. In other words, the sum of parts is not the Whole, which could be more than and different from the linear sum of parts.

The theme of your paper is vindicated by the fact that a top-down causation model with simple boundary conditions is shown to predict the observed expansion of the universe and galaxies without any bottom up causation used in the standard model or particle physics. As described in my posted paper - "From Absurd to Elegant Universe", the current paradoxes, singularities, and inconsistencies in the standard cosmology are shown to be artifacts of the absence of the top-down wholesome approach. The proposed Relativistic Universe Expansion (RUE) model based on the top-down conservation of the relativistic mass-energy-space-time continuum accurately predicts the observed universe accelerated expansion, dark energy or cosmological constant, and galactic star velocities without the concept of dark matter. It also predicts the dilation and creation of mass without any anti-matter and eliminates black hole singularity without the need for any super luminous inflation. The model also explains/predicts the inner workings of quantum mechanics and resolves paradoxes of the measurement problem, quantum gravity and time, and inconsistencies with relativity theory.

The evidence presented in my paper directly and mechanistically vindicates the following statements regarding the top-down causation in your paper:

" ...the foundational assumption that all causation is bottom up is wrong, even in the case of physics."

"The key feature is that the higher level dynamics is effectively decoupled from lower level laws and details of the lower level variables:......you don't have to know those details in order to predict the higher level behavior."

My paper also proves as true the following concluding statement in your paper:

".... recognizing this feature will make it easier to comprehend the physical effects underlying emergence of genuine complexity, and may lead to useful new developments, particularly to do with the foundational nature of quantum theory. It is a key missing element in current physics."

I am delighted to read your paper as it mirrors the overall theme and results of my paper. I would greatly appreciate and welcome your comments on my paper.

Sincerely,

Avtar Singh

    • [deleted]

    Dear George Ellis,

    I found your essay very comprehensible, succinct and eloquent, as others have also found. It was enjoyable to read. The subject matter is interesting to me. I was also interested to see, in your comments, that you have written another essay in which natural selection is discussed. I have just touched on this kind of pattern control at the end of my essay and if there had been space I would have liked to have discussed it further. So it was really good to see your essay here because you have done a really thorough and clear job of getting the very important concept across.

    One complaint often given by intelligent design supporters is that complex forms or functions can not arise by random chance. I think we are both saying that the outcome is not chance but a consequence of the organisation that already exists -and- the rules of physics and biology. Your photoshop example made me think of how an egg shell is formed. Calcium carbonate from ground up oyster shell or cuttle fish bone may be input to a bird (organised structure) and a beautifully formed eggshell is output. That egg form would not occur without the complex bird organism.It is a product of the organisation and rules not just self assembly of atoms.

    I really like that you have considered this over many different scales from the smallest to the largest. There seems to be organisation at whatever scale is investigated and I think we agree that to concentrate on the smallest scales, and to expect all of the answers to come from there, is "myopic".It is also really good that you have explained your work in this discussion thread. I have found your comments helpful and think your full participation and patience is admirable.

    You are sure to have many more appreciative readers.Good luck in the competition.

      "Reality, I suggest, can be thought as an axiomatic system in which fundamental aspects correspond to axioms and non-fundamental aspects correspond to theorems."

      - a very old dream, and one that is probably unattainable both because of Godel's theorem (on the logical side, showing th eproblems with axiomatic systems) and because of the issues Laughlin raises (on the physical side, showing the limits of bottom up deduction; see his quote in the appendix to my essay).

      In any case suppose it were true, this raises a whole new set of issues:

      * in what way do these axioms and theorems exist, and where do they exist? Are they Platonic forms for example?

      * what decides the form they have? (there are various possible forms of logic: who chose this one?)

      * how do they have the power to create any physical entity whatever?

      Actually axiomatic systems are rather limited in their powers and in their ability to represent reality. I suggest you take a look at Eddington's book On the nature of the physical world regarding our use of mental models, and the limits to their use. They are partial representations of reality, and should not be confused wit reality itself.

      The FQXI essay a few years ago was on the nature of time. I don't want to go into that again here. Yo'll find an in depth presentation of my view there.

      Paul

      I really don't want to go into the issue of time here, it is a separate issue than what I am focusing on in my essay. Nevertheless I'll respond this time:

      * I agree with your first main paragraph, interpreted as regards the passage of time along world lines in spacetime

      * IN the third paragraph, you state " this involves a vanishingly small degree of change and duration, but it must be so." This is a physics assumption that may or may not be true. Many assume spacetime is quantised, in which case there is a minimum unit of time, and what you say is not true.

      George

      Dear Joe

      Thanks for that. Your comments on identity strike to the heart of what I say about multiple realisability. You are right, the keyboard letters are never exactly identical: yet the abstract letter "A" represented by them is still the letter "A" despite all the variations you mention.

      It is also the letter "A" if

      * you change font (Times New Roman to Helvetica)

      * you change to bold or italic

      * you change size of the font

      * you change colour of the font

      * you change the medium from light on a computer screen to ink on paper

      One of the key problems in Artificial intelligence is to assign all these different representations to the same abstract entity that they all represent. This way varied lower level representations of a higher level entity occur is characteristic of top-down causation: what matters is the equivalence class of all these representations, which is the characteristic of the higher level entity, not which particular representation has been chosen.

      So all those different appearances all represent the same thing. And our minds easily handle this and recognize the higher level abstract thing all these phenomena represent, whether it is the letter "A" or the plan for a jumbo jet airliner. Those higher level entities (such as the plan for the airliner) really exist as entities in their own right. Proof: Jumbo jet airliners exist. It could not do so unless the abstract plan, with all its multiple represntations, were real.

      Dear Georgina Parry

      many thanks for that. I like your eggshell example - yes it is a nice illustration. For an in depth discussion of top-down causation in developmental biology, the book by Gilbert and Epel ("Ecological developmental biology") is excellent.

      The key point about adaptive selection (once off or repeated) is that it lets us locally go against the flow of entropy, and this lets us build up useful information. In this regard, I can't resist the following comment: it is often said that you can't unscramble an egg. Yes you can. How? By feeding the omelette to a chicken! (you get less egg than you started with:that's the Second Law in operation)

      Good luck to you too.

      George

      My problem here is that the very way in which physical reality occurs, which is what I am really writing off, albeit generically, involves an implication for time, ie the lack of it therein. And an understanding as to what timing is reveals the same point. Anyway, there is no "passage of time", in the sense that time is 'something'. Physically, there is alteration, in a sequence, and one aspect of that is the rate at which that occurs, for which we can use timing to calibrate. But that concerns difference between realities, not a feature of a reality. Spacetime is an invalid model of physical reality. Time, or more precisely, timing, is extrinsic thereto.

      That was not a "physics assumption". All we can know, and physical reality is what we can know of it, is that there is something independent of sensory systems, because they receive it (it being the result of a physical interaction between other pyhsically existent phenomena) and when such inputs are compared, difference is identifiable. A unit of time is, by definition, the fastest change to occur, because timing is rating change, per se.

      Paul

      Joe

      Exactly, as I have said elsewhere to you, and this is my fundamental point. As at any given point in time (as in timing), there is a specific physically existent state. To discern it, we would have to identify the particular state of the properties, and the relative spatial position, of every elementary particle involved. An impossible task, but our inability to do that does not detract from the fact that that is what constitutes physical reality (aka the present)as at that point in time. Even your A is more than one of these physically existent states. Misconceptualising this, leads to problems. Neither does sensory detection have any impact on that, because it occurred before it was sensed (something which the Copenhagen interpretation does not recognise).

      Paul

      • [deleted]

      George,

      That's understandable. Thanks for the reply.

      • [deleted]

      Dear George,

      The paper to which you referred me, 'Top down causation and emergence: some comments on mechanisms,' did indeed help to answer my earlier question about squaring your ideas with natural selection. Thank you.

      I'd like to comment on the point you made in your example illustrated by the question: "Why is an aircraft flying?" You wrote, "And why was it designed to fly? Because it will potentially make a profit for the manufacturers and the airline company! Without the prospect of that profit, it would not exist. This is the topmost cause for its existence."

      I question whether there may be an even higher level cause: some human somewhere along the line posed the question "If birds can fly, why can't I?" And then our fellow humans refused to stop seeking until they found a satisfactory answer. Human curiosity about the way things work.

      We might ask why all these essays have been written and submitted to the FQXi essay competition. Was it primarily because all these authors hope to win some easy money? I suspect not. More likely it is because they all have thought about the workings of the universe and have developed their own ideas and explanations that they believe are sensible, and they seek to share their ideas with similarly thoughtful people and, hopefully, perhaps to receive validation in the form of recognition and appreciation, regardless of any potential monetary reward.

      Is it possible that human curiosity and creativity and eagerness for constructive collaboration are among the top of the topmost causes?

      jcns

        • [deleted]

        jcns,

        With George's point about planes existing because they make a profit for airlines and manufacturers, it is a top down logic of careful analysis of the situation and how it might be incrementally expanded. With your observation about flight being a consequence of human curiosity, it leans more toward a bottom up evolutionary striving, where all possible options get tried and those which succeed are the most repeated. Obviously there is no clear line between the two, but a constant feedback between experimentation and planning.

        One might define the basis or bottom, as simple, while the elevated state is simply more complex, rather than "higher." So that initial question, "If birds can fly, why can't I?" is not so much a higher cause, but a more elemental cause. George's top down position is rather a vantage point from where one might plan on how to push even further up.

        • [deleted]

        What sort of role might Erdos-Renyi networks play here? The sort of nearest neighbor approach with probability weights is a neural model of sorts. These networks are the basis for percolation theory and mean field theory. When the number of connected nodes reaches some threshold the properties of the system can change. In the case of percolation theory this can lead to a rapid failure of a material.