Ken Wharton, in his excellent essay, shows that the boundary conditions are what is fundamental. In this, he supports my definition of what a truth is. "A truth is an absence of choice for everyone". The strongest absence of choice is an impossibility and this, in the most universal sense, is failing the rule of non-contradiction (RNC). All truths are bound by the rule of non-contradiction. In other words, respecting the rule of non-contradiction de-fines, or makes finite and real a truth. The RNC is the basis of maths, logic, and pretty much everything else. The biggest gap in all this was, I believe, not having a clear definition of a truth...

Neither Ken or I may lay claim to this. This was Aristotle's claim all along. "The rule of non-contradiction is the most important rule in the universe... "

Right now Aristotle is spinning in his grave shouting..

" I told you SOOOOOOooooooooooooooo........!!!!!

Thanks Ken,

Marcel,

    Dear Ken Wharton, You have invested a lot of energy to write this good essay. You write:"What's needed is some 'startingpoint'" This starting point can be the principle of the identity of space and matter of Descartes'.According to Descartes, space is matter, and matter is space that moves. Thus, space is the foundation for constructing fundamental theories. The space has information, which is then realized in the device of the world. Look at my essay, FQXi Fundamental in New Cartesian Physics by Dizhechko Boris Semyonovich Where I showed how radically the physics can change if it follows this principle. Evaluate and leave your comment there. Then I'll give you a rating as the bearer of Descartes' idea. Do not allow New Cartesian Physics go away into nothingness, which wants to be the theory of everything OO.

    Sincerely, Dizhechko Boris Semyonovich.

    Dear Edwin -- very perceptive! Yes, lots of side connections to my main research interests, although I tried to keep that to a minimum... :-) As you can probably tell, though, I'm quite skeptical of treating time independently from space, and tend to think about things in 4D as much as possible. I'll try to get to your own essay next week. Best, Ken

    Dear Flavio,

    Good point! Indeed, I certainly could have written a much longer essay about my take on the use of probability in quantum theory. (That's my primary research interest, after all.) But I think it's important to distinguish probability from randomness -- at least randomness of the 'equal a priori probability postulate' variety that I talk about here. That sort of randomness shows up in quantum statistical mechanics, but not really in quantum theory per se. Probabilities in quantum theory are notoriously *not* random in this manner -- some events are more probable than others -- which makes the quantum issue a somewhat different question than the main point I'm making here.

    On that issue, I'm firmly of the opinion that all probabilities -- even quantum probabilities -- are due to things that we don't know. But I'm not at all sure whether there's still room for *non*-fundamental randomness (of the "all-at-once" variety I describe here: http://www.mdpi.com/2078-2489/5/1/190/htm ) , or whether the boundary constraints on the universe really fix the whole history of the universe down to the last detail. Both options still seem to be in play, as I see it.

    Thanks for your comment and interest! -Ken

    Dear Marcel-Marie,

    basically everyone agrees on RNC, the question is what it means: is it logical (A; not-A), complementary (A; everything A is not) or categorical (orthogonal) (∫AB=0)?

    Heinrich

    Heinrich,

    This is an excellent question! In the system described in my essay, it works much like the "logical" RNC you define. It is A is not A*. The asterisk only indicates a different form of A (while minus (-) A suggests a mathematical context.)

    For the system to work logically, its elements must be comparable i.e. A, A*, As, etc. and distinguishable (different) so that the RNC may Work. On the other hand, A and B are not comparable and "A not equal B" is simply the definition of them being two different elements (substances A and B in my essay).

    So, for example, the following are "not A":

    A* (or any variation of A) : --- comparable and distinguishable, therefore, logically operational within the logical system based on A.

    B : (or the set of all except A) not comparable ------ distinguishable by definition.

    Nothingness: both comparable and distinguishable, it supports logically the existence of A, A*, .As, etc (all forms of A)... and (if not in the same system) B and anything else.

    So, to answer your question, the RNC, INSIDE a substantial system (real stuff), operates in a logical sense (comparable and distinguishable) between all the A and A* and other A variations.

    Outside a substantial system, the RNC works in a complementary way (A; all A is not) would differentiate different logical systems, say, one based on A and its variations and another based on B and its variations, or any other ..

    Heinrich, could you describe the categorical (orthogonal) (∫AB=0)? RNC?

    Thanks,

    Marcel,

    Dear Ken Wharton

    Just letting you know that I am making a start on reading of your essay, and hope that you might also take a glance over mine please? I look forward to the sharing of thoughtful opinion. Congratulations on your essay rating as it stands, and best of luck for the contest conclusion.

    My essay is titled

    "Darwinian Universal Fundamental Origin". It stands as a novel test for whether a natural organisational principle can serve a rationale, for emergence of complex systems of physics and cosmology. I will be interested to have my effort judged on both the basis of prospect and of novelty.

    Thank you & kind regards

    Steven Andresen

    Dear Ken,

    thank you for your essay, which I found very interesting and pleasurable. It reminded me a famous poetry by the Italian poet Montale, who said, "Codesto solo oggi possiamo dirti,ciò che non siamo, ciò che non vogliamo." [we can tell you just what we are not and what we don't want].

    You write that

    > Explaining the relationship between two things does not really explain either of them. What's needed is some 'starting point'.

    This is very interesting for me, since I proposed in my essay about absolute relativism that everything is relational. But I agree that we can fully know something just within boundaries.

    All the best,

    Francesco D'isa

    I do not see why an appeal to randomness would be considered less fundamental. Of course, many phenomena in our universe is not random and it can be explained in base to "this" or "that", but there is no objective reason to expect that everything in the universe has a cause. I do not find any reason to believe that Universe is deterministic.

    Does energy conservation follow from Noether theorem? Or it just that the Lagrangian formalism is only valid for non-dissipative systems and thus has conservation law just hidden in its symmetries. Moreover, all treatises on Noether theorem I know confound conservation of energy (diE/dt=0) with invariance of energy (dE/dt=0).

    The equal a priori probabilities postulate is not associated to the second law. The postulate is needed in equilibrium statistical mechanics to get thermodynamic properties for systems at equilibrium and routinely used for the description of reversible processes. In fact the postulate is not valid outside equilibrium and, thus, not valid to study the irreversible evolution of a system towards equilibrium.

    The "past hypothesis" not only do not explain the second law, but shows a basic missunderstanding about the second law. The second law is not about features of the initial state.

    Indeed the time-asymmetry encoded in the second Law cannot come about from time-symmetric dynamical laws. We need time-asymmetric dynamical laws.

    "The Second Law tells us that entropy always increases". Not true. That is only a superficial and misguided formulation of the law. The second law says that the production of entropy is non-zero. The secondf law is not dS>0. The second law is diS >=0. And this is the classic formulation, where thermal fluctuations are ignored.

    All the subsequent attempt to explain that superficial and misguided formulation of the law is invalid as well. Asigning a low entropy to the initial instant of the Universe does not explain anything, and the incompatibility between the second law of thermodynamics and mechanics (time-reversible) remains. Effectively, we solve the Liouville equation (or its von Neuman quantum analog) and set initial state of very low entropy and the evolutions predicted by the equations continue contradicting the second law and observations.

    "Boundary explanations" do not explain anything. Noticing that the systme evolved from A to B because it was first on A and latter was found on B is vacous of content. Moreover this kind of boundary 'explanations' often hide another serious missunderstanding of the second law; if all what was needed to explain that the system evolved irreversibly as A --> B because it was initially on A, then the second law would not be needed. The first law would be enough.

    Initial states and boundaries are alredy used in the laws of mechanics and electrodynamics, but those laws cannot describe irreversibility. And that is the reason why thermodynamics and the second law was born as a separate field of physics.

    The arrow of time, the irreversibility of the second law, has a dynamical origin: resonances. There are a broad literature in the topic.

      Thanks, Juan, for a careful reading and interesting points. Lots to parse here.

      >I do not see why an appeal to randomness would be considered less fundamental. Of course, many phenomena in our universe is not random and it can be explained in base to "this" or "that", but there is no objective reason to expect that everything in the universe has a cause.

      Agreed -- see my response to Flavio above. Some things don't need an explanation. But the Second Law does need an explanation, for several reasons. 1) It supplies many other subsidiary explanations, so it's not devoid of content. 2) It can't be fundamental in its own right, because it only applies to macrostates, not microstates. 3) It can't be explained from our time-symmetric dynamical laws, or randomness.

      > I do not find any reason to believe that Universe is deterministic.

      I probably agree with you here, but the real question is whether it's time-symmetric. Even our indeterminstic theories predict time-symmetric micro-phenomena.

      >Does energy conservation follow from Noether theorem? Or it just that the Lagrangian formalism is only valid for non-dissipative systems and thus has conservation law just hidden in its symmetries. Moreover, all treatises on Noether theorem I know confound conservation of energy (diE/dt=0) with invariance of energy (dE/dt=0).

      Well, when you do it correctly in classical field theory, you get constraints on the Stress Energy tensor, which is really the right way to go. "E" is a bit of a fiction, certainly in GR.

      >The equal a priori probabilities postulate is not associated to the second law.

      I'm skeptical. Could you point me to a stat mech argument for the 2nd law that doesn't implicitly assume it at some stage?

      >The postulate is needed in equilibrium statistical mechanics to get thermodynamic properties for systems at equilibrium and routinely used for the description of reversible processes. In fact the postulate is not valid outside equilibrium and, thus, not valid to study the irreversible evolution of a system towards equilibrium.

      Applying the word "valid" to the EAPPP seems like a category error. Of course, it's never *truly* valid: at any given time the actual system is in 1 microstate, with 100% certainty, and all others with 0%. The "a priori" means that you use it as a Bayesian prior, when you have no other information. And as you note, if you did this, you would predict it would be in an equilibrium macrostate. (Which it might not be, certainly, but that would be your best bet given no other knowledge.) And if you *knew* it wasn't in equilibrium, or knew anything else at all, you'd update your priors. But usually that just means applying the EAPPP to all possible states that were compatible with your updated knowledge. That's how you get to the 2nd Law from the EAPPP. Knowledge always trumps randomness.

      >The "past hypothesis" not only do not explain the second law, but shows a basic missunderstanding about the second law. The second law is not about features of the initial state. Indeed the time-asymmetry encoded in the second Law cannot come about from time-symmetric dynamical laws. We need time-asymmetric dynamical laws.

      That was certainly what Eddington thought -- but that challenge has been open for a century with no answer in sight. (If there are time-asymmetric dynamical laws, what are they?) By now, the question has been settled by computer simulations that show entropy increasing (from low initial boundary constraints!) using explicitly time-symmetric dynamics. In computer simulations, there is no possibility of hidden dynamics we don't know about.

      >"The Second Law tells us that entropy always increases". Not true. That is only a superficial and misguided formulation of the law. The second law says that the production of entropy is non-zero. The secondf law is not dS>0. The second law is diS >=0. And this is the classic formulation, where thermal fluctuations are ignored.

      Agreed! (But in our universe, at any reasonable coarse graining, it does increase.) Also agreed about the fluctuation issue; I talk about this in the 'anthropic' section.

      > All the subsequent attempt to explain that superficial and misguided formulation of the law is invalid as well. Asigning a low entropy to the initial instant of the Universe does not explain anything, and the incompatibility between the second law of thermodynamics and mechanics (time-reversible) remains.

      See the computer example above. Some of the best work on this topic has been done by Larry Schulman. He puts a low entropy *final* condition on systems and shows that entropy *decreases* in computer simulations. He also used initial and final boundaries and showed that entropy went up and then down again. There's no incompatibility whatsoever; all the asymmetries come from the boundaries.

      > Effectively, we solve the Liouville equation (or its von Neuman quantum analog) and set initial state of very low entropy and the evolutions predicted by the equations continue contradicting the second law and observations.

      I don't understand what your point is here.

      >"Boundary explanations" do not explain anything.

      Obviously, I disagree.

      >Noticing that the systme evolved from A to B because it was first on A and latter was found on B is vacous of content.

      True... But those boundaries can still explain what happens inbetween. And if you don't impose anything at B, the boundary at A can also be used to explain asymmetries, if A is "special" or essentially different from how it ends up at B. Furthermore, (the case I'm most interested in) consider *partial* boundary constraints (say, half the Cauchy parameters), constrained on both ends. Once the system is solved, these partial boundaries then explain the un-constrained parameters, at the beginning and the end, and all the parameters in the middle, too. So boundaries can absolutely be used to explain things, when combined with some way to solve the system.

      > Moreover this kind of boundary 'explanations' often hide another serious missunderstanding of the second law; if all what was needed to explain that the system evolved irreversibly as A --> B because it was initially on A, then the second law would not be needed. The first law would be enough.

      I think you're perhaps mixing up macro- and micro- concepts here. There are no irreversible events at a micro scale. (Or so most physicists believe; maybe we're all wrong.)

      >Initial states and boundaries are alredy used in the laws of mechanics and electrodynamics, but those laws cannot describe irreversibility. And that is the reason why thermodynamics and the second law was born as a separate field of physics.

      Yes, it was born separate, but Boltzmann (and others) figured out how to reunite them. The key difference is that when you zoom out to the macrostate, tossing away some of the data as unknown, then apparently irreversible (macro-) processes enter the story (assuming you have a special low-entropy boundary condition, so that the Second Law is in play). But if you know everything, even that apparent irreversibility goes away.

      >The arrow of time, the irreversibility of the second law, has a dynamical origin: resonances. There are a broad literature in the topic.

      There is indeed a very broad literature, and the vast bulk of physicists are perfectly happy with the boundary-based account, even if they're not willing to treat boundaries as fundamental in their own right. If special time-asymmetric resonances were needed, then why would entropy increase in computer simulations that lacked them? Don't you think you're already using Second-Law-style logic when you try to infer a time-asymmetry from a resonance? (Classical chaos is time-symmetric, too, but you can get an arrow from it if you impose a low entropy initial boundary.)

      In general, I'd note that we have lots of time-asymmetric intuitions, and they're all too easy to slip into our analysis without properly seeing where they come in (as happened to Boltzmann himself). The discovery of fundamental time symmetry has been a big surprise to those intuitions, but a surprise that we should take very seriously.

      All the Best,

      Ken

      Dear Ken,

      Very nice essay. Actually I noticed that you have written fantastic essays in the previous essay contests as well. So I will check them out one after the other. One idea I missed in your discussion though is that time itself could be emergent - in the sense that entropy increase defines time and that the very notion of „initial" boils down to small entropy. For example, Claus Kiefer and Dieter Zeh have shown that it is quite reasonable that such an emergent arrow of time can be retrieved by tracing out uninteresting degrees of freedom, and that this arrow of time always points into the direction of an increasing scale factor of the Universe. In practice, I believe, this is equivalent to assume an initial boundary condition, though it applies to the macrostate while the micostate itself has entropy zero and is timeless (this is what I'm arguing for in my own essay). Best regards, Heinrich

        Thanks, Heinrich! The previous essay that got the most attention was "The Universe is not a Computer", which you might enjoy. There's an extended version on the arXiv.

        I am in agreement that the *arrows* of time could certainly be emergent -- indeed, they *must* be if all the laws are time-symmetric. But not time *itself* -- at least not as you describe. For one thing, you can't talk about *anything* increasing without having a concept of time already on the table. For another, entropy isn't fundamental; it applies to macrostates of partial knowledge, not microstates.

        As far as Kiefer+Zeh's idea, it doesn't sound like a boundary explanation at all -- sounds like they're linking it to the dynamics. And I would expect that even if large-scale arrows of time emerged due to an increasing scale factor on large scales, that one would be hard pressed to make sure the same arrow emerged at much smaller scales, in all instances. In that respect, I don't see much of a distinction between it and Barbour's "Janus Point" idea that I critique in the essay. But perhaps I'm missing some nuance there.

        I took a peek at your own essay, and agree that if entanglement is a "real thing", that certainly pushes one in the direction you take. I happen to be a contrarian on the topic, though, and I strongly align myself with the 'psi-epistemic' viewpoint. Now I just have to figure out what the ontic state really is... Details, details. :-)

        Thanks again! --Ken

        Ken,

        Time runs slower toward the ground. So, an object falling toward the ground is moving spontaneously toward "slower time". Slower time means "longer seconds". In order for c (m/s) to remain constant, space must increase just as much as the seconds get longer. In other words, the object is falling spontaneously into larger space, which is dispersion, a classic example of an entropic process.

        Both gravity and entropy are spontaneous processes that show a higher probability of existence in one direction. They have the same underlying logical cause.

        My essay shows (?) that the universe as a logical system admits only one type of stuff or substance and only one type of logical cause. This logical system also appears to operate using a single logical operation, the logical substitution. Both types of motion, in gravity and in entropic dispersion, are the resolution in progress of an illogical state of affair, a non-uniform state of existence due to a non-uniform logical substitution.

        Finally, a clock is a spontaneous device (energy in spring is ok). As such, it represents, for comparison, the local rate of evolution of other (co-located) spontaneous processes, including time. In order for the clock to respond to the local rate of evolution of time "via a logical operation", they both have to be of the same nature, same type of stuff. The clock is just a more complex form of time.

        All mumbo jumbo...Right?

        All the bests,

        Marcel,

        As i see Physics develop we find the individual processes happening in Nature to be random or probabilty conscious. But if we go into the details of the process we find logic or order in the same. Thus, to me random and order appear as two sides of the same coin that nature throws as dice to us!The spontaneity of the process is random or probability conscious. While the logic behind the process has an order behind. All logical aspect of any processes are restrained by conservation laws. But then two canonically conjugate quantities are governed by the Uncertainity principle according to the Quantum theory. Energy relates to time while space relates to the momentum/motion. It therefore seems that any infermity in space gives rise to motion while any energy infermity leads to phase change in time. Classically we can not understand the reasons behind and that is where Quantum theory comes to explain the process. It mostly governs the microscopic phenomena while classically theory explains the gross picture about the same process. To understand the QM predictions visibly , a teacher has to invoke the classical analogue as reality becomes difficult to visualize quantum mechanically! Such di-echtomy has become the rule we proceed in Physics today!

        Hi Ken:

        Completely agree with your conclusion - "Although we use randomness when we don't know any better, a principle of indifference cannot be used to explain anything interesting or fundamental.

        The above is vindicated in my paper -"What is Fundamental - Is C the Speed of Light". that describes the fundamental physics of antigravity missing from the widely-accepted mainstream physics and cosmology theories resolving their current inconsistencies and paradoxes. The missing physics depicts a spontaneous relativistic mass creation/dilation photon model that explains the yet unknown dark energy, inner workings of quantum mechanics, and bridges the gaps among relativity and Maxwell's theories. The model also provides field equations governing the spontaneous wave-particle complimentarity or mass-energy equivalence. The key significance or contribution of the proposed work is to enhance fundamental understanding of C, commonly known as the speed of light, and Cosmological Constant, commonly known as the dark energy.

        The paper not only provides comparisons against existing empirical observations but also forwards testable predictions for future falsification of the proposed model.

        I would like to invite you to read my paper and appreciate any feedback comments.

        Best Regards

        Avtar Singh

        Respected Prof Ken Wharton

        Wonderful arguments.... " Looking at this problem from a different perspective reveals a natural solution: boundary-based explanations that arguably should be viewed as no less fundamental than other physical laws." Best wishes for your essay sir...

        I hope you will not mind that I am not following main stream physics...

        By the way...Here in my essay energy to mass conversion is proposed................ yours is very nice essay best wishes .... I highly appreciate hope your essay ....You may please spend some of the valuable time on Dynamic Universe Model also and give your some of the valuable & esteemed guidance

        Some of the Main foundational points of Dynamic Universe Model :

        -No Isotropy

        -No Homogeneity

        -No Space-time continuum

        -Non-uniform density of matter, universe is lumpy

        -No singularities

        -No collisions between bodies

        -No blackholes

        -No warm holes

        -No Bigbang

        -No repulsion between distant Galaxies

        -Non-empty Universe

        -No imaginary or negative time axis

        -No imaginary X, Y, Z axes

        -No differential and Integral Equations mathematically

        -No General Relativity and Model does not reduce to GR on any condition

        -No Creation of matter like Bigbang or steady-state models

        -No many mini Bigbangs

        -No Missing Mass / Dark matter

        -No Dark energy

        -No Bigbang generated CMB detected

        -No Multi-verses

        Here:

        -Accelerating Expanding universe with 33% Blue shifted Galaxies

        -Newton's Gravitation law works everywhere in the same way

        -All bodies dynamically moving

        -All bodies move in dynamic Equilibrium

        -Closed universe model no light or bodies will go away from universe

        -Single Universe no baby universes

        -Time is linear as observed on earth, moving forward only

        -Independent x,y,z coordinate axes and Time axis no interdependencies between axes..

        -UGF (Universal Gravitational Force) calculated on every point-mass

        -Tensors (Linear) used for giving UNIQUE solutions for each time step

        -Uses everyday physics as achievable by engineering

        -21000 linear equations are used in an Excel sheet

        -Computerized calculations uses 16 decimal digit accuracy

        -Data mining and data warehousing techniques are used for data extraction from large amounts of data.

        - Many predictions of Dynamic Universe Model came true....Have a look at

        http://vaksdynamicuniversemodel.blogspot.in/p/blog-page_15.html

        I request you to please have a look at my essay also, and give some of your esteemed criticism for your information........

        Dynamic Universe Model says that the energy in the form of electromagnetic radiation passing grazingly near any gravitating mass changes its in frequency and finally will convert into neutrinos (mass). We all know that there is no experiment or quest in this direction. Energy conversion happens from mass to energy with the famous E=mC2, the other side of this conversion was not thought off. This is a new fundamental prediction by Dynamic Universe Model, a foundational quest in the area of Astrophysics and Cosmology.

        In accordance with Dynamic Universe Model frequency shift happens on both the sides of spectrum when any electromagnetic radiation passes grazingly near gravitating mass. With this new verification, we will open a new frontier that will unlock a way for formation of the basis for continual Nucleosynthesis (continuous formation of elements) in our Universe. Amount of frequency shift will depend on relative velocity difference. All the papers of author can be downloaded from "http://vaksdynamicuniversemodel.blogspot.in/ "

        I request you to please post your reply in my essay also, so that I can get an intimation that you replied

        Best

        =snp

        Ken,

        Yours is the first essay I have been able to comprehend, from first word to last, and on first reading. Thank you!

        I find that it's fully consistent with Einstein's wish to have boundary conditions that would eliminate the need to specify boundary conditions--and therefore lead to a singularity free general relativity. You write:

        " ... the initial state of the universe is often referred to as an 'initial boundary condition'. The only problem is that many physicists want to then explain this boundary condition, via dynamics or randomness."

        If it's true, however, that the 3 dimension boundary is identical to the 4 dimension horizon, "(3D spatial volumes have 2D boundaries; 4D spacetime- volumes such as our universe have 3D boundaries.)", the 3-d boundary has one negative element + + + - , i.e. (-1), and the 4-d spacetime - - - +, one positive element (+1) though we always measure changes in relations between center mass points, so the positive mass theorem must apply here, for a non-arbitrary initial condition.

        Reduce to a 1-dimensional model, and you have [link:https:fqxi.org/community/forum/topic/3124] my essay.

        All best,

        Tom

        TH,

        You say "....such as our universe have 3D boundaries". The 3D belongs to our reality, not to the universe; there's a difference. The 3D is just the definition of a point like relational observation, the observer. A lot of what we think we learn about the universe is in fact about ourselves.

        Bests,

        Marcel,

        Dear Ken Wharton,

        I like your essay, though I don't agree with all of it. You actually get to grips with the concepts, rather than jumping through them, as some do. And I enjoyed your way of writing. I agree that randomness can't be used to explain things, but with one exception - unless one suspects that it's at the very deepest level.

        The symmetries and patterns the laws contain, which you mention as perhaps suggesting they didn't arise randomly, might be caused by something underneath that happens to generate a lot of symmetry in the levels further up. If so, how that underlying layer was selected would be a very open question, and in general, the question of how the laws arose is unanswered, and to me separate from these questions.

        But looking at randomness within existing physics, at each level of description, there are things that behave with a mixture of randomness and predictability, and these mixtures make patterns. But at the next level of description down, the randomness disappears, and what was random gets predictable. So when we find something very deep that appears partly random, as in QM, we wonder if there's some even deeper level where it goes away. And we've found we can limit the possibilities for that, and that only non-local theories have the option, if they can find a way to do it.

        But without knowing what the underlying picture is, if there is one, we don't know if the randomness is fundamental or superficial. I'd say it could be either, and unless one happens to believe one of the existing interpretations for QM (as I don't), one can choose to say it's an open question. I agree with what you say about boundary explanations, and that taking boundaries as fundamental is a possibility.

        I'd appreciate it if you'd rate my essay - I've only had four ratings so far, and (although that included high ratings and nice comments from Fabio and Edwin), I've found one needs ten ratings for the average to be taken seriously. The essay deals with what relates the levels of description in physics, and argues that explanation does, alongside emergence. It also looks at questions to do with time, and makes a new point near the top of p2, which I'd say removes emergent time as a possibility.

        Anyway, best regards,

        Jonathan