I was intending to respond with a longer discussion. I will have to postpone that until later. I will say that I think an aspect of the quantum measurement problem as an undecidable proposition is related to an aspect of quantum gravitation. If you read my essay you find I propose a duality between the unitarity and equivalence principles of quantum mechanics and general relativity. This in a quantum context leads to a sort of paradox of the same nature as the quantum measurement issue.

I will return to this in the near future. It got late today and I will probably write a better discussion tomorrow.

Cheers LC

Dear Olaf Dreyer

Just letting you know that I am making a start on reading of your essay, and hope that you might also take a glance over mine please? I look forward to the sharing of thoughtful opinion. Congratulations on your essay rating as it stands, and best of luck for the contest conclusion.

My essay is titled

"Darwinian Universal Fundamental Origin". It stands as a novel test for whether a natural organisational principle can serve a rationale, for emergence of complex systems of physics and cosmology. I will be interested to have my effort judged on both the basis of prospect and of novelty.

Thank you & kind regards

Steven Andresen

Dear Olaf,

I have enjoyed reading your concise and to the point essay. Just wondering whether I got it right that you use 'emergence' not in the naturalistic (bottom up), but rather in the sense of 'spontaneous idea' implying human agency.

Heinrich

To continue onwards with what I started, I thought I would break this out some with regards to quantum gravitation.

With the measurement issue a quantum system with some set of states, usually rather small in number or with a few number of degrees of freedom, by a classical-like system. By the use of "like" it is a reference to the fact this system is really in fact quantum mechanical. In the sense that Zurek outlines there is a form of quantum decoherence that induces superposition or entanglement phase of the system to enter into a large reservoir of states. Hartle then illustrates how decoherence leads to a form of macrostates that are decoherent sets of states. These are subjectively assigned groups of states, similar to the idea of macrostates in phase space in statistical mechanics. We then see in this physics the split between objective physics, which might be seen as the dynamical evolution of quantum states, and subjective physics that occurs with the phenomenological report observers make.

I do not think it is unreasonable to see this as a case of quantum states encoding information about quantum states. In order to fully understand this process it requires some understanding of how the reservoir of quantum states define the final needle states, which requires a measurement system of the measuring system, which then of course gets into this recursion. The many world interpretation has this feature in it of a product structure of increasingly entangled states. This ψ-ontological interpretation has its mirror in ψ-epistemic interpretations, such as in GRW objective collapse models. Further, quantum interpretations tends to be incomplete and contradict each other. I see this as a possible feature of a Gödelian nature of quantum physics.

When it comes to quantum gravity there is a similar gap. Currently the firewall is a major obstruction to the unital description of quantized gravity. Hawking bet that information, here information as the number of quantum bits, qubits or qu-Nits for N >= 2, is not conserved. The conservation of information appears to be a reasonable requirement of physics, which was the stance of Susskind. Susskind won a bet with Hawking, but then Almheiri, Marolf, Polchinski, and Sully demonstrated something interesting. A failure of accounting for entanglements of states meant that either unitarity of quantum mechanics or the equivalence principle of general relativity fails. In my essay I illustrate how these two principles are complementary and not compatible in a classical sense.

The vacuum is filled with virtual pairs of fields. With a black hole the gravity field causes one of these pairs to fall into the black hole and the other to escape. This means the quantum particle or photon that escapes as Hawking radiation is entangled with the pair that falls into the black hole, and so this means Hawking radiation is entangled with the black hole. So at first blush there seems to be no problem. However, if we think of a thermal cavity heated to high temperature photons that escape are entangled with quantum states of atoms composing the cavity. Once the entanglement entropy reaches a maximum at half the energy released the subsequent photons released are entangled with prior photons released. This would hold with black holes as well, but because of the virtual pair nature of this radiation it means Hawking radiation previously emitted in a bipartite entanglement are now entangled not just with the black hole, but with more recently emitted radiation as well. This means a bipartite entanglement is transformed into a tripartite entanglement. Such transformations are not permitted by quantum unitary evolution. This is called quantum monogamy requirement, and what this suggests is unitarity fails. To prevent the failure of quantum mechanics some proposed a firewall that violates the equivalency principle. This is called a firewall.

The firewall occurs when half the possible radiation is emitted, which is also the Page time. This also corresponds to the failure of a quantum error correction code. Error correction codes involve some deep mathematics; it is connected with the RT formula and I illustrate in my essay the connection with Mirzakhani's mathematics on the geodesics in hyperbolic spaces. Error correction is also tied with the packing of spheres or how oranges stack at the grocery store, the Kepler problem. This gets into the guts of what my paper is about. However focusing in an error correction corrects the mixing of information. Think of a library, in particular an elementary school library with little kids, and the patrons scramble up the order of books. The distance a books ends up from its right position is the Hamming distance. As the library gets mixed up an algorithm can manage this disordering. However, at about half mixing up things break down. The librarian has to virtually start over.

In the end it may be that the equivalence principle and the unitary principle are complementary and in a quantum setting are not observable in a simultaneous observation. This is similar to the Heisenberg uncertainty principle with position and momentum. Back to the Gödelian issue, this means the universe presents itself in entirely different ways depending on the type of measurement performed. This is also a sort of form of "collapse" if thought of in a ψ-epistemic sense, which would agree with Hawking and Penrose. A ψ-ontological perspective would be more in line with Susskind. These perspectives are I think ultimately a form of G and NOT-G for G a true but undecidable Gödelian proposition.

Cheers LC

I just boosted your voting score a bit. I realized that I had not voted for your paper yet.

Cheers LC

Horrors, the message above I copied from MSW looks terrible. Here it is again

Thanks for your remarks on my paper. I can say that a part of this was motivated by Maryam Mirzakhani's death. She died of breast cancer last July, and the news for various reason made me angry. I had read one of her paper's back in 2014 when she won the Fields medal, and at the time I thought this had something maybe to do with physics. Last spring I studied the Ryu-Takayanagi (RT) formula and for some reason the day I heard of Maryam's death the insight on how her work connects with this hit me.

There is this problem with how gravitation and quantum mechanics merge or function in a single system. It is often said we understand nothing of quantum gravity, and this is not quite so. Even with the based canonical quantization of gravity from the 1970s in a weak limit is computable and tells you something. This theoretical understanding is very limited and big open questions remain. Of course since then far more progress has been made. The AdS/CFT correspondence, the Raamsdonk equivalence between entanglement and spacetime and the RT formula are some of the more recent developments. These indicate how spacetime physics has a correspondence or maybe equivalency with quantum mechanics or quantum Yang-Mills fields. However, an obstruction exists that appears very stubborn.

The vacuum is filled with virtual pairs of fields. With a black hole the gravity field causes one of these pairs to fall into the black hole and the other to escape. This means the quantum particle or photon that escapes as Hawking radiation is entangled with the pair that falls into the black hole, and so this means Hawking radiation is entangled with the black hole. So at first blush there seems to be no problem. However, if we think of a thermal cavity heated to high temperature photons that escape are entangled with quantum states of atoms composing the cavity. Once the entanglement entropy reaches a maximum at half the energy released the subsequent photons released are entangled with prior photons released. This would hold with black holes as well, but because of the virtual pair nature of this radiation it means Hawking radiation previously emitted in a bipartite entanglement are now entangled not just with the black hole, but with more recently emitted radiation as well. This means a bipartite entanglement is transformed into a tripartite entanglement. Such transformations are not permitted by quantum unitary evolution. This is called quantum monogamy requirement, and what this suggests is unitarity fails. To prevent the failure of quantum mechanics some proposed a firewall that violates the equivalency principle. This is called a firewall.

The firewall occurs when half the possible radiation is emitted, which is also the Page time. This also corresponds to the failure of a quantum error correction code. Error correction codes involve some deep mathematics; it is connected with the RT formula and I illustrate in my essay the connection with Mirzakhani's mathematics on the geodesics in hyperbolic spaces. Error correction is also tied with the packing of spheres or how oranges stack at the grocery store, the Kepler problem. This gets into the guts of what my paper is about. However focusing in an error correction corrects the mixing of information. Think of a library, in particular an elementary school library with little kids, and the patrons scramble up the order of books. The distance a books ends up from its right position is the Hamming distance. As the library gets mixed up an algorithm can manage this disordering. However, at about half mixing up things break down. The librarian has to virtually start over.

The solution with Susskind and others is to say spacetime variables and quantum states are equivalent. I do not disagree completely, but I think this is a complementarity instead of an equivalency. It means with either spacetime or quantum states you can account for the system, but at the expense of abandoning a description of the system by the other. You can't describe quantum gravity completely by both in the same measurement description. So this is a sort of Heisenberg uncertainty, if you will.

Cheers LC

Rats, failed again. One more try

Thanks for your remarks on my paper. I can say that a part of this was motivated by Maryam Mirzakhani's death. She died of breast cancer last July, and the news for various reason made me angry. I had read one of her paper's back in 2014 when she won the Fields medal, and at the time I thought this had something maybe to do with physics. Last spring I studied the Ryu-Takayanagi (RT) formula and for some reason the day I heard of Maryam's death the insight on how her work connects with this hit me.

There is this problem with how gravitation and quantum mechanics merge or function in a single system. It is often said we understand nothing of quantum gravity, and this is not quite so. Even with the based canonical quantization of gravity from the 1970s in a weak limit is computable and tells you something. This theoretical understanding is very limited and big open questions remain. Of course since then far more progress has been made. The AdS/CFT correspondence, the Raamsdonk equivalence between entanglement and spacetime and the RT formula are some of the more recent developments. These indicate how spacetime physics has a correspondence or maybe equivalency with quantum mechanics or quantum Yang-Mills fields. However, an obstruction exists that appears very stubborn.

The vacuum is filled with virtual pairs of fields. With a black hole the gravity field causes one of these pairs to fall into the black hole and the other to escape. This means the quantum particle or photon that escapes as Hawking radiation is entangled with the pair that falls into the black hole, and so this means Hawking radiation is entangled with the black hole. So at first blush there seems to be no problem. However, if we think of a thermal cavity heated to high temperature photons that escape are entangled with quantum states of atoms composing the cavity. Once the entanglement entropy reaches a maximum at half the energy released the subsequent photons released are entangled with prior photons released. This would hold with black holes as well, but because of the virtual pair nature of this radiation it means Hawking radiation previously emitted in a bipartite entanglement are now entangled not just with the black hole, but with more recently emitted radiation as well. This means a bipartite entanglement is transformed into a tripartite entanglement. Such transformations are not permitted by quantum unitary evolution. This is called quantum monogamy requirement, and what this suggests is unitarity fails. To prevent the failure of quantum mechanics some proposed a firewall that violates the equivalency principle. This is called a firewall.

The firewall occurs when half the possible radiation is emitted, which is also the Page time. This also corresponds to the failure of a quantum error correction code. Error correction codes involve some deep mathematics; it is connected with the RT formula and I illustrate in my essay the connection with Mirzakhani's mathematics on the geodesics in hyperbolic spaces. Error correction is also tied with the packing of spheres or how oranges stack at the grocery store, the Kepler problem. This gets into the guts of what my paper is about. However focusing in an error correction corrects the mixing of information. Think of a library, in particular an elementary school library with little kids, and the patrons scramble up the order of books. The distance a books ends up from its right position is the Hamming distance. As the library gets mixed up an algorithm can manage this disordering. However, at about half mixing up things break down. The librarian has to virtually start over.

The solution with Susskind and others is to say spacetime variables and quantum states are equivalent. I do not disagree completely, but I think this is a complementarity instead of an equivalency. It means with either spacetime or quantum states you can account for the system, but at the expense of abandoning a description of the system by the other. You can't describe quantum gravity completely by both in the same measurement description. So this is a sort of Heisenberg uncertainty, if you will.

Cheers LC

"influences" ,but then everything "touches" everything else to varying degrees. This is the problem. So, the mechanism of "touching" should be one of the fundamentals.

4 days later

Dear Olaf,

Thank you for a very nice essay. Godel's theorem is indeed very deep.

One must also look at Turing's halting theorem. You might also look at the

following book which deals with such issues "Computability in Analysis and Physics"

Please take a look at my essay.

Thank you again for a very readable essay.

All the best,

Noson Yanofsky

By Marian B. Pour-El, J. Ian Richards

8 days later

Dear Olaf,

Very nice essay, clear and deep message in a brief and well written form. I like your statement "emergent phenomena that are to fundamental physics what true and unprovable statements are to mathematics". In fact I wrote about incompleteness and limits of computability in relation to emergence and reductionism in my previous essay. In particular if you look at the figure in section 4 you may see how much we agree. Your example with solids touching is intriguingly simple, but at this point I don't understand why we can't prove they touch. If some of the positions in the list repeat, I think they touch, am I missing something? Also if they touch the dynamics will "know", because the evolution will include a collision. Perhaps the meaning is that the list of positions and momenta contain only positions and momenta, and the touching or collision can only be inferred from the list?

Best regards,

Cristi Stoica

Dear Olaf

If you are looking for another essay to read and rate in the final days of the contest, will you consider mine please? I read all essays from those who comment on my page, and if I cant rate an essay highly, then I don't rate them at all. Infact I haven't issued a rating lower that ten. So you have nothing to lose by having me read your essay, and everything to gain.

Beyond my essay's introduction, I place a microscope on the subjects of universal complexity and natural forces. I do so within context that clock operation is driven by Quantum Mechanical forces (atomic and photonic), while clocks also serve measure of General Relativity's effects (spacetime, time dilation). In this respect clocks can be said to possess a split personality, giving them the distinction that they are simultaneously a study in QM, while GR is a study of clocks. The situation stands whereby we have two fundamental theories of the world, but just one world. And we have a singular device which serves study of both those fundamental theories. Two fundamental theories, but one device? Please join me and my essay in questioning this circumstance?

My essay goes on to identify natural forces in their universal roles, how they motivate the building of and maintaining complex universal structures and processes. When we look at how star fusion processes sit within a "narrow range of sensitivity" that stars are neither led to explode nor collapse under gravity. We think how lucky we are that the universe is just so. We can also count our lucky stars that the fusion process that marks the birth of a star, also leads to an eruption of photons from its surface. And again, how lucky we are! for if they didn't then gas accumulation wouldn't be halted and the star would again be led to collapse.

Could a natural organisation principle have been responsible for fine tuning universal systems? Faced with how lucky we appear to have been, shouldn't we consider this possibility?

For our luck surely didnt run out there, for these photons stream down on earth, liquifying oceans which drive geochemical processes that we "life" are reliant upon. The Earth is made up of elements that possess the chemical potentials that life is entirely dependent upon. Those chemical potentials are not expressed in the absence of water solvency. So again, how amazingly fortunate we are that these chemical potentials exist in the first instance, and additionally within an environment of abundant water solvency such as Earth, able to express these potentials.

My essay is attempt of something audacious. It questions the fundamental nature of the interaction between space and matter Guv = Tuv, and hypothesizes the equality between space curvature and atomic forces is due to common process. Space gives up a potential in exchange for atomic forces in a conversion process, which drives atomic activity. And furthermore, that Baryons only exist because this energy potential of space exists and is available for exploitation. Baryon characteristics and behaviours, complexity of structure and process might then be explained in terms of being evolved and optimised for this purpose and existence. Removing need for so many layers of extraordinary luck to eventuate our own existence. It attempts an interpretation of the above mentioned stellar processes within these terms, but also extends much further. It shines a light on molecular structure that binds matter together, as potentially being an evolved agency that enhances rigidity and therefor persistence of universal system. We then turn a questioning mind towards Earths unlikely geochemical processes, (for which we living things owe so much) and look at its central theme and propensity for molecular rock forming processes. The existence of chemical potentials and their diverse range of molecular bond formation activities? The abundance of water solvent on Earth, for which many geochemical rock forming processes could not be expressed without? The question of a watery Earth? is then implicated as being part of an evolved system that arose for purpose and reason, alongside the same reason and purpose that molecular bonds and chemistry processes arose.

By identifying atomic forces as having their origin in space, we have identified how they perpetually act, and deliver work products. Forces drive clocks and clock activity is shown by GR to dilate. My essay details the principle of force dilation and applies it to a universal mystery. My essay raises the possibility, that nature in possession of a natural energy potential, will spontaneously generate a circumstance of Darwinian emergence. It did so on Earth, and perhaps it did so within a wider scope. We learnt how biology generates intricate structure and complexity, and now we learn how it might explain for intricate structure and complexity within universal physical systems.

To steal a phrase from my essay "A world product of evolved optimization".

Best of luck for the conclusion of the contest

Kind regards

Steven Andresen

Darwinian Universal Fundamental Origin

Write a Reply...