To continue onwards with what I started, I thought I would break this out some with regards to quantum gravitation.
With the measurement issue a quantum system with some set of states, usually rather small in number or with a few number of degrees of freedom, by a classical-like system. By the use of "like" it is a reference to the fact this system is really in fact quantum mechanical. In the sense that Zurek outlines there is a form of quantum decoherence that induces superposition or entanglement phase of the system to enter into a large reservoir of states. Hartle then illustrates how decoherence leads to a form of macrostates that are decoherent sets of states. These are subjectively assigned groups of states, similar to the idea of macrostates in phase space in statistical mechanics. We then see in this physics the split between objective physics, which might be seen as the dynamical evolution of quantum states, and subjective physics that occurs with the phenomenological report observers make.
I do not think it is unreasonable to see this as a case of quantum states encoding information about quantum states. In order to fully understand this process it requires some understanding of how the reservoir of quantum states define the final needle states, which requires a measurement system of the measuring system, which then of course gets into this recursion. The many world interpretation has this feature in it of a product structure of increasingly entangled states. This ψ-ontological interpretation has its mirror in ψ-epistemic interpretations, such as in GRW objective collapse models. Further, quantum interpretations tends to be incomplete and contradict each other. I see this as a possible feature of a Gödelian nature of quantum physics.
When it comes to quantum gravity there is a similar gap. Currently the firewall is a major obstruction to the unital description of quantized gravity. Hawking bet that information, here information as the number of quantum bits, qubits or qu-Nits for N >= 2, is not conserved. The conservation of information appears to be a reasonable requirement of physics, which was the stance of Susskind. Susskind won a bet with Hawking, but then Almheiri, Marolf, Polchinski, and Sully demonstrated something interesting. A failure of accounting for entanglements of states meant that either unitarity of quantum mechanics or the equivalence principle of general relativity fails. In my essay I illustrate how these two principles are complementary and not compatible in a classical sense.
The vacuum is filled with virtual pairs of fields. With a black hole the gravity field causes one of these pairs to fall into the black hole and the other to escape. This means the quantum particle or photon that escapes as Hawking radiation is entangled with the pair that falls into the black hole, and so this means Hawking radiation is entangled with the black hole. So at first blush there seems to be no problem. However, if we think of a thermal cavity heated to high temperature photons that escape are entangled with quantum states of atoms composing the cavity. Once the entanglement entropy reaches a maximum at half the energy released the subsequent photons released are entangled with prior photons released. This would hold with black holes as well, but because of the virtual pair nature of this radiation it means Hawking radiation previously emitted in a bipartite entanglement are now entangled not just with the black hole, but with more recently emitted radiation as well. This means a bipartite entanglement is transformed into a tripartite entanglement. Such transformations are not permitted by quantum unitary evolution. This is called quantum monogamy requirement, and what this suggests is unitarity fails. To prevent the failure of quantum mechanics some proposed a firewall that violates the equivalency principle. This is called a firewall.
The firewall occurs when half the possible radiation is emitted, which is also the Page time. This also corresponds to the failure of a quantum error correction code. Error correction codes involve some deep mathematics; it is connected with the RT formula and I illustrate in my essay the connection with Mirzakhani's mathematics on the geodesics in hyperbolic spaces. Error correction is also tied with the packing of spheres or how oranges stack at the grocery store, the Kepler problem. This gets into the guts of what my paper is about. However focusing in an error correction corrects the mixing of information. Think of a library, in particular an elementary school library with little kids, and the patrons scramble up the order of books. The distance a books ends up from its right position is the Hamming distance. As the library gets mixed up an algorithm can manage this disordering. However, at about half mixing up things break down. The librarian has to virtually start over.
In the end it may be that the equivalence principle and the unitary principle are complementary and in a quantum setting are not observable in a simultaneous observation. This is similar to the Heisenberg uncertainty principle with position and momentum. Back to the Gödelian issue, this means the universe presents itself in entirely different ways depending on the type of measurement performed. This is also a sort of form of "collapse" if thought of in a ψ-epistemic sense, which would agree with Hawking and Penrose. A ψ-ontological perspective would be more in line with Susskind. These perspectives are I think ultimately a form of G and NOT-G for G a true but undecidable Gödelian proposition.
Cheers LC