EmeraldBeetle Mathematical equations plus physical structures plus logical relationships is literally the dynamic form of Schrödinger's equation:
i\hbar\,\partial_t \psi = \hat H\,\psi
it does not require a belief in your infinite and eternal consciousness or Lorraine's pan-agency for it to 'move' or for it to be applied to real physical systems in our actual world.
Schrödinger eq. as timedependent is in 'the infinite, ethernal consciousness', not in the physicality. I think Lorraine and Steve are right here. Time is not a physical structure, the wave is not physical either (can be discussed), you need a particle or position to have that. Schrödinger equation describes the potential for motion, but agency (a 'collapse' or reduction) is required to execute the motion and make it real and irreversible. You need an arrow of time for the exclusion?
It is an interesting model you make. You describe the work of 'a quantum computer' without measurement, never halting, never making a decision, just an ethernal oscillation between entangled parts. Time is symmetric here and you get no entropy. Penrose described this situation where all is entangled as 'Schrödingers Sea'. Here the more parts you entangle, the more stable the system get, and the more it unitarily knows. https://www.nature.com/articles/s41586-024-08449-y
We published results showing that the more qubits we use in Willow, the more we reduce errors, and the more quantum the system becomes. We tested ever-larger arrays of physical qubits, scaling up from a grid of 3x3 encoded qubits, to a grid of 5x5, to a grid of 7x7 — and each time, using our latest advances in quantum error correction, we were able to cut the error rate in half. In other words, we achieved an exponential reduction in the error rate. This historic accomplishment is known in the field as “below threshold” — being able to drive errors down while scaling up the number of qubits. You must demonstrate being below threshold to show real progress on error correction, and this has been an outstanding challenge since quantum error correction was introduced by Peter Shor in 1995.
This is called redundancy, which is in itself an interesting thought. Can we redefine what decoherence is? This would be an extremely important question for 'the quantum of life' especially the warm and wet situation.
This goes against common sense, where we have entropy increasing, and ignorance growing. These are two DIFFERENT worlds. But say you are a quantum computer with 500 entangled qubits (quantized 'particle') and you are just one of them, you definitely does not know much. But entanglement is non-local, so you ignorance is 'smeared out' over all 500 qubits, and then you are a continous wave, taking info from all other parts, so together you know much more. But this is not a physicality. And you cannot measure it.
Basically you put a collective world against an individual world. An equation, being a logical relationship, cannot execute itself (Gödel). It must be broken or non-unitary, not a non-local continuum. Here we must recognize the odd numbers as such 'breakers'? And consciousness is such a 'breaker' from selection, reducing and focusing, like we do here and now.
This is what I don't understand in Everett, how come to knowledge without a breaking?
Regards Ulla.