Ralph,
Thanks for the positive word. The incompleteness extends potentially to various levels, including the dichotomy between the quantum and classical world. Complex structures like organisms impose constraints on the micro-causal systems that compose it. How this occurs is not well known. It is the possible that in addition there is a top-down element to this. The physics of molecules or atoms do not predict biological systems, but biological systems as emergent structures impose constraints on how molecules behave.
The physical universe doubtless has a computational aspect to it. It is less clear that this defines all of physics. The discrete paradigm of reality clearly has some problems or limits. In particular the discrete model of spacetime, such as offered by LQG, implies there are violations of Lorentz symmetry. Recent distant observations of GRBs have found no dispersion predicted by this. Symmetry breaking implies mass or some dispersion due to longitudinal modes. Much of physics can also be expressed according to a path integral which is derived using variational methods. The initial and final state of the system is defined and the intervening states of the system are derived without any causal state by state evolution perspective.
A path integral has some classical path, which is usually defined by the extremal path for the largest expected outcome. This is one motivation for the einselection paradigm for assigning states as the stable classical configuration for a system. The odd part of this is that quantum mechanics is noncontextual by the Kochen-Specker theorem. However, the path integral implies some sort of classical "shadow" to QM. Classical or macroscopic physics is contextual, and this seems to imply there is a theorem (the KS theorem etc) which in the broader context of physics is undecidable. This would be the case if one considers classical physics as having some reality, even if in a coarse grained perspective.
I should also be mentioned that chaos theory does not involve halting algorithms. They are recursively enumerable; they do not halt and carry on to an arbitrary level of floating point precision. So certain disciplines of physics already embody theory that involves nonhalting algorithms. Halting algorithms are recursive, and their complements are recursive. Recursively enumerable algorithms have complements which are not decidable. Recursive algorithms are BLOOPs to use Hofstaeder's term and their duals are BLOOPs. Recursively enumerable algorithms are FLOOPs (free loops), but their duals are GLOOPs (Godel loops).
I wrote an essay for the FQXi contest, which at last check is #4 out of 182, which demonstrates how a causal (state to state with time) perspective of physics will have some level of undecidability. This has some relationship to Hume's naturalist fallacy and to Wallace's refutation of Taylor's fatalism. Such a perspective motivates the suggestion that associativity of operators may be the axiom that can be toggled on or off.
Trying to understand how nonassociative mathematics of operators fits into physics is really the hard part. I think that quantum mechanics is purely complex, or C. Of course classical mechanics is R. Gauge theory can be written according to quaternions H. A lot of gauge theory is done though in standard vector form without quaternions. It is interesting though that Maxwell formulated electromagnetism, the first gauge field theory, in quaternions. Field operators in a second quantization act on a Fock space basis to give quantum amplitudes. So we have a relationship that might be heuristically written as π:H --- > C. The question is then whether there is some sort of higher level structure π:O --- > H.
Spacetime I think offers a clue. A black hole horizon has some quantum uncertainty on a scale near the string or Planck length. There will then be an associative uncertainty with three quantum fields, where one of those fields is identified near the horizon. The standard approach to QFT is to assign a harmonic oscillator at every point in space, impose equal time commutators on that spatial surface with the Wightman criterion for commutation, and work from there. Yet that spatial surface on a small scale will have some noncommutative structure and this will lead to a host of uncertainties in assigning QFT operators. If there are event horizons this should lead to an associative uncertainty.
The above "maps" between C, H and O, where a similar map π:C --- > R would be the relationship between quantum mechanics and classical mechanics, are really just forms of the Hopf fibration. The relationship between quantum and classical mechanics is of course a difficult subject in its own right. With each of these "ladders" on the Hopf fibration there is some increased uncertainty. Quantum mechanics saved physics from the UV divergence that classical mechanics predicted with the hydrogen atom. Similarly this may protect physics from divergences with black holes, such as the singularity and maybe with the current big problem of firewalls.
There may be a general level of undecidability in physics which tells us that an algorithmic perspective of physics will not be able to define all of physics. The "bit" or qubit perspective of physics is important, but it may not embody all of what might be called physical truth.
I see that you have an essay. I just got my voting code retransmitted to me. The computer I had it on suffered a big virus meltdown and I lost that for a couple of weeks. I will get to voting this week as time permits.
Lawrence B. Crowell