The part about truncation is that this is a cut-off that prevents what might be called infinite navel gazing. A formal system with a countably infinite number of predicates that are provable, will be the Cantor diagonalization of the Godel numbering of these predicates result in ever more predicates that are not provable. Godel's theorem is really a form of Cantor's diagonalization or "slash" operation on a list of numbers. As a result a formal system has an uncountably infinite number of elements, and of course Bernays and Cohen used this to show the continuum hypothesis is unprovable in the Godelian sense, but consistent with ZF set theory. From a computation perspective we really do not want to go there!
If we think of the most elementary hyper-computation consider the case of the switch flipped on and off according to Zeno's prescription. What will be the final state of the switch? The problem is that as time approaches zero the switch is moving with so much energy it becomes a black hole. The answer is not revealed to us. Hyper-computation has some funny connections with black holes. This sort of puts an event horizon over the ability to beat Turing and Godel.
Thinking of this with Turing machines, the universal TM is a sort of Cantor diagonalization slasher on TMs, and it will always have TMs outside its list of halt and nonhalt. Then enters the MH spacetime which exploits properties of the inner Cauchy horizon of a black hole. It is in principle possible for an observer to cross this horizon and receive information about any possible algorithm process in the exterior. It is then in principle a sort of UTM that can make this list, even if it is uncountably infinite, and this is hyper-computation. However, this relies upon the properties of an eternal black hole. Black holes can exist for a long time, the largest that might exist in the future are a trillion solar masses (from the end point of galactic clusters say 10^{40} years from now) and these might endure for 10^{110} years. However, this is not eternal and it cuts off or truncates any possible hyper-computation. In reality I don't suspect much would be entering such as black hole as the exterior world will be a dark and cold void. The evaporation of a black hole even limits hyper-computation in the interior.
What I do outline though is that this will adjust the Chaitin Ω-number for halting probability. If we had perfect hyper-computation available the Chaitin Ω-number would be 1 or 0. Without that we do not know it with any certainty. However, with truncated hyper-computation the Ω-number many be adjusted closer to 1 or 0, and in a quantum mechanical tunneling setting or just plain probabilities and loaded dice this may give outcomes. These outcomes may or may not work, but in a truncated hyper-Turing machine setting they will permit more favorable outcomes; in effect you can hedge your bet or there is some pink noise.
Then if nature is dual, what happens at the extreme UV limit with black hole quantum hair is dual to low energy IR stuff, such as chemistry or biology, then ultimately this sort of structure is encoded into the nature of reality. The main argument I give then is the emergence of self-directed systems that exihibit intenionality is scripted into the structure of the universe.
Cheers LC