Dear Paul,
I understand your viewpoint and explanation of the difficulties, which we encounter when we try to describe the world. I will rephrase the idea, since I think that I can do this in a way which avoids the discussions about math and new hidden levels of complexity.
The idea is that at any time t we have a set of experimental data, say Dt (I will use such symbols as short names, not equations). For each set of data, there is a set of possible theories (=explanations/descriptions, possibly not limited to mathematics, if this is really a limitation) of reality, E(Dt), compatible with that set of data. I consider that E(Dt) should contain all possible theories. I will assume that the data increase with time, hence for t smaller than t', we have Dt included in Dt'. What happens with E(Dt)? I claim that E(Dt) contains E(Dt'). Clearly, each possible theory can be infirmed by new data. No new theory can occur because of new data at t', because it must also be compatible with the data at previous t, therefore it must be already contained in E(Dt).
So, my claim is that there is an abstract set of all theories, and this set is only reduced by new data, and not increased. And I give as example the true theory, the one that really describes the world, which must exist, because the world exists, but we don't know it yet. Therefore, I claim that E(D2009-11-08) is not empty, only that we don't know yet what it contains.
I would say here that E(Dt) should contain also incredibly complicated explanations, which contain unobservable data at that time, and are usually ruled out by the physicists of the time because of Occam's razor. These theories are more complicated that the current data, but may become meaningful with new data. Such example may be the string theories, and some other quantum gravity approaches (if they would explain all the data we observed so far).
Taking your example with atoms, I would say that E(D1800-01-01) should contain explanations based on indivisible atoms, but also explanations based on composite atoms (which were not available at that time, and we recognized this only because of new data). So, I would say that there is a knowledge of the possible theories explaining all the data at a time t, Kt:= K(E(Dt)), which changes in a more unpredictable way. Perhaps at the end of the XIXth century it contained a theory, based on Newton's and Maxwell's equations. But now it is empty, because we don't have an accepted theory explaining all the data we have now. And possible it will contain at least one theory, in time. And, of course, it may become again empty, because of new data.
According to this view, it is imaginable that Kt will change with time, so that we will feel that we are going closer, but we will never be confident enough that this is all. But it is also imaginable that we will be able to find a theory which will never be invalidated by new data. So, I would keep my mind open to all possibilities. That's what I wanted to say in my previous comment.
I want to emphasize that even if we would find someday a theory fitting all the data at that time, and which will never be invalidated, despite the new data, there will be potentially infinitely many consequences to be explored, and tests to be passed by the theory (perhaps this is a weak version of the fairness principle?). It is even possible the principles be already known, but the consequences not understood, and that the future generations will understand asymptotically that the theory fits the data.
Thank you for your interest in the link I gave in the previous comment, I give here the direct link to the pdf.
Best regards,
Cristi