Dear Inés,
First, I love the clarity of your writing and thinking. The argument is so coherent I could follow it easily, even though every sentence adds something new to consider. The historical background sets the stage very nicely, and the analogy between math and physics seemed to make perfect sense - though I would never have imagined a connection between axioms and initial conditions, or between theorems and evidence. Interesting! Treating theories as computer programs gets to the heart of the matter - matching experimental results - and makes clear the two essential criteria for theories to be more or less fundamental. And by pointing out that generality and conceptual depth (algorithmic simplicity) don't necessarily go hand-in-hand, you directly address the question of this contest.
The key issue in your essay seems to be - to what extent is what happens in the world determined by rules? Since ancient times, it's the rule-determined aspect of things that's been taken as basic (and even divine), since it corresponds to how we've learned to think. To the extent the data is not predictable, in quantum theory, it can seem alien to our understanding, like undecidable propositions in mathematics. You say, "The experiences we collect throughout our (single) life do not form a valid string, nor a target for a theory."
This reminds me of the Medieval definition of truth - adaequatio intellectus et res. It implies not only that our minds are made to understand the world, but also that the world is made to fulfill our potential for understanding. But now it seems as though this mutual adequacy might be broken. If I may, I'll try to sketch how I think it might be restored.
The world of classical physics is strictly deterministic - in simple situations like the sphere falling from the tower, our algorithms reproduce the measured data as closely as we like. Yet even a situation as simple as three bodies moving in space, per Newton's law of gravity, can only be approximated, reiterating the algorithm for two bodies over and over. We can prove there's a unique solution to the three-body problem, but it's not directly computable. So even before we get to quantum mechanics, there's a gap between what our algorithms can do and what the physical world can do.
QM formalizes this gap. Its algorithms give the statistics, very accurately, but leave the measured results up to chance. We can see this as a failure we have to live with. But maybe the algorithms and measurement-contexts are two distinct aspects of the system, both needed to make it work. One determines the regularities that support classical physics, the other determines the unique events that constitute our world of experiences. Maybe these two kinds of determining, lawful and random, are what enables the world to do what the algorithms alone could not.
In any case, in today's physics, the evidence is well-explained, but the theories themselves are not. They're so far from being simple or self-evident that physicists refuse to consider them fundamental, despite the lack of any empirical evidence pointing beyond them. What's missing is not generality but conceptual depth. What I've tried to show in my current essay is that the structural requirements for a system of measurement-contexts might provide this.
Thanks for an excellent and highly relevant essay. And, congratulations on your well-deserved prize in last year's contest!
Best wishes, Conrad