Hi Lorraine,
Ok, thank you for laying out some more constraints for the conversation. :)
The following is what had ran through my head when I made my last comment, and I believe that it kind of relates to what nmann and Lawrence are saying too. Forgive me if none of this is new to anyone. I just want to eliminate the dead end paths in my understanding.
I think that symbols exist in order to defer (or outright eliminate) the gathering of materials and expenditure of effort (heat generation) required to fully re-
present some physical system.
A lateral case would be the symbols that we have for the word "blue". Ancient people couldn't generally shoot out blue photons at will, and so the spoken/written word
was created in order to defer the gathering of materials / expenditure of effort required to make blue light. In the end, neither the word "blue" written on paper or
the brain cell/chemical/electrical pattern are very similar to a blue photon in terms of spatial or temporal physical configuration.
A vertical case is the modeling of the Solar system.
We can partially defer our re-presentation of this system by making a wood and metal model that runs by a small electrical motor. The spatial configuration is much
different from the real thing, and so the dynamics must be driven manually by the motor.
An even more deferred approach for modeling the Solar system is to write down on paper the equation F = GMm/r^2 alongside a list of real vectors that correspond to the
current locations and velocities of some planets. The ink on the page looks absolutely nothing like a Solar system in terms of either spatial or temporal physical
configuration -- the dynamics are entirely deferred (F = ... doesn't even move around the page). If we use a computer simulation to calculate and drive the dynamics
of the system, we take up some of that slack caused by our previous deferral, but not all of the slack since it still would take much more material and effort to
create an actual copy of the Solar system.
As for what Lawrence and nmann are saying, I think that it's fair to say that we have deferred our re-presentation of the actual physical configuration of some thing
over space and time by writing down the density matrix, which encodes the physical states, and by writing down some equations that encode the dynamics. Like in the F= ... case, we can then do a computer simulation to drive the dynamics and take up a bit of the slack from our deferral.
This is the point in my line of thought where I had come to the conclusion that the most faithful representation of the Universe is the one where we defer no gathering of materials or expenditure of effort. One `simply' puts the materials in the right places and the dynamics take care of themselves from then on. In effect, spacetime and the interactions take over the place of our computer that was required to re-present the dynamics. However, it's clear to me now that this isn't the end of the symbolism spectrum that you were looking for.
I think that the `Universe is a digital computer' idea got a real boost when the papers on the holographic principle came out in the early-mid 90s. One of the most literal interpretations of the holographic principle is that those microstates states which are represented by the density matrix actually map directly to a set of oscillators (that flip up or down). If the black hole's event horizon is in any one of n microscopic states (and the states are all of the same probability of occurring), then this oscillator paradigm says there must be log_2(n) oscillators. It's like how an 8-bit integer can be in any one of 2^8 = 256 different microscopic states. The dynamics are still another matter altogether though. For instance, the original holographic principle paper (Dimensional Reduction in Quantum Gravity) does not just lay out this idea of mapping the states to binary oscillators, but also starts to lay out a kind of cellular automaton rule that would govern how exactly the black hole evolves from one specific microstate to the next as time proceeds.
In the end, I think that these people who take the holographic principle literally are just looking for something that can be subsequently framed as an object-oriented cellular automaton (just slap it into some classes, compile and run in order to re-present the dynamics). How different is this from what you are looking for, besides the fact that they take the existence of binary "bits" literally?