[deleted]
Hi Edwin and nmann,
Ok, so far I think I am on the right page insomuch that Shannon entropy is not what's being searched for here.
I really like that example of "one if by land, two if by sea". It was very informative for one, but it also helps me see meaning is a big can of worms here. The analogy of "one if spin up, two if spin down" is also great because it points back to Boltzmann and von Neumann entropy, which are pretty much analogous to Shannon entropy, if I'm not mistaken. Like with Shannon entropy, the problem with von Neumann entropy is that there is also meaning involved, and so not even this version of entropy is helpful. For instance, in terms of a black hole, someone might say that the entropy S means that there are "bits" which could represent a binary spin up/down which means that the event horizon area is quantized (and thus black hole mass is quantized), and another person (Edwin, myself) might say that it's more than just about binary spin and that the entropy is not discretized as such.
So, how would our description of physics change from this von Neumann approach? Is that what this search is for? To eliminate room for interpretation? I apologize if this is frustrating.
Would it be easier to discuss it in terms of object oriented programming, where there are classes (categories) and class member functions (morphisms that act on categories in a way that do not alter the fundamental structure of the category)? I know this kind of thing far better than I know physics.
I'm just having trouble visualizing what kind of "thing" would be in hand when the search is complete -- like what's its structure? I read one of those papers mentioned earlier about the Mathematical Theory of Information by Kahre, and it said right near the beginning that their theory gives Shannon entropy in a limiting case. I could not follow their argument, but it seems that they weren't eliminating Shannon entropy, but more like generalizing it. Help? Please? :)
- Shawn