The first sentence of my long post should have read something like:

"Some of what you're talking about sounds like it's relevant to the Newell-Simon theory in which meaning is claimed to reside in the symbols ("tokens") themselves instead of THE SYMBOLS possessing WHATEVER meaning a given culture or subculture assigns, often arbitrarily, to THEM."

More or less.

    Hi Edwin and nmann,

    Ok, so far I think I am on the right page insomuch that Shannon entropy is not what's being searched for here.

    I really like that example of "one if by land, two if by sea". It was very informative for one, but it also helps me see meaning is a big can of worms here. The analogy of "one if spin up, two if spin down" is also great because it points back to Boltzmann and von Neumann entropy, which are pretty much analogous to Shannon entropy, if I'm not mistaken. Like with Shannon entropy, the problem with von Neumann entropy is that there is also meaning involved, and so not even this version of entropy is helpful. For instance, in terms of a black hole, someone might say that the entropy S means that there are "bits" which could represent a binary spin up/down which means that the event horizon area is quantized (and thus black hole mass is quantized), and another person (Edwin, myself) might say that it's more than just about binary spin and that the entropy is not discretized as such.

    So, how would our description of physics change from this von Neumann approach? Is that what this search is for? To eliminate room for interpretation? I apologize if this is frustrating.

    Would it be easier to discuss it in terms of object oriented programming, where there are classes (categories) and class member functions (morphisms that act on categories in a way that do not alter the fundamental structure of the category)? I know this kind of thing far better than I know physics.

    I'm just having trouble visualizing what kind of "thing" would be in hand when the search is complete -- like what's its structure? I read one of those papers mentioned earlier about the Mathematical Theory of Information by Kahre, and it said right near the beginning that their theory gives Shannon entropy in a limiting case. I could not follow their argument, but it seems that they weren't eliminating Shannon entropy, but more like generalizing it. Help? Please? :)

    - Shawn

    Thank you for yet another reference to look up. I am reading it and trying to digest it all.

    I suppose a better question is: what do you mean by 'information qua information'? Perhaps knowing that will help me see Lorraine's point of view better.

    I apologize for taking this off on a tangent by bringing up "meaning", especially after Edwin had already pointed out that it was a can of worms. I just wanted to eliminate what's *not* beinig searched for. I do appreciate your patience with me.

    Haha, I have to laugh at myself a little bit here. You and Edwin definitely bring up a good point about cultural differences -- even the word information is in and of itself context-specific. I think that this is why I am having trouble getting off of the ground here, because I have thought of the word information in terms of entropy for so long. Like, when someone says to me "So, what's new?" I think of them as saying "Please give me information.". I'm not trying to be intentionally slow-witted here, I'm just having some trouble reprogramming my brain. I do appreciate all the answers to the questions that I'm asking. Again, apologies if it's frustrating.

    Regardless of whether or not this has anything to do with what you're trying to explain... it sure would be neat if we created a neural network that takes an input state and gives an output state, and by luck (huge luck, involving how we picked the training to take place) the network started predicting experimentally-verifiable results that our current rule-based theories didn't call for. :)

    Well. I was hoping not to have to get into all this but...

    Every coin has two sides. And so does Information Theory. Consider nmann's earlier quote, that "traditional Shannonian Information Theory you're talking about, which doesn't address "meaning" instead limiting itself to the abstracted transmission of physical data". Like a weird, quantum superposition, that statement is simultaneously true and false. It is true that is does not address the "final meaning", that a human will eventually slap onto the received message. But it very powerfully addresses a much more important meaning; one that enables the human to receive any message at all.

    Here is the problem: Suppose you receive a message. That message has been corrupted. In addition to the information that the sender intends for you to receive, you are also receiving a great deal of information about all the things contributing to the corruption; distortion, multi-path interference, co-channel interference, doppler frequency shifts, time-varying amplitude attenuation, additive noise. You get the picture. It can be a really messy picture. So here is the problem; How do you know that all that "crap" is in fact "crap?" How do you know that that is not exactly what the sender intended you to receive?

    If you know, a prior, that the sender is using only a very limited set of "channel coding symbols" (think of each as a uniquely modulated, short waveform), then anything that does not appear exactly as expected, must be corrupted. In the past thirty years, very powerful techniques have been developed to FORCE corrupted symbols to appear uncorrupted, by exploiting a priori information about these symbols.

    This means that you know where the intended, as opposed to the unintended (corrupting) meaning lies.Think of this as always receiving two simultaneous messages, all mixed and garbled together, one containing the intended meaning, the other all the undesired, unintended meaning; and it does "mean" something; if you cared to, you might be able to learn a lot about the sources of the corruption, if you tried to analyze it. But we are not even going to try. It is the dirt, we are going to go for the gold.

    In effect, such a system knows EXACTLY what it is looking for, and is trying very hard, often with very great success, to completely ignore all the "crap" that is known, a priori, to be of no interest. Have you ever wondered why your eyes cannot detect the absorption lines in the solar spectrum of visible light? That is why. Have you ever wondered why your auditory system cannot understand a fax transmission, sent via an acoustic modem over a telephone line? That is why.

    So what does this have to do with physics? Well. If you make a guess about what the entity being observed "IS", then you might be able to exploit that information in the same way that the above channel coding can be exploited. For example, if you guess how many components "SHOULD" be measured, and you guess correctly, you may be able to extract a far "cleaner" observation than if you made no guess at all. But if you guessed wrong...

    Rob McEachern

    Shawn,

    Never apologize! Never explain!

    The "qua information" is maybe easier to get a handle on if you use "qua energy" as a paradigm. Nobody has ever observed Energy as such. But still we feel at home with the idea of Energy. We even have an intuitive sense of the concept Energy (or believe we do). It's like the feeling you get when you're running fast or taking off in an airplane. Or seeing horses galloping. Whatever. But we're not actually observing Energy as such. We're observing ourselves and other physical objects channeling energy and being energetic.

    So back off and detach the sense of Energy from anything specifically energetic. You do that all the time. E=mc^2. That works as "pure energy." Even though in actuality you've only ever observed or sensed motion, heat, lightning ... coded energy.

    Lorraine does something similar with Information, I suspect. She imagines, intuits something existing behind and prior to the coded messaging more generally associated with Information. Here's a link ... the most relevant material is the first part. I don't think Lorraine would agree with the general tenor of the paper, but it gives me a sense of what pure (or reasonably pure) Information might be like:

    Quantum Physics as a Science of Information

    Shawn, you say, "even the word information is in and of itself context-specific".

    *Everything* is context specific. In fact, in an essay posted yesterday, "Cosmic Solipsism", Amanda Gefter states, "Instead, I argue, each observer has their own universe, which constitutes a complete and singular reality." While I would not go so far as solipsism, I think that she is right if she means that "each observer has their own MENTAL universe" or singular world-view. The initial genetic data that grew our unique brains and the information that shapes our learning brains could not possibly produce the same total picture. This is the beauty of math, and hence Shannon's treatment of information as data in a channel -- we abstract meaning from the symbols and treat them as meaningless entities, to which only logic applies. Once we move these from the world of math into a real physical universe in which every thing must [I assume] fit together in self-consistent fashion with everything else, then context [initial data] is all.

    I believe that is what Robert McEachern is saying and I believe his essay one of the most important here.

    I plan on taking no more of Lorraine's thread, but thanks all.

    Edwin Eugene Klingman

    Hi Robert and Edwin,

    Thank you both very much for the further explanations. I have a basic idea of some error correction techniques used in TCP (checksums, acknowledgements, sliding window), and I'll use that as a kind of base to investigate more into what Robert is talking about.

    I will definitely give Robert and Amanda's essays another read.

    Thank you very much for taking the time to help me sort though this.

    Ok, I think I'm catching your drift now. I can definitely see what you mean about how energy can be considered to be an intangible object that only makes itself known somewhat indirectly via motion. I think that some physicists say that there is an equivalence between energy and information (http://arxiv.org/abs/1207.6872), so perhaps this is 'qua energy' is very much one and the the same thing as `qua information'.

    Thank you for the link to that paper about Quantum Physics as a Science of Information.

    I will stop hijacking Lorraine's thread for now and read up some more before asking other questions.

    Robert,

    I have found answering your post to be quite testing, but it is also true that I am tired from a meeting last night.

    You say "...All sensory outputs are similarly coded. Without such coding, you would experience nothing." But a code itself is not meaningful information - you can "experience nothing" meaningful until it is decoded, and to do that you need to understand the code.

    I contend that from the point of view of certain physical elements of your retina/brain, certain other related physical elements are objects that represent information (i.e. they are like a code). Its as if "code" originally derives from a subject/object relationship. I also contend that the main content of information/experience/physical reality is categories and relationships. From the point of view of physical elements in your retina/brain, subjective experience consists of interrelated information categories and relationships that derive from self or other objects.

    Although they derive from light interacting with physical elements of the eye/brain, I would think that red green or yellow information categories only exist from the subjective point of view of the physical elements of the eye/brain (which are in turn part of the experience of a larger organism). I would think that this subjective information derives from information category relationships, certainly not from code.

    Hi Shawn,

    thanks for the compliment on my essay.

    What I'm really saying is connected to my assertion that the universe is not like a computer. I'm saying that what Information Theory calls "information" is actually "coded information". But what underlies all this code? Computers only deal with coded information; and words, symbols and sentences are codes too. Code represents something else. What is the something else that that code represents?

    I'm saying that when all the layers of code have been deciphered, you are left with "uncoded" "pure" information, and this "pure" information is experience.

    Shawn, see above for my reply to your original post of 5 September 2012, 17:33 GMT

    Also above, I have replied to Robert's post of 5 September 2012, 16:45 GMT

    nmann , re your post of 6 September 2012, 00:41 GMT :

    In "Quantum Physics as a Science of Information" by Caslav Brukner and Anton Zeilinger, they seem to be talking about how much coded information a quantum system might represent from a human point of view, and how it can be used. They really have no answer to what "pure" unencoded information might exist from a particle (entangled or not) point of view.

    I think that the "subjective information derived from information relationships" IS the code. That is probably the only really important difference between our two points of view.

    This link between observation and experience, is accomplished simply by treating observations not as "real number", but "serial numbers", which are used to "look up" how to behave towards an observation; these behaviors are, of course, based on accumulated past experience.

    A somewhat more detailed description of this can be found in my post on Sept 3, 2012 @ 18:26, under the essay "Is Quantum Linear Superposition and Exact Principle of Nature?"

    Rob McEachern

    Robert,

    Point taken. Also you can treat aspects of direct human communication -- social interaction -- from a Shannon perspective. The Whisper Game. The final output can often be outrageous. Or conflicting eyewitness accounts of a robbery or an automobile accident.

    Still, though, I believe that Jan Kahre, building in no small part on Shannon, takes it farther. His Law of Diminishing Information specifically accounts for more phenomena and incorporates subjective meaning. Example: if information in any given context is something that can only be diminished, and you can't read Chinese, and you're sitting on a park bench in Shanghai staring with incomprehension at a Chinese newspaper, and then an English-speaking native comes along and translates an article for you -- hasn't information actually been increased? Answer: No, because for you the material in the paper was never information in the first place.

    In other words, while the newspaper's content was objectively information, your inability to recognize it as such doesn't count as a measure of the information content. "Any information measure that depends so strongly on the prior knowledge of the recipient is too subjective to be of use and should be discarded."

    The universe is a computer in the same way the brain is a computer. The computer is a model system which we devise that processes information. The double slit experiment for instance is a case where letting the electron pass through the two slits is an OR process, while if one tries to measure which slit the particle passes through this is an AND process. The two in a quantum logical setting lead to the experimentally determined observation which supports quantum mechanics. We may use these rules to construct the a Feynman path integral. In this sense quantum physics can be seen as a type of logic and the processes by which a quantum state evolves or is observed is due to processes." This does not necessarily mean the universe is a computer in the sense of what we generically call a computer. This is similar to the brain, where while local circuits of neurons have flip-flop type of arrangements for the execution of action potentials, clearly on larger scales there are huge departures from our usual von Neumann architecture of a computer.

    The universe as a computer, to use this as a parallel, is one which computes itself. The computer is its own program and data stack. If the above idea of using quantum logic to construct a path integral is used, then the e^{-iHt} in general is witten according to the a large Lie group, the free Hamiltonian is the Cartan center, interactions Hamiltonians involve transitions of root vectors and so forth. In this case the "logical operations" correspond to how quantum state or elementary particles transition between each other. The most elementary concept of this is a two state system with a state vector

    |ψ> = |1> e^{iθ}|0>

    for a qubit. This state vector evolves according the Schrodinger equation. The free Hamiltonian is given by the Pauli matrix σ_z with diagonal entries 1 and -1 for the two states, and the interaction terms will involve σ_ σ_- with off diagonal entries that transition between the two states.

    We may of course form a concept of information from the density matrix ρ as

    S = -k tr[ρlog(ρ)],

    which reduces

    S = -k sum_iP_i log(P_i),

    for P_i the probabilities. Information theory makes no distinction between whether this information has some meaning or not. We might make this a bit firmer with the notion that this information is decrypted in some form. A process may run quantum information through some very complex process, say a black hole, where the subsequent Hawking radiation which emerges has no apparent "signal." The observer in this case lacks the encryption/decryption code. This is then a subjective aspect to information. However, whether information measured has "meaning," or the observer lacks the decryption processor, this lack of subjectivity does not mean that information was not in some way processed.

    The upshot is that I can meet you half way on this. The universe is not a computer in the standard sense of the definition of "computer." However, this does not necessarily mean that how quantum states are processed does not have a computer process model structure.

    Cheers LC

      Rob,

      I've read your 3 September 2012 18:26 post to "Is Quantum Linear Superposition and Exact Principle of Nature?". I think that physicists have been forced to use these mathematical tools to try to model the reality they find experimentally. The difficulty is in trying to interpret what the models mean. My interpretation is that the nature of reality is such that not all of fundamental reality is amenable to precise mathematical representation, i.e. reality is not fully deterministic. But I think that many, including many physicists, cannot accept such a view of reality. I think that you must see reality as deterministic.

      Re numbers: I think that numbers can't be taken for granted. I think they must derive from information category self-relationships. See my 6 April 2012 post to David Tong's essay "Physics and the Integers" (http://fqxi.org/community/forum/topic/897) for more details.

      Hi Lawrence,

      I'd like to make the point that "Information Theory" is NOT about information - its about CODED information. Do you agree?

      Computers only deal with code (coded information), and when people communicate via speech or writing they are also using code. Code is meaningless until it is decoded, and to do this you need a code book or you need to understand the code. I contend in my essay that when all the layers of coded information are decoded, what is left is "pure" information i.e.experience.