Hi Shawn,

thanks for the compliment on my essay.

What I'm really saying is connected to my assertion that the universe is not like a computer. I'm saying that what Information Theory calls "information" is actually "coded information". But what underlies all this code? Computers only deal with coded information; and words, symbols and sentences are codes too. Code represents something else. What is the something else that that code represents?

I'm saying that when all the layers of code have been deciphered, you are left with "uncoded" "pure" information, and this "pure" information is experience.

Shawn, see above for my reply to your original post of 5 September 2012, 17:33 GMT

Also above, I have replied to Robert's post of 5 September 2012, 16:45 GMT

nmann , re your post of 6 September 2012, 00:41 GMT :

In "Quantum Physics as a Science of Information" by Caslav Brukner and Anton Zeilinger, they seem to be talking about how much coded information a quantum system might represent from a human point of view, and how it can be used. They really have no answer to what "pure" unencoded information might exist from a particle (entangled or not) point of view.

  • [deleted]

Hi Lorraine,

Thank you for taking the time to respond. I think I see now what you're aiming for, and what you're pointing out.

I think that the "subjective information derived from information relationships" IS the code. That is probably the only really important difference between our two points of view.

This link between observation and experience, is accomplished simply by treating observations not as "real number", but "serial numbers", which are used to "look up" how to behave towards an observation; these behaviors are, of course, based on accumulated past experience.

A somewhat more detailed description of this can be found in my post on Sept 3, 2012 @ 18:26, under the essay "Is Quantum Linear Superposition and Exact Principle of Nature?"

Rob McEachern

  • [deleted]

Robert,

Point taken. Also you can treat aspects of direct human communication -- social interaction -- from a Shannon perspective. The Whisper Game. The final output can often be outrageous. Or conflicting eyewitness accounts of a robbery or an automobile accident.

Still, though, I believe that Jan Kahre, building in no small part on Shannon, takes it farther. His Law of Diminishing Information specifically accounts for more phenomena and incorporates subjective meaning. Example: if information in any given context is something that can only be diminished, and you can't read Chinese, and you're sitting on a park bench in Shanghai staring with incomprehension at a Chinese newspaper, and then an English-speaking native comes along and translates an article for you -- hasn't information actually been increased? Answer: No, because for you the material in the paper was never information in the first place.

  • [deleted]

In other words, while the newspaper's content was objectively information, your inability to recognize it as such doesn't count as a measure of the information content. "Any information measure that depends so strongly on the prior knowledge of the recipient is too subjective to be of use and should be discarded."

The universe is a computer in the same way the brain is a computer. The computer is a model system which we devise that processes information. The double slit experiment for instance is a case where letting the electron pass through the two slits is an OR process, while if one tries to measure which slit the particle passes through this is an AND process. The two in a quantum logical setting lead to the experimentally determined observation which supports quantum mechanics. We may use these rules to construct the a Feynman path integral. In this sense quantum physics can be seen as a type of logic and the processes by which a quantum state evolves or is observed is due to processes." This does not necessarily mean the universe is a computer in the sense of what we generically call a computer. This is similar to the brain, where while local circuits of neurons have flip-flop type of arrangements for the execution of action potentials, clearly on larger scales there are huge departures from our usual von Neumann architecture of a computer.

The universe as a computer, to use this as a parallel, is one which computes itself. The computer is its own program and data stack. If the above idea of using quantum logic to construct a path integral is used, then the e^{-iHt} in general is witten according to the a large Lie group, the free Hamiltonian is the Cartan center, interactions Hamiltonians involve transitions of root vectors and so forth. In this case the "logical operations" correspond to how quantum state or elementary particles transition between each other. The most elementary concept of this is a two state system with a state vector

|ψ> = |1> e^{iθ}|0>

for a qubit. This state vector evolves according the Schrodinger equation. The free Hamiltonian is given by the Pauli matrix σ_z with diagonal entries 1 and -1 for the two states, and the interaction terms will involve σ_ σ_- with off diagonal entries that transition between the two states.

We may of course form a concept of information from the density matrix ρ as

S = -k tr[ρlog(ρ)],

which reduces

S = -k sum_iP_i log(P_i),

for P_i the probabilities. Information theory makes no distinction between whether this information has some meaning or not. We might make this a bit firmer with the notion that this information is decrypted in some form. A process may run quantum information through some very complex process, say a black hole, where the subsequent Hawking radiation which emerges has no apparent "signal." The observer in this case lacks the encryption/decryption code. This is then a subjective aspect to information. However, whether information measured has "meaning," or the observer lacks the decryption processor, this lack of subjectivity does not mean that information was not in some way processed.

The upshot is that I can meet you half way on this. The universe is not a computer in the standard sense of the definition of "computer." However, this does not necessarily mean that how quantum states are processed does not have a computer process model structure.

Cheers LC

    Rob,

    I've read your 3 September 2012 18:26 post to "Is Quantum Linear Superposition and Exact Principle of Nature?". I think that physicists have been forced to use these mathematical tools to try to model the reality they find experimentally. The difficulty is in trying to interpret what the models mean. My interpretation is that the nature of reality is such that not all of fundamental reality is amenable to precise mathematical representation, i.e. reality is not fully deterministic. But I think that many, including many physicists, cannot accept such a view of reality. I think that you must see reality as deterministic.

    Re numbers: I think that numbers can't be taken for granted. I think they must derive from information category self-relationships. See my 6 April 2012 post to David Tong's essay "Physics and the Integers" (http://fqxi.org/community/forum/topic/897) for more details.

    Hi Lawrence,

    I'd like to make the point that "Information Theory" is NOT about information - its about CODED information. Do you agree?

    Computers only deal with code (coded information), and when people communicate via speech or writing they are also using code. Code is meaningless until it is decoded, and to do this you need a code book or you need to understand the code. I contend in my essay that when all the layers of coded information are decoded, what is left is "pure" information i.e.experience.

    Lorraine,

    You hit the nail on the head:

    "The difficulty is in trying to interpret what the models mean. My interpretation is that the nature of reality is such that not all of fundamental reality is amenable to precise mathematical representation"

    But the main problem is not determinism versus non-determinism. The problem is, that physicists have misinterpreted the meaning of the mathematics, due to some very specific misconceptions about the nature of information. This was the subject of my own essay. My major point is this: in addition to the equations per se, one needs to known other things, like initial conditions. It is the vast information content of the initial conditions, not the tiny information content of the equations, that determines almost everything about how the physicists themselves behave. By totally ignoring this fact, they have totally misunderstood how physicists interact with their own observations, and consequently, misinterpreted the observations. The math is not the problem. The problem is the slapped-on interpretations.

    Rob McEachern

    nmann,

    Boy, we are all in deep trouble...

    "Any information measure that depends so strongly on the prior knowledge of the recipient is too subjective to be of use and should be discarded."

    I guess we will all have to discard this forum, our computers, cell phones, smart phones, big-screen TVs, GPS navigators...

    And here I thought that all these devices, that depend so strongly on the prior knowledge of the recipient, for almost every aspect of their functioning, were really pretty cool and even, dare I say, useful. I stand corrected.

    On a more serious note: If I was sitting on a park bench, staring at a paper with your credit card number written on it, in incomprehensible Chinese, and then an English-speaking native translated the number for me, do you really believe I will not increase my information content, and perhaps even my bank account?

    I agree that the paper was not information for me. But the translated speech sure was.

    Could you explain what you mean by "if information in any given context is something that can only be diminished..."? The whole point of communication is to increase information, and this is accomplished very successfully in many contexts.

    One other thought occurred to me, about a possible point of confusion:

    "while the newspaper's content was objectively information"

    It was not. It was merely data, not information. The whole concept of information is not about what is in the message, but about how much you are able to recover, error free, from the message. In other words, it is measured at the output, not the input, of the information recovery process.

    Rob McEachern

    • [deleted]

    Pure information theory deals with any string of bits, whether coded or not. Coding comes if you impose some Golay or elliptic curve encryption or error correction. If one sends a signal through a noisy channel an encryption system is used to filter the signal at the other end of the channel. A black hole is a sort of quantum bit channel. A major goal is to figure out the encryption algorithm, or equivalently algebra, which permits information to be communicated through a black hole. By this is means if you send a certain amount of information into a black hole then, how is that information or signal ciphered in the Hawking radiation?

    Cheers LC

    • [deleted]

    Robert,

    Please forgive me for starting a new internal thread here, but I really hate the layout of these topic forums. Re: Kahre ... and you are going to despise this absolutely, I just know it (from Hans C. von Baeyer's blurb or review)(quote):

    In The Mathematical Theory of Information Jan Kahre presents a bold new approach to classical information theory. With profound erudition, refreshing iconoclasm, and an appealing sense of humor he leaps over half a century's received wisdom and starts afresh. From common sense observations such as "information is ABOUT something", "transmission tends to degrade information," and "noise can be beneficial", he constructs a rigorous framework that is general enough to encompass a great variety of conventional results in different fields, yet discriminating enough to prune the accumulation of dead wood. His most lasting accomplishment is the demonstration that Claude Shannon's measure of information, which has dominated the subject since 1948, is by no means the only candidate, and not always the most appropriate one. This remarkable book is sure to stretch the horizon and inspire the imagination of every physicist, computer scientist, mathematician, engineer, and philosopher interested in the theory and practice of information science.

    (end quote)

    In all fairness it's no fruitier than most blurbs. Anyway, here's Kahre himself, getting more or less technical (and I too tend to wince at "This would make the Second Law of Thermodynamics a special case of the Law of Diminishing Information"):

    Toward an Axiomatic Information Theory

    He says "Towards" but then his native languages and Finnish and Swedish.

      nmann,

      "Z gets all its information about X from Y." Not too many entities get all the knowledge they possess from a single source. Even if there is only a single input message source, the a priori knowledge built into the receiver, and being exploited to recover info from that source, probably did not come from that source.

      "Shannon's famous entropy measure is not a general information measure, but a consequence of the statistical properties of long strings of symbols." This statement represents a profound misunderstanding. As I mentioned in a post under my essay, Shannon's Capacity measure is simply equal to the number of bits required to "digitize" a band-limited signal. His equality statement simply says that the number of recovered bits of information cannot exceed the number of bits required to digitize a band-limited signal, in the first place.

      The entropy measure is probably the best known, but least significant aspect of Shannon's work. A much more significant aspect was the non-constructive proof he gave, which demonstrated that some coding scheme must exist, which would enable messages to be received error free, right up to the maximum limit. The proof gave no clues how to do it, but it inspired two generations of communications engineers to search for the answers. After fifty years of research, and for the first time ever, that research enabled dial-up telephone modems to operate, virtually error free, at rates over 90% of the maximum limit. When I started graduate school, forty years ago, dial-up modems operated at 110 bits per second. It was a BIG deal, a year or so later, when the Physics dept. bought the new, latest and greatest modems, that operated at 300 bits per second. By the beginning of the 21st century, discoveries in coding theory and signal processing algorithms had pushed it up over 30,000 bits bit second, aided by the recent development of integrated circuit technology, that was powerful enough to run the algorithms.

      Rob,

      looking back over the posts, I think I can't agree with what you've said about the nature of information e.g. in your last 3 posts:

      You say "The problem is, there is no such thing as "uncoded information" in experience. Experience is all about coding, such as sensory coding."

      But the physical coded information we use in everyday life (speech, writing, math equations,computer code) always represents something else. I think experience is uncoded information in that it doesn't represent something else.

      You say "I think that the "subjective information derived from information relationships" IS the code."

      But what does this code represent? Does subjective information represent something else? I don't think so. I think subjective information (experience) is the end of the line - it doesn't represent anything else except maybe itself.Are you saying that experience is equivalent to information, and that this information represents itself?

      Re "My major point is this: in addition to the equations per se, one needs to known other things, like initial conditions. It is the vast information content of the initial conditions, not the tiny information content of the equations, that determines almost everything about how the physicists themselves behave.":

      I would think that a lot of fundamental information (including quantity/number) derives from a sort of network of information category relationships that we humans attempt to symbolically represent by law of nature equations. The trouble is that we may never be able to precisely mathematically represent the source of some of this fundamental information - so from our point of view we can't understand it, and perhaps that is why the mathematical equations don't correspond to our normal view of reality. But given that from our point of view, the equations attempt to represent actual information (number, category, relationship) that is occurring at the foundations of reality - I don't know what you mean by saying that the equations have a "tiny information content".

      Lawrence,

      Re: "A black hole is a sort of quantum bit channel. A major goal is to figure out the encryption algorithm, or equivalently algebra, which permits information to be communicated through a black hole. By this is means if you send a certain amount of information into a black hole then, how is that information or signal ciphered in the Hawking radiation?"

      Sorry, but I wouldn't be too sure that what you say has any relationship to reality - it's very speculative stuff!

      I can only repeat that a string of bits is NOT information -it can only represent information. A string of bits has no information value until every layer of code it represents is decoded, at which stage it is "pure" information i.e. experience.

      Lorraine,

      You asked : "But what does this code represent?"

      It represents how to behave towards future, similar experiences. We learn from experience, but there is no point in accumulating learning for its own sake. If you never actually "use" it, then it is a pointless waste of time, from a "survival of the fittest" point-of-view. What makes this so interesting to me, is that the system never has to learn "why" anything is useful. It merely has to treat it as though it is - and then it will be. In this sense, it becomes a self-fulfilling prophecy.

      • [deleted]

      Robert,

      The fact that a real X generally doesn't get all its information from any single Y is one of those obvious things a critical reader processes automatically. It's akin to the way thermodynamic entropy is sometimes introduced by discussing two hypothetical molecules, X and Y or A and B with different temperatures, and what occurs when you put them in isolated juxtaposition. Writ large you have what used to be called the heat-death of the Universe.

      Here's the first chapter of the magnum opus with considerably more detail, including the original mention of the Chinese newspaper. Kahre stubbornly insists on calling its content Information.

      1. About Information

      Lorraine,

      "I feel that the content of 'pure' information is necessarily categories and relationships, and that new information categories and even numbers can be built from existing categories and relationships. For example mass is a category of information that we humans can represent with a symbol; and our symbols ' - x /' etc. represent the only types of relationships that categories of information can have. Laws of nature are our representation of information category relationships."

      Okay ... well over a decade ago the physicist David Mermin originated what's called the Ithaca Interpretation of Quantum Mechanics, named after Ithaca, NY where Cornell U is located. In a paper from the end of the last century called "What is Quantum Mechanics Trying to Tell Us?" he presents the concept of "correlations without correlata" (which Piet Hut, a FXQi member, popularized in a piece titled "There Are No Things"). I can supply links. "Correlations have physical reality; that which they correlate does not," Mermin says. Anyway, do you see this as at all analogous to "pure information"?

      • [deleted]

      Got my X and Y juxtaposed.