If Kahre wishes to allow the unintelligible text on the newspaper (input, rather than output), to be "information" then he must also allow for the fact, that unbeknownst to him, I secretly genetically engineered the newspaper fibers. To decode my message, he must recover every DNA molecule, from every cell, from every fiber, string them all end-to-end and read my preamble, which then informs him he must acquire every newspaper on earth and do the same in order to read my complete message. Preventing such absurdities is why information is defined on the output of the recovery process, not the input.

Rob,

I have read your essay, but I think that at the foundations of reality there can only be fairly simple things like information relationships. You seem to be saying that, effectively, there is a block of code at the foundations of reality that consists of a whole set of logical rules. I can't see that at the level of fundamental particles there has been any time for a "survival of the fittest" set of rules to evolve. Also, where are these rules, in what form do they exist (Platonic realm?), how are these rules enforced, and in what way are they different to our attempted representations of "laws of nature"?

Of course things have changed by the time you get to complex molecules. I surmise that by the time you get to DNA molecules, a simple information experience has given way to a more complex information encounter in which the physical RNA represents information to itself, and that in relationship to the RNA the physical DNA and its physical environment represents further information. We can describe and represent what is happening from our point of view, but this is not the same as what is happening information-wise from the RNA molecule's point of view.

  • [deleted]

Hi Lorraine,

I hope that I'm making some progress in understanding this problem that you pose.

I ran into some theorems by Tarski and Godel about the limitations of self-referential languages. They are interesting, but I'm not sure how these may or may not apply to what you are trying here. In any case, these theorems indirectly lead me to the thought that the only representation of the Universe that is not self-referential is the Universe itself. The problem I am seeing is that in order to have such a representation would be for us to create a Universe. This is pretty much where my thought process implodes and my ears begin to emit a little smoke. Have your thoughts about this essay ever traveled down this path? If so, did you make it further than I did?

- Shawn

    • [deleted]

    Sorry,

    "The problem I am seeing is that in order to have such a representation would be for us to create a Universe."

    should have read

    "The problem I am seeing is that in order to have such a representation, we would have to create a Universe."

    Lorraine,

    I never said anything at all about "the foundations of reality." It is you, not I, that speaks of such things. I merely spoke about an observer's perception of that reality. And like it or not, all of our perceptions of reality are indeed being filtered through, as you put it, "a block of code", residing in our brains. And, like it or not, that is our reality.

    nmann, re the online chapter 1 of The Mathematical Theory of Information by Jan Kahre:

    I have read it, and as expected, it's about coded information not information. Noticeable is the way it mixes up information and coded information as though there were no essential difference. Look at these representations:

    (1.5.1) "rel(B@A)" Here A represents a coded message sent and B represents a coded message received.

    (1.2.1) "inf(B@A)" Here B represents a (coded) newspaper article about an event, and A represents the event. (The newspaper article is coded because words are a code)

    But actually A does not represent an objective event. So-called event A is actually the reporter's subjective experience of event A, and B is the coded representation of that subjective experience.

    "Information" theories fail to honestly identify what is going on - they fail to properly identify and highlight the reality of subjective experience. Subjective experience is actually part of these theories, but instead of saying that A represents the reporter's subjective experience they claim that A represents an event.

    As part of my essay, I tried to point out the connection with subjective experience.

    • [deleted]

    I am basing my definition of information on the Khinchin-Shannon measure

    S = -k sum_n P(n) log(P(n))

    For P(n) the probability of the occurrence of the n-th bit. Entropy is the basic measure of information. If there is a process which changes the distribution of probabilities and thus this entropy measure, then information has been lost or erased. If you have a bit stream that enters a channel and at the output side the entropy measure is the same you conclude your information has been conserved. The largest probability occurs when you have the equipartition of probabilities so P(n) = P for all n. and the minimal entropy for P(m) = 1 for a unique m and P(n) = 0 for all n =! M.

    It might be of course you don't understand how it is conserved. Suppose you have bit stream, or a quantum bit stream, with probabilities P(n). You have a code for this bit stream; for example the bit steam is an image and the code is a gif or jpeg algorithm. You then run this into a channel to communicate it to the output. The output is corrupted in some way, the image algorithm gives no picture or only part of it. This means there has been a scrambling of your information with other information. If all that other stuff is given by some density of probabilities ρ for random bits, the total entropy is

    S' = -k sum_n P(n) log(P(n)) - k sum_aρ_a log(ρ_a)

    = S S(channel),

    where you have a problem with mixing your signal with the channel noise. What you want is some error correction system which unscrambles this so you can trace out the channel noise. This can be accomplished by various means of computing Hamming distances or the application of a Steiner system.

    So this I think connects to the different definitions of information used here. The distinction is between information with bit probabilities on a channel

    The entropy of a black hole is given by the Bekenstein formula S = kA/4L_p^2. Here L_p = sqrt{Għ/c^3} is the Planck length, and L_p^2 is a unit of Planck area. A is the area of a black hole event horizon. The entropy is equal to the number of Planck units of area that comprises the event horizon S = Nk/4. This is given by the total density matrix of the black hole, where ρ is the quantum density matrix ρ_{op} = sum_n|ψ_n)(ψ_n|. A trace over the density matrix in the Khinchin-Shannon formula determines entropy. If you threw a bunch of quantum states into a black hole entropy of the black hole is

    S = k sum_nρ_a log(ρ_n) S_{BH}

    The black hole entropy increases. You are not able to disentangle entropy of your signal from the black hole by performing the proper partial trace necessary. However, if you kept an accounting of all states in the black hole and the joint entropy of the states you put into the black hole, which are negative, then you could in principle extract the information you put into the black hole. How joint entropy can be negative is a consequence of quantum entanglement, and by putting a quantum bit stream into a black hole is to entangle it with the black hole.

    A detailed understanding of this requires the use of error correction codes. The most general one is the Leech lattice Λ_{24}, which is constructed from a triplet of E_8 heterotic groups. string theory has a heterotic sector with E_8 ~ SO(32). The so called sporadic groups, such as the Leech lattice, are a system of automorphisms and normalizers which define something called the Fischer Griess (or monster) group.

    At any rate the idea of the universe as a computer is really just a mathematical gadget. It is not as if the universe is identical to the Matrix computer in the movies of that title.

    Cheers LC

      Hi Shawn,

      Self-referential languages, while an interesting topic, don't really apply to what I'm trying to say here. I have just posted to nmann on the subject of The Mathematical Theory of Information by Jan Kahre, and that post might be relevant to what you are trying to follow up.

      Rob,

      My apologies for getting the wires crossed.

      I think this happened because I am actually talking about information/experience mainly from the point of view of fundamental reality. This was what my essay was about - that unlike the physical bits in a computer which represent coded information, fundamental physical particles /fundamental particle states do not represent coded information - they are "pure" information.

      Lawrence,

      thank you for a very detailed reply.

      I contend that, in the case of communications, your above equations refer to coded information, and the problems of ensuring that the received coded message in a noisy environment is the same as the sent coded message. At no point does this coded message constitute information until a decoded version is subjectively experienced by a human being. Shannon himself used the title Communication Theory, not Information Theory. See my 9 September 2012 @ 12:45 GMT post to nmann about Information Theory's failure to honestly acknowledge subjective experience i.e. pure, uncoded information.

      In the case of black holes, aren't you referring to a connection between entropy and signals, between entropy and codes? Isn't there an assumption here that quantum bits are in fact a code that exists at the level of fundamental reality?

      I think that the idea of the universe as a computer is an idea about the essential nature of fundamental reality. This is what I am interested in discussing.

      nmann,

      Re "correlations without correlata" (which Piet Hut, a FXQi member, popularized in a piece titled "There Are No Things")..."Correlations have physical reality; that which they correlate does not," Mermin says. Anyway, do you see this as at all analogous to "pure information"?

      No, I do not see this as being analogous to pure information.

      I think that the content of the complex multiple streams of pure information that we experience is not things, it is not actions. The content is categories of things, and relationships between these categories.

      Similarly, the content of information that exists at the fundamental level of reality must be categories and relationships.

      Lorraine,

      No need to apologize. You are in good company. Most of the world's physicists have their wires crossed in exactly the same way. Imagine a "Skyscraper of Perception", built on "The Foundations of Reality." As one journeys upwards, bottom to top, one passes through realms of increasing levels of perception; Physical behaviors, chemical behaviors, biological behaviors, and at the top, conscious behaviors.

      You and the physicists are on the roof, looking downwards, through the mists, trying to "See", "The Foundations of Reality". I went to graduate school, in physics, expecting to spend my career doing the same thing. But before I had even finished school, I had noticed all the communication antennae on the roof, and began to wonder what all that was about. That was much nearer, and not surrounded by mists, and thus more readily discerned. Then, after having satisfied my curiosity about information and communications systems, I once again took note of all my former colleagues, the physicists, still peering intently down into the mists. But now, rather than joining them, I began to peer at them. I wondered if they were really doing anything all the different than the other antennae on the roof. They certainly believed that they were. But I had my doubts. They believe that they are "seeing", but like all the other antennae on the roof, they are only "perceiving". And when they set up their instruments, to enhance their "seeing", they merely perceive the perceptions of the instruments, in addition to their own.

      The difference between "seeing" and "perceiving" is important. As noted elsewhere, in these posts, you can "see" "data", but "information" can only be "perceived." Data is what exists "out there", but perceptions and "information" only reside at the output, not the input, of an information recovery process. By confusing the two, you confuse everything you can ever know about what actually resides within the mists. Physicists have assumed that they were "seeing" "The Foundations of Reality", But they merely perceive "Our Reality", the false-colored, coded information, generated by our entire collection of perceptual apparatus.

      By failing to take into account the "instrumental effects" produced by their own perceptual apparatus, they have convinced themselves that "Our Reality" must be necessarily identical to "The Foundations of Reality." But that is not in fact necessary, and as I attempted to demonstrate in my essay, it is in fact not the case. At present they are still quite different. The subject of this essay contest, is ultimately about why they no longer seem to be growing any closer together. My reply is "Because you have failed to clearly perceive perception itself", because you do not clearly understand what information even is.

      Thus, while the physicists continue to debate if information is lost, when a book is dropped into a black-hole, I respond "NO!" Information only exists at the output of a recovery process, a perception. For information to be lost, all the observers capable of reading the book must be dropped into the black-hole.

      • [deleted]

      Hi Lorraine,

      Ok, thank you for laying out some more constraints for the conversation. :)

      The following is what had ran through my head when I made my last comment, and I believe that it kind of relates to what nmann and Lawrence are saying too. Forgive me if none of this is new to anyone. I just want to eliminate the dead end paths in my understanding.

      I think that symbols exist in order to defer (or outright eliminate) the gathering of materials and expenditure of effort (heat generation) required to fully re-

      present some physical system.

      A lateral case would be the symbols that we have for the word "blue". Ancient people couldn't generally shoot out blue photons at will, and so the spoken/written word

      was created in order to defer the gathering of materials / expenditure of effort required to make blue light. In the end, neither the word "blue" written on paper or

      the brain cell/chemical/electrical pattern are very similar to a blue photon in terms of spatial or temporal physical configuration.

      A vertical case is the modeling of the Solar system.

      We can partially defer our re-presentation of this system by making a wood and metal model that runs by a small electrical motor. The spatial configuration is much

      different from the real thing, and so the dynamics must be driven manually by the motor.

      An even more deferred approach for modeling the Solar system is to write down on paper the equation F = GMm/r^2 alongside a list of real vectors that correspond to the

      current locations and velocities of some planets. The ink on the page looks absolutely nothing like a Solar system in terms of either spatial or temporal physical

      configuration -- the dynamics are entirely deferred (F = ... doesn't even move around the page). If we use a computer simulation to calculate and drive the dynamics

      of the system, we take up some of that slack caused by our previous deferral, but not all of the slack since it still would take much more material and effort to

      create an actual copy of the Solar system.

      As for what Lawrence and nmann are saying, I think that it's fair to say that we have deferred our re-presentation of the actual physical configuration of some thing

      over space and time by writing down the density matrix, which encodes the physical states, and by writing down some equations that encode the dynamics. Like in the F= ... case, we can then do a computer simulation to drive the dynamics and take up a bit of the slack from our deferral.

      This is the point in my line of thought where I had come to the conclusion that the most faithful representation of the Universe is the one where we defer no gathering of materials or expenditure of effort. One `simply' puts the materials in the right places and the dynamics take care of themselves from then on. In effect, spacetime and the interactions take over the place of our computer that was required to re-present the dynamics. However, it's clear to me now that this isn't the end of the symbolism spectrum that you were looking for.

      I think that the `Universe is a digital computer' idea got a real boost when the papers on the holographic principle came out in the early-mid 90s. One of the most literal interpretations of the holographic principle is that those microstates states which are represented by the density matrix actually map directly to a set of oscillators (that flip up or down). If the black hole's event horizon is in any one of n microscopic states (and the states are all of the same probability of occurring), then this oscillator paradigm says there must be log_2(n) oscillators. It's like how an 8-bit integer can be in any one of 2^8 = 256 different microscopic states. The dynamics are still another matter altogether though. For instance, the original holographic principle paper (Dimensional Reduction in Quantum Gravity) does not just lay out this idea of mapping the states to binary oscillators, but also starts to lay out a kind of cellular automaton rule that would govern how exactly the black hole evolves from one specific microstate to the next as time proceeds.

      In the end, I think that these people who take the holographic principle literally are just looking for something that can be subsequently framed as an object-oriented cellular automaton (just slap it into some classes, compile and run in order to re-present the dynamics). How different is this from what you are looking for, besides the fact that they take the existence of binary "bits" literally?

      Hi Lorraine,

      The only possible way to seriously think of the universe as a computer, in my opinion, is to think of it as an analog computer, which is based on the continuum and adjusts flow rates according to physical parameters. In this sense the universe is not 'computing' anything. It is simply evolving. The idea that the universe must 'calculate' its next 'state' is a weird fantasy.

      Edwin Eugene Klingman

      Hi Edwin,

      I completely agree with you. I don't know if it's quite right to say that spacetime and the interactions are really computation per-se. I mean, if we wish to symbolize the physics of a bear by dressing up in a bear costume and run around going "roar", is that really computation? I'm hard-pressed to draw such a parallel, and so I think that you're right that the universe itself is not so much a computer.

      - Shawn

      • [deleted]

      Hi Shawn,

      To my way of thinking, symbols and representation probably naturally arise from the subject/object configuration of reality:

      Experience (i.e. pure information) has a point of view: experience is subjective. The simplest subjective experience is that of a "particle" - a particle is pure experience, pure information. The content of subjective experience is mainly categories and relationships. To a particle (a subject), the rest of reality including itself represents information in relationship with the particle, i.e. the rest of reality including itself are like objects that represent information.

      Re cellular automata:

      I think that reality cannot be represented by cellular automata because fundamental reality is not entirely representable, not entirely "lawful", not entirely automatic.

      Re "the density matrix", which represents "those microstates states":

      This seems to be an attempt to model the reality that we observe, reality as some people understand it. (Not that we really understand or observe much of black holes.) I'm saying that this is an attempt to precisely model reality from our point of view, reality as we see it. This has worked pretty well until recently, but perhaps it has limits. Perhaps we cant model reality entirely from our point of view, perhaps we cant model reality precisely, perhaps we have to try to envision reality from the point of view of fundamental particles.

      Hi Edwin,

      Re "...the universe is not 'computing' anything...The idea that the universe must 'calculate' its next 'state' is a weird fantasy.":

      Yes, it certainly is a weird fantasy about the nature of reality - I don't think there can be anything analogous to the calculations performed by a computer going on. I think that what looks like the outcomes of calculations must be due to the nature of information and information relationships.

      • [deleted]

      The transition from one state to another is an operation on a quantum bit. Actually it can be more than just a qubit, for this could involve 3 states or a ternary system (qutrit) or some n-ary system.

      I think in some ways we are getting into some definition issues. The measure of information is just the entropy formula. If you send a stream of bits or "letters" down a channel and the information is constant then the communication has conserved the information content of the input signal. A code or cipher can be a method for measuring that entropy according to Hamming distances. In that case one is looking for a conservation of a pattern, structure, or some algorithmic content.

      Quantum gravity is some putative system where by quantum states of gravity, say the Hartle-Hawking vacuum states or graviton states of the heterotic string etc, transform into each other by a way which does not change the phase space volume given by the density matrix of states. If that volume is constant it is equivalent in quantum information theory to say that qubits, qutrits or GHZ-quadits etc, transform into each other without an increase in the quantum entropy (entanglement entropy) of these states. A black hole in this perspective is a processor which takes states from the external environment, absorbs them in entanglements and produces Hawking radiation as the "processed output" of these states.

      In my paper I look at elementary issues which lead to computations of how two particles, or two superstrings, scatter each other, where an intermediate black hole state results and this decays (Hawking radiation) into product particles or strings. I reference work in progress, which should be submitted for publication this month, of two input particles or strings --- > quantum black hole --- > two output particles or strings. This is the most elementary process one can consider, and the input and output states share the same information content.

      In saying the universe is a computer, this really means the concept of computation or processes that transform quantum bits or states have elements that are isomorphic to computation mathematics laid down by Turing, von Neumann, Chomsky, and others.

      Cheers LC

        • [deleted]

        Hi Lawrence,

        So, any process (operation) that gives any kind of change to anything is a computation. That strikes me as particularly anthropomorphic, because it forbids the possibility that there is such distinction between data and operations at the deepest level of reality. As for the non-black hole -> black hole -> non-black hole chain of transformation, can I assume that this would break unitarity because the input state (a non-black hole) would necessarily have less entropy than the intermediary state (a black hole)? Perhaps that's going too far afield of the conversation here, though it definitely does relate.

        - Shawn