I would not say myself that light is information. I would say it carries information and it might be emergent from a theory of pure information.

In any case statements like "It From Bit" "Bit From It" "It Is Bit" etc are just philosophical interpretations. These are just things that help guide us to a more concrete mathematical theory that is consistent with observation. If you think of it a different way round from me I can't argue with it. The important thing is where does it lead to in real operational terms.

  • [deleted]

Excellent summary I have learned very much from your essay!

Philip

"I would not say myself that light is information. I would say it carries information"

That sounds semantic. And what is "emergent" & "a theory of pure information"?

Light is physically existent. It results from an interaction with something else, which is also physically existent, and it is known that an understanding of that something else can be extrapolated from it. So it is therefore information, because it is representational of something else. I am only thinking in terms of what occurs, well generically anyway. Whatever form of knowledge (maths, words, graphics) is used to depict that must correspond with it.

Paul

OK That is quite interesting. Look at it this way.

If I had a bag full of books I might say that the bag is carrying information in the books. That is clearly different from saying that the bag of books is information. The bag has properties of their own aside from the information.

However, you might say that everything we know about the book and the bag is described by information and there is nothing else. OK I like that way of looking at things but I want to come at it from the other direction. I want to start with some pure information that looks like something more fundamental but then when I look at the information more carefully I find that it has the same characteristics as a book of books. In other words, if I use the information to answer questions in a particular way the answers are the same as I would get playing twenty questions when the answer is a bag of books.

Coming back to light and particles I might start with bits of information on a grid. The information might evolve according to some simple rules. In other words it is a cellular automaton. When I study how the system evolves I might find that different types of particles form an travel across the grid. With luck I might replicate something that looks like the physics of photons. Then I would say that the light was emergent from a theory of pure information.

Some people have looked at cellular automata (Fredkin, Wolfram, etc.) The results are interesting but the problem is that quantum field theory includes non-local entanglement and you cant get that with classical cellular automata. 't Hooft has explored quantum cellular automata where he has a hilbert space spaned by the possible states of the system. This is more interesting. He first described the holographic principle using a model of that form and now he claims that string theory can be described that way. Perhaps he can really get it to work.

However, I think this is not enough. Cellular automata already have grid structures that represent space and time is a discrete process put in by hand. I want to see a theory that starts from something more fundamental so that space and time are also emergent/ I think that symmetries are important because they reflect the idea of redundant information. So I want to start from algebras that describe symmetries over Hilbert spaces of information and then reproduce physics as emergent structures from that. It is ambitious but there are mathematical principles that suggest it might just be possible such as the mappings I describe in my essay.

I hope that makes it a little clearer how my thinking works.

Philip

But that all just turns on what we are labelling as being light, ie is light really the physical state which eyes can, upon receipt, utilise. And the rest of the physical entity is the 'carrier'. But this distinction, as with many others one can identify when going into detail, is irrelevant to the point. Light is a representation of something else, ergo, it is information.

Indeed, there is only information about/knowledge of reality. We cannot 'directly access' it. But this is not the point either, because that is a statement of the obvious, and pointless. In the same way that everything provides us with information, so the concept of information again is pointless, as there is no differentiation from not-information. Incidentally, physically, the book is not information, it is ink & paper, or whatever.

Space and time are not emergent. Distance is an artefact of physically existent entities, it being a difference between them in terms of spatial position. Existence necessitates physical space, but that can only be assigned via entities. So distance can only involve entities which exist at the same time. And they can only exist in one physically existent state at a time. Time is the turnover rate of existent states (ie realities).

Paul

Paul, that is a very reasonable view of information and I cant disagree with it.

I am however confused by your last statement. You say space and time are not emergent but then you express the relational view of space and time which is usually identified with an emergent approach. I am probably misunderstanding what you mean.

  • [deleted]

Okay Phil,

Once again, a thought provoking essay! Of course I'm sympathetic to your position but I feel you open yourself up to a bit of critique so I'll take advantage. Where did you open yourself up? I quote:

"Should we base our theoretical foundation on basic material constructs such as particles and space-time or do these things emerge from the realm of pure information? Wheeler argued for the latter. But no amount of philosophizing can tell us if this is how the universe works. There is no point in asking where the information comes from, or where it is stored."

So, with this in mind, I'm going to actually propose to you a fundamental question!

For the last couple of weeks I've been reading FQXi essays, past and present, together with some of the fascinating articles provided. Julian Barbour's essay [JB] is, of course, relevant to this year's subject and although I'm not sympathetic to Mr. Barbour's position it seems to me Mr. Barbour does a rather excellent job of analyzing the nature of information. He divides information into three categories:

"In summary, we must distinguish three kinds of information: Shannon's information, the uncertainty as to which message will be selected from a source; factual information, the content of such a message; and intrinsic semantic information, which distinguishes a random message, or configuration, from one that carries meaning and to some extent explains its very genesis."

After establishing the different kinds of information, Mr. Barbour spends a great deal of time talking about probabilities in the context of quantum information theory; his position is that ITs, quantum configurations or fields, create qubits:

"The key point is this. If we are to speak about ontology, as opposed to efficient coding in communication channels, the most important symbol in (1) is not p for probability but i for the thing, or configuration, that has the probability pi. Probabilities are for outcomes: what you find when you open the box. Thus, even if quantum probabilities are an integral and essential part of the world [PBR] (reference mine), they are meaningless in themselves. They are at best secondary essentials, not primary essentials. They must always be probabilities for something."

Now this is the thing that struck me! When speaking of quantum mechanics, whether regarding the Standard Model or Cosmology, the emphasis is always on probabilities and statistical configurations; but what about the Hilbert Space?

Consider the winning FQXi essay by mathematician George Ellis [GE]:

"Causation: The nature of causation is highly contested territory, and I will take a pragmatic view:

Definition 1: Causal Effect - If making a change in a quantity X results in a reliable demonstrable change in a quantity Y in a given context, then X has a causal effect on Y.

Existence: Given this understanding of causation, it implies a view on ontology (existence) as follows: I assume that physical matter (comprised of electrons, protons, etc.) exists. Then the following criterion for existence makes sense:

Definition 2: Existence - If Y is a physical entity made up of ordinary matter, and X is some kind of entity that has a demonstrable causal effect on Y as per Definition 1, then we must acknowledge that X also exists (even if it is not made up of such matter)."

Now, it's generally agreed that Bell's work proves the mutually exclusive relationship between locality and counterfactual definiteness and, of course, Bell's work has been further elaborated upon since, for example by Mateus Araujo in [MA]. The EPR experiments of Aspect et al. and the Mach-Zehnder experiments of Herzog et al. unequivocally support counterfactual definiteness at the expense of locality. Furthermore, the experiments of Aspect et al. would seem to place quantum entanglement on a firm ontological footing. So then while Hilbert Space is the space of all possible configurations it's much more than a simple Universe of Discourse; the inseparability of Hilbert Space describes quantum entanglement! Therefore, based on experimental evidence and according to the definitions of George Ellis, the Hilbert Space must have an ontological referent!

So I propose to you a "Hogwartian" and certainly fundamental question:

What ontological entity does the Hilbert Space refer to?

References

[MA] Araujo, M., Bell inequalities refined to Boole inequalities in: Quantum Realism, Quantum Surrealism (http://arxiv.org/abs/1208.6283), accessed 26 April, 2013.

[GE] Ellis, G., An excellent exposition revealing the essence of organicism in: Recognising Top-Down Causality (http://fqxi.org/data/essay-contest-files/Ellis_FQXI_Essay_Ellis_2012.pdf), accessed 26 April, 2013.

[JB] Barbour, J., An informative discourse on information in: Bit from It (http://fqxi.org/data/essay-contest-files/Barbour_Wheeler.pdf), accessed 26 April, 2013.

[PBR] Pusey, M. F. et al., Are wavefunctions ontological entities? in: A boost for quantum reality (http://www.nature.com/news/a-boost-for-quantum-reality-1.10602), accessed 26 April, 2013.

I shall wait for a few more essays to present themselves before I rate your essay but, rest assured, I'll levy an upper range rating.

With regards,

Wes Hansen

    • [deleted]

    Wes, thanks for this interesting question. I don't think I can give a very short answer because the ontology I use is quite elaborate. It could be a whole essay in itself which is why I don't try to cover such issues in this essay, but I cant really answer the question without telling you how my ontology works so I will give you the outline.

    Before I do I should make the point that philospohical ontologies are in my opinion just scaffolding that we use to build a house, where the house is the real physical theory. When you are finished you can take the scaffolding away. Ellis and Barbour are offering you a different brand of scaffolding that does not work in the same way. It may work just as well and you may use it to build the same house. The only difference is you might build it faster or slower depending on how good their scaffolding is compared to mine. I am not going to criticize their scaffolding but I will try to sell you mine instead.

    My starting point is basically Pl*t*n*c. I have censored this word because some people find it very offensive. This includes people who don't like maths of course, but also many good physicists will complain about it. I find that the ones with the more mathematical background like myself are more comfortable with it. Any who isn't will need to find a different brand of scaffolding.

    The pl*t*n*c principle says that all mathematical systems exist in a realm outside of the physical universe. This realm is essentially unique. Even if you start with different axioms you end up in the same realm because any set of axioms that is consistent and sufficiently general gives the same set of mathematical systems. This even transcends questions of undecidability in my opinion.

    In this realm all possible universes with any self consistent set of laws exist and there are relationships such as equivalence, or overlap between them. The number of complex systems is much larger than the number of simple systems. You can think of it as an outer level of m*lt*v*rs* if you like (sorry more censorship required there)

    To find the place of our universe in this m*lt*v*rs* we must first of all understand the concept of universality. Universality is the idea from complexity theory where some kind of universal behavior is found in large collections of complex systems, for example, computer languages are defined by complex arbitrary rules but aside from limitation due to memory size most of them provide equivalent definitions of computability provided they are not too simple. Another example is chaotic behavior of non-linear systems which has a type of universality characterized by Feigenbaum constants that are the same no matter how you construct the original system, provided it is not too simple. More examples appear in critical systems where you approach a critical point and the correlation lengths go to infinity. You remormalise to get a consistent macroscopic limit which does not depend on all the microscopic details, provided they are not too simple.

    I think that if you could analyse the pl*t*n*c realm in the right way you would find that there is also a universal behavior at work and this is what forms the laws of physics. In essence this behavior is so dominant that anything living in the pl*t*n*c realm (i.e. everything) would only notice this universal behavior so these would describe their top level laws of physics. I call this idea the theory of theories and first wrote about it twenty years ago.

    So how do these laws work? The first thing we can say is that they are indeterministic. Different mathematical systems can overlap so from your experience you can never determine exactly what system you are in, even given the universal behavior, so experience must be inderterministic, but there are statistical laws of some kind telling you what is most likely, i.e quantum mechanics. The second thing is that this is a complex system, even though it has clear fundamental laws. There are many solutions. Probably there are different universes all fulfilling these laws as different solutions. This is the second level of m*lt*v*rs* and indeterminism within each universe provides a third layers of m*lt*v*rs*. Because of the second layer we can't expect to derive the low energy effective theory of physics. That is part of the solution. It must be there because that is where the *nthr*p*c principle comes in and allows the universe to be fine-tuned for life.

    Trying to derive the top level laws of physics from first principle is also going to be hard. perhaps if we were hugely intelligent we could do it and would then only need to do experiments to figure out the solution level, but we are not that smart. Some people have tried to model the pl*t*n*c realm as a kind of statistical ensemble with a big path integral over all possible mathematical systems. This captures the ontology of the theory of theories quite well but in my opinion the actual path integral is based on a more algebraic principle than a statistical one. Even in quantum mechanics you get complex numbers and grasmann variables rather than plain probabilities etc.

    However, one useful observation is that how ever it works the "path integral" will itself define a mathematical system. That means that is can also be found under the path integral. forming a path integral is a version of quantisation so effectively it gets quantised a second time, and then that system is also under the path integral. Continueing recursively ad infinitum we can argue that multiple quantisation is included in the system. My conjecture is that multiple quantisation actually describes the universal behavior that dominates the path integral. This means that we just need to find the right general construction for quantisation and we are done. We have then built the house and pull, away the scafolding.

    You asked what is the ontological origin of the Hilbert space. Complex numbers and vector spaces are universal features in mathematics. They were discovered by mathematicians long before they were used in quantum theory. Although real numbers were originally inspired by geometry even they are need in mathematics to solve problems defined in purely combinatorial terms. Mathematicians would have discovered them independently of physics if they had needed to. It is no surprise that Hilbert spaces would emerge in the ultimate universality structure that controls the ensemble of all mathematical systems.

    Some people like to think in terms of "simulations". That is to say that we are in a simulation, but there can be lots of simulations. There are even simulations within simulated systems so you can get a similar idea of multiple quantisation that way. I dont like the simulation idea quite so much because it seems to imply that causality is fundamental, and perhaps time too. Causality is part of thermodynamics so it is emergent in the lower layers of physical law. causality, locality, space and time are not fundamental in the top level laws described by multiple quantisation, but they are emergent as features of some solutions. That is why my ontology does not discuss why the big bang happened for example. That is not a relevant question.

    However, there are some important things that are emergent in the formation of the top level laws as a feature of universality and multiple quantisation. These include the role of information, quantum mechanics, qubits etc. It also includes symmetry. It is a feature of critical systems that symmetries emerge at critical points, e.g. you can do field theory on a grid lattice which does not have full spacetime symmetry, but when you move towards the critical point the scale is remormalised and rotational and translational symmetry can emerge. This is one reason why I expect symmetry to be so important. It is emergent at a higher level than space and time and can be hidden by symmetry breaking at lower levels.

    OK I will stop there because I have probably confused everyone that tried to read this. As I said, it would take a full essay to describe this in comprehensible terms and even in my mind it is quite vague, but perhaps I have given you an idea of how my scaffolding holds up.

    Oops, did not log in. That was me in case it was not obvious.

    Philip

    I have a suspicion about this word 'emergent', like several other 'buzz' words (information is probably one) what lies behind it probably does not correspond with reality.

    Reality is spatial. My point was that whilst we know that there must be 'space' for there to be physical existence, we only measure it indirectly, ie in terms of not something. That is, this something occupies that space, or these somethings have that much space between them. So space is not an illusion, it is 'physically existent'. Though in those terms it may be nothing, as the space is completely full of somethings, ie there is no space as such, only somethings. So space does not 'emerge', it is 'there', like everything else. What it 'constitutes' differs in that the somethings differ, and at any given time, it has a definitive configuration, because the somethings have a definitive configuration (ie physically existent state). And any given reality is one of those. It is the physically existent state of the somethings (and hence space) at that time. And as with everything else, in order to measure something we invoke a process of comparison to identify difference.

    Time is concerned with the turnover rate of realities. There is no time in a reality. It is a calibration of the speed at which difference occurs. And if there is difference that is another reality. There can only be one at a time. And to be existent it must be definitive. So again, the concept is not an illusion, though it is not properly attributed to the physical event which lies behind it. And again, to calibrate time, ie rate of change, we compare rates of change and identify difference. If you use a quartz timing device, for example, then you are comparing number of oscillations with whatever.

    So spatial configuration alters, realities alter, at a rate. If this is what emergent really refers to then it is a pointless concept. But I know there is reification, and this concept embodies incorrect conceptions. It attributes difference with existence, and presumes more than one reality is co-existent at any time.

    Perhaps you could define the commonly used meaning of this term emergent.

    Paul

    • [deleted]

    Very interesting. Just wondering if there are any loopholes in info theory. Gravitons carry some information about the contents from a black hole, if a gauge theory of quantum gravity is a reality. The black hole must be exchanging gravitons with all other masses in the universe. These gravitons can be viewed as carrying information about the mass in the black hole. Measure the strength of the gravitational field and you have info on how much mass is in the hole. Is this relevant to the definition of "information"?

    Also, is Hawking radiation perfectly random? If Hawking's own heuristic mechanism for his radiation is right (pair production at the event horizon, with one virtual particle falling in and the other escaping to become onshell and real), surely it will be affected by the electric charge of the black hole? E.g. if mainly positive ions (not light electrons) fall into the black hole, it gets positive charge, and this polarizes the virtual fermion pairs, so after the pair production it's no longer a random virtual charge that falls in, but mostly a negative (attracted to the positive matter in the black hole). The Hawking radiation escaping would then be positrons, rather than gamma rays from annihilation of random charges usually assumed to escape. So is it possible that information may be carried out by Hawking radiation?

      Paul, "Emergent" means the opposite of "fundamental". Any thing we know about that is physically real is either fundamental in the it is written into the underlying laws of physics, or it is emergent in that it appears as a collective behavior or derived property from those laws. Emergence is fairly well understood in the context of complex systems, condensed matter, etc.

      However, I agree that such terms can be confusing if they are applied to physical realms that we are less familiar with. Things that appear fundamental at one time may be emergent in a deeper theory. I have even bigger problems with words like "real", "illusion", "physical", "existence" etc. I know what I mean by these words in my view of things but you may use them differently.

      You keep objecting to the words I use but really you need to look more closely at what I am saying to see how I am using these words. It is the model of reality that I describe that really counts, not the words. What you are saying about space, information etc. is perfectly acceptable to me if I interpret the meaning of your words in a way that makes sense, so it may be that any disagreement you may have with me is purely semantic.

      Hi Philip,

      Your hypothesis is that quantum information is fundamental and all material entities including spacetime are emergent. But do we need something more fundamental than the spacetime itself? In my opinion (presented in my essay http://fqxi.org/community/forum/topic/1609) we do not. I propose a simple experiment to prove that particles, fields and information are the same i.e. they are only spacetime deformations but perceived and defined by human beings in different ways.

      My prediction is not very exotic and new but is generally a conclusion of general relativity (at least its great discovery that gravity is not a force but only a spacetime geometry). I ask why not to apply the same rule to another "forces" and find them somewhere in spacetime geometry? Using your words ...that would be the amazing power of consistency. That simple question should be followed by a complicated answer. The answer is an universal metric. As we do not have any, I do not want to wait and instead I propose to start from a prediction of the idea itself and the experiment that could falsify that prediction or confirm it. In the latter case the time will come to look for the metric, being at least sure of its existence.

      We are human beings so we depend on our perception abilities. Everything we perceive is a creation of our minds. We call it Reality. If we want to be sure the Reality exists and not only in our minds we desperately need a real experiment. Creating highly sophisticated ideas like holographic principle does not change that fact. Do the holographic idea or string theories give any prediction?

      We need the experiment outcome and maybe than we could propose even strongest equivalence principle claiming that any interaction is entirely geometrical by nature.

      If you are looking for energy conservation you will find it assuming that the spacetime is the fabric of everything i.e. particles and field forces. A spacetime deformation needs energy and is the energy itself. That is not my idea but that is GR. Then any randomly chosen region of spacetime gives you that perfect conservation needed. Every entity e.g. a particle is spread out over the entire spacetime (or Universe if you like) according to the Gaussian distribution that is a continuous probability distribution with the apparently strongest deformation (spacetime density?) in the center of the observed entity.

      Thanks

        "Measure the strength of the gravitational field and you have info on how much mass is in the hole. Is this relevant to the definition of "information"?"

        Yes, this is a point I am making in my essay. The gravitational field around a black hole can tell you its mass, momentum, angular momentum and position. That's ten numbers. The electric field can tell you its charge. Other gauge fields would give more information. If there were a huge hidden gauge symmetry it may be that all information could be accounted for in this way. That could be how holography works.

        I think you are right that Hawking radiation is not completely random. If the BH is charged it will at some point radiate away that charge. However the radiation should be thermal with energy distribution of a black body. To resolve the information loss problem we may need to accept that it is non-random in other more subtle ways.

        • [deleted]

        Just a few more comments on the black hole Hawking radiation information claims. (Apologies if this wastes space, please delete if it seems off topic.)

        The idea that heavy positive ions may be more likely to fall into a black hole is conventional fractionation of ionized gas in a gravitational field. If a large cloud of gas is falling into a black hole, it accelerates and heats up, being ionized. The positive ions in this gas plasma have less kinetic energy than the lighter electrons, so the heavy ions effectively fall faster in the gas and enter the black hole first. Obviously in a vacuum, all masses fall at the same rate, but in a gas the more massive, highly inert molecules move more slowly and so end up at the bottom, while the lighter ones pick up greater velocity in collision and end up preferentially at higher altitudes as explained by Maxwell's kinetic theory of gases. E.g., hydrogen gas rises in the atmosphere while heavier molecules concentarte at sea level.

        There are issues with Hawking's claim that all black holes radiate at a rate simply dependent on 1/mass. Schwinger's vacuum polarization calculation for the running coupling in QED shows that there's a 0.5 MeV threshold (IR or low energy cutoff) on polarizable pair production, corresponding to an electric field of ~10^18 v/m. Below that electric field strength, there's a sharp exponential fall spontaneous pair production, which would prevent Hawking's radiation mechanism from working. Hence, if Hawking's heuristic mechanism for Hawking radiation (spontaneous pair production at the event horizon) is true, then Schwinger's experimentally verified QED calculation of the magnetic moment of leptons necessitates the condition that you need to have >10^18 v/m electric field strength at the event horizon of a black hole, or it won't radiate Hawking radiation. So a black hole must have a massive electric charge to radiate Hawking radiation, a fact that isn't mentioned by Hawking or included in his equation! The only ways around this would be to either forget Hawking's heuristic mechanism (that's easier for mathematicians than physicists), or else for quantum gravity (rather than existing QFT) to provide the energy density for spontaneous pair production in the vacuum.

        So it seems to that a particular quantum gravity theory is needed in order to validate the mechanism for Hawking radiation. The energy density of an electric field is the product of half the vacuum permittivity and the square of the electric field strength, and since for protons the gravitational field is about 10^40 times weaker than the electromagnetic field, you can estimate pair production in a quantum gravity field by scaling from electromagnetism. But there is the question of whether the gravity coupling runs with energy as assumed in supergravity, or not. If couplings do unify at the Planck scale, the difference in EM to gravity force strength for fundamental particles will decrease from 10^40 to 1 as energy (or inverse of distance) increases. Fundamental particles, if treated as black holes, should radiate intensely due to their small mass. Is the physical basis of gauge theory, the exchange of offshell gauge bosons between charges, causing fundamental forces?

        Jecek, welcome to the contest, I will be reading the new essays later today.

        • [deleted]

        Hi Philip,

        I have one question and one request. I have been reading your papers and trying to see what you are trying to do exactly. The use of event symmetry is interesting, but I am not clear about multiple quantization and path integral, could you clarify it please.

        The discussion of this thread and with Jochen is most interesting for me. Now, I don't want to make this thread about my theory, but I would like a line or two worth of feedback; can you see any link to your ideas. My theory is very platonic, it links space, energy , matter all in one concept, time is a change of state, it does not appear explicitly. As you can see the lagrangian falls out of the system for the Bohr like model and you get the usual relation between c,h_bar and alpha. I have many other results that I have not shown, like g-factor, Fine Structure Constant and full QM hydrogen 1S (in it, if you change the proton width-very closed to measured- even a little the energies come out wrong). I hope I can show all that in time for this contest.

        Philip

        No I do not object to words, per se, and I am always searching for the underlying correspondence with reality.

        So space is not emergent then, given that definition, because physical existence involves space. Neither is time emergent, assuming what it relates to is understood, because there definitely is difference in physical existence. And difference involves a rate at which that occurs. Without space nothing could be existent, and without change nothing would differ, which must occur at a rate.

        Certainly we can only 'assign' space via entities, and we can only calibrate time via entities, but surely that is not the point about emergent? So I am now really lost.

        In the meantime I will re-read your essay .

        Paul

        Nige, it seems to me that you might be confusing together some unrelated things. Hawking radiation has nothing to do with pair production in an electric field. It is an effect coming from the horizon. The "heuristic" explanation involving virtual pair production that Hawking and others have used in popular explanations does not really represent the detailed calculation that Hawking did. That was based on methods of semi-classical quantum gravity that are hard to describe well in general terms.

        Quantization is the procedure that takes you from a classical theory to a quantum theory by replacing classical observables with non-commuting observables in a specific way (as formulated by Dirac) An equivalent procedure is to use path integrals as defined by Feynman. Multiple quantisation is the idea that this can be repeated iteratively if you treat the quantum shoredinger equation as if it was a classical field equation and quantise again. The dream is that if you define quantization in a very gebneral algebraic way you can derive physics with just this idea.

        I had a quick look at your theory and of course it has a lot in common with other frameworks including mine. I like the idea of trying to simulate systems because I started out in Lattice Gauge Theories. I cant give you a proper review here but I think it would be good if you could write an essay and submit it here. You may then get some feedback from other authors.