I know, I will get round to reading it again, there is plenty of time.
An Acataleptic Universe by Philip Gibbs
Philip
Meant to say, there was a piece in the Times today (its actually published in Nature Communications) about why we forget some things short-term then they come back. This is the sort of knowledge we need, because obviously the sensory system/brain processing 'interferes' with what we physically receive, ie we cannot presume physical input equals perception output. So leaving aside individualism (another issue-remember input is inly received at individual level, we need to understand, generically, how these processes work, so that from the output we can extrapolate the input. Which is the start point for physics.
Paul
Is there a link for that or was it a hardcopy?
The formulation that they used for quantization is probably not too important now. The idea comes from decades ago and is probably a bit dated. I just like the multiple quantisation idea in general. I think Weizsäcker ended up with some large number type arguments that don't really make sense anymore.
[deleted]
Hi Philip,
Thank you for the detailed answers. I will study your reply to get a better understanding of your idea and then I will formulate more specific questions.
[deleted]
Philip
Look, when I was young, a wind up toy was the new technology. Hardcopy. Not sure if you can read it on their web site without paying, but the reference was from something called Nature Communications.
Paul
[deleted]
Dear Philip,
Do you think that bits could have real form to make it?
If so, I would propose that those bits have a complex 4 fold convertible torus
to fill the particle bill. See 2x attachments and:
3-D particles the deeper bit-reality of it (matter)
http://vixra.org/abs/1103.0002
http://www.flickr.com/photos/93308747@N05/sets/72157633110734398/Attachment #1: It_from_4x_macaroni_Bit_rotation.jpgAttachment #2: quarks_it_made_from_bits.jpg
Leo, Good to see you over here.
I think bits of information are real but they do not have any physical form or shape themselves. However, they have relationships with each other such as entanglement and these relationships have real form. That is just the way I see it.
I like you pictures. I will reread your essay.
[deleted]
Thank you very much Philip,
So, Bits of information have relationships with each other such as entanglement and these relationships have real form.
With real form you mean "can be described by math Formulas"?
By "form" I mean some kind of geometric representation
Philip
"I think bits of information are real but they do not have any physical form or shape themselves"
What about light, vibration, noise, etc, then? This is information, because it is representational of something else, but is also physically existent in its own right.
No information, as in knowledge, is physically existent, unless one expresses it in terms of neural activity, or a chemical known as ink on a substance known as paper, etc. But of course that I not the point. So this cannot do anything physical.
Paul
I would not say myself that light is information. I would say it carries information and it might be emergent from a theory of pure information.
In any case statements like "It From Bit" "Bit From It" "It Is Bit" etc are just philosophical interpretations. These are just things that help guide us to a more concrete mathematical theory that is consistent with observation. If you think of it a different way round from me I can't argue with it. The important thing is where does it lead to in real operational terms.
[deleted]
Excellent summary I have learned very much from your essay!
Philip
"I would not say myself that light is information. I would say it carries information"
That sounds semantic. And what is "emergent" & "a theory of pure information"?
Light is physically existent. It results from an interaction with something else, which is also physically existent, and it is known that an understanding of that something else can be extrapolated from it. So it is therefore information, because it is representational of something else. I am only thinking in terms of what occurs, well generically anyway. Whatever form of knowledge (maths, words, graphics) is used to depict that must correspond with it.
Paul
OK That is quite interesting. Look at it this way.
If I had a bag full of books I might say that the bag is carrying information in the books. That is clearly different from saying that the bag of books is information. The bag has properties of their own aside from the information.
However, you might say that everything we know about the book and the bag is described by information and there is nothing else. OK I like that way of looking at things but I want to come at it from the other direction. I want to start with some pure information that looks like something more fundamental but then when I look at the information more carefully I find that it has the same characteristics as a book of books. In other words, if I use the information to answer questions in a particular way the answers are the same as I would get playing twenty questions when the answer is a bag of books.
Coming back to light and particles I might start with bits of information on a grid. The information might evolve according to some simple rules. In other words it is a cellular automaton. When I study how the system evolves I might find that different types of particles form an travel across the grid. With luck I might replicate something that looks like the physics of photons. Then I would say that the light was emergent from a theory of pure information.
Some people have looked at cellular automata (Fredkin, Wolfram, etc.) The results are interesting but the problem is that quantum field theory includes non-local entanglement and you cant get that with classical cellular automata. 't Hooft has explored quantum cellular automata where he has a hilbert space spaned by the possible states of the system. This is more interesting. He first described the holographic principle using a model of that form and now he claims that string theory can be described that way. Perhaps he can really get it to work.
However, I think this is not enough. Cellular automata already have grid structures that represent space and time is a discrete process put in by hand. I want to see a theory that starts from something more fundamental so that space and time are also emergent/ I think that symmetries are important because they reflect the idea of redundant information. So I want to start from algebras that describe symmetries over Hilbert spaces of information and then reproduce physics as emergent structures from that. It is ambitious but there are mathematical principles that suggest it might just be possible such as the mappings I describe in my essay.
I hope that makes it a little clearer how my thinking works.
Philip
But that all just turns on what we are labelling as being light, ie is light really the physical state which eyes can, upon receipt, utilise. And the rest of the physical entity is the 'carrier'. But this distinction, as with many others one can identify when going into detail, is irrelevant to the point. Light is a representation of something else, ergo, it is information.
Indeed, there is only information about/knowledge of reality. We cannot 'directly access' it. But this is not the point either, because that is a statement of the obvious, and pointless. In the same way that everything provides us with information, so the concept of information again is pointless, as there is no differentiation from not-information. Incidentally, physically, the book is not information, it is ink & paper, or whatever.
Space and time are not emergent. Distance is an artefact of physically existent entities, it being a difference between them in terms of spatial position. Existence necessitates physical space, but that can only be assigned via entities. So distance can only involve entities which exist at the same time. And they can only exist in one physically existent state at a time. Time is the turnover rate of existent states (ie realities).
Paul
Paul, that is a very reasonable view of information and I cant disagree with it.
I am however confused by your last statement. You say space and time are not emergent but then you express the relational view of space and time which is usually identified with an emergent approach. I am probably misunderstanding what you mean.
[deleted]
Okay Phil,
Once again, a thought provoking essay! Of course I'm sympathetic to your position but I feel you open yourself up to a bit of critique so I'll take advantage. Where did you open yourself up? I quote:
"Should we base our theoretical foundation on basic material constructs such as particles and space-time or do these things emerge from the realm of pure information? Wheeler argued for the latter. But no amount of philosophizing can tell us if this is how the universe works. There is no point in asking where the information comes from, or where it is stored."
So, with this in mind, I'm going to actually propose to you a fundamental question!
For the last couple of weeks I've been reading FQXi essays, past and present, together with some of the fascinating articles provided. Julian Barbour's essay [JB] is, of course, relevant to this year's subject and although I'm not sympathetic to Mr. Barbour's position it seems to me Mr. Barbour does a rather excellent job of analyzing the nature of information. He divides information into three categories:
"In summary, we must distinguish three kinds of information: Shannon's information, the uncertainty as to which message will be selected from a source; factual information, the content of such a message; and intrinsic semantic information, which distinguishes a random message, or configuration, from one that carries meaning and to some extent explains its very genesis."
After establishing the different kinds of information, Mr. Barbour spends a great deal of time talking about probabilities in the context of quantum information theory; his position is that ITs, quantum configurations or fields, create qubits:
"The key point is this. If we are to speak about ontology, as opposed to efficient coding in communication channels, the most important symbol in (1) is not p for probability but i for the thing, or configuration, that has the probability pi. Probabilities are for outcomes: what you find when you open the box. Thus, even if quantum probabilities are an integral and essential part of the world [PBR] (reference mine), they are meaningless in themselves. They are at best secondary essentials, not primary essentials. They must always be probabilities for something."
Now this is the thing that struck me! When speaking of quantum mechanics, whether regarding the Standard Model or Cosmology, the emphasis is always on probabilities and statistical configurations; but what about the Hilbert Space?
Consider the winning FQXi essay by mathematician George Ellis [GE]:
"Causation: The nature of causation is highly contested territory, and I will take a pragmatic view:
Definition 1: Causal Effect - If making a change in a quantity X results in a reliable demonstrable change in a quantity Y in a given context, then X has a causal effect on Y.
Existence: Given this understanding of causation, it implies a view on ontology (existence) as follows: I assume that physical matter (comprised of electrons, protons, etc.) exists. Then the following criterion for existence makes sense:
Definition 2: Existence - If Y is a physical entity made up of ordinary matter, and X is some kind of entity that has a demonstrable causal effect on Y as per Definition 1, then we must acknowledge that X also exists (even if it is not made up of such matter)."
Now, it's generally agreed that Bell's work proves the mutually exclusive relationship between locality and counterfactual definiteness and, of course, Bell's work has been further elaborated upon since, for example by Mateus Araujo in [MA]. The EPR experiments of Aspect et al. and the Mach-Zehnder experiments of Herzog et al. unequivocally support counterfactual definiteness at the expense of locality. Furthermore, the experiments of Aspect et al. would seem to place quantum entanglement on a firm ontological footing. So then while Hilbert Space is the space of all possible configurations it's much more than a simple Universe of Discourse; the inseparability of Hilbert Space describes quantum entanglement! Therefore, based on experimental evidence and according to the definitions of George Ellis, the Hilbert Space must have an ontological referent!
So I propose to you a "Hogwartian" and certainly fundamental question:
What ontological entity does the Hilbert Space refer to?
References
[MA] Araujo, M., Bell inequalities refined to Boole inequalities in: Quantum Realism, Quantum Surrealism (http://arxiv.org/abs/1208.6283), accessed 26 April, 2013.
[GE] Ellis, G., An excellent exposition revealing the essence of organicism in: Recognising Top-Down Causality (http://fqxi.org/data/essay-contest-files/Ellis_FQXI_Essay_Ellis_2012.pdf), accessed 26 April, 2013.
[JB] Barbour, J., An informative discourse on information in: Bit from It (http://fqxi.org/data/essay-contest-files/Barbour_Wheeler.pdf), accessed 26 April, 2013.
[PBR] Pusey, M. F. et al., Are wavefunctions ontological entities? in: A boost for quantum reality (http://www.nature.com/news/a-boost-for-quantum-reality-1.10602), accessed 26 April, 2013.
I shall wait for a few more essays to present themselves before I rate your essay but, rest assured, I'll levy an upper range rating.
With regards,
Wes Hansen
[deleted]
Wes, thanks for this interesting question. I don't think I can give a very short answer because the ontology I use is quite elaborate. It could be a whole essay in itself which is why I don't try to cover such issues in this essay, but I cant really answer the question without telling you how my ontology works so I will give you the outline.
Before I do I should make the point that philospohical ontologies are in my opinion just scaffolding that we use to build a house, where the house is the real physical theory. When you are finished you can take the scaffolding away. Ellis and Barbour are offering you a different brand of scaffolding that does not work in the same way. It may work just as well and you may use it to build the same house. The only difference is you might build it faster or slower depending on how good their scaffolding is compared to mine. I am not going to criticize their scaffolding but I will try to sell you mine instead.
My starting point is basically Pl*t*n*c. I have censored this word because some people find it very offensive. This includes people who don't like maths of course, but also many good physicists will complain about it. I find that the ones with the more mathematical background like myself are more comfortable with it. Any who isn't will need to find a different brand of scaffolding.
The pl*t*n*c principle says that all mathematical systems exist in a realm outside of the physical universe. This realm is essentially unique. Even if you start with different axioms you end up in the same realm because any set of axioms that is consistent and sufficiently general gives the same set of mathematical systems. This even transcends questions of undecidability in my opinion.
In this realm all possible universes with any self consistent set of laws exist and there are relationships such as equivalence, or overlap between them. The number of complex systems is much larger than the number of simple systems. You can think of it as an outer level of m*lt*v*rs* if you like (sorry more censorship required there)
To find the place of our universe in this m*lt*v*rs* we must first of all understand the concept of universality. Universality is the idea from complexity theory where some kind of universal behavior is found in large collections of complex systems, for example, computer languages are defined by complex arbitrary rules but aside from limitation due to memory size most of them provide equivalent definitions of computability provided they are not too simple. Another example is chaotic behavior of non-linear systems which has a type of universality characterized by Feigenbaum constants that are the same no matter how you construct the original system, provided it is not too simple. More examples appear in critical systems where you approach a critical point and the correlation lengths go to infinity. You remormalise to get a consistent macroscopic limit which does not depend on all the microscopic details, provided they are not too simple.
I think that if you could analyse the pl*t*n*c realm in the right way you would find that there is also a universal behavior at work and this is what forms the laws of physics. In essence this behavior is so dominant that anything living in the pl*t*n*c realm (i.e. everything) would only notice this universal behavior so these would describe their top level laws of physics. I call this idea the theory of theories and first wrote about it twenty years ago.
So how do these laws work? The first thing we can say is that they are indeterministic. Different mathematical systems can overlap so from your experience you can never determine exactly what system you are in, even given the universal behavior, so experience must be inderterministic, but there are statistical laws of some kind telling you what is most likely, i.e quantum mechanics. The second thing is that this is a complex system, even though it has clear fundamental laws. There are many solutions. Probably there are different universes all fulfilling these laws as different solutions. This is the second level of m*lt*v*rs* and indeterminism within each universe provides a third layers of m*lt*v*rs*. Because of the second layer we can't expect to derive the low energy effective theory of physics. That is part of the solution. It must be there because that is where the *nthr*p*c principle comes in and allows the universe to be fine-tuned for life.
Trying to derive the top level laws of physics from first principle is also going to be hard. perhaps if we were hugely intelligent we could do it and would then only need to do experiments to figure out the solution level, but we are not that smart. Some people have tried to model the pl*t*n*c realm as a kind of statistical ensemble with a big path integral over all possible mathematical systems. This captures the ontology of the theory of theories quite well but in my opinion the actual path integral is based on a more algebraic principle than a statistical one. Even in quantum mechanics you get complex numbers and grasmann variables rather than plain probabilities etc.
However, one useful observation is that how ever it works the "path integral" will itself define a mathematical system. That means that is can also be found under the path integral. forming a path integral is a version of quantisation so effectively it gets quantised a second time, and then that system is also under the path integral. Continueing recursively ad infinitum we can argue that multiple quantisation is included in the system. My conjecture is that multiple quantisation actually describes the universal behavior that dominates the path integral. This means that we just need to find the right general construction for quantisation and we are done. We have then built the house and pull, away the scafolding.
You asked what is the ontological origin of the Hilbert space. Complex numbers and vector spaces are universal features in mathematics. They were discovered by mathematicians long before they were used in quantum theory. Although real numbers were originally inspired by geometry even they are need in mathematics to solve problems defined in purely combinatorial terms. Mathematicians would have discovered them independently of physics if they had needed to. It is no surprise that Hilbert spaces would emerge in the ultimate universality structure that controls the ensemble of all mathematical systems.
Some people like to think in terms of "simulations". That is to say that we are in a simulation, but there can be lots of simulations. There are even simulations within simulated systems so you can get a similar idea of multiple quantisation that way. I dont like the simulation idea quite so much because it seems to imply that causality is fundamental, and perhaps time too. Causality is part of thermodynamics so it is emergent in the lower layers of physical law. causality, locality, space and time are not fundamental in the top level laws described by multiple quantisation, but they are emergent as features of some solutions. That is why my ontology does not discuss why the big bang happened for example. That is not a relevant question.
However, there are some important things that are emergent in the formation of the top level laws as a feature of universality and multiple quantisation. These include the role of information, quantum mechanics, qubits etc. It also includes symmetry. It is a feature of critical systems that symmetries emerge at critical points, e.g. you can do field theory on a grid lattice which does not have full spacetime symmetry, but when you move towards the critical point the scale is remormalised and rotational and translational symmetry can emerge. This is one reason why I expect symmetry to be so important. It is emergent at a higher level than space and time and can be hidden by symmetry breaking at lower levels.
OK I will stop there because I have probably confused everyone that tried to read this. As I said, it would take a full essay to describe this in comprehensible terms and even in my mind it is quite vague, but perhaps I have given you an idea of how my scaffolding holds up.
Oops, did not log in. That was me in case it was not obvious.