Hi Hector,

of course I also sympathize with the idea that no pure randomness is continuously injected into phsycial reality, and that everything that appears random is still the result of a deterministic process.

You write that by the algorithmic universe approach one ends up with 'an organized, structured world with a very specific distribution (Levin's universal distribution)'.

Levin's distribution m(x) provides the a-priori probability of binary string x, and depends on the number of programs of any length that trigger a computation on a Prefix Universal Turing Machine that terminates by outputting x. Thus, the sum of m(x) over all x depends on the number of programs that trigger a computation on a Prefix Universal Turing Machine that terminates (by outputting ANY x), and this is Chaitin's Omega! Nice! I imagine you knew already, but I didn't!

So, are you saying that you have been able to measure the extent to which distributions of data sets (binary strings) from our real world vs. from an artificial, algorithmic world approximate the m(x) distribution (which, I read, is 'lower semi-computable', that is, knowable only approximately)? This sounds very challenging. But I am curious about the type of artificial universe that you have experimented with, and the type of data that you analyzed in it.

For example, if I gave you a huge causal set, intended as an instance of discrete spacetime, where would I look for a data set to be tested against Levin's distribution?

By the way, are these distributions referring to an internal or external view at the universe (Tegmark's frog vs. bird view)? The problem being that in the real universe we collect data as frogs, but with a simulated universe it is much easier to act as birds.

A final question for you. By introducing the apriori probability of string x one shifts the focus from the space S of strings to which x belongs, to the space P of programs that can compute x. But then, why not assuming that even the elements of space P -- strings themselves -- enjoy an a-priori probability? (This is not reflected in the definition of m(x).) How, or why to avoid an infinite regression?

  • [deleted]

and Solomonoff will say ...waww AIXI is possible .....but a string is divisible and a sphere no.hihihi I love this platform.

Of course a string in computing is different.But but confusions hihihi

Now I insist , for a correct turing universal machine, the real fractal of the main central sphere with its pure number is essential....if not it's a wind.

Second , never a machine will be intelligent because never we shall reproduce the first code at the Planck scale if you prefer.

The system must be quantified really with rotating spheres.and furthermore it must have an evolutive spherical topology.

The algorytrhms are reals , universals or humans.......and the universal probability makes the rest no.....

If I understand well, you invent codes, some algorythms and series for some applications.

The Universe is totally different.I understand thus why some persons invent time machine and multiverses, or others irĂ´nic sciences.

We understand thus why it's important to compute universally.It's even intriguing these codes invented by humans.

The conjectures must repect the sphere and its distribution.If not it's just a superimposing of logical series where some limits can be analyzed and sorted.But in a serie of polarity of evolution if the numbers are respected....that it's relevant.

The Universe is a sphere and our particles also....our computing must repect that, if not never we shall find where we are inside this beautiful universal sphere at this moment.The evolution is a specific serie with its intrinsic codes,.....the computing is an invention, human with a very young age.....but it's wonderful, but never the artificial intelligence is possible,only by our hands and brains.the automatic serie of intelligence and encoding is not possible for a computer, even if the Blue gene and Cray system or jaguar fuse and have children in 10000 years, they will be always machines, terminator also no...hihihih arrogance and humility , universality and computing.....hihihi they are crazy these scientists.

Cheers ....vanity of vanities, all is vanity .....

Steve

  • [deleted]

Tommaso,

Yes, m(x) and Chaitin's Omega are deeply connected, in fact as you noticed the former contains the latter. While Chaitin's Omega is the halting probability, m(s) is the output probability (over halting programs running on a prefix-free universal Turing machine) so knowing the latter gives you the former.

Yes, we (in joint with Jean-Paul Delahaye) have been able to measure (to some limited extent) the statistical correlations between several real-world datasets (binary strings) from the real world and an empirical purely algorithmic generated m(s).

As you say, m(s) is 'lower semi-computable', that means one can (with a lot of computational resources) approximate it but never entirely compute it (because of the halting problem). But halting runtimes are known for (relatively) large spaces of abstract machines, for example for Turing machines thanks to the busy beaver game. So one of the ways we undertook was to calculate an experimental m(s) for the known available busy beaver function values.

As an important aside result there is that if you have a way (even if limited) to calculate m(s) then you have a way to calculate C(s), the Kolmogorov-Chaitin complexity of the string s, by way of Chaitin-Levin's coding theorem! And that's the applicability of our research beyond the statistical comparison between real-world datasets and artificial/digital datasets (to test the computational hypothesis).

The calculation we performed produced enough data to produce an empirical m(s) up to relatively short strings (from small Turing machines up to 4 states), for which we could evaluate C(s), something never done before due to its difficulty given that the usual way to approximate C(s) is by compression algorithms but for short strings this used to fail for obvious reasons (compression algorithms have a hard time finding patterns in strings too short so values from compressors are too unstable).

You ask where would you look for a data set to test a real world dataset against Levin's distribution. The answer is here: http://arxiv.org/abs/1101.4795 and we can of course provide full tables.

You also ask whether this view is an internal or external view at the universe. I have troubles placing me in this dichotomy at this moment. I think I should think further about it. I think, however, our view may be a bird view (even at an upper level of the physical). In fact m(s) is also called the universal distribution because it assumes nothing but the use of a universal Turing machine, so it dominates (proven by Levin himself) whichever other semi-computable distribution (it has also been called a 'miraculous' distribution, see http://www.springerlink.com/index/867P162741726288.pdf).

So if one would, for example, create candidate universes, one should probably first look whether the universe is capable of approaching the empirical calculated m(s) which would be an indication that the said universe is capable of producing enough complexity both in terms of structured complexity and apparent randomness distributed as m(s) says, and then one should look at whether the universe fulfills all others physical properties (such as the Lorentz invariant).

You also ask another interesting question about assuming a prior for the distribution of strings (I guess strings acting as initial conditions over the programs). The beauty of m(s) is that it basically does not matter from where (or what) you start from, you end up getting the same distribution because what matters is the computational filter, the random distribution of programs. I think only the distribution of programs would have an impact into m(s) (our experiments also confirm this). An interesting question is effectively whether one can impose restrictions on the distribution of programs, for example imposed by physical laws that one may model with game theory (something close to Kevin Kelly's critics to all Bayesian approaches to learning theory and in connection to the old problem of induction).

But in fact, as a consequence of our research, m(s) is no longer a prior, our experimental m(s) (with the reserve that it has a limited scope) is no longer Bayesian but an empirical (hence posterior) distribution, which according to Kevin Kelly would give our approach and to the algorithmic complexity approach greater legitimacy as a theory. One can see complexity emerging from our experimental distributions according to the way m(s) and C(s) was believed to operate.

Sincerely.

Dear Tomaso

Indeed, It is very interesting. I also agree with case A. On the other hand, I believe that even the TOE must have parameters that have to be defined by experiment. Anyway, I invite you to see my essay, which is in essence a philosophical approach in which I cite a unified theory based on the tenet that space is a material continuum.

Kind Regards

Israel

  • [deleted]

I insist you confound the computing and the reality.You compute under your laws, that's all.

The Universe is totatlly different.

If your encoding is not rational, I understand your conclusions hihihihih

You can invent your codes in a deterministic road, that doesn't mean that your laws are universally true. It's only simple than that.A computer has its codes,if you insert bizare maths????

The only free universal turing machine is with spheres for a correct quantization.At this moment the system is logic but not for simulations, that depends at my knowledge.

Well on that , good contest and vanity , like habit

signed Steve the humble arrogant

8 days later
  • [deleted]

Hi Tommaso,

1. Thanks for your clear presentation on the groundwork needed to find emergence from simple computations.

2. How did the "Experiments with emergence in computational systems modeling spacetime and nature" ISTI-CNR, Pisa, Italy, July 10-11, 2009 Turn out? Where there any dramatic experiments demonstrated in your opinion?

3. I like your point that: "At all levels, including the lowest ones, something 'interesting' must happen. Objects, localized structures distinguishable from a background, waves, the mix of order and disorder, are examples of 'interesting' things."

4. I believe that one of the most interesting events in physics is progression of particle masses from Buckyballs to Fleas (a Planck Mass). We go from indistinguishable objects to identifiable objects and from objects that can exhibit interference to those that have none. This seems to me an area that may be appropriate for a computational approach.

Thanks again,

Don Limuti

    Dear Tommaso,

    I think your essay is very interesting.

    I was wondering if you could clarify something: You say, "There exists a tiniest scale at which the fabric of spacetime appears as a pomegranate...made of indivisible atoms, or seeds." How do you reconcile this with the fact that photons of any wavelength travel at the same rate through space? Would photons of a smaller wavelength be more impeded by the atoms of space? (Forgive me if you have already addressed this.)

    Also, I feel I should clarify my response to your question about my essay. I do believe that discrete space and time are the ultimate bottom layer of nature, but when I say 'nature,' I'm thinking of the universe as a working system, not its basic, fundamental components. Only through discrete space and time do you get matter, force, (relative) energy, and all the workings of the universe. However, on a fundamental level, I believe space at least is a continuous object. Its discreteness would come from particular one-, two- or three-dimensional regions becoming more 'timelike' in their nature, though they would continue to be space...in my opinion.

    All the best with the contest,

    Lamont

      Hi Williams,

      you ask whether photons of a smaller wavelength would be more impeded by the atoms of space.

      I wish I were already there!

      The general question behind this, I guess, would be: what are, really, particles in a causal set, intended as a discrete model of spacetime?

      A general answer would be: since the only tool we have for building the universe is a discrete spacetime -- a directed graph made of nodes (events without attributes) and causal relations among them -- a particle is a trajectory, a worldline, a periodic pattern made of events.

      But talking about the SPEED of a particle in a causal set is already quite difficult, since we need a reference frame, and that's not easy to define either, since we cannot enjoy the advantages of a continuous setting, such as Minkowski spacetime. One way to proceed would be to identify the origin of a reference frame with a particle, as defined above, so that you end up with a system of particles that observe each other and detect their mutual speeds... But I have not jet investigated this line of thought.

      Defining what a photon is in a causal set is indeed particularly challenging, for the following reason.

      While the definitions of time-like and space-like sets of events are immediately available, via the notions of chain and anti-chain for partial orders, the definition of light-like path is problematic, since you do not have a continuous background where you can take limits.

      The difficulty is as follows. Consider the set of nodes forming the photon's trajectory.

      If these are in time-like relation, we have something for which Lorentz distance progresses (Lorentz distance between two events in a causal set is the length of the longest directed path between them - this works fine, as the people in the Causal Set Programme know well), but then we would have a progression of the photon proper time, which contradicts its living on a null spacetime cone.

      If the points are in a space-like relation, no information can be carried by the photon, violating the idea that this particle is the fastest causality carrier.

      The attitude one assumes with the type of research I have described is to set up computational experiments and, basically, see what happens, without preconceived expectations. I find this approach justified in light of the highly creative power exhibited by emergence in simple models of computation. Of course the idea is then to establish relations between what emerges and familiar physical phenomena, as I suggested, for example, with the entanglement-like effect in Figure 3 of my essay.

      At the moment, photons and null cones appear to be still at large, in my causets.

      Hi Don,

      1) Thank you.

      2) You refer to the JOUAL 2009 conference . Four of the six regular papers presented at the event are now published in Complex Systems . I could add that one of those authors - Alex Lamb - has just submitted an interesting essay at this contest, which matches very well, again, the JOUAL focus! Go take a look.

      3) That's what I call the 'Teilhard conjecture' (after T. de Chardin). We are still very far from its experimental validation, but if it turned out to be false, all the experiments I am running on computational, causet-based big bangs would be, mostly, wasted cpu time...

      4) I fully agree.

      • [deleted]

      Dear Tommaso Bolognesi,

      "There exists a tiniest scale at which the fabric of spacetime appears as a pome-granate (Figure 1), made of indivisible atoms, or seeds. This view is reflected in models such as Penrose's spin networks and foams, and is adopted in theories such as Loop Quantum Gravity [14] and in the so called Causal Set Programme [6, 13]...."

      Thank you for pointing out that this is a view.

      "At that level, a universal computation keeps running. We do not know yet the program code, but, in accordance with a fundamental principle of minimality ('Occam razor'), we like to believe that it is small, at least initially."

      And it grows by what means?

      "Perhaps it is a self-modifying program: code and manipulated data might coincide. ..."

      Self? In other words by mechanical magic?

      "...This does not mean that we have to postulate the existence of a divine digital Computer that sits in some outer space and executes that code, for the same reason that, under a continuous mathematics viewpoint, we do not need a transcendental analog Computer that runs the Einstein field equations for animating spacetime. Computations may exist even without computers (and, incidentally, the concept of computation is much older than computer technology). 1"

      Of course computations may exist without computers. Your remark about "outer space" could be applied equally well to extra dimensions and other universes.

      I have printed off your essay and am going to read it; but, I must admit that your beginning appears to be ideological rather than scientific. If I am incorrect, I will learn that by reading your essay and will returnh to apologize.

      James

        Dear James Putnam,

        your post triggers a number of reactions, probably (and unfortunately) more than I can put in a post here.

        The first general point I need to clarify is that, in an attempt to be concise in writing the essay, I adopted a style, especially in the opening that you quote, which may indeed sound more 'assertive' than that of a stardard scientific paper. On the other hand, I believe that a bit of 'assertiveness' may be helpful for stimulating discussions, in a context such as the FQXi forum.

        And in the sequel of the essay, in particular in Section 3 - 'Experimental validation', I do put my views in the right perspective.

        I did not feel like using more space just for supporting the validity of the idea of working with discrete models such as causal sets, in light of the aboundance of solid papers that share that view as a very reasonable working hypothesis, to say the least.

        You ask: By which means does a discrete spacetime grow?

        Rideout and Sorkin [Phys. Rev. D 61, 024002, 2000] discuss classical dynamics for causal set growth, using *probabilistic* techniques.

        Out of the several ways I have explored (by simulations) for obtaining causal sets by *deterministic algorithms*, the one that I personally consider most attractive consists in letting a 'stateless ant' walk on a trivalent graph G while applying, step by step, one out of two possible graph rewrite rules (known also in Loop Quantum Gravity, and used by Smolin, Markopoulou & friends). G can be equated to a growing space, while spacetime corresponds to the causal set of the computation being performed on it (the relevant references are [1-5], [8], [14] and [17]).

        One of the two graph rewrite rules introduces a new node in G, so this rule would be responsible for the growth of space. But spacetime (the causal set) grows at *every* rewrite rule application, since:

        1 rewrite rule application = 1 computation step = 1 node (event) added to the causal set (spacetime).

        In this respect, my approach is indeed in contrast with the following statement from your essay:

        "The universe evolves, but not because simplicity can generate complexity. Complexity can come only from pre-existing complexity. The greatest possible effects of complexity in the universe exist right from its start in a potential state."

        I tend to disagree on that, based on the phenomenon of deterministic chaos, or algorithmic randomness. The best example of this is perhaps Wolfram's elementary cellular automaton n. 30: in spite of the elementary initial condition and the very simple rule (coded by a boolean function of three variables), the computation dynamics exhibit surpring pseudo-random character, as if information were continuously injected in the process from outside.

        I like to believe (ideology again!) that our universe started from nothing, or almost nothing, and that space, spacetime, and complexity did increase: they were not 'already there' at the beginning, unless you mean that all the complexity of the mentioned rule 30 computation is 'potentially' present in the few bits that code its initial condition and boolean function.

        I remember a workshop at the Phys. Dept. of the Padova Univ. (Jan 15, 2008) with lively discussions on String Theory vs. Loop Quantum Gravity. During the panel, Gabriele Veneziano and Carlo Rovelli did agree on one thing: they both were unable to point out a crucial validation/falsification experiment for their respective theories. I believe I am in good company in claiming that, in physics, both imaginative hypotheses and accurate experimental verification are necessary. If you call the former 'ideological', then I am afraid my essay has a lot of ideology in it. And, although I also provide simulation results (more in the references) that suggest interesting analogies with physical phenomena (e.g. the emergence of interacting objects!), my research on causal sets (as well as that of many others) is still very far from precise numerical predictions.

        Since this is getting long, I answer to the remark on the self-modifying program in the next post. Looking forward to your comments after reading the essay. Thanks for your stimulating remarks!

        Tommaso

        Dear James,

        you quote my essay:

        "Perhaps it is a self-modifying program: code and manipulated data might coincide. ..."

        and ask:

        Self? In other words by mechanical magic?

        The funny thing is that I had removed this line from the quasi-final version of the essay, because I did not have enough space for expanding on it. But then I put it back, being ready to accept questions or criticism. Thank you for giving me an opportunity to explain.

        The idea of a self modifying program is, again, an attempt to satisfy the requirement of minimality. While I support the computational universe idea, what I find a bit annoying is the separation between (i) a (fixed) program P and (ii) the data D that it manipulates. Under this view, D represents our Reality, while P would be the rule that governs it, without enjoying itself the status of a Real entity. They are two things, and one of them is even 'unreal'. Two is bigger than one. If the program operates on itself, we would have only one thing: P = D = Reality. That would be more elegant. I believe that the Mathematical Universe idea by Max Tegmark also achieves this unity: there's only one thing, namely a mathematical structure.

        By the way, the concept of a self-modifying program is quite familiar in computer science, e.g. in logic programming (in the Prolog language etc.). Furthermore, self-reference is a recurrent concept when dealing with formal systems (Goedel's theorem), computation (universal Turing machines), not to mention consciousness. I would not be surprised at all if it played a crucial role in an ultimate, computational theory of physics.

        If you claim that there is something magic in a program that modifies itself (= Reality), then I'd expect you to claim the same for a program that modifies 'external' data (= Reality). In my opinion, there is no more magic in a program that runs our Reality, than in a set of differential equations that does essentially the same thing.

        Cheers. Tommaso

        PS

        So far I've done only few experiments on self-modifying Turing machines, without exciting results. In all the experiments mentioned in the essay, data and program are separated, and the latter is fixed.

        • [deleted]

        yes of course and a micro BH also because the singularity says that.

        Me also I am musician and poet and my father was a bus driver and now he is died.And after what at the age of 20 I was in the coma. no but frankly hihihihii several pappers and this and that ....a big joke yes and big pub .

        hop under review.Christi hiihhi you see I play everywhere as a child now,I love this platform and the vanities of scientists.

        Computer vs rationality of our universe. Big Bangs with a S no but frankly, you simulate what an universe or your universe.A big joke all that.It's just computing , not physics.On that good bye.

        Don't be offensed,I am just a little crazzy.Hop I am going to take my meds.until soon.

        Best

        Steve

        Dear Tommaso,

        Perhaps it would help to bring up the contribution of Alan Turing and his concept of universality unifying both data and programs. While one can think of a machine input as data, and a machine as a program, each as separated entities, Turing proved that there is a general class of machines of the same type (defined in the same terms) capable of accepting descriptions of any other machine, and simulate their evolution for any input, hence taking a program+data as data, and unifying both.

        This is why one can investigate the 'computational universe; today either by following an enumeration of Turing machines, or using one (universal Turing) machine running an enumeration of programs as data inputs. Because both approaches are exactly the same thanks to Turing's universality.

        Best.

        - Hector Zenil

        • [deleted]

        Dear Tommaso Bolognesi,

        I have printed your response. I am impressed. Your response is not in agreement with me; but, that is a minor point. Your response was directed at my questions and even referred to my own essay. I appreciate your time and effort in putting your response together. I will follow the leads you referrenced. I will respond when I put something together worth your time.

        James

        5 days later
        • [deleted]

        Tommaso,

        Thanks for a fascinating and extremely well constructed essay. Since Wolfram is scheduled to speak at ICCS in Boston this summer, I think it might be interesting to see how your multi-level hierarchy compares to Bar-Yam's multiscale variety -- hierarchies of emergence vs. lateral distribution of information.

        Interesting conceptual equation, "spacetime geometry = order number." Suppose one were to make another equation: "order = organization feedback." Then one would get -- substituting terms in my equation for yours -- the theme of my ICCS 2006 paper ("self-organization in real and complex analysis") that begs self-organization of the field of complex numbers, z, in the closed algebra of C.

        One more comment (though I could go on; your paper is rich in quotable points), concerning global and local (4.1) time-dependent relations among point particles. Research in communication network dynamics (e.g., Braha--Bar-Yam 2006, Complexity vol 12) shows often radical shifts in hub to node connectivity on short time intervals while time in the aggregate shows that the system changes very little. Taking point particles as network nodes, perhaps something the same or similar is happening.

        Good luck in the contest. (I also have an entry.) I expect that you will rank deservedly high.

        All best,

        Tom

          Dear Tom,

          thanks for the positive comments. Following your links I reached the Robert Laughlin's 2005 book 'A Different Universe: Reinventing Physics from the Bottom Down', in which the role of emergence in theoretical physics is given an important role. Good to hear; another book on the pile!

          In the equation 'spacetime geometry = order plus number', introduced by people in the Causal Set programme, 'number' simply refers to counting the number of events in a region of the causal set, which is then equated to the volume of that region. And 'order' is the partial order among events. You mention self-organization in the context of the field of complex numbers, and this does not seem much related to 'number' in the above sense (if this is what you meant to suggest). But of course I am curious about everything that has to do with self-organization.

          Usually a self-organizing system is conceived as a moltitude of simple active entities. Does this happen in your ICCS 2006 paper?

          One peculiarity of the 'ant-based' (or Turing-machine-like) approach dicussed in my essay is that you actually have only ONE active entity -- the 'ant' -- and expect everything else to emerge, including the moltitude of interacting particles or entities that one normally places at the bottom of the hierarchy of emergence.

          Ciao. Tommaso

          • [deleted]

          Dear Tommaso,

          Thank you for your essay. You write a lot about an emergence and computation, chaos, self-organization and automata. All the elements I have touched in my essay because they are closely connected to the evolution of spacetime concept.

          E.g. you write: "Computations may exist even without computers". It seems to have something in common with Computational LQG by Paola Zizzi. In my essay I have even quoted Paola.

          My own view is that the universe is a dissipative coupled system that exhibits self-organized criticality. The structured criticality is a property of complex systems where small events may trigger larger events. This is a kind of chaos where the general behavior of the system can be modeled on one scale while smaller- and larger-scale behaviors remain unpredictable. The simple example of that phenomenon is a pile of sand.

          When QM and GR are computable and deterministic, the universe evolution (naturally evolving self-organized critical system) is non-computable and non-deterministic. It does not mean that computability and determinism are related. Roger Penrose proves that computability and determinism are different things.

          Let me try to summarize: the actual universe is computable at the Lyapunov time so it is digital but its evolution is non-computable so it remains at the same time analog (the Lyapunov time is the length of time for a dynamical system to become chaotic).

          Your work seems to be the trial to develop the computable model of the universe at the Lyapunov time. Good luck!

          Jacek

          • [deleted]

          Ciao Tommaso,

          Yes, I do mean to suggest that the non-ordered set, z (the universal set of complex numbers) is organized to allow -- not a partial order of events -- but a well-ordered sequence in the specified domain of topology and scale, with analytic continuation over n-dimension manifolds. It is nontrivial that this is accomplished without appeal to Zorn's lemma (axiom of choice). And time is given a specifically physical definition. I followed up at ICCS 2007 with a nonmathematical paper ("Time, change and self organization") that incorporated and expanded on some of these results.

          You pick up right away, the difference between the hierarchical distribution of information, and multiscale variety. I am thinking that your "multitude of entities" may be dual to the "ant" analogy, because with activities occuring at varying rates at different scales, new hierarchies may form and feed back to the system dynamics.

          You know, Boston is very beautiful in the summer. :-)

          All best,

          Tom

          Hi again Tommaso,

          Just to let you know I dropped you a comment on Feb. 18, 2011 @ 21:07 GMT concerning the data vs. code question, just in case you hadn't seen it.

          Best.