Mr. Klingman,

glad you enjoyed my essay. I see that you are a prolific author and you consider yourself an authority in both physics and computer hardware. And you think that my view on the universe as an immense computer is not deep enough. Well, your opinion of my view is based on an assumption that has no basis in reality. For your information, I started to entertain this idea only after I studied biology and physiology long enough to became convinced that life is a program. Once you make this leap, everything else follows logically.

So I understand very well how you, having never studied life sciences, may have difficulty coming to the same view. Knowing physics and computers may not be enough to come to this conclusion, just as it was not enough for me until I took on biology.

Take care!

Hi Ian,

Thanks for your comments! For your first point about time-symmetry, I don't think this is necessarily an NSU/LSU issue at all, as both viewpoints can be consistent with time-symmetry (see NSU classical physics). But LSU is more "automatically" time-symmetric, and in the particular case of QM, NSU approaches are forced into time-asymmetric stories, while LSU approaches aren't. That's not surprising; the NSU assumes a causal arrow of time, so it's more likely to diverge from pure time-symmetry than the LSU.

Now, I know you're on Eddington's side of the fence concerning the need for a fundamental asymmetry to explain the large-scale arrows of time. But I'm perfectly happy with the more standard explanation, where it's our proximity to one particular cosmological boundary condition (the big bang) that is fully responsible for all those arrows, even given CPT-symmetric laws. After all, as you zoom down to microscopic scales, phenomena get *more*, not less, time-symmetric. So it's baffling to me why anyone would want the laws that apply at small-scales to be even *less* time-symmetric than large-scale classical laws. Maybe this is the assumption that all large-scale behavior must result from small-scale behavior that George Ellis is rightly complaining about. Anyways, I think I did make the point in the essay that the LSU helps to force the preparation and the measurement to be time-reverses of each other, even down to the way we impose boundary conditions.

Any other reason you felt "unfulfilled" by the conclusion, or was it mainly the issue in that footnote? I expect it's also because I don't yet have a working model that exhibits all these features, but I'm getting there... :-)

Your point about anthropocentrism is interesting, but you sort of mixed in another point about realism. To be crystal clear: I am a realist. Something objective exists. In fact, I'm a "spacetime realist", as I'm interested in (only!) entities associated with particular points on the spacetime manifold. Thus my disinterest in approaches where entities that live in configuration space are somehow viewed as "real".

Which leads into my follow-up question as to exactly what you mean by models that are "at least partially correct". If a classical stat mech physicist saw the usefulness of configuration spaces, and concluded that the fundamental entities in the universe were classical partition functions that lived in such huge-dimensional spaces (rather than spacetime), and built theories around those entities, would that count as "partially correct"? (I'm sure you see where I'm going here, but regardless, I think the real question is which approximations and misapprehensions are leading us away from deeper, more fundamental discoveries.)

Finally, when it comes to entropy, I'm actually on board with you to a large extent. So long as the Big Bang is part of a cosmological boundary condition (a logical input rather than a logical output, to use my essay's language), I have no trouble with the gravitational degrees of freedom being so tightly constrained. And the "disorder" language, granted, is imprecise -- and to some extent meaningless at the fine-grained realistic level that I'm pursuing.

Thanks again!

Ken

Hao Yau,

Thanks for the pointer to your essay. I think the clearest connection is that you're interested in the second-order Klein-Gordon equation. For my earlier not-quite-LSU take on this equation, you might see my reference [8]. I've actually backed away from this approach to some extent, but still think Klein-Gordon is far preferable to the Schrodinger equation, and that everyone tends to misinterpret the so-called "negative frequency" solutions by viewing them in the light of a totally different equation. So if anything in [8] strikes a chord with your efforts, let me know and I might be able to at least steer you away from my various failed ideas... :-)

Best,

Ken

Hi Jerzy,

I'm glad you think my thesis is obvious, but I wager you're in the minority.

Also it sounds like you agree for quite different reasons than I'm using... Arguing for under-determined universes as compared to determined universes is a somewhat different issue than NSU/LSU. (After all , classical action extremization is an LSU approach that leads to just as a "determined" result as NSU equations of motion.

That said, these days I am on the fundamentally-underdetermined side of the fence, so I guess we agree after all. :-)

Cheers,

Ken

Hi Pentcho,

I actually quite like GR, but I grant that the details may be wrong. Maybe one needs to add a new field or two, or even a few new dimensions, or even something stranger.

But the point is that GR (and such extensions) are our best models of the structure of our universe. Ignoring this structure when building up a fundamental quantum theory seems reckless.

Best,

Ken

Ms. Vasilyeva,

Quite an assumption: "So I understand very well how you, having never studied life sciences, may have difficulty coming to the same view."

You are correct that I have not published any books on the life sciences, but from 2001 to 2006 I took UC Berkeley and UC Santa Cruz university extension courses completely covering Bruce Alberts' "Molecular Biology of the Cell", as well as courses in Proteomics, Immunology, Epigenetics and Embryogeneis. This was done "just for fun", but still, I don't feel ignorant of biology.

My point was that "us physicists" are not just a "headstrong bunch" but are also an almost uniquely curious and well-rounded bunch of people, and any **assumption** that we are ignorant is not likely to be true. As I've mentioned in other comments, everyone who submits an essay in this contest tends to feel that "they've figured it out", yet probably some of us are wrong. I'm glad that you've figured it out.

I won't intrude on Ken's space any more, so you get the last word...

Best,

Edwin Eugene Klingman

The inconsistent theory, if adopted, is much more dangerous for science than the false but consistent theory. The latter is easily falsifiable, the former easily overcomes any hurdle, either logical or experimental. Peter Hayes has explained this quite nicely:

Peter Hayes "The Ideology of Relativity: The Case of the Clock Paradox" : Social Epistemology, Volume 23, Issue 1 January 2009, pages 57-78 "In the interwar period there was a significant school of thought that repudiated Einstein's theory of relativity on the grounds that it contained elementary inconsistencies. Some of these critics held extreme right-wing and anti-Semitic views, and this has tended to discredit their technical objections to relativity as being scientifically shallow. This paper investigates an alternative possibility: that the critics were right and that the success of Einstein's theory in overcoming them was due to its strengths as an ideology rather than as a science. The clock paradox illustrates how relativity theory does indeed contain inconsistencies that make it scientifically problematic. These same inconsistencies, however, make the theory ideologically powerful."

Pentcho Valev

  • [deleted]

Dear Ken Wharton,

Ian Durham wrote: "we may all view slightly different realities". My notion of reality is different, and I guess you are also believing in just one objective reality.

You did not respond to my curiosity concerning superdeterminism and Huw Price. Perhaps these are not important. I am not really interested in questions like ultrafinitism.

I would rather appreciate at least one serious argument against my contrary to your position reasoning which I tried to make immediately obvious in my Figures.

Just off topic: In Germany police and intelligence failed for many years to get aware of a NSU (national socialist underground] who murdered. They were misled by a mysterious female DNA that did not belong to the criminals. It happens that mysterious things have a simple explanation.

Sincerely,

Eckard

Hi LC,

Thanks for the interesting comments.

>A Lagrangian model of a physics system has the initial state and the final state specified. This carries over to the quantum path integral, where the extremization procedure is generalized to a variational calculus on many paths with quantum amplitudes. The analyst is then faced with the task of finding some dynamical process or quantum evolution which maps the initial state of the system to the final state. The analyst will then use what tools they have in their box to solve this problem.

Ah, but this is my point: there may not *be* some master dynamical process that takes the initial state to the final state, so such an analyst that you describe is implicitly assuming a "NSU". Meanwhile, an LSU analyst will have a broader set of tools to use, as the intermediate solution can take the future constraint into account.

>With the universe in total this picture becomes more problematic. If we are to think of an observer as reading the output, there is no boundary region where this observer can read this output without that procedure also being a part of the "computation".

Yes, exactly. In that case I think it only makes sense to think of the end/asymptotic state of the universe as a "logical input" (even though it's a final boundary condition), just as I argue that quantum measurements should be viewed as boundary constraints on subsystems. It would be nice to have a conceptual framework that would work the same way for both subsystems and the universe as a whole.

>The problem of establishing a Cauchy region of initial and final data is much more difficult to work.

You may be interested in my response to Sean Gryb above in regards to the thick-sandwich problem in GR, and Sean's "Oracle"; looking for (unique) solutions to Cauchy-type problems may be more restrictive than is strictly necessary.

Thanks again!

Ken

Hector,

We're in complete agreement on your first point: there's no guarantee that models will map to a particular implementation. But to me, that's all the more reason to not limit ourselves to models that are only computable via some temporally-linear fashion (where the algorithm is completely blind to its eventual output).

As for your statement, "I think it is far from obvious that science assumes that a particular implementation of a mathematical model is the specific way nature operates.", I also agree, and think Spekkens' essay makes some useful points in this regard. But if you replace the word "particular" with "Newtonian Schema" (as defined in my essay), I think the situation changes. To me, anyway, it's become strikingly obvious that most scientists assume the only models worth considering are those that might have temporally-linear algorithmic implementations. Meanwhile, promising LSU-style models, which have no such implementation, are not explored.

As you say, the proof will be a "loophole"-free implementation. I just hope that if such a model is developed, it won't be ruled out merely on the grounds that it's not in an NSU framework.

Best,

Ken

  • [deleted]

Dear Ken Wharton,

Hector Zenil finds it difficult to reject in the light of your own arguments that the universe is a quantum computer. May I see him right in so far that this conjecture cannot be experimentally falsified before there are quantum computers available that work as envisioned?

Eckard Blumschein

I'm not sure I want to wade into this thread, but I would like to assure M.V. that I'm certainly not speaking on behalf of all physicists, and indeed my main point is very much out of the mainstream (or else the NSU wouldn't be such a pervasive assumption). My view is also in opposition to the 'shut up and calculate' position, in that I'm asking questions about what is happening between measurements that such an operationalist refuses to answer.

Best,

Ken

Hi Georgina,

Thank you for your kind comments. As for whether the 'universe as computer' claim is prevalent or not, you're probably right that many physicists would not say that they take such a position. But when pressed, many of those physicists retreat to the question: "But what would it mean for the universe *not* to be a computer?" And being in an NSU-mindset, and not hearing much of an answer from the universe-as-computer proponents, they may conclude that such a claim is basically meaningless.

So I hope that at the very least, my essay draws some lines on which one can have a meaningful debate. Given that there is an objective difference between NSU- and LSU-based models, linking the "universe as computer" to the NSU provides an answer to the above question. (One such non-computer-universe is the LSU, a universe that is some solution to a 4D boundary-value problem.)

The point is that many physicists who wouldn't say "the universe is a computer" still assume the NSU is the right framework to best explain the way things really "work". It's the NSU that's the prevalent assumption (or that's the way I see it, anyway).

Cheers!

Ken

Hi Eckard,

Thanks for your comments. While we certainly do disagree on the role of time-symmetry, I don't think our disagreements are *quite* as profound as you made it sound. I'm also interested in describing one objective reality, and I also think we have accidently blundered into foundational mistakes. (Although, granted, we disagree on which mistakes those are...)

If you're pursuing time-asymmetric interpretations then it would probably be time-well-spent to read Huw Price's book, "Time's Arrow and Archimedes' Point". Price lays out in a clearly accessible manner the key issues on this topic, and lays out concrete challenges that anyone pursuing your approach should directly address. His discussion of superdeterminism in that book is what I had in mind in that earlier post, although you can find a summary of the point on this recent profile piece. In a nutshell, merely postulating correlations between a future choice F and a past hidden variable P doesn't require there to be a common-cause in the past of both F and P if one instead entertains straight retrocausal influences from F to P. In this way, the choice at F can still be "free", in every meaningful sense of the word.

Also, we don't have to wait to build a quantum computer to address the question of whether the universe is an NSU-style quantum computer. In fact, I don't see any new insights that would come out of actually having such a computer; see my response to James Lee Hoover above.

I hope to get to your essay at some point, and if I have any comments that I think you might find useful, I'll post them on your essay thread.

Best,

Ken

  • [deleted]

Ken Wharton,

Congratulations, your excellent essay is in the top 35 essays of this contest.

In spite of the excellent write up in your essay, its title remains highly misleading. When you say, 'Universe is not a computer', it implies as if some physicists actually believe that Universe is a computer, or a machine or an engine etc. etc.

However, most physicists know that the state of the Universe undergoes causal evolution through dynamic interactions among its constituent particles and fields. We humans, in our quest for grasping the physical phenomena, have developed various mathematical models to represent this causal evolution of physical phenomena. The computations which you are attributing to the Universe are essentially the operative parts of our mathematical models and not of the Universe. When you say that Universe is not a computer, you are actually implying that your current mathematical models, representing physical reality, are incapable of computing causal evolution of ALL types of physical phenomena. This inability refers to the weakness of your current mathematical models and not to the Universe itself.

In the light of above clarifications, you are requested to kindly summarize in a few sentences, which of our 'Basic Physical Assumptions' are wrong in your opinion?

Anonymous

    Dear Anonymous,

    I do recognize that the phrase "the universe is a computer" will be interpreted differently by different people, and I agree that this makes the title somewhat imprecise (when taken on its own). But hopefully it is not *too* misleading; some physicists, at least, interpret this phrase as I'm defining it in the essay abstract and body (using the more-precise concept of the "NSU").

    So, to answer your question, the incorrect Basic Physical Assumption is that the fundamental "rules" that govern our universe are "computer-like", in that they causally evolve an initial "input state" to generate future states. Because of this assumption, we physicists tend to only look at mathematical models that work in the same manner -- aka the Newtonian Schema.

    Hand-in-hand with this assumption is another, more subtle one: The notion that the non-NSU, Lagrangian-style approach is not a valid Schema in its own right, at least when it comes to looking for a fundamental explanation of the evident correlations across space and time. Without giving up this second assumption, it's hard to give up the first.

    All the Best,

    Ken

    • [deleted]

    Dear Ken Wharton,

    "Price adds, "Should we assume that particles in physics know about the past but not the future?..."

    Of course not, because particles do not know anything. The style of question is as clever as in case of "is the universe a computer?"

    Is it a foundational question? Time will tell future generations whether or not back-causation is more than a silly hope to explain possibly not correctly designed or misinterpreted experiments. I see back-causation not a new idea but related to old futile belief. Ritz and Einstein agreed to disagree. If my interpretation of my Fig. 5 is correct then both were possibly misled.

    I cannot expect any mercy, not from those who simply trust in the correctness of current mainstream mathematics and physics and also not from those like you who are putting even the direction of causality in question in order to remedy some problems that seem to have no solution as long as some tenets are taboo.

    Huw Price reiterated what I consider imprecise and misleading: "Classical physics is symmetrical." I agree with George Ellis: Differential equations are not the primary models of reality.

    You are right; we may agree on a lot. However I do not expect you having a serious argument against 1364.

    Congratulation to you personally without my approval of back-causation

    Eckard

    If you do not understand why your rating dropped down. As I found ratings in the contest are calculated in the next way. Suppose your rating is [math]R_1 [/math] and [math]N_1 [/math] was the quantity of people which gave you ratings. Then you have [math]S_1=R_1 N_1 [/math] of points. After it anyone give you [math]dS [/math] of points so you have [math]S_2=S_1+ dS [/math] of points and [math]N_2=N_1+1 [/math] is the common quantity of the people which gave you ratings. At the same time you will have [math]S_2=R_2 N_2 [/math] of points. From here, if you want to be R2 > R1 there must be: [math]S_2/ N_2>S_1/ N_1 [/math] or [math] (S_1+ dS) / (N_1+1) >S_1/ N_1 [/math] or [math] dS >S_1/ N_1 =R_1[/math] In other words if you want to increase rating of anyone you must give him more points [math]dS [/math] then the participant`s rating [math]R_1 [/math] was at the moment you rated him. From here it is seen that in the contest are special rules for ratings. And from here there are misunderstanding of some participants what is happened with their ratings. Moreover since community ratings are hided some participants do not sure how increase ratings of others and gives them maximum 10 points. But in the case the scale from 1 to 10 of points do not work, and some essays are overestimated and some essays are drop down. In my opinion it is a bad problem with this Contest rating process. I hope the FQXI community will change the rating process.

    Sergey Fedosin

    Ken,

    Your essay, and your clear framing of the NSU and LSU was the intellectual equivalent of getting hit with a Taser. Coming from having spent years building a Schrodinger-based conceptual framework, I went into a kind of mental paralysis after reading your essay.

    I've long intuited that least-action must be one of the most fundamental aspects of Nature, with *something* akin to a Wheeler-Feynman absorber mechanism, but there seemed to be little room for this in the Schrodinger frameworks. I did quick reads of every arxiv paper of yours and am still absorbing the contents.

    As this is the end-game for this contest, I'm not expecting a reply, but felt it worth tossing out the few things to consider:

    1) If you are in need of arguments in support of LSU, the fact that many EPR experiments utilize transparent materials, which from Feynman's QED explanations seem very likely be similar to the laser-cavities you bring up. It has often bugged me that fragile entanglements are not destroyed by all the quantum jumping up and down in a lens or a beam splitter, and yet EPR papers also state that 'any interaction with the environment' destroys entanglement. (My essay is a bit of a disaster, but I was quite clear that passing through a beamsplitter is fundamentally different than a measurement ... LSU might help understand why!)

    3) If you haven't already, it might be interesting to see how LSU applies to such beasts as Cooper Pairs in superconductors. When proposing new viewpoints, or testing new theories, my motto is Think Crazy, Prove Yourself Wrong. The faster you can break your own model, the faster you learn how to fix it. Since discovering that Cooper Pairs have spatial separation, but correlation, I have found them a good place to test concepts. This will either strengthen the LSU, or cause head scratching!

    I'm not completely sold on the LSU, but that could well be because it has been almost a month since I've read your paper, and I'm still feeling the effects of being tasered by the mathematical gymnastics required to internalize the impact.

    I look forward to following your research and think your concepts, at the bare minimum, provide a *much* needed new perspective for fostering debate.

    Dean

      Dear Ken

      It is easy to follows derivations, where we use energy, Hamiltonians and so on, because we can imagine energy, momentum etc. But when I try to follow derivation with Lagrangian, I do not know what to imagine - it is some type of energy, but what type. Feynman's book "QED: The Strange Theory of Light and Matter" gives some imagination about for optimization of Lagrangian. You also gave same useful examples. I hope for someone who will clarify Lagrangians as much as possible.

      What I think as imagination of derivations in physics, you can see in my article.

      In my essay I speculate about importance of principle of uncertainty. I claim that the simplest derivation of it shows, that this principle is more fundamental than wave functions. I claim that wave functions in quantum gravity are not important. Do you have any idea with Lagrangian, how to more simply derive uncertainty principle and how to use it further? Do you know Cramer's interpretation of quantum mechanics? It also shows that "future" influences on past. (A photon do a handshaking and then flies, it is a comparision of all path.) Such approach is also used by Mark Hadley's theory about gravitational explanation of quantum mechanics. Namely, they do not need simple "input-output schema" from past to future.

      Best regards Janko Kokosar