Torsten,

Thank you for the comments... I actually quite like the continuum, although that probably didn't come across in this contest (see the last contest for more details). The question of whether EPR/Bell-inequalities can be explained by hidden variables or not actually depends on whether one is in an NSU or LSU framework. It works in the latter but not the former. Unfortunately, most analysis simply assumes NSU, which has biased many people against hidden variables, but that option opens up again with LSU.

Best,

Ken

James,

Thanks! I'm not sure I understand your point, but even if one had a working quantum computer, one couldn't "model" quantum phenomena with it; one could merely "reproduce" quantum phenomena. And, unfortunately, such a computer would give no more information about what was happening between quantum measurements than the bare phenomena themselves, because one can't make an "extra" intermediate measurement in the quantum computer without destroying the match to the actual phenomena in the first place. (Weak measurements aside for now...)

Really, a quantum computer would *be* a quantum phenomenon, in and of itself. It would not be a useful "model", in that it would not add to our understanding... any more than pointing out that a star is an excellent model of that same star would add to our understanding.

Please let me know if I'm totally off-target with your intended comment!

Cheers,

Ken

Hi Sean! Yes, of course I remember you, and thanks for your very thoughtful comments.

Your GR points are all perfectly valid, and you're right that the LSU-style "thick sandwich problem" is unsolved (specifically, the question of whether there is one unique classical solution for a given closed-hypersurface boundary metric, and how to find it). But the "problematic questions" that I had in mind were issues involving the intersection of GR with QM... namely, the question of whether one can even impose all of the needed boundary data without violating the HUP. (If not even the Big Bang can beat the HUP, having a formal solution of Newtonian-GR for a HUP-violating initial boundary seems sort of useless to me, even leaving aside whether "lapse functions" are natural to include in an initial boundary to begin with.)

Even though the thick-sandwich problem is clearly in the LSU camp, the way it's normally phrased makes it clear that it was developed in an NSU-mindset. After all, who says that there has to be only one unique solution? (Esp. given what we know about QM.) I'd be far more interested in a result that showed *many* solutions for a given boundary, of which our actual universe would be merely one possibility. (And yet, if this result were discovered, I think many physicists would view this as a failure.) At the end of the day, though, it's not the job of LSU approaches to recover classical results, or even NSU-like results. Hopefully it will lead to *new* insights that actually look like our GR+QM universe.

As for the measurement problem, the clearest discussion I've written so far is my entry in the previous FQXi contest, but even reading that you'll still probably have many of the same questions. A stronger argument will require a fully-fleshed out toy model, one that I'm still plugging away on, but for now I'll leave you with the following thoughts: If I'm right that configuration spaces live in our heads, not in reality, then we need to delve deeper into the foundations of the path integral to find a realistic LSU story. The good news is that the action itself lives in spacetime. Sinha and Sorkin (1991) pointed out that by doubling the particle path integral (folding the paths back on themselves in time), you no longer need to square the integral to get out probabilities -- and at this point it almost looks like the configuration spaces used in stat mech (which I assume you'll agree would be perfectly acceptable for a realistic theory). But those particle paths still need negative probabilities to get interference. However, fields can interfere in spacetime, so extending these ideas to fields can arguably solve this problem (although this will require further alterations of the path integral, with some master restriction on field configurations to make it mathematically well-defined). The last piece of the puzzle -- how to objectively define an external measurement in the first place -- is addressed in the previous contest entry. The key is that a measurement on a subsystem is not merely the future boundary constraint itself, but also the chain of correlations that link it to the cosmological boundary. Thus, in a quantum eraser experiment, the future chain is broken, and all observers eventually concur that no "measurement" was ever made in the first place. Note this only works in an LSU; in a realistic NSU theory, one needs to know "right away" whether a given interaction is a measurement or not.

For your final idea, about some "Oracle" choosing amongst possibilities, I'm entirely on board with that notion and am growing more confident about it all the time. But it only makes sense for the choice to happen *once*; some global choice that picks one reality out of all possible universes (given all the boundary data, and some master constraint on the total Lagrangian density). Maybe it's now clearer why I'd prefer lots of solutions to the thick-sandwich problem. From our perspective, individual quantum experiments all might seem to have a different random choice, but really it's all one big choice that manifests itself as computable patterns between similar experiments (computable from the size of the different solution spaces.) Getting the probabilities right will be the ultimate test of all this, but if you read my conclusion again with this in mind, you might see what I'm going for.

I'm looking forward to reading your own essay... And I hope we cross paths again soon!

Cheers,

Ken

Dear Ken,

I think there is a conflation between computational models and mathematical representations. It is true that mathematical representations almost always turn out to be computable models, specially when used numerically to approximate the solution to a real-world problem. But a mathematical model doesn't always come with its implementation (in fact rarely does). That is, one has to find a computer implementation for a mathematical model. There is no one-to-one correspondence between models and algorithms. For example, many Turing machines can compute the same computable function, but each can do so in a completely different way (e.g. a function that can be computed in linear time can also be computed in exp time).

While I may agree that the working assumption of science is that nature is mathematical I think it is far from obvious that science assumes that a particular implementation of a mathematical model is the specific way nature operates. This is also a common conflation when people think that saying that a natural process is Turing computable means that it is computed exactly by something like a Turing machine (nobody thinks the brain is a Turing machine, but most science works under the assumption that the brain is Turing computable, which is completely different). In your essay you point out Seth Lloyd's claim that the universe is a computer, this illustrates my point, because in fact Seth Lloyd does not think that the universe is a Turing machine, but in fact a quantum computer, which in the light of your own arguments I find difficult to reject.

On the other hand, I don't think that physicists will ever be able to close all loopholes in quantum theory at the same time, and it hasn't yet been done. While your ideas are provocative I still find the computable ground less subject to particular interpretations.

    Dear Prof. Wharton,

    as a physicist, you argue that the universe is not a computer. As a professional in the field of computers, it is hard for me to agree with you. To me the universe appears as an enormous computer and I find the parallels between the two everywhere, from the the structure of space with the visible universe confined to a 3-dimensional display, akin to a 3D touch screen, to the origins of life itself. Most striking similarities I find in our creation myths and the ways in which the complex systems are put together in practice. But my essay is not about that. In my essay I infringe into your territory, just as you infringe into mine, and argue that organization of space housing the universe dictates the laws of physics.

    My analysis of the current state in physics zeroed in on the paradox of space, which ~100 years ago was substituted with the wave-particle duality. The incongruous notion prevailing today that waves can propagate in emptiness, without a supporting medium, is called a workaround in my field, where such compromises are common and constitute a norm rather than an exception. The difference between you physicists and us programmers is that the programmers fully appreciate that, in the long run, such workarounds come with a heavy price and are must be addressed sooner or later. Better sooner than later.

    In contrast, you physicists seem a headstrong bunch. Having decided long before the computers, when the ability to compute was deemed as the height of human ability, that the understanding and visualization of the underlying reality can be discarded as long mathematics appear adequate, you as a group still stubbornly stick to it. Apparently you do not realize that you have put yourselves in danger of being replaced by the very calculators you still strive to emulate. The current motto in physics "shut up and calculate" have put you on par with the computing machines, as if you have forgotten that whatever a human can calculate a machine can do far better. As a professional in my field, I fully appreciate the fact that the main difference between calculators and humans is that humans understand how the calculations relate to the physical reality and have a vision of how their own experience fits in it.

    And so I find it ironic that the modern day physicists appear unaware that their phenomenal ability to compute could in fact be the vestiges of our very origins as well as the origins of our universe. Arguing that the universe is not a computer you physicists don't seem to appreciate the fact that mathematics divorced from the understanding of the underlying reality, and the vision that comes with it, is what has turned you into the glorified calculators and thus put forth the question of your own utility to the rest of the humanity.

      Dear M. V. Vasilyeva,

      Perhaps were you a little older, and had experience with analog computers, you might not jump to such conclusions.

      You say, "In contrast, you physicists seem a headstrong bunch."

      But when your only tool is a hammer, everything looks like a nail. When one is mostly familiar with computer concepts, they seem to apply everywhere. Believe me, there's much more to it than you seem to see.

      "And so I find it ironic that the modern day physicists appear unaware that their phenomenal ability to compute could in fact be the vestiges of our very origins as well as the origins of our universe. Arguing that the universe is not a computer you physicists don't seem to appreciate the fact that mathematics divorced from the understanding of the underlying reality, and the vision that comes with it, is what has turned you into the glorified calculators and thus put forth the question of your own utility to the rest of the humanity."

      Since I have been far more professionally successful in the field of computer design (see here) than as a physicist, and have published university texts on computer design and a dissertation on an 'automata theory' of physics, I do know about computers. All I can say is that, no matter how much everything looks like a nail to you, it's a little deeper than that.

      But I still enjoyed your essay.

      Edwin Eugene Klingman

      Dear Ken Wharton,

      A well written, accessible essay. I found it very interesting.I am not sure of the prevalence of the basic assumption that the universe is a computer. However having chosen that assumption to talk about you do a very good job of clearly communicating your viewpoint.

      I do like that your essay is forward thinking and suggesting a potentially useful direction for future research, rather than just pointing out the problems. Good luck in the competition. Regards Georgina.

        Ken

        Not 'start' with logic, but found the 'structure' emerging to be precisely that of Truth Propositional Logic (infinite hierarchical 'nested' compound propositions), and the kinetics analogous to dynamic logic's (PDL) 'Interleaved modes'.

        I also started from a more Lagrangian than Newtonian view, but consider real data more than theory and don't habitually 'compartment' findings or assume interpretations. Issues with Bell are well covered elsewhere and in the references (lack of space), but I'll follow up yours.

        All I'm after is falsification. To me it's entirely self apparent and all attempts to falsify have only resolved other anomalies. but because it's unfamiliar all just seem to do what you have, write it off and ignore it as it's 'different'. So half of physics is looking for a better solution, but all of physics completely ignores it when one arises (and have done for 4 years). How do we overcome that? Any ideas?

        Best wishes. I think you are correct, but it seems there may be no point in being so.

        Peter

        Hi Ken,

        Thanks for your detailed response! If you have had to a chance to read my essay, you'll probably find that we have very different intuitions for how to address some of these problems (in our approach, we treat configuration space as completely fundamental!). Nevertheless, I am very interested in hearing more about your ideas. I think I can see where you are going and would be interested to see a toy model worked out.

        In regards to the "thick sandwich" problem, I can see now why the classical result doesn't bother you. However, our universe seems pretty classical now. It takes some real mental gymnastics to try to think in the way you are suggesting!

        Where our intuitions do seem to overlap is with the Oracle issue. Even in the setting described in my essay, I think this is potentially the only way to really understand measurement although I didn't really say this in the text. I would be happy to discuss ways to make this idea more concrete. I've always found it very intriguing.

        Hope to see you some time again!

        Cheers,

        Sean.

        Dear Ken Wharton,

        Do not belittle what you called "typical engineering-physics". I agree that your LSU exactly corresponds to the monist view by Einstein, Hilbert, and present mainstream. You are not questioning the fundamentals; you are trying to defend and extend the philosophy on which spacetime and time symmetries arose. I see my essay a challenge to you because it clearly distinguishes between past and future.

        You seem already to be not very precise when you wrote in the abstract "predict the future from what we know about the present" but then "predict the future from the past". This would mean the past is what we know about the present. Noticing your obvious (in words like "we know" and "predict") anthropocentric point of view, I prefer a notion of objective reality that I described in my essay.

        You denied to be a superdeterminist and declared to agree with an unknown to me Huw Price on the issue. Could you please explain his and your position?

        Let me out myself as a fan of Karl Popper: I see it justified assuming potentially infinite influences, no matter whether the world is actually open in the sense of potentially infinite or there is objectively no chance to have a complete and trustworthy mathematical description of it.

        Do not mistake this as support for Roger Schlafly's certainly also welcome in fqxi almost nihilistic attitude. I am the optimist who hopes for revelations of foundational mistakes. Maybe, my Figure 5 can be refuted. So far it seems to refute a basic assumption that led to Einstein's work.

        Sincerely,

        Eckard Blumschein

          Mr. Klingman,

          glad you enjoyed my essay. I see that you are a prolific author and you consider yourself an authority in both physics and computer hardware. And you think that my view on the universe as an immense computer is not deep enough. Well, your opinion of my view is based on an assumption that has no basis in reality. For your information, I started to entertain this idea only after I studied biology and physiology long enough to became convinced that life is a program. Once you make this leap, everything else follows logically.

          So I understand very well how you, having never studied life sciences, may have difficulty coming to the same view. Knowing physics and computers may not be enough to come to this conclusion, just as it was not enough for me until I took on biology.

          Take care!

          Hi Ian,

          Thanks for your comments! For your first point about time-symmetry, I don't think this is necessarily an NSU/LSU issue at all, as both viewpoints can be consistent with time-symmetry (see NSU classical physics). But LSU is more "automatically" time-symmetric, and in the particular case of QM, NSU approaches are forced into time-asymmetric stories, while LSU approaches aren't. That's not surprising; the NSU assumes a causal arrow of time, so it's more likely to diverge from pure time-symmetry than the LSU.

          Now, I know you're on Eddington's side of the fence concerning the need for a fundamental asymmetry to explain the large-scale arrows of time. But I'm perfectly happy with the more standard explanation, where it's our proximity to one particular cosmological boundary condition (the big bang) that is fully responsible for all those arrows, even given CPT-symmetric laws. After all, as you zoom down to microscopic scales, phenomena get *more*, not less, time-symmetric. So it's baffling to me why anyone would want the laws that apply at small-scales to be even *less* time-symmetric than large-scale classical laws. Maybe this is the assumption that all large-scale behavior must result from small-scale behavior that George Ellis is rightly complaining about. Anyways, I think I did make the point in the essay that the LSU helps to force the preparation and the measurement to be time-reverses of each other, even down to the way we impose boundary conditions.

          Any other reason you felt "unfulfilled" by the conclusion, or was it mainly the issue in that footnote? I expect it's also because I don't yet have a working model that exhibits all these features, but I'm getting there... :-)

          Your point about anthropocentrism is interesting, but you sort of mixed in another point about realism. To be crystal clear: I am a realist. Something objective exists. In fact, I'm a "spacetime realist", as I'm interested in (only!) entities associated with particular points on the spacetime manifold. Thus my disinterest in approaches where entities that live in configuration space are somehow viewed as "real".

          Which leads into my follow-up question as to exactly what you mean by models that are "at least partially correct". If a classical stat mech physicist saw the usefulness of configuration spaces, and concluded that the fundamental entities in the universe were classical partition functions that lived in such huge-dimensional spaces (rather than spacetime), and built theories around those entities, would that count as "partially correct"? (I'm sure you see where I'm going here, but regardless, I think the real question is which approximations and misapprehensions are leading us away from deeper, more fundamental discoveries.)

          Finally, when it comes to entropy, I'm actually on board with you to a large extent. So long as the Big Bang is part of a cosmological boundary condition (a logical input rather than a logical output, to use my essay's language), I have no trouble with the gravitational degrees of freedom being so tightly constrained. And the "disorder" language, granted, is imprecise -- and to some extent meaningless at the fine-grained realistic level that I'm pursuing.

          Thanks again!

          Ken

          Hao Yau,

          Thanks for the pointer to your essay. I think the clearest connection is that you're interested in the second-order Klein-Gordon equation. For my earlier not-quite-LSU take on this equation, you might see my reference [8]. I've actually backed away from this approach to some extent, but still think Klein-Gordon is far preferable to the Schrodinger equation, and that everyone tends to misinterpret the so-called "negative frequency" solutions by viewing them in the light of a totally different equation. So if anything in [8] strikes a chord with your efforts, let me know and I might be able to at least steer you away from my various failed ideas... :-)

          Best,

          Ken

          Hi Jerzy,

          I'm glad you think my thesis is obvious, but I wager you're in the minority.

          Also it sounds like you agree for quite different reasons than I'm using... Arguing for under-determined universes as compared to determined universes is a somewhat different issue than NSU/LSU. (After all , classical action extremization is an LSU approach that leads to just as a "determined" result as NSU equations of motion.

          That said, these days I am on the fundamentally-underdetermined side of the fence, so I guess we agree after all. :-)

          Cheers,

          Ken

          Hi Pentcho,

          I actually quite like GR, but I grant that the details may be wrong. Maybe one needs to add a new field or two, or even a few new dimensions, or even something stranger.

          But the point is that GR (and such extensions) are our best models of the structure of our universe. Ignoring this structure when building up a fundamental quantum theory seems reckless.

          Best,

          Ken

          Ms. Vasilyeva,

          Quite an assumption: "So I understand very well how you, having never studied life sciences, may have difficulty coming to the same view."

          You are correct that I have not published any books on the life sciences, but from 2001 to 2006 I took UC Berkeley and UC Santa Cruz university extension courses completely covering Bruce Alberts' "Molecular Biology of the Cell", as well as courses in Proteomics, Immunology, Epigenetics and Embryogeneis. This was done "just for fun", but still, I don't feel ignorant of biology.

          My point was that "us physicists" are not just a "headstrong bunch" but are also an almost uniquely curious and well-rounded bunch of people, and any **assumption** that we are ignorant is not likely to be true. As I've mentioned in other comments, everyone who submits an essay in this contest tends to feel that "they've figured it out", yet probably some of us are wrong. I'm glad that you've figured it out.

          I won't intrude on Ken's space any more, so you get the last word...

          Best,

          Edwin Eugene Klingman

          The inconsistent theory, if adopted, is much more dangerous for science than the false but consistent theory. The latter is easily falsifiable, the former easily overcomes any hurdle, either logical or experimental. Peter Hayes has explained this quite nicely:

          Peter Hayes "The Ideology of Relativity: The Case of the Clock Paradox" : Social Epistemology, Volume 23, Issue 1 January 2009, pages 57-78 "In the interwar period there was a significant school of thought that repudiated Einstein's theory of relativity on the grounds that it contained elementary inconsistencies. Some of these critics held extreme right-wing and anti-Semitic views, and this has tended to discredit their technical objections to relativity as being scientifically shallow. This paper investigates an alternative possibility: that the critics were right and that the success of Einstein's theory in overcoming them was due to its strengths as an ideology rather than as a science. The clock paradox illustrates how relativity theory does indeed contain inconsistencies that make it scientifically problematic. These same inconsistencies, however, make the theory ideologically powerful."

          Pentcho Valev

          Dear Ken Wharton,

          Ian Durham wrote: "we may all view slightly different realities". My notion of reality is different, and I guess you are also believing in just one objective reality.

          You did not respond to my curiosity concerning superdeterminism and Huw Price. Perhaps these are not important. I am not really interested in questions like ultrafinitism.

          I would rather appreciate at least one serious argument against my contrary to your position reasoning which I tried to make immediately obvious in my Figures.

          Just off topic: In Germany police and intelligence failed for many years to get aware of a NSU (national socialist underground] who murdered. They were misled by a mysterious female DNA that did not belong to the criminals. It happens that mysterious things have a simple explanation.

          Sincerely,

          Eckard

          Hi LC,

          Thanks for the interesting comments.

          >A Lagrangian model of a physics system has the initial state and the final state specified. This carries over to the quantum path integral, where the extremization procedure is generalized to a variational calculus on many paths with quantum amplitudes. The analyst is then faced with the task of finding some dynamical process or quantum evolution which maps the initial state of the system to the final state. The analyst will then use what tools they have in their box to solve this problem.

          Ah, but this is my point: there may not *be* some master dynamical process that takes the initial state to the final state, so such an analyst that you describe is implicitly assuming a "NSU". Meanwhile, an LSU analyst will have a broader set of tools to use, as the intermediate solution can take the future constraint into account.

          >With the universe in total this picture becomes more problematic. If we are to think of an observer as reading the output, there is no boundary region where this observer can read this output without that procedure also being a part of the "computation".

          Yes, exactly. In that case I think it only makes sense to think of the end/asymptotic state of the universe as a "logical input" (even though it's a final boundary condition), just as I argue that quantum measurements should be viewed as boundary constraints on subsystems. It would be nice to have a conceptual framework that would work the same way for both subsystems and the universe as a whole.

          >The problem of establishing a Cauchy region of initial and final data is much more difficult to work.

          You may be interested in my response to Sean Gryb above in regards to the thick-sandwich problem in GR, and Sean's "Oracle"; looking for (unique) solutions to Cauchy-type problems may be more restrictive than is strictly necessary.

          Thanks again!

          Ken

          Hector,

          We're in complete agreement on your first point: there's no guarantee that models will map to a particular implementation. But to me, that's all the more reason to not limit ourselves to models that are only computable via some temporally-linear fashion (where the algorithm is completely blind to its eventual output).

          As for your statement, "I think it is far from obvious that science assumes that a particular implementation of a mathematical model is the specific way nature operates.", I also agree, and think Spekkens' essay makes some useful points in this regard. But if you replace the word "particular" with "Newtonian Schema" (as defined in my essay), I think the situation changes. To me, anyway, it's become strikingly obvious that most scientists assume the only models worth considering are those that might have temporally-linear algorithmic implementations. Meanwhile, promising LSU-style models, which have no such implementation, are not explored.

          As you say, the proof will be a "loophole"-free implementation. I just hope that if such a model is developed, it won't be ruled out merely on the grounds that it's not in an NSU framework.

          Best,

          Ken