Essay Abstract

This essay discusses a possible route towards the removal or deformation of existing physical postulates. This is looked at in light of history. The foundations of interest are unitarity and locality. How these principles are changed or abandoned is first examined in light of previous changes in the understanding of physical foundations. These foundations are examined to question their firmness, and if they give way then unto what do they submit to as emergent properties. Suggested approaches are then proposed within light of the AdS/CFT correspondence, nonlocal BCFW amplitudes in QCD, cosmological quantum phase structure and ultimately the replacement of unitarity by deeper principles of modularity.

Author Bio

My graduate work was at Purdue University. I have worked on problems of clock synchronization with general relativity, spacecraft navigation, quantum optics and more recently with IT/programming. I spend much time thinking about issues concerning foundations.

Download Essay PDF File

  • [deleted]

Hi,Lawrence

What mean unitarity is emergent?

More simple please.

Correct quote of Ludwig Wittgenstein

"Whereof one cannot speak,thereof one must be silent"

Major Works: Selected Philosophical Writings p.82

2009 by HarperCollins Publishers

    • [deleted]

    Dr. Crowell,

    In reaction to a message I just read posted in Dr. Gibb's blog, your essay is listed. Glad to see you enterred and are sharing your expert opinions again in discussions.

    James

      If you look in P. gibbs' page you will see I break this out in a bit more detail. Unitarity is a limiting case where wave functions are analytic everywhere. Physics based on modularity and nonlocality has no reference to spacetime. Causality in physics is based on propagators or Greene functions that push a field from (x, t) to (x', t'). Without spacetime this simply does not exist. The removal of the pole or singularity occurs when there are no black holes or in a region of spacetime that excludes big bang singularities.

      One of the things I think comes from this is the universe contains only one of each particle. The universe has only one electron, one up quark, one muon, one photon, one Z, one higgs one... . What we observe as individual particles are the same particle within different configuration variables, whether spacetime or momentum-energy. Spacetime is in effect a sort of emergent property, in many ways an illusion, where particles we observe are mirror images of the same particles with different configurations. Baruch Spinoza wrote about something like this, which he called monads.

      Cheers LC

      • [deleted]

      1.The "Monads" belong to Leibniz, The "Modes" coined by Spinoza.

      2.The Universe has:

      Fermions 12(6 quarks+3 leptons+3 neutrino).

      Bosons 12(8 gluons+3 vector(2W+1Z)+1photon).

      Numerical supersymmetry not broken.

      3.From other side the Universe has:

      Fermions 3(proton,electron,neutrino),neutron non-stable

      Boson only 1 photon.

      See my essay http://www.fqxi.org/community/forum/topic/946

      Metasymmetry is broken

      In this case Lawrence B Crowell is right.

      • [deleted]

      Lawrence,

      Do you still believe that Banesh Hoffmann and John Norton are wrong in their claim that the Michelson-Morley experiment CONFIRMED the variable speed of light predicted by Newton's emission theory of light?

      http://www.pitt.edu/~jdnorton/papers/companion.doc

      John Norton: "These efforts were long misled by an exaggeration of the importance of one experiment, the Michelson-Morley experiment, even though Einstein later had trouble recalling if he even knew of the experiment prior to his 1905 paper. This one experiment, in isolation, has little force. Its null result happened to be fully compatible with Newton's own emission theory of light. Located in the context of late 19th century electrodynamics when ether-based, wave theories of light predominated, however, it presented a serious problem that exercised the greatest theoretician of the day."

      http://philsci-archive.pitt.edu/1743/2/Norton.pdf

      John Norton: "In addition to his work as editor of the Einstein papers in finding source material, Stachel assembled the many small clues that reveal Einstein's serious consideration of an emission theory of light; and he gave us the crucial insight that Einstein regarded the Michelson-Morley experiment as evidence for the principle of relativity, whereas later writers almost universally use it as support for the light postulate of special relativity. Even today, this point needs emphasis. The Michelson-Morley experiment is fully compatible with an emission theory of light that CONTRADICTS THE LIGHT POSTULATE."

      http://www.amazon.com/Relativity-Its-Roots-Banesh-Hoffmann/dp/0486406768

      "Relativity and Its Roots" By Banesh Hoffmann: "Moreover, if light consists of particles, as Einstein had suggested in his paper submitted just thirteen weeks before this one, the second principle seems absurd: A stone thrown from a speeding train can do far more damage than one thrown from a train at rest; the speed of the particle is not independent of the motion of the object emitting it. And if we take light to consist of particles and assume that these particles obey Newton's laws, they will conform to Newtonian relativity and thus automatically account for the null result of the Michelson-Morley experiment without recourse to contracting lengths, local time, or Lorentz transformations. Yet, as we have seen, Einstein resisted the temptation to account for the null result in terms of particles of light and simple, familiar Newtonian ideas, and introduced as his second postulate something that was more or less obvious when thought of in terms of waves in an ether."

      Pentcho Valev pvalev@yahoo.com

        Banesh-Hoffman are right that a particle, even a photon, emitted by a moving frame relative to a stationary frame has more energy than the same particle emitted in the stationary frame. In the case of a photon the photon emitted from a moving frame in the same direction is blue shifted with more energy. These matters concerning the measurement of light speed are old and clearly demonstrated.

        To be honest I did not write this essay with the intention of debating century old physics that is well established. I doubt I am going to seriously get around to reading these papers, for they are long and not likely very enlightening. I am not sure why people decide that some aspect of physics is all wrong and devote their lives and work doing battle. This happens with biology in the ongoing reaction to Darwin, but at least the deniers are upholding some theology, which gives some sense for why they do this. There is no such motivating ideology for denying some physical theory that is well established.

        The point of my essay is not that relativity is all wrong. It is more that in a quantum field setting at small scales it becomes incomplete. This pertains to black holes that are smaller than a nucleus or within the first 10^{-30} seconds of the big bang and so forth. This does not mean that relativity is overthrown and what I advocate here is found in basic measurement, such as the Michelson-Morley experiment.

        Special relativity is not just a subject to be researched, but it really is more of an application these days. It is so well established within its proper domain of experience that its validity is beyond reasonable doubt. General relativity is a subject of research, but it is pretty well tested with no empirical evidence that it fails.

        Cheers LC

        I thought that Spinoza first advanced the idea of monads.

        I write more on this below. If I am right the any particle, say an electron, in some wave function ψ(r, t) at some point in spacetime (r, t), is a projection of a single electron onto those configuration variables. The same holds if the particle is described in momentum-energy variables in a Fourier tranforms

        φ(k,ω) = sqrt{1/2π}∫d^3xdt ψ(r, t)e^{ikx - ωt}.

        This projection occurs due to the nonlocality of fields, and their physics is described not by analytic functions or unitarity, but rather by modularity.

        I write more about this below.

        Cheers LC

        • [deleted]

        In 1887 (FitzGerald and Lorentz have not yet advanced the ad hoc length contraction hypothesis) the Michelson-Morley experiment unequivocally confirms the assumption that the speed of light varies with the speed of the light source (c'=c+v) and refutes the assumption that the speed of light is independent of the speed of the light source (c'=c). That is what John Norton and Banesh Hoffmann suggest. Do you agree, Lawrence?

        Pentcho Valev pvalev@yahoo.com

        To be honest I would prefer that my essay page not be filled with posts over this imagined controversy.

        LC

        There have been some developments along these lines which may give support for my thesis here. The paper Black Holes: Complementarity or Firewalls? by Almheiri, Marolf, Polchinski, Sully raises an important point. This points out an inconsistency with the holographic principle. They focus on the suggestion that postulate #2; Outside the stretched horizon of a massive black hole, physics can be described to good approximation by a set of semi-classical field equations, is to be "relaxed." I take it that this relaxation focuses on the issue of "massive" as the mass approaches around 10^3 to 10^4 Planck units of mass. This still makes the black hole massive when compared to the masses of elementary particles.

        In discussions with Stoica on singularities I suggested the following metric with 1 - 2m/r = e^u. so then

        ds^2 = e^udt^2 - e^{-u)dr^2 dΩ^2.

        We now have to get dr from

        dr = -2me^u/(1 - e^u)^2du.

        Now the metric is

        ds^2 = e^udt^2 -2m[e^u/(1 - e^u)^4]du^2 dΩ^2.

        The singularity is at u = ∞, where the dt term blows up, and the horizon coordinate singularity at u = 0 is obvious in the du term. My rational was that the singularity had been removed "to infinity" in these coordinates. This makes the black hole metric similar to the Rindler wedge coordinates, which does not contain a singularity. In the accelerated frame or Rindler wedge there is singularity. The treatment of the Schwarzschild metric in the near horizon approximation Susskind uses is one where the singularity is sufficiently removed so that field in the Rindler wedge may be continued across the horizon without concerns. In this metric of mine the singularity is at infinity so the analytic functions for fields in the Rindler wedge are replaced with meremorphic functions with a pole at infinity.

        Stoica made the observation that this runs into trouble with Hawking radiation. The singularity at infinity causes trouble with the end point of the radiance process for it has to "move in" from infinity. The final quantum process of a black hole is a problem not well known in any coordinates. Your objection does have a certain classical logic to it. However, by the time the black hole is down to its last 10^4 or 10^3 Planck mass units the black hole itself is probably quantum mechanical. In my coordinates (assuming they are unique to me, which is not likely) the singularity at infinity may not have to "move" from infinity. There may be some nonlocal physics which causes its disappearance without having to move at all. This nonlocality is a correspondence between states interior to a black hole and those on the stretched horizon. The Susskind approach does not consider the interior, and he raises this as a question towards the end of his book "The Holographic Principle."

        This nonlocaity would be a relaxation of the postulate #2. The issue of unitarity comes into play. If the theory is replaced with meremorphic functions, say analytic in a portion of the complex plane, then fundamentally quantum fields in curved spacetime or quantum gravity is not unitary but modular.

        Unitarity is represented by a complex function e^{-iHt} and so forth, which is analytic. The loss of unitarity does not mean there is a complete loss of everything; in particular quantum information can still be conserved. A simple analytic function of this sort describes standard quantum physics. Gravity as we know is given by a hyperbolic group, such as SO(3, 1) ~ SL(2,C), where the latter has a map to SL(2,R)^2. The functions over these groups have posed difficulties for quantum gravity, for they are explicitly nonunitary. The trick of performing a Wick rotation on time or with τ = it is a way of recovering the compact groups we know in quantum physics.

        It does turn out I think that we can think directly about quantum gravity by realizing that the SL(2,R) is related to a braid group with Z --- > B --- > PSL(2,Z), and that the braid group is contained in SL(2,R). Braid groups have correspondence with Yang-Baxter relations and quantum groups. The group SL(2,Z) is the linear fractional group, which is an elementary modular form. An elementary modular function is

        f(z) = sum_{n=-∞}^{n=∞}c(n)e^{-2πi nz}

        which in this case is a Fourier transform. In this case we are safely in the domain of standard QM and QFT. In general modular functions are meromorphic (analytic everywhere but infinity) and analytic condition is held on the upper half of the complex plane.

        Of particular interest to me are the Eisenstein series of modular functions or forms. These define an integer partition function, which is an acceptable partition function or path integral for a stringy black hole. I include a graphic here illustrating an Eisenstein function. This has a certain self-similar structure to it, or what might be called an elementary form of a fractal. In this picture unitarity is replaced with modularity. In this more general setting the transformation do no promote a field through time by some operator, but that the operator simply computes the number of states or degrees of freedom in a way that is consistent. Unitarity is then a special case of this, which happens to fit into our standard ideas of causality.

        The Eisenstein series describes a partition function or path integral for a black hole. The theory is not one of unitary evolution, but simply one of counting states or degrees of freedom on the horizon. In effect physics is more general than unitarity, where unitarity is a necessary condition to describe the semi-classical states in postulate #2.

        Cheers LC

        • [deleted]

        Dr. Crowell,

        Nice work! Just the right amount of historical and introductory theoretical information to help the educated non-physicist have a reasonable opportunity to follow the logic of your essay. That extra information is clearly not filler material or added in an author's attempt to appear to be well informed. Not at all! Your professional viewpoint is made accessible while presenting advanced theoretical concepts. Thank you for the lift-up.

        James

        Hi Lawrence,

        here are some ideas ...

        Hello thinkers,

        Very interesting these extrapolations. But I am insisting about the finite groups. Furthere more a photon in my line of reasoning possesses the serie of uniqueness.So a serie begining from the main central sphere.After the serie is a fractalization with specific spherical volumes. the serie is between 1 and x. So a photon is so complex in fact because its quantum number is very important.See also that this numbers the same than our cosmological number of spheres(without the quantum spheres of course).So it is very relevant about the broken symmetry indeed due to informations correlated with volumes and the rotations spinal and orbitals.The tori of stability take all their meaning. See that the system is a fusioned system.The desnity is relevant correlated with mass, and the polarity m/hv due to evolution. So the exchanges is probably a fusioned system and not a binar system in its pure generality.

        See also that the informations are very relant when we consider the VOLUMES OF SPHERES !!!The informations can be bosonic or fermionic.Personnaly I beleive that the volumes of fermions are more stable due to the encoding of evolution.The bosonic encoding is more subtle due to its quantum number and its fractal cited above. The sortings and synchros appear with a universal proportionality.

        Regards

          1st timer submission, not yet submitted, while reviewing selected works for possible End Notes.

          Good history and very interesting paper but got lost in the heavy math, wish I was abreast of all covered...but love your ideas "G implying 1/mass and QFT as unit less coupling, have different times" (if I interpret correctly)

          First I see E/f = h and Power = E/t. Dividing one gets, t/f, so IF t = 1/f it implies either t squared of 1/f squared. Square roots generate plus and minus, a past and future with no present?

          Second, mass and energy, respectively, as the inscribed sphere, tangent to the face of a regular tetrahedron where sphere and tetrahedron have equal surface-to-volume ratios at ANY size, e.g. equivalent "activity" as free energy.

          Comment? (may use in end notes)

            The gravitational constant in naturalized units is "area," so it is sqrt{G} that is ~ length or reciprocal of mass.

            Verlinde has a proposal that the work done by gravity W = ∫F•dr is equal to entropy S = nkT. There has been some controversy over this. However if we look at an increment of work done through some increment of time δt, I will not worry about the relativistic issues with this definition of time for now, then the increment of work is

            δW = F•(dr/dt)δt = Pδt.

            Power has in natural units reciprocal length squared L^{-2}, or ~ 1/G. Consequently, this increment in work can be written as δW = δt/G ~ n/G. We interpret G as the fundamental Planck unit of area, and n = # of Planck units of area generated by this work. This would then correspond (roughly) to the Bekenstein bound or entropy S = kA/4L_p^2. This is why I think his entropy force of gravity pertains to moving the holographic screen.

            Cheers LC

            Thanks guys for the response. I think the situation we face with quantum gravity may mirror something in the past. The solution might in part be under our noses.

            LC

            Dear Lawrence Crowell,

            You begin your essay with a well written summary of physics history, beginning with "the motion of particle executes little variations to find the proper path", then undergoing a "radical shift [from] variation of the least action in classical physics [to] the path integral in the quantum mechanics of fields." Like some others in this essay contest, I am more inclined to attempt to derive quantum theory from classical fields than vice versa, so I particularly liked your explanation that "constructing a propagator for a field on that very same field" leads to problems.

            In analyzing the limits of space-time, you point out that we are limited by the fact that beyond a certain point, our probe creates black holes that hide the information from us. [That's one reason I treat non-relativistic quantum mechanics and weak field gravity, where we know, at least potentially, whereof we speak.] Thus you point out, "space-time itself is a barrier to the complete specification of an observable." You then say "information conservation demands...". If you'd care to comment on the grounds on which you base a belief in "information conservation" I would be interested. I know it is often assumed nowadays, but I'm not sure on what it is based. I assume you do not begin with quantum error correcting code to achieve this.

            While I don't buy either quantum gravity or supergravity, nevertheless your observations about "the breakdown in the ability to measure everything about the universe" are quite interesting, as is your conjecture that this implies time, unitarity, locality, and causality to be emergent. You seem to agree with Philip Gibbs, so I suspect these are the waters the "math beyond physics" school swim in today. In my previous essays and in my dissertation, "The Automatic Theory of Physics", I presented logic and mathematics as emergent, so I tend to question any ultimate conclusions based on math that go beyond physical barriers to observation. Frank de Meglia may have as much claim to this territory as anyone.

            Nevertheless, having chosen to play the game of 'math beyond physics', you do a bangup job of it, ending up with one electron, one quark, one photon in a universe based on underlying quantum error correction codes.

            Best of luck in the contest,

            Edwin Eugene Klingman