• [deleted]

Lawrence,

Do you still believe that Banesh Hoffmann and John Norton are wrong in their claim that the Michelson-Morley experiment CONFIRMED the variable speed of light predicted by Newton's emission theory of light?

http://www.pitt.edu/~jdnorton/papers/companion.doc

John Norton: "These efforts were long misled by an exaggeration of the importance of one experiment, the Michelson-Morley experiment, even though Einstein later had trouble recalling if he even knew of the experiment prior to his 1905 paper. This one experiment, in isolation, has little force. Its null result happened to be fully compatible with Newton's own emission theory of light. Located in the context of late 19th century electrodynamics when ether-based, wave theories of light predominated, however, it presented a serious problem that exercised the greatest theoretician of the day."

http://philsci-archive.pitt.edu/1743/2/Norton.pdf

John Norton: "In addition to his work as editor of the Einstein papers in finding source material, Stachel assembled the many small clues that reveal Einstein's serious consideration of an emission theory of light; and he gave us the crucial insight that Einstein regarded the Michelson-Morley experiment as evidence for the principle of relativity, whereas later writers almost universally use it as support for the light postulate of special relativity. Even today, this point needs emphasis. The Michelson-Morley experiment is fully compatible with an emission theory of light that CONTRADICTS THE LIGHT POSTULATE."

http://www.amazon.com/Relativity-Its-Roots-Banesh-Hoffmann/dp/0486406768

"Relativity and Its Roots" By Banesh Hoffmann: "Moreover, if light consists of particles, as Einstein had suggested in his paper submitted just thirteen weeks before this one, the second principle seems absurd: A stone thrown from a speeding train can do far more damage than one thrown from a train at rest; the speed of the particle is not independent of the motion of the object emitting it. And if we take light to consist of particles and assume that these particles obey Newton's laws, they will conform to Newtonian relativity and thus automatically account for the null result of the Michelson-Morley experiment without recourse to contracting lengths, local time, or Lorentz transformations. Yet, as we have seen, Einstein resisted the temptation to account for the null result in terms of particles of light and simple, familiar Newtonian ideas, and introduced as his second postulate something that was more or less obvious when thought of in terms of waves in an ether."

Pentcho Valev pvalev@yahoo.com

    Banesh-Hoffman are right that a particle, even a photon, emitted by a moving frame relative to a stationary frame has more energy than the same particle emitted in the stationary frame. In the case of a photon the photon emitted from a moving frame in the same direction is blue shifted with more energy. These matters concerning the measurement of light speed are old and clearly demonstrated.

    To be honest I did not write this essay with the intention of debating century old physics that is well established. I doubt I am going to seriously get around to reading these papers, for they are long and not likely very enlightening. I am not sure why people decide that some aspect of physics is all wrong and devote their lives and work doing battle. This happens with biology in the ongoing reaction to Darwin, but at least the deniers are upholding some theology, which gives some sense for why they do this. There is no such motivating ideology for denying some physical theory that is well established.

    The point of my essay is not that relativity is all wrong. It is more that in a quantum field setting at small scales it becomes incomplete. This pertains to black holes that are smaller than a nucleus or within the first 10^{-30} seconds of the big bang and so forth. This does not mean that relativity is overthrown and what I advocate here is found in basic measurement, such as the Michelson-Morley experiment.

    Special relativity is not just a subject to be researched, but it really is more of an application these days. It is so well established within its proper domain of experience that its validity is beyond reasonable doubt. General relativity is a subject of research, but it is pretty well tested with no empirical evidence that it fails.

    Cheers LC

    I thought that Spinoza first advanced the idea of monads.

    I write more on this below. If I am right the any particle, say an electron, in some wave function ψ(r, t) at some point in spacetime (r, t), is a projection of a single electron onto those configuration variables. The same holds if the particle is described in momentum-energy variables in a Fourier tranforms

    φ(k,ω) = sqrt{1/2π}∫d^3xdt ψ(r, t)e^{ikx - ωt}.

    This projection occurs due to the nonlocality of fields, and their physics is described not by analytic functions or unitarity, but rather by modularity.

    I write more about this below.

    Cheers LC

    • [deleted]

    In 1887 (FitzGerald and Lorentz have not yet advanced the ad hoc length contraction hypothesis) the Michelson-Morley experiment unequivocally confirms the assumption that the speed of light varies with the speed of the light source (c'=c+v) and refutes the assumption that the speed of light is independent of the speed of the light source (c'=c). That is what John Norton and Banesh Hoffmann suggest. Do you agree, Lawrence?

    Pentcho Valev pvalev@yahoo.com

    To be honest I would prefer that my essay page not be filled with posts over this imagined controversy.

    LC

    There have been some developments along these lines which may give support for my thesis here. The paper Black Holes: Complementarity or Firewalls? by Almheiri, Marolf, Polchinski, Sully raises an important point. This points out an inconsistency with the holographic principle. They focus on the suggestion that postulate #2; Outside the stretched horizon of a massive black hole, physics can be described to good approximation by a set of semi-classical field equations, is to be "relaxed." I take it that this relaxation focuses on the issue of "massive" as the mass approaches around 10^3 to 10^4 Planck units of mass. This still makes the black hole massive when compared to the masses of elementary particles.

    In discussions with Stoica on singularities I suggested the following metric with 1 - 2m/r = e^u. so then

    ds^2 = e^udt^2 - e^{-u)dr^2 dΩ^2.

    We now have to get dr from

    dr = -2me^u/(1 - e^u)^2du.

    Now the metric is

    ds^2 = e^udt^2 -2m[e^u/(1 - e^u)^4]du^2 dΩ^2.

    The singularity is at u = ∞, where the dt term blows up, and the horizon coordinate singularity at u = 0 is obvious in the du term. My rational was that the singularity had been removed "to infinity" in these coordinates. This makes the black hole metric similar to the Rindler wedge coordinates, which does not contain a singularity. In the accelerated frame or Rindler wedge there is singularity. The treatment of the Schwarzschild metric in the near horizon approximation Susskind uses is one where the singularity is sufficiently removed so that field in the Rindler wedge may be continued across the horizon without concerns. In this metric of mine the singularity is at infinity so the analytic functions for fields in the Rindler wedge are replaced with meremorphic functions with a pole at infinity.

    Stoica made the observation that this runs into trouble with Hawking radiation. The singularity at infinity causes trouble with the end point of the radiance process for it has to "move in" from infinity. The final quantum process of a black hole is a problem not well known in any coordinates. Your objection does have a certain classical logic to it. However, by the time the black hole is down to its last 10^4 or 10^3 Planck mass units the black hole itself is probably quantum mechanical. In my coordinates (assuming they are unique to me, which is not likely) the singularity at infinity may not have to "move" from infinity. There may be some nonlocal physics which causes its disappearance without having to move at all. This nonlocality is a correspondence between states interior to a black hole and those on the stretched horizon. The Susskind approach does not consider the interior, and he raises this as a question towards the end of his book "The Holographic Principle."

    This nonlocaity would be a relaxation of the postulate #2. The issue of unitarity comes into play. If the theory is replaced with meremorphic functions, say analytic in a portion of the complex plane, then fundamentally quantum fields in curved spacetime or quantum gravity is not unitary but modular.

    Unitarity is represented by a complex function e^{-iHt} and so forth, which is analytic. The loss of unitarity does not mean there is a complete loss of everything; in particular quantum information can still be conserved. A simple analytic function of this sort describes standard quantum physics. Gravity as we know is given by a hyperbolic group, such as SO(3, 1) ~ SL(2,C), where the latter has a map to SL(2,R)^2. The functions over these groups have posed difficulties for quantum gravity, for they are explicitly nonunitary. The trick of performing a Wick rotation on time or with τ = it is a way of recovering the compact groups we know in quantum physics.

    It does turn out I think that we can think directly about quantum gravity by realizing that the SL(2,R) is related to a braid group with Z --- > B --- > PSL(2,Z), and that the braid group is contained in SL(2,R). Braid groups have correspondence with Yang-Baxter relations and quantum groups. The group SL(2,Z) is the linear fractional group, which is an elementary modular form. An elementary modular function is

    f(z) = sum_{n=-∞}^{n=∞}c(n)e^{-2πi nz}

    which in this case is a Fourier transform. In this case we are safely in the domain of standard QM and QFT. In general modular functions are meromorphic (analytic everywhere but infinity) and analytic condition is held on the upper half of the complex plane.

    Of particular interest to me are the Eisenstein series of modular functions or forms. These define an integer partition function, which is an acceptable partition function or path integral for a stringy black hole. I include a graphic here illustrating an Eisenstein function. This has a certain self-similar structure to it, or what might be called an elementary form of a fractal. In this picture unitarity is replaced with modularity. In this more general setting the transformation do no promote a field through time by some operator, but that the operator simply computes the number of states or degrees of freedom in a way that is consistent. Unitarity is then a special case of this, which happens to fit into our standard ideas of causality.

    The Eisenstein series describes a partition function or path integral for a black hole. The theory is not one of unitary evolution, but simply one of counting states or degrees of freedom on the horizon. In effect physics is more general than unitarity, where unitarity is a necessary condition to describe the semi-classical states in postulate #2.

    Cheers LC

    • [deleted]

    Dr. Crowell,

    Nice work! Just the right amount of historical and introductory theoretical information to help the educated non-physicist have a reasonable opportunity to follow the logic of your essay. That extra information is clearly not filler material or added in an author's attempt to appear to be well informed. Not at all! Your professional viewpoint is made accessible while presenting advanced theoretical concepts. Thank you for the lift-up.

    James

    Hi Lawrence,

    here are some ideas ...

    Hello thinkers,

    Very interesting these extrapolations. But I am insisting about the finite groups. Furthere more a photon in my line of reasoning possesses the serie of uniqueness.So a serie begining from the main central sphere.After the serie is a fractalization with specific spherical volumes. the serie is between 1 and x. So a photon is so complex in fact because its quantum number is very important.See also that this numbers the same than our cosmological number of spheres(without the quantum spheres of course).So it is very relevant about the broken symmetry indeed due to informations correlated with volumes and the rotations spinal and orbitals.The tori of stability take all their meaning. See that the system is a fusioned system.The desnity is relevant correlated with mass, and the polarity m/hv due to evolution. So the exchanges is probably a fusioned system and not a binar system in its pure generality.

    See also that the informations are very relant when we consider the VOLUMES OF SPHERES !!!The informations can be bosonic or fermionic.Personnaly I beleive that the volumes of fermions are more stable due to the encoding of evolution.The bosonic encoding is more subtle due to its quantum number and its fractal cited above. The sortings and synchros appear with a universal proportionality.

    Regards

      1st timer submission, not yet submitted, while reviewing selected works for possible End Notes.

      Good history and very interesting paper but got lost in the heavy math, wish I was abreast of all covered...but love your ideas "G implying 1/mass and QFT as unit less coupling, have different times" (if I interpret correctly)

      First I see E/f = h and Power = E/t. Dividing one gets, t/f, so IF t = 1/f it implies either t squared of 1/f squared. Square roots generate plus and minus, a past and future with no present?

      Second, mass and energy, respectively, as the inscribed sphere, tangent to the face of a regular tetrahedron where sphere and tetrahedron have equal surface-to-volume ratios at ANY size, e.g. equivalent "activity" as free energy.

      Comment? (may use in end notes)

        The gravitational constant in naturalized units is "area," so it is sqrt{G} that is ~ length or reciprocal of mass.

        Verlinde has a proposal that the work done by gravity W = ∫F•dr is equal to entropy S = nkT. There has been some controversy over this. However if we look at an increment of work done through some increment of time δt, I will not worry about the relativistic issues with this definition of time for now, then the increment of work is

        δW = F•(dr/dt)δt = Pδt.

        Power has in natural units reciprocal length squared L^{-2}, or ~ 1/G. Consequently, this increment in work can be written as δW = δt/G ~ n/G. We interpret G as the fundamental Planck unit of area, and n = # of Planck units of area generated by this work. This would then correspond (roughly) to the Bekenstein bound or entropy S = kA/4L_p^2. This is why I think his entropy force of gravity pertains to moving the holographic screen.

        Cheers LC

        Thanks guys for the response. I think the situation we face with quantum gravity may mirror something in the past. The solution might in part be under our noses.

        LC

        Dear Lawrence Crowell,

        You begin your essay with a well written summary of physics history, beginning with "the motion of particle executes little variations to find the proper path", then undergoing a "radical shift [from] variation of the least action in classical physics [to] the path integral in the quantum mechanics of fields." Like some others in this essay contest, I am more inclined to attempt to derive quantum theory from classical fields than vice versa, so I particularly liked your explanation that "constructing a propagator for a field on that very same field" leads to problems.

        In analyzing the limits of space-time, you point out that we are limited by the fact that beyond a certain point, our probe creates black holes that hide the information from us. [That's one reason I treat non-relativistic quantum mechanics and weak field gravity, where we know, at least potentially, whereof we speak.] Thus you point out, "space-time itself is a barrier to the complete specification of an observable." You then say "information conservation demands...". If you'd care to comment on the grounds on which you base a belief in "information conservation" I would be interested. I know it is often assumed nowadays, but I'm not sure on what it is based. I assume you do not begin with quantum error correcting code to achieve this.

        While I don't buy either quantum gravity or supergravity, nevertheless your observations about "the breakdown in the ability to measure everything about the universe" are quite interesting, as is your conjecture that this implies time, unitarity, locality, and causality to be emergent. You seem to agree with Philip Gibbs, so I suspect these are the waters the "math beyond physics" school swim in today. In my previous essays and in my dissertation, "The Automatic Theory of Physics", I presented logic and mathematics as emergent, so I tend to question any ultimate conclusions based on math that go beyond physical barriers to observation. Frank de Meglia may have as much claim to this territory as anyone.

        Nevertheless, having chosen to play the game of 'math beyond physics', you do a bangup job of it, ending up with one electron, one quark, one photon in a universe based on underlying quantum error correction codes.

        Best of luck in the contest,

        Edwin Eugene Klingman

          Thanks for the thumbs up. I seem to be falling downwards in the community rankings, though my paper has only been up about 36 hours. I am not sure what is going on there, for I know the physics I present is better than a whole lot of the papers ranking higher.

          Cheers LC

          It is not difficult to quantize weak gravity. This is usually written as a bimetric theory g_{ab} = η_{ab} h_{ab}, where η_{ab} is a flat spacetime (Minkowski) metric and h_{ab} is a perturbation on to of flat spacetime. We may write a theory of the sort g_{ab} = (e^{ω})_a^c η_{cb}, where the bimetric theory is to O(ω) in a series expansion

          g_{ab} =~ (δ_a^c ω_a^c) η_{cb}.

          Gravitons enter in if you write the perturbing metric term as h_{ab} = φ_aφ_b, or ω_a^c = φ_aφ^c. The Ricci curvature in this weak field approximation is

          R_{ab} - (1/2)Tg_{ab} = □h^t_{ab},

          with h^t_{ab} the traceless part of the metric, and □ the d'Alembertian operator. Which in a sourceless region this computes plane waves. The two polarization directions of the graviton may then be interpreted as a form of diphoton, or two photons in an entanglement or a "bunching" as in Hanbury Brown-Twiss quantum optical physics.

          If we now think of extending this to a strong field limit there are the square of connection terms Γ^a_{bc} in the Ricci curvature, or cryptically written as R ~ ∂Γ ΓΓ where there is the appearance of the nonlinear quadratic term in the connection. This nonlinear term indicates the group structure is nonabelian, so the photon interpretation breaks down. The graviton in this case is a form of di-gluon, or gluons in a state entanglement or chain that has no net QCD color charge. This connects with the AdS_n ~ CFT_{n-1} correspondence, where for n = 4 the conformal field theory is quark-gluon QCD physics. Further D-branes have QCD correspondences and this takes one into the general theory I lay out. One does need to look at the references to learn more of the specifics. The quantum phase transition to entanglement states is given in the paper I write in ref 11 L. B. Crowell

          The simple fact is that as physics develops it will invoke new mathematics. I don't think I am overly mathematical in this essay, and I leave most of those details in the references. A theoretical physicist I think is wise to have a decent toolbox of mathematical knowledge and thinking. Physics invokes ideas of symmetries, remember Noether: symmetry corresponds to conservation law, and invariant quantities can also have connections with topology and number theory. I think the more one is familiar with advanced mathematics the more capable one is of thinking deeply about these matters.

          It is true that my work is commensurate with P. Gibbs'. If field theoretic locality and spacetime are emergent structures then so is causality. This emergence is connected with a quantum phase transition, or a quantum critical point (tricritical point of Landau), and something occurring on a scale much larger than the string length.

          Cheers LC

          hello to both of you ,

          Mr Corda,

          Happy to see you again.

          Regards

          "Physics invokes ideas of symmetries, remember Noether: symmetry corresponds to conservation law, and invariant quantities can also have connections with topology and number theory."

          "Invokes" is an excellent choice of word. My impression is that many physicists today would go farther and claim that symmetry is the basis from which the universe 'emerges' -- a very questionable assumption.

          I also agree that "the more one is familiar with advanced mathematics the more capable one is of thinking deeply about these matters." But that doesn't address the issue that "mathematics hangs on logic." And to assume that when space and time are abolished (coming "close to what we might call nothingness") somehow logic and math still exist, is to assume a lot. I believe it is a wrong assumption.

          Edwin Eugene Klingman

          Large symmetries are clearly important. The more general a symmetry group is, say with a larger Lie group, the transformations of that group can maintain a more general vacuum as a vacuum. In other words, symmetry preserves the ground state (vacuum), and broken symmetry does not, or maintains a more restricted ground state. There may of course be other elements to the foundations of physics than simply using ever larger Lie groups, such as removing certain postulates like locality of field data.

          A lot of this about mathematics and logic relies upon the philosophy of mathematics, which I have read about and find somewhat interesting. However, I am not that steeped in the subject, nor does it concern me that deeply. Some mathematical subjects have no reference to geometry, such as most of number theory. Of course we humans have to exist with all our causal structure in spacetime to study it. However, a mathematical realist would say that number theoretic proofs are true whether we know them or not.

          Cheers LC

          • [deleted]

          Dear Lawrence Crowell,

          While you now correctly spelled annus, I see you wrong again: "The introduction of the Monad, as Leibniz conceived it, is a direct result of his disagreement with Descartes and Spinoza." (http://www2.sunysuffolk.edu/schievp/file22m.html)

          You wrote on p. 3: "This would not quantum mechanics in any natural or realistic

          way." I do not understand this sentence.

          According to the title of your essay, unitarity is a foundation that is not a foundation. I wonder why you did not anticipate readers like Yuri and me who do not feel forced to immediately understand such play with words just because you mentioned the nebulous word "emerging". Having searched for "unitarity" in the text of your essay, I did not get the due explanation but only two hits.

          The abstract promised replacement of unitarity by modularity, a word that is not at all mentioned in the text.

          We merely learn: unitarity "might be emerging". In "abandonment of locality and unity" you did perhaps also mean unitarity, not unity; because the next sentence speaks of the loss of unitarity.

          I do not just criticize some imperfections but I am also ready to factually question it if you are willing to deal with my admittedly quite different view.

          Sincerely,

          Eckard

          Eckard