• [deleted]

Thanks, John R. To briefly explain, just as there are 6 coordinate points that fix the location of the observer's origin in 3 dimension space (up, down, left, right, forward, backward) -- and symmetry assures us that only 3 are required for an observer to describe a vector from the origin -- a 4 dimension set (Minkowski spacetime) requires 16 coordinate points of which 6 are the redundant points of 3 dimension space, leaving 10 of which 2 are vectors of opposite sign. That leaves 8 coordinate points to fix the origin with one vector reversible to the origin.

Clear as mud?

As applied to physics, the Minkowski formulation is conceptually easy and operationally hard -- the vector algebras (Hamilton's quaternions and Clifford algebra, Cayley-Dickson octonions, Hestenes' spacetime algebra) are operationally easy and conceptually hard. I mean, the linear and complete vector formulation of a continuous spacetime is much easier to calculate with, than one built of partial differential equations.

What we gain in simplicity, however, we lose in the possibility of analytic continuation to higher dimension physics. Because time is vectorized in the linear model with symmetric reversibility, there is no true scalar value -- or rather, the scalar is identical to the sum of vectors. In my own research, I characterize this result: "The four dimension horizon is identical to the ten dimension limit." There is no calculable scalar value to account for the evolution of the time parameter beyond the horizon of observability.

Though we often speak of "vector analysis" as if it were truly analytical, it is not. Higher dimension theories beyond the Euclidean R^3 have the advantage of explaining in a mathematically complete way, how hidden variables affect results in the continuous function physics that we experience. Whether these hidden variables can be explained as nonlocal, as in conventional string theory (which is extended from quantum field theory), or local, as allowed by Joy Christian's measurement framework, is the next big question.

Best,

Tom

Last was mine.

Jason,

I don't have the interest in ghosts that you do. I am only making the point that if such spirits exist, they require a continuous field model of physical reality. Evidence by itself doesn't mean anything to science, in the absence of a theory that incorporates it.

Best,

Tom

Thanks Tom,

I can follow the general idea, but I'm the muddy character here. Much food for thought, though I think it also illustrates why there is a growing appreciation for topology and its simple connectivity to make space algebraic. Not that I have yet gotten much of a handle on it, but it's fascinating in contemplating how given the finite ('astonishingly low' - Fitzgerald) velocity of physical communication possible, that space can be connected at all. I'm taking a little break at present. Somewhere I tripped on the Hopf Fibration and fell headlong into an impossible image of a one dimensional continuous line being turned inside out like a sweater sleeve that got caught on a cufflink. Wish me luck.

I'm not much for ghosts pushing fiancés down the stairs either, doesn't hold up in court. jrc

Tom,

It's OK not to be interested in ghosts or experiments performed to detect ghosts. They don't really lend themselves to mathematical descriptions.

John Cox,

"I'm not much for ghosts pushing fiancés down the stairs either, doesn't hold up in court. jrc "

A court of law would consult a physicist; but physicists require a theory along with the evidence. Therefore the physicist would call it "pure buncomb" because they don't have a theory. Of course this has nothing to do with the fact that it actually happened.

  • [deleted]

Eckard,

Despite any criticisms of Michelson's methodology, it was Maxwell and the refined apparatus results by Michelson and Morely that are pre-eminent is proving that light velocity is constant in ideally empty space. Any arguments to the contrary are just that, contrary to all physical chemistry and electronics.

My naïve model is purely classical so I doubt it would be applicable to Fourier Transformations applied to the purely mathematical probability wave function in QM. I only mentioned it because it incorporates a classical continuous function that resolves to a discrete particle, and as such contributes to why we have a universe that follows Quantum rules. If I were to publish I'd want some modern electronic media technical help, this computer world slays me.

My point about measuring from the waveform is simply this; ideally, empty space does not contain an observer, nor a position for an observer. If we accept M&M, we should look to the wavelength of light in empty space as the only location available to find a benchmark from which to measure its velocity. I hypothesized density varying directly with velocity. Another approach might be found in the rectified length of curvature in the sinusoidal curve in relation to Planck's Constant, or in relation to the proper time span of that wavelength. This is an admittedly empirical approach because we would firstly accept a priori that the velocity of light is constant in our arbitrary system of unit measures. And Einstein didn't establish that constancy, he worked from it. jrc

Yes we are that great.

1. America provides an amazingly good life for the ordinary guy.

2. America offers more opportunity and social mobility than any other

country, including the countries of Europe.

3. Work and trade are respectable in America, which is not true

elsewhere.

4. America has achieved greater social equality than any other society.

5. People live longer, fuller lives in America.

6. In America, the destiny of the young is not given to them but is

created by them.

7. America has gone further than any other society in establishing

equality of rights.

8. America has found a solution to the problem of religious and ethnic

conflict that continues to divide and terrorize much of the world.

9. America has the kindest, gentlest foreign policy of any great power in

world history.

10. America, the freest nation on earth, is also the most virtuous nation

on earth.

http://www.phillytalkradioonline.com/comment/10-great-things.html

But more to the point, America guarantees my right to free speech and freedom of religion, freedom of press. Also, without us, neutral nations Europe would be gobbled up by Muslim countries and other predatory nations.

New Physics! And strong support for the causal QM of my essay, well timed!

Have We Been Interpreting Quantum Mechanics Wrong This Whole Time?

The only thing that doesn't really do, and the key to everything, is show how 'non locality' can be produced classically. I recently lodged a short (2 page) 'summary' resume of the fuller derivation in my essay, consistent with the above, here;

Classical reproduction of quantum correlations.

Paradigm changes can't be instant but my original 2020 estimate now looks more realistic; 2020 Vision. A model of Discretion in Space' http://fqxi.org/community/forum/topic/803

The same electron (Compton/Raman) scattering mechanism at c in the electron C of M rest frame ('discrete field dynamics' or DFM) appears able to coherently rationalise both SR and QM without paradox to allow convergence (see the other 3 essays). If anybody can spot any apparent flaws do please flag them up. Thanks.

Could this be a red letter day for fqxi? Hmmm.

Peter

    Eckard,

    I should have added that I agree with you about the interpretation of Lorentz which in any which I have read, requires Lorentz invariance to mean that any acceleration to light velocity results in 'infinite mass'. I have long thought that an absurdity. My argument about energy density varying in direct proportion to velocity is that as density diminishes, greater applied energy would be necessary to maintain continued acceleration. Also, Lorentz can be treated as an exponential function within the limits of the mass-energy equivalence. With your practical knowledge in mathematics, is that defensible? I do not mean to question your arguments as to SR being unnecessary, I think you have a good point. I'm just looking at a rationale for Lorentz covariance rather than invariance. Thanks, jrc

    It is simply impossible to say with certainty that a wavefunction exists or does not exist because of the very nature of language. Existence is a question of the nature of matter in either of the two realities of gravity and quantum universes. Without a complete and self consistent set of axioms for the universe that we have, you cannot state with certainty what existence really means and therefore whether a wavefunction exists or not. In other words, the patchwork of gravity and quantum action within mainstream science means that there are at least these two somewhat inconsistent meanings for existence of matter.

    Does matter exist? Certainly. Does matter exist with both amplitude and phase? Once again, I say yes. Does a wavefunction represent the existence of such a matter wave? Most certainly.

    You say that since there is no measure of a wavefunction, a wavefunction does not exist. But science infers that a great many things exist from just indirect evidence. There is no direct measure of a quark, but quarks exist and we cannot directly measure a Higg's boson but science still argues that it exists. And of course there is by definition no measure of a black hole, and yet science presumes black holes exist from much indirect evidence. Since we also have much indirect evidence for the existence of matter waves, it therefore seems useful to suppose that wavefunctions also exist and represent those matter waves.

    In fact, it is very useful to use the concept of matter waves to represent the stuff that makes up objects. This is because matter waves not only make up atoms and molecules, the exchange of matter waves is the glue that bonds those matter particles together. Quantum electrodynamics associates vacuum oscillators with the propagation of light through space by exchange of photons. Light is a matter wave and that is self evident and light's movement occurs by a particle exchange that excites and deexcites those vacuum oscillators in space.

    Correspondingly, matter waves evolve by a similar process of matter exchange with the same boson matter that is the basis of the entire universe. When a particle gains velocity, it gains mass and when a particle loses velocity, it loses mass. If a particle accelerates due to a force, the particle's mass evolves continuously. In fact, a particle's mass is then due to its absolute velocity, which is the speed of light and is the rate that the universe shrinks. Since the universe shrinks at the speed of light, light itself is stationary in the frame of reference of our shrinking universe of matter.

    In other words, all motion of a particle is equivalent to a change in its mass. Everywhere in the universe is subject to acceleration due to gravity force, so it is therefore convenient to associate a boson matter exchange with a change in mass to describe both gravity and charge forces. The vector potential of Maxwell's equations becomes a matter acceleration and now is the unification that drives both gravity and charge forces.

      If wave-functions do exist as real things, then I think we should consider that the speed of light/permitivity/permeability is a property of the wave-function, not a property of just empty space. The reasoning would be as follows: the best way to explain the invariance of the speed of light for all matter and all energy is to say that it's because all matter and all energy have a wave-function associated with it. All interactions have a wavefunction that span between interacting elements. Wave-functions are the invisible "existent" thing that imposes the invariance of c. Maybe in a thousand years, we will know how to disconnect a spaceship from the wave-functions of the rest of space-time, which will disconnect the space-ship from the speed of light restriction. This would allow us to travel outside of space-time.

      Your presentations are interesting. I have done some presentations, live and in front of people, of related material. I focused on homotopy theory. The double slit experiment is a form of homotopy, where there are two sets of trajectories that are distinct by a topological obstruction. The measurement of which slit the particle passes through transfers the superposition into an entanglement with a needle state. Entanglements can then be a case of topology or homotopy. I am particularly interested in the case of where a quantum system entangles with a black hole.

      Cheers LC

      Lawrence,

      Nonlocality could now be produced classically as shown in the links above. You suggested;

      "Physically the nonlocal properties of QM simply can't be reduced to a classical realization."

      That has certainly become the established viewpoint, but is it just a 'cop out'? John Bell was unhappy with it calling it 'sleepwalking' and saying;

      "in my opinion the founding fathers were in fact wrong on this point. The quantum phenomena do not exclude a uniform description of micro and macro worlds...systems and apparatus." (Speakable..p171)

      In fact he went further; "It may be that a real synthesis of quantum and relativity theories requires not just technical developments but radical conceptual renewal."

      And in 'Beables..'; "I think that conventional formulations of quantum theory, and of quantum field theory in particular, are unprofessionally vague and ambiguous. Professional theoretical physicists ought to be able to do better."

      I tend to agree, and think the implications of the links I posted above may prove very important. Views?

      Peter

      jrc,

      Yesterday Peter Jackson challenged me to watch even without sound a video by Teufel. IIrc, when he explained the theory of guiding waves, Teufel argued that velocities can be derived from positions. An Italian co-worker referred in her dissertation to Albers (?) who considered this view incomplete. Common sense tells me that the velocity of light equals to the increment from the position of emitter AT THE MOMENT OF EMISSION to the position of receiver AT THE MOMENT OF ARRIVAL divided by the time of flight. Doesn't this easily explain the experiments in Potsdam by Michelson 1881 and in Cleveland together with Morley 1887, on condition one abandons Maxwell's idea of a light-carrying medium?

      In an earlier essay I dealt with an acoustic Michelson experiment by Norbert Feist who was shocked by my explanation. The prediction outlined by Michelson and Morley in 1887 was not quite correct. They admitted this, and there is anyway no need for a correction; the null-result is plausible unless one is not ready to abandon the idea of a light-carrying medium. Lorentz defended such medium by his hypothesis of length contraction according to so called Lorentz transformation.

      You described the history a bit different. I agree that Morley was an important scientist too. However, it seems to me that the basic method concerning a light-carrying medium was created by Michelson. Also, it seems to me you are confusing the effect of increased mass which was already found by Thomson with its later attribution to Lorentz transformation.

      Why do you deny the possibility that a receiver/observer of light from an emitter may be located within an otherwise ideally empty space? I agree that the wavelength of light does not change there.

      What about Fourier transformation between position representation and momentum representation I maintain that cosine transformation might be sufficient and even more appropriate.

      Eckard

      There are a number of counter arguments to these. These statements are typical propaganda slogans, which have or had maybe some element of truth. However, there are a number of things that can be raised to at least raise questions. The United States has since its origin been in a fairly major war every 20 years. Recently we left Iraq after causing directly or indirectly the deaths of over a million Iraquis. With Vietnam we killed over 3 million, and Korea was similar. I am not here to get into the geo-political reasons for these wars, but we do have a serious history of bombing and attacking nations. As for equality and related matters that might have been the case up to the 1980-90 time period. There are a lot of metrics which challenge these agit prop type of statements.

      On the whole the halcyon statements about this country had more truth to them in the past, or from the 1950-1960 to 1980-1990 time period. Since then we have been backsliding.

      LC

      The deBroglie-Bohm theory of QM is perfectly acceptable. It has no particular flaw, unless one takes this as a theory of local hidden variables. The problem with that is that the pilot wave must adjust to a quantum outcome. The problem is that identically prepared quantum systems would have the same quantum pilot wave. As this approaches the pilot wave must "decide" which configuration to assume. It must either go left or right, and this is a nonlocal connection. I take a picture from this article and change it slightly to illustrate a quantum OR condition. The pilot wave as it approaches the double slit must adjust to either situation, and this is even if the particle or "beable" is heading directly towards the midpoint between the two. This is a nonlocal property, and it is reflect in how the quantum AND logical condition does not distribute across the OR condition.

      The nonlocal property of the pilot wave means that this particular "picture" of the pilot wave is a special condition, or analogous to a gauge. The pilot wave is in fact in an infinite number of configurations. All one has to do is perform a symplectic (canonical) transformation of the classical variables to get another configuration for the beable and pilot wave. Each of these configurations is related to the others by no locality, and they form a congruency that is a form of path integral.

      These results are interesting, but I suspect that if the statistics were carefully analyzed that they would be found to obey the Bell inequalities. I would be genuinely surprised if these turn out to produce the inequality violations.

      LCAttachment #1: double-slit-quantum-or.jpg

      Eckard,

      Please understand that I am acutely aware that my own lack of education might lead you and others into some ambarrassment attempting discourse with me, and I can only apologize. Until about two years ago I was in a poverty/political trap that constrained any mobility. Where I'm coming from I could carry all the hard science and math books from the new modern public library under one arm, the reactionary right-wing politics threatening European stability at present has always been pervasive where I have lived. Now I have a computer and can begin to find some intelligent reading. (And I still have two of my upper teeth in more or less one piece!) And I've relocated to a small liberal arts college town.

      So... I am not familiar with Michelson's discussions with Thompson but not surprised they would have been engaged. I am aware that at the time when Maxwell conducted his exhaustive analysis of Faraday's results, the 'aether'

      medium in Newtonian space was the prevailing thinking. What I find important of Maxwell's discovery of the 'c' proportional difference of electric and magnetic difference of intensity between point charges, is not only that it means light is only one segment of the spectrum, but all physical chemistry is dependent on that proportion being constant. Perhaps that it why Morely, a chemist, was interested in Michelson's efforts. I understood it is Michelson that attempted to prove the aether existence, which failed in the null result. It is an interesting footnote that some aether based caculations had determined that the physical property of the aether would have had to have a rigidity equivalent to steel for the wave form to transmit an energetic response across space (I think I read that in an Asimov book). I have wondered however if gathering evidence by the time of Michelson's first interferometer had not called the aether hypothesis into enough question that it was simply politically expedient to present the experiment in terms of 'confirming the aether wind'. It seems many concepts that become the prevailing wisdom at any time, such as we now understand the Electromagnetic Spectrum to be independent of any medium, have had a lengthy incubation period. Relativistic ideas go back to Galleo and beyond.

      Actually, Eckard, though I have striven to understand both SR and GR at least conceptually, I am quite comfortable with your own 'good old notion of ubiquitous time' and also find the theoretical climate to be more productive of a 'snowstorm of mathematics' than real progress. I like the tried and true method of experimentation on a controlled workbench, and if truth were admitted the much hyped success of QM is in reality the product of engineers who having tested the theoretical predictions to no avail, have gone back to the reliable technique of 'poke it with a stick and see what it does'! And if an arbitrary scalar increment of time is acceptable in QM such that it 'zeroes out', then it's as good a methodology in classical physics of 'tick, tick, tick' to explain the null result of Michelson. I think you summed it correctly that 'all numbers are ideal', it's our science and we choose what method we restrict our inquires to, and how we want to devise our reference frames. The only final criteria is that in following any inquiry we do not violate any of our own axioms along the way.

      I won't digress about mass increase with velocity, other than to propose that there might be a 'break even' point where a quantity of energy at rest will prescribe a proportional density that constitutes matter and which will behave under acceleration as a mass increase, while a smaller quantity will prescribe a proportional density which will behave as an electrodynamic charge and behave as a decreased density under acceleration with applied energy. And if Lorentz and Poincaire don't like the speed I'm going now, they sure as hell won't like the other one.

      Seriously on topic of Why Quantum, first we have to resolve the 'zero point particle' absurdity. Singularity is a mathematical property not a physical property. And for more than a century science has said 'E=mc^2' and the EM spectrum is a wave and the EM spectrum is a particle. Those who write checks on a black budget (government industrial academia) want a damned particle! Put some energy in one spot and make one that will last long enough to put a dent in something else! Then and only then will science save itself from itself and the public will again think of it as was commonplace in the era of Newtonian application to industrialized progress. There has to be discrete somethings made manifest to argue about quantum probabilities in the first place. And until we do that, the public and politicos will consider science just a pissing contest among prima donnas.

      I'm not versed if Fourier Analysis, but would think the direct connectivity geometrically in cosines would be readily adaptable in continuous transformation of wave characteristics.

      What is the acronym, IIrc? Pardon the length of this, jrc

      I interpreted the two slit experiment to mean that the electron somehow became un-manifested in such a way that it when through the two slits as wave. The pilot theory says that the waves are real, but that the electron is always manifested like a hard sphere. I disagree with the pilot wave interpretation. I believe that the particle somehow becomes un-manifested, as if melting back into the wave, until something directly measures it's properties, at which time the electron fully manifests again.

      Can anyone dispute or point out an error in my interpretation that the electron becomes UN-manifested at the two slits?

      By suggesting that the electron becomes "un" manifested by the 2 slit diffraction, have I violated a major law of physics? Or created a paradox?

      The good news is that once we figure this out, it will most likely lead to new physics. Maybe we'll find out that we have grey alien neighbors.

      Lawrence, Jason. The 'delayed choice' simply resolves on choosing different starting assumptions. Yes, Jason, as in QED's sum over paths and Huygens construction (foundational to quantum and laser optics and photonics) there can be no 'photon' or path' until the combined Schrodinger sphere surfaces (NLS equation) are forced to interact with matter (a 'detector') where only ONE position has adequate constructive interference to quantize the new particle (at different 'ranges' the 'positions' also differ).

      'Non-locality' then emerges from particle 'reversibility', which 'weak (statistical) measurement' can't discern. I invoke electron 'spin flip', which in fact Bell also did, relating measurement to direction of DETECTOR field electron spin - which REVERSES with EM field orientation (setting angle)! Bell also admitted his (Bohr) assumption; Bertlmann's sock always differ. A sphere has BOTH spins (poles) and a sock can be randomly worn inside out (pink becomes green). OAM of a sphere is conserved through x any y axis rotation. 'Direction' is NOT conserved!!!!! That's the DFM derivation of 'non-locality'.

      To explain in terms of Wheelers view and (i.e. Jacques) delayed choice;

      The focussed waves follow BOTH 'paths' from splitter 1, so each detector has a 50:50 chance of clicking.

      Introducing a second splitter COMBINES them, so phase can be tuned so EITHER detector can have 100% constructive interference, leaving the other 0%.

      As Wheeler said; "No elementary phenomenon is a phenomenon until it is a registered (observed) phenomenon". It is wrong to speak of the "route" of the photon in the experiment of the beam splitter. It is wrong to attribute a tangibility to the photon in all its travel from the point of entry to its last instant of flight."

      Timed pair experiments then CAN access data which gives A,B aa or bb if just one detector dial is reversed. My previous essay identifies that 99.999% of Aspects data confirmed this, but couldn't theoretically rationalise it so discarded it. Wieghs (et al inc. Zeilinger) found the same so just 'corrected' for it! Perfect examples of 'theory bias' in experimentation.

      My short summary completes the work in the link above by classically explaining non-locality as well as entanglement.

      Classical reproduction of quantum correlations. But we seem now permanently trapped by 'theory bias'. Can you now see the solution Lawrence? Most surely won't.

      Jason

      "The good news is that once we figure this out, it will most likely lead to new physics. Maybe we'll find out that we have grey alien neighbors." The figuring out was the simple bit. It's done. The real job seems to be is to overcome our human failings to make it visible!.