Unitarity just means that any operator which transforms a quantum state Uψ = ψ' is such that its transpose in a matrix representation and complex conjugate, denoted by † is such that U^† = U^{-1} and for an operator H which is the generator of U, U = exp(iH) then

U^† = exp(iH^†) = U^{-1} = exp(-iH)

And so H^† = H. The theory of quantum mechanics based on this has been experimentally tested thousands of times. So unitarity is more than a hypothesis.

Ergodic theory is more hypothetical because it is based on a certain interpretation of statistic. This is a form of frequentism.

Cheers LC

In 1954 Einstein confessed, in a somewhat enigmatic way, that his 1905 constant-speed-of-light postulate was both false and fatal for contemporary physics:

http://www.perimeterinstitute.ca/pdf/files/975547d7-2d00-433a-b7e3-4a09145525ca.pdf

EINSTEIN'S 1954 CONFESSION: "I consider it entirely possible that physics cannot be based upon the field concept, that is on continuous structures. Then nothing will remain of my whole castle in the air, including the theory of gravitation, but also nothing of the rest of contemporary physics."

The analysis showing that "physics cannot be based upon the field concept, that is on continuous structures" is equivalent to "physics cannot be based upon the 1905 constant-speed-of-light postulate" is not difficult to perform.

Pentcho Valev pvalev@yahoo.com

  • [deleted]

John,

You wrote, "Yet you don't think that various of today's theories might also be scales that will eventually fall from our eyes?"

It doesn't matter what I (or anyone) thinks. I can't emphasize enough that personal belief plays no role in science. I cited previously the example of phlogiston as a once widely accepted physical principle that we now know is superfluous. Where chemists once thought that an unknown substance (phlogiston) was necessary to make fire, because they couldn't understand combustion otherwise, after Lavoisier demonstrated that combustion is simply a process of rapid oxidation, phlogiston ceased to be an issue. It hardly serves science to look back and say "How stupid could one have been to accept such an idea in the first place?" when patient researchers were producing results step by step regardless of their innocence of the correct solution. And it's only the crankiest of accusers who would say before or after Lavoisier's discovery, "I told you that you were wasting your time on that idea," based on nothing more than some vague intuition, and having invested no intellectual toil and sweat in trying to produce the correct result themselves.

Scientists make conjectures and offer refutations based on the state of knowledge as it is, not on what they personally believe. The overwhelming proliferation of information today has made it all but impossible for an individual to hold all the knowledge, so debates are more intricate and sophisticated than they ever were. Nevertheless, facts and theorems still rule, and a counterexample is worth a thousand words.

Tom

  • [deleted]

John,

Apropos what Tom is talking about regarding how science has progressed over the course of history, in case you've not already read Thomas S. Kuhn's book 'The Structure of Scientific Revolutions' it's a wonderful read, imo. Puts some of the history of science in an interesting and useful light. Paradigms play an important role in shaping the way we think about the things we observe empirically, and Kuhn's book addresses the issue of paradigm change. Fascinating stuff.

In our discussions here at FQXi I've attempted to focus on the fundamental paradigms we're using when we talk about slippery topics such as "time." I continue to believe that this focus on paradigms is a potentially fruitful avenue to pursue. Paradigms are necessary to permit the creation of scientific theories, but paradigms also can tend to create tunnel vision in the way we think about the things we observe. It's something to be mindful of. Anyway, I can't recommend Kuhn's book too highly. I've read it several times and will undoubtedly revisit it for a refresher.

jcns

I have to second TH Ray's comment. A theoretical physicist might get some idea about things and work hard to develop a hypothesis. Even though that person might really hope their work is found in nature there is no believing. This is not about wanting nature to uphold one's hopes, for we should not equivocate truth with hope.

Cheers LC

  • [deleted]

While I cannot see from Stachel's presentation that Einstein gave up constant speed of light, the late Einstein did indeed explicitly utter that the NOW worried him seriously.

As indicated by Stachel, several physicists already looked for ways out of some problems by elaborating the idea of discrete spacetime. Such approach seemed to be promising because it could be declared a legacy of the idol.

I wonder if someone was courageous enough as to create a non-tense-less physics which will presumably indeed render a lot of "contemporary physics" a "castle in the air".

Eckard

  • [deleted]

jcns,

There was a book written in the 70's called Paradigm Shift, that was one of the first books on physics I read and it was influential to my thinking. I tried looking it up, but having forgot the author and it likely being out of print, it didn't show up. The term gets lots of use these days. I tried writing a blog when they first became common, and titled it Peeling Paradigms. There definitely seems to be something of a cyclical nature to the process, as old paradigms age and congeal, which creates the need and space for fresh paradigms to rise.

Unfortunately I'm working two jobs lately and the time to read much of anything has become quite limited.

Tom and Lawrence,

"I cited previously the example of phlogiston as a once widely accepted physical principle that we now know is superfluous."

So spacetime as a "physically real model" is irrefutable. And blocktime and wormholes and all the other fantastical ideas springing from it.

"This is not about wanting nature to uphold one's hopes, for we should not equivocate truth with hope."

TH,

I was recently trying to work out what maths means for the universe by opposition to what it means to us. For example, if I have $5 in the bank and $5 in my pocket and "have" a total of $10. This is about adding some knowledge. But if we look at planets, they increase or add mass by getting matter closer to their surface. So, I figure that the natural equivalent to addition is grouping or getting things closer together. Subtraction would be the reverse, again distancing two things apart. But what happens is we increase the distance between all the parts of a group? We dilute its concentration and that would be a "natural" division. A multiplication would amount to a general volume reduction or concentration.

How could one make some sense of this kind of maths? First, it would seem that natural maths is intrinsically tied to geometry. In our minds we may make calculations, but in the real world of real stuff ... addition of things that exists requires a place for those things to exist before we get to add them.

Secondly, my earlier exploration was meant to explain why nanometre size light photons have "more energy" than kilometre long radio photon... In that context, it was explained that both photons have the same "energy", the Planck. Their real difference was their power. The nanometre photon just delivers more quickly its Planck than the radio photon. In other words, in a universe where time passing is the basic canvas, how quickly something happen does matters. So, a grouping could be associated with an increase of power of the whole group... ?? Power of what? Power of existence? Maybe (for whatever it means). Because the background is passing time, existence (matter existing) is not state! It is an actual dynamic process that can therefore be associated with a time rate and a kind of power of existence, of sort. ...

Strangely, while the universe is expanding (in division), everything existing in it is in a grouping mode (multiplication a.k.a. universal gravitation.)

Just having fun!

Marcel,

  • [deleted]

LC,

You argued: "The theory of quantum mechanics based on this has been experimentally tested thousands of times. So unitarity is more than a hypothesis."

While QM is a rather confirmed theory, its application to cosmic scales might be a rather wild guess, at least to me. Is it not a bit far fetched to absolutely exclude the possibility that CMB radiation can be explained without cosmic inflation on this basis?

Eckard

  • [deleted]

Eckard,

The irony is that the math is likely far more comprehensible for a non-inflationary cosmology. Einstein originally proposed a Cosmological Constant to maintain a stable universe and expansion has shown to resemble a CC. So it would seem "space" expands between galaxies proportional to the rate it collapses within them and this is not coincidence, but two sides of the same process. Yet it seems the case has been closed and cannot be reopened, because this generation of theorists were raised to this view.

The CMB exists without inflation. It is predicted by the standard FLRW spacetime model of the big bang. What inflation does indicate is how the CMB is smooth with small anisotropy. It also tells us how different regions of the CMB originated from the same initial conditions.

The quantum aspect of inflation is the vacuum energy stretched out space enormously, about 63 efolds. The vacuum energy just defines an energy density

T_{00} = Λg_{00} = (8πG/c^2)ρ,

for ρ the energy density of the vacuum. So the vacuum energy provides the energy for the dynamics of space, where that dynamics is purely classical.

Cheers LC

Lawrence,

What would light be that has been completely redshifted far beyond the visible scale? Wouldn't it resemble the black body radiation of CMBR?

Wouldn't it also have small distortions around where the emitting source is located, like ripples in a stream around a submerged rock?

Anil Ananthaswamy wrote: "Unfortunately, physics treats time rather differently. Einstein's theory of special relativity presents us with a four-dimensional spacetime, in which the past, present and future are already mapped out."

Equivalently, Anil Ananthaswamy could have written, referring to Banesh Hoffmann's text below: "Unfortunately, Einstein resisted the temptation to account for the null result of the Michelson-Morley experiment in terms of particles of light and simple, familiar Newtonian ideas":

http://www.amazon.com/Relativity-Its-Roots-Banesh-Hoffmann/dp/0486406768 "Relativity and Its Roots" By Banesh Hoffmann "Moreover, if light consists of particles, as Einstein had suggested in his paper submitted just thirteen weeks before this one, the second principle seems absurd... (...) And if we take light to consist of particles and assume that these particles obey Newton's laws, they will conform to Newtonian relativity and thus automatically account for the null result of the Michelson-Morley experiment without recourse to contracting lengths, local time, or Lorentz transformations. Yet, as we have seen, Einstein resisted the temptation to account for the null result in terms of particles of light and simple, familiar Newtonian ideas, and introduced as his second postulate something that was more or less obvious when thought of in terms of waves in an ether."

    The CMB has a thermal or blackbody distribution of frequencies. This peaks in the microwave region. We identify infrared with heat because the blackbody distribution peaks for most hot sources in that region. There is actually nothing fundamental about that. A blackbody curve can peak in the extreme gamma radiation frequency range, and in fact did so during the rather early universe.

    Other galaxies outside our local group will red shift increasingly. In 10 billion years they will only be observed in the IR band. However, the distribution of frequencies from these sources will not be Gaussian or blackbody. As this occurs however, the time of arrival rate for photons will decrease, their wavelengths will increase and it will becomes very difficult to detect them.

    Cheers LC

    Lawrence,

    Just in theory, assume there are infinite numbers of galaxies in infinite space. So that even radiation from much further than 13.7 billion lightyears away eventually reaches us. Wouldn't that distribution of sources fill in all the gaps and result in a black body radiation below the infrared and in the microwave?

    " ... assume there are infinite numbers of galaxies in infinite space ..."

    Why?

    • [deleted]

    I am answering this in a new text box.

    An infinite universe is plausible. The RxR^3 topology of the universe may not have the same vacuum structure everywhere. There may be these zones where the vacuum energy Λ_0 >> Λ, where Λ_0 and Λ are the bare vacuum and the broken vacuum cosmological constant. So our observable universe may have a sort of barrier, where signature of that might exist on the CMB. This is the Linde pocket universe model, and these bubbles are due to the physics of Coleman --- called bubble nucleation. There some good reasons to think this is the case.

    However, we can assume the vacuum is the same for the entire R^3. Any observer looking out will see galaxies and objects further out move at increasing velocity, indeed faster than light. If the space is infinite the velocity as one looks out to infinity becomes infinite. I outline how this works below. So if that happens then any photon emitted sufficiently far out is not just red shifted to the IR or into the radio wave band. It red shifts beyond the horizon length ~ 10^{10} light years. Again I illustrate what the horizon is below. So this quells the Olber's paradox, for anything sufficiently far out is redshifted to such low frequencies that it is not very observable. Further. anything that far out emits photons along our light cone from behind the CMB opaque region. This means such photons are swamped by that boundary This would not be the case if these quanta are in the form of neutrinos. In principle with neutrinos we could observe the universe at a time far earlier than the CMB. Gravitons similarly could permit us to observe right to the quantum gravity event. These gravitons would be red shifted into long wave length gravity waves which might perturb the CMB in so called B-modes.

    The problem with an infinite universe with a single vacuum is that it means our past light cone extends infinitely outwards. In order to have a finite time it means the initial inflationary period involved an infinite expansion. That is a bit of trouble. So to prevent some problems the Linde pocket universe idea (eternal inflation etc) is a better candidate. However, the whole R^3 contains an infinite number of these pockets expanding out at an extremely rapid rate. So in some sense we have pushed the problem out to another level. So this R^3 might have started out as a three sphere S^3 where a point was removed and that topological information is involved with quantum information of these bubbles that are finite in number. Further, for reasons I will go into right now, that huge vacuum these bubbles are contained in may run down, which would also mean the creation of pocket cosmologies is finite and of brief duration.

    The expansion of the universe is described by a scale factor a(t). Given a radial distance r the scale factor a(t) gives a new radial distance r'(t) = a(t)r. I will use Newtonian mechanics and gravity, for it turns out that this gives the same thing as general relativity for flat space, but curved spacetime. General relativity is somewhat complicated to work with. We have for Newtonian mechanics with gravity that the kinetic energy of a moving object is (1/2)mv^2 and that the potential energy is -GMm/r. The total energy is the sum of these. The velocity is determined for our situation by the scale factor so that r(t) = a(t)r and v(t) = (da/dt)r. A little bit of calculus is entering in here. So the total energy we can set to zero, and we have

    (a')^2 = 2GM/a, a' = da/dt

    We then have M = (4π/3)d (ar)^3, for d = density of matter in a spherical region of radius r' = a(t)r. We then write this dynamical equation as

    a'^2 = 8π Gd a^2/3,

    where the Hubble factor is H = a'/a. Now if I assume that the density is constant then this is a differential equation a' = Ka, for H = sqrt{8πGd/3}, and the solution is

    a(t) = (1/H}exp(Ht).

    So the scale factor expands exponentially. This is approximately a de Sitter spacetime configuration.

    The Hubble factor for small time gives v = Hr, for r a small radius out related to a time t. For v = c one can compute the radius where that occurs and we have r = c/H which is the cosmological horizon distance

    R = 1/sqrt{8πG/3c^2} = sqrt{3//\}

    Where /\ is the cosmological constant. This horizon distance is about 10 billion light years.

    This event horizon is not a barrier to our ability to observe things. It is similar to the event horizon of a black hole, but it is analogous to looking out into the exterior world from inside a black hole. It is a barrier to our ability to send a signal to anything beyond this distance. A galaxy with a z > 1 is beyond this event horizon, and the CMB has z ~ 1000. What happens with galaxies disappearing is that they will accelerate away and become highly red shifted. In about 10 billion years all galaxies outside our local group will be red shifted out of the optical band. An intelligent life form could observe other galaxies if they use IR or microwave instruments. The CMB will recede into the radio wave band and to long wavelength frequencies.

    LC

      • [deleted]

      Tom,

      If redshift is a function of distance and not recession, then that black body radiation is the light from those infinite numbers of stars. It has just fallen off to the microwave spectrum.

      • [deleted]

      Tom,

      Lawrence's post, 5/23, at 13:46;

      " As this occurs however, the time of arrival rate for photons will decrease, their wavelengths will increase and it will becomes very difficult to detect them."

      From my essay in the recent contest;

      "As the light from a star expands out to fill the volume around it, it necessarily grows

      more diffuse, as the same amount of energy must cover ever more volume. The further

      away that star is, the smaller it appears and the fainter its light gets. Since the smallest

      measurable quantity of light we can detect is what will trip that electron, eventually it

      reaches the point that barely enough is reaching our detectors to even trip one atom on

      the detector. Beyond that and the duration between the detections start getting further

      apart, so that the resulting wave pattern created by the continuing process of measuring

      these photons will have longer wavelengths."