• Cosmology
  • A Self-Gravitational Upper Bound on Localized Energy

Hi Jonathan,

You're welcome. For sure, Schiller's work supports what Steven is presenting. I think Steven took it a bit further. However as I alluded to above, I suspect there is more to maximum force than what Planck units might be showing us. I believe there probably is a maximum force in nature but it probably is not quite so clear cut as c^4/4G. With the advent of Joy's work and also Michael Goodband's work, we probably should account for extra spatial dimensions. And torsion. I am hoping that this will actually pull down Nature's maximum force from the Planck scale. Realistically, there is no experimental evidence at all that Planck length, etc. mean anything whatsoever. And Newton's G is one of the poorest known constants over a limited range. So we may have a long way to go here but speculation is always fun. And... have fun we must. :-)

Best,

Fred

Edwin, and Jonathen.

Thanks, I'd read it only once, quickly. Always limiting.

A lovely apparent dichotomy seems to emerge; that dark energy contributes gravitational potential as well as the opposite expansion force. Is that a fair point?

I actually find much agreeable. Indeed I've also suggested, for instance, the Unruh effect is nothing to do with 'acceleration' per se but motion through the medium, so resolves to propagation of matter (yes, virtual or 'photoelectrons') via photoionization, so speed dependent. I'm only able to do so as I've allowed the QV, Higgs field etc and dark energy a local kinetic identity.

If gravity then emerges topologically, i.e. as a dark energy density gradient, then does not the 'cold spot' cluster theory also fit nicely into place? Do you have any particular citations for that one Jonathen? I haven't picked it up in that way. I'm months behind with my AAS & RAS paper reviewing, but I think it's just a good different characterisation I've missed that seems to fit the topographical 'energy density' model. In a nutshell, the energy fro matter is provided locally - leaving a 3D Dirac/Newton/Yukawa shaped 'cold zone'.??

Edwin, thanks re morphology. I should have focussed my comment more on the dynamic aspects. As in the 'Montevideo interpretation' of QM it's incalculable, but I'm quite convinced we're missing an important trick resolving anomalies by ignoring it's effects.

Finally Jonathen I agree. ArXiv as most science is rather too parochial to academia, and Steven's excellent work is a proof.

Best wishes

Peter

At the beginning of the article, Kauffmann notes that :

"But the uncertainty principle of quantum theory can manifest a disconcerting predilection to throw up infinite energies, and if we understandably quail at abandoning so firmly established a principle, it behooves us to at least try to ponder its self-gravitational implications."

I do not believe that the uncertainty principle needs to be abandoned. But I do believe that it is time to recognize that few physicists are familiar with its mathematical origins, the assumptions built into it, and the consequent limited circumstances to which it can be applied. Its predilection to throw up all sorts of quantum oddities, is larger due to misapplying and/or misinterpreting it, usually by violating one of the assumptions, deep within its foundations.

Rob McEachern

    Rob,

    I agree that the uncertainty principle is based as much on Fourier analysis as on physics. I believe the key aspect of 'reality' underlying this principle is the apparent fact that nature does nothing below a certain threshold of action. This, in my view, is what keeps the whole thing together. I've tried to imagine a universe with no minimum action, where anything goes, at any level down to zero (all noise, no signal?), and it's inconceivable to me that structure would survive in this situation.

    The dimensional aspect of ( M*L*L ) / T leads to convenient formulas in terms of position-momentum and energy-time and angular momentum and the ability to describe energy as h/T fits in perfectly with Fourier frequency analysis. But in my mind there is no necessity to generate infinite energy based on this fact, yet I resist postulating a "minimum time", so I've not been quite certain where the Fourier "prediction" breaks down for such high frequency components, as it must. I rather like Kauffmann's natural approach to self limiting energies

    Of course much of this problem is predicated on the possibility of virtual particles, which may have made sense with a vacuum energy 123 orders of magnitude greater than seems to be the actual case, but which I find to be highly unlikely. Yet assuming there is some corner of the universe where these energies actually exist, say some future super-super-LHC, it's still nice to know that there's a natural limiting mechanism.

    As you can tell by my previous comments, I find Kauffmann's work fascinating (probably because he is so in line with my own bias) and I think you would also. I would be interested in any comments you might have after looking at some of his other work.

    Best,

    Edwin Eugene Klingman

    Edwin,

    Let me be more specific about the nature of the problem. The uncertainty principle is a statement about how much information an observer can obtain from an observation; it says the minimum number of bits obtainable is 1, anything less, and an observation has failed to occur, which is of course possible. It is not a statement about *any* characteristic, attribute or property of the entity being observed. It is merely a statement about observations of such properties.

    Now consider Kauffmann's statement, on page 7, that:

    "Upon quantization, each such oscillator has a minimum positive energy...being completely mandated by the quantum uncertainty principle...always has infinite energy."

    The uncertainty principle mandates *nothing* of this sort. It is a statement about how much information about the oscillator energy can be *observed* (how many significant bits are contained within the energy measurement), not how much energy the oscillator *has*. Consequently what the principle mandates, in this situation is:

    *IF* you can successfully make an observation of each oscillator's energy, *THEN* that observation must, of necessity, contain a minimum of one bit of information about the amount of energy detected, *BUT*, you may fail to succeed in making any such observation, and thus obtain 0 bits of information.

    The correct use of the uncertainty principle cannot enable one to deduce "infinite energy". There is no "infinite energy", that must somehow be explained away.

    Rob McEachern

    I beg to differ;

    The uncertainty principle refers to how pairs of measurements yield a result that depends on the order in which two observations are made, such that any one definitive measurement clouds subsequent measurements of other quantities and there is thus a minimum uncertainty in the product of the two. Of course; this is a non-commutative relation, where the two measurements are usually taken to be orthogonal properties - say position and momentum.

    However; experiment shows we can bend the rules somewhat, by taking repeated weak measurements, as explained in this article by Steinberg et al..

    In Praise of Weakness from Physics World

    There is some question in my mind, though, about whether uncertainty is a property intrinsic to sub-atomic particles. Is there, in fact, a situation of their being loosely defined - except in relation to other forms? We have a kind of observer bias, from the fact that any definitive measurement we make is taken from a platform that occupies a certain location in space at a particular moment in time. While one could argue that observation is irrelevant to the state of a system, one can also say that the system's state is defined by its interactions with its surroundings.

    More on this later,

    Jonathan

    On the roots of uncertainty;

    It is my understanding that Heisenberg first came to discover this principle when studying the Rydberg-Ritz atoms and the process of using two spectral lines to find a third. He discovered that the order in which they are specified yields a unique result, and thought this was curious enough to deserve further study. He first discovered a principle about pairs of quantum measurements. Then later; he came to find out there was a minimum limit to the combined uncertainties thereof.

    Please correct me if my history (from my recollection of Connes' retelling) is inaccurate. In fairness; I should have used the word determinations, instead of quantum measurements, because in QM any measurement is a participatory process much like constructive geometry. But I'll return to this point, if there is time.

    Have Fun,

    Jonathan

    • [deleted]

    Jonathan,

    "We have a kind of observer bias, from the fact that any definitive measurement we make is taken from a platform that occupies a certain location in space at a particular moment in time. While one could argue that observation is irrelevant to the state of a system, one can also say that the system's state is defined by its interactions with its surroundings."

    It seems to me that all perception requires some form of frame. Such as taking pictures requires specific speed, filtering, aperture, direction, distance, lensing, etc. Otherwise there is blurring, washing, etc., as the amount of available information quickly goes to infinity and the result is white noise.

    This goes to the relationship of energy to information and that while information defines energy, energy manifests information. So when we combine energy, there are the resulting canceling effects, so combining the resulting information also causes canceling. Much as a top down/generalized view tends to blend the details and a bottom up specialized view cannot see the larger context.

    So locating a particle means having to filter out its motion and measuring motion means blurring its details.

      Jonathan,

      You have correctly stated the conventional misunderstanding of the principle. What you said is true. But it does not change the fact that it is frequently possible to measure a third "thing" and then infer, without measurement, what the two observations, whose product forms the principle, *must* be, with far greater accuracy than they could ever be directly measured. Things like FM receivers do this all the time. They accomplish this by exploiting *a priori* information about what the *CORRECT* mathematical model, for the observations, *IS*. The uncertainty principle was derived, under the assumption that no such a priori information is being exploited.

      But more importantly, the principle merely states a limitation upon what an observer can know about an entity. That is not the same as stating that the entity has a corresponding limitation. It may indeed have such a limitation. But the uncertainty principle says nothing about it.

      More specifically, the principle has much more to do with "resolution" than with "accuracy". As an analogy, a telescope has a resolution that depends upon the size of its aperture. When one uses a telescope to observe a binary star, one may not be able to observe more than one speck of light. But that does not unable one to deduce that there is only one star. What is 'Uncertainty" is how many entities there *are* to be observed within the two-parameter space, not the values of those two parameters. Two spectral lines may be far too close together, to *resolve* them is a short time period. but this fact does not prevent a dual-frequency FM receiver from determining the two frequencies, with great accuracy, within the same time period.

      Rob McEachern

      • [deleted]

      Edwin,

      In your Mar.7,2013 @ 18:30 GMT reply to Rob you write, "describe energy as h/T". Please help me get this 'buzzing phrase' out of my mind. What does it mean? You ask why I ask? Because in The Thermodynamics in Planck's Law I derive the duration of time required for an 'accumulation of energy' h to occur is given by h/kT. So how does this relate to "energy as h/T" ?

      Constantinos

      • [deleted]

      Jonathan is correct. Heisenbeg uncertainty does not refer to a system's information content. If such were true, we would know with certainty the point where quantum phenomena become classical.

      Tom

      Thanks John,

      I think the camera analogy is rather appropriate here. There is always a question about what you are trying to capture or emphasize. Is sharpness of focus on the subject more important, or is the depth of field paramount because we need to see the background to establish context? Is sharpness of time definition of greatest value, as when determining the winner of a race, or do do you want to preserve the blur of motion for artistic effect, and leave the shutter open longer?

      Some phenomena are too faint to photograph without a long exposure, and others are too swift for anything but the shortest exposure possible. So you correctly point out that even a single observation involves a trade-off of sorts. I'll have to think more on this.

      Regards,

      Jonathan

      Rob, Jonathan, all,

      I believe part of the problem is 'whose?' uncertainty principle we're discussing. The above arguments seem to assume there is AN uncertainty principle. I suspect (without going to the trouble to prove it) that both Rob and Jonathan can find a reputable text or paper that supports very closely their own interpretation. So, without arguing about what THE uncertainty principle says, I believe the issue is 1) the informational approach, largely based on Fourier analysis, that Rob describes and 2) the fact that the order of measurement *does* determine the outcomes in the cases of interest, when the measurements interact with the system under consideration to a degree that changes the system. This is the significance of "weak measurement". (Jonathan, thanks for the link to Physics World. It shows the same 'wave function measurement' diagram that is on the first page of my FQXI essay, "The Nature of the Wave Function", and I provide 3 references to such articles.)

      Rob, I had the same problem with Kauffmann's 'mandatory' phraseology, and yet I believe that this is the perspective of most quantum field theorists, "the most successful theory ever". My own goal is not to dismiss it as a stupid misinterpretation but to try to understand the physical essence of what's going on. I was in a discussion night before last with a very bright physicist who was singing the theme song, "but our intuitions did not evolve to understand quantum reality, yada-yada-yada. That's what Bohr, claimed, and is the basis of the Copenhagen interpretation, yet the weak measurements clearly show, as stated in Jonathan's Physics World link: "But it is striking that the average result of such a measurement will yield exactly what common sense would have suggested."

      I began my above comment by stating my belief that 'quantum action' is the fundamental reality of our universe, more fundamental than momentum, energy, what have you, as there is no universal measure of momentum, or of energy, but Planck's constant IS the universal measure of action. Action has units of mass*length-squared*inverse-time (MLL/T), but it is the product of these terms that is constant, not the terms themselves, the mass, the length, or the time. Thus when one analyzes energy over very short durations, the constant is divided by time, and if there are no limits on the time interval, the energy heads toward infinity. I don't believe this happens, but neither do I believe that there is a "shortest time" in the universe, so the question is what to believe about physical reality. I think most quantum field theorists agree with Kauffmann's wording. For this reason I was happy to see his 'self-gravitating' approach as a possible self-limiting solution to the 'problem'. I do not believe in the physical reality of infinity, so when the math seems to imply infinity, I look for the point or mechanism at and by which it breaks down. Kauffmann may have found one.

      Rob, while I agree with your information-based analysis, and with your FM examples, there is still a physical aspect that is interaction-based in the sense that measurement of quantum 'objects' interacts with, interferes with, and changes what is being observed. That's why weak measurement theory and approach is so significant.

      Constantinos, I am speaking of action as the fundamental unit of the universe. We speak of momentum and energy because our minds find it easier to grasp these, since everyday experience does not illuminate units of action. Yet quantum theory began with Planck's discovery that action allowed him to match the data when all else failed, and quantum theory is intertwined with action every step of the way, from variational principles, to uncertainty principles, to spin, Bohr orbits, you name it. There appears to be no meaningful physics between zero action and a Planck unit of action, hence nothing to 'accumulate'. When you refer to h/kT you are working with the thermodynamic relation based on statistical ensembles (indicated by your use of the Boltzmann constant, k) and temperature T, which is a measure of average energy. Averages, of course, are NOT limited to integral multiples of h, as they are mathematical, not physical. The statistics show that the averaging scale factor, 1/kT, is useful in probability in terms of units of h, which is multiplied by a frequency to get the energy that is then compared to the average energy. The T in h/T is not temperature but time, and refers to what one gets when one tries to separate time from energy in a world built around quantum action. I hope this clarifies it somewhat.

      Best,

      Edwin Eugene Klingman

      I'd like to recap here, one of Edwin's comments above..

      Steven Kauffmann "is talking about the interaction of gravity with energy (including the energy of the gravitational field.) But more specifically he is saying that localized energy -- such as the 'virtual' particles of infinite energy that appear in QED -- will have a mass equivalence that generates its own gravitational field, and this field, if energy is to be conserved, will not be 'extra' energy added to the situation but will be energy of the particle that is effectively 'converted' to gravitational energy."

      This has the effect of restoring the sense of physical realism to our picture of the world. I think Kauffmann would agree with Rob that the infinities arising from the point particle assumption plus QFT do not appear to be a physically realistic possibility, and this may be part of why he sought a reason things could be resolved otherwise.

      Have Fun,

      Jonathan

        • [deleted]

        Jonathan,

        There is complexity in simplicity and simplicity in complexity.

        • [deleted]

        Edwin,

        I was mistaking the T in your h/T as temperature and not time. Thanks for clarifying. Consider the following in response to your comments.

        1) Physical reality is 'manifested' (observed, measured) at or above Planck's constant h.

        2) Physical reality exists the same but 'unmanifested' (not measurable or observable) below h

        3) Planck's constant though considered as minimal 'action' can also be thought as minimal 'accumulation of energy' (the time-integral of energy -- what in my formulation is the prime physis 'quantity eta') that can be observed or measured.

        4) Though E(average)=kT is a statistical derivation, the same equation can be derived without any statistical thermodynamics, as is the case in my formulation. In this non-statistical formulation, h/kT is the duration of time required for an 'accumulation of energy' (quantity eta) h to occur or manifest.

        5) If in h/T the T is 'time', what 'time'? If this time T is the 'duration of time for h to occur', then h/T would be 'average energy' according to my results. Is that the reference made in "energy as h/T" in your comment? That is indeed very interesting and fitting my formulation!

        6) Indeed the most basic 'physical reality' is 'action'. This is the 'quantity eta' in my formulation. In terms of this 'quantity eta' all other physical quantities can be defined and basic laws of physics can be mathematically derived.

        See my chapter The Thermodynamics in Planck's Law for all of the above.

        Best,

        Constantinos

        Edwin,

        "weak measurement" - a new name for an old concept? Increase the signal-to-noise ratio, by measuring a large number of identical entities, perform "post selection" by basing the final computation on the detectors (within a filter-bank) with the highest SNR. This is what a filter-bank, of FM detectors, tuned to different center frequencies accomplishes. This is what I discussed in my FQXI essay. The basic idea is older than I am, and I'm retired. The article I published in "EDN", twenty years ago, showed how Gaussian windows could be combined with Fast Fourier Transforms (FFT), to perform this very efficiently. The "weak measurement" is the Gaussian window, that has a wider bandwidth than those used by the "strong measurement", narrow bandwidth "uncertainty principle" filter-bank. "Post selection" consists of only using the filters with the highest amplitude, to derive the final frequency estimate.

        Rob McEachern

        Constantinos,

        I am familiar with your mathematical approach, and admire it very much, as I've said several times in comments on your essay blog. I realize that your 'eta' is 'action'. You say: "If in h/T the T is 'time', what 'time'? If this time T is the 'duration of time for h to occur', then h/T would be 'average energy' according to my results." I agree that thermodynamically it may be 'equivalent' to your formulation. But for a single event "average energy" doesn't have a well defined meaning. I think I understand statistical phenomena - I tend to focus on the more fundamental events and/or entities, and try to understand them. The answer to "what time" is somewhat artificial, and is a result of trying to break action into 'energy' * 'time'. You could call it the "time to accumulate" enough energy for the action to occur, but I don't really believe that the universe works like that, just as I don't believe the universe "computes its next step (or state)". When we break a unified whole into pieces, we can say a number of things about the pieces, but what we say and the pieces themselves may be fictitious.

        Rob, I would not argue with anything you say. There are very few ideas that do not a recycle an older idea from a different context. But for quantum theory, suffering from a century of Bohr, and half a century of Bell, weak measurements are big deal, and I'm excited about it.

        Best,

        Edwin Eugene Klingman

        • [deleted]

        Edwin,

        I understand the dilemma understanding my results using statistics. For example, you write "for a single event 'average energy' doesn't have a well defined meaning".

        If you consider 'average energy' to be of an ensemble of many events, than clearly this would not apply to a single event. But 'average energy' can also be equivalently thought as 'accumulation / time'. In this sense, there is no need to consider an ensemble of events in order to define 'average energy'. This is how I am using the term.

        Another example along the same lines is defining 'temperature of radiation'. Clearly, the way temperature is currently defined in statistical thermodynamics (as the 'average kinetic energy of moving particles') this concept also would not make sense for radiation.

        I have defined 'temperature' in such a way that it does not depend on 'moving particles'. And I am able to get the same results and equations as with statistical thermodynamics. Furthermore, this approach enables us to make sense of Planck's constant as being that constant 'accumulation of energy' used as standard to define (in this way) the Kelvin temperature scale. So there is no mystery why Planck's constant exists and what it actually means. Mystery resolved!

        Best,

        Constantinos