• Cosmology
  • Black Holes Do Not Exist, claims Mersini-Houghton

" ... understanding complex situation means being able to step back and sense where the energy flowing through them is going and that is the signal that will have the most effect."

No it won't, because demonstrably, a system changes very little on long time scales and distances. That view is fine if one doesn't want to do actual science -- and it's the main problem with philosophers and politicians who won't acknowledge that fuzzy thinking lags far behind the feedback effects that effect the most physical change in a syatem. So we end up a day late and a dollar short in response to almost every important situation that affects the health and growth of our society.

Tom,

Philosophers have never run anything. They are like Truman's two handed economists. Power tends to flow to those most obsessively focused on acquiring it, not those who look at the broader situation. Personally I'm one of those more interested in knowing what's going on, than being the one in charge.

The politicians no longer run things in the west, as they are being carried along by the same financial vortex as the rest of us. Those nominally in control are the ones with the money, but even they are simply riding an enormous wave, based on the acquisition of abstracted value. As that particular wave is about to crash on the shores of limits to real value to back it up, there is a fair amount of turmoil on the horizon.

I think one of the more ironic headlines of the last few days, was one accusing Putin of have his stable of pliant bankers to support him. What? Politicians owning bankers!?!? Surely it should be the other way around!

It brings to mind an old spy novel I read many years ago, where the Soviet spy says to the American, "We both want power. You people want money to acquire it, while we go for the power directly."

For any system, it is normal for those running it to patch up problems when they appear and not seriously question its premises, as they would more likely lose their position of authority, than actually fix it. The result is overshoot, as the system becomes hidebound and more a problem than a solution. Eventually it too breaks up and is washed away.

It's all wave action at work. No flatlines here.

Regards,

John M

Steve,

The only result Google provided for my searches of 'gaechron' was to this article http FQXi. So I'm wondering if this primordial mass is of your invention?

It does seem as if QM assumes matter to be the same thing as energy, but without an ontological rationale of what makes it so. My own personal perspective has long been that the QM/Relativity divide originated when the dual wave-like and particle-like characteristics of matter were realized and that it was not so much a failure on the part of physicists, but the success of non-physicist mathematicians in proclaiming it impossible to determine the distribution of energy in a spherical volume in accord with inverse square law, which led to the conventional acceptance of the 'zero point particle' expedient. (fortunately at one time, long away and far ago, I didn't know that) Nor have I ever heard from any Quant how 'exchange particles' physically accomplish the work of stitching things together. Are 'amplitude' and 'phase' truly physical (?) or are these more that probably the 'gaechron' will materialize 'here' on 'this' timeline?

Any success in finding a 'quantum' of gravitational correlation with the geometry of Generally Relativistic spacetime, is likely going to have to accept that Mike Faraday was way ahead of the pack; the discrete particle exists as only a portion of the energy of a full, self-gravitational and self-limiting field of raw energy. All quantum states would resolve from the peculiarities arising in the transition from a linear function for mass quantities associated with EMR and a non-linear function for greater mass quantities where the relative inertial density is equal to or greater than a density which exhibits an inelastic quality, and hence responding to applied acceleration in accord with Lorentz, of energy conserving space. And a good start point is the well established experimentally defined E=hv, but with a recognition that a 'photon' is a bundle of one or more (h), and (h) is constant to each of any wavelength so; so must there be a corresponding constant volume. Not that (h) is an invariable single entity, but a composite coupled charge of elastic density quantities. And that; EMR is a special case because it is a uni-directional motion of condensate energy seeking equivalent existant light velocity.

The limit of a gravitational field domain need not be the determinant of an orbit, far from it. There would be no rationale for a massive object (however minute) to find location at the least density. "But that lucky ol' sun, ain't got nothin' to do, but roll 'round heaven all day." :-} jrc

John, you have an amazing facility for diverting the topic.

  • [deleted]

Yes, the gaechron is mine. Since my background is spectroscopy, I tend to think in terms of Fourier transforms. Light pulses in time are due to spectral superposition and matter pulses in time (i.e. the universe) is likewise due to a universal matter spectrum. The matter spectrum is dominated by the smallest particle, so I named it gaechron.

Of course, matter and energy are equivalent in both GR and QM. The ontology is a little obscure, but in matter time, it is straightforward. Since the universe collapses at c, all matter is comoving at c with energy = mc2. Any change in velocity always adds energy as 1/2mv2 to the object with that velocity. This also means that light is in effect, standing still and we are the ones moving.

The concept of fields in space a la Faraday and Maxwell is a very useful one for then you do not have to keep track of the particles that cause those fields. Semiclassically, gaechron acts kind of like an aether that fills space and decaying everywhere so that charge cross section results in force for properly phased particles. The flows or fluxes of gaechron are a quantum basis of both charge and gravity.

Instead of inventing a new particle just for gravity, the gaechron flux is responsible for both charge and gravity force and so light is not only the dipole particle bonding electrons to protons, light multipoles are the binding particles for bonding gaechron to neutral matter, which is gravity.

I actually think that I understand what you are saying finally.

"As an example; you say; "The luminosity curve is the findings" That's the conventional view, allowing the LC much more import than it may contain. The LC is a 'CURVE' built from data, not the data itself. It is CREATED FROM the data by applying certain assumptions, then implying interpretations (really other assumptions)."

Of course, intensity, spectrum, and direction are what astronomers measure. So luminosity is intensity, what you measure, along with a factor to account for distance. This is still what is measured and there is a lot of corrections and calibrations, but if you do not believe in a luminosity model, come up with your own.

I assume that recycling luminosity must have been done...I just cannot find a single example out of how many other zillion models for quasar luminosity. The way that you were talking, I thought that you might have done something to show a recycling luminosity that agreed with the findings. When you say the galaxy disk evaporates, there is absolutely no evidence for that statement as far as I can tell. Since there is no evidence, that is what I would call falsification and then I would move on.

For matter time, I try very hard to find out if things are measured well enough so that there is no chance of a 0.283 ppb/yr decay. It would be great to have more precise measurements, but instead, everywhere I look, I actually see evidence for decay of matter. So I would greatly appreciate anyone who can show me that matter does not decay over time because then I can be done with it.

You keep saying the recycling model fits perfectly, but the figure that you show does not seem to show that. Plus, you really need to be careful about smoothing and just bin the data without smoothing. The red shift ratio z is a little ropey as an axis since it is, as you say, a derived and dimensionless number. Why not plot versus velocity of recession?

Thanks, Tom.

I like to think of it as expanding on it, but then we have different perspectives.

Regards,

John M

Steve,

"Why not plot versus velocity of recession?" because that is the most uncertain of all the derivatives! - it's tied to the assumptions; 'time' and 'distance'. The ONLY data we actually gather is redshift. 'z' is the only fundamental not derived from assumptions. We don't even know what's affected 'z', and know even THAT has inconsistencies, (apparently interacting galaxies at different redshifts) but it's only certain data we have. ALL else is 'derived', using guesswork.

It's also important to ALWAYS remember to use the scalar 'wavelength'. As soon as theorists slip into 'f' and mix that with waves up all can collapses to nonsense. We only know Lambda and amplitude, and Doppler shift is only ever a lambda/lambda function. I identified the errors from using f in my 2012 essay. Even 'direction' has been modulated! (aberration/ refraction/ rotation of optical axis) and certainly polarity has.

We do have many other ways of 'observing', but again many forget the results are ALL based on assumptions and interpretations. That's the big problem. So we must clear the foundations of the muddy nonsense and only build from the solid ground. (exactly as Popper identified).

I somewhat agree your point about smoothing. Nature is graded not 'binned' and when we look more closely we find the curves ARE smoothed, but 'joining the dots' is also assumption.

I agree matter decays, and is re-ionized right down to protons. Even fermions cancel over the Debye length to apparently reverse the condensation process of pair production. After negotiating the AGN contraflow torus and z-pinch the quasar jet outflows when at full power (luminoscity) are free protons, which we might assume are not 'new' ones! The fermions are produced at the collimation hypersurfaces.

All this is quantified precisely in many hundreds of papers in the AJ, ApJ, MMNRAS etc. Many slightly contradictory as there's too much 'interpretation' but reading them all gives a coherent picture. I'll post some of the few which are free access at random, (many links are in the refs in my paper).

MNRAS Yue et al 2014.

MNRAS Gardner, Done, 2013.

Proc.Royal Soc. Hirota et al 2014.

Begue et al MNRAS 2014,

Hopefully these will start to give a glimpse of the masses of data we have we from detailed observation,

Best wishes

Peter

Or course, what we gather are spectra, intensities or luminosities versus frequencies of light, and z comes from comparing the frequencies of spectral features like hydrogen lines with those of earth. Assuming constant c, we can calculate a velocity or a z with equivalent precision because both z and v depend on the assumption of constant c.

To get time, we need a Hubble constant, but that is just a number and as a number, does not change the precision of the plot either, just its interpretation. Luminosity is J/s and so has an implicity time model as an integration of intensity over a spectral feature. Luminosity can be apparent and that is relative to our time, or absolute luminosity depends on a time cosmology. Plotting luminosity verus z, then, really gets confused because it ignores the built in time assumption for L, which ties L pretty dramatically to a particular cosmology.

For a decaying universe, c increases over proper time along with alpha and h, so the spectral features of hydrogen appear red shifted when in fact they are simply from an early epoch and are actually blue shifted instead. Fortunately, this just means that L is divided by gamma^2 for each epoch and low and behold, quasar L is a more modest function of z or time, the second plot in the link.

plot quasar numbers and luminosities

Now the luminosity peaks before the quasar number peaks, which implies a very nice progression from bright quasars to more numerous quasars. Once again, not much recylcling implicit in such a model, but really nice kinetics and dynamics. The shrinking universe is actually a lot smaller than it appears...

You are preaching to the choir here. I do not understand why Mersini&Houghton did not reference your very nice papers, even though your papers did not seem to make it to the "big show."

What you do not mention at all (and neither did M&H) was the possibility of boson stars being the quantum equivalent of an ECO. Is there any room for boson stars in your cosmology? There is a huge literature on boson stars and such objects are much better quantum objects than SMBHs or ECOs and I can't help but think that these objects are inherently quantum ground states of large matter accretions.

If you follow a photon, I think it will go all the way to the center of the black hole (in principle). That is to say that the event horizon is a mirage from the point of view of flat space. If such a photon were to make it all the way to the center, it would blue shift all the way in until gravity gave it far more energy than any gamma ray.

Jonathan, Tom, Akinbo,

Fig. 3 of 1364 illustrates what I consider a non-Euclidean notion of numbers behind putative singularities within IR. Dedeind's pebble like notion of number is responsible for the distinction between ] and ) in so called point-set-topology. The latter should rather be called pebble-set-theory. It contradicts to Euclid's definition of a point as something that has no parts while a genuine continuum is something every part of which has parts.

While Buridan's pebble-critical donkey is most likely of ancient origin, Johannes Buridan and also Nicole of Oresme were early critics of Claudius Ptolemaios.

Eckard

Cosmology and fundamental physics have come to the limit of knowledge to the limit in the process of cognition. Therefore, the issue of "black holes" should be considered not only at the level of mathematics and observation, but at the deepest ontological level. In fundamental physics is necessary to introduce an ontological standard justification of knowledge in addition to the standard empirical justification. The Universum must be considered as a whole («whole»? Vs. «hole»?).

The pinciple: total ontological unification of matter at all levels of the Universum. In the construction the Universum appear "source" and "outflow" of matter and the primordial generating structure ("general framework structure") as the ontological framework, carcass and basis of fundamental knowledge. We must remember that today as well as mathematics and physics - fundamental sign systems without ontological justification. This philosophical nonsense of modern fundamental science.

Yes, today it is necessary «to reimagine the fabric of space-time, but also rethink the origins of the universe», «to rethink their ideas of the Big Bang and whether it ever happene.» Concept "black hole" - a beautiful and fascinating metaphor (for example in art - the "Black square"), but it does not give a new heuristic. The Universum, basic science and modern society needs a new ontology, which gives an insight into the nature of information and time , an insight into primordial ontological structure of space and its ontological dimension.

Sincerely,

Vladimir Rogozhin

    Eckard,

    I did not fully understand the Fig.3. But I see you keep repeating here and in your essay that the 'point' has no parts means it has zero extension. That is not the correct meaning! The 'point' has extension, probably about Planck scale ~10-35m size. "Have no parts" does not mean zero extension. As I mentioned in my essay, "Have no parts" means not divisible into further geometric objects. That is, there IS an end to geometric divisibility. Tom believes there is no end to divisibility which is the concept of the continuum, but it gives rise to numerous paradoxes.

    Once, you give 'points' their respect as the non-fictitious fundamental unit of space, an accumulation of them can give you lines, surfaces and bodies.

    Regards,

    Akinbo

    Vladimir,

    It might be that black holes are simply the limit of linear projection and we need to reconsider a more cyclical paradigm.

    Less narrative bottom line and more yin and yang.

    Regards,

    John M

    "Tom believes there is no end to divisibility ..."

    No, you are confusing arithmetic operations on the real number line with the topology of the 3 sphere, where R^3 is compactified by a simple pole at infinity. This results in the definition: 1/0 = oo.

    The limit of divisibility is division by zero, which is a forbidden operation in arithmetic. In 3-sphere topology, however, covariant operations are nondegenerate near the singularity, because complex analysis substitutes lines for points in the fundamental underlying geometry.

    Tom, not being a mathematician I cannot argue with what you say since I am told that mathematicians operate in a Platonic realm of which I 'have no part'.

    But I think I can ask whether arithmetic operations on the real number line are more applicable in the real world than the other type of operations, which though useful and of value may not actually correspond to the real world.

    The Romans were more practical and had no number called zero preferring to use I, II, III, IV, C, X, etc that can be said to correspond to what can physically exist. The Arabs were more into metaphysics and included the number 0 to the 1,2,3,4, etc we know. This has been a kind of curse to physics but a blessing to mathematicians. It is also the source of infinities in physical theories resulting in the need for 'renormalisation', etc.

    A zero account bank balance simply does not exist anymore! It is no longer part of the world economy, but by different manipulations it is still kept there to participate in the economy. Any surprise then that we now have a bubble economy waiting to burst? No more gold-backed currencies but 'promissory' and 'In God we trust' notes when everyone knows promises can be broken and God admonishes us to shun money, being the root of all evil.

    I am consoled however that once in a while when confronted by the need to be real, as you say "analysis substitutes 'what is not zero' (lines) for 'zero' (points) in the fundamental underlying geometry". In my essay that I referred Eckard to you will see the reference where Plato himself when boxed into a corner by Aristotles dialectic admitted that points are not fictitious, meant only for book-keeping but are part and parcel of the real physical world, preferring however to still refer to them as the end of lines. Can a platonic 'zero entity' be part of a real 'non-zero entity'? Why then not start from non-zero geometry instead of adhoc substitution?

    All the best,

    Akinbo.

    Dear Steve,

    Boson stars are result of QM rather than any QG, the say way White Dwarfs & Neutron Stars are QM (Degeneracy pressure at T=0) rather than any QG effect. Boson star's stability depends on uncertainty principle instead of degeneracy pressure. Since no appropriate stable Boson of desired property is known, one often invokes imaginary ``scalar fields'' for theoretically constructing them. The upper mass limit of White Dwarf (1.4 solar mass) is obtained by using a Fermion mass ~ 4 GeV (He nuclei). And this is fixed. But if one would imagine Fermions of much lighter mass, the upper mass limit of a Fermi-Dirac star would be higher. Similarly Boson stars are expected to have fixed upper mass limit unless one would conveniently imagine a different Boson mass or a different scalar field. Thus neither FD nor Boson stars can explain BHCs whose MASS RANGE ~ 3 solar mass - billions of solar masses. On the other hand, ECOs have neither any lower nor any upper mass limit. Thus they are indeed appropriate candidates for BHCs. More importantly, their formation is a GENERIC effect, TRAPPING OF RADIATION BY self-gravity.

    Abhas

      Tom,

      "complex analysis substitutes lines for points"

      Doesn't oo x 0 still =0

      Akinbo,

      There is nothing wrong with a number/symbol for nothing, if we keep in mind that it is nothing and don't try surreptitiously creating something from nothing.

      Currencies not gold/asset backed, are necessarily backed by public obligation, which can be used to drain the value out of the people and into the hands of those wishing to own everything, including the people.

      When you try to put everything into a zero point, the result is pressure and the consequence is eventually an explosion. When you try to create something from nothing(or very little), the result is a bubble and eventually it pops, no matter how much those selling it insist otherwise.

      Regards,

      John M