John RC,

"difficult to keep...spacetime from coming apart at the particles." Impossible I'd say. 'Spacetime' means quite different things to different people. After all the nonsensical interpretations that many still cling to Einstein ended up precisely where Minkowski started;

"not 'space' but infinitely many 'spaces' in relative motion" This gives the discrete inertial system or 'field' model (DFM). The particle interactions then divide them.

I could't penetrate far through the haze into your sample jar, so best to offer some simplifications which I think are compatible with that and the above.

Inertia is simply gyroscopic. It's not then bizarre finding a ton weight accelerating under G at the same rate as a pea. Imagine a ton of spinning gyroscopes fixed to a framework. Then beside it one tiny peas shaped gyroscope. Which is easiest for you to accelerate by pushing? Correct, the tiny one. So why do we expect gravity to do the opposite and accelerate the big one faster?

That shown the fundamental error sin all our assumptions. Errors we've become so familiar with that most are unable to challenge them. I did and a very simple model emerged, but it seems most can't so confusion remains.

If you wish to see the model you only need to read the essays. The limit on propagation speed of EM waves is then simple and relates to minimum wavelength gamma, which is at optical breakdown mode plasma density. (I'll post the link to the paper on that if you wish).

Best wishes

Peter

PS. It will make perfect sense to you when you read it but if you don't then also 'rehearse' it the whole dynamic will evaporate as our neural networks don't have a pre-set default mode to 'hang it on'.

Akinbo,

Light does c through all matter systems wrt the rest frame of the matter. Simple as that. The G field only then has a secondary affect because it affects particle density. That resolves Q1.;

"The concept speed is inherently relative, but ONLY relative to the matter it propagates in (including lenses)." These are then local 'discrete frames' or fields.

Q2; What is absolute? Time rate locally, NOT emitted signals universally. Propagation speed, but again LOCALLY wrt the propagating medium only. Space, which may be quantized, is then logically modelled as a fluid, or multiple sets of two frames (2-fluid plasma).

You ask; What is a 'good' vacuum? You should know a perfect vacuum cannot exist. A good vacuum has insignificant massive particles at n=>1. It will always still have free electrons at n=1 but though their coupling constant is high they are of course 'invisible' spectroscopically. If there are a lot and they are blowing across the light path they'll 'drag' the light (actually "rotate the optical axis of re-emissions" is the more precise description). This is the well known kSZ effect, also giving elliptical polarity and Faraday Rotation (IFR), all 'anomalous' under current theoretical assumptions.

Make sense?

Peter

Tom,

I agree SR's maths may be 'complete'. But it's clear is that it doesn't completely describe nature. SR as finally defined by Einstein however, reduced to the postulates, does so perfectly, with the soupcon of uncertainty inherent in the 'local' quantum mechanism and LT.

All empirical evidence supports that definition of SR. It does not support the assumed mathematical description beyond that. If you think you can suggest some evidence that does so please specify it and I'll show you where the wrong assumption lies.

Peter

Pete,

Actually I'd like to have a link to your papers, it looks like what you are developing is what gets too complex for me after the rationalization of the volumetric determination for rest mass energy distribution in what I've modeled.

Beg pardon... the brief synopsis of method for determining distribution to density variation was just that. It starts with a simplistic (intentionally) model that is 'ballpark' parametric to establish density values for magnetic response, then G as a 'c' proportion lower density which is assumed to be the limit of cohesive coherence of energy, a 'c' proportion greater density than magnetic with the characteristic of translating electric response, and another 'c' proportion greater density with an assumed characteristic of

inelasticity as the kinetic particle boundary. The parametric particle volume is then rationalized to a 'real' base radius by theory terms applied to a formulation of Coulomb's Law, then the exponential radial set of density difference equations are applied. The differentiation comes from the inertial density in relation to the kinetic density. If inertial > kinetic, the result is particulate, if inertial < kinetic the electrical density allows the charge to expand to fill it's container, in this manner the EM spectral range is equated the same way mass accumulation in particulate matter is equated.

In the range of particulate matter (the model produces a rationale for an upper limit of stable mass quantity at 263.11 amu ) the difference between inertial and kinetic density, and the volumetric requirement of energy quantity at a constant density in the central core, results in the same "Incredible Shrinking Area of a Surface of a Volume" that Tom keeps trying to get people to understand is what is meant by 'curvature of space'. It's not like GR removes the field from the volume, it just uses acceleration instead of force to define it's size.

If I get around to dusting it off, I'll be looking at what appears to be an optimal differential that results in some mass quantities and the volumes those quantities prescribe, having a greater propensity than other quantity values. It's been 26 years and nobody but Reagan's SDI was interested, the best I could do as an amateur was the make sure that D.O.D. knew first that I was tinkering on something so nobody could use it to sneak into DARPA.

Your own efforts in treating discrete fields in aggregate gets into a complexity that naturally must treat inertia as an operation between masses. The resemblance to all those studies Einstein made of Brownian motion is what always dissuades me from attempting the math. But the over-riding conclusion is that in aggregate, domains develop and in relation to any dominant gravitational frame, gram molecular weight is one thing but size matters.

NASA has some very interesting results from Voyager's transition to deep space you might want to look at. jrc

p.s. I really have to get started on a noisy wheel bearing.

John,

'Domain size matters' Right. Space 's' moving within larger space 'S'. Always a local background reference frame for defining the concept 'speed', but one that moves within it's own larger background.

This is the hierarchical structure of Truth Function Logic I refer to in my previous essay. They are written as conceptual so called 'toy' models as mathematics can't expose the logical dynamics. In sequence the last 3 are here (all top 10 finalists, but overlooked as they don't use current assumptions). I've just posted the fqxi links to Petcho, but here are the less abridged versions;

2020 Vision

Much Ado About Nothing

The Intelligent Bit

An optics based fuller picture is here, finally explaining why reflections from a moving mirror in a vacuum do c wrt the vacuum frame not the mirror; arXiv;

Kantor and Babcock-Bergman Emission Theory Anomalies.

When you've waded through those ask for the LT derivation link and any other areas such as the cyclic cosmology emergent. The mechanism is dead simple. Unravelling all the present complex confusion is less so at first.

Best of luck. Do report back with any questions or comments.

Peter

Pete

Thanks for the links. I'll put on my waders but give me some time. I think you will like what NASA obtained, there was an abrupt and sustained increase in plasma pressure that can be precisely pegged in Voyager's transition out of the solar system's gravitational 'bubble'. When I read the piece I thought it was right up your alley. jrc

Lorraine,

To continue that thought, there are lots of different ways to think about all the enormous complexities of life and it doesn't take ten billion dollars to explore the possibilities.

One of the basic points I keep trying to raise about numbers and simple addition is that when we add, we are adding sets and ending up with larger sets, not the contents of the sets. Say when we add 4 apples and 6 apples, we are taking the two sets and creating a set of 10 apples. If we actually added the apples, we would have a jar of apple sauce.

This seems like a very obvious and nit-picky point, but I think it is part of the thinking that leads physics to currently trying to say everything is discrete.

Think about it in terms of the body; All our organs and cells and thoughts and relationships and physical context, etc. add up to ourselves as a whole person. Then when we step back, there is no real, distinct line where one person ends and the next begins. Yes, we can draw lines, but they are more like horizon lines. More a matter of selective perception. We all exist as each other's context, have the same dna, energy, often share thoughts, etc. The lines we draw are as subjective as we are.

Now on the other hand, when everything is its own entity and we keep looking at smaller and smaller scales, we eventually get down to the quantum level of distinct objects. Even though the lines between them seem pretty fuzzy and there are statistical waves and super positions and non-locality and entanglement and all those other factors that make them seem connected, we are not fooled! We know if we just keep looking and poking, we will find true clarity of distinction and all the parts will be truly separate, even if we have no theory of how they all work together.

We need distinctions, but we also need connections. Like painting a picture, we don't want all the colors to run together, but we still need them to make connections.

' Shut up and calculate" is not even philosophy, but belief, if you can't examine how the factors and functions operate.

Regards,

John M

John,

You're right. It predicted Voyager findings, against doctrine. The latest VLBA findings are entirely consistent with the model, confirming plasma refraction. Those are on top of the 20 I listed for Tom. But it seems even in science, when we weigh up irrefutable evidence and logic against established beliefs its beliefs that win every time. It's an interesting insight to human nature and neural networks. Just a shame about all those billions really!

I look forward to your comments. Don't hold back on any flaws you may find.

Peter

Pete

A clarification, please on the 2020 article. Right off you state axiomatically that the 'speed of the wave' is resultant of the refraction index. This is consistent with Fitzgerald whose contraction is of the wave, not time. The way I read the various experimental methods which are cited as evidence that 'light slows down in a medium', is not Lorentzian. The velocity of propagation is not altered, it is still celeritus. The wave contraction due to (pardon) quantum effects in and of the medium, computes as a lower velocity that does not really distinguish between the speed of the wave and the velocity of propagation. That ambiguity is inherent to LT, Lorentz packed the same amount of electric charge into a smaller volume and declared density equals mass. If you measure the strength of the charge by inverse square law, why would it behave as an average density of charge within a smaller volume?

The shorter wave is still propagating its volumetric change at the same velocity along the same timeline, through the media. It's extension physically of that volumetric dimension along the timeline is reduced, not the volume of it's wave event, nor it's constituent energy quantity. It's more analogous to compression than to velocity.

I'm trying to digest a bit at a time, so don't take me too seriously. jrc

John M,

My guess is that what you are trying to say is something like:

- we have current knowledge of fundamental reality represented/symbolised by law of nature equations. These equations don't necessarily imply any further equations.

- to go further than these equations you need a philosophy of what these equations mean. For example, if you philosophise that a multiverse exists, you will create a new set of equations (consistent with the original law of nature equations) to represent your ideas, and then look for evidence that might confirm these new equations.

- that is, to progress the equations, you definitely need a philosophy.

Cheers,

Lorraine

    Lorraine,

    A life without a philosophy is a fish without water. Even physicists have one, they just can't measure it and part of their philosophy is that if it can't be measured, it doesn't exist. Silly, but true.

    Regards,

    John M

    John,

    Abott view agrees the 'Dirac Line' I postulate in my essay distinguishing reality from mathematical representation, so falsifies the 'Law of the Excluded Middle' (i.e. binary 0,or 1, but nothing between) as having any real relevance as a descriptor nature. Great link, Thanks.

    But it seems the alternative Bayesian/Godel "Law of the 'Reducing' Middle" I postulated as the one applying to nature was a step to far. Nobody even commented on it! Did it appear too presumptuous? It it wrong? Is it understandable?

    Views please, anybody.

    Peter

    Tom, Akinbo,

    The 'Shapiro effect' was immediately exposed as a sham, seemingly encourage by the US military who sponsored the work. Shapiro was doing the keynote speech at the 5th Texas Symposium on Relativistic Physics, but Proff Dicke and others rumbled the con and Shapiro was a no show!

    It was a simple trick. He'd found the delay/curvature in the (Venus) radar reflection, which was many times the SR prediction, made a very large allowance (guess) for refraction in the ionosphere of Venus, and lo and behold he ended up with the SR prediction!! Of course refraction isn't allowable, and the same refraction in the ionosphere around Venus wasn't allowed in the more dense ionosphere around the sun!

    So don't hold any store by that effect! If curved space-time is implemented by diffuse ion diffraction then all anomalies resolve without issues and the SR postulates are proved (DFM). The empirical evidence is now overwhelming, and consistent with all evidence cited for SR. Can you suggest anything other than 'belief' and fear that prevents this logical explanation being studied and evaluated in comparison 'old school' doctrine?

    Peter

    Peter,

    As usual, you are not quite exact. You certainly meant Irwin not Stuart L. Shapiro on the symposium on ... Astrophysics, not ... Physics held in 1971 while the Shapiro delay was found already in 1964.

    Eckard

    Eckard,

    Who is 'Stuart' Shapiro? The 1964 test Akinbo referred was indeed Irwin Shapiro. The Fifth Texas Symposium was in 1970, and it's a matter of record that Shapiro was to be keynote speaker. The much respected Robert Dicke replaced him. but was quite diplomatic, explaining that due to "systematic variations" in the radar data Shapiro 'had doubts about the solar interplanetary radar test of GR.'

    B.G. Wallace also spoke, explaining; "It is now apparent that the (Lincoln) Lab used computer methods to artificially rectify the 1961 radar data. They did so in order to remove large frequency-related variations that were obviously related to intervening plasma." See; 'Spectroscopy Letters', 3(4+5) 115-121 1970. also 4(3 &4), 79-84 1971. and 4(5), 123-127 1971. also Foundations of Physics. Vol. 3, No 3, 1973. I can give you more exact information if you feel you need it.

    Wallace was subsequently attacked but had obtained the original data so was never refuted. [Interestingly the web references I've kept seem to be regularly 'tidied away' but nobody can remove all the printed documents]. Not exactly an auspicious time for physics, but I'm sure they meant well.

    Peter

    Peter,

    I think people will look back and see lot of what we think today as hopelessly naive. For one thing, the idea of dichotomies will likely be much more taken for granted, that most observations have not just opposing views, but a variety of reflections.

    To paraphrase Newton; "For every thought, there is an equal and opposite thought."

    Regards,

    John M

    PS, One thought which has been occurring to me, in terms of my point about time and looking at where the world seems to be headed, is that I keep pointing out that if time were a vector from past to future, the faster clock would move into the future more rapidly, but the opposite is true. Since it is a process by which the future becomes past, the faster clock burns/ages more rapidly, so it moves into the past quicker. If this idea were to take hold, then maybe civilization might begin to appreciate the value of "Slowing down and thinking." Since more speed often just means one peaks sooner.

    Tom,

    Them's the facts. Only you can decide if there's some conspiracy. The (Lincoln Lab) project was sponsored by the army and supposed to be in collaboration with the Russians! As it was at the height of the cold war misinformation era I'd not be too surprised at the generals wanting to throw a curved ball. But the data remains the data whatever, both raw, and the very different figures after the (army's Lincoln) computer was used to 'clean them up' (which is openly admitted).

    But to gain any credibility back you can't claim all findings support the predictions of SR and/or GR and also assume that includes all the 'interpretations' of SR and/or GR, while blithely ignoring those which very clearly don't!

    Quite apart from the 20 anomalies I posted to you, the recent findings of the VLBA and confirmed apparent superluminal motion of quasar jets pulses of up to 46c make nonsense of the interpretations which you claim are implicit. All I'm pointing out is that, as Einstein said, they are not implicit, so SR and GR do not have to fall if any part of the assumptive decorations fall.

    You always have the other option; To explain the findings in terms of your beliefs. Can you do so? If not, what's you objection to a more consistent interpretation?

    Peter

    Peter Jackson

    I think I see where your criteria must rely on quantum averaging, and for that sort of regime perhaps Lorentz transforms are well suited, they sure worked well for Einstein in relation to averaging mass with measurable volume such as in our own solar mass. Incidentally, you may be putting yourself outside modern convention more than convention excluding you. NASA theoreticians were ecstatic when the data was crunched, they had been afraid it could take years.

    Where I find difficulty with what is really a 'short answer' concerning refraction index, is the common phraseology of 'wave speed'. Rigorously the speed of the wave could only slow down if a (choice) 1 cm wavelength waveform were to take 1.0003 seconds to transit 3^10 cm of average earth atmosphere. It ceases being a 1 cm wavelength at the refraction plane, and each waveform event is contracted a reciprocal of the refraction index from the original 1 cm length. It turns blue, and carries with it the absorption line signature of it's source of emission. Upon exit from refractive medium, transiting the refraction plane, the wavelength does not 'stretch' back out to 1 cm. That does not mean that it necessarily retains the refracted length, any alteration of the wavelength by a medium is always in a proportion of refraction index to 'zero' refraction. A wavetrain in atmospheric transiting through glass, would experience firstly the atmospheric contraction, then the additional glass contraction, but would then have that combined contracted length rebound very slightly by the reciprocal value of atmospheric to 'zero' refraction when the wave form transits from glass to the atmosphere. There is definitely, an arrow of time.

    In gross evaluation such as is necessary in a treatment of aggregates of discrete field entities, it might not be a problem to average as many things as reasonably possible and box refraction as a slower speed. I can't say. If you are confident in results you obtain correlating to experimental observations, then go for it. With this caveat, you have expressed something of a disconnect between the heuristic conceptual construct and a representation by mathematic formalization, and that does leave the uninitiated a bit disoriented. The more simplified the math introduction, the more quickly the physical relation can be grasped.

    Onward! through the fog! jrc