Akinbo,

Eric Reiter just reiterated the loading theory by Max Planck 1911. In unquantum.net he explains: "Light is emitted in a photon's worth of energy hv, but thereafter the narrow cone of light spreads classically. There are no photons!" Reiter's experiment with gamma rays did convincingly confirm Planck's loading theory. Constantinos Ragazas also advocated for it. I will check whether my worry about an unrealistically looking result by Gompf with single photon measurement can be explained by this theory too.

Peter,

You criticized Caroline Thompson for not yet providing the solution. I have to admit that I never delved into the loopholes of overly theory-based experimental confirmation of theories that contradict to simple reasoning and give rise to invoke ghosts, consciousness, and the like. Can you please specify Thompson's failure?

Did you deal with my argument that the velocity of light in vacuum is not at all related to a frame of reference but to the distance between the emitter at the moment of emission and the receiver at the moment it just arrived there?

Eckard

Akinbo,

Thank you kindly. On; "trying too hard to explain this inexplicable" you misunderstand. I tried to explain how ridiculous QM's rationale is despite it's predictions working with perfect precision! I invoke no unproven science, bring in nothing irrelevant and 'force' nothing! but you must adopt the axioms and follow the ontological construction that uses them if you wish to understand the solution they lead to. That's how axiomatic theories work! You don't have to 'swallow' anything along the way, just understand it. The 'assessment' can only come at the end, on assimilating the results.

I'm confused by your question re electron frames. It seems you have some quite different conception of rest frames All bodies and particles have their own 'centre of mass' rest frame, but only ONE! A particle can know NO OTHER FRAME! A frame is simply defined here a 'state of motion'. Did you have some other conception?

"How can there be more entanglement when things are further apart rather than when they are nearer to each other as Copenhagen interpretation believes, but which Caroline Thompson rightly disputes (in my opinion)."

There can't. Perhaps re-read what I wrote. There is an ADDITIONAL 'entanglement' effect at short range, well know in tomography etc. More homework!! Most assume the effects are the same, thus the confusion. I identify the difference (if and when you get there!) I agree with Caroline's brilliant work pretty well 100% (and have never 'criticised' it!) but she well knew it wasn't 'complete'. She also does NOT dispute the interpretation I described.

However while brilliantly identifying almost all the flaws she was frustrated in not being able to derive the actual classical description of the key effects; entanglement and non-locality or the intermediate cosine distribution. her 'random sphere' idea was similar to Joy's, but with the same limitations. (Joy agrees that his own more sophisticated description still does no extend to an actual theory).

Did you understand from the summary what the effect termed 'non-locality' actually is and how it emerges? (It needed careful reading). I'm working on improved figures, and descriptions if you need them.

Best wishes

Peter

The question posed by the article is not why quantum granulation in the microscopic realm arises, but only why statistically the quantum mechanical probabilities seem to be confirmed across the board, while other probabilistic theories seem to prove out as well but not always in all their parts. It would be nice to hear from those with enough practical (practiced) familiarity with probability methodologies, to explain what the Barrett and Leifer proposal attempts to discover in regard to the question of how does spacetime incarnate as energy resolve into clumps in the first place.

In our macro world things seem to be solid and smoothed over, while the micro atomic world seems to be granule. Only in extreme cases is entropy zero, as is theoretically the case for electromagnetic radiation in a background free frame. So the question goes to; Below diffeomorphism, is entropy zero or does diffeomorphism of matter-like energy clumps evolve in anentropic spacetime? jrc

    JC,

    "your DFM is consistent and applicable but runs into the conundrum of momentum being associated with mass which cannot achieve light velocity. It suffers from an apparent lack of any physical property in the helical structure necessary to transmit the angular momentum of 'the photon'.

    It becomes perplexing, an angular moment in time orbiting a line. Where is the photon? to have momentum? What's not spooky about that kind of action across a distance?"

    A 'photon' was only ever cited as a 'quanta of energy', manifesting as an 'entity' on interaction. Like many I disagree with ballistic theory. I don't pretend to answer 'what is waving', nobody can (unless you'll accept "Ground Comprathene"!) so just consider motion itself as energy. A 'dipole' is then a positive and negative charge, each spinning, orbiting each other so describing a double helix path through space when propagating, each path consisting of a smaller 'helical path' described by each charge. That is the 'hyperfine spin' of the 'spin orbit' relation found and used in optics. It may be considered as a higher order (smaller) 'dimension' or fractal.

    There are more helices at greater scales, i.e. the light may propagate on a planet which itself spins so describes a helical path round the sun, then has a greater helical path through the galaxy, etc. Like rope, the 'amplituhedron', or perhaps even string theory, the pattern repeats. But we're just considering two.

    But the photon can ONLY propagate at c (and can be considered as a 'speck' on an expanding Shrodinger sphere surface 'wave') An electron on the other hand has OAM 'intantaneously' as the spin is 'bound', so can be seen as an 'entity' with a rest frame and lower translational speed limit. I like the toroidal model (paired vortices) but considering as a sphere works just fine.

    Now when an orbiting dipole meets and electron where is the problem with the (small) dipole OAM being added the electron spin then 'spat out' again on the same axis (subject to KRR) but now with the ELECTRON'S spin 'direction'? That's what atomic scattering (coupling) 'is'! Would it not be far harder to imagine a 'cannonball' doing anything like that!?

    There is then NO 'action at a distance' required. The assumption that if Alice changes setting to find "up" then Bob's finding has to magically change to be "down" is no longer needed. Each of them can find 50:50 'up' and 'down' entirely independently.

    The nature of randomness means that we can reverse ALL Alice's findings, but STILL find she get's a 50:50 result! That last little gem simply hasn't been understood in 'statistical analysis', which could make either assumption but makes the wrong one (that you can't get two 'up's' together).

    The problem seems to be, as Bell identified, that Quantum physicists are 'sleepwalking', so don't believe there even IS any rational solution, so perhaps little wonder they'll just deny and ignore any that emerge.

    That was helpful, thanks. Did the explanation help allay your reservations at all? Do identify where if not.

    Best wishes

    Peter

    Eckard,

    Finding QM's logic was an aim I shared with Caroline. But as my previous essay identified QM started as a falsification of the DFM which predicted something apparently 'ridiculous'; that Aspect's finding should have been very different to that reported. It was frankly a relief (I could go sailing) but I checked anyway. I found his (French language paper only!) data WAS as predicted, but was 'corrected' (most omitted) to match the theory! That was shocking, but how could I challenge such a 'well reputed experimentalist!' (lol!).

    If it weren't for then finding Caroline's work I'd have probably given up there and then. She didn't find the actual classical solution or theory but did brilliantly analyse and identify all the faults and shortcomings of quantum theory and Aspects experiments. I was devastated to find she'd died, but her work was an inspiration and gave me the confidence to believe I was on the right lines. I'd never criticise it!

    I'd have loved to have just passed the predicted solution over to Caroline to present properly (no doubt with reams of maths) and use to slap the faces of all the peer reviewers that rejected her papers. So again I was left without the expert help needed.

    The actual assumptive flaw Bell adopted and mechanism which Caroline didn't identify, reproducing QM's predictions, involved the 'extent' of randomness and the consequences of using a different assumption.

    To understand this in DFM terms lets consider a planet. Take Earth. We spin it up and fire it through space on it's axis, either North or South pole first. We do this with 1,000 planets, randomly 50% north first and 50% south.

    We may then split it in half on the equator and send the other half the other way, so the OPPOSITE pole will always then lead the way, still 50:50.

    Now if one half of each pair hits a strong magnetic field and is turned around, they will then be led by the OPPOSITE poles. i.e. the SAME poles as led the other way. If they arrive at Bob and Alice who note down the poles for each pair they will then find the SAME pole in each case! But STILL 50:50 north/south.

    Think carefully about this as it's very easy to misunderstand or forget. It means that we can REVERSE ALL the findings of either Bob or Alice but NOBODY CAN TELL unless each individual pair is timed and matched. The 'statistical' approach of most experiments assumes we can't have A,S and B,S, but can't tell if we did or not!

    Now as the 'axis' is common to both halves of the planet, if Bob and Alice measure at different 'angles' the angles CAN THEN BE RELATED (solving the most important part of the conundrum). The common axis then preform the role of "entanglement."

    What nobody previously noticed is that Bell assumed the axes of the two halves of Earth would ALSO be entirely randomly orientated (p146). It is only THAT assumption which means that the Wigner-d'Espagnet (and 'Bell') inequality limit applies. If we consider a photon as propagating as part of a Schrodinger sphere surface then it seems reasonable to assume that the spin axis may be normal to the surface plane.

    Bells other problem was deriving the cosine curve. The solution simply emerges from the DFM dynamic; Between each line of latitude on Earth the orbital velocity varies by the cosine of the angle to that latitude from the centre of the Earth and equatorial plane for Momentum, and inversely from the polar axis for spin 'direction' (at the equator there is no clockwise or anticlockwise).

    Nobody should assume absolute causality emerges. It doesn't. But 'non-locality' (apparent action at a distance) is explained rationally, free of any spookyness. Of course it does also seem to fulfil it's original task, so allow convergence of classical and quantum physics (a hierarchy of LOCAL 'preferred frames').

    That dynamic geometrical ontological construction is what the paper lodged on Academia describes, developing the essay. Has that made it clearer? Do you still perceive any 'shortcomings' It's complex at first but entirely consistent with Caroline's conclusions, and owes much to her that it might eventually just manage to 'emerge' one day for the benefit of mankind. Or are we now beyond paradigm changes?

    Best wishes

    Peter

    Yes,Pete,

    that is a good clarification and goes to entropy occurring in the absorption and emission 'frames', for want of a full understanding of what is physically occurring and the pursuit of which is what your efforts are all about. Eckard's referral to Planck's loading hypothesis has long been assumed in my own modeling and I wonder how close the math involved might compare to that of a Zenner diode which builds to a threshold and instead of sending a pulse back in the loading circuit, but at threshold the resistance collapses and allows the collective potential to pass across the junction. There must be some time element for that event in draining the loaded potential while also being a drain on the resistance level, rebuilding the threshold. I learned a little about that, refurbing a little garden tractor mower which was powered by an old B&S with a flywheel alternator charging circuit that used a Zenner in the regulator instead of rectification through a Whetstone Bridge.

    My question about Barrett & leifer protocols is how are they going to apply the 'entropy' parameter in rejecting theories they don't like. Entropy is not an inherent element in the maths Einstein built onto each other to construct the GR model, it emerges in application. Is there entropy inherent to SR? What razor of Ockham's are they going to strop? jrc

    Yes, it is true that this thread has not taken the "Why Quantum" essay to heart. But the stated intent of the essay seems to be somewhat different from what you describe. All the essay seems to propose is that entropy might show why the microscopic universe follows quantum logic instead of a host of other possible models.

    Frankly, the essay asks why the universe is the way that it is, which is actually one of those questions that have no answer...at least no unique answers. This question appears most often in the context of religion or philosophy and so I was quite surprised to find it funded as a "Foundational Question."

    Actually, I have been a little disappointed that the FXQi website does not actually ever seem to recognize the nature of some foundational questions that really do not have answers. There are foundational questions that people have been asking for tens of thousands of years and that have no answers. First and foremost is the question:

    "Why is the universe the way that it is?"

    Religion and philosophy address this question endlessly and the result is always more discourse about the answer, not a single testable answer.

    I use entropy quite a lot in my work with solution thermodynamics and am always disappointed when the first definition of entropy is not the logarithm of the total number of states of the system. If the system has one state, its entropy is zero because the logarithm of one is zero.

    It is very true that the entropy of an isolated system must increase over time, and yet the entropy of many systems, i.e. galaxies, within the universe have decreased significantly over time. Since the number of possible states of a system always increases over time, entropy always increases over time. However, there are no truly isolated systems in the universe and so the laws of thermodynamics are highly scale dependent.

    The entropy of quantum action is quite well behaved, while the entropy of gravity action can be quite peculiar. Trying to use entropy, which is the way the universe is, to explain why the microscopic universe is the way that it is, i.e. quantum, seems circular.

    The universe is the way it is because that is the way it is...

    Steve,

    Thanks again for a clarifying moment. So if QM is inherently probabilistic its a log of the numbers of statistically averaged states, compared to what numbers of such as predicted by other theories. Assuming we live in a perfect universe and everything that could or should or would happen, always will eventually. So we have to swallow a multiverse.

    Which I doubt. The one thing that you can invariably count on is Murphy's Law of Perversity (attributed to a guy on the early rocket powered test sled project) which states that anything that can go wrong, will. At the least opportune time. Applying that generally to all of reality in whole or in macro part, it isn't a paranormal question of something popping into existence without any cause, it is a matter of something that should occur that doesn't for no reason at all. And there is no predicting that. But once something does not happen which predictably should, then it alters the terrain and other things occur that are not 100% predictable classically. But that applies to classical mechanics as well as quantum mechanics. Murph rules! jrc

    Peter,

    I share Steven's opinion that addressing basic questions is not an excuse for wild guesses up to mysticism. Also I share his opinion that closed systems are always something ideal rather than real. Concerning the cat I already reduced Akinbo's autopsy to Buridan's naïve donkey even if this insight does not yet reveal any really basic flaw in QM.

    Because the link to your Academia paper didn't work for me, I tried to understand from your last posting; why do you believe that "she didn't find the actual classical solution"? Before I read her paper I looked at their figures and got aware that Fig. 1 was obviously wrong. In [14] she explained that she was well aware of that error. Bell 1964 and Aspect were thinking in terms of QM, and d'Espagnat revealed the naivety of his argumentation already in his figure on p. 160 of http://www.scientificamerican.com/media/pdf/197911_0158.pdf by including separability into the premises of local realistic theories instead of questioning some basics of QM.

    Actio = reactio. I question Markov models. Nothing imaginable to me is separated from its history.

    Peter, I think you mistook Thompson. She did already reveal the conserved common axis of polarization for propagating in opposite directions half pairs of particle-like wavelets.

    Of course, Planck's, Reiter's, and Thompson's thoughts are utterly unwelcome in mainstream FQXi.

    I attribute individual frames of reference to anything that propagates including light. It doesn't matter whether we are imagining it as waves or particles, light obviously propagates in empty space regardless of velocities of emitter as well as of receiver. Empty space merely constitutes instantaneous distances without any naturally preferred location to refer to.

    Eckard

    Peter,

    I agree with JRC. That analogy (To understand this in DFM terms lets consider a planet....) makes your line of thought very much clearer. Points of disagreement is now narrowed to whether or not there was any encounter with an orientation changing mechanism like a magnetic field. And must the probability always be 50:50? Can the system not be started as 70:30 and measured when separated to see if experiment shows this?

    I also agree with you Caroline did not proffer any solution (at least from what I have read so far). But her identification of the gaping loop holes and falsehood in the Quantum theory assumptions appear impeccable, and remain uncontroverted.

    I googled, 'Ground Comprathene', the term is attributed to you, which means you 'invented' it?

    When you say, "Like many I disagree with ballistic theory. I don't pretend to answer 'what is waving',..."

    Have you not heard in Einsteiniana that space (or space-time) can vibrate? Have you not heard that this vibration also travels at same speed, c as light? If so, why can't gravitational waves and light waves be same but occupying different parts of the spectrum? Pentcho also posted a link to some of Einstein's thoughts on the possible discrete nature of space here. What is discrete can 'wave' or don't you think?

    Eckard,

    The Buridan's ass story may have implication in physics. I also saw this quote, "...a man, being just as hungry as thirsty, and placed in between food and drink, must necessarily remain where he is and starve to death" -- Aristotle, On the Heavens, ca.350 BCE

    JRC and Steve,

    You are right. Much has not been said on the relationship between entropy and quantum theory. I hope to make some comments later on this.

    Regards,

    Akinbo

    No, Akinbo and Peter,

    Thompson wrote at NPA 2000:"I shall attempt to explain what the debate is all about, and how the real experiments can be modelled without any need to invoke quantum weirdness." Finding QM's logic was NOT her aim. She concluded:

    "The scientific community seems to have gone off on the wrong track" and

    "8. Theorists realise that quantum theory itself is at risk. Regardless of all the "conceptual difficulties", it is too "successful" to abandon without good

    experimental evidence.

    9. "Quantum computing" etc depends on quantum theory being right 28, and the

    computing industry is currently an important source of funds. (It has so far

    tolerated the fact that nothing spectacular has been achieved. Hopefully the

    research will produce useful results - advances in optical computers, for

    example - even if the original idea is totally misguided.)

    The claimed success of QM rather relates to the discoveries by Franck and G. Hertz and by Stern and Gerlach than to the mathematical guesswork of the 1920 decade which I revealed as improper use of Heaviside's trick.

    Akinbo, thank you for the reference to Aristotle. It was already known to me that Buridan's donkey considerable predates Buridan.

    Eckard

    Eckard,

    Like Akinbo I found no falsifiable solution emergent from Caroline's excellent demolition job. Do identify the reference to common axis so we can track down any implications derived.

    You persistently cite 'empty' space, which is a misnomer. 'No space is free of field' (AE, and as now found). Media are simply more or less diffuse. Much of space is very VERY diffuse, but very very BIG to balance that! A galactic halo lenses light exactly as the lens in you glasses as it has the same number of particles. If you treat yourself to a subscription to MNRAS, AJ or ApJ and read a few papers it'd soon become clear why space as 'nothing' is a non starter. It needs to and does do far more.

    I agree re 'frames'. As a galaxy complete with it's halo 'propagates' through the local group (at known velocity) it also defines such a discrete inertial system. But where you still seem to struggle is in recognising that light IN the galaxy is shifted to c wrt the galaxy rest frame. Similarly light INSIDE the heliospheric shock propagates at c wrt the sun's rest frame. Only THEN can we make rational sense of findings, AND in line with SR's postulates! But the current SR 'interpretation' only gives anomalies and paradox.

    Nobody has ever shown that free electrons and protons scatter at anything other than c wrt their OWN rest frame. Most are confused because they fail to distinguish between bulk media relative v and relative refractive index n.

    On QM, I look at it a little differently to Caroline. She suggests "it's wrong", where in fact in reality it's predictions are of course precisely confirmed, just not 'classically' explained. So yes, the 'explanation' (weirdness) it adopts is wholly wrong, but it's finding are correct. The big error is then (as Bell agreed) is in 'giving up' and 'accepting' that it's not POSSIBLE to explain classically.

    Caroline agreed Aspects ACTUAL findings didn't need 'correcting' to match the false theoretical assumption, however from what I've seen she never managed to derive the CORRECT classical mechanism ('theory') reproducing QM's predictions; both non-locality and the cos^2 distribution. Or do correct me if you find it.

    Best wishes

    Peter

    Akinbo;

    "..narrowed to whether or not there was any encounter with an orientation changing mechanism like a magnetic field. And must the probability always be 50:50? Can the system not be started as 70:30 and measured when separated to see if experiment shows this?"

    Of course. If we start with coins 70% heads that works fine, (or some planets with two north poles! It's nature that insists on 50:50 not me!

    To initially prove your 1st bit is even simpler; The modulator (filer/ analyser/ polariser/ magnets/ whatever we call it) IS an EM field! That's what's being rotated. Weihs et al (inc. Zeilinger) even used an 'electro-optic' one, and specifically reported it's rotational effects. We also know from circularly polarised coherent light that application of a strong field can reverse it's polarity. Wade into arXiv and you'll find it all there.

    The problem with physics is that we do it 'incrementally', isolated from other parts. All are studying parts of trees in detail but none can see the forest. That's why my studies have been multi disciplinary, and why say we need 'joined-up-physics'. We just need to consistently apply each part to construct the coherent ontology. Might that approach be possible by 2020?

    My point about 'Ground Comprathene' is that we ONLY know ANYTHING by it's properties. We can can call anything any name we wish, it's meaningless, just a 'tag' to hep communication. Sure the dark energy/condensate of whatever is 'something; but it's not 'matter'. So we don't have any clue of way of describing it except from it's properties and their effects. If you look closely you'll see that the DFM ontology is the only model fully consistent with those properties and effects. But what is 'IS' (the name tag) is completely meaningless!

    The model suggests there are no 'gravity waves', only effects which will vary with a bodies proximity, and the at the BICEP finding will be found repeated at a smaller scale in collimated quasar jet emissions.

    Also entropy is an irrelevant and confusing misnomer as the only consistent long term cosmology is cyclic. Random a matter accreted to an AGN 'self organises' into two counter rotating helical paths ('helicoil') as in a fusion tokamak. So much for 'entropy'!

    Best wishes

    Peter

    Pete,

    What I hear from Eckard's use of 'empty space' is the practical theorist taking an item out of complexity and examining it on the classic workbench of 'background independant' measure. Just like you cannot cut planets in half. The question is not dependant on your not attempting to examine what the EM wave might be physically, the question is; can we divide the quantum (~h) and build an understanding of the wave without having more than one physically discrete state that would therefore result in the EM spectrum being subject to entropy and decay?

    Speaking of entropy, do you have any comments on the article? jrc

    JC,

    You may be right, but Eckard specified the conception previously. The bit 'on the bench' is fine, but there's much else to explain, all left in the 'engine bay'. A little like Akinbo's comments about appearing to 'introduce' unnecessary effects, it read to me like a driver looking into the engine bay and asking the mechanic 'why did you introduce all these irrelevant complications around the engine'?

    The answer is of course that the whole thing has to work coherently in all circumstances. An engine is really quite simple, and only as complex as it needs to be. At present the effects we call QM are still 'just' too complex for most to understand classically, but only because most now 'believe' something different. Same with SR. The same simple solution solves both at once. It's just 'unfamiliar'.

    Which entropy article did you mean? Have I missed a link? I see entropy as having one major connection with nature; They're both misunderstood. I agree Einstein's '1,000th of 1%.'

    Eckard

    Yes, I did again comment on your proposal a number of posts ago, but not in detail. Just as one quick example of dozens of problems it would raise from astrophysics; AGN/Quasar accretion, jet speed, energy, plasma collimation and propagation rates etc are all related and comparable. The whole process relies absolutely on an ambient local rest frame (observable vial the halo stars and gas and normally also the AGN centre of mass frame except for the longest jets) General term is the 'intergalactic medium' necessary for many other effects.

    The ejected protons densely propagate new fermion pairs at the collimation shear hypersurfaces in proportion to their speed through this 'medium'. This is the same process as all astronomical shocks.

    Sure, present theory is nonsense as although light is modulated by these shocks, so often Doppler blue shifted to form the GRB's we find from them, that can't be theoretically assimilated as it appears to violate SR! But then so does the 'speed' of the jets, measured trigonometrically at up to 46c. There is only one possible logical solution, and calling space 'empty' isn't it! It's just unfamiliar. But very simple; Light is continually scattered to the local c. CSL is due to 'continuous spontaneous localisation' (Pearle). But of course things always REMAIN unfamiliar if all are looking elsewhere.

    Best wishes

    Peter

    I think the number of states of the universe has increased over time by virtue of the fact that the universe is expanding. Space itself is filled with states. If the universe expands, then it adds more states to itself.

    JRC and Steve,

    I have browsed the article. Peter, click 'back to article' on top of this page. I may not make much comment because 'entropy' as a concept itself is yet to be fully understood and now trying to combine it with an equally controversial theory like QM will only lead to the invention of more mechanisms that then lead to absurdities and paradoxes. I suggest an FQXi grant be spent instead on falsifying or confirming Caroline Thompson's work. It is after this falsification and Aspect's finding passes the test then we can look at a combination of entropy with QM. But JRC, you said, "...Only in extreme cases is entropy zero". Will the beginning of the universe not be an extreme situation? If the second law holds, then the initial state will be of zero entropy. Can a state of zero entropy be a very hot thing of quantum size at 1032K temperature or was there a state before that which was of 0K, and thus zero entropy? Sorry, this is moving towards cosmology which may not belong here...

    Eckard, I find that it will be useful for me to read Caroline's work again and again.

    Peter, the part I agree fully with you is that, that light IN the galaxy is shifted to c wrt the galaxy rest frame. Similarly light INSIDE the heliospheric shock propagates at c wrt the sun's rest frame..., although the c's may not be of same value, which you fail to mention here although you have agreed before that your c can be any value. To make DFM progress further you must give a list of unknown, yet falsifiable claims or postulates, the finding of which means that DFM must be abandoned. JRC has asked you for one by asking, "can we divide the quantum (~h)...', which I rephrase as, if DFM admits of the photon, is it divisible? List out other claims on which a bet can be taken and on which the success or failure of DFM can rest. Don't be afraid. Einstein himself said, if it is found that space is discontinuous his whole theory of relativity would vanish in the air. He was also ready to sacrifice his special theory of relativity and said so in a quote. So, what is it that if found, DFM should be abandoned? This may give you heartache though if you lose the bet.

    Regards,

    Akinbo

    Pete,

    You asked..."Which entropy article did you mean? Have I missed a link?"

    The article which is the topic of this blog. Go back to top and click the subscript to 'Why Quantum' that reads; 'Back to Article'. That's the typical format. Hope you're having good bike weather, jrc

    Peter,

    In order to decipher AGN as active galaxy nuclei and GBR as gamma-ray burst, I looked into http://www.spacedaily.com/reports/

    Gamma_ray_burst_challenges_particle_acceleration_theories_999.html which was a pleasure to read while your statement "the 'speed' of the jets, measured trigonometrically at up to 46c" is an old misleading one. Perhaps there is no plausible argument that confirms your idea of "reemission locally at c".

    I merely argue that Michelson's 1881/87 unexpected null-result is quite logical if ideally empty space does not behave like a medium; the explanation of the result does not require length contraction. I am well aware that cosmos is anything but empty, and a space that is considered empty for experiments with light may nonetheless contain static electric fields etc.. You still didn't show what could be wrong with my argument.

    Already Thompson's comment 14 on her Fig. 1 revealed to me that she understood the common axis of "entangled particles". I will look for further evidence in her text. We should be happily in agreement with her, Planck, and Reiter. She dared called a spade a spade. Do you hope for compromising?

    Eckard

    Akinbo,

    Yes you are right, this thread has gotten long in the tooth while not chewing on the topic. I think Steve Agnew summed it well, and I'm going to try to refrain from digression... with one last wild fling.

    '...Will the beginning of the universe not be an extreme condition?...'

    I'm not willing to close the door on the Steady State Theory, and that is currently heresy. In the 1972 3rd edition of the introductory compendium, "Asimov's Guide to Science" good ol' Isaac gives a typical thumbnail sketch in summation of Fred Hoyle's elaboration of Hermann Bondi and Thomas Gold's development, that states that current estimates at that time of the expansion rate of the observable universe would require an undetectable mass quantity to evolve in the continual creation model. The energy creation would amount as equivalent to one simple hydrogen atom per year, per one billion litres of space. Given the 'dark energy' conundrum of more recent times, and the contributions by Hoyle to the evolutionary production of isotopes heavier than Helium, I think a renewed look at Hoyle and company is rational and warranted. And it raises the question as to whether entropy is only a quantum macroscopic phenomenon.

    I bow now to the statisticians, jrc